Search Results

Search found 81493 results on 3260 pages for 'file size'.

Page 134/3260 | < Previous Page | 130 131 132 133 134 135 136 137 138 139 140 141  | Next Page >

  • Win'08 - Extend volume size on SAN attached storage in a failover cluster

    - by user53207
    Running Win 2008, I'd like to extend the volume of a SAN attached drive that is part of a failover cluster. The SAN team has allocated additional drive space which is being seen by Windows Storage Manager. However, the option to "Extend Volume" is disabled, so is the ability to turn it into a dynamic disk. Is the ability to extend volumes when part of a failover cluster disabled or not available when it's part of SAN attached storage?

    Read the article

  • Limit share size at LaCie d2 Network

    - by zote
    I share an LaCie d2 Network with my brother. We an planning to have 3 shares: 1000gb public and 250 for each of us. The shares are created, but how can I set limit (quota) of some share, to prevent I get all NAS space?

    Read the article

  • Config file (App.config) does not update on new installation

    - by Muhammad Kashif Nadeem
    I am creating setup of my project using Visula Studio 2008. I am facing problem in setup installation. If I uninstall old setup (application) and install the new one then config file (App.config) updates the attributes (surely it is new file) of config file but if I install new setup without uninstalling the old one then config file does not update. from config file I mean MyProject.exe.config Why is this behavior of config file. Should it not be updated on installation of the new setup Is this possible to delete and copy the config file of new setup? Is there a way to update only config file forcefully during installation. Thanks for your help!

    Read the article

  • NFSv3 Asynchronous Write Depends on Block Size?

    - by Joe Swanson
    I am trying to figure out if my NFSv3 deployment is performing SAFE asynchronous writes. I suspect that it is doing strictly synchronous writes, as I am getting poor performance in general. I used Wireshark to look at the 'stable' flag in write calls, and look for 'commit' calls. I noticed that, with especially large block sizes, writes to appear to be performed asynchronously: dd if=/dev/zero of=/proj/re3/0/zero bs=2097152 count=512 However, smaller block sizes appear to be performed strictly synchronously: dd if=/dev/zero of=/proj/re3/0/zero bs=8192 count=655360 What gives? How does the client decide whether to tell the server to perform writes synchronously or asynchronously? Is there any way I can get smaller block sizes to be performed asynchronously?

    Read the article

  • Proper 16:9 video size for non-HD 4:3 video (for youtube/vimeo)

    - by Xeoncross
    Since High Definition video came out on all the online sites it has changed the default aspect ratio of the player from 4:3 to 16:9. This means that for people posting SD video you have to resize some of your videos to get them to fit right. For example, NTSC DVD quality (aka 480i/p) is 720x480 pixels (width x height). However, low-end High Definition (720i/p) is 1280x720. Anyway, now that the video players are built for HD you will find that uploading standard quality videos will result in videos that are "letter boxed" which means they have extra black bars on the top and bottom (or sides). Correct me if I'm wrong, but in order to get a 720x480 video to fit a box that is designed for HD the best practice would be to crop some of it off so that it fits as 720x404 since: 16/9 = 1.78 (1.7777777777778) 720/405 = 1.78 405x1.78 = 720.9 The same would stand for 640x480 (old TV quality) video that would need to be 640x360 correct? I'm asking because I'm not sure about all this and whether this is the proper way to fix these letter-boxing/display problems.

    Read the article

  • Recommended boot partition size for Windows 7

    - by dwj
    I started using One Big Partition for everything and separating data out with folders when I got my current computer years ago. I'm preparing to upgrade my system from Windows XP to Windows 7 and I thought I might go back to putting my data on a separate partition. Most likely I'll just use the default OS install. My current Program Files tree has ~16 GB of stuff. Thinking ahead though, I've had XP installed for years. Who knows what apps I'm going to install down the line? This, of course, begs the question: How big do I make my Windows 7 install partition?

    Read the article

  • What could cause the file command in Linux to report a text file as data?

    - by Jonah Bishop
    I have a couple of C++ source files (one .cpp and one .h) that are being reported as type data by the file command in Linux. When I run the file -bi command against these files, I'm given this output (same output for each file): application/octet-stream; charset=binary Each file is clearly plain-text (I can view them in vi). What's causing file to misreport the type of these files? Could it be some sort of Unicode thing? Both of these files were created in Windows-land (using Visual Studio 2005), but they're being compiled in Linux (it's a cross-platform application). Any ideas would be appreciated. Update: I don't see any null characters in either file. I found some extended characters in the .cpp file (in a comment block), removed them, but file still reports the same encoding. I've tried forcing the encoding in SlickEdit, but that didn't seem to have an effect. When I open the file in vim, I see a [converted] line as soon as I open the file. Perhaps I can get vim to force the encoding?

    Read the article

  • Buying Dual Monitors of different size and resolution

    - by rutherford
    I'm about to go choose a dual monitor setup: 1) Is there any reason why I can't just walk out and buy the two TFT screens I like (a wide screen and a 'portrait' screen) and combine them? Mainly wide screen would be for gaming, and portrait for browsing. I'd want the desktop stretching from one to the other (ie drag pointer, apps from one screen to the other) 2) Also do I need separate gfx cards for each monitor or can one cover both? any performance cost? 3) And can I have separate background images for each, seeing as they'll be different resolutions?

    Read the article

  • Buying Dual Monitors of different size and resolution

    - by rutherford
    I'm about to go choose a dual monitor setup: 1) Is there any reason why I can't just walk out and buy the two TFT screens I like (a wide screen and a 'portrait' screen) and combine them? Mainly wide screen would be for gaming, and portrait for browsing. I'd want the desktop stretching from one to the other (ie drag pointer, apps from one screen to the other) 2) Also do I need separate gfx cards for each monitor or can one cover both? any performance cost? 3) And can I have separate background images for each, seeing as they'll be different resolutions?

    Read the article

  • Loopback connection via PHP's getimage size crashes server (Magento's CMS)

    - by Alex
    We were able to trace down a problem that is crashing our NGINX server running Magento until the following point: Background info: Magento Backend has a CMS function with a WYSIWYG editor. This editor loads some pictures via a controller in magento (cms/directive). When we set the NGINX error_log level to info, we get the following lines (line break inserted for better readability): 2012/10/22 18:05:40 [info] 14105#0: *1 client closed prematurely connection, so upstream connection is closed too while sending request to upstream, client: XXXXXXXXX, server: test.local, request: "GET index.php/admin/cms_wysiwyg/directive/___directive/BASEENCODEDIMAGEURL,,/ HTTP/1.1", upstream: "fastcgi://127.0.0.1:9024", host: "test.local" When checking the code in the debugger, the following call does never return (in ´Varien_Image_Adapter_Abstract::getMimeType()` # $this->_fileName is http://test.local/skin/adminhtml/base/default/images/demo-image-not-existing.gif` # $_SERVER['REQUEST_URI'] = http://test.local/admin/cms_wysiwyg/directive/___directive/BASEENCODEDIMAGEURL list($this->_imageSrcWidth, $this->_imageSrcHeight, $this->_fileType, ) = getimagesize($this->_fileName); The filename requests is an URL to the same server which is requesting the script a link to a static .gif that is not existing. Sample URL: http://test.local/skin/adminhtml/base/default/images/demo-image-not-existing.gif When the above line executed, any subsequent request to the NGNIX server does not respond any more. After waiting for around 10 minutes, the NGINX server starts answering requests again. I tried to reproduce the error with a simple test script that only calls getimagesize() with the given URL - but this not crash. It simple leads to an exception saying that the URL could not be loaded (which is fine as the URL is wrong)

    Read the article

  • rename/delete a folder from multipart rar file

    - by kikio
    Hello. I've a question: (I sent it in past) I have multipart rar file. Their contents are: file.part01.rar: myfolder (is a folder) data.cab -- file.part02.rar: myfolder (is a folder) data.cab <- file.part03.rar: myfolder (is a folder) data.cab <- file.part04.rar: difffolder (is a folder) anfolder (is a folder) data.cab <- file.part05.rar: myfolder (is a folder) data.cab <-- I want to extract it, so I right-click on "file.part01.rar" and select "Extract to ...". It extract 3 files, but in part 4, WinRAR said: "CRC. This file is currput." I think it problem is in the folders name in part04.rar. Is there anyway to rename folders in part04.rar? and cut "data.cab" from "afolder" to "difffolder". I really need it!! it is very emergency!!!!!!!! Thank you .....

    Read the article

  • How to restore from file using Symantec NetBackup 7.5

    - by Tony
    I have an install of Symantec NetBackup 7.5 and I want to restore the server from a NetBackup image file. The file was created using NetBackup before I arrived. We had a hardware failure that corrupted this server and it needed to be rebuilt, now we want to restore from this image file. I can't for the life of me figure out how to restore from that file. I've installed the NetBackup application but it can't find the file when using the restore command within the application. If I double-click the file it opens the application then gives me the same "can't find any NetBackup files" error. I also can't simply drag the file into the NetBackup window. Any advice on how I restore from this file would be appreciated, thank you.

    Read the article

  • What is the difference between the BIN file generated by ImgBurn and UltraISO

    - by user275517
    I have a CD that I would like to generate a BIN file from (with a CUE file to accompany it). I used ImgBurn and UltraISO to to generate two BIN files. However, I have found out that BIN files generated by these programs are not identical (different file size). So, what is the difference between the BIN file formats and which one should I use to backup CD? The same applies to ISO file generation by these two programs - file size does not match.

    Read the article

  • SQL Error (1064) when importing data from SQL file

    - by mejpark
    I have a MySQL database, which was originally set up with the default latin1 character set and latin1_swedish_ci collation. I was using the database like this for sometime, until I noticed strange characters on my production web site, which is powered by a database exported from my development machine. At this point, I changed the default character set of the database and tables to utf8 and the collation to utf8_unicode_ci, converted the latin1 data inside each table to utf8 (using the 'convert data' option) and exported the database as a single SQL file using HeidiSQL. When the resulting SQL file is opened in Notepad++, several characters are rendered incorrectly. For example, en dashes (-) are displayed as – and e with accent (é) are displayed as é. I changed the encoding of the file from ANSI to UTF-8 (using the encoding menu option in Notepad++) and the offending characters are rendered correctly. I saved the new utf8-encoded SQL file and attempted to import the contents into the MySQL database on my production server. The import process fails with following error: /* SQL Error (1064): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '?# -------------------------------------------------------- # Host: ' at line 1 */ /* Error with snippets directory: The specified path was not found */ The head of the SQL file: # -------------------------------------------------------- # Host: 127.0.0.1 # Server version: 5.1.33-community # Server OS: Win32 # HeidiSQL version: 6.0.0.3773 # Date/time: 2011-04-20 09:48:36 # -------------------------------------------------------- It chokes on the first line of the file, which is commented out. Why is this happening? I didn't have a problem loading data from SQL files until I changed the character set and collation of the database. I came up with an ugly workaround to this problem by performing following steps: Export database as single SQL file using HeidiSQL Open resulting file in Notepad++ and convert from ANSI to UTF-8 encoding Create new empty file in Notepad++, paste in UTF-8 and save file normally What am I missing here?

    Read the article

  • Crop display size in linux

    - by TomSW
    My laptop has a slightly damaged lcd - it has a black strip on the right that reduces the working resolution from 1400x1050 to about 1375x1050 - thus hiding part of the right edge of the desktop, windows, or the mouse pointer should it be unwise enough to stray there. Is there a way in linux to crop the monitor output so as to keep the screen inside the working area of the monitor? edit: this is the same question as Limit video output to a section of a display and leave the rest blank

    Read the article

  • rkhunter warns of inode change by no file modification date changes

    - by Nicholas Tolley Cottrell
    I have several systems running Centos 6 with rkhunter installed. I have a daily cron running rkhunter and reporting back via email. I very often get reports like: ---------------------- Start Rootkit Hunter Scan ---------------------- Warning: The file properties have changed: File: /sbin/fsck Current inode: 6029384 Stored inode: 6029326 Warning: The file properties have changed: File: /sbin/ip Current inode: 6029506 Stored inode: 6029343 Warning: The file properties have changed: File: /sbin/nologin Current inode: 6029443 Stored inode: 6029531 Warning: The file properties have changed: File: /bin/dmesg Current inode: 13369362 Stored inode: 13369366 From what I understand, rkhunter will usually report a changed hash and/or modification date on the scanned files to, so this leads me to think that there is no real change. My question: is there some other activity on the machine that could make the inode change (running ext4) or is this really yum making regular (~ once a week) changes to these files as part of normal security updates?

    Read the article

  • Accidentally dd'ed an image to wrong drive / overwrote partition table + NTFS partition start

    - by Kento Locatelli
    I screwed up and set the wrong output for dd when trying to copy a freenas iso, overwriting the wrong external hard drive. Ironically, I was trying to setup a freenas server for data backup... External drive is only used for data storage, system is entirely intact Drive had a single NTFS partition filing the entire device (2TB WD elements) Drive originally had an MBR partition table. Drive now shows as having a GPT, presumably from the freenas image. Drive was mounted at the time, with maybe a couple kB of data written/read after running dd Drive is just a few months old and healthy (regular SMART / fs checks) I have not reboot the OS (crunchbang) /proc/partition still holds the correct information (and has been stored) Have dd's output (records in / out / bytes) testdrive did not find any partitions on quick or deep search running photorec to recover the more important data (a couple recent plaintext files that hadn't been backed up yet). Vast majority of disk content ( 80%) is unnecessary media files. My current plan is to let photorec do it's thing, then recreate the mbr with gparted and use cfdisk to create another NTFS partition using the sector information from /sys/block/.../. Is that a good course of action (that is, a chance of success)? Or anything else I should try first? Possibly relevant information: dd if=FreeNAS-8.0.4-RELEASE-p3-x86.iso of=/dev/sdc: 194568+0 records in 194568+0 records out 99618816 bytes (100 MB) copied grep . /sys/block/sdc/sdc*/{start,size}: /sys/block/sdc/sdc1/start:2048 /sys/block/sdc/sdc1/size:3907022848 cat /proc/partitions: major minor #blocks name ** Snipped ** 8 32 1953512448 sdc 8 33 1953511424 sdc1 current fdisk -l output: WARNING: GPT (GUID Partition Table) detected on '/dev/sdc'! The util fdisk doesn't support GPT. Use GNU Parted. Disk /dev/sdc: 2000.4 GB, 2000396746752 bytes 255 heads, 63 sectors/track, 243201 cylinders Units = cylinders of 16065 * 512 = 8225280 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0x00000000 Disk /dev/sdc doesn't contain a valid partition table

    Read the article

  • size of bad data on HD windows7

    - by acidzombie24
    While using my external harddrive (NTFS) i had a crc32 error. Now i would like to see how much data is corrupted. If its a few KBs i wont mind but if its a few MB i should consider getting a new harddrive. How can i check using windows7

    Read the article

  • Gifsicle: How to set it to not overwrite the original GIF file if the resulting modified GIF file is larger than the original?

    - by galacticninja
    About Gifsicle: Gifsicle is a command-line tool for creating, editing, and getting information about GIF images and animations. One of its features is (from its website): Optimize your animations! This stores only the changed portion of each frame, and can radically shrink your GIFs. You can also use transparency to make them even smaller. Gifsicle’s optimizer is pretty powerful, and usually reduces animations to within a couple bytes of the best commercial optimizers. I call Gifsicle through this .BAT file in the Right Click - 'Send to' Menu: @echo off :compressFile "C:\Programs\Compression Scripts\gifsicle\bin\gifsicle.exe" --batch -V -O3 %1% echo. echo. SHIFT if exist %1% goto compressFile PAUSE This animated GIF file, however: http://i.minus.com/i7WdodY5Zwot3.gif, when its compression is optimized with Gifsicle with the above commands, results in a larger-filesized GIF file. Gifsicle overwrites the original GIF file with the resulting larger-filesized GIF file. Initial filesize: 7.57 MiB (7,942,886 bytes). After running through the above commands with Gifsicle: 7.64 MiB (8,017,622 bytes). Is there a way to prevent Gifsicle from overwriting the original file if its output file is larger than the original file, while still overwriting the original file if the output file is smaller? Details: OS: Windows 7 Gifsicle version: 1.63, from the binary provided here: http://www.lcdf.org/gifsicle/ Gifsicle manual

    Read the article

  • Divide a network into two subnets of equal size

    - by kylex
    I have been given the following IP 192.168.14.137/25 and asked to divide the network into 2. This is what I've come up with: The subnet mask is therefore 255.255.255.128 The network address is 192.168.14.128 There are a total of 128 available addresses (including the network address and broadcast address) To divide the network we create to subnets: 192.168.14.128/26 192.168.14.192/26 This will have a subnet mask of 255.255.255.192 Am I missing anything, or is this correct?

    Read the article

  • Split (parts of) image into several smaller (same size) pieces non-manually

    - by hlovdal
    If I have an image with a table containing several rows, say like the periodic table: Are there any tool I can use to split this into one smaller image with the H He row, another image for the Li Be ... Ne row, etc? The tool does not have to detect the row borders by itself, specifying a start offset + row height in pixels is ok. Manually selecting and cutting/copying in gimp is not an option, I have way to many rows to process.

    Read the article

< Previous Page | 130 131 132 133 134 135 136 137 138 139 140 141  | Next Page >