Search Results

Search found 9992 results on 400 pages for 'space efficiency'.

Page 2/400 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • iOS Efficiency File Saving Efficiency

    - by Guvvy Aba
    I was working on my iOS app and my goal is to save a file that I am receiving from the internet bit by bit. My current setup is that I have a NSMutableData object and I add a bit of data to it as I receive my file. After the last "packet" is received, I write the NSData to a file and the process is complete. I'm kind of worried that this isn't the ideal way to do it because of the limitations of RAM in a mobile device and it would be problematic to receive large files. My next thought was to implement a NSFileHandle so that as the file arrives, it would be saved to the disk, rather than the virtual memory. In terms of speed and efficiency, which method do you think will work decently on an iOS device. I am currently using the first, NSMutableData approach. Is it worth changing my app to use the NSFileHandle? Thanks in advance, Guvvy

    Read the article

  • The Space Invader – A Childhood to Adulthood Story [Comic]

    - by Asian Angel
    Did you ever wonder what life is like for the Invaders? Take a journey through time with this particular Invader as he grows from a child into adulthood and decides to join the war. Note: We have shown only the first panel here. You can view the entire comic story by visiting the link below. The Invader – EL Comics [via Neatorama] How to Get Pro Features in Windows Home Versions with Third Party Tools HTG Explains: Is ReadyBoost Worth Using? HTG Explains: What The Windows Event Viewer Is and How You Can Use It

    Read the article

  • Lost disk space in Windows 7, cannot find the missing

    - by Tsanders
    My hard drive is complaining it is low on disk space, but a strange thing seems to be happening: Explorer reports 10Gb of available space (on a 120 Gb hard disk), chkdsk in the command prompt does the same but if I use a disk space tool such as SpaceSniffer or WinDirStat, only 50Gb of data is found. My guess is that there somehow is a hold on a large block of disk space (but that's just a guess) because of a prior very large (40 Gb) download attempt that didn't complete. There isn't 40Gb of files on the drive (hidden or visible), yet Explorer insists that something is there. How can I claim back this hard disk drive (without formatting my hard disk)? SpaceMonger is providing a clue, reporting four unscannable folders which add up to 43Gb: C:\RRBackups C:\System Volume Information C:\Windows\Csc\v2.06 C:\Windows\System32\LogFiles\Wmi\RtBackup Does anybody know what these folders are for, and how I can claim back at least some space? Restore point claims about 4Gb, so that doesn't seem to be the main problem.

    Read the article

  • No space left on device with encrypted disk that takes all space

    - by Yosef
    I use Ubuntu 11.04. There's no space left on device. I have encrypted the disk that takes up space (maybe it's good to disable it, but I don't know how). In shell, I get this message: No space left on device I run df -I: Filesystem Inodes IUsed IFree IUse% Mounted on /dev/sda1 3055616 602499 2453117 20% / none 210161 890 209271 1% /dev none 214789 8 214781 1% /dev/shm none 214789 53 214736 1% /var/run none 214789 3 214786 1% /var/lock /home/myuser/.Private 3055616 602499 2453117 20% /home/myuser df -I Edit: When I run only df: Filesystem 1K-blocks Used Available Use% Mounted on /dev/sda1 48060296 45618928 0 100% / none 1538340 684 1537656 1% /dev none 1547596 808 1546788 1% /dev/shm none 1547596 104 1547492 1% /var/run none 1547596 0 1547596 0% /var/lock /home/myuser/.Private 48060296 45618928 0 100% /home/myuser Edit: I thinking about few solution but I don't know which better and how exactly to do them: to enlarge partition size (I cant install gparted - no more disk space) remove encryption of partition - I really not need that

    Read the article

  • Can We Survive the Sun’s Death?

    - by Akemi Iwaya
    In the distant future, our sun will begin its descent into death after using up all of the hydrogen fuel in its core. When that happens, the inner parts of our solar system will suffer horrible consequences. But what will happen at that point in time and how quickly will things ‘deteriorate’? Is there anything that could be done to help our planet survive? AsapSCIENCE looks at this ‘hot’ topic in their latest video. Can We Survive The Sun’s Death?     

    Read the article

  • No Gentle Galaxy Collision

    - by Akemi Iwaya
    It is hard to imagine a more cataclysmic event than the collision of two galaxies and the effect such an event has on each one as they are ripped apart. See the destructive power of collision at work in this video showing the latest Hubble imagery of Arp 142, a.k.a. the collision of NGC 2936 and NGC 2937. Note: The video also shows stunning imagery of other colliding galaxies. No Gentle Galaxy Collision [YouTube]     

    Read the article

  • Enjoy 1.3 Billion Pixels of Mars Surface Panoramic Photography from the Curiosity Rover

    - by Akemi Iwaya
    Have you been waiting for more awesome photos of Mars’ surface from the Curiosity Rover mission? Then you are definitely going to love this bit of news! NASA and GigaPan have teamed up to create a truly inspiring 1.3 billion pixel panoramic view of Mars that you can ‘zoom around’ and explore at your leisure. There are two websites that you can visit to enjoy this awesome scenery: NASA’s official website with two viewing options…     

    Read the article

  • How to analyze the efficiency of this algorithm Part 2

    - by Leonardo Lopez
    I found an error in the way I explained this question before, so here it goes again: FUNCTION SEEK(A,X) 1. FOUND = FALSE 2. K = 1 3. WHILE (NOT FOUND) AND (K < N) a. IF (A[K] = X THEN 1. FOUND = TRUE b. ELSE 1. K = K + 1 4. RETURN Analyzing this algorithm (pseudocode), I can count the number of steps it takes to finish, and analyze its efficiency in theta notation, T(n), a linear algorithm. OK. This following code depends on the inner formulas inside the loop in order to finish, the deal is that there is no variable N in the code, therefore the efficiency of this algorithm will always be the same since we're assigning the value of 1 to both A & B variables: 1. A = 1 2. B = 1 3. UNTIL (B > 100) a. B = 2A - 2 b. A = A + 3 Now I believe this algorithm performs in constant time, always. But how can I use Algebra in order to find out how many steps it takes to finish?

    Read the article

  • 12.04 ext4 - cannot create regular file/No space left - with a lot of space and inodes

    - by user1434058
    This seems similar: EXT4 "No space left on device (28)" incorrect but there is no explanation I created an ext4 filesystem on a RAID 1 array with: mke2fs -t ext4 -T small /dev/md0 Copying a single directory with many tiny files I get: cp: cannot create regular file `/mnt/raid1_new/pics/pic3412.jpg': No space left on device space used 5% inodes used 1% I manually tried: cp /source/test1.jpg /mnt/raid1_new/pics/test1.jpg --- error cp /source/test1.jpg /mnt/raid1_new/pics/test2.jpg --- ERROR cp /source/test1.jpg /mnt/raid1_new/pics/test3.jpg --- no error Notes: RAID 1 disks are error free. I tried mv instead of cp and got the same thing. I tried omitting -T small with no effect Can somebody help me understand this magic?

    Read the article

  • What PHP configuration and extensions are recommended for speed, efficiency and security?

    - by Sanoj
    I am setting up an Ubuntu server with nginx and PHP. I have read about many different configurations and extensions that could be added and it is pretty hard to know about all of them. I would like to hear from you, sysadmins, what PHP configuration and extensions do you recommend? I have read about: Suhosin for security Alternative PHP Cache for speed and efficiency Memcache for speed and efficiency PHP FastCGI Process Manager for speed and efficiency But I have no idea if they are good or not, and if I should use them together.

    Read the article

  • Program complains not enough disk space even if the disk space exists

    - by user1189899
    I have an EXT3 partition mounted in ordered data mode. If a power failure occurs when a program is creating files on that partition, I see that space usage reported is normal and I don't see any partial written files. But when I try to run the same program again after the system comes back up it complains that there is not enough disk space. Even though the free space reported is far more than required. The program always succeeds in normal conditions. Also the problem seems to disappear when the partition is remounted. I was wondering what could be the right way to handle the situation other than unmounting and remounting.

    Read the article

  • Get extra hard drive space from windows 7

    - by abhinole
    I am using Ubuntu 12.04 and Windows 7 (dual-boot) on my laptop.For some reasons I want some more space in my Ubuntu partition.I have installed gParted in Ubuntu.Now is it recommended to get this required extra space from Windows 7 drive (*where my linux is installed *) directly using gParted? Will it cause damage to my boot loader or my data on the partition from where I wish to grab some space? Any help will be appreciated. Thanks in advance.

    Read the article

  • MySQL index cardinality - performance vs storage efficiency

    - by Sean
    Say you have a MySQL 5.0 MyISAM table with 100 million rows, with one index (other than primary key) on two integer columns. From my admittedly poor understanding of B-tree structure, I believe that a lower cardinality means the storage efficiency of the index is better, because there are less parent nodes. Whereas a higher cardinality means less efficient storage, but faster read performance, because it has to navigate through less branches to get to whatever data it is looking for to narrow down the rows for the query. (Note - by "low" vs "high", I don't mean e.g. 1 million vs 99 million for a 100 million row table. I mean more like 90 million vs 95 million) Is my understanding correct? Related question - How does cardinality affect write performance?

    Read the article

  • Efficiency of PHP arrays cast as objects?

    - by keithjgrant
    From what I understand, PHP objects are generally much faster than arrays. How is that efficiency affected if I'm typecasting to define stdClass objects on the fly: $var = (object)array('one' => 1, 'two' => 2); If the code doing this is deeply-nested, will I be better off explicitly defining $var as an objects instead: $var = new stdClass(); $var->one = 1; $var->two = 2; Is the difference negligible since I'll then be accessing $var as an object from there on, either way?

    Read the article

  • MySQL efficiency as it relates to the database/table size

    - by mlissner
    I'm building a system using django, Sphinx and MySQL that's very quickly becoming quite large. The database currently has about 2000 rows, and I've written a program that's going to populate it with another 40,000 rows in a couple days. Since the database is live right now, and since I've never had a database with this much information in it, I'm worried about some things: Is adding all these rows going to seriously degrade the efficiency of my django app? Will I need to go back through it and optimize all my database calls so they're doing things more cleverly? Or will this make the database slow all around to the extent that I can't do anything about it at all? If you scoff at my 40k rows, then, my next question is, at what point SHOULD I be concerned? I will likely be adding another couple hundred thousand soon, so I worry, and I fret. How is sphinx going to feel about all this? Is it going to freak out when it realizes it has to index all this data? Or will it be fine? Is this normal for it? If it is, at what point should I be concerned that it's too much data for Sphinx? Thanks for any thoughts.

    Read the article

  • PfSense: dhcpd: send_packet: No buffer space available

    - by Tillebeck
    Pfsense 2.0.1-RELEASE (i386) I get a lot of entries in the log saying: dhcpd: send_packet: No buffer space available There are aprox 40 active users and they complain about prolonged times for getting an IP and periodically about problems accessing the internet, prolonged response times etc. In the log there is this entry repeatedly (periodically): Jun 10 18:27:30 dhcpd: send_packet: No buffer space available Jun 10 18:40:53 dhcpd: send_packet: No buffer space available Jun 10 19:01:15 dhcpd: send_packet: No buffer space available Jun 10 19:10:47 dhcpd: send_packet: No buffer space available Jun 10 19:31:10 dhcpd: send_packet: No buffer space available Jun 10 19:53:51 dhcpd: send_packet: No buffer space available Jun 10 20:23:32 dhcpd: send_packet: No buffer space available Jun 10 21:01:42 dhcpd: send_packet: No buffer space available Jun 10 21:04:52 dhcpd: send_packet: No buffer space available Jun 10 21:23:29 dhcpd: send_packet: No buffer space available Jun 10 21:46:05 dhcpd: send_packet: No buffer space available Jun 10 22:02:17 dhcpd: send_packet: No buffer space available Jun 10 22:02:45 dhcpd: send_packet: No buffer space available Jun 10 22:06:21 dhcpd: send_packet: No buffer space available Jun 10 22:08:45 dhcpd: send_packet: No buffer space available Jun 10 22:09:19 dhcpd: send_packet: No buffer space available Jun 10 22:22:23 dhcpd: send_packet: No buffer space available I guess it is the same periods that the users complain about reduced access to the internet. I have seen other threads about it. But none related to pfsense. Any ideas of what to do?

    Read the article

  • How to see installed programs and the occupied space (not in Synaptic or Terminal)?

    - by cipricus
    I am often running low on space on the system and home partition. I would like to see a list of installed programs and the space they take, in order to be able to make decisions on this matter. I do not mean a list with all components, 'secondary' programs and dependencies like in Synaptic, which I know can be also displayed in the Terminal -- so please do not set as duplicate of this question --, but just a list with the main programs (those that appear in the main menu) and with the space they occupy on the disk. (I use Xubuntu 12.10)

    Read the article

  • where is my free space

    - by doug
    Hi there I'm using Pinnacle Studio 12 to edit some videos and as you can guess it use a lot of resources. The problem is that it eats all my free space from the C drive and after i close the application i don;t get back my free space. I assume it does some swap files but where and why it doesn't delete them? I don't store the project and saved files on C drive. I'm running Pinnacle Studio 12 on a XP sp3 Windows machine. I've tried to clean my system with CClean, but it doesn't find/see where my space is lost! TY

    Read the article

  • ANTLR : How to replace all characters defined as space with actual space

    - by Puneet Pawaia
    Hi All, My ANTLR code is as follow : LPARENTHESIS : ('('); RPARENTHESIS : (')'); fragment CHARACTER : ('a'..'z'|'0'..'9'|); fragment QUOTE : ('"'); fragment WILDCARD : ('*'); fragment SPACE : (' '|'\n'|'\r'|'\t'|'\u000C'|';'|':'|','); WILD_STRING : (CHARACTER)* ( ('?') (CHARACTER)* )+ ; PREFIX_STRING : (CHARACTER)+ ( ('*') )+ ; WS : (SPACE) { $channel=HIDDEN; }; PHRASE : (QUOTE)(LPARENTHESIS)?(WORD)(WILDCARD)?(RPARENTHESIS)?((SPACE)+(LPARENTHESIS)?(WORD)(WILDCARD)?(RPARENTHESIS)?)*(SPACE)+(QUOTE); WORD : (CHARACTER)+; What I would like to do is to replace all characters marked as space to be replaced with actual space character in the PHRASE. Also if possible, I would then like all continuous spaces to be represented by a single space. Any help would be most appreciated. For some reason, I am finding it hard to understand ANTLR. Any good tutorials out there ?

    Read the article

  • Extra white space, on HTML output, on PHP MVC

    - by user316841
    Hi, I'm getting extra white space, that is not CSS, or nothing like it on the view output: The HTML. I've checked for ? (removed, where I could), saved UTF8 without BOM. Checked for existent white space in the beginning of each file, even at end. This is the structure: index.php - this is the entry point; MODEL/ CONTROLLER/ VIEW/ Let's say, that trough method GET, its sent the var TPL with some value. Let's call it LIST, so it pulls the LIST model, with all data and then show the right template to the user, with the right data. I used and tested, with require_once, include_once, include, even tested with readfile (just to test). The LIST Template opens the header.tpl and footer.tpl; I also tryed to remove this both includes from LIST template, but still, the extra white space continued. This is where the extra white space is coming from. This controller is placed between controller activity runs here , this is where the extra white space is coming from: $model_works-getRows(); $rows = $model_works-rows; if ( !require_once('views/list_works.tpl.php') ) { echo "Error."; } // end if clause The list_works.tpl.php, is basicly HTML with tags; I've t tested by changing the extension to something else, like html. Also, just to remember that at top of this file, we are using require_once to open the header.tpl and at bottom the footer.tpl. I've tested by removing both and the extra white space was still generated. The extra white space is being generated here: EXTRA WHITE SPACE HERE Thanks a lot for looking, ;D

    Read the article

  • the effect of large number of files on disk space in unix filesystems

    - by user46976
    If I have a text file in Unix that contains N-many independent entries (e.g. records about employees, where each employee has a separate record), is it expected that this file will take up less space than if I split the file into N files, each containing the entry for one employee? in other words, can one save significant space on unix file systems by concatenating many files together, or is the difference negligible? thanks.

    Read the article

  • Efficiency Question for an Ajax App

    - by Kubi
    Hi, Currently I am dealing with a web application which uses a txt file as a database for testing for now. But we will connect it to a server later on. My question is, if there is a more efficient way to get my objects than the way I am using now. During the page_init I am getting all my objects into a Collection as List, then I am populating the ajax toolkit accordion objects in the page with that. I have some client side buttons which fires callbacks for getting some other objects to populate the accordions in an update panel. And I am using .net Collections too much like dictionary and list, I am wondering if using arrays is more efficient. Could you advise me about how to make this site better and faster ? Is it better or possible to initialize those TravelP objects in javascript at the beginning and use it like that ? Any comments would be greatly appreciated, Thanks

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >