Search Results

Search found 45804 results on 1833 pages for 'large files'.

Page 14/1833 | < Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >

  • Binary files printing and desired precision

    - by yCalleecharan
    Hi, I'm printing a variable say z1 which is a 1-D array containing floating point numbers to a text file so that I can import into Matlab or GNUPlot for plotting. I've heard that binary files (.dat) are smaller than .txt files. The definition that I currently use for printing to a .txt file is: void create_out_file(const char *file_name, const long double *z1, size_t z_size){ FILE *out; size_t i; if((out = _fsopen(file_name, "w+", _SH_DENYWR)) == NULL){ fprintf(stderr, "***> Open error on output file %s", file_name); exit(-1); } for(i = 0; i < z_size; i++) fprintf(out, "%.16Le\n", z1[i]); fclose(out); } I have three questions: Are binary files really more compact than text files?; If yes, I would like to know how to modify the above code so that I can print the values of the array z1 to a binary file. I've read that fprintf has to be replaced with fwrite. My output file say dodo.dat should contain the values of array z1 with one floating number per line. I have %.16Le up in my code but I think that %.15Le is right as I have 15 precision digits with long double. I have put a dot (.) in the width position as I believe that this allows expansion to an arbitrary field to hold the desired number. Am I right? As an example with %.16Le, I can have an output like 1.0047914240730432e-002 which gives me 16 precision digits and the width of the field has the right width to display the number correctly. Is placing a dot (.) in the width position instead of a width value a good practice? Thanks a lot...

    Read the article

  • How do I use Group Policy on a domain to delete Temporary Internet Files?

    - by Muhammad Ali
    I have a domain controller running on Windows 2008 Server R2 and users login to application servers on which Windows 2003 Server SP2 is installed. I have applied a Group Policy to clean temporary internet files on exit i.e to delete all temporary internet files when users close the browser. But the group policy doesn't seem to work as user profile size keeps on increasing and the major space is occupied by temporary internet files therefore increasing the disk usage. How can i enforce automatic deletion of temporary internet files?

    Read the article

  • Compress Large Video Files with DivX / Xvid and AutoGK

    - by DigitalGeekery
    Have you ever recorded home video on a camcorder only to find the video size is enormous? What if you wanted to share a video clip on YouTube or another video sharing site, but the file size was bigger than the maximum upload size? Today we’ll look at a way to compress certain video files, such as MPEG and AVI, with Auto Gordian Knot (AutoGK). AutoGK is a free application that runs on Windows. It supports Mpeg1, Mpeg2, Transport Streams, Vobs, and virtually any codec used for an .AVI file. AutoGK will accept as input the following file types: MPG, MPEG, VOB, VRO, M2V, DAT, IFO, TS, TP, TRP, M2T, and AVI. Files are output as .AVI files and are converted using the DivX or XviD codecs. Installing and Using AutoGK Download and install AutoGK (link below) Open the AutoGK. You’ll need to navigate a few wizard screens, but you can just accept the defaults.   Choose your video file by clicking on the folder to the right of the Input file text box.   Browse for and select your video file and click “Open.”   For this example, we’ll be working with an .AVI file that’s 167MB in size.   The output file is copied into the same directory as the input file by default, but you can change this if you choose. If the input file is also .AVI, AutoGK will append an _agk to the output file so that the original is not overwritten. Next, you’ll see any audio tracks listed. You can unselect the check box if you’d like to remove the audio track. You can choose one of the Predefined size options… Or, select a Custom size in MB or Target Quality in percentage. For our example, we’ll be compressing our 167MB file to 35MB. Click on Advanced Settings. Here you can choose your codec, if you have a preference, as well as output resolution and output audio. If you’d like to use the DivX codec, you’ll need to download and install it separately. (See link below) Typically you’ll want to keep the defaults. Click “OK.” Now you’re ready to add your file conversion job to the Job queue. Click Add Job to add it to the queue. You can add multiple files conversions to the job queue and  convert them in one batch. Click Start to begin the conversion process. The process will begin. You’ll be able to see the progress in the Log window on the bottom left. When the conversion is complete you’ll see a “Job finished” and the total time in the log window.   Check your output file to see it’s compressed size. Test your video just to make sure the output quality is satisfactory.   Note:  Conversion times can vary greatly depending on the size of the file and your computer hardware. Files that are several GBs in size may take several hours to compress. AutoGK is no longer being actively developed but is still a wonderful DivX/XviD conversion tool. It can also be used to compress and convert non-copy protected DVDs. Downloads AutoGordianKnot DivX (optional) Similar Articles Productive Geek Tips Use Your Mac Mini as a Media Server Part 2Make Disk Cleanup Compress Older(or Newer) Files on XPMysticgeek Blog: Exclusive Look Inside Vreel – Including Interview With Vreel Founder!Friday Fun: Watch HD Video Content with MeevidConvert a DVD Movie Directly to AVI with FairUse Wizard 2.9 TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Penolo Lets You Share Sketches On Twitter Visit Woolyss.com for Old School Games, Music and Videos Add a Custom Title in IE using Spybot or Spyware Blaster When You Need to Hail a Taxi in NYC Live Map of Marine Traffic NoSquint Remembers Site Specific Zoom Levels (Firefox)

    Read the article

  • Visual C#, Large Arrays, and LOH Fragmentation. What is the accepted convention?

    - by Gorchestopher H
    I have an other active question HERE regarding some hopeless memory issues that possibly involve LOH Fragmentation among possibly other unknowns. What my question now is, what is the accepted way of doing things? If my app needs to be done in Visual C#, and needs to deal with large arrays to the tune of int[4000000], how can I not be doomed by the garbage collector's refusal to deal with the LOH? It would seem that I am forced to make any large arrays global, and never use the word "new" around any of them. So, I'm left with ungraceful global arrays with "maxindex" variables instead of neatly sized arrays that get passed around by functions. I've always been told that this was bad practice. What alternative is there? Is there some kind of function to the tune of System.GC.CollectLOH("Seriously") ? Are there possibly some way to outsource garbage collection to something other than System.GC? Anyway, what are the generally accepted rules for dealing with large (85Kb) variables?

    Read the article

  • using a "temporary files" folder in python

    - by zubin71
    I recently wrote a script which queries PyPI and downloads a package; however, the package gets downloaded to a user defined folder. I`d like to modify the script in such a way that my downloaded files go into a temporary folder, if the folder is not specified. The temporary-files folder in *nix machines is "/tmp" ; would there be any Python method I could use to find out the temporary-files folder in a particular machine? If not, could someone suggest an alternative to this problem?

    Read the article

  • Google I/O 2011: Large-scale Data Analysis Using the App Engine Pipeline API

    Google I/O 2011: Large-scale Data Analysis Using the App Engine Pipeline API Brett Slatkin The Pipeline API makes it easy to analyze complex data using App Engine. This talk will cover how to build multi-phase Map Reduce workflows; how to merge multiple large data sources with "join" operations; and how to build reusable analysis components. It will also cover the API's concurrency model, how to debug in production, and built-in testing facilities. From: GoogleDevelopers Views: 3320 17 ratings Time: 51:39 More in Science & Technology

    Read the article

  • Undocumented Gmail Search Operator Ferrets Out Large Email Attachments

    - by Jason Fitzpatrick
    If you’re looking for a way to quickly find large email attachments in your Gmail account, this undocumented search operator makes it simple to zero in on the hulking attachments hiding out in your inbox. To use the search operator simply plug in “size:” and some value to narrow your search to only emails that size or larger. In the screenshot above we searched for “size:20000000″ to search for files roughly 20MB or larger (if you want to be extremely precise, a true 20MB search would be “size:20971520″). If you’re looking to clean up your Gmail account this is a nearly zero-effort way to find the biggest space hogs–in our case, we found an email packed with massive PDF files from a 5 year old project that we were more than happy to purge. Finding Large Attachments in Google Mail/Gmail [via gHacks] 6 Ways Windows 8 Is More Secure Than Windows 7 HTG Explains: Why It’s Good That Your Computer’s RAM Is Full 10 Awesome Improvements For Desktop Users in Windows 8

    Read the article

  • GNU make copy files to distro directory

    - by TheRoadrunner
    I keep my source html (and images etc.) in separate directories for source control. Part of making the distro is to have make copy files to output folder and set the attributes. Today my makefile shows (extract): %.html: /usr/bin/install -c -p -m 644 $< $@ www: $(HTMLDST)/firmware.html $(HTMLDST)/firmware_status.html $(HTMLDST)/index.html $(HTMLDST)/firmware.html: $(HTMLSRC)/firmware.html $(HTMLDST)/firmware_status.html: $(HTMLSRC)/firmware_status.html $(HTMLDST)/index.html: $(HTMLSRC)/index.html This is shown with only three html files, but in reality, there are lots. I would like to just list the filenames (without paths) and have make do the comparison between source and destination and copy the files that have been updated. Thank you in advance Søren

    Read the article

  • Trying to cat files - unrecognized wildcard

    - by Barb
    Hello, I am trying to create a file that contains all of the code of an app. I have created a file called catlist.txt so that the files are added in the order I need them. A snippet of my catlist.txt: app/controllers/application_controller.rb app/views/layouts/* app/models/account.rb app/controllers/accounts_controller.rb app/views/accounts/* When I run the command the files that are explicitly listed get added but the wildcard files do not. cat catlist.txt|xargs cat > fullcode I get cat: app/views/layouts/*: No such file or directory cat: app/views/accounts/*: No such file or directory Can someone help me with this. If there is an easier method I am open to all suggestions. Barb

    Read the article

  • cannot edit any php files using specific functions

    - by user458474
    I cannot update any txt files using php. When I write a simple code like the following: <?php // create file pointer $fp = fopen("C:/Users/jj/bob.txt", 'w') or die('Could not open file, or fike does not exist and failed to create.'); $mytext = '<b>hi. This is my test</b>'; // write text to file fwrite($fp, $mytext) or die('Could not write to file.'); $content = file("C:/Users/jj/bob.txt"); // close file fclose($fp); ?> Both files do exist in the folder. I just cannot see any updates on bob.txt. Is this a permission error in windows? It works fine on my laptop at home. I also cannot change the php files on my website, using filezilla.

    Read the article

  • Best way to store a large amount of game objects and update the ones onscreen

    - by user3002473
    Good afternoon guys! I'm a young beginner game developer working on my first large scale game project and I've run into a situation where I'm not quite sure what the best solution may be (if there is a lone solution). The question may be vague (if anyone can think of a better title after having read the question, please edit it) or broad but I'm not quite sure what to do and I thought it would help just to discuss the problem with people more educated in the field. Before we get started, here are some of the questions I've looked at for help in the past: Best way to keep track of game objects Elegant way to simulate large amounts of entities within a game world What is the most efficient container to store dynamic game objects in? I've also read articles about different data structures commonly used in games to store game objects such as this one about slot maps, but none of them are really what I'm looking for. Also, if it helps at all I'm using Python 3 to design the game. It has to be Python 3, if I could I would use C++ or Unityscript or something else, but I'm restricted to having to use Python 3. My game will be a form of side scroller shooter game. In said game the player will traverse large rooms with large amounts of enemies and other game objects to update (think some of the larger areas in Cave Story or Iji). The player obviously can't see the entire room all at once, so there is a viewport that follows the player around and renders only a selection of the room and the game objects that it contains. This is not a foreign concept. The part that's getting me confused has to do with how certain game objects are updated. Some of them are to be updated constantly, regardless of whether or not they can be seen. Other objects however are only to be updated when they are onscreen (for example, an enemy would only be updated to react to the player when it is onscreen or when it is in a certain range of the screen). Another problem is that game objects have to be easily referable by other game objects; something that happens in the player's update() method may affect another object in the world. Collision detection in games is always a serious problem. I need a way of containing the game objects such that it minimizes the number of cases when testing for collisions against one another. The final problem is that of creating and destroying game objects. I think this problem is pretty self explanatory. To store the game objects then I've considered a number of different methods. The original method I had was to simply store all the objects in a hash table by an id. This method was simple, and decently fast as it allows all the objects to be looked up in O(1) complexity, and also allows them to be deleted fairly easily. Hash collisions would not be a major problem; I wasn't originally planning on using computer generated ids to store the game objects I was going to rely on them all using ids given to them by the game designer (such names would be strings like 'Player' or 'EnemyWeapon4'), and even if I did use computer generated ids, if I used a decent hashing algorithm then the chances of collisions would be around 1 in 4 billion. The problem with using a hash table however is that it is inefficient in checking to see what objects are in range of the viewport. Considering the fact that certain game objects move (as well as the viewport itself), the only solution I could think of in order to only update objects that are in the viewport would be to iterate through every object in the hash table and check if it is in the viewport or not, updating only the ones that are in the valid area. This would be incredibly slow in scenarios where the amount of game objects exceeds 500, or even 200. The second solution was to store everything in a 2-d list. The world is partitioned up into cells (a tilemap essentially), where each cell or tile is the same size and is square. Each cell would contain a list of the game objects that are currently occupying it (each game object would be inserted into a cell depending on the center of the object's collision mask). A 2-d list would allow me to take the top-left and bottom-right corners of the viewport and easily grab a rectangular area of the grid containing only the cells containing entities that are in valid range to be updated. This method also solves the problem of collision detection; when I take an entity I can find the cell that it is currently in, then check only against entities in it's cell and the 8 cells around it. One problem with this system however is that it prohibits easy lookup of game objects. One solution I had would be to simultaneously keep a hash table that would contain all the positions of the objects in the 2-d list indexed by the id of said object. The major problem with a 2-d list is that it would need to be rebuilt every single game frame (along with the hash table of object positions), which may be a serious detriment to game speed. Both systems have ups and downs and seem to solve some of each other's problems, however using them both together doesn't seem like the best solution either. If anyone has any thoughts, ideas, suggestions, comments, opinions or solutions on new data structures or better implementations of the existing data structures I have in mind, please post, any and all criticism and help is welcome. Thanks in advance! EDIT: Please don't close the question because it has a bad title, I'm just bad with names!

    Read the article

  • VSS - Solution file between multiple users

    - by BhejaFry
    Hi folks, we have a solution with multiple projects that is being developed by a team of developers. Project paths in the solution file checked in initially contains the path that are specific to that developer. Now when another dev gets latest of the solution, some of the projects won't load as the path differs. What's a better way to manage this ? TIA

    Read the article

  • How to resize a /home partition in Kubuntu?

    - by Devon
    I was distro hopping for awhile in the past few months, so in order to keep all of my files secure, I made a partition of around 50 GB named Files to store all of my files in, and still have them for quick and easy access. However, now that I've found a distribution I'm comfortable with (Kubuntu 11.10), I would like to remove this partition, and have all of my files in my /home folder, in order to more easily deal with these files. I've moved all of my files in the partition to my /home folder (and still have plenty of room to spare), and now I'm trying to delete the partition and use the space for my /home folder. I can delete the partition just fine, however, I can't extend the /home folder into the unallocated space. Here's a screenshot of what I'm talking about. In order to change the size of the /home partition, I need to unmount it. But, I am unable to unmount it! How do I best extend the size of the partition?

    Read the article

  • caching static files for ruby on rails application using nginx

    - by splintercell
    I have been trying for some time to serve & cache static files for my rails app using nginx. the rails app server runs mongrel_cluster and is deployed on a different host than that of nginx. following many of the available discussions I tried the following server { listen 80; server_name www.myappserver.com; ssl on; root /var/apps/myapp/current/public; location ~ ^/(images|javascripts|stylesheets)/ { root /var/apps/myapp/current; expires 10y; } location / { proxy_pass http://myapp_upstream; } } But nginx fails to find the images and to load the css and js files. Can anyone help me out here? My aim is to configure nginx in such a way that it caches the static files till expiry. Please suggest me some way to achieve this or am I missing any point here?

    Read the article

  • Using JDBC to asynchronously read large Oracle table

    - by Ben George
    What strategies can be used to read every row in a large Oracle table, only once, but as fast as possible with JDBC & Java ? Consider that each row has non-trivial amounts of data (30 columns, including large text in some columns). Some strategies I can think of are: Single thread and read table. (Too slow, but listed for clarity) Read the id's into ConcurrentLinkedQueue, use threads to consume queue and query by id in batches. Read id's into a JMS queue, use workers to consume queue and query by id in batches. What other strategies could be used ? For the purpose of this question assume processing of rows to be free.

    Read the article

  • socket.accept error 24: To many open files

    - by Creotiv
    I have a problem with open files under my Ubuntu 9.10 when running server in Python2.6 And main problem is that, that i don't know why it so.. I have set ulimit -n = 999999 net.core.somaxconn = 999999 fs.file-max = 999999 and lsof gives me about 12000 open files when server is running. And also i'm using epoll. But after some time it's start giving exeption: File "/usr/lib/python2.6/socket.py", line 195, in accept error: [Errno 24] Too many open files And i don't know how it can reach file limit when it isn't reached. Thanks for help)

    Read the article

  • Why am I am unable to open exe files?

    - by Aaron
    It doesn't matter what disk I use, it can not open the program. I keep getting the following error: Archive: /media/xxxxxxxx/INSTALL/_Setupa.exe [/media/xxxxxxxxxx/INSTALL/_Setupa.exe] End-of-central-directory signature not found. Either this file is not a zipfile, or it constitutes one disk of a multi-part archive. In the latter case the central directory and zipfile comment will be found on the last disk(s) of this archive. zipinfo: cannot find zipfile directory in one of /media/xxxxxx/INSTALL/_Setupa.exe or /media/xxxxxxxxx/INSTALL/_Setupa.exe.zip, and cannot find /media/xxxxxxxxx/INSTALL/_Setupa.exe.ZIP, period. Any ideas?

    Read the article

  • How to create file of files?

    - by TheMachineCharmer
    class Node { FooType Data; // I can save Data to file with extension .foo void Save() { // save Data to .foo file } } Now , class Graph { List<Node> Nodes; void Save() { foreach(Node node in Nodes) { node.Save(); } } } Now when I invoke someGraph.Save(); then it creates Nodes.Count number of files. I would like to appear those files as one file with extension somename.graph and be able to read it again as Nodes. How can I do this? Is there a way to bundle files into a single different file?

    Read the article

  • Reason for monolithic data files

    - by Ali Lown
    Primarily this seems to be a technique used by games, where they have all the sounds in one file, textures in another etc. With these files commonly reaching the GB size. What is the reason behind doing this over maintaining it all in subdirectories as small files - one per texture which many small games use this, with the monolithic system being favoured by larger companies? Is there some file system overhead with lots of small files? Are they trying to protect their property - although most just seem to be a compressed file with a new extension?

    Read the article

  • Cloud computing cost savings for large enterprise

    - by user13817
    I'm trying to understand whether cloud computing is meant for small to medium sized companies OR also for large companies. Imagine a website with a very large user base. The storage and bandwidth demands as well as the number of database transactions are incredibly high. The website might be hosting videos, music, images, etc. that keep the demands high. Does it make sense to be in the cloud when you know you need huge volumes of storage, bandwidth, and GET,PUT,etc. requests? (Each of these variables costs money in the cloud) OR does it make sense to build your own infrastructure? I can see the cost savings of cloud computing if you are a small business, but if you were aiming at the next big thing on the Internet, I can't quite see the benefits.

    Read the article

  • Can I share files via SAMBA while in root?

    - by user212501
    I have a couple of things going on here. I am currently running UBUNTU 10.04 in root via "STARTX" command. The reason being is that I keep getting the "LOW GRAPHICS ERROR CODE" and I cant not start up normally. Now I have researched & researched the error. I mean I got my SUDO on........, o.k. I'm really done trying to fix this error. I just wanna re-install. (funny thing. This startx command is how Im using the laptop now, so I know that it is a software issue) So here's my dilemma that is inside another dilemma. I have about 30gigs of info that I need to get off the laptop before I can re-install. Any thoughts What say you? LOTR LOL

    Read the article

  • Monitor flashes black when starting video files, opening folders with video files

    - by Bob
    My entire monitor flashes black for a second or so when starting any video file. It does the same thing when closing the file. It also does this when opening a folder that has video files and a thumbnail view, or when I do anything that presumably accesses the video stream inside a video file. It happens in VLC and Windows Media Player, though it occurs to me certain video editors I have like avidemux and virtualdub don't have that issue when opening video files... It also does not happen when watching flash-based videos on web sites. It doesn't seem to happen when Quicktime. I went through and uninstalled any extraneous programs a week or so ago and didn't see any change. I also updated the graphics card driver. What can I do to troubleshoot/fix this?

    Read the article

  • Large Queries in Google

    - by marienbad
    I have a large query I want to do in google. It's just a string of OR's for the purpose of determining which search terms are ranked the highest compared to the others. It's not ABSURDLY large -- it's only 5,500 characters. But Google says: Request-URI Too Large The requested URL /search... is too large to process. Is there a way around?

    Read the article

  • Recovering files that do not appear in the Recycle Bin, but are in the $Recycle.bin folder on external drive

    - by Zach Morgan
    Problem: I have an NTFS external drive with a $Recycle.bin folder on the root (E:/$Recycle.bin/) that has about 70gb worth of data. For whatever reason, the folder is no longer a hidden system file and no Windows machine I have used the drive on will show the files in the actual Recycle Bin. What I Want To Do: I want to atleast view the recycle bin files from this external, and all of the help articles I have read just talk about deleting the folder all together. I plan on reformating the drive, but first I need to see if there are any important deleted files. What Didn't Work: Recuva - didn't see any of my files Resetting the external's Recycle Bin via command prompt and moving the old $Recycle.bin files into the new external $Recycle.bin folder (I didn't read this anywhere, just made it up on my own)

    Read the article

< Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >