Search Results

Search found 40479 results on 1620 pages for 'binary files'.

Page 22/1620 | < Previous Page | 18 19 20 21 22 23 24 25 26 27 28 29  | Next Page >

  • Open source C program with little requirements and at least 2MB binary size.

    - by max
    Hi, I'm developing a operating system and I need a test program (function of any kind) to test certain internal features. I cannot find any appropriate program to do this job. Probably one of you knows one. The program should be open source, written in C with very little user library usage (only file IO, pthreads, stdio, stdlib preferred) and must have a binary size of at least 2MB. Thanks for any suggestions.

    Read the article

  • How can I quickly parse large (>10GB) files?

    - by Andrew
    Hi - I have to process text files 10-20GB in size of the format: field1 field2 field3 field4 field5 I would like to parse the data from each line of field2 into one of several files; the file this gets pushed into is determined line-by-line by the value in field4. There are 25 different possible values in field2 and hence 25 different files the data can get parsed into. I have tried using Perl (slow) and awk (faster but still slow) - does anyone have any suggestions or pointers toward alternative approaches? FYI here is the awk code I was trying to use; note I had to revert to going through the large file 25 times because I wasn't able to keep 25 files open at once in awk: chromosomes=(1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25) for chr in ${chromosomes[@]} do awk < my_in_file_here -v pat="$chr" '{if ($4 == pat) for (i = $2; i <= $2+52; i++) print i}' >> my_out_file_"$chr".query done

    Read the article

  • Java Oracle Installation /usr/bin/java: cannot execute binary file

    - by Dave
    Hi I've been trying for 1 day to get Oracle Java running on Ubuntu. I have a powermac g5 with Ubuntu 12.04 ppc64. uname -a : Linux LK37 3.2.0-53-powerpc64-smp #81-Ubuntu SMP Thu Aug 22 21:17:14 UTC 2013 ppc64 ppc64 ppc64 GNU/Linux lspci: david@LK37:~$ sudo lspc [sudo] password for david: sudo: lspc: command not found david@LK37:~$ sudo lspci 0000:00:0b.0 PCI bridge: Apple Inc. CPC945 PCIe Bridge 0000:0a:00.0 VGA compatible controller: NVIDIA Corporation NV43 [GeForce 6600 LE] (rev a2) 0001:00:00.0 Host bridge: Apple Inc. U4 HT Bridge 0001:00:01.0 PCI bridge: Broadcom BCM5780 [HT2000] PCI-X bridge (rev a3) 0001:00:02.0 PCI bridge: Broadcom BCM5780 [HT2000] PCI-X bridge (rev a3) 0001:00:03.0 PCI bridge: Broadcom BCM5780 [HT2000] PCI-Express Bridge (rev a3) 0001:00:04.0 PCI bridge: Broadcom BCM5780 [HT2000] PCI-Express Bridge (rev a3) 0001:00:05.0 PCI bridge: Broadcom BCM5780 [HT2000] PCI-Express Bridge (rev a3) 0001:00:06.0 PCI bridge: Broadcom BCM5780 [HT2000] PCI-Express Bridge (rev a3) 0001:00:07.0 PCI bridge: Apple Inc. Shasta PCI Bridge 0001:00:08.0 PCI bridge: Apple Inc. Shasta PCI Bridge 0001:00:09.0 PCI bridge: Apple Inc. Shasta PCI Bridge 0001:01:07.0 Unassigned class [ff00]: Apple Inc. Shasta Mac I/O 0001:01:0b.0 USB controller: NEC Corporation OHCI USB Controller (rev 43) 0001:01:0b.1 USB controller: NEC Corporation OHCI USB Controller (rev 43) 0001:01:0b.2 USB controller: NEC Corporation uPD72010x USB 2.0 Controller (rev 04) 0001:03:0c.0 IDE interface: Broadcom K2 SATA 0001:03:0d.0 Unassigned class [ff00]: Apple Inc. Shasta IDE 0001:03:0e.0 FireWire (IEEE 1394): Apple Inc. Shasta Firewire 0001:05:04.0 Ethernet controller: Broadcom Corporation NetXtreme BCM5780 Gigabit Ethernet (rev 03) 0001:05:04.1 Ethernet controller: Broadcom Corporation NetXtreme BCM5780 Gigabit Ethernet (rev 03) david@LK37:~$ I tried various ways to install Oracle Java but I always end up with: bash: /usr/bin/java: cannot execute binary file At the moment I have Installed jdk-7u25-linux-x64.tar.gz in /usr/lib/jvm/jdk1.7.0/bin/ as said in this post I already tried the web install but I get a 404 error. I hope you can help me. I started using Ubuntu yesterday so please give me the complete terminal code, it will be a lot easier for me. For those who care I want to play Minecraft and with the OpenJDK I got a java.lang error. That's why I want to install Oracle Java.

    Read the article

  • Batch deletion of smaller files from group of files via unix command line

    - by artlung
    I have a large number (more than 400) of directories full of photos. What I want to do is to keep the larger sizes of these photos. Each directory has 31 to 66 files in it. Each directory has thumbnails, and larger versions, plus a file called example.jpg I dispatched the example.jpg file easily with: rm */example.jpg I initially thought that it would be easy to delete the thumbnails, but the problem is they are not consistently named. The typical pattern was photo1.jpg and photo1s.jpg. I did rm */photo*s.jpg but it ended up some of the files named photoXs.jpg were actually larger and not smaller. Argh. So what I want to do is scan each directory for filesize and delete (or move) the thumbnails. I initially thought I'd just ls -R every file and extract the size of each file and save those under a threshold. The problem? In one directory the large will be 1.1 MB and the thumb is 200k. In another the large is 200k and the small 30k. Even worse, the files really are mostly named photo1.jpg - so simply putting them all in the same folder, sorting by size, and deleting in groups would not work without renaming already, and if it's possible I'd prefer to keep them in their folders. I was almost resolved to just doing this all manually, but then thought I'd ask here. How would you do this task?

    Read the article

  • How do I use Group Policy on a domain to delete Temporary Internet Files?

    - by Muhammad Ali
    I have a domain controller running on Windows 2008 Server R2 and users login to application servers on which Windows 2003 Server SP2 is installed. I have applied a Group Policy to clean temporary internet files on exit i.e to delete all temporary internet files when users close the browser. But the group policy doesn't seem to work as user profile size keeps on increasing and the major space is occupied by temporary internet files therefore increasing the disk usage. How can i enforce automatic deletion of temporary internet files?

    Read the article

  • Convert NSData to primitive variable with ieee-754 or twos-complement ?

    - by William GILLARD
    Hi every one. I am new programmer in Obj-C and cocoa. Im a trying to write a framework which will be used to read a binary files (Flexible Image Transport System or FITS binary files, usually used by astronomers). The binary data, that I am interested to extract, can have various formats and I get its properties by reading the header of the FITS file. Up to now, I manage to create a class to store the content of the FITS file and to isolate the header into a NSString object and the binary data into a NSData object. I also manage to write method which allow me to extract the key values from the header that are very valuable to interpret the binary data. I am now trying to convert the NSData object into a primitive array (array of double, int, short ...). But, here, I get stuck and would appreciate any help. According to the documentation I have about the FITS file, I have 5 possibilities to interpret the binary data depending on the value of the BITPIX key: BITPIX value | Data represented 8 | Char or unsigned binary int 16 | 16-bit two's complement binary integer 32 | 32-bit two's complement binary integer 64 | 64-bit two's complement binary integer -32 | IEEE single precision floating-point -64 | IEEE double precision floating-point I already write the peace of code, shown bellow, to try to convert the NSData into a primitive array. // self reefer to my FITS class which contain a NSString object // with the content of the header and a NSData object with the binary data. -(void*) GetArray { switch (BITPIX) { case 8: return [self GetArrayOfUInt]; break; case 16: return [self GetArrayOfInt]; break; case 32: return [self GetArrayOfLongInt]; break; case 64: return [self GetArrayOfLongLong]; break; case -32: return [self GetArrayOfFloat]; break; case -64: return [self GetArrayOfDouble]; break; default: return NULL; } } // then I show you the method to convert the NSData into a primitive array. // I restrict my example to the case of 'double'. Code is similar for other methods // just change double by 'unsigned int' (BITPIX 8), 'short' (BITPIX 16) // 'int' (BITPIX 32) 'long lon' (BITPIX 64), 'float' (BITPIX -32). -(double*) GetArrayOfDouble { int Nelements=[self NPIXEL]; // Metod to extract, from the header // the number of element into the array NSLog(@"TOTAL NUMBER OF ELEMENTS [%i]\n",Nelements); //CREATE THE ARRAY double (*array)[Nelements]; // Get the total number of bits in the binary data int Nbit = abs(BITPIX)*GCOUNT*(PCOUNT + Nelements); // GCOUNT and PCOUNT are defined // into the header NSLog(@"TOTAL NUMBER OF BIT [%i]\n",Nbit); int i=0; //FILL THE ARRAY double Value; for(int bit=0; bit < Nbit; bit+=sizeof(double)) { [Img getBytes:&Value range:NSMakeRange(bit,sizeof(double))]; NSLog(@"[%i]:(%u)%.8G\n",i,bit,Value); (*array)[i]=Value; i++; } return (*array); } However, the value I print in the loop are very different from the expected values (compared using official FITS software). Therefore, I think that the Obj-C double does not use the IEEE-754 convention as well as the Obj-C int are not twos-complement. I am really not familiar with this two convention (IEEE and twos-complement) and would like to know how I can do this conversion with Obj-C. In advance many thanks for any help or information.

    Read the article

  • using a "temporary files" folder in python

    - by zubin71
    I recently wrote a script which queries PyPI and downloads a package; however, the package gets downloaded to a user defined folder. I`d like to modify the script in such a way that my downloaded files go into a temporary folder, if the folder is not specified. The temporary-files folder in *nix machines is "/tmp" ; would there be any Python method I could use to find out the temporary-files folder in a particular machine? If not, could someone suggest an alternative to this problem?

    Read the article

  • What is the fastest way to create a checksum for large files in C#

    - by crono
    Hi, I have to sync large files across some machines. The files can be up to 6GB in size. The sync will be done manually every few weeks. I cant take the filename into consideration because they can change anytime. My plan is to create checksums on the destination PC and on the source PC and than copy all files with a checksum, which are not already in the destination, to the destination. My first attempt was something like this: using System.IO; using System.Security.Cryptography; private static string GetChecksum(string file) { using (FileStream stream = File.OpenRead(file)) { SHA256Managed sha = new SHA256Managed(); byte[] checksum = sha.ComputeHash(stream); return BitConverter.ToString(checksum).Replace("-", String.Empty); } } The Problem was the runtime: - with SHA256 with a 1,6 GB File - 20 minutes - with MD5 with a 1,6 GB File - 6.15 minutes Is there a better - faster - way to get the checksum (maybe with a better hash function)?

    Read the article

  • GNU make copy files to distro directory

    - by TheRoadrunner
    I keep my source html (and images etc.) in separate directories for source control. Part of making the distro is to have make copy files to output folder and set the attributes. Today my makefile shows (extract): %.html: /usr/bin/install -c -p -m 644 $< $@ www: $(HTMLDST)/firmware.html $(HTMLDST)/firmware_status.html $(HTMLDST)/index.html $(HTMLDST)/firmware.html: $(HTMLSRC)/firmware.html $(HTMLDST)/firmware_status.html: $(HTMLSRC)/firmware_status.html $(HTMLDST)/index.html: $(HTMLSRC)/index.html This is shown with only three html files, but in reality, there are lots. I would like to just list the filenames (without paths) and have make do the comparison between source and destination and copy the files that have been updated. Thank you in advance Søren

    Read the article

  • Trying to cat files - unrecognized wildcard

    - by Barb
    Hello, I am trying to create a file that contains all of the code of an app. I have created a file called catlist.txt so that the files are added in the order I need them. A snippet of my catlist.txt: app/controllers/application_controller.rb app/views/layouts/* app/models/account.rb app/controllers/accounts_controller.rb app/views/accounts/* When I run the command the files that are explicitly listed get added but the wildcard files do not. cat catlist.txt|xargs cat > fullcode I get cat: app/views/layouts/*: No such file or directory cat: app/views/accounts/*: No such file or directory Can someone help me with this. If there is an easier method I am open to all suggestions. Barb

    Read the article

  • cannot edit any php files using specific functions

    - by user458474
    I cannot update any txt files using php. When I write a simple code like the following: <?php // create file pointer $fp = fopen("C:/Users/jj/bob.txt", 'w') or die('Could not open file, or fike does not exist and failed to create.'); $mytext = '<b>hi. This is my test</b>'; // write text to file fwrite($fp, $mytext) or die('Could not write to file.'); $content = file("C:/Users/jj/bob.txt"); // close file fclose($fp); ?> Both files do exist in the folder. I just cannot see any updates on bob.txt. Is this a permission error in windows? It works fine on my laptop at home. I also cannot change the php files on my website, using filezilla.

    Read the article

  • VSS - Solution file between multiple users

    - by BhejaFry
    Hi folks, we have a solution with multiple projects that is being developed by a team of developers. Project paths in the solution file checked in initially contains the path that are specific to that developer. Now when another dev gets latest of the solution, some of the projects won't load as the path differs. What's a better way to manage this ? TIA

    Read the article

  • How to resize a /home partition in Kubuntu?

    - by Devon
    I was distro hopping for awhile in the past few months, so in order to keep all of my files secure, I made a partition of around 50 GB named Files to store all of my files in, and still have them for quick and easy access. However, now that I've found a distribution I'm comfortable with (Kubuntu 11.10), I would like to remove this partition, and have all of my files in my /home folder, in order to more easily deal with these files. I've moved all of my files in the partition to my /home folder (and still have plenty of room to spare), and now I'm trying to delete the partition and use the space for my /home folder. I can delete the partition just fine, however, I can't extend the /home folder into the unallocated space. Here's a screenshot of what I'm talking about. In order to change the size of the /home partition, I need to unmount it. But, I am unable to unmount it! How do I best extend the size of the partition?

    Read the article

  • caching static files for ruby on rails application using nginx

    - by splintercell
    I have been trying for some time to serve & cache static files for my rails app using nginx. the rails app server runs mongrel_cluster and is deployed on a different host than that of nginx. following many of the available discussions I tried the following server { listen 80; server_name www.myappserver.com; ssl on; root /var/apps/myapp/current/public; location ~ ^/(images|javascripts|stylesheets)/ { root /var/apps/myapp/current; expires 10y; } location / { proxy_pass http://myapp_upstream; } } But nginx fails to find the images and to load the css and js files. Can anyone help me out here? My aim is to configure nginx in such a way that it caches the static files till expiry. Please suggest me some way to achieve this or am I missing any point here?

    Read the article

  • socket.accept error 24: To many open files

    - by Creotiv
    I have a problem with open files under my Ubuntu 9.10 when running server in Python2.6 And main problem is that, that i don't know why it so.. I have set ulimit -n = 999999 net.core.somaxconn = 999999 fs.file-max = 999999 and lsof gives me about 12000 open files when server is running. And also i'm using epoll. But after some time it's start giving exeption: File "/usr/lib/python2.6/socket.py", line 195, in accept error: [Errno 24] Too many open files And i don't know how it can reach file limit when it isn't reached. Thanks for help)

    Read the article

  • Why am I am unable to open exe files?

    - by Aaron
    It doesn't matter what disk I use, it can not open the program. I keep getting the following error: Archive: /media/xxxxxxxx/INSTALL/_Setupa.exe [/media/xxxxxxxxxx/INSTALL/_Setupa.exe] End-of-central-directory signature not found. Either this file is not a zipfile, or it constitutes one disk of a multi-part archive. In the latter case the central directory and zipfile comment will be found on the last disk(s) of this archive. zipinfo: cannot find zipfile directory in one of /media/xxxxxx/INSTALL/_Setupa.exe or /media/xxxxxxxxx/INSTALL/_Setupa.exe.zip, and cannot find /media/xxxxxxxxx/INSTALL/_Setupa.exe.ZIP, period. Any ideas?

    Read the article

  • How to create file of files?

    - by TheMachineCharmer
    class Node { FooType Data; // I can save Data to file with extension .foo void Save() { // save Data to .foo file } } Now , class Graph { List<Node> Nodes; void Save() { foreach(Node node in Nodes) { node.Save(); } } } Now when I invoke someGraph.Save(); then it creates Nodes.Count number of files. I would like to appear those files as one file with extension somename.graph and be able to read it again as Nodes. How can I do this? Is there a way to bundle files into a single different file?

    Read the article

  • Reason for monolithic data files

    - by Ali Lown
    Primarily this seems to be a technique used by games, where they have all the sounds in one file, textures in another etc. With these files commonly reaching the GB size. What is the reason behind doing this over maintaining it all in subdirectories as small files - one per texture which many small games use this, with the monolithic system being favoured by larger companies? Is there some file system overhead with lots of small files? Are they trying to protect their property - although most just seem to be a compressed file with a new extension?

    Read the article

  • Can I share files via SAMBA while in root?

    - by user212501
    I have a couple of things going on here. I am currently running UBUNTU 10.04 in root via "STARTX" command. The reason being is that I keep getting the "LOW GRAPHICS ERROR CODE" and I cant not start up normally. Now I have researched & researched the error. I mean I got my SUDO on........, o.k. I'm really done trying to fix this error. I just wanna re-install. (funny thing. This startx command is how Im using the laptop now, so I know that it is a software issue) So here's my dilemma that is inside another dilemma. I have about 30gigs of info that I need to get off the laptop before I can re-install. Any thoughts What say you? LOTR LOL

    Read the article

  • The binary you uploaded was invalid. A pre-release beta version of the SDK was used to build the app

    - by Nick
    I am having a problem submitting my new app to the appstore. ItunesConnect gives me the error: The binary you uploaded was invalid. A pre-release beta version of the SDK was used to build the application. I haven't changed anything, I can compile to a ad-hoc certificate and that works fine. I uploaded another app yesterday and that worked fine too. All the targets and project info is set to compile to the base SDK iPhone OS 3.0. I even upgraded to the latest SDK but same result. Any ideas?

    Read the article

< Previous Page | 18 19 20 21 22 23 24 25 26 27 28 29  | Next Page >