Search Results

Search found 4547 results on 182 pages for 'haskell io'.

Page 47/182 | < Previous Page | 43 44 45 46 47 48 49 50 51 52 53 54  | Next Page >

  • create a sparse BufferedImage in java

    - by elgcom
    I have to create an image with very large resolution, but the image is relatively "sparse", only some areas in the image need to draw. For example with following code /* this take 5GB memory */ final BufferedImage img = new BufferedImage( 36000, 36000, BufferedImage.TYPE_INT_ARGB); /* draw something */ Graphics g = img.getGraphics(); g.drawImage(....); /* output as PNG */ final File out = new File("out.png"); ImageIO.write(img, "png", out); The PNG image on the end I created is ONLY about 200~300 MB. The question is how can I avoid creating a 5GB BufferedImage at the beginning? I do need an image with large dimension, but with very sparse color information. Is there any Stream for BufferedImage so that it will not take so much memory?

    Read the article

  • Is there a Base64Stream for .NET? where?

    - by Cheeso
    If I want to produce a Base64-encoded output, how would I do that in .NET? I know that since .NET 2.0, there is the ICryptoTransform interface, and the ToBase64Transform() and FromBase64Transform() implementations of that interface. But those classes are embedded into the System.Security namespace, and require the use of a TransformBlock, TransformFinalBlock, and so on. Is there an easier way to base64 encode a stream of data in .NET?

    Read the article

  • Why isn't this file reading/writing program working?

    - by user320950
    This program is supposed to read files and write them. I took the file open checks out because they kept causing errors. The problem is that the files open like they are supposed to and the names are correct but nothing is on any of the text screens. Do you know what is wrong? #include<iostream> #include<fstream> #include<cstdlib> #include<iomanip> using namespace std; int main() { ifstream in_stream; // reads itemlist.txt ofstream out_stream1; // writes in items.txt ifstream in_stream2; // reads pricelist.txt ofstream out_stream3;// writes in plist.txt ifstream in_stream4;// read recipt.txt ofstream out_stream5;// write display.txt float price=' ',curr_total=0.0; int wrong=0; int itemnum=' '; char next; in_stream.open("ITEMLIST.txt", ios::in); // list of avaliable items out_stream1.open("listWititems.txt", ios::out); // list of avaliable items in_stream2.open("PRICELIST.txt", ios::in); out_stream3.open("listWitdollars.txt", ios::out); in_stream4.open("display.txt", ios::in); out_stream5.open("showitems.txt", ios::out); in_stream.close(); // closing files. out_stream1.close(); in_stream2.close(); out_stream3.close(); in_stream4.close(); out_stream5.close(); system("pause"); in_stream.setf(ios::fixed); while(in_stream.eof()) { in_stream >> itemnum; cin.clear(); cin >> next; } out_stream1.setf(ios::fixed); while (out_stream1.eof()) { out_stream1 << itemnum; cin.clear(); cin >> next; } in_stream2.setf(ios::fixed); in_stream2.setf(ios::showpoint); in_stream2.precision(2); while((price== (price*1.00)) && (itemnum == (itemnum*1))) { while (in_stream2 >> itemnum >> price) // gets itemnum and price { while (in_stream2.eof()) // reads file to end of file { in_stream2 >> itemnum; in_stream2 >> price; price++; curr_total= price++; in_stream2 >> curr_total; cin.clear(); // allows more reading cin >> next; } } } out_stream3.setf(ios::fixed); out_stream3.setf(ios::showpoint); out_stream3.precision(2); while((price== (price*1.00)) && (itemnum == (itemnum*1))) { while (out_stream3 << itemnum << price) { while (out_stream3.eof()) // reads file to end of file { out_stream3 << itemnum; out_stream3 << price; price++; curr_total= price++; out_stream3 << curr_total; cin.clear(); // allows more reading cin >> next; } return itemnum, price; } } in_stream4.setf(ios::fixed); in_stream4.setf(ios::showpoint); in_stream4.precision(2); while ( in_stream4.eof()) { in_stream4 >> itemnum >> price >> curr_total; cin.clear(); cin >> next; } out_stream5.setf(ios::fixed); out_stream5.setf(ios::showpoint); out_stream5.precision(2); out_stream5 <<setw(5)<< " itemnum " <<setw(5)<<" price "<<setw(5)<<" curr_total " <<endl; // sends items and prices to receipt.txt out_stream5 << setw(5) << itemnum << setw(5) <<price << setw(5)<< curr_total; // sends items and prices to receipt.txt out_stream5 << " You have a total of " << wrong++ << " errors " << endl; }

    Read the article

  • Execute a Application On The Server Using VBScript

    - by Nathan Campos
    I have an application on my server that is called leaf.exe, that haves two arguments needed to run, they are: inputfile and outputfile, that will be like this example: leaf.exe input.jpg output.leaf They are all on the same directory as my home page file(the executable and the input file). But I need that a VBScript could run the application like that, then I want to know how could I do this.

    Read the article

  • What is the fastest way for reading huge files in Delphi?

    - by dummzeuch
    My program needs to read chunks from a huge binary file with random access. I have got a list of offsets and lengths which may have several thousand entries. The user selects an entry and the program seeks to the offset and reads length bytes. The program internally uses a TMemoryStream to store and process the chunks read from the file. Reading the data is done via a TFileStream like this: FileStream.Position := Offset; MemoryStream.CopyFrom(FileStream, Size); This works fine but unfortunately it becomes increasingly slower as the files get larger. The file size starts at a few megabytes but frequently reaches several tens of gigabytes. The chunks read are around 100 kbytes in size. The file's content is only read by my program. It is the only program accessing the file at the time. Also the files are stored locally so this is not a network issue. I am using Delphi 2007 on a Windows XP box. What can I do to speed up this file access?

    Read the article

  • Read a specific range of lines from a file using c

    - by James Joy
    I have the following content in a file: hhasfghgsafjgfhgfhjf gashghfdgdfhgfhjasgfgfhsgfjdg jfshafghgfgfhfsghfgffsjgfj . . . . . startread hajshjsfhajfhjkashfjf hasjgfhgHGASFHGSHF hsafghfsaghgf . . . . . stopread . . . . . . jsfjhfhjgfsjhfgjhsajhdsa jhasjhsdabjhsagshaasgjasdhjk jkdsdsahghjdashjsfahjfsd I need to read the lines from the next line of startread till the previous line of stopread using a c code and store it to a string variable(of course with a \n for every line breaks). How can i achieve this? I have used fgets(line,sizeof(line),file); but it starts reading from the beginning. I don't have the exact line number to start and stop reading since the file is written by another C code. But there are these identifiers startread and stopread to identify whereto start reading. Operating platform is linux. Thanks in advance.

    Read the article

  • Spring Integration 1.0 RC2: Streaming file content?

    - by gdm
    I've been trying to find information on this, but due to the immaturity of the Spring Integration framework I haven't had much luck. Here is my desired work flow: New files are placed in an 'Incoming' directory Files are picked up using a file:inbound-channel-adapter The file content is streamed, N lines at a time, to a 'Stage 1' channel, which parses the line into an intermediary (shared) representation. This parsed line is routed to multiple 'Stage 2' channels. Each 'Stage 2' channel does its own processing on the N available lines to convert them to a final representation. This channel must have a queue which ensures no Stage 2 channel is overwhelmed in the event that one channel processes significantly slower than the others. The final representation of the N lines is written to a file. There will be as many output files as there were routing destinations in step 4. *'N' above stands for any reasonable number of lines to read at a time, from [1, whatever I can fit into memory reasonably], but is guaranteed to always be less than the number of lines in the full file. How can I accomplish streaming (steps 3, 4, 5) in Spring Integration? It's fairly easy to do without streaming the files, but my files are large enough that I cannot read the entire file into memory. As a side note, I have a working implementation of this work flow without Spring Integration, but since we're using Spring Integration in other places in our project, I'd like to try it here to see how it performs and how the resulting code compares for length and clarity.

    Read the article

  • Java- FileWriter/BufferedWriter - appending to end of a text file?

    - by KP65
    I've done this before once, I'm trying to replicate what I did so far and this is what I've got: try { BufferedWriter writer = new BufferedWriter(new FileWriter("file.P", true)); System.out.println("entered"); if (!(newUserName.isEmpty()) || (newUserPass.isEmpty())){ writer.newLine(); writer.write("hellotest123"); writer.close(); } It seems to find file.P, which is just a txt file, but it doesn't seem to append anything onto it? It enters the code and passes the IF statement fine, but nothing is appended to the text file? I'm slightly stuck!

    Read the article

  • Determine if the current thread has low I/O priority

    - by Magnus Hoff
    I have a background thread that does some I/O-intensive background type work. To please the other threads and processes running, I set the thread priority to "background mode" using SetThreadPriority, like this: SetThreadPriority(GetCurrentThread(), THREAD_MODE_BACKGROUND_BEGIN); However, THREAD_MODE_BACKGROUND_BEGIN is only available in Windows Server 2008 or newer, as well as Windows Vista and newer, but the program needs to work well on Windows Server 2003 and XP as well. So the real code is more like this: if (!SetThreadPriority(GetCurrentThread(), THREAD_MODE_BACKGROUND_BEGIN)) { SetThreadPriority(GetCurrentThread(), THREAD_PRIORITY_LOWEST); } The problem with this is that on Windows XP it will totally disrupt the system by using too much I/O. I have a plan for a ugly and shameful way of mitigating this problem, but that depends on me being able to determine if the current thread has low I/O priority or not. Now, I know I can store which thread priority I ended up setting, but the control flow in the program is not really well suited for this. I would rather like to be able to test later whether or not the current thread has low I/O priority -- if it is in "background mode". GetThreadPriority does not seem to give me this information. Is there any way to determine if the current thread has low I/O priority?

    Read the article

  • Foreach File in a Folder in Flash?

    - by msandbot
    Hey, I have an image slideshow program working right now and it takes in a folder of a hard coded in number of images. I would like to change this so that it can take in a folder and will display all of them no matter the number. Is there a way to do this in flash? I'm thinking something like the foreach loop in perl or other scripting language. It is possible to store then number of images in a text file but I also don't know how to read that in flash either. I'm working in actionscript 3. Any help would be greatly appreciated. Thanks -Mike

    Read the article

  • Java how to copy part of a file

    - by user3479074
    I have to read a file and depending of the content of the last lines, I have to copy most of its content into a new file. Unfortunately I didn't found a way to copy first n lines or chars of a file in java. The only way I found, is copying the file using nio FileChannels where I can specifiy the length in bytes. However, therefore I would need to know how many bytes the stuff I read needed in the source-file. Does anyone know a solution for one of these problems?

    Read the article

  • How to append text into text file dynamically

    - by niraj deshmukh
    [12] key1=val1 key2=val2 key3=val3 key4=val4 key5=val5 [13] key1=val1 key2=val2 key3=val3 key4=val4 key5=xyz [14] key1=val1 key2=val2 key3=val3 key4=val4 key5=val5 I want to update key5=val5 where [13]. try { br = new BufferedReader(new FileReader(oldFileName)); bw = new BufferedWriter(new FileWriter(tmpFileName)); String line; while ((line = br.readLine()) != null) { System.out.println(line); if (line.contains("[13]")) { while (line.contains("key5")) { if (line.contains("key5")) { line = line.replace("key5", "key5= Val5"); bw.write(line+"\n"); } } } } } catch (Exception e) { return; } finally { try { if(br != null) br.close(); } catch (IOException e) { // } try { if(bw != null) bw.close(); } catch (IOException e) { // } }

    Read the article

  • Explained shell statement

    - by Mats Stijlaart
    The following statement will remove line numbers in a txt file: cat withLineNumbers.txt | sed 's/^.......//' >> withoutLineNumbers.txt The input file is created with the following statement (this one i understand): nl -ba input.txt >> withLineNumbers.txt I know the functionality of cat and i know the output is written to the 'withoutLineNumbers.txt' file. But the part of '| sed 's/^.......//'' is not really clear to me. Thanks for your time.

    Read the article

  • c++ File input/output

    - by Myx
    Hi: I am trying to read from a file using fgets and sscanf. In my file, I have characters on each line of the while which I wish to put into a vector. So far, I have the following: FILE *fp; fp = fopen(filename, "r"); if(!fp) { fprintf(stderr, "Unable to open file %s\n", filename); return 0; } // Read file int line_count = 0; char buffer[1024]; while(fgets(buffer, 1023, fp)) { // Increment line counter line_count++; char *bufferp = buffer; ... while(*bufferp != '\n') { char *tmp; if(sscanf(bufferp, "%c", tmp) != 1) { fprintf(stderr, "Syntax error reading axiom on " "line %d in file %s\n", line_count, filename); return 0; } axiom.push_back(tmp); printf("put %s in axiom vector\n", axiom[axiom.size()-1]); // increment buffer pointer bufferp++; } } my axiom vector is defined as vector<char *> axiom;. When I run my program, I get a seg fault. It happens when I do the sscanf. Any suggestions on what I'm doing wrong?

    Read the article

  • Read whole ASCII file into C++ std::string

    - by Arrieta
    Hello, I need to read a whole file into memory and place it in a C++ std::string. If I were to read it into a char, the answer would be very simple: std::ifstream t; int lenght; t.open("file.txt", "r"); // open input file t.seekg(0, std::ios::end); // go to the end length = t.tellg(); // report location (this is the lenght) t.seekg(0, std::ios::beg); // go back to the beginning buffer = new char[length]; // allocate memory for a buffer of appropriate dimension t.read(buffer, length); // read the whole file into the buffer t.close(); // close file handle // ... do stuff with buffer here ... Now, I want to do the exact same thing, but using a std::string instead of a char. I want to avoid loops, i. e., I don't want to: std::ifstream t; t.open("file.txt", "r"); std::string buffer; std::string line; while(t){ std::getline(t, line); // ... append line to buffer and go on } t.close() any ideas?

    Read the article

  • Unable to open a file for writing

    - by asdasdas
    I am trying to write to a file. I do a file_exists check on it before I do fopen and it returns true (the file does exist). However, the file fails this code and gives me the error every time: $handle = fopen($filename, 'w'); if($handle) { flock($handle, LOCK_EX); fwrite($handle, $contents); } else { echo 'ERROR: Unable to open the file for writing.',PHP_EOL; exit(); } flock($handle, LOCK_UN); fclose($handle); Is there a way I can get more specific error details as to why this file does not let me open it for writing? I know that the filename is legit, but for some reason it just wont let me write to it. I do have write permissions, I was able to write and write over another file.

    Read the article

  • Extract some data from a lot of xml files

    - by LifeH2O
    I have cricket player profiles saved in the form of .xml files in a folder. each file has these tags in it <playerid>547</playerid> <majorteam>England</majorteam> <playername>Don</playername> the playerid is same as in .xml (each file is of different size,1kb to 5kb). These are about 500 files. What i need is to extract the playername, majorteam, and playerid from all these files to a list. I will convert that list to XML later. If you know how can i do it directly to XML i will be very thankful.

    Read the article

  • Error copying file from app bundle

    - by Michael Chen
    I used the FireFox add-on SQLite Manager, created a database, which saved to my desktop as "DB.sqlite". I copied the file into my supporting files for the project. But when I run the app, immediately I get the error "Assertion failure in -[AppDelegate copyDatabaseIfNeeded], /Users/Mac/Desktop/Note/Note/AppDelegate.m:32 2014-08-19 23:38:02.830 Note[28309:60b] Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Failed to create writable database file with message 'The operation couldn’t be completed. (Cocoa error 4.)'.' First throw call stack: "... Here is the App Delegate Code where the error takes place -(void) copyDatabaseIfNeeded { NSFileManager *fileManager = [NSFileManager defaultManager]; NSError *error; NSString *dbPath = [self getDBPath]; BOOL success = [fileManager fileExistsAtPath:dbPath]; if (!success) { NSString *defaultDBPath = [[ [NSBundle mainBundle ] resourcePath] stringByAppendingPathComponent:@"DB.sqlite"]; success = [fileManager copyItemAtPath:defaultDBPath toPath:dbPath error:&error]; if (!success) NSAssert1(0, @"Failed to create writable database file with message '%@'.", [error localizedDescription]); } } I am very new to Sqlite, so I maybe I didn't create a database correctly in the FireFox Sqlite manager, or maybe I didn't "properly" copy the .sqlite file in? (I did check the target membership in the sqlite and it correctly has my project selected. Also, the .sqlite file names all match up perfectly.)

    Read the article

  • How to read from database and write into text file with C#?

    - by user147685
    How to read from database and write into text file? I want to write/copy (not sure what to call) the record inside my database into a text file. One row record in database is equal to one line in the text file. I'm having no problem in database. For creating text file, it mentions FileStream and StreamWriter. Which one should I use?

    Read the article

  • Embarrassingly parallel workflow creates too many output files

    - by Hooked
    On a Linux cluster I run many (N > 10^6) independent computations. Each computation takes only a few minutes and the output is a handful of lines. When N was small I was able to store each result in a separate file to be parsed later. With large N however, I find that I am wasting storage space (for the file creation) and simple commands like ls require extra care due to internal limits of bash: -bash: /bin/ls: Argument list too long. Each computation is required to run through a qsub scheduling algorithm so I am unable to create a master program which simply aggregates the output data to a single file. The simple solution of appending to a single fails when two programs finish at the same time and interleave their output. I have no admin access to the cluster, so installing a system-wide database is not an option. How can I collate the output data from embarrassingly parallel computation before it gets unmanageable?

    Read the article

< Previous Page | 43 44 45 46 47 48 49 50 51 52 53 54  | Next Page >