Search Results

Search found 3953 results on 159 pages for 'overlapped io'.

Page 21/159 | < Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >

  • Why does File::Slurp return a scalar when it should return a list?

    - by BrianH
    I am new to the File::Slurp module, and on my first test with it, it was not giving the results I was expecting. It took me a while to figure it out, so now I am interested in why I was seeing this certain behavior. My call to File::Slurp looked like this: my @array = read_file( $file ) || die "Cannot read $file\n"; I included the "die" part because I am used to doing that when opening files. My @array would always end up with the entire contents of the file in the first element of the array. Finally I took out the "|| die" section, and it started working as I expected. Here is an example to illustrate: perl -de0 Loading DB routines from perl5db.pl version 1.22 Editor support available. Enter h or `h h' for help, or `man perldebug' for more help. main::(-e:1): 0 DB<1> use File::Slurp DB<2> $file = '/usr/java6_64/copyright' DB<3> x @array1 = read_file( $file ) 0 'Licensed material - Property of IBM.' 1 'IBM(R) SDK, Java(TM) Technology Edition, Version 6' 2 'IBM(R) Runtime Environment, Java(TM) Technology Edition, Version 6' 3 '' 4 'Copyright Sun Microsystems Inc, 1992, 2008. All rights reserved.' 5 'Copyright IBM Corporation, 1998, 2009. All rights reserved.' 6 '' 7 'The Apache Software License, Version 1.1 and Version 2.0' 8 'Copyright 1999-2007 The Apache Software Foundation. All rights reserved.' 9 '' 10 'Other copyright acknowledgements can be found in the Notices file.' 11 '' 12 'The Java technology is owned and exclusively licensed by Sun Microsystems Inc.' 13 'Java and all Java-based trademarks and logos are trademarks or registered' 14 'trademarks of Sun Microsystems Inc. in the United States and other countries.' 15 '' 16 'US Govt Users Restricted Rights - Use duplication or disclosure' 17 'restricted by GSA ADP Schedule Contract with IBM Corp.' DB<4> x @array2 = read_file( $file ) || die "Cannot read $file\n"; 0 'Licensed material - Property of IBM. IBM(R) SDK, Java(TM) Technology Edition, Version 6 IBM(R) Runtime Environment, Java(TM) Technology Edition, Version 6 Copyright Sun Microsystems Inc, 1992, 2008. All rights reserved. Copyright IBM Corporation, 1998, 2009. All rights reserved. The Apache Software License, Version 1.1 and Version 2.0 Copyright 1999-2007 The Apache Software Foundation. All rights reserved. Other copyright acknowledgements can be found in the Notices file. The Java technology is owned and exclusively licensed by Sun Microsystems Inc. Java and all Java-based trademarks and logos are trademarks or registered trademarks of Sun Microsystems Inc. in the United States and other countries. US Govt Users Restricted Rights - Use duplication or disclosure restricted by GSA ADP Schedule Contract with IBM Corp. ' Why does the || die make a difference? I have a feeling this might be more of a Perl precedence question instead of a File::Slurp question. I looked in the File::Slurp module and it looks like it is set to croak if there is a problem, so I guess the proper way to do it is to allow File::Slurp to croak for you. Now I'm just curious why I was seeing these differences.

    Read the article

  • Load binary file using fstream

    - by Kirill V. Lyadvinsky
    I'm trying to load binary file using fstream in the following way: #include <iostream #include <fstream #include <iterator #include <vector using namespace std; int main() { basic_fstream<uint32_t file( "somefile.dat", ios::in|ios::binary ); vector<uint32_t buffer; buffer.assign( istream_iterator<uint32_t, uint32_t( file ), istream_iterator<uint32_t, uint32_t() ); cout << buffer.size() << endl; return 0; } But it doesn't work. In Ubuntu it crashed with std::bad_cast exception. In MSVC++ 2008 it just prints 0. I know that I could use file.read to load file, but I want to use iterator and operator>> to load parts of the file. Is that possible? Why the code above doesn't work?

    Read the article

  • How can I redirect the output of Perl's system() to a filehandle?

    - by syker
    With the open command in Perl, you can use a filehandle. However I have trouble getting back the exit code with the open command in Perl. With the system command in Perl, I can get back the exit code of the program I'm running. However I want to just redirect the STDOUT to some filehandle (no stderr). My stdout is going to be a line-by-line output of key-value pairs that I want to insert into a mao in perl. That is why I want to redirect only my stdout from my Java program in perl. Is that possible? Note: If I get errors, the errors get printed to stderr. One possibility is to check if anything gets printed to stderr so that I can quite the Perl script.

    Read the article

  • \n not working in my fwrite()

    - by brett
    Not sure what could be the problem. I'm dumping data from an array $theArray into theFile.txt, each array item on a separate line. $file = fopen("theFile.txt", "w"); foreach ($theArray as $arrayItem){ fwrite($file, $arrayItem . '\n'); } fclose($file); Problem is when I open theFile.txt, I see the \n being outputted literally. Also if I try to programmatically read the file line by line (just in case lines are there), it shows them as 1 line meaning \n are really not having their desired effect.

    Read the article

  • question about fgets

    - by user105033
    Is this safe to do? (does fgets terminate the buffer with null) or should I be setting the 20th byte to null after the call to fgets before i call clean. // strip new lines void clean(char *data) { while (*data) { if (*data == '\n' || *data == '\r') *data = '\0'; data++; } } // for this, assume that the file contains 1 line no longer than 19 bytes // buffer is freed elsewhere char *load_latest_info(char *file) { FILE *f; char *buffer = (char*) malloc(20); if (f = fopen(file, "r")) if (fgets(buffer, 20, f)) { clean(buffer); return buffer; } free(buffer); return NULL; }

    Read the article

  • Keeping the UI responsive while parsing a very large logfile

    - by Carlos
    I'm writing an app that parses a very large logfile, so that the user can see the contents in a treeview format. I've used a BackGroundWorker to read the file, and as it parses each message, I use a BeginInvoke to get the GUI thread to add a node to my treeview. Unfortunately, there's two issues: The treeview is unresponsive to clicks or scrolls while the file is being parsed. I would like users to be able to examine (ie expand) nodes while the file is parsing, so that they don't have to wait for the whole file to finish parsing. The treeview flickers each time a new node is added. Here's the code inside the form: private void btnChangeDir_Click(object sender, EventArgs e) { OpenFileDialog browser = new OpenFileDialog(); if (browser.ShowDialog() == DialogResult.OK) { tbSearchDir.Text = browser.FileName; BackgroundWorker bgw = new BackgroundWorker(); bgw.DoWork += (ob, evArgs) => ParseFile(tbSearchDir.Text); bgw.RunWorkerAsync(); } } private void ParseFile(string inputfile) { FileStream logFileStream = new FileStream(inputfile, FileMode.Open, FileAccess.Read, FileShare.ReadWrite); StreamReader LogsFile = new StreamReader(logFileStream); while (!LogsFile.EndOfStream) { string Msgtxt = LogsFile.ReadLine(); Message msg = new Message(Msgtxt.Substring(26)); //Reads the text into a class with appropriate members AddTreeViewNode(msg); } } private void AddTreeViewNode(Message msg) { TreeNode newNode = new TreeNode(msg.SeqNum); BeginInvoke(new Action(() => { treeView1.BeginUpdate(); treeView1.Nodes.Add(newNode); treeView1.EndUpdate(); Refresh(); } )); } What needs to be changed?

    Read the article

  • Ruby: Is there a better way to iterate over multiple (big) files?

    - by zxcvbnm
    Here's what I'm doing (sorry for the variable names, I'm not using those in my code): File.open("out_file_1.txt", "w") do |out_1| File.open("out_file_2.txt", "w") do |out_2| File.open_and_process("in_file_1.txt", "r") do |in_1| File.open_and_process("in_file_2.txt", "r") do |in_2| while line_1 = in_1.gets do line_2 = in_2.gets #input files have the same number of lines #process data and output to files end end end end end The open_and_process method is just to open the file and close it once it's done. It's taken from the pickaxe book. Anyway, the main problem is that the code is nested too deeply. I can't load all the files' contents into memory, so I have to iterate line by line. Is there a better way to do this? Or at least prettify it?

    Read the article

  • How do I find out what process Id and thread id / name has a file open

    - by peter
    Hi All, I am using C# in an application and am having some problems with a file becoming locked. The piece of code does this, while (true) { Read a packet from a socket (with data in it to add to the file) Open a file Writes data to it Close a file } But in the process the file becomes locked. I don't really understand how, we are are definately catching and reporting exceptions so I don't see how the file doesn't get closed every time. My best guess is that something else is opening the file, but I want to prove it. Can someone please provide a piece of code to check whether the file is open and if so report what processid and threadid has the file open. For example if I had this code, StreamWriter streamWriter1 = new StreamWriter(@"c:\logs\test.txt"); streamWriter1.WriteLine("Test"); // code to check for locks?? StreamWriter streamWriter2 = new StreamWriter(@"c:\logs\test.txt"); streamWriter1.Close(); streamWriter2.Close(); That will throw an exception because the file is locked when we try and open it the second time. So where the comment is what could I put in there to report that the current app (process Id) and the current thread (thread Id) have the file locked? Thanks.

    Read the article

  • Not able to open a file in php

    - by ehsanul
    The following code works when invoking through the command line with php -f test.php, from root. It does not work though when being invoked via apache when loading the php page. The code chokes at fopen() and the resulting web page just says "can't open file". <?php $fp = fopen("/path/to/some_file.txt","a") or die("can't open file"); fwrite($fp,"some text"); fclose($fp); ?> I tried to play with the file permissions, but to no avail. I changed the user/group with chown apache:apache test.php and changed permissions with chmod 755 test.php. Here is the relevant result of ls -l /path/to/some_file.txt: -rwxr-xr-x 1 apache apache 0 Apr 12 04:16 some_file.txt

    Read the article

  • uploading multiple files from client to server with asp.net

    - by Maestro1024
    uploading multiple files from client to server with asp.net I have been looking at the asp.net upload control but that is for one file (unless someone knows a better way to do it). http://msdn.microsoft.com/en-us/library/system.web.ui.webcontrols.fileupload.aspx For what I want to do I don't even really need a browse. I know the files off of the client are at a certain location. Is it possible to create a collection of *HttpPostedFile*s and upload those? http://msdn.microsoft.com/en-us/library/system.web.httppostedfile.aspx I don't think it is possible but would be glad to be proven wrong. Is there a different asp.net method or control that will easily allow uploading multiple files from client to server?

    Read the article

  • Python unicode problem

    - by Somebody still uses you MS-DOS
    I'm receiving some data from a ZODB (Zope Object Database). I receive a mybrains object. Then I do: o = mybrains.getObject() and I receive a "Person" object in my project. Then, I can do b = o.name and doing print b on my class I get: José Carlos and print b.name.__class__ <type 'unicode'> I have a lot of "Person" objects. They are added to a list. names = [o.nome, o1.nome, o2.nome] Then, I trying to create a text file with this data. delimiter = ';' all = delimiter.join(names) + '\n' No problem. Now, when I do a print all I have: José Carlos;Jonas;Natália Juan;John But when I try to create a file of it: f = open("/tmp/test.txt", "w") f.write(all) I get an error like this (the positions aren't exaclty the same, since I change the names) UnicodeEncodeError: 'ascii' codec can't encode character u'\xe9' in position 84: ordinal not in range(128) If I can print already with the "correct" form to display it, why I can't write a file with it? Which encode/decode method should I use to write a file with this data? I'm using Python 2.4.5 (can't upgrade it)

    Read the article

  • EDIT Control Showing Squares Instead Of Returns

    - by Nathan Campos
    I'm playing a little bit with PocketC by doing a simple text editor. But with this code to read and to display the contents of the file on the EDIT control: int filehandle; int file_len; string file_mode; initComponents() { createctrl("EDIT", "test", 2, 1, 0, 24, 70, 25, TEXTBOX); wndshow(TEXTBOX, SW_SHOW); guigetfocus(); } main() { filehandle = fileopen(OpenFileDlg("Plain Text Files (*.txt)|*.txt; All Files (*.*)|*.*"), 0, FILE_READWRITE); file_len = filegetlen(filehandle); if(filehandle == -1) { MessageBox("File Could Not Be Found!", "Error", 3, 1); } initComponents(); editset(TEXTBOX, fileread(filehandle, file_len)); } It's all ok, but my test file, now have returns: Hello, World! PocketC Test Of My Editor Then when I open this file on the editor, instead of returns, I just see two squares(that means that it's a unknown character for that control), but if I change the control to a STATIC, it does he returns ok, but I can't edit the text if I use a STATIC. Then I want to know what I need to do to do the returns instead of showing those squares.

    Read the article

  • Opening a file from a pack URI in WPF

    - by cptmorgan
    Hi All, I am looking to open a .csv file from the application pack to do some unit testing. So what I would really love is some analog to File.ReadAllText(string path) which is instead X.ReadAllText(Uri uri). I haven't as yet been able to find this. Does anyone know if it is possible to read text / bytes (don't mind which) from a file in the pack without compiling this file to disk first? Oh and btw, File.ReadAllText(@"pack://application:,,,/SpreadSheetEngine/Tests/Example.csv") didn't work for me.. Thanks in advance.. Gav

    Read the article

  • How do I take advantage of Android's "Clear Cache" button

    - by Jay Askren
    In Android's settings, in the "Manage Applications" activity when clicking on an app, the data is broken down into Application, Data, and cache. There is also a button to clear the cache. My app caches audio files and I would like the user to be able to clear the cache using this button. How do I store them so they get lumped in with the cache and the user can clear them? I've tried storing files using both of the following techniques: newFile = File.createTempFile("mcb", ".mp3", context.getCacheDir()); newFile = new File(context.getCacheDir(), "mcb.mp3"); newFile.createNewFile(); In both cases, these files are listed as Data and not Cache.

    Read the article

  • Import and Export for CSV are both broken in Mathematica

    - by dreeves
    Consider the following 2 by 2 array: x = {{"a b c", "1,2,3"}, {"i \"comma-heart\" you", "i \",heart\" u, too"}} If we Export that to CSV and then Import it again we don't get the same thing back: Import[Export["tmp.csv", d]] Looking at tmp.csv it's clear that the Export didn't work, since the quotes are not escaped properly. According to the RFC which I presume is summarized correctly on Wikipedia's entry on CSV, the right way to export the above array is as follows: a b c, "1,2,3" "i ""heart"" you", "i "",heart"" u, too" Importing the above does not yield the original array either. So Import is broken as well. I've reported these bugs to [email protected] but I'm wondering if others have workarounds in the meantime. One workaround is to just use TSV instead of CSV. I tested the above with TSV and it seems to work (even with tabs embedded in the entries of the array).

    Read the article

  • Getting Error When Opening Files

    - by Nathan Campos
    I'm developing a simple Text Editor to understand better PocketC language, then I've done this: #include "\\Storage Card\\My Documents\\PocketC\\Parrot\\defines.pc" int filehandle; int file_len; string file_mode; initComponents() { createctrl("EDIT", "test", 2, 1, 0, 24, 70, 25, TEXTBOX); wndshow(TEXTBOX, SW_SHOW); guigetfocus(); } main() { filehandle = fileopen(OpenFileDlg("Plain Text Files (*.txt)|*.txt; All Files (*.*)|*.*"), 0, FILE_READWRITE); file_len = filegetlen(filehandle); if(filehandle = -1) { MessageBox("File Could Not Be Found!", "Error", 3, 1); } initComponents(); editset(TEXTBOX, fileread(filehandle, file_len)); } Then I tried to run the application, it opens the Open File Dialog, I select a file(that is at \test.txt) that I've created with notepad, then I got my MessageBox saying that the file wans't found. Then I want to know why I'm getting this if the file is all correct? *PS: When I click to exit the MessageBox, I saw that the TextBox is displaying where the file is(I've tested with many other files, and with all I got the error and this).

    Read the article

  • Check to see if file transfer is complete

    - by Cymon
    We have a daily job that processes files delivered from an external source. The process usually runs fine without any issues but every once in a while we have an issue of attempting to process a file that is not completely transferred. The external source SCPs these files from a UNIX server to our Windows server. From there we try to process the files. Is there a way to check to see if a file is still being transferred? Does UNIX put a lock on a file while SCPing it that we could check on the Windows side?

    Read the article

  • synchronized block in JSP tag class

    - by Sudhakar
    Hi, I am trying to find answer for the following ,for the past couple of days ,but couldnt find comprehensive answer Problem Statement I have a custom JSP tag class which handles a web form submission ,captures data and write it to same file in the filesystem. As all web applications,this can be triggeredsimultaneosly ,and i fear that multiple threads would be in action handling each of the submission (we all know thats how Servlet works.) CODE synchronized (this){ final String reportFileName = "testReport.csv"; File reportDir = new File( rootCsDirectory, "reports" ); if(!reportDir.isDirectory())reportDir.mkdir(); File reportFile = new File (reportDir, reportFileName); logReport(reportFile,reportContent.toString()); } ISSUE: - A File object can be opened by one thread for writing and at same time another thread might try to access and fail and throw an exception So i thought of synchronizing (on the object ) should solve the issue , but read some where that jsp engine would have pool of jsp tag objects, so i am afraid that synchronized (this) wont work and it should be changed to synchronized (this.getClass())

    Read the article

  • Execute a Application On The Server Using PHP

    - by Nathan Campos
    I have an application on my server that is called leaf.exe, that haves two arguments needed to run, they are: inputfile and outputfile, that will be like this example: pnote.exe input.pnt output.txt The executable is at exec/, inputfile is at upload/ and outputfile is on compiled/. But I need that a PHP could run the application like that, then I want to know: How could I do this? How could I echo the output of the program?

    Read the article

  • What should be the ideal number of parallel java threads for copying a large set of files from a qua

    - by ukgenie
    What should be the ideal number of parallel java threads for copying a large set of files from a quad core linux box to an external shared folder? I can see that with a single thread it is taking a hell lot of time to move the files one by one. Multiple threads is improving the copy performance, but I don't know what should be the exact number of threads. I am using Java executor service to create the thread pool.

    Read the article

  • Spaces while using "Print" in VBA

    - by Josh
    For some reason I am getting a lot of spaces in front of each value while trying to print to a flat text file. 'append headers Cells(start_row - 2, 1).Select For i = 1 To ActiveCell.SpecialCells(xlLastCell).Column If ActiveCell.Offset(0, 1).Column = ActiveCell.SpecialCells(xlLastCell).Column Then Print #finalCSV, Cells(start_row - 2, i) & "\n", Else Print #finalCSV, Cells(start_row - 2, i) & ",", End If Next i Example output: DC Capacity:hi, Resistive Capacity:lo, Resistive Capacity:hi, Reactive Capacity:lo, Is there any way to get rid of these spaces?

    Read the article

  • How do I write to a file and print to a terminal cuncurrently in Unix?

    - by bias
    I have a little bash function to log my Macports outputs to a file (since installs often spew little tidbits that are easy to lose in terminal noise), then I just cat the file to the terminal: function porti { command sudo port install $@ >> $1.log 2>&1; cat $1.log } Is there a way to do this concurrently? I don't care about it being in Bash, that's just how I started it. BTW I pass $@ to install but only $1 for the file name so that I can do something like: porti git-gore +bash_completion and only get the file git-core.log however someone else might prefer to include variants in the file name ...

    Read the article

  • aio_read from file error on OS X

    - by Pyetras
    The following code: #include <fcntl.h> #include <unistd.h> #include <stdio.h> #include <aio.h> #include <errno.h> int main (int argc, char const *argv[]) { char name[] = "abc"; int fdes; if ((fdes = open(name, O_RDWR | O_CREAT, 0600 )) < 0) printf("%d, create file", errno); int buffer[] = {0, 1, 2, 3, 4, 5}; if (write(fdes, &buffer, sizeof(buffer)) == 0){ printf("writerr\n"); } struct aiocb aio; int n = 2; while (n--){ aio.aio_reqprio = 0; aio.aio_fildes = fdes; aio.aio_offset = sizeof(int); aio.aio_sigevent.sigev_notify = SIGEV_NONE; int buffer2; aio.aio_buf = &buffer2; aio.aio_nbytes = sizeof(buffer2); if (aio_read(&aio) != 0){ printf("%d, readerr\n", errno); }else{ const struct aiocb *aio_l[] = {&aio}; if (aio_suspend(aio_l, 1, 0) != 0){ printf("%d, suspenderr\n", errno); }else{ printf("%d\n", *(int *)aio.aio_buf); } } } return 0; } Works fine on Linux (Ubuntu 9.10, compiled with -lrt), printing 1 1 But fails on OS X (10.6.6 and 10.6.5, I've tested it on two machines): 1 35, readerr Is this possible that this is due to some library error on OS X, or am I doing something wrong?

    Read the article

  • CFBundleDocumentTypes & UIFileSharingEnabled issues

    - by carloe
    Has anyone gotten UIFileSharingEnabled or CFBundleDocumentTypes to work? I added UIFileSharingEnabled as true to my plist and used Apple's example from the link below for CFBundleDocumentTypes, but can't seem to get it to work. I don't see my app under file sharing in iTunes, and I do not get the option to open documents I registered in my app when I click on them in the mail.app http://developer.apple.com/iphone/library/documentation/General/Conceptual/iPadProgrammingGuide/CoreApplication/CoreApplication.html

    Read the article

  • Problem with reading file line-by-line

    - by Maulrus
    I'm trying to complete an exercise to write a program that takes the following command line arguments: an input file, an output file, and an unspecified number of words. The program is to read the contents of the input file line by line, find for each word given which lines contain the word, and print the lines with their line number to the output file. Here's my code: #include <iostream> #include <fstream> #include <string> #include <sstream> using namespace std; int main(int argc, char* argv[]) { if (argc < 4) { cerr << "Error #1: not enough arguments provided\n"; return 1; } ifstream in(argv[1]); if (!in.is_open()) { cerr << "Error #2: input file could not be opened\n"; return 2; } ofstream out(argv[2]); if (!out.is_open()) { cerr << "Error #3: output file could not be opened\n"; return 3; } ostringstream oss; for (int i = 3; i < argc; ++i) { int k = 0; string temp; oss << argv[i] << ":\n\n"; while (getline(in, temp)) { ++k; unsigned x = temp.find(argv[i]); if (x != string::npos) oss << "Line #" << k << ": " << temp << endl; } } string copy = oss.str(); out << copy; in.close(); out.close(); return 0; } If I try to run that, I get the predicted output for the first word given, but any words following it aren't found. For example, for the source code above will give the following output: in: Line #1: #include <iostream> Line #2: #include <fstream> Line #3: #include <string> Line #4: #include <sstream> Line #5: using namespace std; Line #7: int main(int argc, char* argv[]) { Line #12: ifstream in(argv[1]); Line #13: if (!in.is_open()) { Line #14: cerr << "Error #2: input file could not be opened\n"; Line #22: ostringstream oss; Line #23: string temp; Line #24: for (int i = 3; i < argc; ++i) { Line #26: int k = 0; Line #28: while (getline(in, temp)) { Line #30: unsigned x = temp.find(argv[i]); Line #31: if (x != string::npos) Line #32: oss << "Line #" << k << ": " << temp << endl; Line #35: string copy = oss.str(); Line #37: in.close(); out: That is, it'll find all the instances of the first word given but not any following. What am I doing wrong here?

    Read the article

< Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >