Search Results

Search found 77950 results on 3118 pages for 'large file upload'.

Page 17/3118 | < Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >

  • Writing to a file in Python inserts null bytes

    - by Javier Badia
    I'm writing a todo list program. It keeps a file with a thing to do per line, and lets the user add or delete items. The problem is that for some reason, I end up with a lot of zero bytes at the start of the file, even though the item is correctly deleted. I'll show you a couple of screenshots to make sure I'm making myself clear. This is the file in Notepad++ before running the program: This is the file after deleting item 3 (counting from 1): This is the relevant code. The actual program is bigger, but running just this part triggers the error. import os TODO_FILE = r"E:\javi\code\Python\todo-list\src\todo.txt" def del_elems(f, delete): """Takes an open file and either a number or a list of numbers, and deletes the lines corresponding to those numbers (counting from 1).""" if isinstance(delete, int): delete = [delete] lines = f.readlines() f.truncate(0) counter = 1 for line in lines: if counter not in delete: f.write(line) counter += 1 f = open(TODO_FILE, "r+") del_elems(f, 3) f.close() Could you please point out where's the mistake?

    Read the article

  • Determine which process (b)locks a file, programmatically (under Windows >= XP)

    - by fred-hh
    How to programmatically determine from a process P, which other process P' has a lock on a file, that prevents P from recreating that file ? I know there are tools to do that, but how do they achieve that ? (Context: a batch program that runs overnight fails because of a locked file. Running an admin tool the next day may be too late to get useful information. So it would be nice if the batch program itself was able to determine the culprit.) EDIT: Added complexity: the file resides on a DFS and P' might not run on the same machine as P (but maybe does). But a solution that works locally would be a good beginning.

    Read the article

  • Failure remediation strategy for File I/O

    - by Brett
    I'm doing buffered IO into a file, both read and write. I'm using fopen(), fseeko(), standard ANSI C file I/O functions. In all cases, I'm writing to a standard local file on a disk. How often do these file I/O operations fail, and what should the strategy be for failures? I'm not exactly looking for stats, but I'm looking for a general purpose statement on how far I should go to handle error conditions. For instance, I think everyone recognizes that malloc() could and probably will fail someday on some user's machine and the developer should check for a NULL being returned, but there is no great remediation strategy since it probably means the system is out of memory. At least, this seems to be the approach taken with malloc() on desktop systems, embedded systems are different. Likewise, is it worth reattempting a file I/O operation, or should I just consider a failure to be basically unrecoverable, etc. I would appreciate some code samples demonstrating proper usage, or a library guide reference that indicates how this is to be handled. Any other data is, of course, welcome.

    Read the article

  • c source code to remove subset transactions form text file

    - by user324887
    I have a file containing data as follows 10 20 30 40 70 20 30 70 30 40 10 20 29 70 80 90 20 30 40 40 45 65 10 20 80 45 65 20 I want to remove all subset transaction from this file. output file should be like follows 10 20 30 40 70 29 70 80 90 20 30 40 40 45 65 10 20 80 Where records like 20 30 70 30 40 10 20 45 65 20 are removed because of they are subset of other records.

    Read the article

  • removing a line from a text file?

    - by Blackbinary
    Hi all. I am working with a text file, which contains a list of processes under my programs control, along with relevant data. At some point, one of the processes will finish, and thus will need to be removed from the file (as its no longer under control). Here is a sample of the file contents (which has enteries added "randomly"): PID=25729 IDLE=0.200000 BUSY=0.300000 USER=-10.000000 PID=26416 IDLE=0.100000 BUSY=0.800000 USER=-20.000000 PID=26522 IDLE=0.400000 BUSY=0.700000 USER=-30.000000 So for example, if I wanted to remove the line that says PID=26416.... how could I do that, without writing the file over again? I can use external unix commands, however I am not very familiar with them so please if that is your suggestion, give an example. Thanks!

    Read the article

  • c source code to remove subset transactions from text file

    - by user324887
    I have a file containing data as follows 10 20 30 40 70 20 30 70 30 40 10 20 29 70 80 90 20 30 40 40 45 65 10 20 80 45 65 20 I want to remove all subset transaction from this file. output file should be like follows 10 20 30 40 70 29 70 80 90 20 30 40 40 45 65 10 20 80 Where records like 20 30 70 30 40 10 20 45 65 20 are removed because of they are subset of other records.

    Read the article

  • android ftp upload has stopped error

    - by Goxel Arp
    class Asenkron extends AsyncTask<String,Integer,Long> { @Override protected Long doInBackground(String... aurl) { FTPClient con=null; try ` { con = new FTPClient(); con.connect(aurl[0]); if (con.login(aurl[1], aurl[2])) { con.enterLocalPassiveMode(); // important! con.setFileType(http://FTP.BINARY_FILE_TYPE); FileInputStream in = new FileInputStream(new File(aurl[3])); boolean result = con.storeFile(aurl[3], in); in.close(); con.logout(); con.disconnect(); } } catch (Exception e) { Toast.makeText(getApplicationContext(), e.toString(), Toast.LENGTH_LONG).show(); } return null; } protected void onPostExecute(String result) {} } I AM USING THIS CLASS LIKE BELOW.THERE IS BUTTON AND WHENEVER I CLICK THE BUTTON IT SHOULD START FTP UPLOAD PROCESS IN BACKGROUND BUT I GET "PROGRAM HAS STOPPED UNFORTUNATELY" ERROR. Assume that The ftp address and username password pathfile sections are true and I get the internet and network permissions already by the way ... button1.setOnClickListener(new OnClickListener() { public void onClick(View arg0) { new Asenkron().execute("ftpaddress","username","pass","pathfileon telephone"); } }); And here is the logcat for you to analyse the potential error and help me ... 10-13 13:01:25.591: I/dalvikvm(633): threadid=3: reacting to signal 3 10-13 13:01:25.711: I/dalvikvm(633): Wrote stack traces to '/data/anr/traces.txt' 10-13 13:01:25.921: D/gralloc_goldfish(633): Emulator without GPU emulation detected. 10-13 13:01:31.441: W/dalvikvm(633): threadid=11: thread exiting with uncaught exception (group=0x409c01f8) 10-13 13:01:31.461: E/AndroidRuntime(633): FATAL EXCEPTION: AsyncTask #1 10-13 13:01:31.461: E/AndroidRuntime(633): java.lang.RuntimeException: An error occured while executing doInBackground() 10-13 13:01:31.461: E/AndroidRuntime(633): at android.os.AsyncTask$3.done(AsyncTask.java:278) 10-13 13:01:31.461: E/AndroidRuntime(633): at java.util.concurrent.FutureTask$Sync.innerSetException(FutureTask.java:273) 10-13 13:01:31.461: E/AndroidRuntime(633): at java.util.concurrent.FutureTask.setException(FutureTask.java:124) 10-13 13:01:31.461: E/AndroidRuntime(633): at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:307) 10-13 13:01:31.461: E/AndroidRuntime(633): at java.util.concurrent.FutureTask.run(FutureTask.java:137) 10-13 13:01:31.461: E/AndroidRuntime(633): at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:208) 10-13 13:01:31.461: E/AndroidRuntime(633): at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1076) 10-13 13:01:31.461: E/AndroidRuntime(633): at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:569) 10-13 13:01:31.461: E/AndroidRuntime(633): at java.lang.Thread.run(Thread.java:856) 10-13 13:01:31.461: E/AndroidRuntime(633): Caused by: java.lang.RuntimeException: Can't create handler inside thread that has not called Looper.prepare() 10-13 13:01:31.461: E/AndroidRuntime(633): at android.os.Handler.<init>(Handler.java:121) 10-13 13:01:31.461: E/AndroidRuntime(633): at android.widget.Toast$TN.<init>(Toast.java:317) 10-13 13:01:31.461: E/AndroidRuntime(633): at android.widget.Toast.<init>(Toast.java:91) 10-13 13:01:31.461: E/AndroidRuntime(633): at android.widget.Toast.makeText(Toast.java:233) 10-13 13:01:31.461: E/AndroidRuntime(633): at com.example.ftpodak.ODAK$Asenkron.doInBackground(ODAK.java:74) 10-13 13:01:31.461: E/AndroidRuntime(633): at com.example.ftpodak.ODAK$Asenkron.doInBackground(ODAK.java:1) 10-13 13:01:31.461: E/AndroidRuntime(633): at android.os.AsyncTask$2.call(AsyncTask.java:264) 10-13 13:01:31.461: E/AndroidRuntime(633): at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:305) 10-13 13:01:31.461: E/AndroidRuntime(633): ... 5 more By the way I changed the relevant code like that ; instead of catch (Exception e) { Toast.makeText(getApplicationContext(), e.toString(), Toast.LENGTH_LONG).show(); } I replaced with this code catch (Exception e) { HATA=e.toString(); } And I added the code to button textview1.setText(HATA); So I can see the error on the textview and it is writing that "Android java.net.UnknownHostException: Host is unresolved" But i know that the ftp server is correct and I check the ftp server from the AndFTP application. With the same address login and pass information ftp server is working.So the problem is in my code I think.Any help will be too much appreciated.Anyone who can help me I can give teamviewer to analyse what is the problem ...

    Read the article

  • read chars from a file - c#

    - by Saskaaa
    How to read from a file array of numbers? I mean, how to read chars from a file? sorry for bad eng. upd: yes, i can :) just: "1 2 3 4 5 6 7 8" and etc. I just do not know how to read chars from a file.

    Read the article

  • uploading images with the help of arrays and fetch errors

    - by bonny
    i use a script to upload a couple of images to a directory. the code works great in case of just one picture will be uploaded. if i like to upload two images or more and have an extension that is not accepted, the script will upload the one with the extension that is allowed to upload and shows the errormessage for the one who is not accepted. but the upload takes place. that's my first problem. second problem will be: in case of an errormessage i would like to display a message in which it is said, which of the images will be not allowed. i do not know how to fetch this one that has an unaccepted ending into a variable that i can echo to the errormessage. here is the code i use: if(!empty($_FILES['image']['tmp_name'])){ $allowed_extension = array('jpg', 'jpeg', 'png', 'bmp', 'tiff', 'gif'); foreach($_FILES['image']['name'] as $key => $array_value){ $file_name = $_FILES['image']['name'][$key]; $file_size = $_FILES['image']['size'][$key]; $file_tmp = $_FILES['image']['tmp_name'][$key]; $file_extension = strtolower(end(explode('.', $file_name))); if (in_array($file_extension, $allowed_extension) === false){ $errors[] = 'its an unaccepted format in picture $variable_that_count'; continue; } if ($file_size > 2097152){ $errors[] = 'reached maxsize of 2MB per file in picture $variable_that_count'; } if (count($errors) == 0){ $path = "a/b/c/"; $uploadfile = $path."/".basename($_FILES['image']['name'][$key]); if (move_uploaded_file($_FILES['image']['tmp_name'][$key], $uploadfile)){ echo "success."; } } } } hope it will be clear what i like to approach. if there is someone who could help out i really would appreciate. thanks a lot.

    Read the article

  • How to best transfer large payloads of data using wsHttp with WCF with message security

    - by jpierson
    I have a case where I need to transfer large amounts of serialized object graphs (via NetDataContractSerializer) using WCF using wsHttp. I'm using message security and would like to continue to do so. Using this setup I would like to transfer serialized object graph which can sometimes approach around 300MB or so but when I try to do so I've started seeing a exception of type System.InsufficientMemoryException appear. After a little research it appears that by default in WCF that a result to a service call is contained within a single message by default which contains the serialized data and this data is buffered by default on the server until the whole message is completely written. Thus the memory exception is being caused by the fact that the server is running out of memory resources that it is allowed to allocate because that buffer is full. The two main recommendations that I've come across are to use streaming or chunking to solve this problem however it is not clear to me what that involves and whether either solution is possible with my current setup (wsHttp/NetDataContractSerializer/Message Security). So far I understand that to use streaming message security would not work because message encryption and decryption need to work on the whole set of data and not a partial message. Chunking however sounds like it might be possible however it is not clear to me how it would be done with the other constraints that I've listed. If anybody could offer some guidance on what solutions are available and how to go about implementing it I would greatly appreciate it. Related resources: Chunking Channel How to: Enable Streaming Large attachments over WCF Custom Message Encoder Another spotting of InsufficientMemoryException I'm also interested in any type of compression that could be done on this data but it looks like I would probably be best off doing this at the transport level once I can transition into .NET 4.0 so that the client will automatically support the gzip headers if I understand this properly.

    Read the article

  • Scalable / Parallel Large Graph Analysis Library?

    - by Joel Hoff
    I am looking for good recommendations for scalable and/or parallel large graph analysis libraries in various languages. The problems I am working on involve significant computational analysis of graphs/networks with 1-100 million nodes and 10 million to 1+ billion edges. The largest SMP computer I am using has 256 GB memory, but I also have access to an HPC cluster with 1000 cores, 2 TB aggregate memory, and MPI for communication. I am primarily looking for scalable, high-performance graph libraries that could be used in either single or multi-threaded scenarios, but parallel analysis libraries based on MPI or a similar protocol for communication and/or distributed memory are also of interest for high-end problems. Target programming languages include C++, C, Java, and Python. My research to-date has come up with the following possible solutions for these languages: C++ -- The most viable solutions appear to be the Boost Graph Library and Parallel Boost Graph Library. I have looked briefly at MTGL, but it is currently slanted more toward massively multithreaded hardware architectures like the Cray XMT. C - igraph and SNAP (Small-world Network Analysis and Partitioning); latter uses OpenMP for parallelism on SMP systems. Java - I have found no parallel libraries here yet, but JGraphT and perhaps JUNG are leading contenders in the non-parallel space. Python - igraph and NetworkX look like the most solid options, though neither is parallel. There used to be Python bindings for BGL, but these are now unsupported; last release in 2005 looks stale now. Other topics here on SO that I've looked at have discussed graph libraries in C++, Java, Python, and other languages. However, none of these topics focused significantly on scalability. Does anyone have recommendations they can offer based on experience with any of the above or other library packages when applied to large graph analysis problems? Performance, scalability, and code stability/maturity are my primary concerns. Most of the specialized algorithms will be developed by my team with the exception of any graph-oriented parallel communication or distributed memory frameworks (where the graph state is distributed across a cluster).

    Read the article

  • Large Reports for MSRS

    - by Greg Lorenz
    I have a report that needs to be able to render a very large amount of pages (about 4500 in this instance) in a web browser. The total time needed to finish on the report server from start time to end time is about 30 mins for the instance that I am looking at. Does anyone know what options exist for handling the rendering of such a large report in a web browser? In terms of looking into how this can be resolved I have already performed the following tasks. The report gets its data off of a database table that already has the data flattened to the point that the TimeDataRetrieval on the report server is 17812 or about 18 secs. The report itself has been reformatted to include the least expensive report objects that it can in order to render the data in the correct format. I basically consists of a table with about 4 nested tables and thats it. We were trying to accomplish this on a 2005 report server but continued to run into memory issues that were not feasible for our clients. In response to that we moved this onto a 2008 report server to take advantage of the fact that it uses the file system instead of memory and finally were able to get this to work without running out of the available memory but of course it takes much longer.

    Read the article

  • Export large amount of data from Oracle 10G to SQL Server 2005

    - by uniball
    Dear all, I need to export 100 million data rows (avg row length ~ 100 bytes) from Oracle 10G database table into SQL server (over WAN/VLAN with 6MBits/sec capacity) on a regular basis. So far, these are the options that I have tried and a quick summary. Has anyone tried this before? Are there other better options? Which option would be the best in terms of performance and reliability? The time taken has been calculated using tests on smaller amounts of data and then extrapolating it to estimate the time required. Using data import wizard on the SQL server or SSIS packages to import the data. It will take around 150 hours to complete the task. Using Oracle batch job to spool data into a comma-delimited flat-file. Then using SSIS package to FTP this file to the SQL server and then load directly from the flat-file. The issue here is the size of the flat-file which is expected to run in GBs. Although this option is drastically different, I am even considering the option of using Linked Server to query the Oracle data directly at run-time to avoid bringing in data. Performance is a big problem and I have limited control over the Oracle database in terms of creating table indexes. Regards, Uniball

    Read the article

  • Do you have any additions or alterations to this list of popular audio formats?

    - by roja
    All, I am trying to compile a list of common audio file formats used in both personal storage and peer transmission. I have compiled the following list, do you think that there are any significant formats missing? Are any of them not actually common formats? Any advice/alterations are highly useful. advanced audio coding, apple lossless audio file, atrac3 audio file, atrac audio file, audio interchange file format, core audio file, free lossless audio codec file, mpeg 1 audio layer 3, mpeg 2 audio, mpeg 4 audio book file, musical instrument digital interface, ogg vorbis compressed audio file, open media framework file, real audio, real audio media, waveform audio file format, windows media audio Kind regards, Roja

    Read the article

  • Does the upload file INPUT HTML element give any personal information away?

    - by Senseful
    I'm wondering what personal information the file input (<input type="file">) element gives the website. I noticed that it does show the file name and the website does seem to have access to it. What about the file's path? If the file is located in My Documents, they could find out the user name via the path (e.g. C:\Documents and Settings\Bob\My Documents) which many times is the actual user's name that is using the website. What information do most modern browsers allow the website to access when a user uses the file input element? Could JavaScript somehow be used to gain more information? What about when plugins (such as Flash or Java) implement file uploading?

    Read the article

  • Rename file in XP, only select file name, but show file extension.

    - by RasmusWriedtLarsen
    So if I have a file called: test.txt and I want to rename it, there are two options (depending on the Show already known file extension option): 1) ON: it selects everything (test.txt), meaning I have to manually select "test" and replace it with the new filename. (which is irritating) 2) OFF: Only "test" is editable (and visible). Problem is that I frequently need to change the file extension of a file, but if the option is turned on, it's a pain to change the file name. I know that in Win7 it does something smart: It only selects the file name when you press rename[F2], but also lets you edit the file extension. Is there a way to accomplish this?

    Read the article

  • Difference between Web.Config and Machine.Config File

    - by SAMIR BHOGAYTA
    Two types of configuration files supported by ASP.Net. Configuration files are used to control and manage the behavior of a web application. i) Machine.config ii)Web.config Difference between Machine.Config and Web.Config Machine.Config: i) This is automatically installed when you install Visual Studio. Net. ii) This is also called machine level configuration file. iii)Only one machine.config file exists on a server. iv) This file is at the highest level in the configuration hierarchy. Web.Config: i) This is automatically created when you create an ASP.Net web application project. ii) This is also called application level configuration file. iii)This file inherits setting from the machine.config

    Read the article

  • Why is Samba Access from Windows So Slow?

    - by swalker2001
    I have set up a file server using Ubuntu 12.04 Server. The purpose is to serve several network drives to Windows users that have heretofore been served by numerous NAS drives. I have Samba set up with one share defined so far. I can connect to it fine from my test Windows 7 and Windows XP machines. When I do a directory listing on the share from Windows, it can take up to two minutes to get all the files listed--would have taken about 1.5 seconds when I was using the Buffalo NAS. Sometimes it times out with no response at all. I have used the default smb.conf and simply added the following for the share I have set up so far: [engineering] comment = Ubuntu File Server Share path = /networkdriveshares/engineering browsable = yes guest ok = yes read only = no create mask = 0755 I have tried changing the workgroup setting to the Active Domain name our Windows computer use but didn't notice any difference. The only other change I made to the default smb.conf was adding in the recommended socket settings: SO_RCVBUF=8192 SO_SNDBUF=8192 socket options = TCP_NODELAY Lots of information about slow Samba shares online but I have tried all of the solutions I have found and none have made a lick of difference. If there is no solution, is there a better way to set up a file server to be used by Windows clients?

    Read the article

  • Jumpshare Makes It Dead Simple To Drag, Drop, and Share 150+ File Formats

    - by Jason Fitzpatrick
    If you’re looking for a super simple way to share files with friends and coworkers Jumpshare offers drag and drop file transfer with a powerful built in file viewer. You don’t need to register, install any software, or do anything but drag the file, drop it onto the Jumpshare interface, and share the link with your friend. Share the link and your friend can watch the real-time progress of the file upload as well as download or view the completed files within the Jumpshare file viewer. Files are hosted for two weeks before deletion. Hit up the link below to take it for a test drive. Jumpshare HTG Explains: Is ReadyBoost Worth Using? HTG Explains: What The Windows Event Viewer Is and How You Can Use It HTG Explains: How Windows Uses The Task Scheduler for System Tasks

    Read the article

  • Python: slicing a very large binary file

    - by Duncan Tait
    Say I have a binary file of 12GB and I want to slice 8GB out of the middle of it. I know the position indices I want to cut between. How do I do this? Obviously 12GB won't fit into memory, that's fine, but 8GB won't either... Which I thought was fine, but it appears binary doesn't seem to like it if you do it in chunks! I was appending 10MB at a time to a new binary file and there are discontinuities on the edges of each 10MB chunk in the new file. Is there a Pythonic way of doing this easily?

    Read the article

< Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >