Search Results

Search found 75611 results on 3025 pages for 'copy file'.

Page 669/3025 | < Previous Page | 665 666 667 668 669 670 671 672 673 674 675 676  | Next Page >

  • JAVE function call form JSP page is not returning value

    - by Satyendra
    I am calling a java function from JSP page which returns file name after creationg a XML file. In some cases where size of file is large(Java function execution takes much time due to large data) it is returning blank where as XML file is genertaed after some time. Can any one help me to get the file name in this case so that user can know the generated file name.

    Read the article

  • rsync doesn't use delta transfer on first run

    - by ockzon
    I'm trying to synchronize a large local directory (with a batch file using rsync 3.0.7 on Cygwin, Windows 7 x64, 30k files, 200gb size) to a remote server (Debian x64 with kernel 2.6, rsyncd 3.0.7) over a slow internet connection (90kbyte/s upload). I know almost all files are identical and I verified that using md5sum locally and remotely. However when executing rsync from my local machine every file gets transferred completely for the first time. When I terminate the batch file after a few transfers and run it again then the already transferred files are skipped. But as soon as it gets to a file not yet transferred it uploads the file as a whole again instead of noticing that the checksum is the same locally and remotely. The batch file calling rsync looks like this (backslashes and line brakes added here for readability): c:\cygwin\bin\rsync.exe --verbose --human-readable --progress --stats \ --recursive --ignore-times --password-file pwd.txt \ /cygdrive/d/ftp/data/ \ rsync://[email protected]:33400/data/ | \ c:\cygwin\bin\tee.exe --append rsync.log I experimented using the following parameters in varying combinations but that didn't help either: --checksum --partial --partial-dir=/tmp/.rsync-partial --compress

    Read the article

  • Restoring a subversion repository to workcopy revision

    - by tinny
    My subversion VM died the other day (host hardware melted) and I had to restore a backed up copy of the vmware server image. The restore went well and the VM is running again on a new host. The problem I have is that my restored repository is at revision 60 but my working copy on my PC is at 66. When I try and commit my working copy I get the following error message. svn: Commit failed (details follow): svn: No such revision 61 What is the best way to force this commit and bring subversion up to the same revision as my working copy? Thanks

    Read the article

  • Why can't I insert record with foreign key in a single server request?

    - by Eran Betzalel
    I'm tryring to do a simple insert with foreign key, but it seems that I need to use db.SaveChanges() for every record insert. How can I manage to use only one db.SaveChanges() at the end of this program? var files = new List<File> { new File { Name = "Test1" }, new File { Name = "Test2" }, new File { Name = "Test3" } }; foreach (var file in files) { db.AddToFileSet(file); db.SaveChanges(); db.AddToDirectorySet( new Directory { DirectoryName = file.Name + "Dir", CreationDate = DateTime.UtcNow, file_relation = file }); db.SaveChanges(); }

    Read the article

  • Sending files from server to client in Java

    - by Lee Jacobson
    Hi, I'm trying to find a way to send files of different file types from a server to a client. I have this code on the server to put the file into a byte array: File file = new File(resourceLocation); byte[] b = new byte[(int) file.length()]; FileInputStream fileInputStream; try { fileInputStream = new FileInputStream(file); try { fileInputStream.read(b); } catch (IOException ex) { System.out.println("Error, Can't read from file"); } for (int i = 0; i < b.length; i++) { fileData += (char)b[i]; } } catch (FileNotFoundException e) { System.out.println("Error, File Not Found."); } I then send fileData as a string to the client. This works fine for txt files but when it comes to images I find that although it creates the file fine with the data in, the image won't open. I'm not sure if I'm even going about this the right way. Thanks for the help.

    Read the article

  • How to retrieve the file previews used by windows explorer in Windows vista and seven?

    - by user193655
    I am developing a Delphi documents management application, so somehow I am giving the user some functionality similar to windows explorer. I would like to know if there is a way to get the preview used by windows explorer. For example windows explorer creates a small thumbnail for a pdf document for example, and displays it when the user chooses to view "big icons". Is there a way to retrieve that preview? MyTImage := GiveMePreviewForFile('C:\Test\File.pdf');

    Read the article

  • How to partially ftp a file (using ftp, wget with shell scripts or php)?

    - by Dave
    hi, i want to partially download a ftp file. i just need to download lets say 10MB, but after skipping 100MB (for example). In php, http://php.net/manual/en/function.ftp-fget.php this function allows arbitay starting point: bool ftp_fget ( resource $ftp_stream , resource $handle , string $remote_file , int $mode [, int $resumepos = 0 ] ) however it does not allow me to set "how many bytes" i want to download.

    Read the article

  • mysql: can't set max_allowed_package to anything grater than 16MB

    - by sas
    I'm not sure if this is the right place to post these kind of questions, if it's not so, please (politely) let me know... :-) I need to save files greater than 16MB on a mysql database from a php site... I've already changed the c:\xampp\mysql\bin\my.cnf and set max_allowed_packet to 16 MB, and everything worked fine then I set it to 32 MB but there´s no way I can handle a file bigger than 16 MB I get the following error: 'MySQL server has gone away' (the same error I had when max_allowed_packet was set to 1MB) there must be some other setting that doesn´t allow me to handle files bigger than 16MB maybe the php client, I guess, but I don't know where to edit it this is the code I'm running when file.txt is smaller than 16.776.192 bytes long, it works fine, but if file.txt has 16.777.216 bytes i get the aforementioned error oh, and the field download.content is a longblob... $file = 'file.txt'; $file_handle = fopen( $file, 'r' ); $content = fread( $file_handle, filesize( $file ) ); fclose( $file_handle ); db_execute( 'truncate table download', true ); $sql = "insert into download( code, title, name, description, original_name, mime_type, size, content, user_insert_id, date_insert, user_update_id, date_update ) values ( 'new file', 'new file', 'sas.jpg', 'new file', '$file', 'mime', " . filesize( $file ) . ", '" . addslashes( $content ) . "', 0, " . db_char_to_sql( now_char(), 'datetime' ) . ", 0, " . db_char_to_sql( now_char(), 'datetime' ) . " )"; db_execute( $sql, true ); (the db_execute funcion just opens the connections and executes the sql stuff) running on windows XP sp2 server version: 5.0.67-community PHP Version 4.4.9 mysql client API version: 3.23.49 using: ApacheFriends XAMPP (Basispaket) version 1.6.8 that comes with + Apache 2.2.9 + MySQL 5.0.67 (Community Server) + PHP 5.2.6 + PHP 4.4.9 + PEAR + phpMyAdmin 2.11.9.2 ... this is part of the content of c:\xampp\mysql\bin\my.cnf # The MySQL server [mysqld] port= 3306 socket= "C:/xampp/mysql/mysql.sock" basedir="C:/xampp/mysql" tmpdir="C:/xampp/tmp" datadir="C:/xampp/mysql/data" skip-locking key_buffer = 16M # max_allowed_packet = 1M max_allowed_packet = 32M table_cache = 128 sort_buffer_size = 512K net_buffer_length = 8K read_buffer_size = 256K read_rnd_buffer_size = 512K myisam_sort_buffer_size = 8M

    Read the article

  • When does log4net write or commit the log to file?

    - by Jollian
    We use the log4net to log the winform application's event and error. Our customer want check the log file during the application running. But I can't find out when and how the log4net do the write(commit) operation. And how to meet the customer's requirement, except creating another logger by myself. Any help? Thanks.

    Read the article

  • Make Excel Defined Names within a worksheet to be global

    - by idazuwaika
    Hi, I wrote Powershell script to copy a worksheet from a workbook A to another workbook B. The worksheet contains define names for ranges within that sheet. Originally, the defined names are global in workbook A, ie. can be referenced from any worksheets within workbook A. But now, after copy to worksheet B, the defined names are limited to that worksheet only. How to I programmatically (via Powershell script preferably) make all those named range global i.e. can be referenced from all worksheets within workbook B. Some codes for clarity. #Script to update SOP from 5.1 to 5.2 $missing = [System.Type]::missing #Open files $excel = New-Object -Com Excel.Application $excel.Visible = $False $excel.DisplayAlerts = $False $newTemplate = "C:\WorkbookA.xls" $wbTemplate = $excel.Workbooks.Open($newTemplate) $oldSop = "C:\WorkbookB.xls" $wbOldSop = $excel.Workbooks.Open($oldSop) #Delete 'DATA' worksheet from old file $wsOldData = $wbOldSop.Worksheets.Item("DATA") $wsOldData.Delete() #Copy new 'DATA' worksheet to old file $wbTemplate.Worksheets.Item("DATA").Copy($missing,$wbOldSop.Worksheets.Item("STATUS")) #Save $wbOldSop.Save() $wbOldSop.Close() #Quit Excel $excel.Quit()

    Read the article

  • Reverse SCP over SSH connection

    - by pavpanchekha
    I pretty often need some file from some server when I'm on my laptop. But if I don't know where that file is, I have to ssh into the server, look around, exit, and then scp server:file .. If I'm working with my desktop and my server, both of which have static IPs, I can just SCP the file in reverse (scp desktop:~ file), but I can't do that for my laptop. Is there any nice way to SCP a file backwards over an SSH connection? So that the computer I connect to with SSH sends a file backwards to the client?

    Read the article

  • How do I change from locally generated File in Silverlight to a Website?

    - by Kris Erickson
    Because I want to do things like load images from the web, I guess I need to move my Silverlight project from using a default file to some kind of web package. I don't really want an ASP .Net site, in fact, I totally don't want an ASP.Net site and yet I want to be able to develop and load images from the web. How do I move my development files to a website and still be able to compile, debug, etc from Visual Studio?

    Read the article

  • How to load image data from resource bitmap file for directshow filter ?

    - by Forrest
    I need put one bitmap image to my directshow filter. Then user can use this bitmap image and do not care where is it. First, I import this bitmap file into resource bundle, and get one IDB_BITMAP1. Then, I need to read this IDB_BITMAP1 using opencv cvLoadImage or some windows image API to load this image into buffer. So question is how to do this ? Or is that possible ? Thanks

    Read the article

  • What VC++ compiler/linker does when building a C++ project with Managed Extension

    - by ???
    The initial problem is that I tried to rebuild a C++ project with debug symbols and copied it to test machine, The output of the project is external COM server(.exe file). When calling the COM interface function, there's a RPC call failre: COMException(0x800706BE): The remote procedure call failed. According to the COM HRESULT design, if the FACILITY code is 7, it's actually a WIN32 error, and the win32 error code is 0x6BE, which is the above mentioned "remote procedure call failed". All I do is replace the COM server .exe file, the origin file works well. When I checked into the project, I found it's a C++ project with Managed Extension. When I checking the DLL with reflector, it shows there's 2 additional .NET assembly reference. Then I checked the project setting and found nothing about the extra 2 assembly reference. I turned on the show includes option of compiler and verbose library of linker, and try to analyze whether the assembly is indirectly referenced via .h file. I've collect all the .h file and grep all the files with '#using' '#import' and the assembly file itself. There really is a '#using ' in one of the .h file but not-relevant to the referenced assembly. And about the linked .lib library files, only one of the .lib file is a side-product of another managed-extension-enabled C++ project, all others are produced by a pure, traditional C++ project. For the managed-extension-enabled C++ project, I checked the output DLL assembly, it did NOT reference to the 2 assembly. I even try to capture the access of the additional assembly file via sysinternal's filemon and procmon, but the rebuild process does NOT access these file. I'm very confused about the compile and linking process model of a VC++/CLI project, where the additional assembly reference slipped into the final assembly? Thanks in advance for any of your help.

    Read the article

< Previous Page | 665 666 667 668 669 670 671 672 673 674 675 676  | Next Page >