Search Results

Search found 39577 results on 1584 pages for 'temp files'.

Page 113/1584 | < Previous Page | 109 110 111 112 113 114 115 116 117 118 119 120  | Next Page >

  • Uploading many large files to a remote server

    - by TiernanO
    I am in the process of creating an offsite backup, and need to do a initial load of data. Currently, that's about 400Gb, give or take 10Gb or so... The backup system is producing files which are about 4Gb each, and has some other, smaller related files also. So, i need to transfer all 400ish gigs to a remote server, but how? What is the best method? I have full remote access to the server, so i can install anything i need to install. There are Windows, Linux and a Solaris VM running on the box itself, so any of those can be used there, and i have Windows and Linux at home. I have 2 internet connections in house, 10Mb/s uploading on each, so something that could potentially split the number of connections would be handy (kind of like GetRight, but in reverse... PutRight?).

    Read the article

  • javascript splice() indexing problem

    - by markzzz
    hello! i have to add some value into an array. code for example : temp[0]=new Array("0","0"); temp[1]=new Array("0","0"); temp[2]=new Array("0","0"); temp[3]=new Array("0","0"); temp[4]=new Array("0","0"); vt=new Array("1","0"); temp.splice(3, 0, vt); temp.splice(4, 0, vt); temp[3][1]="R"; i aspect this output : 1 - 0,0 2 - 0,0 3 - 0,0 4 - 1,R 5 - 1,0 6 - 0,0 7 - 0,0 but the output is : 1 - 0,0 2 - 0,0 3 - 0,0 4 - 1,R 5 - 1,R 6 - 0,0 7 - 0,0 any idea? i think it's a indexing problem with splice() function! cheers

    Read the article

  • Files copying between servers by creation time

    - by driftux
    My bash scripting knowledge is very weak that's why I'm asking help here. What is the most effective bash script according to performance to find and copy files from one LINUX server to another using specifications described below. I need to get a bash script which finds only new files created in server A in directories with name "Z" between interval from 0 to 10 minutes ago. Then transfer them to server B. I think it can be done by formatting a query and executing it for each founded new file "scp /X/Y.../Z/file root@hostname:/X/Y.../Z/" If script finds no such remote path on server B it will continue copying second file which directory exists. File should be copied with permissions, group, owner and creation time. X/Y... are various directories path. I want setup a cron job to execute this script every 10 minutes. So the performance is very important in this case. Thank you.

    Read the article

  • Which is faster in memory, ints or chars? And file-mapping or chunk reading?

    - by Nick
    Okay, so I've written a (rather unoptimized) program before to encode images to JPEGs, however, now I am working with MPEG-2 transport streams and the H.264 encoded video within them. Before I dive into programming all of this, I am curious what the fastest way to deal with the actual file is. Currently I am file-mapping the .mts file into memory to work on it, although I am not sure if it would be faster to (for example) read 100 MB of the file into memory in chunks and deal with it that way. These files require a lot of bit-shifting and such to read flags, so I am wondering that when I reference some of the memory if it is faster to read 4 bytes at once as an integer or 1 byte as a character. I thought I read somewhere that x86 processors are optimized to a 4-byte granularity, but I'm not sure if this is true... Thanks!

    Read the article

  • searching for hidden files using winapi

    - by Kristian
    HI i want to search for a hidden files and directories in a specefic given path but I don't know how to do it for hidden files i do know how to search for normal files and dir i did this code but im stuck can't make it search for only hidden files #include "stdafx.h" #include <windows.h> int _tmain(int argc, _TCHAR* argv[]) { TCHAR *fn; fn=L"d:\\*"; HANDLE f; WIN32_FIND_DATA data; { FILE_ATTRIBUTE_HIDDEN; } f=FindFirstFile(fn,&data); if(f==INVALID_HANDLE_VALUE){ printf("not found\n"); return 0; } else{ _tprintf(L"found this file: %s\n",data.cFileName); while(FindNextFile(f,&data)){ _tprintf(L"found this file: %s\n",data.cFileName); } } FindClose(f); return 0; }

    Read the article

  • testing directory S_ISDIR acts inconsistently

    - by coubeatczech
    hi, I'm doing simple tests on all files in directory. But from some reason, sometimes, they behave wrongly? What's bad with my code? using namespace std; int main() { string s = "/home/"; struct dirent * file; DIR * dir = opendir(s.c_str()); while ((file = readdir(dir)) != NULL){ struct stat * file_info = new (struct stat); stat(file-d_name,file_info); if ((file_info-st_mode & S_IFMT) == S_IFDIR) cout << "dir" << endl; else cout << "other" << endl; } closedir(dir); }

    Read the article

  • Getting data from closed files with concatenate formula

    - by Pav
    Each day a program is creating an excel file for me with some data for the current day. Like what is the price for products, how many people are available today and things like that. Based on all this I need to make some forecasts and workplace allocations for workers. The problem is, that I need to drag all this information manually all the time. So to make it automatic I placed the formula in cells like: ='c:\ABC\[ABC 29-01-14.xlsx]sheet'!a1 Everything works fine, but next day I have to change file name for "ABC 30-01-14" for each cell, what is the same as entering the data manually. So I used "concatenate" formula to change date according to today's date automatically. I used "indirect" formula to turn it in to a real formula, not text string, and realized that it is working only for open files, not closed. Is there any way to do this for closed files without VBA, because I don't know it, or with VBA but explained for an idiot.

    Read the article

  • How do I effectively write to 146 output files in C++ using cstdlib library

    - by Elpezmuerto
    I have a very large binary file and I need to create separate files based on the id within the input file. There are 146 output files and I am using cstdlib and fopen and fwrite. FOPEN_MAX is 20, so I can't keep all 146 output files open at the same time. I also want to minimize the number of times I open and close an output file. How can I write to the output files effectively? I also must use the cstdlib library due to legacy code.

    Read the article

  • Pushing Large Files to 500+ Computers [closed]

    - by WMIF
    I work with a team to manage 500-600 rented Windows 7 computers for an annual conference. We have a large amount of data that needs to be synced to these computers, up to 1 TiB. The computers are divided into rooms and connected through unmanaged gigabit switches. We prepare these computers ahead of time with the Windows installation and configuration, plus any files that we have available to us before we send the base image in for replication by the rental company. Every year, we have presenters approach on site with up to gigs of data that need to be pushed to the room that they will be presenting in. Sometimes they only have a few files that are small sizes, such as a slide PDF, but can sometimes be much larger 5 GiB. Our current strategy for pushing these files is using batch scripts and RoboCopy. For the large pushes, we actually use a BitTorrent client to generate a torrent file, and then we use the batch-RoboCopy to push the torrent into a folder on the remote machines that is being monitored by an installed BT client. Often times, this data needs to be pushed immediately with a small time window. We have several machines in a control room that are identical to the machines on the floor that we use for these pushes. We occasionally have a need to execute a program on the remote machines, and we currently use batch and PSexec to handle this task. We would love to be able to respond to these last minute pushes with "sorry, your own fault", but it won't happen. The BT method has allowed us to have a much faster response time, but the whole batch process can get messy when there are multiple jobs being pushed. We use Enterprise Ghost for other processes, and it doesn't work well in this large of scale, plus it is really quite expensive for a once-a-year task like this. EDIT: There is a hard requirement that the remote machines on the floor are running Windows. The control machines do not have a hard OS requirement. I would really like to stay away from Multicast because of complications with upstream routers. Is Multicast or BitTorrent the better way to go on this? Is there another protocol that might work better?

    Read the article

  • When and How to Delete temporary uploaded but uncommitted files in ASP.NET

    - by slowlycooked
    I'm using EO Ajax toolkits for upload files, the file is uploaded to server. when user click save then it will update the database for what been uploaded or changed. Now i need a clean up process that when user uploaded a file to server, but then close his/her browser before click the save button. in this case how should i programe so that the file user just uploaded is deleted, because it's now useless and not assosiated with any database. thanks. maybe i should upload all files to a temp folder, only when user click saver the file can be moved to target folder, at each session end the temp folder is get deleted.

    Read the article

  • write a batch file to copy files from one folder to another folder

    - by user73628
    I am having a storage folder on network in which all users will store their active data on a server now that server is going to be replaced by new one due to place problem so I need to copy sub folders files from the old server storage folder to new server storage folder. I have below ex: from \Oldeserver\storage\data & files to \New server\storage\data & files.

    Read the article

  • VS: Separating headers from source files?

    - by jco
    I know this is completely subjective, but I'm curious: do you use separate filters for headers and source files in your Visual Studio solutions? Visual Studio creates "Header Files" and "Source Files" filters by default. To me, this dichotomy causes more annoyance than anything else. What's your take on this?

    Read the article

  • Boost Include Files in VC++

    - by Dr. K
    For the last few years, I have been exclusively a C# developer. Previously, I developed in C++ and have a C++ application that I built about 3 years ago using VS2005. It made extensive use of the Boost libraries. I recently decided to brush off the old app and rebuild it in VS2008 with the latest version of Boost (the latest version with the "easy" installation program from BoostPro Computing), 1.39. Previously when I had the program running I was at 1.33. Also, the last time the program was running was at least 2 OS installations ago. The Boost installation is located on my machine at: "C:\Program Files\boost\boost_1_39". Anyway, I have done the following: Set the project's "Additional Include Directories" directory to "C:\Program Files\boost\boost_1_39" Added "C:\Program Files\boost\boost_1_39" to VS2008's Tools - Options - Projects and Solutions - VC++ Directories - Include Files I have a number of Boost includes in my stdafx.h file. The compiler fails upon attempting to open the first one - #include <boost/algorithm/string/string.hpp> I have confirmed that the above file is indeed located at "C:\Program Files\boost\boost_1_39\boost\algorithm\string\string.hpp" I continue to get: fatal error C1083: Cannot open include file: 'boost/algorithm/string/string.hpp': No such file or directory Any tips on what else to check would be greatly appreciated. Again, this is an application that compiled fine a few years ago, but the source has now been moved to a new machine/compiler.

    Read the article

  • unable to transfer files from handy cam to PC

    - by user143989
    I am using a Windows 7 PC,I am using sony dcr -sr88 handy cam . I need to transfer all my videos from handycam to my PC. when i try to connect to the PC through USB. it detects the usb drive in the Handycam on my PC and shows the used memory. But when i open the folder it shows "folder is empty". How i can copy the files? I have tried following: Changed the USB cable CHanged the USB port I can play the videos through handicam, but those files not visible in PC when connected in USB mode. Please help ..bit urgent!

    Read the article

  • Windows Server 2008 scheduled tasks cannot create files

    - by Nick Cartwright
    We have a series of tasks which, when run interactively over the command line run fine creating temporary files and (importantly) logs and backups. When we schedule the task with Administrator privileges to run at the highest priority, however, no logs or temporary files are created! All the directories have read/write privileges as administrator. Has anyone else experienced this?? We are running Windows 2008 Server & the job is configured for 'Windows Vista or Windows Server 2008'. Any help would be much appreciated! OK - so we installed Z-Cron and it works perfectly.... Still a really really strange error from Windows 2008 Task Scheduler, but a solution is perhaps not quite so urgent now we have Z-Cron working!

    Read the article

  • "svn up" misses files!

    - by Blastura
    When trying to do an svn up I get the normal At revision XX even though some files are missing, the missing files do show when doing an svn list example: $ svn list > ConditionTest.java > persistence $ ls > persistence $ svn list > ConditionTest.java > persistence/ $ svn up > Is at revision 55 $ ls > persistence The file ConditionTest.java is not added unless manually running svn up ConditionTest.java What is up? Can't I trust svn anymore? Running svn, version 1.6.6 (r40053)

    Read the article

  • Find command exclude files whose path match a certain pattern

    - by user40570
    I have a find command that looks for files that was modified recently and outputs the date find /path/on/server -mtime -1 -name '*.js' -exec ls -l {} \; I would like it to exclude any deeply nested folder that matches a certain pattern e.g. there are a number of folders that have a "statistics" directory and ".svn" directories. So i'd like to be able to say if the file that was modified yesterday is in a folder named statistics ignore it. Or perhaps not search for files in those folders at all.

    Read the article

  • scp all files starting with 'file' from a server

    - by user209691
    Hi, I use this command to copy all files whose names start with 'file' from a server. scp -vp me@server:/location/files* ./ But i got a 'No Match' error. probably Concerning the '' in the command. How can i protect the '' for ssh to understand that this refers to a list of files and not taking it as a filename. Thx August

    Read the article

  • Windows DIR listing switch to exclude files in hidden folders

    - by Jason
    I'm trying to get a list of files from a directory excluding files in hidden folders. With the following command, hidden folders are traversed even though I've set /A:-H to exclude hidden directories. Is there a different switch to stop them from being traversed too? dir "C:\SVN" /A:-H /w /b /s Alternatively, for this use case I know the name of the hidden folders I want to exclude, so if there is a way to exclude the folders by name ("\.svn\") that might have to suffice. Thanks!

    Read the article

  • Does Microsoft make available the .obj files for its CRT versions to enable whole program optimizati

    - by Leeks and Leaks
    Given the potential performance improvements from LTCG (link time code generation, or whole program optimization), which requires the availability of .obj files, does Microsoft make available the .obj files for the various flavors of its MSVCRT releases? One would think this would be a good place for some potential gain. Not sure what they have to lose since the IL that is generated in the .obj files is not documented and processor specific.

    Read the article

  • .htaccess - deny downloading of files

    - by user317005
    I keep several fonts in the directory "/fonts/" on my server which I then load into my css files via @font-face. However, I want to make sure that people cannot download the file just by simply going to http://www.domain.com/fonts/fontname.ttf. Can I somehow prevent this, and still be able to load the font files into my css files? Because I think putting deny from all into the .htaccess file will even prevent the css files for correctly loading the fonts. I hope this makes sense.

    Read the article

  • Can't copy files from network drive

    - by user630320
    I have weird problem with copying files. When I copy file from network drive into C drive nothing happens but when I copy file from network drive to desktop I can copy the file. Also if I copy files from desktop into C it works fine. I have full local admin permission on this PC and the network drive. I have try these things Created new profile Run Windows Update Run checkdisk I'm using Windows XP 32bit pro Update: Network path: \\server1\shared\folder PC: C:\ (this doesn't work) C:\Documents and settings\Userid\Desktop (This works fine)

    Read the article

  • Corrupted files, hard drive test?

    - by all-R
    Hi guys, I'm currently on a macbook with a 1TB external hard drive connected trough a USB hub wich is connected on my macbook. The problem is, my disk, wich is partitioned in 2 (one HFS+ and one NTFS) keeps getting corrupted, recently it was my HFS+ partition, I could not repair it using the Apple's Disk utility, but was able to backup my files. Is it synonym that my hard drive is failing? Is it because of my USB hub? I also keep all my iTunes library on my external HD (HFS+ partition), and did a lot of transfer lately, adding files, removing etc. the last time, my partition got corrupted after a lot of deleted items. If anybody has an idea of what to check first, what could cause the problem, I would appreciate it :) Thanks!

    Read the article

  • Controlling access to large files in Apache

    - by obeattie
    Hi there, I am looking to control access to some large files (we're talking many GB here) by the use of signed URLs. The files are currently restricted by LDAP Basic authentication (mod_auth_ldap), but I need to change this to verify the signature (passed as a query parameter in the URL). Basically, I just need to run a script to verify the signature, and allow the request to proceed as if authentication had succeeded. My initial thought to this was just to use a simple CGI script, but as the files are so large I'm concerned about performance. So, really, this question is (probably) more like "are there any performance implications of streaming large files from a CGI script via Apache?"… and if so, "is there a better way of doing this (short of writing a dedicated authentication module)?" If this makes any sense, help would be much appreciated :) P.S. I wasn't sure exactly what to search for for this (10 minutes of Googling were fruitless), so I may very well be duplicating someone else's post.

    Read the article

< Previous Page | 109 110 111 112 113 114 115 116 117 118 119 120  | Next Page >