Search Results

Search found 46908 results on 1877 pages for 'managing files and folder'.

Page 25/1877 | < Previous Page | 21 22 23 24 25 26 27 28 29 30 31 32  | Next Page >

  • vb.net how to check for changes in folder upon application start?

    - by Luay
    while the application is running i'm using FileSystemWatcher to monitor the folder. But what if there are changes to the folder when the application is not running, how can I check for these changes when the application starts. (similar to how windows media player, for example, monitors your music folder. Even when you add songs to that folder when it is not running, it does discover them when it runs next time) Thanks

    Read the article

  • VSS: Move file from one folder to another?

    - by shoosh
    Is there a way in Visual SourceSafe to move a file from one directory to another while retaining its history? Edit I actually found a round about way to do this. First I drag the file I want to move to the directory I want to move it to, this creates a "Link" to the file there and then I "permanently destroy" the file in its original location. Does this actually do what I thing it does?

    Read the article

  • AS3 Working With Arbitrarily Large Files

    - by Kekoa
    I am trying to read a very large file in AS3 and am having problems with the runtime just crashing on me. I'm currently using a FileStream to open the file asynchronously. This does not work(crashes without an Exception) for files bigger than about 300MB. _fileStream = new FileStream(); _fileStream.addEventListener(IOErrorEvent.IO_ERROR, loadError); _fileStream.addEventListener(Event.COMPLETE, loadComplete); _fileStream.openAsync(myFile, FileMode.READ); In looking at the documentation, it sounds like the FileStream class still tries to read in the entire file to memory(which is bad for large files). Is there a more suitable class to use for reading large files? I really would like something like a buffered FileStream class that only loads the bytes from the files that are going to be read next. I'm expecting that I may need to write a class that does this for me, but then I would need to read only a piece of a file at a time. I'm assuming that I can do this by setting the position and readAhead properties of the FileStream to read a chunk out of a file at a time. I would love to save some time if there is a class like this that already exists. Is there a good way to process large files in AS3, without loading entire contents into memory?

    Read the article

  • ClickOnce Application Files dialog filename problem

    - by Ted N
    In the ClickOnce "Application Files" files dialog, most of the entries for files are listed with the name "C". I have seen this on a colleague's machine for a different project as well. Has anyone else seen this and is there a way to get the correct filename inserted? We are both using VS 2008.

    Read the article

  • Continous Build Integration with SourceSafe and Batch Files

    - by CraigS
    I want to create a continuous build integration system for .NET using just Windows batch files and Visual Source Safe. I've come up with the following batch file so far - set ssdir=\\xxxx\vss cd d:\mydir "C:\Program Files\Microsoft Visual SourceSafe\ss.exe" diff "$/sourcedir" -R -Q > diffout.txt This will spit out a file containg lines like "SourceSafe files different from local files" when a change has been made. My challenge is to figure out if those lines are in the file, then do a get and kick off MSBuild if they are. I'd then schedule the batch file to run every 10 minutes or so. Anyone got any thoughts on how to do that? Or any other ways of doing continuous build integration without downloading a complicated build automation system? Update: Happy to use cscript or powershell too, though not really familiar with those environments. My main aim is to avoid installing 3rd party software

    Read the article

  • Use php to zip large files

    - by Joseph
    Hi, I have a php form that has a bunch of checkboxes that all contain links to files. Once a user clicks on which checkboxes (files) they want, it then zips up the files and forces a download. I got a simple php zip force download to work, but when one of the files is huge or if someone lets say selects the whole list to zip up and download, my server errors out. I understand that I can increase the server size, but are there any other ways? Thanks!

    Read the article

  • Continous Build Integration with SourceSafe and Windows Batch Files

    - by CraigS
    I want to create a continuous build integration system for .NET using just Windows batch files and Visual Source Safe. I've come up with the following batch file so far - set ssdir=\\xxxx\vss cd d:\mydir "C:\Program Files\Microsoft Visual SourceSafe\ss.exe" diff "$/sourcedir" -R -Q > diffout.txt This will spit out a file containg lines like "SourceSafe files different from local files" when a change has been made. My challenge is to figure out if those lines are in the file, then do a get and kick off MSBuild if they are. I'd then schedule the batch file to run every 10 minutes or so. Anyone got any thoughts on how to do that? Or any other ways of doing continuous build integration without downloading a complicated build automation system? Update: Happy to use cscript or powershell too, though not really familiar with those environments. My main aim is to avoid installing 3rd party software

    Read the article

  • C# On Quit WebPage Delete Files and Folders on Server with no user action

    - by user325558
    Hi, I have some problems to delete temporary folder and files on my server when users not finish some action in webpages and quit to other webpages. Initialy at Page Load folders are created to allow the user to load files.I have tried implementing destruction during Idisposable without success. Could someone point the best method to delete folders and files when user quit the page with no action or cancel button. Thanks.

    Read the article

  • Computer generated files - how do they work ?

    - by hory.incpp
    Hello, .... ‹BÿЃÀ‰D$Ç„$  ....... that's what happens when you open (notepad) such a file that I'm talking about How do algorithms decode that information and when does a program use/generate it ? Does some notepad-like application exist that open such files and transform them to readable code/data ? Any more information which will clarify about these files will be very helpful. Thank you for your time, P.S I'm not talking strictly about .exe files

    Read the article

  • Getting rid of unused php files

    - by scott
    I'm looking into removing php files that are no longer used on my site. I can use something like get_included_files to show the included files, but that would mean I would have to put it on every child page. If I put it on a parent page, it won't show me the child page that called it. Has anybody else run into a similar situation? If so, what did you do to remove unused php files?

    Read the article

  • Data files from development machine to iOS device

    - by StoneBreaker
    My app has created a bunch of data files as development has progressed through the simulator. Their location is obtained by this function: NSString *pathInDocumentDirectory(NSString *fileName) { NSArray *documentDirectories = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentDirectory = [documentDirectories objectAtIndex: 0]; return [documentDirectory stringByAppendingPathComponent: fileName]; } The files are now required on the device as testing of the app is moving from the simulator to actual devices. How do I transfer the data files from my current working environment to the devices?

    Read the article

  • Get a random folder C# .NET

    - by Joshua
    Hi. public sealed static class FolderHelper { public static string GetRandomFolder() { // do work } } But.... How? Like start at c:\ (or whatever the main drive is) and then randomly take routes? Not even sure how to do that.

    Read the article

  • AWS:EC2:: Why my web folder is called "html"??

    - by heathub
    P.S Q stands for Question. My environment is: Amazon linux 64 bit (Q1. i dont if its ubuntu or red-hat, is there any way to check?) And I need to run php and mysql, thus I installed httpd (Q2. is httpd == apache??), but on my default page, it says: please upload files to /var/www/html folder. Q3.This is the first time I set aws ec2 server myself, my previous experience is hosting with hosting company. Normally in hosting company, my web directory is called "www" or "public_html" or "htdocs".Why is my folder name is "/var/www/html"? Am I installed wrong apache?

    Read the article

  • CIFS mounted drive setting "stick-bit" on all files, cannot change permissions or modify files

    - by mattmcmanus
    I have a folder mounted on an Ubuntu 8.10 sever through cifs that I simply cannot change the permissions on once mounted. Here is a breakdown of what's going on: All files within the mounted folder automatically have their permissions set to -rwxrwSrwx regardless of whether the file is create on the windows server or on the linux machine. I have the same directory mounted on two other linux servers (both running 9.10 instead of 8.10) with no problems at all. They all are using the same fstab options and the same credentials. //server/folder /media/backups cifs credentials=/etc/samba/.arcadia_cred,noexec,noserverino 0 0 I've I run a chmod command a million different ways, all of which report successfully changing the permissions. However it doesn't. The issue began after I updated from 8.04 to 8.10 Any idea why this may be happening on one machine? Since it started after an upgrade I'm not sure what is the bes thing to do. Any help you could give would great! None of my automated backup scripts are working because of this!

    Read the article

  • NT Server: deleting files takes ages

    - by Fuxi
    hi all, i'm running an NT Server - when trying to delete eg. just one file - it takes several minutes (!!) until the file gets deleted. i'm only getting the deleting-dialog but nothing happens. any ideas what could be wrong? thx

    Read the article

  • nginx php-fpm uploaded files have nginx ownership after executing move uploaded file

    - by Vangel
    I have a problem with php 5.3.6 using PHP-FPM and file uploads. My Nginx runs as user nginx PHP-FPM uses pools configured for each vhost. For example a user: test group : test runs one pool. When the php file uploads to temp file it is owned by user test. After move_uploaded_file is executed by php script it is owned by nginx :/. The reason for this could be the fact that the original upload file script does an exec('/usr/bin/php do_uploadedfiles.php') which does the actual moving. I do not know if the change in ownership to web server is the correct behaviour. Is there a way in PHP to change the ownership back to the user I want? Maybe make exec "run as"?

    Read the article

  • Storing large amounts of small files into bigger files on Windows

    - by asmo
    Let's say I have 50 GiB of files that weights around 500 KiB each. My guess is that having, for example, 5 large files of 10 GiB each with the same content archived in them would be better for hard drive performance. Am I correct? Will there be a noticeable gain on an NTFS filesystem? ===================================================================== Finally, which tool could I use to group the files together while retaining the ability to modify the content of the archive with zero or minor performance loss? For example, I like TrueCrypt archiving because after mounting an archive file, it creates a drive which I can use seamlessly as if it was a normal drive. The only thing with TrueCrypt is that I don't need encryption/compression, only archiving.

    Read the article

  • Can access website but images, css stylesheets and javascript files do not download

    - by Triztian
    i have this problem, not sure about the source of it, Basically the title describes the issue, I can access the webpage and see the html structure, but no resources are being donwloaded nor I have access to them using the browser that means, no javascript, no css styles and no images., any solutions?, Im using tomcat by the way. EDIT 1 If I access the tomcat manager from within the server it also blocks the images. I'm running on windows server 2008 R2.

    Read the article

  • Is there a way to disable Windows automatically choosing folder templates?

    - by Scott Leis
    Windows Vista (and I guess Win 7 though I haven't used it) sometimes automatically applies templates to folders opened in Explorer based on their content. E.g. a folder with photos automatically gets the columns "Date taken", "Tags", and "Rating". Is there a way to disable the automatic application of this feature while still allowing manual customisation? I really want to apply the "All Items" template to all folders on all drives, and have it stay that way except on a few folders that I manually customise. The reason I want to disable the automatic behaviour is that it's often just wrong. I have folders with over 100 files where Windows has automatically applied a template based on the types of one or two of those files, and the template is wrong for everything else in the same folder.

    Read the article

  • SQL Server plus small files

    - by user1467163
    I have a MSSQL server, 3 volumes, that runs some processes that seem to take way too long. One of these processes reads in a zip file, then writes to a database based on what's in the zip file.... for each record. I have 2 volumes in use and am creating the third- so I am trying to plan how to do this. OS has to remain on vol. 1. The TLogs should probably go on the new volume and the mdf's on the existing vol.2.. Do I put the file store on the volume with the MDF's so they don't interfere with the TLog writes, or with the TLogs so they don't interfere with the TLog flush to the MDFs? I know it's best to have more servers / volumes but I have to make do with whats on hand for now. I appreciate any suggestions.

    Read the article

  • How to version large binary files?

    - by Walter White
    I run Windows XP inside a virtual machine for some tasks. I attempted to use git to version the image for virtual box; however, it is about 6GB after all the service packs. I only have 6GB of ram and git bombs out saying it is out of memory. I would basically like to have snapshots of Windows so that I can simply blow away an image and start anew when I want to. I like to have something I can rollback to in the event that an upgrade doesn't work so I would prefer to use version control or snapshots if the filesystem supports it. Any ideas on what tools I can use to do that?

    Read the article

< Previous Page | 21 22 23 24 25 26 27 28 29 30 31 32  | Next Page >