Search Results

Search found 39200 results on 1568 pages for 'zip files'.

Page 120/1568 | < Previous Page | 116 117 118 119 120 121 122 123 124 125 126 127  | Next Page >

  • Virtual hosting in Varnish with individual vcl files for configuration

    - by Michael Sørensen
    I wish to use varnish to put in front of an apache and a tomcat on the same server. Depending on the ip requested, it goes to a different backend. This works. Now for most of the sites the default varnish logic will work just fine. However for some specific sites I wish to use custom VCL code. I can test for host name and include config files for the specific domains, but this only works inside the individual methods recv etc. Is there a way to include a complete set of instructions, in one file, per domain, without having to manage separate files for subdomain_recv, subdomain_fetch etc? And preferably without running seperate instances of varnish. When I try to include a file on the "root level" of default.vcl, I get a compilation error. Best regards, Michael

    Read the article

  • Files slow to save sometimes after Ubuntu upgrade

    - by Matchu
    I haven't quite been able to track down why this happens sometimes in Ubuntu 10.04 and not other times. I'll go into gedit or OpenOffice.org and try to save files, and, during some sessions, it will take up to 10 seconds to save the file, sometimes causing the program to become unresponsive. But during these same sessions, the files sometimes save instantly. This didn't start happening until after the 10.04 (Lucid) upgrade. I suspect that something is reading all the changes I make, or that there's some other big file action going on, or something like that. I disabled Tracker a while back, before the upgrade, and don't see it under the settings - could it be back under a different name under Lucid? You probably don't already know the answer, but how can I go about finding the cause of this problem?

    Read the article

  • Considering modified files for rebuild

    - by harik
    I have a C++ project, I am using Bakefile for build process, Makefiles are generated for msvc, mingw, gnu etc for cross-platform support. Now the problem is that if I change any .h files (which are included in other .cpp files) and performing a rebuild does not recompile modified files. But changing any .cpp file gets recompiled. Based on modified time-stamp of any file which is included in the project I expect to consider that file for rebuild. Am I missing something which required to be added as a tag in .bkl files? Please help.

    Read the article

  • Webserver sending corrupt or corrupting served files

    - by NotIan
    EDIT: Looks like the problem was a rootkit that corrupted a bunch of low level linux commands, including top, ps, ifconfig, netstat and others. The problem was resolved by taking all web files off the server and wiping it. A dedicated server we operate is having a strange issue. Files are not be sent complete or are showing up with garbage data. Example: http://sustainablefitness.com/images/banner_bootcamps.jpg To make matters more confusing this corruption does NOT happen when the files are served as https, (I would post a link, but I don't have enough rep points, just add an 's' after http in the link above.) When I throw load at the server, I get dozens of (swapd)s in top this is the only thing that really jumps out. I can't post images but ( imgur.com / ZArSq.png ) is a screenshot of top. I have tried a lot of stuff so far, I am willing to try anything that I can. A dedicated server we operate is having a strange issue. Files are not be sent complete or are showing up with garbage data. Example: http://sustainablefitness.com/images/banner_bootcamps.jpg To make matters more confusing this corruption does NOT happen when the files are served as https, (I would post a link, but I don't have enough rep points, just add an 's' after http in the link above.) When I throw load at the server, I get dozens of (swapd)s in top this is the only thing that really jumps out. I can't post images but ( imgur.com / ZArSq.png ) is a screenshot of top. I have tried a lot of stuff so far, I am willing to try anything that I can.

    Read the article

  • Dreamweaver Files uploaded to Win 2008 server cause login prompt

    - by Lil
    I have a customer who uses a 4 year old version of Dreamweaver to edit her webpages. My hosting reseller account is with a company that uses Windows Server 2008. Every time my customer edits a page and uploads it, I have to set the permissions for that file to be readable, manually from the site's control panel. The customer is furious with me because her files cause the login prompt. I am able to upload files myself that remain readable to the site with both Filezilla and with Frontpage. I am assuming that her Dreamweaver settings are the cause of the problem but I don't have that program myself and don't know what to advise her. Any suggestions?

    Read the article

  • How to delete files quicker than rm -rf?

    - by Byakugan
    Is there any way how to delete folder/files quicker than with command rm -rf? It seems my disc is filled with bilions of files (sessions of php5) which were not deleted in cron so I need to delete them manually but it takes hours and it is still not helping reducing the amount. Thank you. My command: rm -rf /var/lib/php5/* Tried also these commands: find /var/lib/php5 -name "sess_*" -exec rm {} \; And perl -e 'chdir "/var/lib/php5/" or die; opendir D, "."; while ($n = readdir D) { unlink $n }'

    Read the article

  • visual-studio-2008 versioninfo for all files updated from one place

    - by ravenspoint
    The version information, displayed when the mouse cursor hovers over the file in windows explorer, is set for a file built by visual studio in the VERSION resource. I would like to set the version in one place for all the files built by a solution, preferably when I change the version in the install properties. Is there a way to do this? The motivation for this is that if the version is not updated for a file, then the installer will leave previous versions of files instead of replacing them with new files. This happens even when the 'RemovePreviousVersions' property is set. In order to save the tedious and error prone task of updating the version in every file built and installed, I remove the version resource from all files - which is not elegant.

    Read the article

  • Copy files with filter (XP)

    - by fire
    I have a huge folder (over 6GB) with multiple sub-folders that I want to copy onto an external hard drive, however I do not want it to copy any PDF, EXE or ZIP files across to save space. Is there any software that will help me achieve this? I have looked at TeraCopy but this doesn't seem to have any filter mechanism on it. I am using Windows XP (* sigh *). *edit: found the xcopy command, will this do it? Can anyone help me with the syntax?

    Read the article

  • tool for advanced ID3 tags handling and audio files ordering

    - by Juhele
    I have following problem – some of my files do not have complete ID3 tags and some have typos or small differences in writing - so finally, my portable player sees “Mr. President” as different artist from “Mr President” and so on. I would need some tool which could search similar tags and then allow me to correct the typos or for example override “artist” in all selected files by manually entered text. The same with empty tag items – sometimes, the track name, album etc. is OK, but artist is missing etc. Without touching the audio quality, of course (but this should be no problem, I think). I already tried tools in Winamp, Songbird and other players and currently most advanced free tool I tried is Tagscanner. However, it is not able to to solve the problem with similar tags. Do you know such tool? Preferably free and for Windows, if possible. However, if you know some commercial app able to do this, please let me know.

    Read the article

  • Looping through a directory on the web and displaying its contents (files and other directories) via

    - by al jaffe
    In the same vein as http://stackoverflow.com/questions/2593399/process-a-set-of-files-from-a-source-directory-to-a-destination-directory-in-pyth I'm wondering if it is possible to create a function that when given a web directory it will list out the files in said directory. Something like... files[] for file in urllib.listdir(dir): if file.isdir: # handle this as directory else: # handle as file I assume I would need to use the urllib library, but there doesn't seem to be an easy way of doing this, that I've seen at least.

    Read the article

  • Haskell lazy I/O and closing files

    - by Jesse
    I've written a small Haskell program to print the MD5 checksums of all files in the current directory (searched recursively). Basically a Haskell version of md5deep. All is fine and dandy except if the current directory has a very large number of files, in which case I get an error like: <program>: <currentFile>: openBinaryFile: resource exhausted (Too many open files) It seems Haskell's laziness is causing it not to close files, even after its corresponding line of output has been completed. The relevant code is below. The function of interest is getList. import qualified Data.ByteString.Lazy as BS main :: IO () main = putStr . unlines =<< getList "." getList :: FilePath -> IO [String] getList p = let getFileLine path = liftM (\c -> (hex $ hash $ BS.unpack c) ++ " " ++ path) (BS.readFile path) in mapM getFileLine =<< getRecursiveContents p hex :: [Word8] -> String hex = concatMap (\x -> printf "%0.2x" (toInteger x)) getRecursiveContents :: FilePath -> IO [FilePath] -- ^ Just gets the paths to all the files in the given directory. Are there any ideas on how I could solve this problem? The entire program is available here: http://haskell.pastebin.com/PAZm0Dcb

    Read the article

  • Ignoring generated files when using "Treat warnings as errors"

    - by krystan honour
    We have started a new project but also have this problem for an existing project. The problem is that when we compile with a warning level of 4 we also want to switch on 'Treat all warnings as errors' We are unable to do this at the moment because generated files (in particular reference.cs files) are missing things like XML comments and this generates a warning, we do not want to suppress the xml comment warnings totally out of all files just for specific types of files (namely generated code). I have thought of a way this could be achieved but am not sure if these are the best way to do this or indeed where to start :) My thinking is that we need to do something with T4 templates for the code that is generated such that it does fill in XML documentation for generated code. Does anyone have any ideas, currently I'm at well over 2k warnings (its a big project) :(

    Read the article

  • Best Place to Store Config Files and Log Files on Windows for My Program?

    - by Dave
    I need to store log files and config files for my application. Where is the best place to store them? Right now I'm just using the current directory, which ends up putting them in the Program Files directory where my program lives. The log files will probably be accessed by the user somewhat regularly, so %APPDATA% seems a little hard to get to. Is a directory under %USERPROFILE%\My Documents the best? It needs to work for all versions of Windows from 2000 forward.

    Read the article

  • Search Files (Preferably with index) on Windows 2000 Server

    - by ThinkBohemian
    I have many files on a windows server 2000 machine that is setup to act as a networked disk drive, is there anyway I can index the files and make that index available as a search to more people than just me? Bonus if the index can look inside of documents such as readme.txt? If there is no easy way to do this globaly (for all users) Is there a way I could generate and store an index locally on my computer? If this is the wrong place to ask this question, any advice on community more suited?

    Read the article

  • multiple FileSystemWatchers to monitor files on local system?

    - by Jason Crowes
    We're writing a text editor like tool for our internal accounting package system that has actions that can be done by our own Xml language specs. These macro commands are specified in Xml files and we need the ability to monitor if files openned have bean modified externally. The only problem is that there maybe 20-30 files with different paths openned at any one time. Would it be good to use multiple FileSystemWatchers for this scenario? Or would it be better to monitor the root drive and catch specific events that match an open file in the editor (though lots of events could be raised). Some are local drives (C,D,E) others are their network drives (U,X,G,H). Files are quite chunky too about 300-400Kb.

    Read the article

  • Rename files and directories using substitution and variables

    - by rednectar
    I have found several similar questions that have solutions, except they don't involve variables. I have a particular pattern in a tree of files and directories - the pattern is the word TEMPLATE. I want a script file to rename all of the files and directories by replacing the word TEMPLATE with some other name that is contained in the variable ${newName} If I knew that the value of ${newName} was say "Fred lives here", then the command find . -name '*TEMPLATE*' -exec bash -c 'mv "$0" "${0/TEMPLATE/Fred lives here}"' {} \; will do the job However, if my script is: newName="Fred lives here" find . -name '*TEMPLATE*' -exec bash -c 'mv "$0" "${0/TEMPLATE/${newName}}"' {} \; then the word TEMPLATE is replaced by null rather than "Fred lives here" I need the "" around $0 because there are spaces in the path name, so I can't do something like: find . -name '*TEMPLATE*' -exec bash -c 'mv "$0" "${0/TEMPLATE/"${newName}"}"' {} \; Can anyone help me get this script to work so that all files and directories that contain the word TEMPLATE have TEMPLATE replaced by whatever the value of ${newName} is eg, if newName="A different name" and a I had directory of /foo/bar/some TEMPLATE directory/with files then the directory would be renamed to /foo/bar/some A different name directory/with files and a file called some TEMPLATE file would be renamed to some A different name file

    Read the article

  • Cannot chown my own files from NFS

    - by valpa
    We have a NFS server provide home directory for many account, which provided by a NIS server. I have account A and B. In /home/A, I try to copy "cp -a /home/B/somedir ~/". Then I found in /home/A/somedir, all files are owned by user A. Then if I do "chown -R B:B somedir", I got "Operation not permitted" error. I am user A, "cp -a" didn't preserve the original user (B). Then I cannot chown my own files. Any suggestion? I fix my own issue by "chmod 777 /home/A", "su - B" and "cp -a somedir /home/A/", and "su - A", then "chmod 755 /home/A". But it is not a good solution.

    Read the article

  • How to programmatically cut/copy/get files to/from Windows clipboard in a systam standard compliand

    - by Ivan
    How to put a cut/copy reference to specific files and/or folders into Windows clipboard so that when I open standard Windows Explorer window, go to somewhere and press Ctrl+V - the files are pasted? If I copy or cut some files/folders in Windows Explorer, how do I get this info (full names and whether they were cut or copied) in my Program? I program in C#4, but other languages ways are also interesting to know.

    Read the article

  • How to copy protected files when an Administrator in Vista (easily)

    - by earlz
    Hello, I have a harddrive I need to backup. In the harddrive is of course things like Documents and Settings which is set to not allow other people to see inside someone's personal folders. I am an administrator though and I can not figure out how to mark these files so that I am permitted to access them and copy them. IWhen I double click on My Documents then it pops up saying You must have permission to access this and gives me an option like ok or cancel. I click ok and then it says you do not have permission to access these files I'm an administrator on the system so I don't understand why Vista is locking me out. How can I setup vista so that it will let me copy every file, even ones I don't have permission to?

    Read the article

  • puppet agent doesn't retrieve files from master

    - by nicmon
    I have a very basic question regarding to Puppet 3.0.1 configuration. I setup a puppet master server (CentOS) with 2 agents (CentOS and Windows 7), all 3 can ping and access each other. There is no error at all. I have copied a file under /etc/puppet/files/test2.txt my site.pp (/etc/puppet/manifests) contains these lines: node default { include test file { "/tmp/testmaster.txt": owner => root, group => root, mode => 644, source => "puppet:///files/test2.txt" } } but there will no file be created on agent servers under /tmp/ once I run "puppet agent --test" here is the output: [root@agent1 ~]# puppet agent --test Info: Retrieving plugin Info: Caching catalog for agent1.mydomain.com Info: Applying configuration version '1354267916' Finished catalog run in 0.02 seconds "puppet apply /etc/puppet/manifests/site.pp" creates the testmaster.txt under /tmp/ on master.

    Read the article

  • Making archive from files with same names in different directories

    - by Tim
    Hi, I have some files with same names but under different directories. For example, path1/filea, path1/fileb, path2/filea, path2/fileb,.... What is the best way to make the files into an archive? Under these directories, there are other files that I don't want to make into the archive. Off the top of my head, I think of using Bash, probably ar, tar and other commands, but am not sure how exactly to do it. Renaming the files seems to make the file names a little complicated. I tend to keep the directory structure inside the archive. Or I might be wrong. Other ideas are welcome! Thanks and regards!

    Read the article

  • Which open source repository or version control systems store files' original mtime, ctime and atime

    - by sampablokuper
    I want to create a personal digital archive. I want to be able to check digital files (some several years old, some recent, some not yet created) into that archive and have them preserved, along with their metadata such as ctime, atime and mtime. I want to be able to check these files out of that archive, modify their contents and commit the changes back to the archive, while keeping the earlier commits and their metadata intact. I want the archive to be very reliable and secure, and able to be backed up remotely. I want to be able to check files in and out of the archive from PCs running Linux, Mac OS X 10.5+ or Win XP+. I want to be able to check files in and out of the archive from PCs with RAM capacities lower than the size of the files. E.g. I want to be able to check in/out a 13GB file using a PC with 2GB RAM. I thought Subversion could do all this, but apparently it can't. (At least, it couldn't a couple of years ago and as far as I know it still can't; correct me if I'm wrong.) Is there a libre VCS or similar capable of all these things? Thanks for your help.

    Read the article

< Previous Page | 116 117 118 119 120 121 122 123 124 125 126 127  | Next Page >