Search Results

Search found 39200 results on 1568 pages for 'zip files'.

Page 177/1568 | < Previous Page | 173 174 175 176 177 178 179 180 181 182 183 184  | Next Page >

  • .htaccess mod_rewrite is preventing certain files from being served

    - by Lucas
    i have a successful use of mod_rewrite to make a site display as i wish... however, i have migrated the 'mock-up' folder to the root directory and in implementing these rules for the site, some files are not being served in the ^pdfs folder: RewriteCond %{REQUEST_FILENAME} -f RewriteRule ^ - [L] (old directory) RewriteRule ^redesign_03012010/mock-up/([^/]+)/([^/]+)$ /redesign_03012010/mock-up/index.php?page=$1&section=$2 [PT] RewriteRule ^redesign_03012010/mock-up/([^/]+)$ /redesign_03012010/mock-up/index.php?page=$1 [PT,L] (new directory) RewriteRule ^([^/]+)/([^/]+)$ /index.php?test=1&page=$1&section=$2 [PT] RewriteRule ^([^/]+)$ /index.php?test=1&page=$1 [PT,L] ... ^pdfs (aka /pdfs/) is not serving the files... any suggestions?

    Read the article

  • Generating PDF files from .NET by using standard .NET GDI printing classes

    - by Philippe Leybaert
    I'm looking for a way to generate PDF files using the standard PrintDocument and Graphics (GDI) classes in .NET. As far as I know, the only way to do that is by printing to a PDF printer. The problem is that a PDF printer driver always asks for a filename, but I need to control the filename from my code. Using a PDF library like PDFSharp or DynamicPDF is not an option, because they all provide their own API for generating PDF files. I need this for an internal application, so dependencies are not a problem. My question is simple: is there a way to control a printer driver (Adobe Acrobat, PDFCreator, ...) in such a way that a filename can be specified and the user is not prompted for anything?

    Read the article

  • Shell script to process files

    - by Harish
    I need to write a Shell Script to process a huge folder of nearly 20 levels.I have to process each and every file and check which files contain lines like select insert update When I mean line it should take the line till I find a semicolon in that file. I should get a result like this C:/test.java select * from dual C:/test.java select * from test C:/test1.java select * from tester C:/test1.java select * from dual and so on.Right now I have a script to read all the files #!bin/ksh FILE=<FILEPATH to be traversed> TEMPFILE=<Location of Temp file> cd $FILE for f in `find . ! -type d`; do cat $FILE/addedText.txt>>$TEMPFILE/newFile.txt cat $f>>$TEMPFILE/newFile.txt rm $f cat $TEMPFILE/newFile.txt>>$f rm $TEMPFILE/newFile.txt done I have very little knowledge of awk and sed to proceed further in reading each file and achieve what I want to.Can anyone help me in this

    Read the article

  • How do you sort files numerically?

    - by Zachary Young
    Hello all, First off, I'm posting this because when I was looking for a solution to the problem below, I could not find one on stackoverflow. So, I'm hoping to add a little bit to the knowledge base here. I need to process some files in a directory and need the files to be sorted numerically. I found some examples on sorting--specifically with using the lamba pattern--at wiki.python.org, and I put this together: #!env/python import re tiffFiles = """ayurveda_1.tif ayurveda_11.tif ayurveda_13.tif ayurveda_2.tif ayurveda_20.tif ayurveda_22.tif""".split('\n') numPattern = re.compile('_(\d{1,2})\.', re.IGNORECASE) tiffFiles.sort(cmp, key=lambda tFile: int(numPattern.search(tFile).group(1))) print tiffFiles I'm still rather new to Python and would like to ask the community if there are any improvements that can be made to this: shortening the code up (removing lambda), performance, style/readability? Thank you, Zachary

    Read the article

  • How to update individual files in a war file

    - by ziggy
    Hi guys, I am doing some research in the way i can deploy an application efficiently using a war file. What i currently do is i deliver the war file everytime there is a release. This means everytime there is a change no matter how small the change is i have to build and deliver all files that make up the application. I am thinking that maybe this is not the correct way to do this. For example if there is a change to a css file i have to rebuild the war file which will include all file. This includes recompiling all *.java files as well. In the above example, is it possible to build a war file with just the css file and deploy it to the tomcat server and have Tomcat just replace the css file and leave everything else as is?

    Read the article

  • Would dynamically created JavaScript files be cached?

    - by venksster
    So my application uses a LOT of js files. thats a lot of http requests. I decided to combine them dynamically at the server in packs of 3-4 files clubbed by functionality. My client side request is: ...script type="text/javascript" src="http://mydomain.com/core-js.php" ... My server side does: --core-js.php-- header("Content-type: application/x-javascript"); include_once('file1.js'); include_once('file2.js'); include_once('file3.js'); include_once('file4.js'); I am setting a far future expire header on core-js.php. My question is, would core-js.php be cached at the client side? If it would be, could someone please explain how? Thanks!

    Read the article

  • Perl: Recursively rename all files and directories

    - by user305801
    I need to recursively rename every file and directory. I convert spaces to underscores and make all file/directory names to lowercase. How can I make the following script rename all files in one run? Currently the script needs to be run several times before all the files/directories are converted. The code is below: #!/usr/bin/perl use File::Find; $input_file_dir = $ARGV[0]; sub process_file { $clean_name=lc($_); $clean_name=~s/\s/_/g; rename($_,$clean_name); print "file/dir name: $clean_name\n"; } find(\&process_file, $input_file_dir);

    Read the article

  • Loading all files in a directory in a Java applet

    - by WarrenB
    How would one go about programatically loading all the resource files in a given directory in a JAR file for an applet? The resources will probably change several times over the lifetime of the program so I don't want to hardcode names in. Normally I would just traverse the directory structure using File.list(), but I get permission issues when trying to do that within an applet. I also looked at using an enumeration with something line ClassLoader.getResources() but it only finds files of the same name within the JAR file. Essentially what I want to do is (something like) this: ClassLoader imagesURL = this.getClass().getClassLoader(); MediaTracker tracker = new MediaTracker(this); Enumeration<URL> images = imagesURL.getResources("resources/images/image*.gif"); while (images.hasMoreElements()){ tracker.add(getImage(images.nextElement(), i); i++; } I know I'm probably missing some obvious function, but I've spent hours searching through tutorials and documentation for a simple way to do this within an unsigned applet.

    Read the article

  • Batch files - number of command line arguments

    - by pyko
    Just converting some shell scripts into batch files and there is one thing I can't seem to find...and that is a simple count of the number of command line arguments. eg. if you have: myapp foo bar In Shell: $# - 2 $* - foo bar $0 - myapp $1 - foo $2 - bar In batch ?? - 2 <---- what command?! %* - foo bar %0 - myapp %1 - foo %2 - bar So I've looked around, and either I'm looking in the wrong spot or I'm blind, but I can't seem to find a way to get a count of number of command line arguments passed in. Is there a command similar to shell's "$#" for batch files? ps. the closest i've found is to iterate through the %1s and use 'shift', but I need to refernece %1,%2 etc later in the script so that's no good.

    Read the article

  • TortoiseSvn Merge followed by Create Patch does not include new files

    - by JoelFan
    I am doing a Merge in TortoiseSvn, which modifies some files, deletes some, and adds some. Next I am doing a Create Patch to create a patch file with these changes. The problem is that the resulting patch file includes only the modifications and deletions, not the adds. I have discovered a workaround. If I revert the adds and then do an explicit Add of those files in TortoiseSVN, then do a Patch, it picks up everything, including the Adds. Is there a way to avoid this workaround?

    Read the article

  • Drupal htaccess redirect - all files in directory to new directory

    - by hfidgen
    Hiya, I've moved a site to Drupal, but am now getting a lot of 404 errors due to the search engines taking their time to update the indexes. The 404 paths all look similar to this: recipedata/ccp1300006/633_L.jpg recipedata/ccp1500005/risotto.jpg recipedata/ccp1500006/haddock.jpg So I'd like to do some htaccess redirection with mod_rewrite to take care of this lot. All the images DO exist - the path has just changed to /sites/default/files/images/ I've edited a lot of redirects into my htaccess already, but because the ccpXXXXXX directory changes I can't quite figure out the regex. This was my last attempt, but yeah - doesn't work :) Can anyone give me some pointers? RewriteRule ^recipedata/(ccp+)/(.+)$ http://domain.co.uk/sites/default/files/images/$2 [R=301,L] This has to be in the context of the Drupal mod_rewrite rules which already exist <IfModule mod_rewrite.c> RewriteEngine on # Rewrite URLs of the form 'x' to the form 'index.php?q=x'. RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_URI} !=/favicon.ico RewriteRule ^(.*)$ index.php?q=$1 [L,QSA] </IfModule> Thanks!

    Read the article

  • Real-time aggregation of files from multiple machines to one

    - by dmitry-kay
    I need a tool which gets a list of machine names and file wildcards. Then it connects to all these machines (SSH) and begins to monitor changes (appendings to the end) in each file matched by wildcards. New lines in each such file are saved to the local machine to the file with the same name. (This is a task of real-time log files collecting.) I could use ssh + tail -f, of course, but it is not very robust: if a monitoring process dies and then restarts, some data from remote files may be lost (because tail -f does not save the position at which it is finished before). I may write this tool manually, but before - I'd like to know if such tool already exists or not.

    Read the article

  • PowerShell Script to Find and Replace for all Files with a Specific Extension

    - by Brandon
    I have several configuration files on Windows Server 2008 nested like such: C:\Projects\Project_1\project1.config C:\Projects\Project_2\project2.config In my configuration I need to do a string replace like such: <add key="Environment" value="Dev"/> will become: <add key="Environment" value="Demo"/> I thought about using batch scripting, but there was no good way to do this, and I heard that with PowerShell scripting you can easily perform this. I have found examples of find/replace, but I was hoping for a way that would traverse all folders within my C:\Projects directory and find any files that end with the '.config' extension. When it finds one, I want it to replace my string values. Any good resources to find out how to do this or any PowerShell gurus that can offer some insight?

    Read the article

  • Flash uploader that can handle >2GB files?

    - by Alvin SMith
    Is there an open source Flash uploader that can handle files larger than 2 GB? ASP.net implementations like SlickUpload are not an option, and SWFUpload (and others that I've seen) do not handle files larger than 2 GB. Nor is requiring the user to have Java installed to run applets. This would be for both IE and Firefox. I've seen a couple "large file transfer" sites that have a Flash uploader and claim to go past the 2GB limit (which is the limit for http uploads for most browsers) so I know it is technically possible.

    Read the article

  • extend web server to serve static files

    - by Turtle
    Hello, I want to extend a web server which is only able to handle RPC handling now. The web server is written in C#. It provides a abstract handler function like following: public string owsHandler(string request, string path, string param, OSHttpRequest httpRequest, OSHttpResponse httpResponse) And I wrote following code to handle image files: Bitmap queryImg = new Bitmap(path); System.IO.MemoryStream stream = new System.IO.MemoryStream(); queryImg.Save(stream, System.Drawing.Imaging.ImageFormat.Bmp); queryImg.Dispose(); byte[] byteImage = stream.ToArray(); stream.Dispose(); return Convert.ToBase64String(byteImage); And I test it in the browser, the image is returned but the image dimension info is missed. Shall I add something more to the code? Or is any general way to server static files? I do not want to serve it in a ASP.net server. Thanks

    Read the article

  • Synchronising scripts / db / files from dev system to web server

    - by Spoonface
    I work as a freelance web dev, and up until now have been ftping my scripts / databases / static files to my web server manually, but I'm finding that is too error prone. So I'm looking for an app to automate uploading new and updated scripts / files / databases / etc. I know a lot of independent devs use WinSCP or Unison, but I don't think those apps can synch databases. Does anyone have any other suggestions? It doesn't need to be anything overly feature rich as I'm not working within a team or across multiple operating systems or anything like that. I can purchase any reasonably priced license if necesary. My work is primarily for PHP / MySQL / Apache on a Windows system, and then uploaded to a Linux / Apache server. thanks for your time!

    Read the article

  • Effective methods for reading and writing large files in C

    - by Bertholt Stutley Johnson
    I'm writing an application that deals with very large user-generated input files. The program will copy about 95 percent of the file, effectively duplicating it and switching a few words and values in the copy, and then appending the copy (in chunks) to the original file, such that each block (consisting of between 10 and 50 lines) in the original is followed by the copied and modified block, and then the next original block, and so on. The user-generated input conforms to a certain format, and it is highly unlikely that any line in the original file is longer than 100 characters long. Which would be the better approach? a) To use one file pointer and use variables that hold the current position of how much has been read and where to write to, seeking the file pointer back and forth to read and write; or b) To use multiple file pointers, one for reading and one for writing. I am mostly concerned with the efficiency of the program, as the input files will reach up to 25,000 lines, each about 50 characters long. Thanks!

    Read the article

  • Using Ant to merge two different properties files

    - by Justin
    I have a default properties file, and some deployment specific properties files that override certain settings from the default, based on deployment environment. I would like my Ant build script to merge the two properties files (overwriting default values with deployment specific values), and then output the resulting properties to a new file. I tried doing it like so but I was unsuccessful: <target depends="init" name="configure-target-environment"> <filterset id="application-properties-filterset"> <filtersfile file="${build.config.path}/${target.environment}/application.properties" /> </filterset> <copy todir="${web-inf.path}/conf" file="${build.config.path}/application.properties" overwrite="true" failonerror="true" > <filterset refid="application-properties-filterset" /> </copy> </target>

    Read the article

  • How to overwrite specific lines on text files

    - by iTayb
    I have two text files. I'd like to copy a specific part in the first text file and replace it with a part of the second text file. This is how I read the files: List<string> PrevEp = File.ReadAllLines(string.Format(@"{0}naruto{1}.ass", url, PrevEpNum)).ToList(); List<string> Ep = File.ReadAllLines(string.Format(@"{0}naruto{1}.ass", url, EpNum)).ToList(); The part in PrevEp that I need: from the start until it meets a line that includes Creditw,,0000,0000,0000. The part I would like to overwrite in Ep: from the start to a line which is exactly Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text. I'm not so sure how may I do it. Could you lend me a hand? Thank you very much, gentlemen.

    Read the article

  • Is there a limit for the number of files in a directory on an SD card?

    - by jamesh
    I have a project written for Android devices. It generates a large number of files, each day. These are all text files and images. The app uses a database to reference these files. The app is supposed to clear up these files after a little use (perhaps after a few days), but this process may or may not be working. This is not the subject of this question. Due to a historic accident, the organization of the files are somewhat naive: everything is in the same directory; a .hidden directory which contains a zero byte .nomedia file to prevent the MediaScanner indexing it. Today, I am seeing an error reported: java.io.IOException: Cannot create: /sdcard/.hidden/file-4200.html at java.io.File.createNewFile(File.java:1263) Regarding the sdcard, I see it has plenty of storage left, but counting $ cd /Volumes/NO_NAME/.hidden $ ls | wc -w 9058 Deleting a number of files seems to have allowed the file creation for today to proceed. Regrettably, I did not try touching a new file to try and reproduce the error on a commandline; I also deleted several hundred files rather than a handful. However, my question is: are there hard limits on filesize or number of files in a directory? am I even on the right track here? Nota Bene: The SD card is as-is - i.e. I haven't formatted it, so I would guess it would be a FAT-* format. The FAT-32 format has hard limits of filesize of 2GB (well above the filesizes I am dealing with) and a limit of number of files in the root directory. I am definitely not writing files in the root directory.

    Read the article

< Previous Page | 173 174 175 176 177 178 179 180 181 182 183 184  | Next Page >