Search Results

Search found 40999 results on 1640 pages for 'duplicate files'.

Page 481/1640 | < Previous Page | 477 478 479 480 481 482 483 484 485 486 487 488  | Next Page >

  • Batch file script to remove special characters from filenames (Windows)

    - by njreed.myopenid.com
    I have a large set of files, some of which contain special characters in the filename (e.g. ä,ö,%, and others). I'd like a script file to iterate over these files and rename them removing the special characters. I don't really mind what it does, but it could replace them with underscores for example e.g. Störung%20.doc would be renamed to St_rung_20.doc In order of preference: A DOS batch file A Windows script file to run with cscript (vbs) A third party piece of software that can be run from the command-line (i.e. no user interaction required) Another language script file, for which I'd have to install an additional script engine Background: I'm trying to encrypt these file with GnuPG on Windows but it doesn't seem to handle special characters in filenames with the --encrypt-files option.

    Read the article

  • Auditing events 4656 and 4658 on Windows folder on Server 2008

    - by PCurd
    During an overnight system state backup we are seeing thousands of success audit events (4656, 4658) on the folder c:\windows\servicing, system32 and others in the windows folder. We use file success auditing on some files so I can't disable it but this deluge is filling up the logs and making reporting tricky. What is the harm of changing the auditing settings on the windows folder? What are the recommended settings to put on the files for those people doing system state backups? Thanks,

    Read the article

  • APC (PHP Cache) Uptime 0 minutes, not caching

    - by Jussi
    My goal is to implement APC for opcode cache for a drupal 6 production site. I have so far tested APC with several php files with and without including other php files with include_once. Also tried to tweak the apc.ini values for shm_size, apc.include_once_override and apc.stat. Restarted apache every time. Resulting in apc.php not showing any changes in any values. (except of course the changed apc.ini values are shown as they should) Every time i refresh the apc.php test page, the start time resets as the current time showing uptime 0 minutes. apc.php -testpage shows: General Cache InformationAPC Version 3.1.9 PHP Version 5.2.10 APC Host xxxx.xx.xx Server Software Apache/2.2.3 (CentOS) Shared Memory 1 Segment(s) with 128.0 MBytes (mmap memory, pthread mutex Locks locking) Start Time 2011/07/26 11:53:56 Uptime 0 minutes File Upload Support 1 Cached Files 0 ( 0.0 Bytes) Hits 1 Misses 1 Request Rate (hits, misses) 2.00 cache requests/second Hit Rate 1.00 cache requests/second Miss Rate 1.00 cache requests/second Insert Rate 0.00 cache requests/second Cache full count 0 Cached Variables 0 ( 0.0 Bytes) Hits 0 Misses 0 Request Rate (hits, misses) 0.00 cache requests/second Hit Rate 0.00 cache requests/second Miss Rate 0.00 cache requests/second Insert Rate 0.00 cache requests/second Cache full count 0 apc.cache_by_default 1 apc.canonicalize 1 apc.coredump_unmap 0 apc.enable_cli 0 apc.enabled 1 apc.file_md5 0 apc.file_update_protection 2 apc.filters apc.gc_ttl 3600 apc.include_once_override 0 apc.lazy_classes 0 apc.lazy_functions 0 apc.max_file_size 16 apc.mmap_file_mask /tmp/apcphp5.095eRm apc.num_files_hint 1024 apc.preload_path apc.report_autofilter 0 apc.rfc1867 0 apc.rfc1867_freq 0 apc.rfc1867_name APC_UPLOAD_PROGRESS apc.rfc1867_prefix upload_ apc.rfc1867_ttl 3600 apc.serializer default apc.shm_segments 1 apc.shm_size 128M apc.slam_defense 0 apc.stat 0 apc.stat_ctime 0 apc.ttl 7200 apc.use_request_time 1 apc.user_entries_hint 4096 apc.user_ttl 7200 apc.write_lock 1 Host Status Diagrams: Free: 128.0 MBytes (100.0%) Hits: 1 (50.0%) Used: 20.3 KBytes (0.0%) Misses: 1 (50.0%) Detailed Memory Usage and Fragmentation: Fragmentation: 0% phpinfo shows: Server API CGI/FastCGI APC: Version 3.1.9 APC Debugging Enabled MMAP Support Enabled MMAP File Mask /tmp/apcphp5.JkKDk7 Locking type pthread mutex Locks Serialization Support php Revision $Revision: 308812 $ Build Date Jul 21 2011 14:31:12 I followed these steps to find if suexec settings would prevent caching: http://www.litespeedtech.com/support/forum/showthread.php?t=4189 [root@host /]# ps -ef|grep lsphp root 20402 17833 0 11:21 pts/0 00:00:00 grep lsphp [root@host /]# ps -waux root 17833 0.0 0.1 5004 1484 pts/0 S 10:39 0:00 bash ..indicates that there is no lsphp running on the host also I read the following article and comments, concluding that in my case the problem is not the suexec as the user apache is the httpd process owner http://www.brandonturner.net/blog/2009/07/fastcgi_with_php_opcode_cache/ also suexec command is not recognized when logged and launced as root @ host also i'm almost confident that there is no cPanel running on the host to check if a setting there would reset the running cache process at some interval This leaves me with few clues where to head next. I tried to set (with chown and chgrp) apache as the owner of the apc.php file and some test php files resulting in 500 server error. Is there a way to check if the file permissions prevent the apc stay running? I'm tremendously grateful for any suggestions or help.

    Read the article

  • Why does Perl's readdir() cache directory entries?

    - by Frank Straetz
    For some reason Perl keeps caching the directory entries I'm trying to read using readdir: opendir(SNIPPETS, $dir_snippets); # or die... while ( my $snippet = readdir(SNIPPETS) ) { print ">>>".$snippet."\n"; } closedir(SNIPPETS); Since my directory contains two files, test.pl and test.man, I'm expecting the following output: . .. test.pl test.man Unfortunately Perl returns a lot of files that have since vanished, for example because I tried to rename them. After I move test.pl to test.yeah Perl will return the following list: . .. test.pl test.yeah test.man What's the reason for this strange behaviour? The documentation for opendir, readdir and closedir doesn't mention some sort of caching mechanism. "ls -l" clearly lists only two files.

    Read the article

  • web.config, configSource, and "The 'xxx' element is not declared" warning.

    - by UpTheCreek
    I have broken down the horribly unwieldy web.config file into individual files for some of the sections (e.g. connectionStrings, authentication, pages etc.) using the configSource attribute. This is working file, but the individual xml files that hold the section 'snippets' cause warnings in VS. For example, a file named roleManager.config is used for the role manager section, and looks like this: <roleManager enabled="false"> </rolemanager> However I get a blue squiggle under the roleManager element in VS, and the following warning: The 'roleManager' element is not declared I guess this is something to do with valid xml and schemas etc. Is there an easy way to fix this? Something I can add to the individual files? Thanks P.S. I have heard that it is bad practice to break the web.config file out like this. But don't really understand why - can anyone illuminate me?

    Read the article

  • script to recursively check for and select dependencies

    - by rp.sullivan
    I have written a script that does this but it is one of my first scripts ever so i am sure there is a better way:) Let me know how you would go about doing this. I'm looking for a simple yet efficient way to do this. Here is some important background info: ( It might be a little confusing but hopefully by the end it will make sense. ) 1) This image shows the structure/location of the relevant dirs and files. 2) The packages.file located at ./config/default/config/packages is a space delimited file. field5 is the "package name" which i will call $a for explanations sake. field4 is the name of the dir containing the $a.dir i will call $b field1 shows if the package is selected or not, "X"(capital x) for selected and "O"(capital o as in orange) for not selected. Here is an example of what the packages.file might contain: ... X ---3------ 104.800 database gdbm 1.8.3 / base/library CROSS 0 O -1---5---- 105.000 base libiconv 1.13.1 / base/tool CROSS 0 X 01---5---- 105.000 base pkgconfig 0.25 / base/tool CROSS 0 X -1-3------ 105.000 base texinfo 4.13a / base/tool CROSS DIETLIBC 0 O -----5---- 105.000 develop duma 2_5_15 / base/development CROSS NOPARALLEL 0 O -----5---- 105.000 develop electricfence 2_4_13 / base/development CROSS 0 O -----5---- 105.000 develop gnupth 2.0.7 / extra/development CROSS NOPARALLEL FPIC-QUIRK 0 ... 3) For almost every package listed in the "packages.file" there is a corresponding ".cache file" The .cache file for package $a would be located at ./package/$b/$a/$a.cache The .cache files contain a list of dependencies for that particular package. Here is an example of one of the .cache files might look like. Note that the dependencies are field2 of lines containing "[DEP]" These dependencies are all names of packages in the "package.file" [TIMESTAMP] 1134178701 Sat Dec 10 02:38:21 2005 [BUILDTIME] 295 (9) [SIZE] 11.64 MB, 191 files [DEP] 00-dirtree [DEP] bash [DEP] binutils [DEP] bzip2 [DEP] cf [DEP] coreutils ... So with all that in mind... I'm looking for a shell script that: From within the "main dir" Looks at the ./config/default/config/packages file and finds the "selected" packages and reads the corresponding .cache Then compiles a list of dependencies that excludes the already selected packages Then selects the dependencies (by changing field1 to X) in the ./config/default/config/packages file and repeats until all the dependencies are met Note: The script will ultimately end up in the "scripts dir" and be called from the "main dir". If this is not clear let me know what need clarification. For those interested I'm playing around with T2 SDE. If you are into playing around with linux it might be worth taking a look.

    Read the article

  • Add Command prompt in VS 2008 Express Edition manually

    - by Kumar
    Hi all, To add the Command prompt in VS 2008 express edition, i have done the following steps: Tools-ExternalTools-Click on Add- Then I have entered the following information. Title: Visual Studio 2008 Command Prompt Command: cmd.exe Arguments: %comspec% /k ""C:\Program Files\Microsoft Visual Studio 9.0\VC\vcvarsall.bat"" x86 Initial Directory: $(ProjectDir) Then OK/Apply: After this when I went to Tools Menu and click on Visual Studio 2008 Command Prompt, command prompt open but got the following error message: '"C:\Program Files\Microsoft Visual Studio 9.0\VC\vcvarsall.bat"' is not recognized as an internal or external command, operable program or batch file. C:\Program Files\Microsoft Visual Studio 9.0\Common7\IDE Please somebody help me to fix this problem.. Or somebody teach me freshly how to add command prompt in Tools Menu manually in VS 2008 Express Edition. Thanks, Kumar

    Read the article

  • Logging Virus Definition Updates for MS Security Essentials in The Security Event Log

    - by Steve
    I would like to log a security in event in Windows 7 whenever the Microsoft Security Essentials 2 virus definition files are updates, deleted, or changed. I was expecting to do this with an Audit setting on one of the MS Security Essentials folders but I wasn't sure which one and how to avoid getting swamped with messages. What folder or files should I audit to track definition updates (or corruption) in the security events or is there a better approach?

    Read the article

  • chkdsk, SeaTools, and "does not have enough space to replace bad clusters"

    - by Zian Choy
    When I tried to do a Windows Vista Complete PC Backup, I received an error message that blathered about bad sectors. Then, when I ran chkdsk /r on the destination drive, this is what I got: C:\Windows\system32>chkdsk /R E: The type of the file system is NTFS. Volume label is Desktop Backup. CHKDSK is verifying files (stage 1 of 5)... 822016 file records processed. File verification completed. 1 large file records processed. 0 bad file records processed. 0 EA records processed. 0 reparse records processed. CHKDSK is verifying indexes (stage 2 of 5)... 848938 index entries processed. Index verification completed. 0 unindexed files processed. CHKDSK is verifying security descriptors (stage 3 of 5)... 822016 security descriptors processed. Security descriptor verification completed. 13461 data files processed. CHKDSK is verifying file data (stage 4 of 5)... The disk does not have enough space to replace bad clusters detected in file 239649 of name . The disk does not have enough space to replace bad clusters detected in file 239650 of name . The disk does not have enough space to replace bad clusters detected in file 239651 of name . An unspecified error occurred.f 822000 files processed) Yet, when I ran the SeaTools short & long generic tests on the Seagate disk, I didn't receive any errors. I know that I could reformat the disk and try running chkdsk /r again but I'd prefer to avoid waiting 4 hours in the hope that the problem was magically fixed. On the other hand, if I RmA the drive to Seagate, I have no SeaTools error number to use and they may claim that the drive is just fine. What should I try to do next? Side frustration: There is plenty of free hard drive space. The E: partition has 182 GB free.

    Read the article

  • how to stop homegroup sharing folders?

    - by srisar
    hi, i have homegroup on my pc and laptop, both running windows 7 , i can share the folders & files easily, but the problem is i cant stop sharing the folder. even i went to computer manage and stop sharing from there, but inside the homegroup the "stopped" share files are still showing. but now i cant open them because its showing the network resource is unavailable. but still the folders are showing how to hide them?

    Read the article

  • Does anyone know of a library in Java that can parse ESRI Shapefiles?

    - by KingNestor
    I'm interested in writing a visualization program for the road data in the 2009 Tiger/Line Shapefiles. I'd like to draw the line data to display all the roads for my county. The ESRI Shapefile or simply a shapefile is a popular geospatial vector data format for geographic information systems software. It is developed and regulated by ESRI as a (mostly) open specification for data interoperability among ESRI and other software products.1 A "shapefile" commonly refers to a collection of files with ".shp", ".shx", ".dbf", and other extensions on a common prefix name (e.g., "lakes.*"). The actual shapefile relates specifically to files with the ".shp" extension, however this file alone is incomplete for distribution, as the other supporting files are required. Does anyone know of existing libraries for parsing and reading in the line data for Shapefiles?

    Read the article

  • How to mount an Oracle database to new instance?

    - by Vimvq1987
    I have an instance of Oracle 10g R2 installed on Windows Server 2003. This instance was running an database, which does not have any backup. Now the OS went down, and could not repaired, all I got is the running files of the old instance. How can I restore the database from these files to new instance? A step-by-step guide will be much appreciated because I'm new with Oracle. Thank you very much

    Read the article

  • How to add a whole directory or project output to WiX package

    - by Coincoin
    We decided to switch from VS integrated setup to WiX. However, what we currently do is use projects output files as the input for the setup project. This lets us easily add Application Files to a directory (for images, samples, and other resources...) and those files are automatically added to the setup when we build. I could not find any similar feature in WiX. WiX seems to require one Directory entry and one File entry for each and every directory and file. This would require us to change the WiX source everytime a file is added which, to my eyes, is prohibitive since we have so many of them. Is there any integrated way of doing that with WiX or do I have to write my own task that will create a WiX source before calling candle?

    Read the article

  • Javascript upload in spite of same origin policy

    - by Brian M. Hunt
    Can one upload files to a domain other than the domain a script originates from? For example, suppose you're hosting files on www.example.com, and you want to upload files to uploads.example.com, would the following script violate the same origin policy (using uploadify): <!-- from http://www.example.com/upload.html --> <input id="fileInput" name="fileInput" type="file" /> <script type="text/javascript">// <![CDATA[ $(document).ready(function() { $('#fileInput').uploadify({ 'uploader' : 'uploadify.swf', 'script' : 'http://upload.example.com/uploadify.php', 'cancelImg' : 'cancel.png', 'auto' : true, 'folder' : '/uploads' }); }); // ]]></script> I haven't seen a clear reference on the same origin policy that would indicate one-way or the other whether the this would be an issue with any browsers.

    Read the article

  • What's the fastest filesystem for developer builds?

    - by Dan Fabulich
    I'm putting together a Linux box that will act as a continuous integration build server; we'll mostly build Java stuff, but I think this question applies to any compiled language. What filesystem and configuration settings should I use? (For example, I know I won't need atime for this!) The build server will spend a lot of time reading and writing small files, and scanning directories to see which files have been modified.

    Read the article

  • Checking the integrity of windows 2008 server backup

    - by Norberto
    Hi all There is a simple way to check the integrity of a backup made with Windows Backup in Windows 2008 Server Enterprise (not R2)? Today i tried to restore some files from a backup I made some days ago, and the Windows Backup snap-in reported that the file was unreadable. I think the corruption occurred due to the network share in wich the files were being stored went offline during the backup process, that was my fault. Regards Norberto

    Read the article

  • How to use Mahout in a Windows environment?

    - by oopdemo
    I am trying to use Mahout in an application running on Windows. I want to build clusters from a lucene index using k-means. As soon as I have to create sequence files (creating vectors from a lucene index), I get a Hadoop-Exception, since Hadoop makes command line calls to programs unknown in a Windows environment (e.g. chmod). Running in Cygwin is not an option, since I want to be able to run the App from eclipse. So my question is is there a way to avoid having to create sequence files to retrieve my vectors from a lucene index? or is there a way to create sequence files in a Windows environment?

    Read the article

  • How do I access a hard drive from a computer where windows is deleted?

    - by Intredasting
    Here is the current situation: My cousin deleted Windows from his hard drive (yeah, don't ask...). His hard drive still has about 200 GB of files on it that he may want to recover before we format the hard drive and reinstall Windows 7 to it. Is there a way I can create a bootable CD from some utility that will allow me to access the files on the hard drive, and copy it to a flash drive? What's the best utility for that?

    Read the article

  • Compressing with RAR vs ZIP

    - by FerranB
    A lot of people are compressing files with RAR, sending compressed files with RAR and so on. ZIP is more standard and works on all platforms. Windows users have ZIP included and linux users have no trouble with that file format. The tests I did sometime ago showed me that RAR compress better (some KyloBytes, no more) but not enough to use a non-free software when ZIP works on almost all the computers for free. Why do some people use RAR rather than ZIP for compressing?

    Read the article

  • Looking for speech-to-text tool (convert .wav to text)

    - by David
    I have the ability to get .wav files of voice mails emailed to me, but sometimes I'll be sitting in a meeting and I need to know the content of a message without playing it out loud. Are there any good (and, preferably, free) tools for converting .wav files to text? I know Google Voice has this capability, but I can't determine if it'll work on a file-by-file basis. I realize that this is a difficult research problem, but even an 80% solution might be workable.

    Read the article

  • Selective patching

    - by Franz
    Hi, I have a folder and a patch for that folder. Now, I do not want to include every change made in the patch in my commit. I can select which files I want to exclude in Subclipse, but can I do the same with only certain lines in those files?

    Read the article

  • Kohana and Javscript path

    - by Leonti
    Hi! I have a following Kohana setup: All my files are placed under 'public_html/koh' My js files are placed under 'public_html/koh/media/js/' I use html::script helper to include those javascript files which generates me following html code: In my js I access one of controllers like 'json/getsomething' (which is http://localhost/koh/json/getsomething). It works OK as long as I'm staying in top of the controller: http://localhost/koh/home When I go to 'http://localhost/koh/home/index' it renders the same page of course but 'json/getsomething' is not accessible from Javascript anymore. How can I solve this problem? Include Javascript using absolute path? Create a variable in js like var fullPath = 'http://localhost/koh/'? What is the best practice to do it? Leonti

    Read the article

< Previous Page | 477 478 479 480 481 482 483 484 485 486 487 488  | Next Page >