Search Results

Search found 75611 results on 3025 pages for 'copy file'.

Page 573/3025 | < Previous Page | 569 570 571 572 573 574 575 576 577 578 579 580  | Next Page >

  • How to analyze CBS.log file by SFC command to detect corrupt files in Windows 7?

    - by energydream2007
    Hey I have some problems with my Win7 so I've ran sfc /scannow and I got this message: "Windows Resource Protection found corrupt files and was unable to fix some of them..." I also told to look at the folder of the log file to find the cbs.log. After that I've ran this command to pull out the actual problems/corrupt files: findstr /c:"[SR]" %windir%\Logs\CBS\CBS.log >"%userprofile%\Desktop\sfcdetails.txt" This command have created an sfcdetails text file (download). Can someone help me to analyze this file? I haven't found a detailed article about it so far.

    Read the article

  • Online OCR website for processing an entire pdf file at one time?

    - by Tim
    I am looking for an online OCR website for processing a multi-page pdf file at one time. Free preferably. I know http://www.newocr.com/. If I am correct, it can only OCR one page at a time, by manually clicking "Preview" and then clicking "OCR" for each page. After each page is OCRed, I have to copy out the text result manually too. If my pdf file has 30~ pages, it will be tedious to repeat the above process for each page. I wonder if there is some other online websites that OCR a whole pdf file, without asking me for manual operation? Thanks!

    Read the article

  • Why does tomcat like deleting my context.xml file?

    - by staticsan
    I'm developing a web-based Java application at work and (obviously) have to run it locally during development. I've figured out the Tomcat docs and have a suitable context.xml file in /etc/tomcat6/Catalina/localhost/ but every so often, Tomcat decides to delete it! Which means I have to put it back and restart Tomcat. Why does it do this? I have searched the Tomcat docs about it and am none the wiser. (Oh yes: it's not actually called context.xml but owners.xml as that's the HTTP path prefix for this application.) Update I've now seen Tomcat delete the file whilst Tomcat was running. I think I need to file a bug...

    Read the article

  • Windows 7 - swap file on a USB disk? [closed]

    - by Sara Cohen
    Possible Duplicate: How to move the page file to another physical disk location Windows 7 I was given temporarily a PC, running Windows 7 Ultimate. The problem is it's hard drive is full, there are like 250 MB free. The swap file is set to none. It has 4 GB RAM. When I load a few tabs in Chrome or IE and start a game it runs out of memory. I already emptied Recycle Bin, %temp%, etc. Deleting/moving user files or adding RAM is not an option. Now I have a USB 3 7200 RPM drive, it's connected to a USB 3 port and is really fast. Is there a way to create a swap file on that drive?

    Read the article

  • How can you change the OST data file that a Windows/Outlook email profile points to?

    - by Howiecamp
    I'm running Outlook 2007 on Windows 7. I'd like to change the OST data file that a Windows/Outlook email profile points to. I've attempt to change it as follows: Go in to the "Mail" Control Panel applet Click the "Data Files..." button Double-click the particular email account Click the "Advanced" tab Click the "Offline Folder File Settings..." button The resulting dialog box has the "File:" textbox and the "Browse" button disabled. I've searched the registry for a match (so I could possibly change it there) but found nothing.

    Read the article

  • How to Shrink/Reset File stderr1 on a SAP System?

    - by Techboy
    I have a file called stderr1 in the work directory of several of the SAP servers in my production cluster. It has grown to around 19GB's to fill the hard disk on each server. I have deleted all trace files and WP files from within transaction SM50 but that hasn't deleted it (or re-named it to .old). If I try to rename or delete it manually, it says I can't because the file is in use. Please can you tell me how I can delete or shrink the stderr1 file?

    Read the article

  • fsck: FILE SYSTEM WAS MODIFIED after each check with -c, why?

    - by Chris
    Hi I use a script to partition and format CF cards (connected with a USB card writer) in an automated way. After the main process I check the card again with fsck. To check bad blocks I also tried the '-c' switch, but I always get a return value != 0 and the message "FILE SYSTEM WAS MODIFIED" (see below). I get the same result when checking the very same drive several times... Does anyone know why a) the file system is modified at all and b) why this seems to happen every time I check and not only in case of an error (like bad blocks)? Here's the output: linux-box# fsck.ext3 -c /dev/sdx1 e2fsck 1.40.2 (12-Jul-2007) Checking for bad blocks (read-only test): done Pass 1: Checking inodes, blocks, and sizes Pass 2: Checking directory structure Pass 3: Checking directory connectivity Pass 4: Checking reference counts Pass 5: Checking group summary information Volume (/dev/sdx1): ***** FILE SYSTEM WAS MODIFIED ***** Volume (/dev/sdx1): 5132/245760 files (1.2% non-contiguous), 178910/1959896 blocks Thanks, Chris

    Read the article

  • Separate php.ini file for each Apache virtual host?

    - by Calvin L
    Is it possible to have a separate php.ini file that overrides the default php.ini file for each virtual host? I'm running Apache/2.2.14, PHP 5.3.2-1. For example I have several vhosts pointing to domains in my /var/www/ directory: /var/www/website1.com /var/www/website2.com What I'd like is to be able to place a custom php.ini file in each directory that would override the default values only for that vhost, but keep the original defaults if the value isn't specified: /var/www/website1.com/htdocs/ /var/www/website1.com/php.ini EDIT: I found more info on the topic here for those interested: http://serverfault.com/questions/34078/how-do-i-set-up-per-site-php-ini-files-on-a-lamp-server-using-namevirtualhosts

    Read the article

  • Opening Office 2007 files using a Vista or Win7 client on a server 2008 file share causes lockups an

    - by DrZaiusApeLord
    I think this mostly happens when trying to open files opened by other users. In the XP/2003 days you would get some kind of warning about a locked/read only file. With 7/Vista/2008 I'm just seeing clients hang (Word just sits there) and if I go into the file share and attempt to right-click on the file, explorer hangs for several minutes. I tried disabling AV on the file server as well as locally. No luck. I've read that SMB2.0 might be the culprit here, but even testing that solution means disabling it on both the client and server, and requires a server reboot. Does this sound like an SMB2 issue? The server is 2008 SP1. The clients are Win7 vanilla and Vista SP2 with all the current updates. Office 2007 SP2 with all updates. Thanks.

    Read the article

  • How can I add autocomplete in notepad++ for javascript in an .html file?

    - by Nikhil
    Notepad++ does auto-complete for html and also for JavaScript. but the auto-completion depends on file extension.html file supports completion for html only. Is there a way to enable auto-completion for javascript in the script tag of an html file? I mean, other than copying the auto-completion keyword list from "javascript.xml" to "html.xml" files... As suggested on stackoverflow, asking on superuser, as I could not find a satisfactory answer there. If anyone has any idea, please let me know! (I hope I am not the only one having this kind of an requirement! :))

    Read the article

  • What command to use to unpack a .tar.gz file in Windows?

    - by Ivan
    I use to receive tar.gz files fromm colleagues but unpacking them with 7zip is not convenient as it unpacks a tar file from gz first while I usually seek to get the tar file contents (so I have to unpack the tar file explicitly next). I've tried adding tar -zxvf %P%N command to Total Commander button bar but it turned that tar -zxvf doesn't work but returns the following error (tried it in bare command prompt): tar: Cannot fork: Function not implemented tar: Error is not recoverable: exiting now I have got both GnuWin32 and cygwin installed, but I seek a solution that will work in Windows command line promt (and so will be callable from Totat Commander), not in cygwin command line promt.

    Read the article

  • How to make VLC play a .vlm configuration file in "With no interface mode"?

    - by Ole Jak
    How to make VLC play a VLM configuration file (.vlm) in "With no interface mode" on Windows? So I have a VLM configuration file that should stream audio from mic to localhost so no VLC user interface is needed. If I say to Windows "play VLM file with VLC" it plays correctly, starts the server where I need and streams data. But how to do such a thing manually from the command line (so we suppose we can call vlc.exe by VLC and we are now in folder with the vlc.exe and vlcConfig.vlm files)?

    Read the article

  • How can I restore Outlook 2007 from a PST file without having to import everything?

    - by schnapple
    I recently upgraded to Windows 7 and went the "format from scratch" route. I backed up my C:\ drive to the free space on my D:\ drive. So now I have Outlook 2007 reinstalled and I have my .pst files and so forth from the previous installation. If memory serves the answer on getting all those emails back into Outlook again is "create a new .pst file for the account and then reimport everything". What I'd like to do is be able to just put the .pst file where it's supposed to go and then have Outlook 2007 just "remember" everything. But I'm pretty sure this doesn't work. Is there a way to restore Outlook from a pst file without having to re-import everything?

    Read the article

  • How to display certain lines from a text file in Linux?

    - by Boaz
    Hi, I guess everyone knows the useful Linux cmd line utilities head and tail. Head allows you to print the first X lines of a file, tail does the same but prints the end of the file. What is a good command to print the middle of a file? something like middle --start 10000000 --count 20 (print the 10,000,000th till th 10,000,010th lines). I'm looking for something that will deal with large files efficiently. I tried tail -n 10000000 | head 10 and it's horrifically slow. Thanks, Boaz

    Read the article

  • How to convert dvr-ms file in Ubuntu to DVD?

    - by edmicman
    I have a .dvr-ms file of a recorded TV show from my Vista Media Center. I would like to burn this to a DVD that can play on any standalone DVD player. My main PC that I want to use to convert it to a DVD format is running Ubuntu 10.04. I am able to play the file in Ubuntu using VLC (which surprised me) so I'm assuming I have what I need to decode it. I guess my questions are: What format do I need to convert this file to so that I could burn it to a playable DVD? I started to go through VLC's conversion process and chose I think H264 and AAC or something, and it gave a message about not having an AAC encoder. I'll look into that some more tonight, but is that something I could then burn to a DVD? Thanks for any help!

    Read the article

  • How to Delete File stderr1 on a SAP System?

    - by Techboy
    I have a file called stderr1 in the work directory of several of the SAP servers in my production cluster. It has grown to around 19GB's to fill the hard disk on each server. I have deleted all trace files and WP files from within transaction SM50 but that hasn't deleted it (or re-named it to .old). If I try to rename or delete it manually, it says I can't because the file is in use. Please can you tell me how I can delete or shrink the stderr1 file?

    Read the article

  • How do I open a pdf file with PDF X-Change Viewer so that I can still modify the pdf source?

    - by ltcomdata
    Whenever I open a pdf file with PDF X-Change Viewer it locks up the source pdf file to edits. Is there a way to open the pdf (with PDF X-Change Viewer) so that it doesn't lock-up the source file --- perhaps as a shell command with an option? The background: I use LaTeX to edit my pdf files, and preview the result with PDF X-Change Viewer. I must first close PDF X-Change Viewer before I can preview any changes I make in the LaTeX source. It would be nice if PDF X-Change Viewer did not lock-up the pdf source so that I could modify it without first closing PDF X-Change Viewer.

    Read the article

  • Can I do a "one-time" file content search in Windows Server 2008 without adding the folder to the index?

    - by G-.
    Can I search for files which contain a specific string in a folder if that folder is not in the search index? So, lets say folder 'textFiles' is not in the index. I navigate to this folder in windows explorer. I type '.ini' in the search box I want to see a result list containing only 'b.txt' FOLDER C:\textFiles\ FILE a.php CONTENT once twice thrice mice moose monkey FILE b.txt CONTENT mingle muddle middle.ini banana beer FILE c.spo CONTENT sellotape stapler phone book I do not have permission to add folders to the windows index and I do not have permission to install or run any executables that did not ship with the server or approved applications. I'd be happy with a windows native command line solution if necessary? Thanks G

    Read the article

  • error 0x80070522. not able to create a file in c:\ directory

    - by Abbas
    Hello Everybody... "error 0x80070522. not able to create a file in c:\ directory" One of our customers has just found a problem when trying to create a file on the root of the C:\ Drive, on a Windows 7 Professional PC. I know they shouln't be keeping files here, but there is a valid reason in this case, so I've relaxed the security on the root of C:\ by giving the group 'users' modify permission. Before I relaxed the security, the user was receiving 'access denied', but now they are receiving the message: An unexpected error is keeping you from creating the file. If you continue to recieve this error, you can use the error code to search for help with this problem. Error 0x80070522: A required priviledge is not held by the client. Googling for this suggests that it is caused by UAC, but how can I get round this when the user doesn't have admin rights on their PC? So did you find a solution for this issue ?? Please its urgent to my accountant software..

    Read the article

  • How do I reassemble a zip file that has been emailed in multiple parts?

    - by Guy
    I received 3 emails each containing part of a zip file. The extensions end in .z00, .z01 and .z02. (Emailed as such to get around the typical 10Mb attachment limit per email.) I have put all 3 files into one directory. I can use both 7-zip and WinZip to open the first file (the .z00 file) and it lists the contents of the zip but when trying to extract the files both programs are reporting errors. What is the least error prone way of reassembling this zip and getting to the files?

    Read the article

  • Tool to convert a file of HEX to ASCII character set?

    - by Aaron
    Question: Is there a known tool to convert a file consisting of 2 byte Hex into ascii? Note: - Maintain file offset listing in bytes Example: File contents: 00000000 0054 0065 0073 0074 0020 0054 0065 0073 00000008 0074 0020 0054 0065 0073 0074 0020 0054 00000016 0065 0073 0074 0020 0054 0065 0073 0074 00000024 0020 0054 0065 0073 0074 0020 0054 0065 00000032 0073 0074 0020 0054 0065 0073 0074 0020 00000040 0054 0065 0073 0074 000a 0054 0065 0073 00000048 0074 0020 0054 0065 0073 0074 0020 0054 00000056 0065 0073 0074 0020 0054 0065 0073 0074 00000064 0020 0054 0065 0073 0074 0020 0054 0065 Expected output 00000016 0065 0073 0074 0020 0054 0065 0073 0074 |est Test Test Te| 00000032 0073 0074 0020 0054 0065 0073 0074 0020 |st Test Test.Tes| 00000048 0074 0020 0054 0065 0073 0074 0020 0054 |t Test Test Test| 00000064 0020 0054 0065 0073 0074 0020 0054 0065 | Test Test Test |

    Read the article

  • How to determine if a file has been backed up?

    - by Console
    I try to consolidate old drives to new ones of larger capacity. Sometimes files have been renamed, but are otherwise identical. Sometimes an old directory has just a few more files in it than a newer directory with the same name. Sometimes a file has the same name but the size differs. So I often find myself asking the question: Are there any files on this old drive or directory that I haven't already copied to the new drive? I just want to know that I have the files, I don't want to try and sync stuff automatically (Syncing tools tend to just sync, creating duplicate folder structures and other problems, so I prefer to do it by hand). Basically, if an old drive has a file called "foo.bar" ten directories deep, and my new big drive has an identical file called "oldstuff.zip" in the root, I just want a "yes you have it" or "no, unique files exist". Is there a free tool, a script or a quick and easy method (Mac/Unix or Windows) to get the answer?

    Read the article

  • How to rsync a large file, with as little CPU and bandwidth expense as possible?

    - by Johan Allgoth
    I have a 500 GB file that I plan on backing up remotely. The file changes often. I'll be rsyncing it from a desktop to a server. Both can run rsync client or server. What is the proper command for this? The ones I've tried sofar has been taking forever or simply acted strange. Example and results: rsync -cv --partial --inplace --no-whole-file /desktop/file1 myserver.com::module/file1 Seems to work, but only if I do it twice (?!). Also, slow. Does the above command do the checksumming on both computers, or only on the sending one? Is it correct otherwise?

    Read the article

  • kill SIGABRT does not generate core file from daemon started from crontab.

    - by Guma
    I am running CentOS 5.5 and working on server application that sometimes I need to force core dump so I can see what is going on. If I start my server from shell and send kill SIGABRT, a core file is created. If I start same program from crontab and then I send the same signal to it the server is "killed" but no core file is generated. Does any one know why is that and what need to be added to my code or changed in system settings to allow core file generation? Just a side note I have ulimit set to unlimited in /etc/profile I have set kernel.core_uses_pid = 1 kernel.core_pattern=/var/cores/%h-%e-%p.core in /etc/sysctl.conf Also my server app was added to crontab under same login id as I am running it from shell. Any help greatly appreciated

    Read the article

  • Double Click to open Office docs is slow, File -> Open is fast.

    - by Keith
    I have 2 unique networks. They both share similar architecture: Windows 2003 SBS SP2 Running Symantec Endpoint Running Symantec Information Foundation Shared drives off a data partition Clients running Office 2003 or 2007 Connect to file server through mapped drives When users try to open a file from their local PC by double clicking, it will take 30-60 seconds to open. When they do File - Open, those same documents open up almost immediately. So far I've tried the following - CCleaner to parse the registry of outdated mapped drives - Disabled "using DDE" - Disabled A/V - Reboot Any ideas beyond that? Figured this question belongs here instead of SU since its the same issue on different networks.

    Read the article

< Previous Page | 569 570 571 572 573 574 575 576 577 578 579 580  | Next Page >