Search Results

Search found 72319 results on 2893 pages for 'file explorer'.

Page 665/2893 | < Previous Page | 661 662 663 664 665 666 667 668 669 670 671 672  | Next Page >

  • mac osx script - find mounted disc with name, copy file from desktop to it, when copy complete, unmount disc

    - by Joshc
    Is there some kind of mac script which finds a newly mounted disc drive that has the name 'EXAMPLE', then copies a file/folder to it and safely ejects the disc when copy has finished? I don't mind if the script needs to be executed by a short cut. And will it work if there are multiple drives plugged in? The reason I am asking is becuase i have 5000 usb memory sticks that I need to copy about 20mb worth of data to it. Thanks in advance for any ideas.

    Read the article

  • Why does cpio say "WARNING! These file names were not selected" when copying a large number of files

    - by mmm bacon
    For over 10 years, I've been using this strategy to copy a large number of files between UNIX filesystems: cd source_directory find . -depth -print | cpio -pdm /path/to/destination_directory It works like a champ. However, I'm now getting this error from cpio: cpio: WARNING! These file names were not selected: (long list of files here...) The source directory is on OSX 10.5, and the destination directory is a NFS filesystem from an OpenSolaris server. Copying over NFS has never been a problem in the past. There's nothing strange about the filenames, meaning there aren't special characters or anything like that. Any ideas?

    Read the article

  • Why can't I copy a 7 GB file to an external USB HD with 120 GB free?

    - by Johann Gerell
    Yes, why can't I? I was stashing away some old photography backup zips last night. I could copy 4 of my 1 GB backup zips to my external USB connected hard-drive when I got the error message "Cannot copy file. Not enough free space." (sort of) for a zip of roughly 7 GB. But there are 120 GB free. Why is this? EDIT: Clarification - the files that I could copy was smaller than 4 GB. The failing one was 7 GB. The cause seems to be the FAT32 4 GB limit.

    Read the article

  • How to open first n bytes of file in hexadecimal and edit it?

    - by Larssend
    I want to edit some .avi videos (cut them, to be precise) in VirtualDub but it failed to open the files. They are encoded in xvid, which I have installed, and play in KMPlayer without problem. Also, all other xvid videos can be opened and cut just fine by VirtualDub. I suspect there's something wrong in the first few bytes of these particular videos (the magic number?). This means I have to open the offending files in a hex editor and make some necessary adjustments to the header. Problem is, they are very large ( 3 GB each) and take very long time to open in UltraEdit. Can UltraEdit open just the first few bytes of a file? If not, do you know of an application that can do that? Edit: I'm using Windows XP.

    Read the article

  • Lenovo ThinkPad: What does the PWMDBSVC.exe service do? It's writing a C:\Log.txt file.

    - by thinkPadUser
    I found a file that keeps popping up in my C:\ drive root, Log.txt ... after installing Process Monitor and seeing what process was writing to it, I came across PWMDBSVC.exe, which appears to be part of the Lenovo ThinkPad software. Even if I delete it, I can get it to re-create the Log.txt when I lock and unlock my workstation. Does anybody know what this software does and whether it is safe to disable? I searched Google already and got the usual pile of useless hits on the process name but nothing seemingly definitive!

    Read the article

  • How do I make an Illustrator file "higher resolution"?

    - by drewjoh
    I was given an illustrator file, but all the curves on the artwork are jagged. I've tried "rasterizing" and exporting by increasing the size of the image. I don't know what else to do or what I'm doing wrong. My understanding is the beauty of Illustrator is that it's all done mathematically, so I can scale it up to infinity and it will be perfect (more or less). And that lines are drawn that way also, so they should be (or can be) infinitely smooth if they want to be. Here's what I have right now: Here's what I have with the image selected showing the plot lines: And a zoomed in view: *I'm not experienced in Illustrator at all; I only know whatever I can carry over from moderate Photoshop experience.

    Read the article

  • How to block a user in apache httpd server from accessing a *.php file inside a Directory, instead user should access this using Directory name

    - by Oxi
    My requirement looks Simple, But Googling Did not help me yet. my query is i want to Throw a 404 page to a user(Not Re-Direct to another folder or file), who is trying to Access *.php files in my website ex: when a client asks for www.example.com/home/ i want to show the content , but when user says www.example.com/home/index.php i want to show a 404 page. i tried different methods, nothing worked for me, one of which tried is shown below <Directory "C:/xampp/htdocs/*"> <FilesMatch "^\.php"> Order Deny,Allow Deny from all ErrorDocument 403 /test/404/ ErrorDocument 404 /test/404/ </FilesMatch> </Directory> Thanks in Advance

    Read the article

  • What is the easiest way to get perfmon counter names into a text file?

    - by Bill Paetzke
    I'd like to create a settings file for my logman command. I expect to have lots of perfmon counters. Is there any easy way to get all the perfmon counters' exact text anywhere? The only thing I thought of was to create a Perfmon Counter Log through the GUI and then export the list of selected counters--but I don't see an export option! I guess I could manually copy what I see on the screen, but that seems inefficient. I'm going to be dealing with tens of counters. Maybe there is a list somewhere? That'd be easier to copy and paste from.

    Read the article

  • File storage service that allows clients to upload large files to my account?

    - by deceze
    Can anyone recommend an online file storage service which fulfills these requirements? I can create an account I can invite clients to upload files into my account clients do not need to register to be able to upload clients must not be able to see anything but their own files or they must not see any files at all, they get only a dropbox only I can access the uploaded files, everything is non-public service is multi-lingual I just need clients to be able to send me potentially large files in a dead simple manner online, that's all. No registration step to go through, no software to download, no synching or sharing. No setting up of individual folders and permissions for each individual client. No copying and pasting of links (a la Mediafire, Rapidshare etc).

    Read the article

  • Easiest way to get Perfmon counter names into a text file?

    - by Bill Paetzke
    I'd like to create a settings file for my logman command. I expect to have lots of perfmon counters. Is there any easy way to get all the perfmon counters' exact text anywhere? The only thing I thought of was to create a Perfmon Counter Log through the GUI and then export the list of selected counters--but I don't see an export option! I guess I could manually copy what I see on the screen, but that seems inefficient. I'm going to be dealing with tens of counters. Maybe there is a list somewhere? That'd be easier to copy and paste from.

    Read the article

  • tar: How to create a tar file with arbitrary leading directories w/o 'cd'ing to parent dir

    - by Yan
    Say I have a directory of files at /home/user1/dir1 and I want to create a tar with only "dir1" as the leading directory: /dir1/file1 /dir1/file2 I know I can first cd to the directory cd /home/user1/ tar czvf dir1.tar.gz dir1 But when writing scripts, jumping from directory to directory isn't always favorable. I am wondering is there a way to do it with absolute paths without changing current directories? I know I can always create a tar file with absolute paths INSIDE and use --strip-components when extracting but sometimes extra path names are extra private information that you don't want to distribute with your tar files. Thanks!

    Read the article

  • Software that will burn DVDs with a SFV or Pararchive file for each disc?

    - by Matt
    I'd like to burn several thousand RAW files (.DNG) of around 10-30 MB each to DVD to backup my photo archive. I'm looking for software that can do this and include an SFV-type file on each disc burnt. These are my requirements: compression is optional, and probably not desirable due to the extra time involved files should be spread amongst the discs as self-contained units, i.e. I don't want to have to load files from more than one disc to be able to read the files on that disc, so that excludes WinRAR's spanning options I don't want to spend time writing ISO images first as this will be a task I'll need to repeat often as I add new images to my archive - the software should write to the DVDs for me as simply as possible I'd like the SFV/Pararchive/recovery record to be stored with the files on each disc, so it only references the files on that particular disk Thanks in advance!

    Read the article

  • Script to gather all the files ending in .log and create a tar.gz file.

    - by Oscar Reyes
    I'm currently using this script line to find all the log files from a given directory structure and copy them to another directy where I can easily compress them. find . -name "*.log" -exec cp \{\} /tmp/allLogs/ \; The problem I have, is, the directory/subdirectory information gets lost because, I'm copying only the file. For instance I have: ./product/install/install.log ./product/execution/daily.log ./other/conf/blah.log And I end up with: /tmp/allLogs/install.log /tmp/allLogs/daily.log /tmp/allLogs/blah.log And I would like to have: /tmp/allLogs/product/install/install.log /tmp/allLogs/product/execution/daily.log /tmp/allLogs/other/conf/blah.log

    Read the article

  • What is the correct root file to import to aptana from an xampp folder/existing web app or git repository?

    - by gaff
    Very noob question - all of this is new to me - not really sure how to get started. Overview: I'm taking over an existing wep project that has been developed in aptana and deployed in xampp setup - I also have access to Git repository on local directory. I want to import the web application into aptana, run it and begin updating/editing - bit of steep learning curve for me What is the best way to import? And what should I import? What should it look like in aptanan? I tried importing what I think is the root file from the git folder ("existing folder as new project")- it contains things like css, doc, img and js. This looks right to me - but might not be. Thanks, Gaff

    Read the article

  • Automation of software installation - should I ask for text or file?

    - by Denis
    I am preparing a software installation in Windows environment for my application. During installation it asks for Subscriber ID which should be entered into text field. I am wondering if it is a best solution for mass installations. I know that for mass installations IT teams use systems like Microsoft System Center which allow automate deployment. But I do not know much about capabilities of such systems. Can such system automate data entry into the text fields? Will it be better to change installation process and ask not a text but a file which contains Subscriber ID? By the way, I am looking for beta testers for my software. This software let user view Microsoft Project files without having Microsoft Project installed.

    Read the article

  • Does ZFS cache Compressed or Uncompressed data in a ZFS file-system with compression turned on?

    - by George Bailey
    ZFS supports file-system compression and it also caches frequently or recently accessed data. If a system has lots of CPU but the underlying data storage system is slow. It is possible that ZFS would perform better with compression turned on. This can be easily tested when writing files by measuring CPU and disk usage and throughput. (of course latency may exist,, but this would not be an issue for large files). But what about cache? If data will have to be decompressed every time it is read then this is probably less of a good idea. Is the cached data compressed?. Does anybody have some information on this?

    Read the article

  • Why do we still have to use drive letters to identify file systems?

    - by Charles E. Grant
    A friend has run into a problem where they installed Windows 7 from an external drive, and the internal boot drive is now assigned to H:. Theoretically this shouldn't cause problems because there are programming interfaces for getting the drive letter for the system drive. In practice though, there are quite a few programs that assume that C: is the only possible location for the system directories, and they refuse to run with the system directories on H:. That's not Microsoft's fault, but it's a pain none-the-less. The general consensus seems to be that a re-install, setting the internal boot drive to C:, is the only way to avoid fix these problems. UNIX-like systems display all file systems in a single unified directory tree and mostly seem to avoid problems like this. Is it possible to configure a Windows system without reference to drive letters, or does the importance of backwards compatibility mean that Windows will be working with drive letters from now until doomsday?

    Read the article

  • pyexiv2 build error src/exiv2wrapper.hpp:32:29: error: exiv2/preview.hpp: No such file or directory

    - by Jake
    The other day I used apt-get install python-pyexiv2 on my ubuntu server, but it seems to have given me an old version. It's not compatible with the code I wrote in my local development environment so I'd like to update it. I downloaded the latest tar.gz from the website, extracted it and ran scons as per the readme. But it will not build, I get the error src/exiv2wrapper.hpp:32:29: error: exiv2/preview.hpp: No such file or directory I've also user apt-get to install libboost-python-dev and libexiv2-dev Can anyone help me on this?

    Read the article

  • How to remove leading whitespace from file and folder names?

    - by timoto
    How to remove leading whitespace from file and folder names? (I'm running OS X 10.6 Snow Leopard.) As provided below by @Lri I was able to remove trailing whitespace using this: #!/bin/bash IFS=$'\n' for d in {1..9}; do find ~/Desktop -name '* ' -depth $d | while read f; do mv "$f" "$(sed 's/ *$//' <<< "$f")" done done Now I'm trying to remove leading whitespace with this: #!/bin/bash IFS=$'\n' for d in {1..9}; do find ~/Desktop -name '* ' -depth $d | while read f; do mv "$f" "$(sed 's/^ *//;s/ *$//' <<< "$f")" done done but it doesn't work. What am I doing wrong?

    Read the article

  • Testing DNS configuration of domain by using hosts file?

    - by Alex Blundell
    I'm currently migrating a website to another server, and want to test the DNS configuration (more specifically, email mx records) before moving the domain over. I've configured the DNS on the new server to have mx entries for Google Apps in the same way that it's configured on the old server. The domain is controlled by nameservers on the old server at the moment, so the change would simply be updating the nameservers to the new servers. (What I'm getting at is DNS is controlled at the server level, not registrar level). Since the website has quite a number of users, I want to make sure the configuration is right before flicking the switch. For this, can I add an entry to the hosts file of my local computer to point the domain to the new server? I've done this, and the web server works, but would this also test the email mx records on the new server?

    Read the article

  • Component MSINET.OCX or one of its dependencies not correctly registered: a file is missing or invalid

    - by tintincute
    Hi can someone please help me. I'm using win7 64Bit and I'm trying to download this program "Croque-Mort" But when I do, I'm getting an error: "Component MSINET.OCX or one of its dependencies not correctly registered: a file is missing or invalid" I tried to check it by opening my cmd.exe and run as administrator and then type : regsvr32 msinet.ocx Then I got an error: "The module "MSINET.OCX" failed to load. Make sure the binary is sorted at the specified path or debug it to check for problems with the binary or depenedent .DLL files. The specified module coudl not be found" Is there a way to install this component? Would appreciate your help

    Read the article

  • Will USMT 4.0 in MDT 2010 Move/Migrate the .NK2 File for Outlook?

    - by Mitch
    We're about to begin a refresh project for about 100 XP Pro laptops and have a concern with regards to the .NK2 file which holds cached email addresses(?). If possible we'd like to have USMT move/migrate this but I can't find anything that confirms that this happens automatically or has been done before. I see lots of manual processes but at this point I'm not sure that we can use that. Has anyone done this or seen this done? Perhaps you can point me to a resource that can give me an idea how its done? Any information would be appreciated. USMT seems to get a lot of the details but missing this part seems odd. Thanks in advance for any responses.

    Read the article

  • How can I change the flow through this PAM (programmable authentication module) file?

    - by Jamie
    I'd like the PAM module to skip the pam_mount.so line when a unix login succeeds. I've tried various things including: auth [success=2 default=ignore] pam_unix.so nullok_secure auth [success=2 default=ignore] pam_winbind.so krb5_auth krb5_ccache_type=FILE cached_login try_first_pass auth requisite pam_deny.so auth requisite pam_permit.so auth required pam_permit.so auth optional pam_mount.so But can't get it to work. Conversely, when a session shuts down, how can I modify the following os that an unmount command (via pam_mount.so) is avoided during a unix login? session [default=1] pam_permit.so session requisite pam_deny.so session required pam_permit.so session required pam_unix.so session optional pam_winbind.so session optional pam_mount.so

    Read the article

  • How can I specify multiple rules for a particular log file(s) with logrotate?

    - by Ether
    I have a logrotate.d config file that looks something like this: /home/myapp/log/* { daily compress dateext ifempty delaycompress olddir /home/myapp/baklog } There are a few particular log files where I want to apply additional rules, such as "mail". How can I apply additional rules to just some files? If I add another rule above that matches the additional files (e.g. /home/myapp/log/warning.log { ... }, I get an error like error: /etc/logrotate.d/myapp:3 duplicate log entry for /home/myapp/log/warning.log. How can I specify multiple rules that match particular files in an overlapping kind of way?

    Read the article

  • Can I remove a RAR file's (known) password without recompressing the archive?

    - by Abluescarab
    Long title. Anyway, I haven't been able to find an answer to this question. I know the password to the RAR file, I locked it myself, but now all I want to do is remove the password because it's too much of a pain in the butt to type it in every time. Is there a way to do this in WinRAR or an equivalent program? The only thing I knew to do was to extract it, then create a new RAR without the password. It's not a life-or-death issue, but it would be nice to know. Thanks for your time! EDIT: I just saw a bunch of related questions that appear to ask the same thing. The only solution I saw was using a DOS command to yadda yadda yadda. Here it is: How to remove password protection from compressed files Is there an easier way? Thanks again!

    Read the article

< Previous Page | 661 662 663 664 665 666 667 668 669 670 671 672  | Next Page >