Search Results

Search found 37684 results on 1508 pages for 'msp files'.

Page 53/1508 | < Previous Page | 49 50 51 52 53 54 55 56 57 58 59 60  | Next Page >

  • Unable to copy files previously extracted from archives created on a Mac, even after claiming ownership

    - by Maxim Zaslavsky
    I reinstalled Windows on my computer today, and backed up my music to a USB drive. Now, I'm trying to copy the files onto my fresh Windows partition, but I'm unable to copy files that I obtained within my previous Windows installation from zip archives created on Macs. When I try to copy those previously-extracted files, I get an error saying that I need permission from S-1-5-21-...-1000 (a bizarre long ID). The first thing I tried was to take ownership of the files by setting my new user account as the owner, but that resulted in errors saying that I need permission from myself! Some Googling suggested adding antivirus suggestions, so I excluded the relevant folders from Microsoft Security Essentials, but the issue persists. For what it's worth, it seems that some program (so far I've only installed Chrome, Microsoft Security Essentials, and the latest Windows updates) created an empty folder named 601c8c7f0e0c03f725 at the root of my external USB hard drive. What gives?

    Read the article

  • proftpd: copying uploaded files to an additional directory

    - by Matthew Iselin
    Using proftpd, is there a good way to automatically synchronise uploaded files from the upload directory to some other directory? Our layout ends up a bit like this: ~/ftp/some/path <-- Files are uploaded here ~/some/other/path/not/accessible/via/ftp <-- But also need to be here after uploading Is there a good way to do this automatically, or do I have to tell anyone uploading files to upload twice, and open up an additional directory (containing data we cannot redistribute)?

    Read the article

  • Using NginX and Apache alongside for both static and dynamic files

    - by faridv
    Background: I've searched a lot and found these useful threads about using of Apache or NginX for static or dynamic files. But they are old (mostly about 1 or 2 years ago) and I think both webservers, specifically Nginx has had important changes in performance and usage. So I think take another look on these issue cannot be that bad. Nginx (for static files) and Apache (for dynamic content)? nginx better than apache for dynamic content? [closed] Apache or NGINX for PHP? Nginx as reverse proxy to Apache with only dynamic content? My question: I have a PHP web application with lots of dynamic files and lots of static contents (videos, images etc.) and it's currently running on a CentOS 6 server and Apache 2.2 since 2 months ago. In past few days, number of our site visitors have gained so fast and I just thought if this number continues to increase with current ratio, we need to change many things (web server, application, etc.) to prevent failures. Because of hardware limitations that we are facing, I thought that it's best for us to start with web server. Should I start with something else? Should I try to increase performance of my PHP application and forget about web server for now? (even if gonna take a long time!) Because of huge usage of .htaccess files (for redirection, rewrites, etc.), I think it's gonna be painful to migrate to NginX as default web server or maybe only for dynamic files. Does this mean that I can't even use Nginx as reverse proxy? I'm not sure latest stable version of NginX and PHP-FPM have a better performance over my current Apache and my limitations (too many things) won't let me to give it a try. Which one is doing better currently? What will I lose by migrating to Nginx? To make it short, what should I do?

    Read the article

  • SQL Server 2005 standard filegroups / files for performance on SAN

    - by Blootac
    I submitted this to stack overflow (here) but realised it should really be on serverfault. so apologies for the incorrect and duplicate posting: Ok so I've just been on a SQL Server course and we discussed the usage scenarios of multiple filegroups and files when in use over local RAID and local disks but we didn't touch SAN scenarios so my question is as follows; I currently have a 250 gig database running on SQL Server 2005 where some tables have a huge number of writes and others are fairly static. The database and all objects reside in a single file group with a single data file. The log file is also on the same volume. My interpretation is that separate data files should be used across different disks to lessen disk contention and that file groups should be used for partitioning of data. However, with a SAN you obviously don't really have the same issue of disk contention that you do with a small RAID setup (or at least we don't at the moment), and standard edition doesn't support partitioning. So in order to improve parallelism what should I do? My understanding of various Microsoft publications is that if I increase the number of data files, separate threads can act across each file separately. Which leads me to the question how many files should I have. One per core? Should I be putting tables and indexes with high levels of activity in separate file groups, each with the same number of data files as we have cores? Thank you

    Read the article

  • Linux command to concatenate audio files and output them to ogg

    - by hasen j
    What command-line tools do I need in order to concatenate several audio files and output them as one ogg (and/or mp3)? If you can provide the complete command to concatenate and output to ogg, that would be awesome. Edit: Input files (in my case, currently) are in wma format, but ideally it should be flexible enough to support a wide range of popular formats. Edit2: Just to clarify, I don't want to merge all wmas in a certain directory, I just want to concatenate 2 or 3 files into one. Thanks for the proposed solutions, but they all seem to require creating temporary files, if possible at all, I'd like to avoid that.

    Read the article

  • How to crash a program

    - by user2949019
    I have a program called BlueCoat Proxy installed on my school issued laptop that basically blocks every second website on the Internet, including stack exchange, YouTube and yahoo answers. I do not have administrator rights, nor can I delete anything in program files, I tried every possible method of obtaining admin rights. It is not accessible in task manager (it doesn't even appear there). I tried to close it with Windows command prompt through commands like 'taskkill' but it returns 'Access is Denied' (I'm only denied access with that program). Does anyone know a method of crashing a program with a batch file or VB program? I was thinking something like the ping command, though for a program. Maybe automating 1000 meaningless requests to the program? Your input on the subject matter is appreciated, however telling me that this is wrong or illegal is not.

    Read the article

  • Windows Batch Scripting: Newest File Matching a Pattern

    - by Eddie Parker
    This would be lightning quick in linux, but I'm not familiar enough with windows flavour of batch scripting. Basically I want to look for a series of files matching a certain wildcard, and get the one with the most recent modified date. I've gotten as far as: for %%X in (*.exe) do ( REM Do stuff.... ) But I'm not sure what manner of comparison operators there are, or if there's a better way of doing this. Anyone offer up any good solutions? Ideally it would involve a vanilla install of Vista; so no special trickery like cygwin/etc.

    Read the article

  • Windows 7 - Non Admin run as Admin for Explorer - still can't see all tmp internet files

    - by Steve
    I'm trying to retrieve video files from the IE 8 cache for a user that's not an admin in Win 7. As a non admin user, I run Explorer as admin and still can't see the temp internet files for the non admin user. Only if I login as a user that is admin can I see the files. Is there any way I can see the files w/o having to go through the login process? Essentially, I want the video file from this page and others like it: http://video.yahoo.com/watch/111585/1027823

    Read the article

  • How to backup millions of small files?

    - by grassbl8d
    What is the best way to backup millions of small files in a very small time period? We have less than 5 hours to backup a file system which contains around 60 million files which are mostly small files. We have tried several solutions such as richcopy, 7z, rsync and all of them seems to have a hard time. We are looking for the most optimal way... We are open to putting the file in an archive first or transferring the file to another location via network or hard disk transfer thanks

    Read the article

  • Recovery of Pinnacle Studio Project Files

    - by seanieb
    My external hard drive had some sort of issue a few months ago, but I was able to recover my files using a data recovery software program. However my Pinnacle studio files are not being recovered as before, they are being recovered as directory's/folders that have sub directory's and files. And I have tried with several different recovery programs and they all recover the projects as directories. And the projects all contain one file called README.TXT: * WARNING This directory contains the descriptive data of the project, split into. various subdirectories and files for better access. DO NOT EDIT, ADD, CHANGE OR MODIFY ANY OF IT'S CONTENTS! This gives me hope that I could some how just turn the directory into a .stu Pinnacle studio project file. How would I go about doing this? Or is there another way to solve this problem?

    Read the article

  • Using rsync when files on one end are all lowercase

    - by DormoTheNord
    I want to rsync a lot of files from a Windows box to a Linux server. The problem is, the files on Windows are all mixed case, and the files on the linux server need to be all lowercase. One solution is to have a script that rsyncs to a different directory on the server, copy the files into the main directory, and then convert them all to lowercase. I'd rather find a more elegant solution, though. I'd prefer a command line application, but I'd be willing to go with a GUI application if that's the best option.

    Read the article

  • Problem opening files with Gvim and NERDTree on Windows 7

    - by Oscar Duignan
    Just installed gvim on windows 7 for the first time, and I'm having a problem opening some files. When I open files (I'm doing this through NERDTree,) vim seems to flash up a cmd window for a few second before closing it (too quickly to make out the contents,) and I end up with a C:/Program folder, and a /Files/vim/vimfiles/doc/ in the directory of the file I just opened. I can see it's trying to access C:/Program Files/vim/vimfiles/doc, which is where vim is installed, however it's choking on the space, and I'm not familiar enough with gvim to work out why. Any and all ideas are greatly appreciated.

    Read the article

  • Importing photoimpact .ab3 file database into another album program

    - by Mel F
    I have Photoimpact Album v5 which has now maxed out at 65,000 photos with database information. I can export the database and file information to a csv or dbase 4 file but I cannot find any way to import the information to a competative Photo Organization/Database product like Adobe Lightroom, ACDSee Pro3 or Photo Collector, etc... which is more modern, will work on Vista/Windows7, and not have any file limitations. Any help would be greatly appreciated. I really cannot reinput data information for 65,000 photos manually. I would use any good software that I can transfer my files and data ifno to.

    Read the article

  • Microsoft Word files have weird icons suddenly?

    - by leaf68
    All of a sudden, my Microsoft Office files have the generic file icon, and it happens with Powerpoint and Word, but not Excel. I can open the files just fine, and I tried restarting my computer, creating new files, clicking F5 on the desktop, turning off my Windows XP theme patcher, and none of them worked, any ideas? EDIT: I've also noticed that it happened to the following programs: Windows Live Messenger, iTunes, and Skype, so not just MS Office. Yet, some are still fine like IE, Paint, GIMP, Paint.NET, a few web browsers, etc.

    Read the article

  • locked files on HFS+ home partition shared between OSX/Linux

    - by HazyBlueDot
    I dual boot into Arch Linux and OS X 10.6 on my MacBook pro. I synced my UID between both OSes and created an HFS partition (with no journaling) to use as a shared home/Users partition. For the most part it works just as I'd expect, but sometimes when I'm booted into OS X certain files are "locked" (when I get info on a particular file the "Locked" box is checked under the "General" pane. I can resolve the issue by manually unchecking the box) and/or I get "Operation not permitted" when I try deleting or chmod'ing a file. In both cases I don't see anything out of the ordinary on the permission bits displayed with ls -l, except for a trailing '@' character in the position where the sticky bit would normally occur: -rw-r--r--@ 1 myuser mygroup 296 Mar 29 11:44 myfile This '@' character shows up on ALL normal files, so doesn't seem to be linked to the locked/operation not permission situation. On the Linux side of things I never have permission problems. To the best of my limited knowledge and experience with ACLs I've not found any ACLs on any of the files in question. For what it's worth, I do most of my file editing using emacs (Aquamacs in OSX), is it possible it is setting weird permission bits? What is the "locked" setting that OS X uses and does it have a permission bit equivalent (so at the very least I could recursively unlock all files in my home directory from the terminal) why might some, but not other files get "locked" when booting into OS X what is the meaning of the '@' character?

    Read the article

  • SCCM 2012 R2 - OSD Task Sequence failure on physical computers

    - by Svanste
    I'm trying to deploy windows 7 with SCCM 2012 R2 to physical desktops and laptops. But the task sequence keeps failing, no matter what I try. When I try it on a VM it works fine. However, when I try it on a physical computer it fails. So I think it has something to do with drivers, but I already tried both the "auto apply drivers" + wmi query for model method, and also the "apply driver package" + wmi query for model method. In the link below I added a zip file, containing two other zip files. One is a captured log from a failed osd on a desktop, the other is the export of my task sequence. Download zip-file with log and TS If anyone could resolve the issue, or share their own task sequence for such a task (pure sccm 2012 (R2), no MDT), that would be great.

    Read the article

  • All my folders and files on my flash drive have been renamed automatically and I can no longer open them... I need those files

    - by jennifer
    I opened up my flash drive this morning and all of my folders and files are normal, except for one folder and all its included files, which is the most important folder. The subfolders and files are renamed with bizarre characters and when I click to open them, a pop-up appears saying it's not accessible and the filename or directory name is incorrect. I don't want to reformat the flash drive because I'd lose all those files. Is there a way for me to restore it or something? I would attach a screen shot, but apparently new users do not have that privilege. If you have a vague idea of what I'm talking about, let me know and I can email you screenshots so you can have a better understanding. Any help is greatly appreciated!

    Read the article

  • Iterating through folders and files in batch file?

    - by Will Marcouiller
    Here's my situation. A project has as objective to migrate some attachments to another system. These attachments will be located to a parent folder, let's say "Folder 0" (see this question's diagram for better understanding), and they will be zipped/compressed. I want my batch script to be called like so: BatchScript.bat "c:\temp\usd\Folder 0" I'm using 7za.exe as the command line extraction tool. What I want my batch script to do is to iterate through the "Folder 0"'s subfolders, and extract all of the containing ZIP files into their respective folder. It is obligatory that the files extracted are in the same folder as their respective ZIP files. So, files contained in "File 1.zip" are needed in "Folder 1" and so forth. I have read about the FOR...DO command on Windows XP Professional Product Documentation - Using Batch Files. Here's my script: @ECHO OFF FOR /D %folder IN (%%rootFolderCmdLnParam) DO FOR %zippedFile IN (*.zip) DO 7za.exe e %zippedFile I guess that I would also need to change the actual directory before calling 7za.exe e %zippedFile for file extraction, but I can't figure out how in this batch file (through I know how in command line, and even if I know it is the same instruction "cd"). Anyone's help is gratefully appreciated.

    Read the article

  • Restoring the owners on debian system files

    - by Vlad
    Due to my inattention, tiredness (and probably stupidity) i've run chown -R someuser:someuser / and now all your base are belongs to us the files on the server belong to one user (lol). After system restart apache, bind9, mysql, and a dozen of other applications don't start and fill their log files with permission errors. I haven't done any backups on system files, only on the db and website files... Please suggest some ways to revive my web server. I have only 2 month experience with linux, so please keep it simple...

    Read the article

  • Files listed by bash but unaccessible

    - by Cerin
    What would cause the following behavior on an Ubuntu 12.04 system? I've SSHed into a machine as the "ubuntu" user. Running ls -lah /data/* shows dozens of non-empty files (e.g. file1.txt, file2.txt, etc), all owned by the "ubuntu" user/group, and with full read/write access. If I try to cat /data/file1.txt, bash gives me the error "cat: /data/file1.txt: No such file or directory" In short, ls is listing files, but in every other way, the files essentially don't exist. I can't cat them or read them in any way. Even giving all the files 777 permission doesn't change anything. This is really bizarre. What's going on here?

    Read the article

  • Amazon S3 bucket - download only certain files

    - by mottey
    Hi I have an Amazon S3 bucket with 10,000 images sitting in it with a standard naming convention: 001_small.jpg 001_large.jpg 002_small.jpg 002_large.jpg Because there are such a large amount of files I don't want to download ALL of them and I don't want to sit there for a couple of hours to select just the *_large.jpg files... Can someone suggest an S3 file manager that can let me select only the *_large.jpg files to download?? Thanks!

    Read the article

  • SQL Server 2005 standard filegroups / files for performance on SAN

    - by Blootac
    Ok so I've just been on a SQL Server course and we discussed the usage scenarios of multiple filegroups and files when in use over local RAID and local disks but we didn't touch SAN scenarios so my question is as follows; I currently have a 250 gig database running on SQL Server 2005 where some tables have a huge number of writes and others are fairly static. The database and all objects reside in a single file group with a single data file. The log file is also on the same volume. My interpretation is that separate data files should be used across different disks to lessen disk contention and that file groups should be used for partitioning of data. However, with a SAN you obviously don't really have the same issue of disk contention that you do with a small RAID setup (or at least we don't at the moment), and standard edition doesn't support partitioning. So in order to improve parallelism what should I do? My understanding of various Microsoft publications is that if I increase the number of data files, separate threads can act across each file separately. Which leads me to the question how many files should I have. One per core? Should I be putting tables and indexes with high levels of activity in separate file groups, each with the same number of data files as we have cores? Thank you

    Read the article

  • background copy large files to a laptop?

    - by Roy Pardee
    Hey All, I like to watch windows media center recorded TV files on my laptop in bed. I find thought that when the programs are in HD I have a lot of stuttering and delays--no doubt b/c of the amount of data being transferred. I actually have a fair amount of space on the laptop's hdd, and wouldn't mind moving the files onto that hard drive, where no doubt my problem would go away. But that requires some planning & time for the files to move. Is there a utility out there that would kind of 'trickle' the files over to the laptop over a long period of time, w/out soaking its bandwidth? Something like ms' BITS tech? Both machines are running win7. Many thanks! -Roy

    Read the article

< Previous Page | 49 50 51 52 53 54 55 56 57 58 59 60  | Next Page >