Search Results

Search found 13068 results on 523 pages for 'copy and paste'.

Page 298/523 | < Previous Page | 294 295 296 297 298 299 300 301 302 303 304 305  | Next Page >

  • Access is denied while moving a file from different volumes

    - by logeeks
    i have a portable 500GB HDD plugged into my dell xps system. The system have windows 7 professional edition. the problem is that when i try to open a file(visual studio .sln file) it is saying that access is denied. I cannot copy this file to a different location(within my local HDD) it is saying that i need permission for the task to complete. I've checked and confirmed the following things 1) I've logged into an admin account before attempting these operations 2) My admin account have 'Full Control' 3) I've full control over the portable HDD 4) I changed the 'UAC' settings to 'Never notify' Please help.

    Read the article

  • File permissions issue with an NFSv4 share, uploaded from a Mac Lion

    - by POP.sicle
    I have an NFSv4 share that was working fine, with Macs using Snow Leopard, to share files across the network. The NFS share has one cloned user/group that all clients autoconnect as. However, when I use a Lion Mac to copy a file from their user directory to the NFS, no other computer (mac SL/mac Lion/Win7) can edit/delete/write to the file that was uploaded - despite having the correct read/write/ex permissions visible on the NFS and through terminal. Attempting to edit the file permissions through Finder completely locks the file. I suspect this has something to do with Lion's ACLs (or maybe its version control) conflicting with NFSv4. Is there a way to disable or ignore extended ACLs or extended file permissions on the NFSv4 side, that would allow users to not run into this conflict? The work around currently is to use NFS Manager and set automounts to ignore ownership but installing NFS manager and configuring automounts for all of the computers seems more troubling than attempting to reconfigure the NFS settings. Advice?

    Read the article

  • chmod -R 777 /. - RHEL 5.5

    - by user1263746
    A shell script testing went bad and it issued chmod -R 777 /. to the system, instead of chmod -R 777 ./ and as expected it wiped the critical meta data. We have turned off the system and it will not function properly the next time it is turned on. I am told that rpm --setperms -a rpm --setugids -a should atleast fix the permission of the packages maintained by rpm. Is it worth doing? And is there any script available which will copy the permission from an identical system? To atleast get the box working. The Box is running RHEL5.5 Thanks!

    Read the article

  • How to convert a mpeg4-aac video to mpeg-ts using MEncoder?

    - by Mahendra Liya
    Hello, I know this can be done using FFMpeg and I have done it. The only problem here was that FFMpeg actually tries to "encode" the file while converting to mpeg-ts which I don't want. i.e. I just want to change the container format to mpeg-ts without encoding the media. Is this type of conversion possible with FFMpeg? (I know about "copy" option, but it works with H264-aac and not with mpeg4-aac). Is it possible to change mpeg4-aac to mpeg-ts container format with MEncoder? I wish to know the personal opinion/advice of those who have already worked on such stuff. Thanks in advance.

    Read the article

  • Replacing files in a folder structure with files from an unsorted folder

    - by Andrew
    I have over 50,000 PDFs organized into folders in a file called PDFACT. I needed to compress these files so I ran them through Adobe to batch compress them and this worked—except Adobe could only output the files without their folder structure. So basically I have 50,000 PDFs set up in a folder with hundreds of subfolders, and everything was organized. I ended up with one folder with 50,000 compressed PDFs in it, just in alphabetical order. Somehow I need to replace all the orginal PDFs with their compressed copies. Let me give an example: In the folder PDFACT we have the following file: C:\PDFACT\BIG DINNER\BILL\NEWESTBILL.PDF … and in the output folder that Adobe created we have just: C:\COMPRESSED_PDF_FOLDER\NEWESTBILL.PDF This copy is smaller then the one in PDFACT and has the same name but it is just lumped in with every other PDF. The folder structure and subfolders are gone. Is there any way to replace all the larger uncompressed PDFS inside the orginal folder structure with their now compressed counterparts?

    Read the article

  • options' default values of autoconf ./configure scripts

    - by hamstergene
    Having run ./configure --help there is a list of options that can be tweaked in the future build, for example --enable-luajit Enable LuaJit Support --disable-env clearing/setting of ENV vars Though any option can be used with enable and disable prefix, some are presented as --enable-me and other as --disable-me in the help output. Is this supposed to hint me of default values, and if yes, how do I to figure them out? Because either way makes sense to me: luajit is disabled by default and therefore it is presented as --enable-luajit so I can enable it by conveniently copy-pasting it from help output to command line. being listed with --enable in help output indicates that luajit is enabled by default.

    Read the article

  • Linux file permissions not being preserved

    - by yellavon
    I am deploying some custom software as root (a necessity for this situation). I set the owner/group to user1:user1 and set all the files to 644 beforehand in shell, then copy and deploy with ant. However, when files get copied over from the deployment directory, the ownership changes back to root and all the files install with 666 permissions. This seems to occur whether the file is overwritten or newly created. I believe there is a way to set an option in cp, mv commands to preserve permissions, but that would be a lot of commands to change. How can I fix this? Is there some setting I can change temporarily for root so the install always preserves the file permissions?

    Read the article

  • Fabric and cygwin don't work with windows UNC paths

    - by tcoopman
    I have some strange problems with fabric deployment to Windows Server 2008r2. The thing I try to accomplish is to copy some files to a shared folder with a fabric script (this script does a lot of other things too, but only this step gives me problems). This is the problem: When I try to access a UNC(Universal Naming convention) path I always get access denied kind of answers if I run the script in fabric. When I run the command in an ssh prompt (same user) it works fine. Examples: cmd: robocopy f:/.... //share result: in ssh this works fine, in fabric I get "Logon failure: the user has not been granted the requested logon type aat this computer." cmd: cd //share result: in ssh this works fine, in fabric I get "//share: Not a directory" Further information: uname -a and whoami return exact the same thing in fabric and ssh. I also tried things like mount, net use, but these commands all have kind of the same problem.

    Read the article

  • Home server - HP Proliant Microserver - Software and setup - OS on USB stick? [closed]

    - by Lloyd Watkin
    I've just purchased a HP ProLiant Microserver for home use. I want to set up with web server, samba shares, the usual stuff. My question is really about system setup. It has an internal USB socket so I've attempted to install a copy of Fedora 14 onto it. I turned off X/Gnome, but it still ran like a pig. I've now put the OS on one of the internal disks (250Gb, 7200rpm), but I was wondering if there was a way to utilise the internal USB to give me better power-saving allowing the hard drives to be shut down when not in use. How would you set this server up? I'd rather not go to the extra cost of an SSD right now, but if that's the best way then so be it.

    Read the article

  • Subversion error: (405 Method Not Allowed) in response to MKCOL

    - by Sergio del Amo
    I am getting the following error while trying to commit a new directory addition. svn: Commit failed (details follow): svn: Server sent unexpected return value (405 Method Not Allowed) in response to MKCOL request for '.... I have never seen this error before. How can I fix this problem? Solution I managed to solve the problem: Delete the parent's directory of the folder giving the problem. Do SVN Update. A folder with the same name as the new one already existed in repository. Delete this folder. SVN commit. Copy the new folder, schedule for addition and SVN commit.

    Read the article

  • suggestions for migrating a windows 7 install to a new 4K sector disk

    - by myCubeIsMyCell
    Hi, I'm looking to upgrade the disk on a windows 7 box to a new larger drive. In the past for such migrations I'd just hook both drives up and use a linux boot disk and use dd to copy from one disk to the next... boot up the new drive & expand the partition. The drive I just purchased however is a western digital using 4k sectors... not sure if there'd be any complications using my old method moving from a 512b sector drive. Current plan is to try the migration by doing a win7 system image backup to an external drive... then restore the image to the new drive via system restore boot disk. Any suggestions or recommendations on how to best complete this migration would be greatly appreciated. Thanks!

    Read the article

  • Reliable Backup Solution for Linux for Complete System Restoration

    - by Chris S
    What's the best backup solution for Linux that can completely restore the entire filesystem to a blank harddrive (including partitioning) after an old harddrive dies? I'm currently running a few Ubuntu machines, some with RAID-1 and others without RAID (mostly laptops). I'd like to implement a backup solution that can take incremental snapshots of the entire filesystem, so that if I were to replace all the harddrives in a machine, I could use the backup to restore a perfect copy of the previous filesystem. Unfortunately, nearly all the backup solutions I've found seem to be glorified rsync scripts, which only backup some files, and have no easy way to restore once the entire filesystem is gone. Some of the more complicated solutions, like Bacula, might do what I need, but require a complicated server/client setup and are notoriously difficult to maintain. I've heard that Apple's TimeMachine utility has this ability, and I've had similar success taking differential disk images with Acronis True Image on Windows, but of course neither of these work on Linux. Is there anything comparable for Ubuntu?

    Read the article

  • Is there an Alternative to TextMate's mate and rmate for Windows?

    - by TiernanO
    As part of the upcoming TextMate 2 release, there will be a new feature called rmate, which will allow you to edit files from a remote machine (Linux/Unix/OSX) via SSH using your local copy of TextMate... Is there something similar for Windows? I know i could use CyberDuck, find the file i want to edit, download locally, work on it and then re-upload, but rmate looks like you just type rmate on the remote server, and text mate pops up with the file. (i have not tried since i am not a TextMate owner) Is there something similar for Windows?

    Read the article

  • How do I keep multiple copies of Outlook in sync when using RPC over HTTP?

    - by Don
    I use Outlook 2007 at work with our Exchange 2003 server. I just setup my home system with Outlook 2007 so that I could use the RPC over HTTP to access Exchange without having to use a VPN. It works fine. I can get mail, send mail, etc. What it doesn't seem to be doing is staying in sync. For example, I read a few messages at home, moved them into different folders from the Inbox, etc. That all seemed fine. When I login to my work machine and look at the copy of Outlook there, the mail is still unread and nothing has been moved. Am I missing something simple here? I would have to assume that my home machine should be telling Exchange where these messages belong and that they've been read. Both machines are running Windows 7, if that matters. Ideas?

    Read the article

  • Bootable SD card still has small memory, even after formating

    - by Inazuma
    I have an SD card which I used to run my RaspberryPi. I wanted to update the copy of raspbian on it, so I formated the card using the software from www.sdcard.com. I followed all the instructions correctly, however the size of my SD card didn't go back to it's default. It is a 4gb SD card, which after it's spell in the RaspberryPi had shrunken to 52mb, which I understand is normal. After formatting, the size rose to 3.69gb. This means that there is not enough space to install a new OS, so how can I make my SD card 4gb again? Any help would be much appreciated!

    Read the article

  • Exchange ActiveSync does not work for one user

    - by jshin47
    One particular user in our system is unable to connect to Exchange ActiveSync via her iPhone. When I try to connect using my own credentials on her iPhone it works (everything begins syncing), but when I input her credentials, the Settings app verifies the credentials are correct but nothing syncs. For example, if I open Mail, no items are shown. When I attempt to force a sync, it says "Cannot connect to server." In Exchange 2010 Management Console the user is no different than the others. Exchange ActiveSync is set as "Enable" in Mailbox Features. EDIT: Alternatively, if there is some easy way to create a new useraccount/mailbox and copy all of the contents of the old one over, I bet it would work, and that would be fine as well. She is a Mac user so we do not have to worry about her Active Directory account.

    Read the article

  • Is there no such thing as a Gigabit switch?

    - by Torben Gundtofte-Bruun
    According to the manufacturer specification, even my rather plain desktop computer has "Gigabit Ethernet". So when I want to copy large files over the LAN (not Internet) it would make sense to have a gigabit switch. I'm searching eBay for a gigabit switch for a planned home network upgrade. The products I find are all labeled "gigabit" but they all have 24 x 10/100Mbit autosensing ports and sometimes 2 x 10/100/1000Mbit autosensing ports. It was my understanding that 10/100 is ancient and that modern computers have network interfaces that work with 1000Mbit, so it would make sense to get a switch that has 24 x 1000Mbit ports. Did I misunderstand, or are sellers (deliberately?) mislabeling older hardware? (Let's not dive into wired vs. wireless networks and how "N" wireless is fast. You'd be right, but not answering the question.)

    Read the article

  • DBD::mysql gives mysql_init not found

    - by highBandWidth
    I have to install a non-admin copy of mysql and perl module DBD::mysql in my home directory. I installed mysql in ~/software/db/mysql and this works since I can start and stop the server and go to the mysql prompt. Then, I downloaded the perl module and installed it using perl Makefile.PL PREFIX=~/myperl/ LIB=~/myperl/lib/lib64/perl5/ --mysql_config=/my_home/software/db/mysql/bin/mysql_config --libs=/myhome/software/db/mysql/lib/libmysqlclient.a make make install I did this to use the statically linked mysql client library. perl -MDBD::mysql -e 1 gives no errors. However, when I actually try to use the module, I get /usr/bin/perl: symbol lookup error: /myhome/myperl/lib/lib64/perl5/x86_64-linux-thread-multi/auto/DBD/mysql/mysql.so: undefined symbol: mysql_init

    Read the article

  • Symantec CPS / Backup Exec 11D Service stuck in "Starting" Status

    - by user42289
    I have two Windows 2003 (one is SE, one is SBS) both SP2, both are Virtual Machines of Microsoft Virtual Server 2005 R2. All of a sudden about 2 weeks ago, the Symantec Backup Exec / CPS 11D stopped working on them. One is the Media server, one is our Exchange 2003 Server. There is another copy of CPS on our file server that the service is running fine on. However the one that is fine is not a VM. When I say stop working, the "backup exec continuous protection agent" service is stuck in "starting" status. On the non Exchange server I've tried uninstalling the last Windows Updates that were run some time around the time of failure. I've tried repairing the install of CPS. I've tried uninstalling it and reinstalling. Exact same problem in the end.

    Read the article

  • Need disc image help pronto!

    - by data
    I recently got a job as a junior network administrator. Last week the senior admins did their yearly reinstall of server 2003, exchange, drivers etc on the main server. I've been asked to back up the disc so that next year they can just copy over the pre-made image. What tools can i use to achieve both the creation of the entire servers HDD image and loading it back on (id like to test it in the sandbox.) To impress them, a program that is free is preferable. And maybe a tool that can do it all from booting the program off of a USB drive.

    Read the article

  • Why not install Msvcr71.dll into system32?

    - by hillu
    While looking for an authoritative source for the missing Msvcr71.dll that is needed by a few old applications, I stumbled across the MSDN article Redistribution of the shared C runtime component in Visual C++. The advice given to developers is to drop the DLL into the application's directory instead of system32 since DLLs in this directory are considered before the system paths. What can/will go wrong if I (as an administrator, not a developer) decide to take the lazy path and install Msvcr71.dll (and Msvcp71.dll while I'm at it) into the system32 directory (of 32 bit Windows XP or Windows 7 systems) instead of putting a copy in each application's directory? Is there another good solution to provide the applications with the needed DLLs that doesn't involve copying stuff to the application directories?

    Read the article

  • Can someone correct me about Windows 2008 DFS implementation

    - by cwheeler33
    I have 2 two Windows 2003 file servers using DoubleTake at the moment. They are in an 2003 AD domain. And each server has it's own disk set. It is time to replace the servers... I want to use Windows 2008 64bit Ent. I was thinking of using DFS-R to replace DoubleTake. The part I'm not sure about is clustering. Do I need to have shared storage if each server has a copy of the data? I want to have the data available to the same share name, so maybe I don't fully understand how DFS is set up. I currently have about 6TB of data. I expect to grow by 3TB a year on these file servers. Any resources/books that could teach me would be good to know as well.

    Read the article

  • How can I transfer a blog hosted on Community Server to WordPress?

    - by Martin Plante
    I have a blog hosted on Community Server 2.1 (old version). I want to transfer it to WordPress (.com). I have tried exporting it to the BlogML format, but it failed with an exception. I saved a copy of the full RSS feed, by setting the # of posts to show to a huge number, save, then lower it back. I have all images and do not mind having to upload and rename them one by one. There must be a way to read that RSS file and either directly import it into WordPress, or for a more complicated path, transform it with the proper XSLT into the BlogML (or other) format, to import it into WordPress?

    Read the article

  • Is it possible to manually install RDP client 6.1 for platforms that aren't directly supported?

    - by Matt
    I have some clients that need RDP client 6.1 in order to utilize the new easy print driver. However, the installer doesn't allow it to install on several platforms such as Windows Home Server or Windows Server 2003 because they are not XP. The version check prevents it going further. I'm reasonably confident that it should actually run however and want to try it. Has anyone done this before? what I should really ask is... what files should I manually copy (backing up originals first)... just the exe or are there lots of dependant dll's that need upgrading too?

    Read the article

  • Create Virtual Image of Laptop before Formatting

    - by Simon Mark Smith
    I have a 3 year old laptop running Windows XP that I used for business. Although I have not used the laptop in over a year, I now want to re-commission it with Windows 7 and a fresh install. Before I do the fresh install I want to create a Virtual Image of the laptop that I can keep and potentially run on my desktop machine should I ever need to access any of the old files/projects that it contains currently. I know that most people will say just copy the files over to your desktop, but my concern is the configuration of the laptop. I used to use it for development and it has older versions of Visual Studio, SQL Server, Active X controls etc, etc than I currently use so I really want to preserve the environment not just the files. So really I am asking what is the best tool-set/method to achieve this? I understand there are free VM tools available but I have never done this before and would appreciate any help.

    Read the article

< Previous Page | 294 295 296 297 298 299 300 301 302 303 304 305  | Next Page >