Search Results

Search found 7545 results on 302 pages for 'backup and restore'.

Page 39/302 | < Previous Page | 35 36 37 38 39 40 41 42 43 44 45 46  | Next Page >

  • How can ShadowProtect SBS backup to alternating external drives?

    - by detly
    I am trying to configure ShadowProtect SBS (v. 4.1.5.10129) in Windows Server 2003 SBS to backup my server hard drives to two alternating external drives. What I want is to be able to swap one drive for another every Friday, and have ShadowProtect continue on the same schedule. Ideally, this would require absolutely no user interaction whatsoever, apart from physically unplugging one drive and reconnecting the other. The trouble is, Windows Server 2003 does not allow you to assign the same drive letter to two different devices. So if I plug in drive #1 and assign it drive letter "X:", the next week when I unplug it and plug in drive #2, it gets some other letter. But since ShadowProtect is set to backup to "X:\", it can't find it and the backup fails. The drives are Samsung STORY Station 3.0 2TB drives. How can I configure things so I can just swap the drives over every week and not worry about having to reconfigure drive letters every time?

    Read the article

  • Backup solution, or, how Duplicati duped me

    - by blarghmaster
    TL/DR version: Mono + Duplicati.commandline.exe restore etc. etc. spits this out for several files regardless of what I try. I am able to list sets, list files in said sets, even do a verify, but each time i do a restore of any kind, i get errors to the effect of : Failed to restore file: "snapshot/blahblah/2005-11-07.tar.gz", Error message: The partial file record for snapshot/blahblah/2005-11-07.tar.gz does not match the file Any advice here, or an idea of where to look for a better solution? FULL STORY: Ive recently put together an nice clean, friendly backup solution for several servers, predominantly Linux, but occasionally a windows box is added too. The solution as is meets all my requirements and does it well... save 1: cross-compatibility The solution is based on a combination of several elements, but eventually comes done to using Duplicity and Duplicati for the actual storage of files. The entire solution was ready to go before i realized that Duplicati, does not, in fact allow me to restore my files to a Linux box, regardless of what the commandline under Mono might tell you. It just spits out errors on random zip and image files, for apparently no good reason as i have tried several options to get it to restore, and several versions of Mono including installing it pretty much lib-for-lib. There is no effective log file for the reasons for these errors, and even the "--debug-output=true" flag does nothing. I am able to list sets, list files in said sets, even do a verify, but each time i do a restore of any kind, i get errors to the effect of : Failed to restore file: "snapshot/blahblah/2005-11-07.tar.gz", Error message: The partial file record for snapshot/blahblah/2005-11-07.tar.gz does not match the file Now i could most likely use the friendly instructions on Duplicati's site and script a bash equivalent of the restore, but that's not exactly ideal. Any advice on this? or possibly an alternative solution that presents the same benefits of Duplicati/Duplicity but that actually works across platforms?

    Read the article

  • Concurrent modification during backup: rsync vs dump vs tar vs ?

    - by pehrs
    I have a Linux log server where multiple applications write data. Data is written in bursts, and in a lot of different files. I need to make a backup of this mess, preferably preserving as much coherence between the file versions as possible and avoiding getting truncated files. Total amount of data on the server is about 100Gb. What I really would want (but can't) is to shut-down, backup the system cold and then start it up again. What kind of guarantees against concurrent modification does the various backup tools give? When do they "freeze" the file versions? I am looking at rsync, dump and tar at the moment, but I am open for other (open source) alternatives. Changing the application or blocking writing for backups is sadly not an option. System is not running LVM (yet), but I have considered that for rebuilding the system and then snapshots.

    Read the article

  • How to reuse backup on Time Machine on Snow Leopard after a logic board change, after choosing wrong

    - by kmiffy
    After my logic board was replaced, I connected my laptop back to my network, and Time Machine gave me a popup, as shown on this thread: http://superuser.com/questions/78068/recycle-time-machine-for-new-machine/78264#78264 I misread the question and clicked on "Create New Backup" when I should have clicked on "Reuse Backup" to connect to my old backup file. How can I trigger that popup again? Turning Time Machine on and off does not work, and the instructions on forums to fix via terminal doesn't work because snow leopard is missing the fsaclctl command (and I'm also not familiar with terminal commands.) Thanks.

    Read the article

  • Why can't I create a Windows backup on my secondary disk?

    - by Brian Sullivan
    I've installed Windows 7 Ultimate on an SSD that I've added to the XPS desktop that I bought from Dell. I would like to use the built-in backup functionality to create incremental backups and store them on the large drive that came with the machine. I formatted the large drive and turned it into a Basic disk. However, when I try to set the backup location to the large internal disk (E:\) in the "Set up backup" wizard, a get a message saying, "A system image cannot be saved on a drive that your computer boots from or that Windows is installed on." Windows is not installed on that disk. I even deleted the OEM partition that was on the disk, and removed it completely from the boot order in the BIOS. Any clue why Windows is griping at me about this?

    Read the article

  • How do I backup (from Vista Home Premium) to a FAT32 HDD connected to a wireless router with no user account or password set?

    - by Scubadooper
    I have a Seagate expansion 2TB HDD which I've managed to get working on my wireless router (thanks to fat32format) however I haven't managed to set my backup to work with it. Vist Home Premium requires that I input a username and password in the backup utility. As far as I'm aware that type of access control isn't set on the HDD: Can I set up the access control on the HDD? If so how Or, can I set the backup to work without it? If so how Thanks for your help, I haven't been able to find the answer anywhere on the net so far

    Read the article

  • Saving backup files automatically in (g)Vim after saving a file.

    - by Somebody still uses you MS-DOS
    I had a problem with my gVim. I lost some important modifications after I plugged on my machine after a hibernating process. To avoid this kind of problem, I would like to know if it's possible to add something in my .vimrc (or a plugin) that automatically backups all saving made to my files. Disk space is not an issue, I can delete these files after. I'm already using set backup set backupdir=~/.backup/vim set directory=~/.swap/vim This creates a myfile.extension~ in my .backup/vim. ...but I would like this configuration to add ~ to first save, ~0 to second, ~1 to third, ~2 to fourth, and so on - something that keeps copies from all modifications I made to a file. Is this possible? Do you know if there's a plugin for this?

    Read the article

  • DPM 2010 PowerShell Script to Easily Restore Multiple Files

    - by bmccleary
    I’ve got what I thought would be a simple task with Data Protection Manager 2010 that is turning out to be quite frustrating. I have a file server on one server and it is the only server in a protection group. This file server is the repository for a document management application which stores the files according to the data within a SQL database. Sometimes users inadvertently delete files from within our application and we need to restore them. We have all the information needed to restore the files to include the file name, the folder that the file was stored in and the exact date that the file was deleted. It is easy for me to restore the file from within the DPM console since we have a recovery point created every day, I simply go to the day before the delete, browse to the proper folder and restore the file. The problem is that using the DPM console, the cumbersome wizard requires about 20 mouse clicks to restore a single file and it takes 2-4 minutes to get through all the windows. This becomes very irritating when a client needs 100’s of files restored… it takes all day of redundant mouse clicks to restore the files. Therefore, I want to use a PowerShell script (and I’m a novice at PowerShell) to automate this process. I want to be able to create a script that I pass in a file name, a folder, a recovery point date (and a protection group/server name if needed) and simply have the file restored back to its original location with some sort of success/failure notification. I thought it was a simple basic task of a backup solution, but I am having a heck of a time finding the right code. I have seen the sample code at http://social.technet.microsoft.com/wiki/contents/articles/how-to-use-a-windows-powershell-script-to-recover-an-item-in-data-protection-manager.aspx that I have tried to follow, but it doesn’t accomplish what I really want to do (it’s too simplistic) and there are errors in the sample code. Therefore, I would like to get some help writing a script to restore these files. An example of the known values to restore the data are: DPM Server: BACKUP01 Protection Group: Document Repository Data Protected Server: FILER01 File Path: R:\DocumentRepository\ToBackup\ClientName\Repository\2010\07\24\filename.pdf Date Deleted: 8/2/2010 (last recovery point = 8/1/2010) Bonus Points: If you can help me not only create this script, but also show me how to automate by providing a text file with the above information that the PowerShell script loops through, or even better, is able to query our SQL server for the needed data, then I would be more than willing to pay for this development.

    Read the article

  • DPM 2010 PowerShell Script to Easily Restore Multiple Files

    - by bmccleary
    I’ve got what I thought would be a simple task with Data Protection Manager 2010 that is turning out to be quite frustrating. I have a file server on one server and it is the only server in a protection group. This file server is the repository for a document management application which stores the files according to the data within a SQL database. Sometimes users inadvertently delete files from within our application and we need to restore them. We have all the information needed to restore the files to include the file name, the folder that the file was stored in and the exact date that the file was deleted. It is easy for me to restore the file from within the DPM console since we have a recovery point created every day, I simply go to the day before the delete, browse to the proper folder and restore the file. The problem is that using the DPM console, the cumbersome wizard requires about 20 mouse clicks to restore a single file and it takes 2-4 minutes to get through all the windows. This becomes very irritating when a client needs 100’s of files restored… it takes all day of redundant mouse clicks to restore the files. Therefore, I want to use a PowerShell script (and I’m a novice at PowerShell) to automate this process. I want to be able to create a script that I pass in a file name, a folder, a recovery point date (and a protection group/server name if needed) and simply have the file restored back to its original location with some sort of success/failure notification. I thought it was a simple basic task of a backup solution, but I am having a heck of a time finding the right code. I have seen the sample code at http://social.technet.microsoft.com/wiki/contents/articles/how-to-use-a-windows-powershell-script-to-recover-an-item-in-data-protection-manager.aspx that I have tried to follow, but it doesn’t accomplish what I really want to do (it’s too simplistic) and there are errors in the sample code. Therefore, I would like to get some help writing a script to restore these files. An example of the known values to restore the data are: DPM Server: BACKUP01 Protection Group: Document Repository Data Protected Server: FILER01 File Path: R:\DocumentRepository\ToBackup\ClientName\Repository\2010\07\24\filename.pdf Date Deleted: 8/2/2010 (last recovery point = 8/1/2010) Bonus Points: If you can help me not only create this script, but also show me how to automate by providing a text file with the above information that the PowerShell script loops through, or even better, is able to query our SQL server for the needed data, then I would be more than willing to pay for this development.

    Read the article

  • r1soft agent is failing with the error: "write error while sending code: Broken pipe"

    - by curiousguy
    I have an Ubuntu 10.04.4 LTS server with r1soft agent installed in it. Recently, the backups are failing with the following error. -------- write error while sending code: Broken pipe -------- I have reinstalled the buagent but to no avail. On checking the server logs, I could see the following errors listed in it: -------- # tail -f /var/log/messages |grep -i buagent Nov 17 03:35:06 microscope buagent: Need to back up 126 sectors Nov 17 03:35:06 microscope buagent: (Righteous Backup Linux Agent) 1.79.0 build 12433 Nov 17 03:35:06 microscope buagent: allowing control from backup server (10.128.136.195) with valid RSA key Nov 17 03:35:06 microscope buagent: allowing control from backup server (10.128.136.201) with valid RSA key Nov 17 03:35:06 microscope buagent: sending auth challenge for allowed host at (10.128.136.201) port (47890) Nov 17 03:35:06 microscope buagent: host (10.128.136.201) port (47890) authentication successful Nov 17 03:35:06 microscope buagent: Backup request accepted. Starting backup. Nov 17 03:35:06 microscope buagent: Snapshot completed in 0.010 seconds. Nov 17 03:45:03 microscope buagent: Error reading blocks from snapshot. Nov 17 03:45:03 microscope buagent: Reading blocks failed Nov 17 03:45:03 microscope buagent: error backup aborted Nov 17 03:45:03 microscope buagent: backup failed on agent closing connection Nov 17 03:45:03 microscope buagent: Backup failed. Nov 17 03:45:03 microscope buagent: write error while sending code: Broken pipe (32) Nov 17 03:45:03 microscope buagent: tell child write failed -------- I tried changing the 'Timeout' and 'DiskAsPartition' value in '/etc/buagent/agent_config' file but no luck. Also, verified that proper route is added to the backup server. The agent is also running fine. Am I missing anything? Any help would be much appreciated. Note: CDP 2.0 is installed in the backup server.

    Read the article

  • How do I restore a non-system hard drive using Time Machine under OSX?

    - by richardtallent
    I dropped one of the external drives on my Mac Pro and it started making noises... so I bought a replacement drive. No biggie, that's why I have Time Machine, right? So now that I have the new drive up and initialized, how do I actually restore the drive from backup? Time Machine is intuitive when it comes to restoring the system drive or restoring individual folders/files on the same literal device, but I'm a bit stuck in how to properly restore an entire drive that is not the boot drive. I saw one suggestion to use the same volume name as the old drive and then go into Time Machine. Haven't tried that since the information is unconfirmed. For now, I just went to the Time Machine volume, found the latest backup folder for that volume, and I'm copying the files via Finder. Of couse, I expect this to work just fine, but I feel like I'm missing something if that's the "proper" way to do this.

    Read the article

  • How to backup/restore excluding filestream varbinary in SQL Server 2008?

    - by fdierre
    There is an application used in a production site that uses SQL Server 2008 as its DBMS. The database schema uses Filestream Varbinary to save binary data on the filesystem instead of directly into the DB tables. The point is that now and then it would be useful to copy the production database on development machines, mostly for doing troubleshooting. The database is too big for comfortably moving it around, but it would be ok if it could be moved leaving out the filestream varbinary fields. In other words, I am trying to make an "imperfect" copy of a database: i.e., on the destination database, it is ok to have NULL values instead of the varbinary. Is this possible? I tried looking for the feature on the SQL Server Management studio and did a backup that excludes the filegroup containing the filestream varbinary, but I cannot restore: MSSMS complains that the restore cannot be done because the backup is incomplete (of course). Is it possible to achieve what I am trying to do in some way?

    Read the article

  • SQL 2008 Backups to UNC Share Failing 0xC002F210

    - by Matty Brown
    This problem is driving me NUTS!! We take backups of all of our production databases to a network share, which are then backed up to tape nightly. 8pm Mon-Fri - Full backup, followed by log backup 7am-7pm Mon-Fri, at half-hour interval - Log backup Our backups have been working in this manner since we migrated from SQL Server Standard 2000 to 2008, 3 years ago. Recently, the first log backup on Mondays have been failing. Not every time, but almost every time! The rest of the week, we've had no problems. I guess the issue may have something to do with the size of the log backup that's attempted after a weekend of no backups. Now onto the issue I need a fix for... All this week, every full backup on our biggest two databases have failed (Both backups < 1GB compressed). There's plenty of disk space on the source and destination servers. I'm guessing the issue is to do with the amount of time it takes to complete the backups of these databases, and/or the size of the backup files required to complete these backups. Changing the backup destination to local storage works fine (and very, very fast in comparison). From the Job History, I can find a few hints as to what the problem could be... Code: 0xC002F210 (Always this code, but a mix of the following descriptions...) "The operating system returned the error '64(failed to retrieve text for this error. Reason: 1815)' while attempting 'SetEndOfFile' on '\drserver\SQLBackups\Database.bak'. BACKUP DATABASE is terminating abnormally. "The operating system returned the error '64(failed to retrieve text for this error. Reason: 1815)' while attempting 'FlushFileBuffers' on '\drserver\SQLBackups\Database.bak'. BACKUP DATABASE is terminating abnormally. Please help save my hair and sanity!!

    Read the article

  • How do I image my hard drive for backup? Or do I just need to backup the files?

    - by NoCatharsis
    I have never imaged a hard drive, so I don't know what software to use or how to prepare my system for imaging. Is this the best way to backup? In the past, I've always just kept a copy of my important files on an external drive and in Gmail or DropBox for smaller stuff, but it would be nice to just take one image and restore from that if something ever goes wrong. Thanks for the help. EDIT: I'm sorry, I forgot to mention the OS. I would like to do this for my home and work computers, which are Vista and XP respectively. And actually I'm about to upgrade to Windows 7 at home, so details on that would be appreciated too.

    Read the article

  • Problem restoring from tar backup: why are there /dev/disk/by-id/ symlinks and how can I avoid them?

    - by SK.
    Hello, I'm trying to make a bare-bone backup system with the most basic tools available on openSUSE 11.3 (in this case: bash, fdisk, tar & grub legacy) Here's the workflow for my scripts: backup.sh: (Run from external system, e.g. LiveCD) make an fdisk script ($fscript) from fdisk -l's output [works] mount the partitions from the system's fstab [works] tar the crucial stuff in file.tgz [works] restore.sh: (Run from external system, e.g. LiveCD) run fdisk $dest < $fscript to restore partitioning [works] format and mount partitions from system's fstab [fails] extract from file.tgz [works when mounting manually] restore grub [fails] I have recently noticed that openSUSE (though I'm sure it has nothing to do with the distro) has different output in /etc/fstab and /boot/grub/menu.lst, more precisely the partition name is for example "/dev/disk/by-id/numbers-brandname-morenumbers-part2" instead of "/dev/sda2" -- but it basically is a simple symlink. My questions about this: what is the point of such symlinks, especially if we're restoring on a different disk? is there a way to cleanly prevent the creation of those symlinks and use the "true" /dev/sdx everywhere instead? if the previous is no, do you know a way to replace those symlinks on the fly in a text file? I tried this script but only works if the file starts with the symlink description (case of fstab, not menu.lst): ### search and replace /dev/disk/by-id/... to /dev/sdx while read oldVolume rest; do # get first element, ignore rest of line if [[ "$oldVolume" =~ ^/dev/disk/by-id/.*(-part[0-9]*$)? ]]; then newVolume=$(readlink $oldVolume) # replace pointer by pointee, returns "../../sdx" echo /dev/${newVolume##*/} $rest >> TMP # format to "/dev/sdx", write line else echo $oldVolume $rest >> TMP # nothing to do fi done < $file mv -f TMP $file # save changes I've had trouble finding a solution to this on google so I was hoping some of the members here could help me. Thank you.

    Read the article

  • Powershell scripts to backup SQL, SVN

    - by bszom
    I'm trying to use PowerShell to create some backups, and then to copy these to a web folder (or, in other words, upload them to a WebDAV share). At first I thought I'd do the WebDAV stuff from within PowerShell, but it seems this still requires a fair amount of "manual labour", ie: constructing HTTP requests. I then settled for creating a web folder from the script and letting Windows handle the WebDAV stuff. It seems that all it takes to create a web folder is to create a standard shortcut, as described here. What I can't figure out is how to actually copy files to the shortcut's target..? Maybe I'm going about this the wrong way. It would be ideal if I could somehow encrypt the credentials for the WebDAV in the script, then have it create the web folder, shunt over the files, and delete the web folder again. Or even better, not use a web folder at all. Third option would be to just create the web folder manually and leave it there, though I'd rather not. Any ideas/pointers/tips? :)

    Read the article

  • eclispe workspace backup

    - by MarcoS
    What do I lose if I skip the .metadata/ directory when doing the back-up of my eclipse workspace? (is there some documentation describing what eclipse stores in this directory)? I've noticed that it changes very often (essentially every time that I use eclipse (galileo). I've seen this question, but I'm not interested in doing a back-up of plug-ins and settings (also because I'm not sure that they would work properly when restored after a re-installation of my PC or on a new PC). I'm just interested in doing a back-up of my projects (source code, libraries, possible data, .svn and .git directories). So, can I safely ignore the .metadata/ directory?

    Read the article

  • Make backup of large site with 100,000+ files/images

    - by niggles
    I tried backing up our site today using the Unix 'cp' command and ended up getting our office blocked out by PLESK - it added my ip to /etc/hosts.deny as it thought I was flooding the server. After Tech support fixed the issue, they suggested I go folder by folder to back it up, but there's about 10,000 folders on the site totaling 1/2 a terabyte, each with multiple sub-folders, so this isn't viable. Basically I want to be able to mirror the domain on another domain we've got set up on the same dedicated server so I can test with live images (the bulk of our content). Any suggestions e.g adding some rules to open_base_dir and getting PHP to recursively copy the folders to the other domain (remember it's on the same dedicated box so it just needs to traverse the directory, not FTP things).

    Read the article

  • Opening Time-Machine OSX backup files on Windows 7?

    - by user39279
    Hi, Have Time Machine backups on a Western Digital External HD. The Time Machine backups were done on my now dead Mac G4 running OSX Leopard- I am waiting on a new iMac but in the meantime I need to access some of my backup files urgently. I have a laptop running Windows 7 so is there any safe way of accessing some of the files from the Time Machine backup on my laptop and still be able to do a full restore when the iMac arrives? Thanks -

    Read the article

  • how to do automatic backup of running vmware virtual machine?

    - by Radek
    I want to do regular automatic backup of my vmware virtual machine (16GB big, Windows XP) that is running I do not have an access to ESX admin. I can ask our admin to set up something in the admin area but I do not have access for myself. I have installed few programs that are important to me so I want to have working backup at any point of time. Note: I know I can copy all the files when the virtual machine is not up and running.

    Read the article

  • Are these the correct instructions to backup TFS 2010?

    - by Keith Sirmons
    Howdy, I am working on a backup plan for TFS 2010. I found this site http://msdn.microsoft.com/en-us/library/ms253070(VS.100).aspx that details a complex backup solution. Has anyone tested these procedures and can confirm they are accurate? There are a couple of steps that violate the SharePoint rule "Do Not Modify the Database!" Thank you, Keith

    Read the article

  • Best way to backup a SQL Server database nightly?

    - by Urda
    What is the best way to backup a SQL Server 2005 database nightly? I'm looking for a solution/strategy that would make the database backup file, and place it in an "outgoing" directory. We're wanting our FTP software to move it out to an offsite server. Any suggestions on how to make this work as desired?

    Read the article

< Previous Page | 35 36 37 38 39 40 41 42 43 44 45 46  | Next Page >