Search Results

Search found 6101 results on 245 pages for 'incremental backup'.

Page 42/245 | < Previous Page | 38 39 40 41 42 43 44 45 46 47 48 49  | Next Page >

  • What are the strategies available to minimise badblocks on an encrypted partition?

    - by David Andreoletti
    Let me explain my backup strategy and the problem I am facing. My current backup strategy: Open encrypted container and execute Carbon Copy Cleaner on it at least once a week. Rotate backup disks. Problem: I have an Truecrypt partition on my 1st external hard disk. I recently found out that some files on this encrypted partition cannot be read due to bad blocks (reported by Antonio Diaz's GNU 'ddrescue'). My backup strategy is ineffective in this scenario because bad blocks are discovered during backup. Possible strategy Strategy #0: Have the encrypted partition over a RAID 1 with 2 disks. Is this a suitable strategy ? Strategy #1: Do you think of any other one ? Environment: Mac OS X 10.8 External 2.5" hard disk (SATA) No RAID

    Read the article

  • Renting linux server just to make backups of my personnal data ?

    - by Matthieu
    Hi all, I would like to be able to backup ALL my computers data on a Linux server. For now, I have a home server, but soon I will be travelling, without home (so no home server). I was thinking of renting a dedicated linux webserver, but this is expensive, and I don't need a fast machine "web-oriented" with mysql server and all, I just need a full SSH access (full control, and then I install my programs). Does "backup servers" exist ? Am I doing it wrong (maybe that is not a good solution) ? Note : I run Mac OS, Windows and Linux, I backup through rsync, I want full control on my backup, not an automated "magic" backup like MobileMe or anything like that. Edit : I need around 500Gb storage

    Read the article

  • Fault tolerance with a pair of tightly coupled services

    - by cogitor
    I have two tightly coupled services that can run on completely different nodes (e.g. ServiceA and ServiceB). If I start up another replicated copy of both these services for backup purposes (ServiceA-2 and ServiceB-2), what would be the best way of setting up a fault tolerant distributed system such that on a fault in any of the tightly coupled services ServiceA or ServiceB the whole communication should go through backup ServiceA-2 and ServiceB-2? Overall, all the communication should go either through both services or their backup replicas. |---- Service A | | Service B | | (backup branch - used only on fault in Service A or B) ---- Service A-2 | Service B-2 Note that in case that Service A goes down, data from Service B would be incorrect (and vice versa). Load balancing between the primary and backup branch is also not feasible.

    Read the article

  • Hot swapping for Linux web/database servers

    - by Art
    Is there a way to perform the following under Linux: There are two web servers, main and backup There are two database servers (postgres), main and backup Web Servers are in sync with each other, ie. configuration/content/applications are the same Backup database is continuously synced up with main database. If either of main servers goes down, it's being replaced with backup one on the fly. When main database server goes back up, all the data from backup server is uploaded to it. Essentially, I need the hot swapping working automatically with no or minimal user intervention, if possible. Recovery procedure is preferably automatic but can include some manual steps.

    Read the article

  • (Preferably) Encrypted Server Backups

    - by Shoaibi
    I have somehow managed to purchase a VPS after collecting money for sometime, now problem is i cant find a way to backup the server. My previous approach was: Got a webdav account from mydisk.se, mounted it on the vps, used duplicity and created encrypted backups. Problem is it was only 2G, and its running out of space, at my own place i dont have a stable internet connection else i have a 500G drive that i could surely use for backups. The vps has a 12G HD, and i would like to backup /home, /root, /etc, /var/ (specially log and www). Any ideas are welcomed. [EDIT] I am more of looking for resource of setting up a backup-point or such(i know how to setup a backup server, but i cant as i dont have stable connection or the money to buy another VPS/disk for backup) , i have already got the tools needed.

    Read the article

  • Make backups of Dropbox folder every week

    - by ilansch
    I have a Dropbox folder which is shared by couple of users. I would like to make a backup of this folder that will occur every week and store this backup on another hard drive. I can simply copy the entire folder each time and this will be the backup, but I would like to copy only the files that have been changed or created during that week. I thought of creating a batch script that will check each file in the Dropbox folder recursively and see its modified date. If that date is later then a given one (current backup date) it will copy the file to a folder named BackUP[Date]. Do you think this solution is OK?

    Read the article

  • Do I need to run a verfication on LTO tape backups even though the drives themselves perform verification as they write?

    - by ObligatoryMoniker
    We have an LTO-3 Tape drive in a Dell media library that we use for our tape backups. The article about LTO on Wikipedia states that: LTO uses an automatic verify-after-write technology to immediately check the data as it is being written, but some backup systems explicitly perform a completely separate tape reading operation to verify the tape was written correctly. This separate verify operation doubles the number of end-to-end passes for each scheduled backup, and reduces the tape life by half. What I would like to know is, do I need my backup software (Backup Exec in this case) to perform a verify on these tapes or is the verify-after-write technology inherent in LTO drives sufficient? I would also be curious if Backup Exec understands the verify-after-write technology enough to alert me if that technology couldn't veryify the data or will it just ignore it making it useless anyway since even if the drive detecs a problem I would never know about it.

    Read the article

  • Backing up Windows 2003 Server that has Certificate Authority

    - by Dina
    I want to export and migrate a Certificate Authority CA role from a Windows 2003 machine to a new copy of Windows 2008 R2 virtual machine. I was told that I cannot have 2 CA roles on the same network at the same time. Therefore, I must first export the certificates on the older machine, delete the CA role, then add the CA role on the new machine and import the certificates into it. As a safety precaution, I am tasked to find a backup solution in case this does not work and I need to revert back to the old Windows 2003 CA. My question is: What is the best software for doing this type of backup? I am currently trying out Symantec Backup Exec 2012. Which I hope will allow me to create a backup prior to removing CA role on Windows 2003. If this CA migration fails, the backup will allow me to revert the old machine to a time before I removed its CA role.

    Read the article

  • iMac with Mountain Lion and Mavericks

    - by bob
    I have been starting up my iMac (Mavericks) with a flash drive that boots up Mountain Lion so that I can use an app that is incompatible with Mavericks. Question: I have a single external backup drive always connected to my Mac. I am thinking of partitioning that drive and giving each partition a unique name, e.g., Backup 1 and Backup 2. I intend then to boot up in Mavericks and tell Time Machine to back up to Backup 1, and then boot up in Mountain Lion and tell Time Machine to back up to Backup 2. Will this work, i.e., will the appropriate partition automatically back up the system in which it is booted?

    Read the article

  • Few questions about backing up the OS

    - by user23950
    I'm running windows 7 with 2gb memory and 2.50 Ghz dual core cpu. here are my question regarding backing up an entire drive. I plan to use Macrium Reflect because its free. And I can't afford to buy one. If I backup a hard drive would I only be able to backup the partition where the operating system is? If I have installed applications which requires activation keys, and I have already installed an activation key. Does backing the hard drive also backup those application, so that I won't have to re-apply the keys later? If I have a multi-boot system, would the backup also include those other OS that are installed in my hard drive? Can I still boot into them after restoring the image? Do you have any links there that could enlighten me on what drive backup really is. Thanks!

    Read the article

  • Compress snapshot backups with duplicates

    - by Usavich
    I have a set of backups of mostly photos. The directory looks sort of like this: backup/Day1/photos/1.jpg .../2.jpg backup/Day2/photos/2.jpg .../3.jpg .../4.jpg backup/DayN/photos/2.jpg .../3.jpg .../9.jpg Files with same name are identical. There are many duplicates. Due to the way the backup system works, it's not possible to create incremental backup directly. I always get the entire dump each day. If I want to create a compressed archive for a date range, say Day 5~9, what is the best tool/compression algorithm to do that?

    Read the article

  • add a from to backup routine

    - by Gerard Flynn
    hi how do you put a process bar and button onto this code i have class and want to add a gui on to the code using System; using System.Collections.Generic; using System.ComponentModel; using System.Data; using System.Drawing; using System.Text; using System.Windows.Forms; using System.Data.SqlClient; using System.IO; using System.Threading; using Tamir.SharpSsh; using System.Security.Cryptography; using ICSharpCode.SharpZipLib.Checksums; using ICSharpCode.SharpZipLib.Zip; using ICSharpCode.SharpZipLib.GZip; namespace backup { public partial class Form1 : Form { public Form1() { InitializeComponent(); } /// <summary> /// Summary description for Class1. /// </summary> public class Backup { private string dbName; private string dbUsername; private string dbPassword; private static string baseDir; private string backupName; private static bool isBackup; private string keyString; private string ivString; private string[] backupDirs = new string[0]; private string[] excludeDirs = new string[0]; private ZipOutputStream zipOutputStream; private string backupFile; private string zipFile; private string encryptedFile; static void Main() { Backup.Log("BackupUtility loaded"); try { new Backup(); if (!isBackup) MessageBox.Show("Restore complete"); } catch (Exception e) { Backup.Log(e.ToString()); if (!isBackup) MessageBox.Show("Error restoring!\r\n" + e.Message); } } private void LoadAppSettings() { this.backupName = System.Configuration.ConfigurationSettings.AppSettings["BackupName"].ToString(); this.dbName = System.Configuration.ConfigurationSettings.AppSettings["DBName"].ToString(); this.dbUsername = System.Configuration.ConfigurationSettings.AppSettings["DBUsername"].ToString(); this.dbPassword = System.Configuration.ConfigurationSettings.AppSettings["DBPassword"].ToString(); //default to using where we are executing this assembly from Backup.baseDir = System.Reflection.Assembly.GetExecutingAssembly().Location.Substring(0, System.Reflection.Assembly.GetExecutingAssembly().Location.LastIndexOf("\\")) + "\\"; Backup.isBackup = bool.Parse(System.Configuration.ConfigurationSettings.AppSettings["IsBackup"].ToString()); this.keyString = System.Configuration.ConfigurationSettings.AppSettings["KeyString"].ToString(); this.ivString = System.Configuration.ConfigurationSettings.AppSettings["IVString"].ToString(); this.backupDirs = GetSetting("BackupDirs", ','); this.excludeDirs = GetSetting("ExcludeDirs", ','); } private string[] GetSetting(string settingName, char delimiter) { if (System.Configuration.ConfigurationSettings.AppSettings[settingName] != null) { string settingVal = System.Configuration.ConfigurationSettings.AppSettings[settingName].ToString(); if (settingVal.Length > 0) return settingVal.Split(delimiter); } return new string[0]; } public Backup() { this.LoadAppSettings(); if (isBackup) this.DoBackup(); else this.DoRestore(); Log("Finished"); } private void DoRestore() { System.Windows.Forms.OpenFileDialog fileDialog = new System.Windows.Forms.OpenFileDialog(); fileDialog.Title = "Choose .encrypted file"; fileDialog.Filter = "Encrypted files (*.encrypted)|*.encrypted|All files (*.*)|*.*"; fileDialog.InitialDirectory = Backup.baseDir; if (fileDialog.ShowDialog() == System.Windows.Forms.DialogResult.OK) { //string encryptedFile = GetFileName("encrypted"); string encryptedFile = fileDialog.FileName; string decryptedFile = this.GetDecryptedFilename(encryptedFile); //string originalFile = GetFileName("original"); this.Decrypt(encryptedFile, decryptedFile); //this.UnzipFile(decryptedFile, originalFile); } } //use the same filename as the backup except replace ".encrypted" with ".decrypted.zip" private string GetDecryptedFilename(string encryptedFile) { string name = encryptedFile.Substring(0, encryptedFile.LastIndexOf(".")); name += ".decrypted.zip"; return name; } private void DoBackup() { this.backupFile = GetFileName("bak"); this.zipFile = GetFileName("zip"); this.encryptedFile = GetFileName("encrypted"); this.DeleteFiles(); this.zipOutputStream = new ZipOutputStream(File.Create(zipFile)); try { //backup database first if (this.dbName.Length > 0) { this.BackupDB(backupFile); this.ZipFile(backupFile, this.GetName(backupFile)); } //zip any directories specified in config file this.ZipUserSpecifiedFilesAndDirectories(this.backupDirs); } finally { this.zipOutputStream.Finish(); this.zipOutputStream.Close(); } this.Encrypt(zipFile, encryptedFile); this.SCPFile(encryptedFile); this.DeleteFiles(); } /// <summary> /// Deletes any files created by the backup process, namely the DB backup file, /// the zip of all files backuped up, and the encrypred zip file /// </summary> private void DeleteFiles() { File.Delete(this.backupFile); File.Delete(this.zipFile); ///File.Delete(this.encryptedFile); } private void ZipUserSpecifiedFilesAndDirectories(string[] fileNames) { foreach (string fileName in fileNames) { string name = fileName.Trim(); if (name.Length > 0) { Log("Zipping " + name); this.ZipFile(name, this.GetNameFromDir(name)); } } } private void SCPFile(string inputPath) { string sshServer = System.Configuration.ConfigurationSettings.AppSettings["SSHServer"].ToString(); string sshUsername = System.Configuration.ConfigurationSettings.AppSettings["SSHUsername"].ToString(); string sshPassword = System.Configuration.ConfigurationSettings.AppSettings["SSHPassword"].ToString(); if (sshServer.Length > 0 && sshUsername.Length > 0 && sshPassword.Length > 0) { Scp scp = new Scp(sshServer, sshUsername, sshPassword); //Copy a file from local machine to remote SSH server scp.Connect(); Log("Connected to " + sshServer); //scp.Put(inputPath, "/home/wal/temp.txt"); scp.Put(inputPath, GetName(inputPath)); scp.Close(); } else { Log("Not SCP as missing login details"); } } private string GetName(string inputPath) { FileInfo info = new FileInfo(inputPath); return info.Name; } private string GetNameFromDir(string inputPath) { DirectoryInfo info = new DirectoryInfo(inputPath); return info.Name; } private static void Log(string msg) { try { string toLog = DateTime.Now.ToString() + ": " + msg; System.Diagnostics.Debug.WriteLine(toLog); System.IO.FileStream fs = new System.IO.FileStream(baseDir + "app.log", System.IO.FileMode.OpenOrCreate, System.IO.FileAccess.ReadWrite); System.IO.StreamWriter m_streamWriter = new System.IO.StreamWriter(fs); m_streamWriter.BaseStream.Seek(0, System.IO.SeekOrigin.End); m_streamWriter.WriteLine(toLog); m_streamWriter.Flush(); m_streamWriter.Close(); fs.Close(); } catch (Exception e) { Console.WriteLine(e.ToString()); } } private byte[] GetFileBytes(string path) { FileStream stream = new FileStream(path, FileMode.Open); byte[] bytes = new byte[stream.Length]; stream.Read(bytes, 0, bytes.Length); stream.Close(); return bytes; } private void WriteFileBytes(byte[] bytes, string path) { FileStream stream = new FileStream(path, FileMode.Create); stream.Write(bytes, 0, bytes.Length); stream.Close(); } private void UnzipFile(string inputPath, string outputPath) { ZipInputStream zis = new ZipInputStream(File.OpenRead(inputPath)); ZipEntry theEntry = zis.GetNextEntry(); FileStream streamWriter = File.Create(outputPath); int size = 2048; byte[] data = new byte[2048]; while (true) { size = zis.Read(data, 0, data.Length); if (size > 0) { streamWriter.Write(data, 0, size); } else { break; } } streamWriter.Close(); zis.Close(); } private bool ExcludeDir(string dirName) { foreach (string excludeDir in this.excludeDirs) { if (dirName == excludeDir) return true; } return false; } private void ZipFile(string inputPath, string zipName) { FileAttributes fa = File.GetAttributes(inputPath); if ((fa & FileAttributes.Directory) != 0) { string dirName = zipName + "/"; ZipEntry entry1 = new ZipEntry(dirName); this.zipOutputStream.PutNextEntry(entry1); string[] subDirs = Directory.GetDirectories(inputPath); //create directories first foreach (string subDir in subDirs) { DirectoryInfo info = new DirectoryInfo(subDir); string name = info.Name; if (this.ExcludeDir(name)) Log("Excluding " + dirName + name); else this.ZipFile(subDir, dirName + name); } //then store files string[] fileNames = Directory.GetFiles(inputPath); foreach (string fileName in fileNames) { FileInfo info = new FileInfo(fileName); string name = info.Name; this.ZipFile(fileName, dirName + name); } } else { Crc32 crc = new Crc32(); this.zipOutputStream.SetLevel(6); // 0 - store only to 9 - means best compression FileStream fs = null; try { fs = File.OpenRead(inputPath); } catch (IOException ioEx) { Log("WARNING! " + ioEx.Message);//might be in use, skip file in this case } if (fs != null) { byte[] buffer = new byte[fs.Length]; fs.Read(buffer, 0, buffer.Length); ZipEntry entry = new ZipEntry(zipName); entry.DateTime = DateTime.Now; // set Size and the crc, because the information // about the size and crc should be stored in the header // if it is not set it is automatically written in the footer. // (in this case size == crc == -1 in the header) // Some ZIP programs have problems with zip files that don't store // the size and crc in the header. entry.Size = fs.Length; fs.Close(); crc.Reset(); crc.Update(buffer); entry.Crc = crc.Value; this.zipOutputStream.PutNextEntry(entry); this.zipOutputStream.Write(buffer, 0, buffer.Length); } } } private void Encrypt(string inputPath, string outputPath) { RijndaelManaged rijndaelManaged = new RijndaelManaged(); byte[] encrypted; byte[] toEncrypt; //Create a new key and initialization vector. //myRijndael.GenerateKey(); //myRijndael.GenerateIV(); /*des.GenerateKey(); des.GenerateIV(); string temp1 = Convert.ToBase64String(des.Key); string temp2 = Convert.ToBase64String(des.IV);*/ //Get the key and IV. byte[] key = Convert.FromBase64String(keyString); byte[] IV = Convert.FromBase64String(ivString); //Get an encryptor. ICryptoTransform encryptor = rijndaelManaged.CreateEncryptor(key, IV); //Encrypt the data. MemoryStream msEncrypt = new MemoryStream(); CryptoStream csEncrypt = new CryptoStream(msEncrypt, encryptor, CryptoStreamMode.Write); //Convert the data to a byte array. toEncrypt = this.GetFileBytes(inputPath); //Write all data to the crypto stream and flush it. csEncrypt.Write(toEncrypt, 0, toEncrypt.Length); csEncrypt.FlushFinalBlock(); //Get encrypted array of bytes. encrypted = msEncrypt.ToArray(); WriteFileBytes(encrypted, outputPath); } private void Decrypt(string inputPath, string outputPath) { RijndaelManaged myRijndael = new RijndaelManaged(); //DES des = new DESCryptoServiceProvider(); byte[] key = Convert.FromBase64String(keyString); byte[] IV = Convert.FromBase64String(ivString); byte[] encrypted = this.GetFileBytes(inputPath); byte[] fromEncrypt; //Get a decryptor that uses the same key and IV as the encryptor. ICryptoTransform decryptor = myRijndael.CreateDecryptor(key, IV); //Now decrypt the previously encrypted message using the decryptor // obtained in the above step. MemoryStream msDecrypt = new MemoryStream(encrypted); CryptoStream csDecrypt = new CryptoStream(msDecrypt, decryptor, CryptoStreamMode.Read); fromEncrypt = new byte[encrypted.Length]; //Read the data out of the crypto stream. int bytesRead = csDecrypt.Read(fromEncrypt, 0, fromEncrypt.Length); byte[] readBytes = new byte[bytesRead]; Array.Copy(fromEncrypt, 0, readBytes, 0, bytesRead); this.WriteFileBytes(readBytes, outputPath); } private string GetFileName(string extension) { return baseDir + backupName + "_" + DateTime.Now.ToString("yyyyMMdd") + "." + extension; } private void BackupDB(string backupPath) { string sql = @"DECLARE @Date VARCHAR(300), @Dir VARCHAR(4000) --Get today date SET @Date = CONVERT(VARCHAR, GETDATE(), 112) --Set the directory where the back up file is stored SET @Dir = '"; sql += backupPath; sql += @"' --create a 'device' to write to first EXEC sp_addumpdevice 'disk', 'temp_device', @Dir --now do the backup BACKUP DATABASE " + this.dbName; sql += @" TO temp_device WITH FORMAT --Drop the device EXEC sp_dropdevice 'temp_device' "; //Console.WriteLine("sql="+sql); Backup.Log("Starting backup of " + this.dbName); ExecuteSQL(sql); } /// <summary> /// Executes the specified SQL /// Returns true if no errors were encountered during execution /// </summary> /// <param name="procedureName"></param> private void ExecuteSQL(string sql) { SqlConnection conn = new SqlConnection(this.GetDBConnectString()); try { SqlCommand comm = new SqlCommand(sql, conn); conn.Open(); comm.ExecuteNonQuery(); } finally { conn.Close(); } } private string GetDBConnectString() { StringBuilder builder = new StringBuilder(); builder.Append("Data Source=127.0.0.1; User ID="); builder.Append(this.dbUsername); builder.Append("; Password="); builder.Append(this.dbPassword); builder.Append("; Initial Catalog="); builder.Append(this.dbName); builder.Append(";Connect Timeout=30"); return builder.ToString(); } } } }

    Read the article

  • Can't get rsync over sftp to work

    - by Patrik
    I'm trying to set up a backup system from an Ubuntu server to a Synology NAS (DS413j) using rsync and sftp. I have created a user for this that we can call ubuntu-backup. I have a directory in ubuntu-backup home directory called www where the backup will be saved. I have enabled Network Backup in DSM The user ubuntu-backup has full access to it's home directory Here is my rsync config file on the Synology NAS: #motd file = /etc/rsyncd.motd #log file = /var/log/rsyncd.log pid file = /var/run/rsyncd.pid lock file = /var/run/rsync.lock use chroot = no [NetBackup] path = /var/services/NetBackup comment = Network Backup Share uid = root gid = root read only = no list = yes charset = utf-8 auth users = root secrets file = /etc/rsyncd.secrets [ubuntu-backup] path = /volume1/homes/ubuntu-backup/www comment = Ubuntu Backup uid = ubuntu-backup gid = users read only = false auth users = ubuntu-backup secrets file = /etc/rsyncd.secrets The permissions on /volume1/homes/ubuntu-backup/www is ubuntu-backup:users 777 Here is the command i'm running. rsync -aHvhiPb /var/www/ [email protected]:./ The result: sending incremental file list ERROR: module is read only rsync error: syntax or usage error (code 1) at main.c(1034) [Receiver=3.0.9] rsync: connection unexpectedly closed (9 bytes received so far) [sender] rsync error: error in rsync protocol data stream (code 12) at io.c(605) [sender=3.0.9] If I'm running this: rsync -aHvhiPb /var/www/ [email protected] It looks like its sending files. No errors. But I cant find anything on the NAS.

    Read the article

  • Incremental Backups / Versioned backups using PHP

    - by Adam
    I'm trying to backup a folder containing several folders and files to a remote location (will be uploading zipped files). Is there any existing scripts that may help me, which checks if the files have been modified after the date of the last backup, and only backs up files created / modified after that? The current size of the data is around 1gb, and I expect adding 50mb-200mb each month Also, what would be the best way to extract the state of the files on a specific date?

    Read the article

  • lftp make-backup not functioning as expected

    - by Felipe Alvarez
    With default settings, when 'putting' the file, it is clobbered without complaint or warning. When 'getting' lftp complains: get: super.sh: file already exists and xfer:clobber is unset I change my /etc/lftp.conf and append: set xfer:make-backup yes set xfer:clobber yes When putting and getting, the files get clobbered, however no backup is made. I've checked the settings with "set -a | grep clob" and "set -a | grep backup" and the values are correct.

    Read the article

  • CentOS Backup BASH Script

    - by user1062058
    I just wrote this script for backing up everything into a tar.gz file. Does it look okay? How can I get the tar file to transfer itself over to another server after executing? FTP from itself? I'm going to put this script into a weekly cron. #!/bin/bash rm ~/backup.tar.gz #removes old backup BACKUP_DIRS=$HOME #$HOME is builtin, it goes to /home/ and all child dirs tar -cvzf backup.tar.gz $BACKUP_DIRS # run tar -zxvf to extract backup.tar.gz Thanks.

    Read the article

  • Windows server backup question

    - by serveradminguy
    Hi, Is it possible to make a backup of my Windows Server from the built-in Microsoft tool, but as long as my Hyper-V backups are stored safely and not backed up anywhere, I can still restore my Windows Server from the native backup and use the Hyper-v machines? So if I lost my C:\ and my VMs are stored remotely, I can restore from an earlier backup and use those VMs.

    Read the article

  • How do I restore my system from a "Backup and Restore Center" backup?

    - by Daniel R Hicks
    The Windows (Vista) documentation and available online info is comprehensively vague. If I have a moderately brain dead system and want to restore it, and I have a "Backup and Restore Center" backup whose "delta" is not quite a week old (but with a "full backup" behind it), what steps do I go through to recover my box back to that backup point? It's totally unclear whether simply doing "restore all" from the (advanced) "Center" is sufficient, or do I need to first take the box back to day zero with the system restore DVD, et al? (Just editing this to get my correct ID associated with it.)

    Read the article

  • SQL Server: "This filegroup cannot be used as a backup destination" error when attempting restore

    - by Ariel
    When running a command like the following: "RESTORE FILELISTONLY FROM DISK='\\\\server\\folder\\DummyDB.bak'" I'm getting this error: Backup destination "\\server" supports a FILESTREAM filegroup. This filegroup cannot be used as a backup destination. Rerun the BACKUP statement with a valid backup destination. RESTORE FILELIST is terminating abnormally. Unless someone comes up with a better idea, it seems that the drive from which restore is being attempted must not contain any database file contained in a filegroup. Is that the case? Thanks in advance.

    Read the article

  • Powershell (sqlps) lastbackupdate not changing despite having run a sqlserver backup

    - by user1666376
    I'm using Powershell to check last backup times across all our sqlserver databases. This seems to work really well, but I've got a question If I run this (a cut-down version of the actual script): dir SQLSERVER:\SQL\Server1\default\databases | select parent, name, lastbackupdate I get: Parent Name LastBackupDate ------ ---- -------------- [Server1] ADBA 10/09/2012 21:15:37 [Server1] ReportServer 10/09/2012 21:00:17 [Server1] ReportServerTempDB 10/09/2012 21:00:18 [Server1] db1 10/09/2012 21:15:35 If I then run a sql backup of the Server1 default instance, and run the same query the last backup date doesn't change: PS C:\temp> dir SQLSERVER:\SQL\Server1\default\databases | select parent, name, lastbackupdate Parent Name LastBackupDate ------ ---- -------------- [Server1] ADBA 10/09/2012 21:15:37 [Server1] ReportServer 10/09/2012 21:00:17 [Server1] ReportServerTempDB 10/09/2012 21:00:18 [Server1] db1 10/09/2012 21:15:35 ..but if I open a new powershell window, it shows the backup I just took: PS SQLSERVER:\> dir SQLSERVER:\SQL\Server1\default\databases | select parent, name, lastbackupdate Parent Name LastBackupDate ------ ---- -------------- [server1] ADBA 12/09/2012 09:03:23 [server1] ReportServer 12/09/2012 08:48:03 [server1] ReportServerTempDB 12/09/2012 08:48:04 [server1] db1 12/09/2012 09:03:21 My guess is that this is expected behaviour, but could anybody show me where it's documented/explained - I just want to understand what's going on. This is running the SQlps which came with 2008, against a 2008 instance. Thanks Matt

    Read the article

  • rsync over ssh backup failing after relocation of server

    - by OlduvaiHand
    I've got two FreeBSD machines set up; one serves video data and the other is the backup for the first. At this point I've got around 4TB of data. I add files to the video server a few at a time, and was planning to use rsync over ssh to keep the backup machine up to date. I did the initial, large backup with both machines hooked up to the same subnet at the lab with no problems using rsync. Then, when I moved the backup machine off-site (but still on the university network), I attempted a sync without changing anything other than the IP (as the machine is now on a different subnet) and got the following error: 2010/03/22 15:55:21 [1260] rsync: connection unexpectedly closed (6340840244 bytes received so far) [receiver] 2010/03/22 15:55:21 [1260] rsync error: error in rsync protocol data stream (code 12) at io.c(601) [receiver=3.0.7] 2010/03/22 15:55:21 [1258] rsync: connection unexpectedly closed (60 bytes received so far) [generator] 2010/03/22 15:55:21 [1258] rsync error: unexplained error (code 255) at io.c(601) [generator=3.0.7] The script that handles the backup hasn't been changed, nor has the crontab that invokes it. Does anyone have any ideas about what might be causing the hiccup? I was under the impression that it might have something to do with the ssh connection timing out or something along those lines, but am not entirely clear on how to diagnose the cause of the problem.

    Read the article

  • Could not retrieve backup settings for primary ID in Log shipping

    - by user1723139
    I am doing log shipping between two Amazon EC2 instances running Windows Server 2008 R2 with SQL Server 2008 R2 standard edition. Both the instances are in the same domain and I can access the shared folders between the instances. The SQL server service account, agent service account are all running under a domain account. When I activate log shipping (with stand by mode restore in secondary server), the initial backup gets restored on the secondary. After that the backup operation is getting failed and i get the following error message: *** Error: Could not retrieve backup settings for primary ID 'xxxxxx-xxxx-xxxx-xxxx-4d772cd7337e'.(Microsoft.SqlServer.Management.LogShipping) *** *** Error: Failed to connect to server IP-0A7653F2.(Microsoft.SqlServer.ConnectionInfo) *** ****** Error: A network-related or instance-specific error occurred while establishing a connection to SQL Server.******** **The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server)(.Net SqlClient Data Provider) *** **----- END OF TRANSACTION LOG BACKUP -----**** Any ideas?

    Read the article

  • Backup Exec 12.5 or 2010? [closed]

    - by Chris Thorpe
    Backup Exec 2010 has just dropped, and I'm about to implement a new BEWS infrastructure, complete with CALs and new central servers. When I specced this up last year, I ignored 2010 and focused on Backup Exec 12.5, since it's a mature product. In previous experience, major released of BE had numerous technical issues and seemed to improve significantly at the first service pack. However, our refresh cycle on the backup infrastructure is slow, the main driver usually being lack of support for some new server type (in this case, ESX has driven our current upgrade need). With this in mind, I'm wondering if Backup Exec 2010 should be my first choice, as it'll last longer under current support than 12.5, which will approach EOL soon. Has anyone got any perspective they could add to this? Right now, I'm leaning towards biting the bullet and going with 2010.

    Read the article

  • BTrFS subvolume / snapshot question

    - by bumbling fool
    I think I'm having difficulty fully understanding subvolumes and snapshots. The /home partition is btrfs. I want to create a "backup" snapshot of /home/user (for example) but user has existed for years (previously ext4 btrfs-convert). I believe you can only make a snapshot of a subvolume. I checked and there are no "default" subvolumes already present. 1) Is there another way for me to backup /home/user other than creating a subvolume /home/user2 and copying everything from user to user2 in order to snapshot it?

    Read the article

  • Cloud storage that works with rsnapshot?

    - by humbledude
    I’ve started using rsnapshot as my backup system for my home PC. I really like the idea of hard links and how they are handled. But I can’t find the best workflow. Currently I keep my snapshots on the same partition and will copy the newest snapshot to a pen-drive at the end of the week. Cloud storage is what I’m looking for. Dropbox doesn’t fit my needs, because there is no way to make Dropbox respect hard links — all snapshots are treated as full snapshots. Renting a server is pretty expensive, so my question is, are there better alternatives for backup in the cloud? I would like to benefit from hard links and send only incremental backups, just like I do with my local host.

    Read the article

< Previous Page | 38 39 40 41 42 43 44 45 46 47 48 49  | Next Page >