Search Results

Search found 5747 results on 230 pages for 'backup'.

Page 21/230 | < Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >

  • Backup Your Windows Home Server Off-Site with Asus Webstorage

    - by Mysticgeek
    Windows Home Server lets you backup machines on your network easily. But what about backing up the server data? Today we take a look at ASUS WebStorage for Windows Home Server, which provides you with secure off-site backup for WHS. To use the ASUS WebStorage service you’ll need to sign up for a free account. It offers 1GB of free storage, then you can purchase an unlimited backup package for $39.99 for a year subscription. Note: They also offer online storage for individual PCs as well. Install ASUS WebStorage for WHS Browse to your shared folders on the server and open the Add-Ins folder and copy over the WHSConnectorSetup2.2.4.088.msi file (link below) then close out of the folder. Now launch Windows Home Server Console from one of the computers on your network, click Settings, then Add-ins. Under Available Add-ins click the Available tab and you’ll see the Asus WebStorage installer file we just copied over. Click the Install button. Installation kicks off and when it’s complete, you’ll need to close out of the console and reconnect. Using ASUS WebStorage WHS Connector  When you reconnect to WHS Console, scroll over to the ASUS WebStorage icon and click on Settings. Now log into your ASUS account… Now select the folders you want to backup to the WebStorage service. Select the radio button next to Enable to initialize the backup process… The backup process begins. You can change which folders are backed up simply by disabling the backup process, uncheck the folder(s), then enable the backup again. ASUS WebStorage Site After you have files backed up to the ASUS site, log into your account, and your presented with an overview of the amount of storage you’re using. It also shows what type of files are taking certain amounts of space.   You can browse through your backed up files and folders. It allows you to share and sync backed up data as well. Navigate to the file you want and you can easily download it by clicking on it, or share it out by clicking the share link below it. If you choose to share it, you’re provided with a link to the file to send out to other users.   Conclusion Users of Windows Home Server have been looking for an inexpensive cloud backup solution for quite some time. There are services such as JungleDisk, KeepVault, Wuala…etc. These services probably do a better job, but can start getting expensive once you start uploading a GBs of data. Another disappointment of ASUS WebStorage is you can only backup your WHS shares (from what we’ve been able to determine), it’s an “all or nothing” type of thing. You cannot go in and select individual files and folders. The initial upload speeds can be a bit slow as well, although that might have something to do with limited upload speeds on the DSL connection we used to test it. Retrieving your data from the ASUS site is a breeze though, and all the data files are organized quite well. The WHS Addin is very easy to install and use. If you’re looking for an off-site solution to backup your WHS data, you can test out ASUS WebStorage for free with a 1GB limit. This is good for testing the service and it might be exactly what you’re looking for. Other users may want a more advanced solution like KeepVault or CloudBerry…which is a front end for Amazon S3 storage. Download ASUS WebStorage WHS Addin Other WHS Offsite Backup Solutions CloudBerry, JungleDisk, KeepVault, Wuala Similar Articles Productive Geek Tips Restore Files from Backups on Windows Home ServerGMedia Blog: Setting Up a Windows Home ServerCreate A Windows Home Server Home Computer Restore DiscRemove a Network Computer from Windows Home ServerShare Ubuntu Home Directories using Samba TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Gadfly is a cool Twitter/Silverlight app Enable DreamScene in Windows 7 Microsoft’s “How Do I ?” Videos Home Networks – How do they look like & the problems they cause Check Your IMAP Mail Offline In Thunderbird Follow Finder Finds You Twitter Users To Follow

    Read the article

  • Windows Server 2008 backup VHD's - is it possible to mount/open in Windows 7?

    - by Simon
    Hi All, Is it possible to mount the VHD files created by the Windows Server 2008 backup utility onto a Windows 7 (release) client? Following an array failure I was very worried that there was a problem with both the backup sets on different USB drives as attaching the VHD to a Win 7 box did not show the expected structure (instead they behaved like unformatted disk space). Subsequently, I've attached the backup drive to a 2008r2 machine that I'd intended to be the replacement and the backup set can be browsed without issue (seemingly). When the new disks arrive I'll go through the recovery process and see where we are, but it looks promising so far. Is it simply the case that you can't take server created VHD's and mount them on desktop machines? (Rather than hyper-ventilating at the thought of years of lost photos and email, I'm now just mildly curious) Edit:One thing that has confused things is that the backup utility on Win7 is more restrictive about restoring from external devices than the equivilent on 2008r2. With r2, I can restore files 'from another server' and browse to external storage. Win7 only allows the back to be located on a network share. Once my box of new disks arrive and I've got something to restore onto, I'll move the smaller of the backup VHDs onto network storage reachable by Win7 and see if the VHD is readable. I haven't read up on the VHD process used by the backup app - I'm assuming it's a base VHD and differencing files used for incremental backups and that the restore app understands this. Finally: In retrospect the question should have been, 'can I restore a 2008r2 backup set via a Win 7 client' Thanks

    Read the article

  • Why am I not able to create a backup plan for TFS?

    - by noocyte
    I am trying to create a backup plan using the TFS Power Tools but I keep running into this error message: I have checked that the account has Full Control on the share, I can edit, create and delete files there. From the log: [Info @07:15:00.403] Starting creating backup test validation [Error @07:15:00.700] Microsoft.SqlServer.Management.Smo.FailedOperationException: Backup failed for Server 'WMSI003714N\SqlExpress'. ---> Microsoft.SqlServer.Management.Common.ExecutionFailureException: An exception occurred while executing a Transact-SQL statement or batch. ---> System.Data.SqlClient.SqlException: Cannot open backup device '\\wmsi003714n\sql dump\Tfs_Configuration_20100910091500.bak'. Operating system error 5(failed to retrieve text for this error. Reason: 1815). BACKUP DATABASE is terminating abnormally. at Microsoft.SqlServer.Management.Common.ConnectionManager.ExecuteTSql(ExecuteTSqlAction action, Object execObject, DataSet fillDataSet, Boolean catchException) at Microsoft.SqlServer.Management.Common.ServerConnection.ExecuteNonQuery(String sqlCommand, ExecutionTypes executionType) --- End of inner exception stack trace --- at Microsoft.SqlServer.Management.Common.ServerConnection.ExecuteNonQuery(String sqlCommand, ExecutionTypes executionType) at Microsoft.SqlServer.Management.Common.ServerConnection.ExecuteNonQuery(StringCollection sqlCommands, ExecutionTypes executionType) at Microsoft.SqlServer.Management.Smo.ExecutionManager.ExecuteNonQuery(StringCollection queries) at Microsoft.SqlServer.Management.Smo.BackupRestoreBase.ExecuteSql(Server server, StringCollection queries) at Microsoft.SqlServer.Management.Smo.Backup.SqlBackup(Server srv) --- End of inner exception stack trace --- at Microsoft.SqlServer.Management.Smo.Backup.SqlBackup(Server srv) at Microsoft.TeamFoundation.PowerTools.Admin.Helpers.BackupFactory.TestBackupCreation(String path) [Error @07:15:00.731] !Verify Error!: Account GROUPINFRA\SA-NO-TeamService failed to create backups using path \\wmsi003714n\sql dump [Info @07:15:00.731] "Verify: Grant Backup Plan Permissions\Root\VerifyDummyBackupCreation(VerifyTestBackupCreatedSuccessfully): Exiting Verification with state Completed and result Error" Any ideas?

    Read the article

  • Are ZFS snapshots + S3 a viable backup system for several VMs and general fileserver storage?

    - by AllanA
    I've been tasked with setting up a backup system for my small office (around 12 people). Most of our production stuff is on the AWS cloud, so what I need to back up are some small office/development files (under 100G right now), plus our operational VMs and development, which round out to a bit under 1T. I just need something reliable, convenient, and straightforward. I'm comfortable with Linux, FreeBSD, and to some extent Solaris 10, so I'm leaning toward a full server rather than an appliance system ala Openfiler or FreeNAS. What I'm contemplating is a small fileserver for general storage and nightly backups of the virtual machines, followed up by an offsite backup to Amazon's S3 storage service. It'd be the usual incremental backups nightly and full backup weekly. My question is if using ZFS snapshots, both locally and dumped to S3 via 'zfs send [-i]', is a viable backup tool? Or should I stick to using Duplicity, or some other method entirely? ZFS snapshots on the internal fileserver/backup machine sound like a perfect way to provide quick and convenient data recovery, so I'm likely to go with that for local redundancy. (If you folks see scenarios where relying on ZFS snapshots would be worse than a more traditional archiving backup, feel free to convince me.) But are snapshots flexible enough to lean on for recovery from the loss of my backup server? Or am I better off with something more traditional? (feel free to recommend free or commercial backup solutions you favor.)

    Read the article

  • Backing up Windows Server 2008 R2 to FTP server

    - by Adrian Grigore
    Hi, I'm looking for an inexpensive way of backing up my Windows 2008 R2 dedicated server to an FTP server. To be any useful, the software should also be able to restore the server by using a bootable CD and the backup set stored on the FTP server. So Windows server backup seems to be out of the question. Can anyone recommend any suitable products? Preferably some you have actually tried yourself? Thanks, Adrian Edit: Just to clarify, by inexpensive I mean something that costs 250 EUR or less...

    Read the article

  • Back up Windows 2008 SBS to iSCSI disk

    - by Farseeker
    I've almost no experience with SBS 2008, so please excuse my noob question! SBS 2008 only has the most basic backup utility built in as far as I can tell (similar to Vista), and it will only back up to physical volumes. I've read that you can set up a batch task to backup to a network volume, but right now I just need to get something deployed ASAP. We have an iSCSI target with plenty of free space. Is it worth backing up to an iSCSI target? Or am I wasting my time? If I need to do a recovery from the iSCSI disk, how would I go about it?

    Read the article

  • Backing Up vs. Redundancy

    - by TK Kocheran
    I'm currently in stage 2 of 3 of building my home workstation. What this means is that my RAID-0 array of solid state disks will be backed up nightly to a RAID-5 or RAID-6 array of traditional spinning hard disks. However, it recently dawned on me that redundancy is not backup. The main reason for setting up a RAID array with redundancy was to protect myself in the event of a drive failure to serve as an effective backup solution. Wait. What if a bolt of lightning finds a way to travel into my house, through my surge-protector, into my power supply and physically destroys all of my hard disks and SSDs? Well, in that case, I guess I'd be fine because I generally keep most important files (music, pictures, videos) stored in multiple places like on my laptop, my wife's laptop, and an encrypted USB hard drive. Wait. What if a giant hedgehog meteor attacks my house from space traveling at mach 3 and all machines and hard disks are blown to smithereens. Well, I guess I could find a way to do ridiculously slow and cumbersome rsyncs or backups to Amazon's Glacier. Wait. What if there's a nuclear apocalypse... and at this point I start laughing hysterically. At what point does backing up become irrelevant? I completely understand situation one (mechanical drive failure), situation two (workstation compromised or destroyed somehow), possibly even situation three (all machines and disks destroyed), but situation four? There's no questioning the need for backups. None. However, there are three questions I'd really like addressed: To what level should one backup? I definitely understand the merits of physical disk redundancy. I also believe in keeping important files on multiple machines and thinning out the possibility of losing all of my files. Online backups make sense, but they beg the following question. What should I be backing up remotely and how often? It's no problem storage-wise to back up important files (music, pictures, videos) and even configuration and temporal data for all of the machines in my network (all Linux based)... albeit locally. Transferring to the cloud is another story. Worst-case scenario, if I lost all of my configuration for my individual computers, the reality is that I probably lost the machines too. The cloud is a long way away from here; I can run backups over CAT-6 here and see 100MB/s easily, but I'm afraid that I'm only going to see 2MB/s at best when transferring up to the cloud.

    Read the article

  • Windows 7 Image Restore with a smaller hard drive

    - by Vaccano
    I have a 500 GB drive that I have made a system image of. I would like to move that to a 250 GB drive (because it is a Solid State drive). I have made a Windows 7 Backup Image of my 500 GB drive. I am currently only using 163 GB of that drive. Can I just restore that to the target drive or will the restore be expecting a 500 GB drive? If it is expecting it I can shrink my partition to less that 250 and backup again. But I would rather not if that is not needed. Will the restore realize that it is not using all the space and just take what it needs?

    Read the article

  • WesternDigital SmartWare - still problematic?

    - by FrustratedWithFormsDesigner
    I've seen a lot of bad press on WD SmartWare (I think it comes on most WD backup devices now, such as their MyBook product line), mostly related to how it's impossible to remove properly or replace. There are allegations (I couldn't tell how true they were) that it has/is a rootkit, as well. Most of the articles are a couple of years old, so I'm wondering if SmartWare is still just as problematic as it was. Does it still have a nasty rootkit reputation and should I just stick with the Windows 7 built-in backup system, or is the current SmartWare generation improved and better behaved?

    Read the article

  • Deleting pagefile.sys on shutdown

    - by Daniel E. Shub
    I have a Windows XP machine (it is a VM running in Xen) that I would like to backup. I have enabled ClearPageFileAtShutdown by following MS KB 314834. If I cleanly shutdown the XP machine and then mount the drive in another machine (which is trivial since the machine is virtual) I still have a large pagefile.sys. I was hoping that enabling ClearPageFileAtShutdown would result in a pagefile.sys with a size near zero. I have two questions. First, is it possible to have pagefile.sys be deleted, or have a drastic size reduction, at shutdown? Second, can I exclude pagefile.sys from my backup?

    Read the article

  • Using mongodump with an auth enabled mongodb server

    - by bb-generation
    I'm trying to do a daily backup of my mongodb server (auth enabled) using the mongodump tool. mongodump provides two parameters to set the credentials: -u [ --username ] arg username -p [ --password ] arg password Unfortunately they don't provide any parameter to read the password from stdin. Therefore everytime I run this command, everyone on the server can read the password (e.g. by using ps aux). The only workaround I have found is stopping the database and directly accessing the database files using the --dbpath parameter. Is there any other solution which allows me to backup the mongodb database without stopping the server and without "publishing" my password? I am using Debian squeeze 6.0.5 amd64 with mongodb 1.4.4-3.

    Read the article

  • Is it a good idea to take onsite/offsite backups of server images?

    - by ServerAdminGuy45
    Assuming a non-virtualized environment it a good idea to take actual images of servers (using something like Acronis True Image) and store them on\off site? Backing up data is great but I feel it would be good to have copies of OS images in the event hardware dies or an upgrade gets botched I can always revert back. What would be your recommended way to do this (preferably using a NAS and an online backup service)? I was talking with the Iron Mountain folks and the service they described is more geared toward taking incremental snapshots of data. I'm not sure if there's a way to backup images in an incremental way such that only the changes between them are saved (that way I'm not wasting X GB each time I take an image).

    Read the article

  • Extract duplicity difftar files manually

    - by isnogud
    I have a duplicity backup which i am not able to recover with duplicity. By calling duplicity file:///path/to/backups /path/to/dir, it returns "Local and Remote metadata are syncronized, no sync needed." but the /path/to/dir is empty. I decrypted all backup volumes and I'm able to view and extract the files from the different difftar files. My only problem is that there are files partitioned and saved in folders named after the files. Can anyone give me a simple script or at least a hint how to untar these difftar files so i get the actual files instead of the partitioned ones?

    Read the article

  • Cloud storage provider lost my data. How to back up next time?

    - by tomcam
    What do you do when cloud storage fails you? First, some background. A popular cloud storage provider (rhymes with Booger Link) damaged a bunch of my data. Getting it back was an uphill battle with all the usual accusations that it was my fault, etc. Finally I got the data back. Yes, I can back this up with evidence. Idiotically, I stayed with them, so I totally get that the rest of this is on me. The problem had been with a shared folder that works with all 12 computers my business and family use with the service. We'll call that folder the Tragic Briefcase. It is a sort of global folder that's publicly visible to all computers on the service. It's our main repository. Today I decided to deal with some residual effects of the Crash of '11. Part of the damage they did was that in just one of my computers (my primary, of course) all the documents in the Tragic Briefcase were duplicated in the Windows My Documents folder. I finally started deleting them. But guess what. Though they appeared to be duplicated in the file system, removing them from My Documents on the primary PC caused them to disappear from the Tragic Briefcase too. They efficiently disappeared from all the other computers' Tragic Briefcases as well. So now, 21 gigs of files are gone, and of course I don't know which ones. I want to avoid this in the future. Apart from using a different storage provider, the bigger picture is this: how do I back up my cloud data? A complete backup every week or so from web to local storage would cause me to exceed my ISP's bandwidth. Do I need to back up each of my 12 PCs locally? I do use Backupify for my primary Google Docs, but I have been storing taxes, confidential documents, Photoshop source, video source files, and so on using the web service. So it's a lot of data, but I need to keep it safe. Backup locally would also mean 2 backup drives or some kind of RAID per PC, right, because you can't trust a single point of failure? Assuming I move to DropBox or something of its ilk, what is the best way to make sure that if the next cloud storage provider messes up I can restore?

    Read the article

  • SharePoint 2010 and Windows Server Backup

    - by Enrique Lima
    A couple of months ago, a friend found a bit of information on TechNet that has proven to be quite useful. See, I am of the opinion SharePoint allows for smaller deployments to be made, and with that said, I am talking about SharePoint Foundation 2010 being used for the most part. But truly the point here is not to discuss whether or not a deployment of SharePoint Foundation 2010 or SharePoint Server 2010 is right or not.  The fact is they do take place and happen.  And information will reside there. Now, the point of this post is to raise awareness on options available for companies that have implemented it and maybe are a bit “iffy” on how to protect the information being placed in libraries and lists.  In many cases I have found SharePoint comes first and business continuity becomes an afterthought.  The documentation piece from TechNet states: “You can register SharePoint Server 2010 with Windows Server Backup by using the stsadm.exe -o -registerwsswriter operation to configure the Volume Shadow Copy Service (VSS) writer for SharePoint Server. Windows Server Backup then includes SharePoint Server 2010 in server-wide backups. When you restore from a Windows Server backup, you can select Microsoft SharePoint Foundation (no matter which version of SharePoint 2010 Products is installed), and all components reported by the VSS writer forSharePoint Server 2010 on that server at the time of the backup will be restored. Windows Server Backup is recommended only for use with for single-server deployments.” Even in the event of single-server deployments you will have options to safeguard your data. The process will require that after you have executed the stsadm command above, you will then use Windows Server Backup to do a Full Server Backup.  Then when the restore operation is needed you will be able to select specifically the section that has the SharePoint technologies backup. The restore process: Hope you find this to be a helpful post.  I have found this to be specially handy in SharePoint deployments that are part of a Team Foundation Server deployment and that are isolated from any other SharePoint farm and such.   Credits:  Sean McDonough for passing along the information available on TechNet.

    Read the article

  • SQL Server database backup: Network Service file access

    - by Keith Maurino
    When trying to run the following database backup command from my code I get an "Operating system error 5(Access is denied.)" error. This is because the log on account for the SQL Server Windows Service is 'Network Service' and that does not have access to right to this folder. BACKUP DATABASE [AE3DB] TO DISK = 'c:\AE3\backup\AE3DB.bak' My question is, from my code how would I go about figuring out where on the C drive 'Network Service' is allowed to right the backup to? NOTE: This is a distributed application so I cannot easily change the log on for the SQL Server Windows Service to the 'Local System' account that would be able to right to that folder.

    Read the article

  • java database backup restore

    - by jawath
    how do i backup /restore any kind of databases inside my java application to flate files.Are there any tools framework available to backup database to flat file like CSV, XML,or secure encrypted file,or restore from csv or xml files to databases ,it should be also capable of dumping table vise restore and backup also

    Read the article

  • Custom Online Backup Solution Advice

    - by Martín Marconcini
    I have to implement a way so our customers can backup their SQL 2000/5/8 databasase online. The application they use is a C#/.NET35 Winforms application that connects to a SQL Server (can be 2000/2005/2008, sometimes express editions). The SQL Server is on the same LAN. Our application has a very specific UI and we must code each form following those guidelines. There’s lots of GDI+ to give it the look and feel we want. For that reason, using a 3rd party application is not a very good idea. We need to charge the customer on a monthly/annual basis for the service. Preferably, the customer doesn’t need to care about bandwidth and storage space. It must be transparent. Given the above reqs., my first thoughts are: Solution 1: Code some sort of FTP basic functionality with behind the scenes SQL Backup mechanism, then hire a Hosting service and compress-transfer the .BAK to the Hosting. Maintain a series of Folders (for each customer). They won’t see what’s happening. They will just see a list of their files and a big “Backup now” button that will perform the SQL backup, compress it and upload it (and update the file list) ;) Pros: Not very complicated to implement, simple to use, fairly simple to configure (could have a dedicated ftp user/pass) Cons: Finding a “ftp” only hosting plan is not probably going to be easy, they usually come with a bunch of stuff. FTP is not always the best protocol. more? Solution 2: Similar to 1, but instead of FTP, find a cloud computing service like Amazon S3, Mosso or similar. Pros: Cloud Storage is fast, reliable, etc. It’s kind of easy to implement (specially if there are APIs like AWS or Mosso). Cons: I have been unable to come up with a service optimized for resellers where I can give multiple sub-accounts (one for each customer). Billing is going to be a nightmare cuz these services bill per/GB and with One account it’s impossible to differentiate each customer. Solution 3: Similar to 2, but letting the user create their own account on Amazon S3 (for example). Pros: You forget about billing and such. Cons: A mess for the customer who has to open the Amazon (or whatever) account, will be charged for that and not from you. You can’t really charge the customer (since you’re just not doing anything). Solution 4: Use one of the many backup online solutions that use the tech in cloud storage. Pros: many of these include SQL Server backup, and a lot of features that we’d have to implement. Plus web access and stuff like that will come included. Cons: Still have the billing problem described in number 2. Little of these companies (if any) offers “reseller” accounts. You have to eventually use their software (some offer certain branding). Any better approach? Summary: You have a software (.NET Winapp). You want your users to be able to backup their SQL Server databases online (and be able to retrieve the backups if needed). You ideally would like to charge the customer for this service (i.e. XX € a year).

    Read the article

  • Mercurial local repository backup

    - by Ricket
    I'm a big fan of backing things up. I keep my important school essays and such in a folder of my Dropbox. I make sure that all of my photos are duplicated to an external drive. I have a home server where I keep important files mirrored across two drives inside the server (like a software RAID 1). So for my code, I have always used Subversion to back it up. I keep the trunk folder with a stable copy of my application, but then I create a branch named with my username, and inside there is my working copy. I make very few changes between commits to that branch, with the understanding that the code in there is my backup. Now I'm looking into Mercurial, and I must admit I haven't truly used it yet so I may have this all wrong. But it seems to me that you have a server-side repository, and then you clone it to a working directory in the form of a local repository. Then as you work on something, you make commits to that local repository, and when things are in a state to be shared with others, you hg push to the parent repository on the server. Between pushes of stable, tested, bug-free code, where is the backup? After doing some thinking, I've come to the conclusion that it is not meant for backup purposes and it assumes you've handled that on your own. I guess I need to keep my Mercurial local repositories in my dropbox or some other backed-up location, since my in-progress code is not pushed to the server. Is this pretty much it, or have I missed something? If you use Mercurial, how do you backup your local repositories? If you had turned on your computer this morning and your hard drive went up in flames (or, more likely, the read head went bad, or the OS corrupted itself, ...), what would be lost? If you spent the past week developing a module, writing test cases for it, documenting and commenting it, and then a virus wipes your local repository away, isn't that the only copy? So then on the flip side, do you create a remote repository for every local repository and push to it all the time? How do you find a balance? How do you ensure your code is backed up? Where is the line between using Mercurial as backup, and using a local filesystem backup utility to keep your local repositories safe?

    Read the article

  • Web site backup in PHP?

    - by Pekka
    Does anybody know a clean PHP-based solution that can backup remote web sites using FTP? Must haves: Recursive FTP backups Possible to run through a cron job Easy to configure (easy adding of multiple sites) Local storage of backup files is sufficient Would be nice: Backed up sites are stored as zip files A nice interface to manage things Provides notification when backup has succeeded or failed Does incremental backups Does MySQL Database backups I need this to be PHP (or Perl) based because it's going to be used on shared hosting packages that do not allow usage of the standard GNU/Linux tools available.

    Read the article

  • Best choice for off-site backup: dd vs tar

    - by plok
    I have two 1TB single-partition hard disks configured as RAID1, of which I would like to make an off-site backup on a third disk, which I am still to buy. The idea is to store the backup at a relative's house, considerably far away from my place, in the hope that all the information will be safe in the case of a global thermonuclear apocalypse. Of course, this backup would be well encrypted. What I still have to decide is whether I am going to simply tar the entire partition or, instead, use dd to create an image of the disks. Is there any non-trivial difference between these two approaches that I could be overlooking? This off-site backup would be updated no more than two or three times a year, in the best of the cases, so performance should not be a factor to be pondered at all. What, and why, would you use if you were me? dd, tar, or a third option?

    Read the article

< Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >