Search Results

Search found 5810 results on 233 pages for 'offsite backup'.

Page 24/233 | < Previous Page | 20 21 22 23 24 25 26 27 28 29 30 31  | Next Page >

  • Centralized backup method recommendation for SMEs with various OSes

    - by Akinator
    Hi I was wondering what in your opinion is the "best" method for having "everything" backed-up in the following situation. We are a SMEs with 10 computers in total. Three of those computers are MACs The rest are windows (1 vista, 4 win7 and 2 XPs) I'm very open to what the method should be but you should also consider the follwing: Very limited resources Quite "small" bandwidth (4 MBs for all (download) 0.4 MBs (upload, yep, thats it)- though this might get, a little bit better) One of the main thing to back up would be the mails, considerations: All windows computers use outlook, mainly 2003 There is one mac that uses outlook too (for mac of course - not 2011 yet) We also have to backup the files: Not a huge amount Very few very big files Very organizes (by machine) What I would like is to hear your opinions as to which would be the best method (or combination of methods - preferably one of course) considering. We are not sure what do we need and I'm open to suggestions, though an online (cloud based applications) would be great, remember the the bandwidth is unbearable. Last think to consider, it that we would like to do weekly updates (unless the method is very easy of course). Thanks in advance!! I tried to be as specific as possible, but if anything is needed I'll gladly update, please ask for any clarification needed! Please avoid any answers like upgrade all to windows 7 and throw away your macs :) our's may not be an ideal situation, but it is what it is, and right now, it would be impossible for us to change it for a lot of circumstances.

    Read the article

  • Centralized backup method recommendation for SMEs with various OSes

    - by Akinator
    Hi I was wondering what in your opinion is the "best" method for having "everything" backed-up in the following situation. We are a SMEs with 10 computers in total. Three of those computers are MACs The rest are windows (1 vista, 4 win7 and 2 XPs) I'm very open to what the method should be but you should also consider the follwing: Very limited resources Quite "small" bandwidth (4 MBs for all (download) 0.4 MBs (upload, yep, thats it)- though this might get, a little bit better) One of the main thing to back up would be the mails, considerations: All windows computers use outlook, mainly 2003 There is one mac that uses outlook too (for mac of course - not 2011 yet) We also have to backup the files: Not a huge amount Very few very big files Very organizes (by machine) What I would like is to hear your opinions as to which would be the best method (or combination of methods - preferably one of course) considering. We are not sure what do we need and I'm open to suggestions, though an online (cloud based applications) would be great, remember the the bandwidth is unbearable. Last think to consider, it that we would like to do weekly updates (unless the method is very easy of course). Thanks in advance!! I tried to be as specific as possible, but if anything is needed I'll gladly update, please ask for any clarification needed! Please avoid any answers like upgrade all to windows 7 and throw away your macs :) our's may not be an ideal situation, but it is what it is, and right now, it would be impossible for us to change it for a lot of circumstances.

    Read the article

  • How should I configure backup of my server?

    - by ed209
    I have just rented a dedicated server. If it helps this is the config I have: CPU1 Intel(R) Core(TM) i7-2600 CPU @ 3.40GHz (Cores 8) RAM 15975 MB Disk /dev/sda doesn't contain a valid partition table (=> /dev/sda doesn't) Disk /dev/sdc doesn't contain a valid partition table (=> /dev/sdc doesn't) Disk /dev/sdb doesn't contain a valid partition table (=> /dev/sdb doesn't) Disk /dev/sda: 120.0 GB (=> 114 GIB) Disk /dev/sdc: 3000.6 GB (=> 2861 GIB) Disk /dev/sdb: 3000.6 GB (=> 2861 GIB) /dev/sda is a 120GB SSD. This is where I have Ubuntu/lamp installed. It's the drive that will run my site. With the account I got two other drives of 3000GB each which I really don't need but they came with the account. I figured I could use these to back up my main 120gb drive. So a couple of things I wondered were: Should I use these for backups? How should I back up. The data I want to back up is a user uploads directory full of images and the database. Everything else is either in a code repo or backed up some other way. For example, it would be nice to know there is a disk image of the 120gb drive somewhere that I can copy over should there be any problems but equally I don't mind doing a fresh install of all the software and copying over just the images and database dump. Thanks for your advice! (also, happy to not use the two other drives and backup elsewhere if it's more sensible)

    Read the article

  • How Windows 8's Backup System Differs From Windows 7's

    - by Chris Hoffman
    Windows 8 contains a completely revamped backup system. Windows 8’s File History replaces Windows 7’s Windows Backup – if you use Windows Backup and update to Windows 8, you’ll find quite a few differences. Microsoft redesigned Windows’ backup features because less than 5% of PCs used Windows Backup. The new File History system is designed to be simple to set up and work automatically in the background. This post will focus on the differences between File History and the Windows Backup feature you may be familiar with from Windows 7 – check out our full walkthrough of File History for more information. HTG Explains: What The Windows Event Viewer Is and How You Can Use It HTG Explains: How Windows Uses The Task Scheduler for System Tasks HTG Explains: Why Do Hard Drives Show the Wrong Capacity in Windows?

    Read the article

  • Auto backup mysql database to dropbox [closed]

    - by Rob
    Is it possible to automatically backup my database to dropbox? If so how can I do it? The key criteria I need it to do is: Be automatic. Be Mac compliant. Be weekly. Sync with dropbox (http://www.dropbox.com) automatically. Be able to backup several databases from several websites. Be free... or relatively cheap! Have a guide on how to setup the solution. UPDATE: I've managed to setup an auto weekly backup using a cronjob: mysqldump -u username -pMyPassword Mydatabase > backup-file.sql That is saving the backup to my hosting space. It's a start but isn't ideal, how can I save that backup to a folder on my computer? Automatically of course.

    Read the article

  • How to Eliminate Tape Backup and Off-site Storage Service?

    - by Daniel Lucas
    PLEASE READ UPDATE AT THE BOTTOM. THANKS! ;) Environment Info (all Windows): 2 sites 30 servers site #1 (3TB of backup data) 5 servers site #2 (1TB of backup data) MPLS backbone tunnel connecting site #1 and site #2 Current Backup Process: Online Backup (disk-to-disk) Site #1 has a server running Symantec Backup Exec 12.5 with four 1TB USB 2.0 disks. BE jobs for full backups run nightly on all servers in site #1 to these disks. Site #2 backs up to a central file server there using software they already had when we purchased them. A BE job pulls that data nightly to site #1 and stores them on said disks. Off-site Backup (tape) Connected to our backup server is a tape drive. BE backs up the external disks to tape once a week which gets picked up by our off-site storage company. Obviously we rotate two tape libraries, one is always here and one is always there. Requirements: Eliminate the need for tape and off-site storage service by doing disk-to-disk at each site and replicating site #1 to site #2 and vice versa. Software based solution as hardware options have been too pricey (ie, SonicWall, Arkeia). Agents for Exchange, SharePoint, and SQL. Some Ideas So Far: Storage DroboPro at each site with an initial 8TB of storage (these are expandable up to 16TB at present). I like these because they are rackmountable, allow disparate drives, and have iSCSI interfaces. They are relatively cheap too. Software Symantec Backup Exec 12.5 already has all the agents and licenses we need. I'd like to keep using it unless there is a better solution, similarly priced, that does everything BE does plus deduplication and replication. Server Because there is no more need for a SCSI adapter (for tape drive) we are going to virtualize our backup server as it is currently the only physical machine save for SQL boxes. Problems: When replicating between sites we want as little data as possible to go across the pipe. There is no deduplication or compression in what I have laid out here so far. The files being replicated are BE's virtual tape libraries from our disk-to-disk backup. Because of this each of those huge files will go across the wire every week because they change every day. And Finally, the Question: Is there any software out there that does deduplication, or at least compression, to handle just our site-to-site replication? Or, looking at our setup, is there any other solution that I am missing that might be cheaper, faster, better? Thanks. Sorry so long. UPDATE 2: I've set a bounty on this question to get it more attention. I'm looking for software that will handle replication of data between two sites using the least amount of data possible (either compression, deduplication, or some other method). Something similar to rsync would work but it needs to be native to Windows and not a port involving shenanigans to get up and running. Prefer a GUI based product and I don't mind shelling out a few bones if it works. Please, answers that meet the above criteria only. If you don't think one exists or if you think I'm being to restrictive keep it to yourself. If after seven days there is no answer at all, so be it. Thanks again everyone. UPDATE 2: I really appreciate everyone coming forward with suggestions. There is no way for me to try all of these before the bounty expires. For now I'm going to let this bounty run out and whoever has the most votes will get the 100 rep points. Thanks again!

    Read the article

  • How to access previous VHD versions of system backup?

    - by feklee
    Quote from the 31 Oct 2009 TechNet article "Learn more about system image backup": During the first backup, the backup engine scans the source drive and copies only blocks that contain data into a .vhd file stored on the target, creating a compact view of the source drive. The next time a system image is created, only new and changed data is written to the .vhd file, and old data on the same block is moved out of the VHD and into the shadow copy storage area. Volume Shadow Copy Service is used to compute the changed data between backups, as well as to handle the process of moving the old data out to the shadow copy area on the target. This approach makes the backup fast (since only changed blocks are backed up) and efficient (since data is stored in a compact manner). When restoring the image, blocks will be restored to their original locations on the source disk. If you want to restore from an older backup, the engine reads from the shadow copy area and restores the appropriate blocks. For the last days, a daily system backup of drive C: to drive E: has been scheduled and run by Windows 7 Backup and Restore. Drive C: currently holds 233 GB of data, which fits comfortably on drive E:, a 1 TB drive, with 727 GB of free space remaining. How do I access the previous version of a VHD? I right clicked on files and folders in E:\WindowsImageBackup, and I looked for Previous Versions but always: There are no previous versions available

    Read the article

  • What's the best Linux backup solution?

    - by Jon Bright
    We have a four Linux boxes (all running Debian or Ubuntu) on our office network. None of these boxes are especially critical and they're all using RAID. To date, I've therefore been doing backups of the boxes by having a cron job upload tarballs containing the contents of /etc, MySQL dumps and other such changing, non-packaged data to a box at our geographically separate hosting centre. I've realised, however that the tarballs are sufficient to rebuild from, but it's certainly not a painless process to do so (I recently tried this out as part of a hardware upgrade of one of the boxes) long-term, the process isn't sustainable. Each of the boxes is currently producing a tarball of a couple of hundred MB each day, 99% of which is the same as the previous day partly due to the size issue, the backup process requires more manual intervention than I want (to find whatever 5GB file is inflating the size of the tarball and kill it) again due to the size issue, I'm leaving stuff out which it would be nice to include - the contents of users' home directories, for example. There's almost nothing of value there that isn't in source control (and these aren't our main dev boxes), but it would be nice to keep them anyway. there must be a better way So, my question is, how should I be doing this properly? The requirements are: needs to be an offsite backup (one of the main things I'm doing here is protecting against fire/whatever) should require as little manual intervention as possible (I'm lazy, and box-herding isn't my main job) should continue to scale with a couple more boxes, slightly more data, etc. preferably free/open source (cost isn't the issue, but especially for backups, openness seems like a good thing) an option to produce some kind of DVD/Blu-Ray/whatever backup from time to time wouldn't be bad My first thought was that this kind of incremental backup was what tar was created for - create a tar file once each month, add incrementally to it. rsync results to remote box. But others probably have better suggestions.

    Read the article

  • SEO Implications of blog on site versus offsite?

    - by Kelli Davis
    I recently added a blog to one of our company's websites, and was confident that this increase in content on the site would have only positive SEO results. My boss, however, feels that we should have instead located the blog off-site, on a blogging platform such as Wordpress, Typepad, etc., in order to generate a backlink (assuming we'd link from blogging platform back to website.) While I know that backlinks are important for SEO, isn't content creation equally, if not more, important? Granted, I'd be creating content either way, but I figured we'd get more site traffic by having the blog located on our site versus a separate blogging platform. Am I incorrect in my priorities here? Boss's TOP priority is increasing the ranking of our website, so maybe a backlink would be better...? If we do need to relocate the blog to an off-site platform, is there a blogging platform that is more conducive to SEO than others? Is there a platform from which backlinks would be more valuable than others?

    Read the article

  • What is the Benefit of Offsite SEO

    Off site SEO is a massive field, which is extremely complex and is made up of lots of different factors. What I will look to do is touch on what I consider to be the most important areas and the impact which they can potentially have on your websites ranking within search engines. The primary and most important item in my experience is links.

    Read the article

  • What Constitutes Offsite Web Optimization?

    Off-page optimization is about getting links to your pages. There are many ways how you can get them but the most valuable links you can get from websites where their webmasters will naturally link to your pages without any intervention.

    Read the article

  • The Importance of Onsite and Offsite SEO

    The Internet has become one of the major sources of information on almost any topic imaginable. More importantly, it provides people and companies a venue to introduce themselves, their products, and their expertise such as multimedia or web design to a larger demographic unbounded by place or time.

    Read the article

  • How do I change the default backup folder in Win7?

    - by mafutrct
    I'm using the built-in Windows Backup. The default destination folder (backup location\computer name\Backup Set year-month-day time) is unsuitable for me because I already have got some other files in that location and I'd prefer to keep them there. I found no way to change it. Is it possible to have a backup on the same partition but in a different folder? Or do I have to store backups on a different disk?

    Read the article

  • How to automaticaly backup TFS 2010

    - by Julien Ferraro
    Hello, I'm evaluating Team Foundation Server 2010. I would like to know if there is some command line to backup my TFS data. I currently have a folder sent to the cloud. This backup contains all the data I need to back up (like MySql databases, word documents, ...) What I want is a way to automatically backup my TFS collections (and any other important TFS data) in one (or more) file in this directory. A command line would be perfect. Many thanks Julien

    Read the article

  • Can I use "Online Backup" to backup my DVS instead of pushing to an external repo?

    - by Matt Brailsford
    Hi Guys, I'm currently signed up with a third party service that hosts my mercurial repositories as a central hub to push my changes to as a sort of backup. Now, I'm looking at a system to backup my laptop and am concidering Mozy. I'm a loan developer, and work on a laptop and am usualy connected to my internet via wifi with my laptop only really being on when I'm working, so feel something like Mozy is my best option. My question is, if I'm the only developer, could I get away with just using local mercurial repos and using Mozy to backup everything up? Rather than pushing to an external repo? Many thanks Matt

    Read the article

  • online backup solution with api for desktop

    - by user161179
    I made a small backup application that simply creates an archive out specified files and folders. Now I need an online service to backup that online. Which service can i use that can be integrated into my app ? Options I considered: dropbox is ideal, but they have all but abandoned the desktop. skydrive has no api. I couldn't find any free reliable backup service that uses ftp . anything else ? it should provide 1-2 gb of free space and be reasonably reliable. Thanks My app is in C#, but can be ported to any other language as well..

    Read the article

  • Secure way to backup MySQL databases?

    - by user203538
    What is the best/secure way to backup a mysql database on windows server (2008)? I have "MySQL Administrator" but that requires that you save passwords for backup project. I'm not keen on doing as anyone gaining access to the server would then have easy access to the database. Can you do anything similar to SQL Server like using Windows authentication. If not what is the most secure (and practical) way of backups. Lastly, what are the privileges needed to backup a database? I have created a single user just for this task. Please advise.

    Read the article

  • Database Backup

    - by Jungle_hacker
    Scenario i want to take backup from 7 client database to 1 server database. i dont know structure of the db { either server or client db }. both databases are having old data. now i have to make the tool take the backup for that. and should possible to backup old data also[if any updates done on old data.] please help to find the solution for this. 1. how can i proceed with the problem. 2. database not specified, may be MS access or Sql server 2005 3. In which i can implement this [ I am thinking of doing it in c#] please help me to find the solution

    Read the article

  • What is safe to exclude for a full system backup?

    - by seb
    Hi, I'm looking for a list which paths/files are safe to exclude for a full system/home backup. Considering that I have a list of installed packages. /home/*/.thumbnails /home/*/.cache /home/*/.mozilla/firefox/*.default/Cache /home/*/.mozilla/firefox/*.default/OfflineCache /home/*/.local/share/Trash /home/*/.gvfs/ /tmp/ /var/tmp/ not real folders but can cause severe problems when 'restoring' /dev /proc /sys What about... /var/ in general? /var/backups/ - can get quite large /var/log/ - does not require much space and can help for later comparison /lost+found/

    Read the article

  • I am one step closer to installing ubuntu, back is grayed out and I was stupid enough to not make a backup

    - by user283544
    Ok, so yeah, i've been stupid enough not to make a backup and 1 step closer to installing ubuntu. What I need is that is windows totally safe and not gonna be touched I made a new partition from windows, and in ubuntu installer I chose "Something else" Then I chose that partition as an ext4 and mount "/" Device for bootloader is my hard disk, /dev/sda/, that's what keeps me a little aware. So I need a quick answer is my windows 7 gonna be touched?

    Read the article

  • Automatically Snapshoting AWS instances (or other back up strategy)

    - by user1172468
    I just realized that my aws instance count has risen into the double digits. I'm currently backing portions of my folders and dbs and moving them off to a backup instance. What I think I should be doing is taking a snapshot of the instances (automatically) and persisting them on S3 so I have a running 7 day collection of daily backups. There is a question asking the same thing here, however the answers don't go into depth. So the closest answer seems to be: use a cron job to snapshot the instance. So do I run the cron job on the instance itself? or do I have a micro instance to run these snapshots? Could I get an example script or the command for say a linux flavor? what software must I have installed to get this to run? Thanks.

    Read the article

  • How to Approach CentOS Back Up on GoDaddy Dedicated Hosting

    - by Scott
    Does anyone have any experience with backing up a dedicated server at GoDaddy or anywhere else? I have a CentOS system. I recently made a big newbie mistake working on linux and toasted my server. I had to start over from scratch b/c I damaged it so bad. GoDaddy says I need to handle it all myself b/c I am not paying them for back ups. Does anyone have any idea on how to approach this back up? I'm not sure how a backup on dedicated hosting would be different than a normal linux back up. In any case, I don't know how to do normal linux backups.

    Read the article

  • RAID-1 and regular drive removal (using RAID-1 as a backup measure)

    - by Vi
    Is using mdadm's RAID-1 of 2 partitions (one on laptop's internal HDD, one on external HDD) a good idea. I want the system to work as RAID-1 if both drives are present, work as regular volume (degradad RAID-1) if external HDD is unplugged and quickly resync when I plug external HDD again. Questions: Is it a good idea? Will write-intent bitmap be enough for this task or I need something else? Should I consider doing it at filesystem level (3b. if yes, how?). Basic requirements are: Quick resync when I re-add the external drive (provided I hasn't changed that partition). More or less consistent data on the removed drive if I remove it not during write/resync operation. If I remove the drive during resync I expect the data to be somewhat inconsistent, but expect quick resync completion when I re-add it again. E.g. I want the the remaining drive to track what is changed (there can be a lot of changes) and that sync back only those parts that need it.

    Read the article

  • System backup with Norton Ghost

    - by Mehper C. Palavuzlar
    I will upgrade my system from Windows Vista Home Premium (x64) to Windows 7 (x64). Before starting the upgrade process, I want to back up my current system with Norton Ghost. I have never used it before, so I need assistance to do that. At the moment, there is 139 GB used space by Vista and I have 1 TB external HD connected via USB. If you can tell me the step by step instructions about how to back up and how to restore if the upgrade somehow fails, I'll appreciate that. Thanks.

    Read the article

  • Local area network computer to computer backup software

    - by thegreyspot
    Hi! I would like to back up my other computers on my local home network to my computer. What software would you suggest? I liked crashplan, however their confusing UI, and its ability not to sync, turned me away. But it has the right idea. A bonus would be if the software did not compile everything in to one file, but its ok if it does. Also I would like it to be free

    Read the article

< Previous Page | 20 21 22 23 24 25 26 27 28 29 30 31  | Next Page >