Search Results

Search found 1556 results on 63 pages for 'backups'.

Page 43/63 | < Previous Page | 39 40 41 42 43 44 45 46 47 48 49 50  | Next Page >

  • Reducing storage cost by moving old files to external USB HDDs. Your thoughts?

    - by cparker4486
    I've got about 300GB of pictures and marketing data that is rarely accessed and I'd like to get it off my main storage. I was thinking to simply add two external USB HDDs to the server and move all the files to one of the drives. The second drive would be the backup destination for the first drive. I'm working with Server 2003 R2 SP2. This will help me free a good amount of space on my main storage as well as reduce the complexity, backup window, and usage of my backups to tape.

    Read the article

  • How do I backup a git repo?

    - by acidzombie24
    I am planning to switch from SVN to git. With svn I just copy my repo folder when I want to back it up. However git doesn't have one so what do I do? Should I create a clone on a separate drive and update by pulling from my project? Then I can burn/archive this folder and it will have all the history? This is probably obvious but I want to make sure when it comes to backups. I still pretend there is a root repository.

    Read the article

  • WHM Backup recommended?

    - by user77284
    I have a VPS (CentOS) with WHM, about 25 GB. It has about 20 accounts on it. I am looking to effectively back it up. My thoughts: Back it up with WHM Backup locally. Use Rsync to mirror it to another server. My questions: Is WHM Backup a good solution? How can I keep several backups while keeping a minimal amount of space? Is there a different solution, I should consider? I am not an expert, so I want something simple that works with minimal maintenance. Thanks.

    Read the article

  • who delete my files?

    - by akalter
    I have some linux server. on two of our server we have mysql. we have daily backup on both machine. but the script different. i saw both scripts. on one of them i saw the "delete older files" algorithm, but in the other this is happening but not from the script. i trying to know who dletes my files, because of that i want to use same script on both machine because of that in the script with the deletion i also copy the files to the another server, and i want to do that in both servers. Who have an idea who delete my older backups? Thank you!

    Read the article

  • Using Dropbox API instead of a FTP server.

    - by Somebody still uses you MS-DOS
    This is a small aplication scenario. Usually, when you have to do some backups of source code/database on your server, you use a second ftp server, a cronjob to tar.gz your db dumps and source files, and send this file to your ftp server from your application server. Dropbox created an API to use it's infrastrucutre. Since they provide 2gb for free accounts, I thought about being able to upload to it instead of a ftp server. So, if you do some freelance work, you can create a free account for each client and use this approach, maybe encrypting the files you send. You even gain a revision for each sent file, like a revison control system, for free, from the last 30 days. What do you think of this approach? Is it possible? And, more importantly: what are the security risks involved? (That's why I'm asking this on serverfault, since this POV from sysadmins will be more accurate). Thanks!

    Read the article

  • Why hasn't anyone made a way for TimeMachine to wirelessly backup to Amazon S3?

    - by Jordan
    Seriously. I'm looking at you, Apple. If TimeMachine is supposed to be 'simple backup that just works' why is it impossible to backup into my S3 filespace? Why hasn't some 3rd party developer (JungleDrive???) made it so that TimeMachine will be OK with backing up to amazon s3 storage? It just seems like the most convenient, robust answer. I'd gladly pay the $20-25 a month for complete, unlosable backups that I can sync with wirelessly on a proper scheme.

    Read the article

  • ESXi disaster recovery plan

    - by Marlin
    I have a Vmware infrastructure where I am using the free version of Esxi 5 . I cannot as a result use vmotion and the other cool features that come with a paid ESXI. I am using snapshots for the backups but they are stored on local hard drive. I need a better backup scenario where I can recover in the event of a harddrive failure. I tried openfiler but could not get it right. What backup method can I try given my situation?

    Read the article

  • Backup all plesk MySQL Databases to individual files

    - by Michael
    Hy, Because I'm new to shell scripting I need a hand. I currently backup all mydatabases to a single file, thing that makes the restore preaty hard. The second problem that my MySQL password dosen't work because of a Plesk bug and i get the password from "/etc/psa/.psa.shadow". Here is the code that I use to backup all my databases to a single file. mysqldump -uadmin -p`cat /etc/psa/.psa.shadow` --all-databases | bzip2 -c > /root/21.10.2013.sql.bz2 I found some scripts on the web that backup each database to individual files but I don't know how to make them work for my situation. Here is a example script: for db in $(mysql -e 'show databases' -s --skip-column-names); do mysqldump $db | gzip > "/backups/mysqldump-$(hostname)-$db-$(date +%Y-%m-%d-%H.%M.%S).gz"; done Can someone help me make the script above work for my situation? Requirements: Backup each database to individual file using plesk password location.

    Read the article

  • How do I restore a SQL Server database from last night's full backup and the active transaction log file?

    - by Dylan Beattie
    I have been told that it's good practise to keep your SQL Server data files and log files on physically separate disks, because it'll allow you to recover your data to the point of failure if the data drive fails. So... let's say that mydata.mdf is on drive D:, and my mydata_log.ldf is on drive E:, and it's 16:45, and drive D: has just died horribly. So - I have last night's full backup (mydata.bak). I have hourly transaction-log backups that will bring the data back up to 16:00... but that means I'll lose 45 minutes worth of updates. I still have mydata_log.ldf on the E: drive, which should contain EVERY transaction that was committed right up to the point where the drive failed. How do I go about recreating the database and restoring data from the backup file and the live transaction log, so I don't lose any updates? Is this possible?

    Read the article

  • How to backup data from linux servers to linux server (incremental + snapshot)?

    - by wag2639
    We have a handful of hosted servers running RHEL4 and RHEL5 and would like to backup some key folders (I'm thinking /var /srv and /etc) to a local server we have in house. The local server is running Ubuntu 9.10 Server edition. I'm looking for a free (preferably OSS) way to grab (or push) incremental backups to my local server and once a month or so, make a new snapshot for incremental updates in between snapshots. Also, while I'm comfortable with using a command line, others may need to use the system in the future, and I would like some kind of graphical or web interface to browse the backup repository. Suggestions?

    Read the article

  • How to backup data from linux servers to linux server (incremental + snapshot)?

    - by wag2639
    We have a handful of hosted servers running RHEL4 and RHEL5 and would like to backup some key folders (I'm thinking /var /srv and /etc) to a local server we have in house. The local server is running Ubuntu 9.10 Server edition. I'm looking for a free (preferably OSS) way to grab (or push) incremental backups to my local server and once a month or so, make a new snapshot for incremental updates in between snapshots. Also, while I'm comfortable with using a command line, others may need to use the system in the future, and I would like some kind of graphical or web interface to browse the backup repository. Suggestions?

    Read the article

  • who deleted my files?

    - by akalter
    I have some linux servers. On two of our server we have MySQL. We have daily backup on both machine. But the scripts are different. I saw both scripts. On one of them I saw the "delete older files" algorithm, but in the other this is happening but not from the script. I am trying to discover who deletes my files, because of that I want to use same script on both machine because of that in the script with the deletion I also copy the files to the another server, and I want to do that in both servers. Who have an idea who deleted my older backups? Thank you!

    Read the article

  • Backup software for Ubuntu - which one?

    - by Industrial
    Hi everybody, I have spent some time testing out different backup solutions for my small home office during the last weeks, but still haven't found anything that have been working out too well yet. We can definitely work with a non-GUI script if that's what it takes, if only the requirements are fulfilled: Upload to Amazon S3 Europe. We get unbelievable slow uploading speed to US, so uploading 400+ GB of data will not be happening anytime this year... Incremental backups - only changed files shall be uploaded or we will have a big bill from Amazon in the end of each month.. Files should not be uploaded in one big per-folder archive. This is not efficient at all, since if we change one file in a subfolder, a huge two-digit GB sized file would have to be uploaded during next backup. Not good for economy again, or traffic overhead on our internet connection. What options are available to us? Thanks!

    Read the article

  • How can I get all my itunes files from multiple computers onto my new PC?

    - by jasondavis
    Managing my itunes files (music,video,tv shows, iphone apps) is a really hard task and I have not had much luck yet. If I had a single PC for the rest of my life with the same hard drive, it would work great but infortunately it cannot be that simple. I just build a new PC as my old hard drive crashed with all my itunes purchased songs, video, etc. So is there anyway to get itunes to recognize that I own all them and let me re-download everything? As I see it so far, it knows what I own by my itunes account login (it knows if I purchased a file or not ever) but does not seem to let you just re-download everything to a new PC or new hard drive. I realize I should of had backups but I didn't. My backup plan is a work in process and it is getting much better but that doesnt help with my situation at the moment. So I am asking for any help, ideas of how to retrieve all my previous itunes purchases?

    Read the article

  • New Servers Active Directory and Exchange

    - by user3164638
    I have 3 Dell PowerEdge server, each with 2 quad-core processors. I am going to bring this office out of the stone-age network (P2P, share files on a flash drive, emails through Google, etc) and set up Active Directory and Exchange 2013. Our needs are not that great at the moment - our staff consists of approximately 40 people, and our network may eventually be managed by an external company. We need only one domain for our emails (though we may serve emails for a few other partners domains as well). I was thinking of setting something up like this: Server 1: Primary DC. Active Directory and Exchange on separate virtual machines. Server 2: Redundant of server 1. Server 3: Shared resources, storage, backups, etc. How would you utilize 3 servers for an Active Directory / Exchange setup for a small/medium office? We do have plans to grow, so my solution must be scalable, though I'm not sure that I want to split permissions, though I'd consider it if that was something that could be changed on down the road.

    Read the article

  • Is there a limit of max. number of files in external hard drive folder?

    - by tfs
    I have a FAT32 external hard drive where I keep backups downloaded from webserver. I have a directory with 30 subdirectories. One of the subdirectories contains 21381 files and when I try to copy more files into this directory I get 0x80070052 error. However,it's possible to copy one more file in this directory (only one) if I make it's name shorter (8 characters instead of 22 as it's original name). How do I solve this problem? Now I can not synchronize external hard disk files with server files which is very important for me.

    Read the article

  • Why does my CentOS logrotate run at random times?

    - by Mike Pennington
    I put a logrotate configuration file in /etc/logrotate.d/ and expected the logs to rotate at a consistent time; however, they do not... log rotation times are seemingly random +/- one hour. Why are the log rotation start times random, and how can I change this? Informational: my logrotate config file looks like this... /opt/backups/network/*.conf { copytruncate rotate 30 daily create 644 root root dateext maxage 30 missingok notifempty compress delaycompress postrotate ## Create symbolic links in daily/ PATH=`/usr/bin/dirname $1`; FILE=`/bin/basename $1`; /bin/ln -s $1 $PATH/daily/$FILE endscript }

    Read the article

  • Dead USB flash drive

    - by Unsliced
    So a friend has come to me with a problem. They have a dead USB thumb drive which no longer responds when plugged into a machine. I've tried it in a Mac and it doesn't even respond, at least on a Windows XP machine it sees that it is there but can't show it in explorer, just that whatever is plugged in has malfunctioned. There is obviously current because the activity light on the drive illuminates. I'm looking for suggestions, please. I have access to Mac or Windows hardware and am happy to experiment (and even to pay if the solution works!) It's a bit late to recommend regular backups, but in the lack of that, what's the next best forensic advice? Edit: I should stress that, if possible, we're trying to rescue the data, after all, thumb drives are basically disposable and hardly worth the bother if there's no emotional or functional reason for wanting to rescue it!

    Read the article

  • Edit write-protected files by breaking hard links

    - by Taymon
    A directory which I own and can write to contains hard links to files that I don't own and don't have write permission for. I want to open and edit these files in Emacs. When I save my changes, Emacs should rename the existing hard link by appending ~, then write my new version of the file as a new file owned by me. I was under the impression that Emacs could just do this (because of the way it does backups), but it's not working; when I save, it attempts to change the file's permissions in order to write to it (and fails because I don't own the file). How do I make this happen?

    Read the article

  • Snapshots are disappearing from Time Machine. Why?

    - by AntonAL
    I have TimeMachine enabled for my external HDD. I'm triggering backups manually 2 times per week. I have noticed, that some snapshots, that i made, are gone. I noticed this behaviour several weeks later, but i had doubts. To make it sure, i've recorded screen video with listing snapshots in TimeMachine. After watching my screen recording today, I'm sure, that for Augusut i had 9 snapshots. Now, i have only 3 of them. What's happening ? I had no crash disk reports, errors etc

    Read the article

  • Forcing Windows to be 'awake' during certain hours?

    - by Mun
    I'm running Windows 7 and have the power options currently set up to put the machine to sleep after 2 hours of inactivity. Is there any way to force the machine not to sleep between certain hours, or to automatically wake up? I'm assuming this can't be done directly from within the Windows power settings, but is there a third party utility (preferably free) that could take care of this? I've got backups scheduled every morning at 3-4am, and these don't run if the machine is suspended, so I just need to make sure that the everything is powered up during this time.

    Read the article

  • Windows VPS/Cloud Host Recommendations?

    - by user18937
    As Hosting.com are no longer offerring Windows VPS accounts, we are looking for a new US based provider. Looking for something that offers standard Windows IIS hosting for a Dotnetnuke portal site with SQL Server Express or Web. Basic managed services for OS updates as well as regular backups required. Good level of support and solid uptime is critical. Budget in the $150-$200 month range but flexible depeding on quality and services offerred. Anyone have any suggestions and good feedback they can share? Currently looking at Jodohost as an option but would like some other possibilities as their support can be suspect at times we have found. A cloud solution in same range would also be an option.

    Read the article

  • How do you limit the bandwidth for a file copy?

    - by wizard
    I've got an old windows 2000 box in a remote location with a T1 connection and a vpn to my location. I normally use smb mounts to transfer files but now it's time to decommission the server and copy it's backups to my location. I have about 40 gigabytes (compressed) to copy. I'm prepared for it to take a long time, but I have a few caveats. I need to limit the bandwidth so terminal service connections to the site are not affected I want to be able to resume a partial transfer There are a few small files and several large files (10-20 gigabytes). I'm familiar with rsync on *nix platforms but have had bad luck with windows and I don't know that it will really keep partially transfered files. What do you use?

    Read the article

  • Backing Up User Data when data is not in use. Should I be concerned?

    - by jberryman
    This may be a dumb question. I would like to use duplicity to make backups to Amazon S3 of directories, each of which contains a different user's data. Each directory could be written to at any time. So I have two questions: Should I be concerned that a scheduled backup of a directory might occur in the middle of data being written to files in the directory, resulting in a corrupted backup? And if that is a valid concern, how would I go about temporarily delaying an operation while IO was happening, to try to minimize that effect. Thanks for the advice

    Read the article

  • How can I boot a vm on Hyper-V 2012 when it has a virtual hard-drive missing?

    - by Zone12
    We have a Hyper-V 2012 server with 8 VM's on. We have attached extra virtual hard-drives to each of the computers to store backups on. These drives are stored on a NAS. After a power failure, we tried to boot the VM's and found that they couldn't be booted without the attached backup drives. We couldn't boot the NAS at that point and so we had to remove all the extra drives manually, boot the VM's and re-attach the drives at a later date when we got the NAS back up and running. These backup drives are non-essential to the running of the system. I would like to know if there is a way to boot a VM on Hyper-V 2012 with some of the hard-drives (scsi) missing so that we can recover automatically from a power failure.

    Read the article

< Previous Page | 39 40 41 42 43 44 45 46 47 48 49 50  | Next Page >