Search Results

Search found 6759 results on 271 pages for 'backup strategies'.

Page 26/271 | < Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >

  • Backup drivers for a device in Windows 7

    - by Moses
    If I can't find a driver on the internet when I prep for a reinstall of Windows 7, I figure if the driver is already installed, why not back it up and reinstall later? The only method I know of is going through Device Manager Device Properties Driver Details. This gives me a nice list of all the files, but then I need to go manually retrieve all of them. For devices with a lot of driver files, this is inefficient. Is there a better way to accomplish this?

    Read the article

  • Multi Gateway and Backup Routing on a cisco router

    - by user64880
    Hi all, I have a 2611 Cisco Router with only one Fastethernet port Now I have two internet gateways. I want to config my router as when primary routing fails second routing automatically start to route all my packets. When I set 2 IP route command in my router then I check I see it work well but when peer IP on primary routing is down it can not change to second routing until I remove first route command.In the following I write my setting. How can I set it? interface FastEthernet0/0 ip address 81.12.21.100 255.255.255.248 secondary ip address 62.220.97.14 255.255.255.252 ip route 0.0.0.0 0.0.0.0 62.220.97.13 ip route 0.0.0.0 0.0.0.0 81.12.21.97 100 Cheer, Kamal

    Read the article

  • Laptop Backup Software (Corporate)?

    - by Hutch
    I wondered if any of you who have a fleet of laptops are using anything to back them up, and if so what? In particular I'm looking for a solution that is totally hands-off once installed i.e. the user doesn't have to do anything, press anything, remember to change something when their domain password changes etc. Right now we use Druva Insync which I have to say is pretty damned good, however our license is up for renewal in a couple of months so I want to be sure it's the best solution before renewing - the only other vaguely comparable product that I know of is from Atempo but the cost of a SQL Server license is a big problem there. Thanks.

    Read the article

  • Boot to VHD backup plan

    - by Josh Barker
    I have a machine that I just reinstalled Windows and all of my applications onto... what a chore that is. I want to totally and completely avoid this from now on by creating an image. My first thought was to see if it possible to copy a VHD file when you are booted into it since I am using Windows 7 Ultimate as boot-to-vhd (without a parent machine). Is this possible and if so, how could I accomplish this? Keep in mind, this is my personal machine and I'm trying to keep things inexpensive (a good script would work). Thanks, Josh

    Read the article

  • Collect and Backup Photos from Multiple Photographers

    - by Graviton
    I have a few photographers working under me ( well, not exactly under me, but I say it anyway just for illustration purpose), so they shoot a lot of photos, the problem is that they all store their photos on their hard disk, and I have no way to retrieve them unless I pass them an USB and ask them to fill it with their photos. Very labor intensive and inefficient! Is there any other better ( more automated) ways of doing this? For the fear of losing the resolution, I don't really favor a online-synch approach, because I think all the photos uploaded to any website will suffer a resolution loss, which is the last thing I want. Is there a better idea? Edit: Being artistic as they are, I can't guarantee that they all use PC and Windows; so the software must at least be able to run on Mac.

    Read the article

  • Backup linux to ftp server

    - by Alakdae
    What do you use for backups to ftp server? I've tried the setup with Amanda and virtual tapes on the ftp server mounted with Curlftpfs and I'm not satisfied with it. I just don't feel confident about Amanda. Also I cannot use anything that uses rsync on the ftp mounted filesystem because it only creates the directories and doesn't create files as it cannot execute "mkstemp". I've been thinking about Bacula but I can't find any good HOWTO for it.

    Read the article

  • PST backup with Volume Shadow Copy Service

    - by NoMadMan
    I was asked to implement the task of backing up 35 PST files ranging from 800Mb to 2000Mb. Windows XP and Windows 2000 workstations are assigned to the users and we have a Windows 2000 domain controller we use to back up files on 3x 500Gb external hard drives. I found several methods from applications to scripts. Local or remote applications would be my last resort. I came across this script based on Volume Shadow Copy Service. CopyWithVss I wanted to know if there would be a problem if the path had spaces. Would mounting the destination path of each PST folder with a drive letter be more practical? My concern with mounting option is that i would eventually run out of letters since I have 35 and possibly more workstations to back up. Lastly, can someone give me an example of CopyWithVss if it were run on a production network? The script is a bit cryptic even after reading several times. Where in the script do I enter the source and the destination? I'm a Mac user so please excuse my ignorance with Windows platform.

    Read the article

  • backup and file server for 50+ TB of data

    - by a-bomb
    our office wants to build a new server to handle our data, over the last 10 years our data was stored on CDs, DVDs, HDDs but now they want all of it in one place that is attached to the network for everybody in the office to access it. the data is 20TB new data and the rest is old, the important now is to store these 20tb and gradually store the other 30tb over time. so what is the best solution to do ? we thought of getting an hp server and connect it to an external enclosure that either had tape drives or HDDs (we haven;t decided yet) or to get a NAS server and connect it to the hp server. what should we do because this is new for us ...

    Read the article

  • Restoring file properties but not the complete files, from backup

    - by Jon
    While copying data from my old storage on a Linux computer to the new (linux-based) NAS, I accidentially failed with getting the properties (most important: the modify dates) along to the new location. I also continued to use/modify the files at the new location and hence, cannot just copy it all over again. What I would like to do is a diff between files in the old vs. the new storage, and for those being identical, restore the properties from Linux storage to the NAS storage files. Is there a clever way such as a script or a tool to do this? I could either run it on the Linux box or in worst case from a remote Windows computer. Grateful for any suggestions. /Jon

    Read the article

  • How restore qmail backup files

    - by Maysam
    We are using qmail as our mail application on a linux server. A few weeks ago our server crashed and we had everything installed from scratch and our users started to send & receive email again. The problem is they have lost their old emails. We have a back up of the whole qmail directory. But I don't know how to restore the old emails without losing the new ones. It's worth mentioning that I don't have any problem with restoring old sent mails. When I copy email files into .sent-mail/cur directory, I have them restored in sent box of users, but restoring files in /cur directory doesn't work for inbox emails and I can't get them restored.

    Read the article

  • Lazy linux backup system?

    - by Alex
    Ok, so I want to say Time Machine, but that's note exactly what I'm looking for. I want to set up a system that will regularly (hourly?) back up the /home directory of our machine. time Machine style things are naturally preferable since they save space by only saving the changes, but honestly, this is important enough that I can suffer some waste. Any ideas?

    Read the article

  • Using Windows Azure storage for backup

    - by Bruno
    I am currently looking at Windows Azure blobs as an option for backing up archive data. I want to be able to upload files from an external windows machine via the internet but I don't know enough about Windows Azure storage to make a decision. Some of the questions I have are How do I upload the files. Is there a client application, can I use robocopy? Would it be fast enough? i.e. Could I download or upload 1TB of data in a week? Is it secure? Hopefully someone smarter than me can help me :-)

    Read the article

  • Schedules Folder Backup

    - by Junaid Saeed
    i have some folders in C drive on which i work on daily basis and the data in them is very critical.. so every night when i shutdown my PC i copy - paste - overwrite existing files these folders to a separate location... so that of the system crashed or something bad happens.. i will be able to easily format C and all i cannot move these folders from C drive because these folders include C:\wamp\www\ of WAMP server and such folders... is there a tool on which i can schedule that everyday at X time these folders will be backuped to 'Y' path

    Read the article

  • How To Place a Drupal Site on to a server from backup

    - by CCG121
    I Backed up my site then totally re-did my server with a different Control Panel which created a different directory structure /var/www/vhosts/user/site.com/httpdocs I put the files into the httpdocs folder and put the database back correctly (I think) I can see the main page but clicking on any links gets me a Not Found Message I have tried running update.php and I cannot access /user/login either.

    Read the article

  • Backup with bash and rsync...

    - by Roger
    Is there a way to auto-rename an existing file on the receiver? For example: if filename already exists, it auto-rename filename to something like filename_001, filename_002 and so on.... So far all I have is this: $ rsync -rh --progress --stats --exclude '.thumb' \ --update --perms /origin /destination By the way, I know rsync has --ignore-existing to "skip updating files that exist on receiver", but I guess what I need would be something like --rename-existing.

    Read the article

  • Backup, Migrate or Clone Failing CentOS 4 (LVM)

    - by Hegelworm
    I've been running a BlueQuartz CentOS 4 system (Nuonce.net distro) for a few years now and although the hard drive (Deskstar) has always been a bit noisy, on a few recent occasions I've heard it having trouble spinning up. Basically, I want to clone this drive to a similar sized one (80 Gig). I've spent many hours reading upon dd, dd_rescue, rsync, clonezilla and LVM mirroring yet the sheer number of options and nightmarish accounts has left me frozen - unable to make an informed decision as to how to start. I've made a few attempts. dd failed after about 2 hours, as, although the drives appeared to be identical on the surface (ATA Seagate Barracudas, Thai not Chinese), the destination drive is slightly smaller. My most recent attempt involved using a Debian CD to format the new drive and then rsync-ing everything over and editing the new drive's grub and fstab to reflect the changes. No joy here either as I hadn't chosen LVM when partitioning the destination drive and it wouldn't boot. As you can probably tell, I'm out of my depth here and a panic-invoking mixture of caution and frustration has prompted me to sign up here. The server itself, although not strictly a production environment, has a very specific installation of Festival, LAME and ffMpeg and provides the back-end for a Text-to-Speech jQuery plugin that I've built over the last 2 years. I'm also planning to rebuild the whole TTS system on Debian as the existing CentOS system still has PHP4 etc. For now though, I'd really like to just shift everything over to a new drive. As this is my first post, please feel free to lay any house rules on me that I might've overlooked; I've been hovering around StackOverflow for a while now but have only just signed up. Many thanks. Update: Thanks for your responses so far - it's much appreciated and makes me feel a little more confident when I can double-check things here. I had the idea of doing a fresh install of CentOS (from the original disk) on the new drive so the partitions and LVM were all set up correctly (after disconnecting my source drive to prevent painful mistakes). I then booted into rescue mode from the same CD, and, to avoid a conflicting label, changed the /boot partition's label using e2label to /bootnew. I then changed the VolGroup name using lvm vgrename from VolGroup00 to VolGroup001. I could then boot with both drives in. After mounting the new drive (via its VolGroup001 alias) into /newhd, I rsync-ed over everything I could to the new drive, using -avr switches and backslashes. Like mentioned here. I then disconnected my original source drive again, booted from the liveCD again, changed back the boot partition label from /bootnew to /boot using e2label and then renamed the VolGroup back to VolGroup00. I then rebooted and it went through the familiar start-up routine only to not find a host of files in proc, usr, lib, var etc. The boot did complete but there were lots of red 'FAILS'. I could log in with my existing creds, but the network was kaput, I couldn't startX (desktop GUI) and there were also a few (a lot) of error messages pertaining to iptables. Back to square one. I naively thought I'd nailed it. Shall I just buy a bigger hard drive and attempt the dd route? I've read that this can mess with LVM setups and there's the added risk of working on two unmounted drives at once with a low-level tool. Thanks again.

    Read the article

  • Setting up logging for a remote backup script

    - by Brian Dainis
    So I wrote up a short script that I am planning to run via a cron job daily to package up my site files and send them to a remote location. I also plan to incorporate DB dumps, but I have not gotten that far yet. My issue today however is that Im am uncertain how to log the output of each command for errors, warnings, or other pertinent information the command may output. I would also like to install sometype of fail safe so if something goes horribly wrong the script will stop dead in its tracks and notify me via email or something. Ok the email thing is not as critical, but would be nice. Does anybody have any ideas for that? Here is what I have so far. By the way, both servers are CentOS 6.2 running standard LAMP. #!/bin/sh ################################# ### Set Vars ################################# THEDATE=`date +%m%d%y%H%M` ################################# ### Create Archives ################################# tar -cf /root/backups/files/server_BAK_${THEDATE}.tar -C / var/www/vhosts gzip /root/backups/files/server_BAK_${THEDATE}.tar ################################# ### Send Data to Remote Server ################################# scp /root/backups/files/server_BAK_${THEDATE}.tar.gz user@host:/home/bak1/ftp/backups/ ################################# ### Remove Data from this Server ################################# rm -rf /root/backups/files/server_BAK_${THEDATE}.tar.gz

    Read the article

  • Best tool to backup your firefox shortcuts

    - by vaccano
    I have lost my shortcuts a few times (from hard drive crashes). Is there a good tool to back them up easily. (I would prefer to not have to remember to do it.) Backing them up to the internet would be a nice bonus, but it is not required for my needs.

    Read the article

  • stsadm farm backup exits with ffffffff

    - by overbyte
    I have a SP2007 farm that uses stsadm through Scheduled Tasks to run farm backups. It always worked fine, however it ran for a couple of seconds one day and just exited with code ffffffff. Looked at Event Viewer, the Sharepoint logs themselves and nothing unusual happened at the time this job ran. No files were created so an spbackup.log doesn't exist. Searched the net for batch files and STSADM return codes but the error message doesn't even exist. Any other recommended place to look for issues like this?

    Read the article

  • How do I backup a git repo?

    - by acidzombie24
    I am planning to switch from SVN to git. With svn I just copy my repo folder when I want to back it up. However git doesn't have one so what do I do? Should I create a clone on a separate drive and update by pulling from my project? Then I can burn/archive this folder and it will have all the history? This is probably obvious but I want to make sure when it comes to backups. I still pretend there is a root repository.

    Read the article

  • Bootable backup for Windows (7) - Like Super Duper for Mac

    - by Dan F.
    Just got an SSD installed on my notebook and as people suggested I want have my bases covered in case it fails and I expect it to fail. Here is what I have in mind... keep a partition on the main drive (HDD) the same size as the SSD and keep a "clone" there, and in case the SSD fails... I take the SSD out and boot from the clone partition. From my understanding SuperDuper! does just that for MacOS, but I don't seem to find a something similar. I've found a lot of great tools out there that enables you to make bootable images (CloneZilla, DriveImage XML, Acronis® True Image™ to name a few), that is not what I'm looking for.

    Read the article

  • Are there free FTP serer backup repositories

    - by Saif Bechan
    I was wondering if there is a free service that provides a repository for your backups. These are backups of my server, there are usually about 200mb. I want to FTP the last 2 or 3. I am looking for a service that maybe provide some gigs of space and with FTP access. Looking at email providers such as gmail and hotmail that give you a couple of gigs of free space, this should also be possible or am i horribly wrong.

    Read the article

  • How can I capture a one-time full backup of a server using AMANDA?

    - by Daemon
    Suppose I have a preconfigured AMANDA server running automated network backups of directories specified in my disklists file. Normally, AMANDA will backup targets disks to /dumps/amanda. Is there any single command or method to perform a manual, one-time, full backup dump to another destination drive? I ask since I'm investigating the possibility of introducing rotating external hard-drive backups for offsite storage and I want to leverage our existing backup strategies wherever possible. Ideally, a full backup restore should be achievable from only any one of these offsite backup discs.

    Read the article

< Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >