Search Results

Search found 5747 results on 230 pages for 'backup'.

Page 55/230 | < Previous Page | 51 52 53 54 55 56 57 58 59 60 61 62  | Next Page >

  • Backing Up User Data when data is not in use. Should I be concerned?

    - by jberryman
    This may be a dumb question. I would like to use duplicity to make backups to Amazon S3 of directories, each of which contains a different user's data. Each directory could be written to at any time. So I have two questions: Should I be concerned that a scheduled backup of a directory might occur in the middle of data being written to files in the directory, resulting in a corrupted backup? And if that is a valid concern, how would I go about temporarily delaying an operation while IO was happening, to try to minimize that effect. Thanks for the advice

    Read the article

  • Remove folder structure from archive, ignore folder while archiving and fix error

    - by Michael
    I am trying to make a script to backup each of my plesk hosts to individual files, I am having two problems: I would like to remove the folder structure from archive, the tar is 3 folders deep I am getting this error: tar: Removing leading `/' from member names I need my archive to ignore folders named "catch" because I don't need them in my archive. The code: FILES=/var/www/vhosts/* FNAME="" for f in $FILES do FNAME=`basename $f` tar cfv "/root/backup/ftp/$FNAME.tar" $f done Sample output: tar: Removing leading `/' from member names /var/www/vhosts/mydomain.com/ /var/www/vhosts/mydomain.com/conf /var/www/vhosts/mydomain.com/etc/ /var/www/vhosts/mydomain.com/etc/group /var/www/vhosts/mydomain.com/etc/termcap /var/www/vhosts/mydomain.com/etc/passwd /var/www/vhosts/mydomain.com/usr/

    Read the article

  • gzip: stdout: File too large when running customized backup script

    - by Roland
    I've create a plain and siple backup script that only backs up certain files and folders. tar -zcf $DIRECTORY/var.www.tar.gz /var/www tar -zcf $DIRECTORY/development.tar.gz /development tar -zcf $DIRECTORY/home.tar.gz /home Now this script runs for about 30mins then gives me the following error gzip: stdout: File too large Any other solutions that I can use to backup my files using shell scripting or a way to solve this error? I'm grateful for any help.

    Read the article

  • Copying MYSQL backup to another server

    - by Yeti
    I'm new to SSH. How to copy a .gz file from one server to another using SSH? I'm using cron to backup mysql databases and want to also automate the process of copying the .gz files a different web host. Any information on the limit of file size that can be copied would also be great. The backup file size range from 100 MB to few GB.

    Read the article

  • Taking backup of data in a table.

    - by Rishabh Ohri
    Hi All, Is there some way to backup the data in a table in sql server by using sql scripts. Note: I dont want to take the backup of the whole database. And later on restore the same data back in case of any failure. Please suggest some solution.

    Read the article

  • what is "removing backup files" installation step

    - by mfeingold
    In many windows installers the last step is called "removing backup files". I understand that to provide transactional integrity of the install process some "backup files" could've been created and have to be cleaned up. What I do not understand is why on many occasions this step takes considerably longer than the rest of the installation. Any idea why?

    Read the article

  • MS SQL SERVER BACKUP Issue

    - by Dhivagar
    In sql server 2008, i have taken Full backup type of my database and database was successfully done.My question is while restoring it how can i check or know that my whole backup was perfect ?

    Read the article

  • create backup file descriptor?

    - by BobTurbo
    stdinBackup = 4; dup2(0, stdinBackup); Currently I am doing the above to 'backup' stdin so that it can be restored from backup later after it has been redirected somewhere else. I have a feeling that I am doing a lot wrong? (eg arbitrarily assigning 4 is surely not right). Anyone point me in the right direction?

    Read the article

  • How to do yum backup and restore?

    - by tomaszs
    Is there a way to make a backup of package that will be change while yum update? For example when I do yum update lighttpd is there a way to backup and restore lighttpd if yum update will be unsuccessful or it will result in unsuspected errors or bugs?

    Read the article

  • Two questions about restoring Thunderbird from a backup

    - by Eric
    Setting up a new Windows 7 PC, I'm puzzled by two things in Thunderbird 3.1.9: I restored a profile from a three-month old backup, no problem. I then copied more recent files into the Mail/ directory, but TBird still shows the old messages. The last message in Inbox is dated 3/16/2011 -- how do I get TBird to display all the messages in the Local Folders/Inbox view? A large number of the existing messages are now displayed in separate tabs -- I can't tell you how many, but there could be over 1000. Which file governs this? Or can I hire someone from Mechanical Turk to come over and manually close each tab?

    Read the article

  • How do you backup your localhost ?

    - by justjoe
    i have method to backup my work on localhost based on week basis. i use multipe dos command and save in on a bat file. i use command such as copy and xcopy and save my localhost to another place. After my server grow larger, i think it take too much space. So tehre is a way to solve this problem ? maybe a software that can track changes on our php code or another method to preserve your code when thing go bad ? EDIT : I use windows xp sp2, on XAMPP Apache PHP 5.2.1 the localhost refer to my laptop. i install the localhost server here

    Read the article

  • What cloud backup solution supports a "backup server to cloud" configuration?

    - by Gepeto
    What online backup tool allows you to: A) Back up Windows, Linux and optionally Mac desktops and servers to the cloud B) Do so by first backing up to a central server or appliance C) Allow restoring from that appliance when possible and if not go to the cloud For now the best option I have seen is i365 by seagate with an appliance between the local computers and the cloud. I know Microsoft also has an i365 plugin for DPM, as well as an Iron Mountain plugin. However, I feel that there must be a simpler way to do this. Can any of the "simpler" solutions like Jungle Disk or anything else going to s3, Mozy, Carbonite, Crashplan, etc do this? Thank you

    Read the article

  • Best way to backup and restore millions of files

    - by bongo
    Hi, I'm facing a rebuilding of the volume on which I host the mail storage (kerio mailserver, which uses maildirs). I need to backup and restore as quick as possible the 3.5+ millions (for about 600GB) small files of the store directory. It takes more than 12 hours via rsync to a NFS share, but I also have a 1TB firewire 800 raid1 disk that I can use (from some preliminary tests it's faster). I'm working off a XServe intel. What is the fastest way to do it? Rsync? Finder copy? tar?

    Read the article

  • OSX Time Machine: deletion of backup folders

    - by jml
    I saw this question and was hoping that someone could expand upon the chosen answer (which I understood): Can you sudo mv Time Machine backup files as sudo from the trash to their original locations? I have tried doing this as root to no avail (operation not permitted). If not, can you successfully rm them via the trash via the terminal, faster than what the endless 'preparing to empty the trash' dialog suggests, and If you get the files back out of the trash can you tell if they are intact via disk utility (and how) Can you force indexing on a Time Machine drive in the same way that you would a normal drive to rebuild the TM index? I realize that a single answer could clarify all of the above, but I wanted to include details to be clear on what I am asking. Thanks for any help.

    Read the article

  • rsync bash script to backup specific directories nightly to remote server

    - by Janice Young
    Hello, I am looking for a rsync script that will backup specific directories from my home machine to a remote server nightly. So say: /home/me/Pictures to ssh -p 6587 [email protected]/Pictures. It would be nice if it can look for changes but im not worried so much about the changes aspect is having a script that runs at a certain time of night with cron or however. I have googled and found scripts but those scripts were specific to the operations of those creators. Any help would be happily accepted as the scripted part really throws me off. Thank you, Janice

    Read the article

  • Windows 2008 Domain Controller - Backup (BDC) to Primary (PDC)

    - by Klaptrap
    I have created a new domain controller with my single domain forest. I have also made it DHCP and DNS ready - all 3 services have synchronised with the existing W2K8 domain controller. I even migrated the FSMO roles and thought everything was fine. Indeed all machines on network appear to obtain DHCP and DNS from new server and the AD is working on the new server as my internal website uses it for login authentication. I have just noticed, via BgInfo - Sys Internals - that the new server is showing as "backup" and the old as "primary" - I thought I had already achieved this. Have the FSMO roles swapped back - as I have yet to have removed the old server from AD (dcpromo). Do I need to do anything before I run dcpromo on the old server? Any thoughts appreciated....

    Read the article

  • ESXi 4.0u1 - backup/copy options

    - by Hanadarko
    I have a machine built using ESXi 4.0u1 and it has 3 hard drives. I have my hosts built on different hard drives but wondering about backup options. I do not have RAID but I have 3 drives and 1 is totally empty. I had been using it to store ISOs for loading. So what options do I have to either create a 1 time copy onto the spare drive or some sort of snapshot to the spare drive? - There must be some way to do this either via the vsphere client or ssh into the ESXi box and go from there. Thoughts? -JD

    Read the article

  • Anyway to backup nginx before recompiling

    - by JM4
    I am looking to install the HttpGeoipModule for NGINX but learning I have to recompile the entire thing from source in order to do so. I have a new Media Temple DV 4.0 server and that comes with nginx v 1.3.0 stock and have never had to recompile from source before and a bit nervous to make changes without being able to revert to a previous state in the event something messes up (that and the fact it is affecting a live server so no idea what downtime is). My plan was to copy all the existing modules used (nginx -V to list them all and copy the modules already compiled). Then rebuild from source with the copied info above and including the ./configure --with-http_geoip_module reference. Is is possible to backup the existing nginx configuration in the event something goes wrong?

    Read the article

  • How to prevent Windows Home server waking for backup

    - by Andronicus
    Since changing the motherboard in my Windows home server it has been switching itself on every night to perform a scheduled backup. I don't want this to happen though, I want to switch the server on using Wake on lan, but I want to manually perform backups if/when required. I know I can disable backups of each computer, and the machine wont turn on, but I want to allow manually triggered backups, or automatic backups if the homeserver is on, but if its in sleep mode, I wish it to stay in that state unless I wake it. How can this be achieved?

    Read the article

  • Two mail servers, need help with dns configuration for the backup one

    - by user92231
    I need to run a redundant backup mail server in case the main one goes down. The settings in GoDaddy look something like the following: A (Host) Host Points to @ ip address of mail1 41.x.x.x mail1 ip address of mail1 41.x.x.x mail2 ip address of mail2 196.x.x.x MX Priority host points to 10 @ mail1.mydomain.com 20 @ mail2.mydomain.com When mail1 goes down, mail2 is able to get emails. I can access it through the browser with no problem, but I want my users to able to pop3/smtp as well without changing anything in their outlook. I dont want any impact to the users when mail1 is down. Also, I'm using the windows server DFS to keep both folders of the mails in sync. Is this the right way, or should I be using something else?

    Read the article

< Previous Page | 51 52 53 54 55 56 57 58 59 60 61 62  | Next Page >