Search Results

Search found 14013 results on 561 pages for 'remote backup'.

Page 108/561 | < Previous Page | 104 105 106 107 108 109 110 111 112 113 114 115  | Next Page >

  • restore ntbackup on server 2003 share

    - by user38040
    i have access to a share on a windows 2003 server... i can create files and folders in this folder... i used ntbackup to backup my files from this share, but when i delete/rename my folder and try to restore all that is restored is the folders the files are not restored... although i can restore the files to a local location thanks

    Read the article

  • Backing up files on ubuntu for reinstall. Will there be problems with permissions?

    - by adam
    I have some very important files I want to backup before I reinstall my Ubuntu back to 9.04 from the 9.10 (its causing me all sorts of problems). The files total size is small so im just going to copy them over to Dropbox. Im wondering, when i reinstall Ubuntu and copy them back will there be any issues re the permissions of those files because my old user account which created them and the new user Ill setup on the new install will be different?

    Read the article

  • Is it a very bad idea to create disk image of mounted disk?

    - by Maciek Sawicki
    I would like to backup my server. For example using dd: dd if=/dev/md0 of=/some_network_share I wonder if this image will be vary inconsistent if /dev/md0 is mounted? Would it be possible to convert such dd image to vdi drive and create working virtual machine? Using this command for example: VBoxManage convertfromraw ImageFile.dd OutputFile.vdi Network traffic is disabled on firewall (there is only connection to/from one remote machine where image is copied).

    Read the article

  • Bacula optimization/profiling tools

    - by pufferfish
    I'm trying to get an idea of where the bottlenecks are in our backup system. Are there tools available for profiling this? If not, any pointers to a home grown method would also help. I guess most of the info would be in the bacula logs, but I'd also like to see things like what gets saturated during despooling: disk, CPU or network? This feels like a problem most bacula admins would have encountered.

    Read the article

  • Can rdiff do incremental backups

    - by Mirage
    I am new to ubuntu , i have installed rdiff-backup. I have folder called sqlfiles on remote ftp server.The sql filesa are stored for last three days and then deleted. But i want to download the all copies to local computers I want to have incremental backups on my local server so that 1)If file is same then it should not be copied 2)if different , then overwrite it 3)If file is in local directory and not in FTP , then leave as it is How can i apply those rules to r-diff

    Read the article

  • Rsync command to synchronize two NTFS drives?

    - by radman
    Hi, I have a 2 1.5TB drives containing numerous video, audio, documents etc that I would like to essentially mirror to 2 other drives for backup. I would like to do this using rsync (as it seems the most appropriate thing to use). What command should I use to do so? Is there anything to be aware of when rsyncing NTFS partitioned drives/files?

    Read the article

  • what's the differences between rsync applications to use for backups?

    - by Thr4wn
    There are many concerns about backing up servers (naturally), and the best responses I found for "backing up best practices" was http://serverfault.com/questions/512/best-practices-to-keep-your-computers-backed-up-efficiently/1076#1076. However, Many people recommended 'rsync', but there are many rsync applications out there (like rsync-backup and duplicity, etc), and I want to know the trade-offs and recommendations for which one to use. Is there one that simply is newer and technically superior to all the rest?

    Read the article

  • Restoring a DataBase

    - by jjj
    i have two deferent Servers databases connections , let say j and k .. and i have backup data file in j, i named it jtok.bak .. now , i am trying to restore a database in k from jtok.bak , the database name is kkk ... so ... is there a way to do that ?

    Read the article

  • Backing up oracle to TAPE

    - by andreas
    Hi folks, our Oracle database has grown very large as of late ~= 400 - 500 GB and saving to filesystem is not scalable anymore to us. We are looking at using RMAN to backup to tape (directly, not to fs then tape). Anyone can shed a light on this please?

    Read the article

  • How can I back up my iPhoto library to DVDs?

    - by Patrick McElhaney
    I'm using iPhoto '09 and have an 80GB library. I want to back everything up to DVDs, because I figure that's the cheapest / most reliable solution. (I plan to have a couple of copies and keep them in different places.) Ideally, after the initial backup, every couple of months, I'd back up everything that's changed (new photos, edits, metadata) to single DVD and add it to the set. How would you go about doing that?

    Read the article

  • IBM LTO 3 Tape drive periodically going "offline"

    - by bruno077
    I have a problem with a LTO 3 Tape Drive. I'm using Brightstor ArcServe Backup software with Windows Server 2003. Sometimes, the scheduled backups will stop working and going into "devices" inside this software reveals that the Tape Drive is offline. The only way to make it work again is uninstalling the Tape Drive's driver and rebooting the server: Windows then auto recognizes and installs the required drivers and the Tape Drive works again. Has anyone ever encountered a similar problem?

    Read the article

  • Backup Google Calendar programmatically: https://www.google.com/calendar/exporticalzip

    - by Michael
    I'm struggling with writing a python script that automatically grabs the zip fail containing all my google calendars and stores it (as a backup) on my harddisk. I'm using ClientLogin to get an authentication token (and successfully can obtain the token). Unfortunately, i'm unable to retrieve the file at https://www.google.com/calendar/exporticalzip It always asks me for the login credentials again by returning a login page as html (instead of the zip). Here's the critical code: post_data = post_data = urllib.urlencode({ 'auth': token, 'continue': zip_url}) request = urllib2.Request('https://www.google.com/calendar', post_data, header) try: f = urllib2.urlopen(request) result = f.read() except: print "Error" Anyone any ideas or done that before? Or an alternative idea how to backup all my calendars (automatically!)

    Read the article

  • Backup and restore database

    - by Suman.hassan95
    i am using mysql as my database server. I want to have a backup of database and restore them in case of machine crash. I am enetering the data through a GUI i Windows. I googled and read so much about automysqldump but i couldn't find the downloadable exe , i have downloaded the .sh file but i am very confused about how to use it. Please help me about this issue. I don't want to know how to use automysqldump but want to know any to backup and restore database using windows. Please tell in details, it's very urgent.

    Read the article

  • MS DPM 2007: Testing the Recovery for a Production Domain

    - by NewToDPM
    Hi everybody! MS DPM 2007 is a new technology in my company, and so am I to the product. We have a classic Microsoft domain with two DCs, Exchange 2007 and a couple Web/MS SQL servers. I have deployed DPM one month ago on the domain, and after fixing the various issues I got with the replicas inconsistence and adapting the schedule and retention range to the server storage pool size, I can say the backup system is working correctly (no errors) as of today. However, there is one problem: we did not attempt to restore from the backups yet, which is a big no-no of course. I'm not sure about the way I should handle this, my main concern being Exchange and the System State of the DCs. From my understanding, DPM can only protect AND restore data on a server which is part of the same domain as the backup server. If I restore the System State (containing Active Directory) and the Exchange Storage Groups on a testing server, I am afraid it would completely disturb the domain functioning (for example, having two primary DCs on the domain). I am thinking about building a second DPM server on a testing separate domain which would mirror the replicas and then restore it on testing servers from this new domain. Is it the right way to handle the data recovery testing? How did you do on your domain when you first deployed DPM? I'd be grateful for any link/documentation or advice. Thank you in advance for your help! EDIT: Two options seem possible so far: i. Create another DC/Exchange server in the alternate location; ii. Create a separate domain in the alternate location and setup a trust between this domain and the production one. The option i is certainly the best but implies setting up a secondary Exchange server, with a dedicated public IP address so that if Exchange #1 dies, we can still send emails with Exchange #2. I don't know how complex this can be and would need to discuss it with my colleagues. The option ii would only fit the testing purposes. My only question regarding this is: if my production and DPM servers are part of domain A, and there is a trust between domains A and B, can I restore a domain A content to any domain B server?

    Read the article

  • pipe from tar to ftp

    - by facha
    I have ftp access to a server I do not control. I'd like to start sending archives of my server's FS to that ftp. The problem is I don't have enough free space on my system to create a backup archive first (and store it on my fs) and then send it to ftp. So I'm wondering if it is possible to do something like this: tar -jcpvf - / | ftp-put ftp://user:pass@host/file.tbz Normally there is no problem doing it over ssh, but in this case I only have ftp available.

    Read the article

  • SQL Server 2012 Backups Not Compressing

    - by Chris
    We recently upgraded from SQL Server 2008R2 to SQL Server 2012 Enterprise. After doing this I noticed that our DB backups were barely compressing at all. Our 22gb DB was compressing down to about 4gb before upgrading to 2012 and after the upgrade our 22gb DB is only compressing down to 19gb. I've checked and double checked the compression setting on the server and all of my backup jobs and compression is turned on. Any ideas on what may be going on?

    Read the article

  • How do I preserve ownership permissions when copying to an external volume?

    - by Yitzchak
    When I upgrade or reinstall linux I backup my home folder by running sudo cp -pr /home/users/yitzchak /media/externalHDD/backups. When I do this I get errors saying that permissions could not be preserved and when I copy the folder back onto my local disk I see that ownership has been changed to root and I have to chown all of them back, which has to be done manually piecemeal because not all files have the same group. Is there any way around that?

    Read the article

  • Encrypt windows 8 file history

    - by SnippetSpace
    File history is great but it saves your files on the external drive without any encryption and stores them using the exact same folder structure as the originals. If a bad guy gets his hands on the hard drive it could basically not be easier to get to your important files. Is there any way to encrypt the file history backup without breaking its functionality and without having to encrypt the original content itself? Thanks for your input!

    Read the article

< Previous Page | 104 105 106 107 108 109 110 111 112 113 114 115  | Next Page >