Search Results

Search found 6101 results on 245 pages for 'incremental backup'.

Page 72/245 | < Previous Page | 68 69 70 71 72 73 74 75 76 77 78 79  | Next Page >

  • How to copy files from HDD to HDD with integrity checking

    - by RafaelM
    I am moving data from an almost dead HDD to an external USB drive using linux , because for some reason Windows cannot see the data. I want to copy a large amount of data over from the HDD to the USB drive with integrity checking. I thought about copying everything over and then checking with md5summer but this would take a reaally long time because its a lot of data and this is not a very powerful PC. What tool can use to do this on Linux?

    Read the article

  • Nearest PC equivalent to Mac Target Disk Mode?

    - by username
    Mac firmware has a special boot mode that allows you to offer its internal hdd to another computer as an external disk (you just connect the two machines via an IEEE 1394 cable). Only the second machine needs a functioning OS installed. Any good suggestions for something similar on the PC side of things? Block level access isn't important to me, I'd just like to be able to copy files off it. It doesn't matter to me if it uses Ethernet, IEEE 1394, or wifi - I just like having a quick way to access files on a client PC. Is there any single-purpose Linux distro specially designed to do this? It'd be nice to have something super simple, quickbooting, and small that I could install on a USB drive. I used to use Knoppix, but it's overkill as a Target Mode replacement.

    Read the article

  • Rsync and wildcards

    - by Jay White
    I am trying to back up both the "Last Session" and "Current Session" files for Google Chrome in one command, but using a wildcard doesn't seem to work. I am trying with the following command rsync -e "ssh -i new.key" -r --verbose -tz --stats --progress --delete '/cygdrive/c/Users/jay/AppData/Local/Google/Chrome/User Data/Default/*Session' user@host:"/chrome\ sessions/" and get the following error rsync: link_stat "/cygdrive/c/Users/jay/AppData/Local/Google/Chrome/User Data/Default/*Session" failed: No such file or directory (2) What am I doing wrong?

    Read the article

  • Carbonite has taken over my iMac

    - by Larry Rothfork
    I used Carbonite to back up 75GB on my iMac. I also created a folder on my iMac to copy files to, from an external hard drive and then use Carbonite to back up from there. And THEN thinking I had everything safely backed up and, in order to make room on my hard drive I DELETED some of those files, and instead of increasing disk space..my disk space has shrunk to 2GB... I know, I know..you can't use Carbonite like that, but now I have two questions. 1) What is the explanation for the decrease in disk space even though I have deleted about 20GB of those backed up files from my hard drive? It must have something to do with the way Carbonite references backed up files, And 2) Is there a way to extricate myself from this situation?

    Read the article

  • Carbonite has taken over my iMac

    - by Larry Rothfork
    I used Carbonite to back up 75GB on my iMac. I also created a folder on my iMac to copy files to, from an external hard drive and then use Carbonite to back up from there. And THEN thinking I had everything safely backed up and, in order to make room on my hard drive I DELETED some of those files, and instead of increasing disk space..my disk space has shrunk to 2GB... I know, I know..you can't use Carbonite like that, but now I have two questions. 1) What is the explanation for the decrease in disk space even though I have deleted about 20GB of those backed up files from my hard drive? It must have something to do with the way Carbonite references backed up files, And 2) Is there a way to extricate myself from this situation?

    Read the article

  • Looking for an open source email archiving application

    - by Joel
    I'm looking for an open source application that will archive my email. It might do this by logging in to my POP3 account on a regular basis and copying the emails across, or it might just read my Unix mbox/maildir file/directory directly on the mail server. It must be open and it must run on Linux (or any open OS actually). Ideally, it would have a web interface, but this is not a major requirement. MXsense (http://www.mxsense.com/mxsense.html) seems to be pretty-much what I want, except it's not open. I have no requirement for MS Exchange support. Any suggestions? The rationale (maybe a bit silly) is that I run Linux exclusively and it's still doesn't have an email client that is anywhere close to MS Outlook in terms of awesomeness, so I find myself switching between mail clients often. I would feel better about this if I had an archive of my emails, so it wouldn't matter which mail client I was using this month.

    Read the article

  • Windows XP restore point file from disk.

    - by Dragos Toader
    Suppose I copied a Windows XP restore point file to a USB memory stick. I copied C:\System Volume Information\MountPointManagerRemoteDatabase C:\System Volume Information\tracking.log C:\System Volume Information\_restore{45B5E8B9-949A-471E-999D-F381DA56A2D3} C:\System Volume Information\catalog.wci to F:\System Volume Information\ How can I restore this restore point? Can I fool the system into using that file (if I copied it back into the restore point folder)? From F:\System Volume Information\MountPointManagerRemoteDatabase F:\System Volume Information\tracking.log F:\System Volume Information\_restore{45B5E8B9-949A-471E-999D-F381DA56A2D3} F:\System Volume Information\catalog.wci to C:\System Volume Information\

    Read the article

  • recover files from encrypted home folder

    - by maskiepop
    I can't seem to find the answer to my questions -- hence my posting this here. 1) I have encrypted my home folder in LinuxMint 15 Cinammon x64. If I create images of LM's partitions via fsarchiver, how do I go about selectively restoring home folders, and files from the FS images? 2) I haven't done this yet; but can I restore another users home folder into another user; both unencrypted. Is that a fairly common thing to do in Linux/ubuntu? I mean is the process fairly straightforward? What if the home folder I want to copy over to another user is encrypted? Thx

    Read the article

  • Dell External SAS 5/E HBA and Hyper-V

    - by JohnyD
    I have a Dell R710 running Win2008 R2 + Hyper-V with dual SAS 5/E HBA's. I'm building a Linux VM to install Bacula on and I need to connect it to my Dell PowerVault 124T via the SAS HBA. I've been doing some looking online and I have yet to find a straightforward answer on how to connect a SAS HBA to a VM, let alone a Linux VM. The flavor is Ubuntu 32-bit.

    Read the article

  • How to recover MySQL database from .mysql file?

    - by Brayn
    We had some problems with our MySQL server and somehow all I've got is a database.mysql file for the database I want to restore. I've done a bit of googleing but I didn't find anything about how should I handle this type of file. It's worth mentioning that the server was running Plesk and the database wasn't using InnoDB. Edit: I've forgot to mention that I don't know what application created the .mysql file and that it's in binary format. Thanks,

    Read the article

  • MySQL: stopping just one DB to allow it to be moved

    - by DrStalker
    I want to do some work on the files that make up a few MySQL DBs (moving the files to a different partition and symlinking the original location to this) and if possible I'd like to shutdown just the database being moved, rather than shutting MySQL down altogether. Is there anyway in MySQL to do this, or will I need to do a full MySQL shutdown to be able to move the files?

    Read the article

  • Backing up Initial and Running configurations for Nortel Baystack 325-24G

    - by i.h4d35
    I recently came across a Nortel Baystack 325-24G switch. This is the first time I've come across a Nortel device of any sort, so I am a little intimidated. My problem is that I have been trying to get the startup and running configurations via both the CLI and the Menus but its become quite apparent that it isn't like the Cisco Switches/Routers. I've searched online but have only found Configuration Guides by Avaya. Also I'd like to know - is there a way to take backups regularly (something like tftp)? Pardon me but I'm a n00b when it comes to routers and switches. Thanks in advance.. EDIT: Still havent found a way to get the running config via the CLI

    Read the article

  • stsadm -o. What does the -o mean?

    - by ddono25
    I am working on a large SharePoint farm, mainly with the backend SQL Servers. We have always used stsadm -o for all stsadm functions, but no one seems to know why. I can't seem to find the info specific for stsadm, would it be general Windows command-line sytax?

    Read the article

  • backing up a virtual machine

    - by ErocM
    I inquired with the support of justcloud.com telling them that I have a vmware vm that I was wondering if it could be backed up while in use. I can back up the vm once it is shut down but I was wondering if their "shadow copy" would back it up while running. This was their response: Thank you for your email. I am really very sorry but virtual machines can't be backed up for a simple reason that they are virtual, they have virtual memory, not physical memory. Please let me know if there is anything else I can help with. Kind Regards, Barry James User Experience Team www.justcloud.com These are physical files so I wasn't sure I even understood the response. Am I wrong in thinking that a vm can be backed up while in use? Does this response even make sense? I need a cheap alternative to backing up the vm off the server in case it goes down. Any suggestions?

    Read the article

  • Automate backing up e-mails in Outlook Express

    - by Michael Itzoe
    My client is a small business (three employees) that uses Outlook Express. They'd like to back up their email. I showed them how to export, but they balked at that. Is there a way I can automate exporting email? They already have a batch file they use that zips a copy of their data and I'd like to be able to add something to that to include email. Is this possible?

    Read the article

  • Bash script to keep last x number of files and delete the rest

    - by Brady
    I have this bash script which nicely backs up my database on a cron schedule: #!/bin/sh PT_MYSQLDUMPPATH=/usr/bin PT_HOMEPATH=/home/philosop PT_TOOLPATH=$PT_HOMEPATH/philosophy-tools PT_MYSQLBACKUPPATH=$PT_TOOLPATH/mysql-backups PT_MYSQLUSER=********* PT_MYSQLPASSWORD="********" PT_MYSQLDATABASE=********* PT_BACKUPDATETIME=`date +%s` PT_BACKUPFILENAME=mysqlbackup_$PT_BACKUPDATETIME.sql.gz PT_FILESTOKEEP=14 $PT_MYSQLDUMPPATH/mysqldump -u$PT_MYSQLUSER -p$PT_MYSQLPASSWORD --opt $PT_MYSQLDATABASE | gzip -c > $PT_MYSQLBACKUPPATH/$PT_BACKUPFILENAME Problem with this is that it will keep dumping the backups in the folder and not clean up old files. This is where the variable PT_FILESTOKEEP comes in. Whatever number this is set to thats the amount of backups I want to keep. All backups are time stamped so by ordering them by name DESC will give you the latest first. Can anyone please help me with the rest of the BASH script to add the clean up of files? My knowledge of BASH is lacking and I'm unable to piece together the code to do the rest.

    Read the article

  • How to dump remote database without mysqldump?

    - by deceze
    I want to dump the database on my remotely hosted site in regular intervals using a shell script. Unfortunately the server is locked down pretty tight, has no mysqldump installed, binary files can't be executed by normal users/in home directories (so I can't "install" it myself) and the database lives on a separate server, so I can't grab the files directly. The only thing I can do is log into the webserver via SSH and establish a connection to the database server using the mysql command line client. How can I dump the contents to a file a la mysqldump in SQL format? Bonus: If possible, how can I dump the contents directly to my end of the SSH connection?

    Read the article

  • Areca 1280ml RAID6 volume set failed

    - by Richard
    Today we hit some kind of worst case scenario and are open to any kind of good ideas. Here is our problem: We are using several dedicated storage servers to host our virtual machines. Before I continue, here are the specs: Dedicated Server Machine Areca 1280ml RAID controller, Firmware 1.49 12x Samsung 1TB HDDs We configured one RAID6-set with 10 discs that contains one logical volume. We have two hot spares in the system. Today one HDD failed. This happens from time to time, so we replaced it. Upon rebuilding a second disc failed. Normally this is no fun. We stopped heavy IO-operations to ensure a stable RAID rebuild. Sadly the hot-spare disc failed while rebuilding and the whole thing stopped. Now we have the following situation: The controller says that the raid set is rebuilding The controller says that the volume failed It is a RAID 6 system and two discs failed, so the data has to be intact, but we cannot bring the volume online again to access the data. While searching we found the following leads. I don't know whether they are good or bad: Mirroring all the discs to a second set of drives. So we would have the possibility to try different things without loosing more than we already have. Trying to rebuild the array in R-Studio. But we have no real experience with the software. Pulling all drives, rebooting the system, changing into the areca controller bios, reinserting the HDDs one-by-one. Some people are saying that the brought the system online by this. Some are saying that the effect is zero. Some say, that they blew the whole thing. Using undocumented areca commands like "rescue" or "LeVel2ReScUe". Contacting a computer forensics service. But whoa... primary estimates by phone exceeded 20.000€. That's why we would kindly ask for help. Maybe we are missing the obvious? And yes of course, we have backups. But some systems lost one week of data, thats why we'd like to get the system up and running again. Any help, suggestions and questions are more than welcome.

    Read the article

  • How to copy files from shadow copy with long source path

    - by Jake
    The files and folders in my shared network drive (set up with DFS) were mass deleted. Currently I am trying to recover the files from the shadow copy "Previous Version". Problem is, thousands of files are deeply nested with long paths making the file path too long. When copying, it shows the dialog "Source Path Too Long". My guess is that the file path just barely hits the limit when saved into the network drive, but shadaw copy service appends the date and time to the folders so the path character limit is exceeded. How else can I copy the files from shadow copy?

    Read the article

  • Is LiveDrive.com reliable?

    - by Marc
    I'm currently using DropBox (50GB account) which works fine, but at this moment I'm not impressed with its speed. I have a 60down/6up connection and only LiveDrive can use almost the full bandwidth of my connection. Dropbox often is very slow (avg. 100-500Kb/s compared to LD at 6MB/s). If I only look at the speed and storage costs then LD is much better, but I don't have enough experience with LD to be able to say something about reliability. Can anyone comment on this? Thx.

    Read the article

  • Help creating image from LVM

    - by jackhab
    I need to duplicate CentOS hard drive image for multiple stations. The HD has the following layout: Disk /dev/sdb: 250GB Sector size (logical/physical): 512B/512B Partition Table: msdos Number Start End Size Type File system Flags 1 32.3kB 107MB 107MB primary ext3 boot 2 107MB 250GB 250GB primary lvm I saved /dev/sdb1 to file with fsarchiver but for sdb2 I get: /fsarchiver savefs an2.fsa /dev/sdb2 oper_save.c#1006,filesystem_mount_partition(): can't detect and mount filesystem of partition [/dev/sdb2], cannot continue. removed an2.fsa Although fsarchiver probe simple correctly detects sdb2 as LVM2_member. Is fsarchiver correct tool for this job? What's wrong? I'm on Ubuntu 9.1 with fsarchiver 0.6.8 and lvm tools installed. Thanks.

    Read the article

  • Help creating image from LVM

    - by jackhab
    I need to duplicate CentOS hard drive image for multiple stations. The HD has the following layout: Disk /dev/sdb: 250GB Sector size (logical/physical): 512B/512B Partition Table: msdos Number Start End Size Type File system Flags 1 32.3kB 107MB 107MB primary ext3 boot 2 107MB 250GB 250GB primary lvm I saved /dev/sdb1 to file with fsarchiver but for sdb2 I get: /fsarchiver savefs an2.fsa /dev/sdb2 oper_save.c#1006,filesystem_mount_partition(): can't detect and mount filesystem of partition [/dev/sdb2], cannot continue. removed an2.fsa Although fsarchiver probe simple correctly detects sdb2 as LVM2_member. Is fsarchiver correct tool for this job? What's wrong? I'm on Ubuntu 9.1 with fsarchiver 0.6.8 and lvm tools installed. Thanks.

    Read the article

  • Better approach to archiving large amounts of original video footage using optical media (DVD/Blu-ra

    - by Rob
    This question is to share my experience as well as ask for suggestions for better methods. Along with 2 friends, I completed the making of a short documentary film in 2006. Clip is at: http://www.youtube.com/mediamotioninvision The film was edited in Adobe Premiere Pro 1.5 on Windows XP. More details and screenshot here: http://www.flickr.com/photos/smilingrobbie/1350235514/ ( note this is not intended to be a plug, we've moved on from this initial learning curve project ;) ) The film is in 4:3 standard definition 720x576 PAL format. As well as retaining the final 30minute film, I wanted to keep all original files that assembled together to make the film. The footage was 83.5Gb So I archived them to over 20 4.7Gb DVD recordables in the original .avi format (i.e. data DVD-ROM format, NOT DVD-Video Mpeg2) Some .avi DV video files were larger than 4.7Gb so I used 7-zip to split them ( here is a guide as to how to do that: http://www.linglom.com/2008/10/12/how-to-split-a-large-file-using-7-zip/ ) To recombine them, a dos shell command like this would do that: copy /b file.avi.* file.avi would do the job, where .* is a wild card to include all the split parts e.g. 001, 002...00n assuming they are all in the same directory path folder. file.avi is the recombined file identical to the original. Later on, I bought a LG BE06 LU10 USB 2.0 Super-multi Blu-ray burner and archived the footage to 2 (two) x 50Gb BD-R DL discs. Again in the original format, written as files to a BD-R in the BD-R BD-ROM UDF format readable by PC/Mac etc, NOT Blu-ray video/film format. This seems to be a good solution for me, because: the archive is in a robust, reasonably permanent, non-volatile medium, i.e. DVD recordable / Blu-ray (debates about stability of optical media organic chemical dye compounds/substrates aside) the format of the archive is accessible by open source tools or just plain Windows Explorer and it's not in a proprietary format I just thought I'd ask folks for their experience on better methods, if such exist.

    Read the article

< Previous Page | 68 69 70 71 72 73 74 75 76 77 78 79  | Next Page >