Search Results

Search found 13175 results on 527 pages for 'live backup'.

Page 81/527 | < Previous Page | 77 78 79 80 81 82 83 84 85 86 87 88  | Next Page >

  • Problem running the python code for backup on Ubuntu?

    - by Akash
    I was trying to run following python code through terminal but something is wrong as it is producing some errors source = ['/home/akash/', '/home/akash/code'] target_dir = '/media' target = target_dir + os.sep + time.strftime('%Y%m%d%H%M%S') + '.zip' zip_command = "zip -qr {0} {1}".format(target, ' '.join(source)) if os.system(zip_command) == 0: print('Successful backup to', target) else: print('Backup FAILED') but when i try to run it following error appears zip I/O error: Permission denied zip error: Could not create output file (/media/20131019083404.zip) Backup FAILED

    Read the article

  • Shared storage solution for our sql server backups

    - by Gokhan
    We have 3 clustered sql servers. We have 5+ multi terrabyte databases and their backup files (compressed using quest litespeed) are hitting over 600gb each, We are required to keep at least a week or two weeks (if we can) of weekly full backups and then 6 days differential backups, and a week or 2 weeks worth of log backups local. We are currently limited to 2TB volumes from our san team, we can have multiple volumes but they are expensive ($200 per raw TB per month) and having to deal with many backup volumes instead of a single big volume is difficult. I think if we could have a shared network storage of 20TB+ raid 10 or so for all our servers for keeping the backups and another department will copy them to tape from the network storage and delete files according to the retention period would be good, if this box would be a build in operating system (even unix a complete file storage system) that would be good. What do you guys think, does this make sense to you, is there any manufacturer that sells a storage product like that which that work in a clustered environment? Thank you

    Read the article

  • undelete files using a live cd

    - by doug
    I'm trying to recover some data using testdisk and running ubuntu from a live cd. When i choose the undelete option i get the following message and it takes me out of testdisk: file undeleteAborded (core dumped) do you know why? can you give me some advices, tips about what to do?

    Read the article

  • Ubuntu 10.04 live cd keyboard problem.

    - by Octa
    Today I downloaded the Ubuntu 10.04 x64 .iso but when trying it out in live cd the keyboard of both my laptop and a microsoft branded one didn't work. I was planning into installing Ubuntu in a partition but I couldn't even put my info in the installation process to do it. Is this a problem with the .iso or the driver? because I'm afraid that if I can actually go ahead with the installation then the keyboard won't work after that too.

    Read the article

  • Process for migrating Dropbox to SpiderOak

    - by Marcel Janus
    I want to move my data from dropbox to SpiderOak. I have 3 computers running dropbox. But I have a poor WAN connection with very limited upload bandwidth. So I thought I do as first step install the dropbox client on my server on the internet an download there my data from dropbox. Then after this I upload/backup my data from this server with a broadband connection to SpiderOak. After the backup is completed I setup the sync between my 3 computers so that they will not have to upload the data again. Will this process will work so that I don't have to upload my data again over my WAN connection at home?

    Read the article

  • Alternatives to Crashplan for VPS?

    - by Chloe
    I use SFTP Net Drive to mount a remote VPS so I can back it up. However, it's taken over 3+ days to scan! I ran 'ls -lR' from my desktop over the mounted network drive and it only took about 5m to list all the files! There are only about 5000 files and 2 GB. I know Crashplan can run headless on the VPS itself, but that sounds like a pain to set up, and it takes so much memory on the server. The VPS doesn't have a lot of memory to spare - it's less than my desktop. Is there another program that can communicate with a Crashplan backup protocol and has a command line interface? backup /home

    Read the article

  • including files in a symlink directory when backing up with duplicity

    - by Rob
    I'm backing up using Duplicity, great tool. I'm unable to include files in the backup that are within a directory that is a symlink. Using the following: duplicity <dup args> --include /var/www/**/current --exclude '**' duplicity will only backup the symlink I've tried: duplicity <dup args> --include /var/www/**/current/* --exclude '**' # and duplicity <dup args> --include /var/www/**/current/** --exclude '**' Not even then symlink is backed up. the "current" directory links to directory like: /var/www/host.com/de9f2c7fd25e1b3afad3e85a0bd17d9b100db4b3 The files contains a few static html & css files. I want those files to be backed up, regardless of which sha'd directory "current" points to. Any help appreciated.

    Read the article

  • Getting users LastLogonTime on Live@edu using powershell

    - by Eagles
    I am trying to get a csv file of all users in a Live@edu environment with a LastLogonTime, but I am having some issues here is my script: foreach ($i in (Get-Mailbox -ResultSize unlimited)) { Get-MailboxStatistics -LastLogonTime $i.DistinguishedName | where {$_.LastLogonTime} | select-object MailboxOwnerID,Name,LastLogonTime | export-csv -path "c:\filepath\UserLastLogon.csv" } I get the error: A positional paparameter cannot be found that accepts argument '[email protected],OU=domain.edu,OU=Microsoft Exchange Hosted Organizations,DC=prod,DC=exchangelabs,DC=com'. +Category Info: InvalidArgument: (:) [Get-MailboxStatistics], ParameterBindingException +FullyQualifiedErrorId : PositionalParameterNotFound,Get-MailboxStatistics Any help would be great!

    Read the article

  • live football stats API

    - by peks
    Hi everyone, i'm looking for a service that will provide live football/soccer statistics (match and player stats, preferably) over API or RSS (or whatever), or maybe just provide easily parsable stats. Does anyone have any suggestions? Thanks in advance, Martin

    Read the article

  • I'll be setting up a dedicated web server at work soon, my first non hobby server - What should I know?

    - by Rogue Coder
    I've been running my own dedicated server running CentOS and a LAMP stack for 2-3 years now, but it's only been hosting my own websites which aren't super important. However, I will soon be setting up a Linux Webserver and Linux Database Server at work, and I'm wondering what are some important things I should be doing. It's an internal server only, so only people in the company can access it. Should I get a slave server for both of my servers for backups? If I do this, how many backups should I be keeping and how often should those backups be done? Right now on my current server I run a cron job nightly to backup my MySQL databases (Usually 40mb files once compressed), and bi-weekly cron jobs to backup my web root. I just store these files on my local computer via FTP. Also, for an internal server like this, should I look at using LightHTTPD or NginX to increase performance, or will Apache be fine?

    Read the article

  • How can I stop ntbackup requiring my new password every time I'm forced to change my Windows passwor

    - by Lunatik
    I have a scheduled job that runs each night using ntbackup which copies a folder on my HDD to a network share. The problem is that every time I'm required to change my Windows password I have to remember to change it in ntbackup aswell, otherwise the backup fails silently i.e. I get no warning that the backup isn't being done. Is there a way to schedule this job so it will automatically pick up my new Windows password, or somehow not be tied to my main login? My user account type is Debugger, not full Administrator, so I'm not sure if that would restrict me in any way, e.g. still forcing a four-weekly password change on a dedicated user account for this. The PC runs XP SP2 on a Windows Server 2003 R2 domain.

    Read the article

  • Is rsync --delete safe in case of disk failure

    - by enedene
    I have two data hard drives on my Linux server and I use second as a backup for a first drive. I use rsync for that purpose. An example would be: rsync -r -v --delete /media/disk1/ /media/disk2/ What this does is that it copies every file/directory from /media/disk1/ to /media/disk2/ but also deletes any difference. For example, lets say that files A and B but not file C are on disk1, and on disk2 there is no A and B files, but there is C. The result would be that after the command on disk2 I'd have files A and B, but file C would be deleted, just like on disk1. Now, a rather disastrous scenario had crossed my mind; what if disk1 dies, system continues to work since system files are on my system disk, but when rsync tries to backup my data on disk2 from broken disk1, it deletes all the files from disk2 because it can't read anything on disk1. Is this a possible scenario, or is there a protection from it build in rsync?

    Read the article

  • How to back up OWA exchange emails without Outlook

    - by fpghost
    My university uses Microsoft Outlook Web Access (OWA) e-mail. Soon they will close my e-mail account as my course has ended, but I really do not want to lose the archive of emails and attachments. I've read a backup to something called a .pst can be made if one has access to Microsoft Outlook but I do not. Is there any other way I could back these up? (preferably on Ubuntu 12.04 but I also have access to Win 7 if need be). One idea I played with is using davmail to allow access via Thunderbird and performing some kind of backup with that. However, I cannot seem to get past Authentication failed: invalid user or password, retry with domain\user.

    Read the article

  • Uploading many large files to a remote server

    - by TiernanO
    I am in the process of creating an offsite backup, and need to do a initial load of data. Currently, that's about 400Gb, give or take 10Gb or so... The backup system is producing files which are about 4Gb each, and has some other, smaller related files also. So, i need to transfer all 400ish gigs to a remote server, but how? What is the best method? I have full remote access to the server, so i can install anything i need to install. There are Windows, Linux and a Solaris VM running on the box itself, so any of those can be used there, and i have Windows and Linux at home. I have 2 internet connections in house, 10Mb/s uploading on each, so something that could potentially split the number of connections would be handy (kind of like GetRight, but in reverse... PutRight?).

    Read the article

  • Ways to deduplicate files

    - by User1
    I want to simply backup and archive the files on several machines. Unfortunately, the files have some large files that are the same file but stored differently on different machines. For instance, there may a few hundred photos that were copied from one computer to the other as an ad-hoc backup. Now that I want to make a common repository of files, I don't want several copies of the same photo. If I copy all of these files to a single directory, is there a tool that can go thru and recognize duplicate files and give me a list or even delete one of the duplicates?

    Read the article

  • Backing up to smaller drive

    - by Dave
    In a few hours I'll have a new 500GB Sony laptop, filled with the usual Sony rubbish which I'll promptly be replacing with Ubuntu or Crunchbang or something. However, first I want to make a full clone of the drive (including recovery partitions), should I wish to return it to Sony or sell it on in its factory state. The problem is that the only backup drives I have are less than 500GB - the biggest I have is 250GB or so! So I need to backup and compress on-the-fly. What's the best way to do this? Presumably dd piped into gzip would do the trick, or does anyone have any other suggestions to accomplish this?

    Read the article

  • Is it useful check data integrity in one DAT tape?

    - by maxim
    I backup my data every day on tape using one drive DAT HP Storageworks DAT 160. I use one tape for every day and I turn them weekly. Every monday I check one tape randomly recover some files saved on it. I know that when data is saved on tape, the driver and backup software check data integrity, but I wonder if a manual check of some data saved has a sense or not. I re-use these tapes many times and I would be sure data are safe.

    Read the article

  • Concerning persistence size in the Linux Live Creator

    - by user63085
    Message : Hello everyone! I have ,for the last several months, used the Linux Live USB Creator which it is a very useful app to make portable OS on to flash drives. I mostly use this application to test and try out new OS's as they are released, before I decide to make a hard disk installatio on to the computer. In many cases, the application developers will allow the “persistence” feature in the flash-drive-installed OS, which is just another way of saying that after multiple boot-ups and shutdowns, all the changes made to the OS will be saved in the flash-drive. But I have a question about the limit of the Persistence size in Linux Live USB Creator (currently version 2.6). I install Super OS 10 on to a partition on my external drive which has 30 GB. I wanted to reserve 10 GB for the persistence so that I can install more applications and space will not run out as I update the installed applications or when I do system updates. But why is it that only 3950 MB can be put for persistence? It would be great if, when desired, as much more persistence space could be set aside so that the space will not run out soon. Also, as I have installed the OS on a 30 GB drive, I tried to see how much space is left. But it seems only the remaining of the Persistence space is displayed when I click on the File System folder. For example, after I have just installed it now, there is 3.5 GB of free space. Where can I access the remaining 26 GB or so drive space which is in the same drive? How do I access it Sir?? It would be helpful if any one could explain and help me with this. Most importantly, it would be a big relief if the persistence can be somehow expanded by a work-around so that I can continue using my SuperOS 10.04 (now heavily customized) OS, which unfortunately has just over 576 MB of space left now, after I removed OpenOffice.org and installed the Libre Office earlier today. This is what remains from the maximum allowable 3950 MB of space for persistence at set-up. Thanks in advance!

    Read the article

  • ESXi disaster recovery plan

    - by Marlin
    I have a Vmware infrastructure where I am using the free version of Esxi 5 . I cannot as a result use vmotion and the other cool features that come with a paid ESXI. I am using snapshots for the backups but they are stored on local hard drive. I need a better backup scenario where I can recover in the event of a harddrive failure. I tried openfiler but could not get it right. What backup method can I try given my situation?

    Read the article

  • How do you do the offsite hard drive backups?

    - by kentchen
    I have been doing hard drive backups for a while, which I believe a lot of you guys do as well, but am having trouble figuring out a better way storing them offsite. I am wondering how you guys out there do that. Any policy or tips & tricks when it comes to offsite store your backups, mainly hard drives not tapes. Thanks in advance. [update] Thanks for mentioning the online backup. We are actually in the middle of this process. And I 100% agree that it's the ultimate way to go. However, considering the cost, sometimes it may not be the option, as it's a quite expensive option if you also consider the application level. I guess online backup can be a very good one in the separate topic. :)

    Read the article

  • Hyper-v back-up

    - by Ddave23
    We are trying to decide a good backup strategy for our new Hyper-V setup. We have 3 VMs on Windows 2008 R2 Hyper-V host. We installed Symantec BackupExec 2010 on the host and have the Hyper-V Agent installed. We would like to perform a full backup at night to tape, and an incremental twice a day to a daily tape. Our environment needs constant protection for our database (Microsoft Access). Any thoughts? Should I be looking at different software?

    Read the article

  • Best server sync software/methods [closed]

    - by Meep3D
    I have a test server at home and a test server at the office. I'd like to somehow sync multiple folders in both directions automatically so I can work at home and to also provide an offsite backup. I've tried Live Sync (Microsofts own product) but it chokes on large amounts of files and seems a bit rudimentary. Dropbox is also a bit small and does not adapt to our filesystem setup. I have seen a few online backup services but none seemed geared to multiple computers using the same account. I don't mind paying a monthly fee provided the service is good. Suggestions would be greatfully appreciated!

    Read the article

  • Restore data from one Windows edition to another

    - by Lindhe94
    I have a Swedish Windows 7 Home Premium on my PC, and I really want to change system language to English. I know that Home Premium can't change system language (only W7 Ultimate does that), so I consider buying an English version and make a clean install. However I do have many settings, programs etc that I don't want to lose. I therefore have two questions: Can I take a backup of my Swedish W7 Home Premium, install the English W7 Home Premium and then restore everything back to normal, except the system language is now English? Can I take a backup of my Swedish W7 Home Premium, install W7 Ultimate and then restore everything back to normal (now with the option to change system language)? Thanks!

    Read the article

  • Backing Up User Data when data is not in use. Should I be concerned?

    - by jberryman
    This may be a dumb question. I would like to use duplicity to make backups to Amazon S3 of directories, each of which contains a different user's data. Each directory could be written to at any time. So I have two questions: Should I be concerned that a scheduled backup of a directory might occur in the middle of data being written to files in the directory, resulting in a corrupted backup? And if that is a valid concern, how would I go about temporarily delaying an operation while IO was happening, to try to minimize that effect. Thanks for the advice

    Read the article

  • Remove folder structure from archive, ignore folder while archiving and fix error

    - by Michael
    I am trying to make a script to backup each of my plesk hosts to individual files, I am having two problems: I would like to remove the folder structure from archive, the tar is 3 folders deep I am getting this error: tar: Removing leading `/' from member names I need my archive to ignore folders named "catch" because I don't need them in my archive. The code: FILES=/var/www/vhosts/* FNAME="" for f in $FILES do FNAME=`basename $f` tar cfv "/root/backup/ftp/$FNAME.tar" $f done Sample output: tar: Removing leading `/' from member names /var/www/vhosts/mydomain.com/ /var/www/vhosts/mydomain.com/conf /var/www/vhosts/mydomain.com/etc/ /var/www/vhosts/mydomain.com/etc/group /var/www/vhosts/mydomain.com/etc/termcap /var/www/vhosts/mydomain.com/etc/passwd /var/www/vhosts/mydomain.com/usr/

    Read the article

< Previous Page | 77 78 79 80 81 82 83 84 85 86 87 88  | Next Page >