Search Results

Search found 14789 results on 592 pages for 'pro backup'.

Page 103/592 | < Previous Page | 99 100 101 102 103 104 105 106 107 108 109 110  | Next Page >

  • File versioning software

    - by blade7
    Hi, I am looking for some software which can version control all my files on my OS (For Windows Server). So I can go back to a file 5 versions old. I know Genie can do this but I have BackupAssist for backups which Genie also does. I need an app that just offers the above.

    Read the article

  • One Way Sync with Dropbox?

    - by user244805
    Is there any way I can mirror a dropbox folder to my C drive by just running a portable file? Extra background information because I know you guys hate it when you don't get the entire situation: I go back to University in fall and I need a new storage solution. I decided to use DropBox to sync my tiny University files (< 5 MB). I need to access these files from 4 machines: Windows 7 Home machine Windows 7 University A machine Windows 7 University B machine Android tablet 1 and 4 are a non-issue. The problem lies with 2 and 3. I want to be able to edit my files on 2 and 3 but those machines are not mine. There is an easy fix. Run a portable version of the DropBox syncer on a USB drive. But the problem is that I don't want to carry a USB drive around with me all the time. In that case, I can just run the small portable DropBox syncer off the internet. But where will it to store the files? A temporary directory on the C drive. There is only one issue left: there are hundreds of machines that I will randomly use that fit in categories 2 and 3. My portable DropBox syncer will notice that the temporary directory is empty on each new PC I use and instead of downloading my DropBox folder to the machine, it syncs the other way around i.e. it deletes my entire DropBox. The solution is to mirror my DropBox onto the temporary directory before running the DropBox syncer.

    Read the article

  • S3sync not working

    - by user57833
    Hello, I managed to get s3sync to upload my test folder to Amazon S3 and can see it in the MWS Managment Console. Downloading the data back to a test folder results in the following error message: root@mybucketname:/var/s3sync# ./week_download.sh s3Prefix backups/weekly localPrefix /var/s3sync/testdown/weekly s3TreeRecurse mybucketname backups/weekly Creating new connection Trying command list_bucket mybucketname prefix backups/weekly max-keys 200 delimiter / with 100 retries le ft Response code: 200 prefix found: / s3TreeRecurse mybucketname backups/weekly / Trying command list_bucket mybucketname prefix backups/weekly/ max-keys 200 delimiter / with 100 retries l eft Response code: 200 S3 item backups/weekly/ s3 node object init. Name: Path:backups/weekly Size:0 Tag:d41d8cd98f00b204e9800998ecf8427e Date:Fri O ct 29 14:21:53 UTC 2010 local node object init. Name: Path:/var/s3sync/testdown/weekly/ Size: Tag: Date: source: dest: Update node s3sync.rb:638:in initialize': No such file or directory - /var/s3sync/testdown/weekly/.s3syncTemp (E rrno::ENOENT) from s3sync.rb:638:inopen' from s3sync.rb:638:in updateFrom' from s3sync.rb:393:inmain' from s3sync.rb:735 I am using the following download script: !/bin/bash script to download local directory upto s3 cd /var/s3sync/ export AWS_ACCESS_KEY_ID=nothing to see here export AWS_SECRET_ACCESS_KEY=nothing to see here export SSL_CERT_DIR=/var/s3sync/certs ruby s3sync.rb -r -v -d --progress --make-dirs mybucket:backups/weekly /var/s3sync/testdown copy and modify line above for each additional folder to be synced Any idea's? Does the download script need to download to the source of Amazon S3 i.e testup folder? Was hoping on the instance of a complete failure and the original folders won't exist that it would just download everything from me. Note: changed my bucket names to "mybucketname" so that it is not public!

    Read the article

  • Is there any chance that my data will get silently corrupted with a robocopy SMB network transfer?

    - by Archagon
    I'm setting up a NAS box for the first time. At the moment, I have most of my data backed up to a few local hard drives, and I intend to transfer all the data to my NAS over ethernet once the RAID array is setup. Since this is all happening over the network, I'm a bit worried about my data getting corrupted silently during transfer. From what I understand, data generally doesn't get corrupted without notice on local transfers because a checksum is performed at some point by the drive or the OS. (This could be totally wrong.) Does the same thing happen with SMB, or is it up to the transferrer to check the integrity of their data? And if it doesn't happen with SMB, is there a protocol that does ensure data integrity? I know that rsync can checksum a transfer, but I'm on Windows and I already have a robocopy configuration that I like. Will my data be safe or do I have to use an external checksum tool to make sure?

    Read the article

  • Cloning my Windows boot drive--Windows hangs on booting off new drive.

    - by idyllhands
    I copied my Windows XP partition to a new drive using GParted live CD (using the GUI). I made sure to flag it as boot, and then used my XP disc to enter Recovery Console and ran fixboot and fixmbr on it. Now, it will boot up to the Windows flash screen, but hangs at that point. Any suggestions on how to proceed? I am just trying to come up with a quick way to clone my system and make the drive bootable, and gParted seemed like the easiest way, but now I've been working on it for over an hour.

    Read the article

  • Rsync fails for files that start with underscore when destination is zfs

    - by Eric
    everyone. I'm using rsync3.1.0pre1 on Mac OS X 10.8.5, and am trying to rsync one folder to another. The destination is a ZFS volume mounted via SMB. The problem I'm having is that files that start with underscore (e.g., '_filename.jpg') are not being successfully synced to the destination. I get the following error message: rsync: mkstemp "/path/to/destination/._filename.jpg.NUgYJw" failed: Permission denied (13) In this case, '_filename.jpg' does not make it to the destination. I understand that rsync creates hidden, temporary files at the destination which are preceded with '.' and have a random file extension appended on the end. But the original filename starts with '', not '.', and I haven't asked rsync to copy extended attributes / resource forks over (unless it always does it). The rsync command I'm using is: rsync -avE --exclude='.DS_Store' --exclude '.Trash' --exclude 'Thumbs.db' --exclude '._*' --delete /source/ /destination/ Has anyone found a way around this problem? Thank you!

    Read the article

  • Splitting an archive on multiple media

    - by Robert Munteanu
    I'm generating archives which are larger than my current physical media ( DVD ). I'd like to split those archives: automatically - instead of generating mini-archives by hand; consistently - so that an archive can be extracted independently of another. For instance for a tree of 24GB which would be archived into 10GB I would get 3 archives, all of them < 4.7 GB and each of them being able to be extracted without the other 2. I'm using dirvish so I'm archiving a filesystem tree. Update: I'm using Linux.

    Read the article

  • Samsung SSD SMART failure warning on macbook proA

    - by user37303
    My 9 month old 256GB Samsung SSD is now reporting the following SMART failures: Rsvd_Blk_Cnt_Chip FAILING NOW 184 Available_Reservd_Space FAILING NOW 20 Can anyone explain the meaning of these two attributes that appear to be failing (Rsvd_Blk_Cnt_Chip and Available_Reservd_Space? Also, also aren't SSDs much more immune to these types of failures? Everything seems to be working fine now, but I'm fearful of a looming failure.

    Read the article

  • Amazon S3: allow users to upload on a restricted basis (per bucket maybe)?

    - by Tom
    Hi there, I'm thinking about signing up to the Amazon S3 storage service. What I want to do is create a service where other people can register their own bucket with a certain amount of storage. These users will install my software, which then uploads their files. Of course, the users may only upload what they have paid for. For this to work I would like to create a separate bucket for each customer, each with its own properties. Question 1: is this possible with the API? How? This means that the installed software must have the rights needed to upload to my Amazon S3 account. Question 2: can I create individual authentication IDs for each bucket or customer, so that they can only upload with restrictions I have set? Thanks in advance.

    Read the article

  • Is there a time machine equivalent for windows that can back up network files?

    - by Jim Thio
    This question is similar to Does an equivalent of Time Machine exist for Windows?, with one difference: The files I want to back up are on a network drive. The computer on that network drive is running Windows XP. I want to back up data on Windows 7. How would I do so? I'd like something similar to Mac OS X' time machine. So copy of data every hour, day, week. Then thinning out, data gets deleted automatically as time goes by. For example, the data for last day is kept as hourly snapshots. For last week, as daily snapshots every day. And for last month as weekly snapshots. How can I achieve this?

    Read the article

  • Wondering about the Windows 7 serial number my laptop has, and its uses.

    - by overmann
    So that's the serial number of my pre-installed windows copy, I take it. But am I allowed to use it again when, say, I don't know, my system gets crippled by a sneaky virus? If I format my computer and install windows starter again from a USB drive (speculating. I've never format before, I suppose is completely possible) Is that serial number still valid? I'm talking about the number printed on the back of my laptop.

    Read the article

  • time on files differ by 1 sec. FAIL Robocopy sync

    - by csmba
    I am trying to use Robocopy to sync (/IMG) a folder on my PC and a shared network drive. The problem is that the file attributes differ by 1 sec on both locations (creation,modified and access). So every time I run robocopy, it syncs the file again... BTW, problem is the same if I delete the target file and robocopy it from new... still, new file has 1 sec different properties. Env Details: Source: Win 7 64 bit Target: WD My Book World Edition NAS 1TB which takes its time from online NTP pool.ntp.org (I don't know if file system is FAT or not)

    Read the article

  • Second user vimrc file usage on vim running on Mac os X 10.8.5 (Mountain Lion)

    - by Deesbek
    I am using MacVim: :version VIM - Vi IMproved 7.4 (2013 Aug 10, compiled Aug 10 2013 17:49:20) MacOS X (unix) version I've executed :version in vim (to check what patches I had installed) and noticed the following two lines part of the output: user vimrc file: "$HOME/.vimrc" 2nd user vimrc file: "~/.vim/vimrc" What is the 2nd user for, and how would you use it? I've found and read this question:http://apple.stackexchange.com/q/34996/10733, but the answer shows how to integrate the ~/.vim/vimrc into .vimrc. I also did the following search in google which did not yield anything interesting: 2nd user in vim and ~/.vim/vimrc, and how to use ~/.vim/vimrc

    Read the article

  • Using Dropbox API instead of a FTP server.

    - by Somebody still uses you MS-DOS
    This is a small aplication scenario. Usually, when you have to do some backups of source code/database on your server, you use a second ftp server, a cronjob to tar.gz your db dumps and source files, and send this file to your ftp server from your application server. Dropbox created an API to use it's infrastrucutre. Since they provide 2gb for free accounts, I thought about being able to upload to it instead of a ftp server. So, if you do some freelance work, you can create a free account for each client and use this approach, maybe encrypting the files you send. You even gain a revision for each sent file, like a revison control system, for free, from the last 30 days. What do you think of this approach? Is it possible? And, more importantly: what are the security risks involved? (That's why I'm asking this on serverfault, since this POV from sysadmins will be more accurate). Thanks!

    Read the article

  • Will 5 Terabyte NAS drive be compatible with Windows XP SP3 32 bit?

    - by TrevorBoydSmith
    (NOTE: The operating system (in this case Windows XP SP3 32 bit) we are using is not a choice.) I am trying to setup a short term storage device. First, I found a large 5 Terabyte NAS drive that would IMO fulfill my storage requirements. Second, I also found that Windows XP seems to have a hard drive size limit (see 'Is there a limit to the size of a hard drive for Windows XP pre-SP1?'): XP should handle up to 2 TB per volume after the service packs are applied. You are correct. There was a 137gb limit on the orginal pre service pack windows xp. This was addressed/fixed in SP1. My question is, will my Windows XP SP3 32 bit machine see the 5 Terabyte NAS and be able to read/write properly to the NAS drive?

    Read the article

  • Automatize uploading the YouTube

    - by John
    Here's the problem: I would like to keep lots of home made videos. Of course, they are subject to being lost, or somebody could steal the the computer, or water or fire could destroy them. Secondly, I have to plug in my hard drive every time I want to watch something, which I find slow and cumbersome. I was thinking that perhaps I could upload the videos to Youtube with the privacy set to invite only and then delete the video from the hard drive automatically. Could this be done?

    Read the article

  • What's the best way to be able to reimage windows computers?

    - by mos
    I've got a low-end machine for testing our software. It needs to be tested under various versions of Windows, so I was planning installing each one on its own partition. Then I realized that after testing our software, I'd want to roll back to the previous, clean state. I don't want to use any virtualization software because it tends to interfere with the workings of our app. That said, what's the best way to achieve my goal? Norton Ghost? Edit: I work for a pretty monstrously huge organization. Money is no object here (and sometimes, if the wrong people get wind of it, "open source" software is bad).

    Read the article

  • Games on windows 8 in bootcamp lag even on lowest graphics

    - by Jackson Gariety
    I've been playing Crysis 2 and Skyrim on my Retina MacBookPro (10,1) for months now. The two games used to run super smoothly even on nearly maxed out settings. This laptop has an Nvidia GeForce GT 650M graphics card inside, it runs great. But I recently replaced my Windows 8 consumer preview with the retail copy, and since then, 3D games lag in this odd way, no matter what the graphics settings. Every second Skyrim and Crysis alternates between running smoothly and lagging. It's a cyclical lag that comes and goes like clockwork. I can turn the graphics down to 800x600 with no antialiasing and low texture quality, and it runs much smoother on the "up" motion of the cycle, but every second it moves back into this lag spike. I've tried installing beta graphics drivers, re installing the operating system, re installing the bootcamp support software, and freeing up space (I have about 20 GB free). I can't figure out what suddenly caused this other than some obscure difference between the consumer preview and the retail version. What can I try? Is my video card failing? Are there some other drivers I can install? This isn't normal lag from maxing out the card, it

    Read the article

  • Is there such a thing as a file hosted container which deduplicates data held within?

    - by Mallow
    Background I have backups of a website which stores all of it's data into a single file. This file is several gigs large and I have many different backups of this file. Most of the data within is mostly the same plus whatever was added or changed to it. I want to keep all the concurrent backups I've made through the years in case I find a horrible surprise of data corruption along the line. However storing a 10gig file every month gets expensive. Seeking Solution I've often thought about different ways of alleviating this problem. One thought that comes up very often combines the idea of a duplicating file system which doesn't require it's own partitioned volume on a hard drive. Something like what truecrypt does, what it calls, "file hosted containers" which when using the truecrypt program allows you to mount and dismount that volume as a regular hard drive. Question Is there a virtual hard drive mounter which uses file-based container which uses data deduplicaiton file system? (This question is a little awkward to put into words, if you have a better idea on how to ask this question please feel free to help out.)

    Read the article

  • Automate BESR 8.5 Deployment

    - by Mike
    I have been searching for a way, script, rain dance, to automate the deployment of several BESR 8.5 created images (v2i file extension). Does anyone have any experience on how to pull this off? I have tried Ghost Solution Suite 2.5, but it doesn't seem to work with images that are password protected. Any help, tool, 3rd party program, etc, would be greatly appreciated. Thanks

    Read the article

  • BackupPC back-up has not been finished in 12 hours(!)

    - by chronoz
    I installed BackupPC toda on a server and set it to do a back-up 12 hours ago... while it's been backing up since, it seems very very slow and it's not completed yet. It's just backing up a testserver with a total disk usage of 1.8GB. What could cause the back-up process to be so slow? rsnapshot always worked wonderfully fast, but I want to improve my back-up solution. df shows that the size on the back-up disk is actually still increasing.

    Read the article

< Previous Page | 99 100 101 102 103 104 105 106 107 108 109 110  | Next Page >