Search Results

Search found 6101 results on 245 pages for 'incremental backup'.

Page 77/245 | < Previous Page | 73 74 75 76 77 78 79 80 81 82 83 84  | Next Page >

  • Using Dropbox API instead of a FTP server.

    - by Somebody still uses you MS-DOS
    This is a small aplication scenario. Usually, when you have to do some backups of source code/database on your server, you use a second ftp server, a cronjob to tar.gz your db dumps and source files, and send this file to your ftp server from your application server. Dropbox created an API to use it's infrastrucutre. Since they provide 2gb for free accounts, I thought about being able to upload to it instead of a ftp server. So, if you do some freelance work, you can create a free account for each client and use this approach, maybe encrypting the files you send. You even gain a revision for each sent file, like a revison control system, for free, from the last 30 days. What do you think of this approach? Is it possible? And, more importantly: what are the security risks involved? (That's why I'm asking this on serverfault, since this POV from sysadmins will be more accurate). Thanks!

    Read the article

  • Automatize uploading the YouTube

    - by John
    Here's the problem: I would like to keep lots of home made videos. Of course, they are subject to being lost, or somebody could steal the the computer, or water or fire could destroy them. Secondly, I have to plug in my hard drive every time I want to watch something, which I find slow and cumbersome. I was thinking that perhaps I could upload the videos to Youtube with the privacy set to invite only and then delete the video from the hard drive automatically. Could this be done?

    Read the article

  • Will 5 Terabyte NAS drive be compatible with Windows XP SP3 32 bit?

    - by TrevorBoydSmith
    (NOTE: The operating system (in this case Windows XP SP3 32 bit) we are using is not a choice.) I am trying to setup a short term storage device. First, I found a large 5 Terabyte NAS drive that would IMO fulfill my storage requirements. Second, I also found that Windows XP seems to have a hard drive size limit (see 'Is there a limit to the size of a hard drive for Windows XP pre-SP1?'): XP should handle up to 2 TB per volume after the service packs are applied. You are correct. There was a 137gb limit on the orginal pre service pack windows xp. This was addressed/fixed in SP1. My question is, will my Windows XP SP3 32 bit machine see the 5 Terabyte NAS and be able to read/write properly to the NAS drive?

    Read the article

  • What's the best way to be able to reimage windows computers?

    - by mos
    I've got a low-end machine for testing our software. It needs to be tested under various versions of Windows, so I was planning installing each one on its own partition. Then I realized that after testing our software, I'd want to roll back to the previous, clean state. I don't want to use any virtualization software because it tends to interfere with the workings of our app. That said, what's the best way to achieve my goal? Norton Ghost? Edit: I work for a pretty monstrously huge organization. Money is no object here (and sometimes, if the wrong people get wind of it, "open source" software is bad).

    Read the article

  • Automate BESR 8.5 Deployment

    - by Mike
    I have been searching for a way, script, rain dance, to automate the deployment of several BESR 8.5 created images (v2i file extension). Does anyone have any experience on how to pull this off? I have tried Ghost Solution Suite 2.5, but it doesn't seem to work with images that are password protected. Any help, tool, 3rd party program, etc, would be greatly appreciated. Thanks

    Read the article

  • Is there such a thing as a file hosted container which deduplicates data held within?

    - by Mallow
    Background I have backups of a website which stores all of it's data into a single file. This file is several gigs large and I have many different backups of this file. Most of the data within is mostly the same plus whatever was added or changed to it. I want to keep all the concurrent backups I've made through the years in case I find a horrible surprise of data corruption along the line. However storing a 10gig file every month gets expensive. Seeking Solution I've often thought about different ways of alleviating this problem. One thought that comes up very often combines the idea of a duplicating file system which doesn't require it's own partitioned volume on a hard drive. Something like what truecrypt does, what it calls, "file hosted containers" which when using the truecrypt program allows you to mount and dismount that volume as a regular hard drive. Question Is there a virtual hard drive mounter which uses file-based container which uses data deduplicaiton file system? (This question is a little awkward to put into words, if you have a better idea on how to ask this question please feel free to help out.)

    Read the article

  • BackupPC back-up has not been finished in 12 hours(!)

    - by chronoz
    I installed BackupPC toda on a server and set it to do a back-up 12 hours ago... while it's been backing up since, it seems very very slow and it's not completed yet. It's just backing up a testserver with a total disk usage of 1.8GB. What could cause the back-up process to be so slow? rsnapshot always worked wonderfully fast, but I want to improve my back-up solution. df shows that the size on the back-up disk is actually still increasing.

    Read the article

  • Download and locally store all emails from all mailboxes on Office365?

    - by scape
    We have a business that uses Office365 and we want to be able to save all the emails locally. I found a thread on Office365 community pertaining to this and basically it was stated that there is no direct way of accomplishing this. I am curious if anyone has considered this and if there is a good method for storing these emails locally, even if it's some nifty PowerShell programming. All I've come up with is having a master mailbox which can view all mailboxes, and just have it sync and archive locally to the computer. I have not tried this yet, as the storage file sounds like it will be huge, so this does not seem like a fantastic idea and I'm open to any suggestions!

    Read the article

  • Safest snapshot of a failing harddrive?

    - by ironfroggy
    I have a headless machine that stopped booting, so I pulled it out for diagnostics and got a message that one of the harddrives was about to fail, so I pulled them all out and I need to get everything off, before figuring out which I need to get rid of. I wasn't sure which drive was failing, because it only said "Harddrive 1" and I don't know which it referred to. I'm wondering the best way to get everything off. I'm worried if I copy everything, I could get corrupt data and not realize some files are wrong until the drive is completely out of commission. What are my best options to get everything off in a way I can safely move to new storage?

    Read the article

  • Automate BESR 8.5 Restore

    - by Mike
    I have been searching for a way, script, rain dance, to automate the restore of several BESR 8.5 created images (v2i file extension). Does anyone have any experience on how to pull this off? I have tried Ghost Solution Suite 2.5, but it doesn't seem to work with images that are password protected. Any help, tool, 3rd party program, etc, would be greatly appreciated. Thanks

    Read the article

  • Mirroring a drive in Debian

    - by James Willson
    I have a drive with data on it. I dont want to use RAID, instead I want to do hourly backups to a second drive. I basically want to mirror the data drive and resync every hour. It is inefficient to re-move the data each time so really I only want to move across what changes. I.e if I add a new file to the data drive only that file will be moved across. What tools are there for doing this on the command line? I used to use luckybackup on Ubuntu but now im on commandline debian.

    Read the article

  • Hard drive partition size wrong. How do I resize without loss of data?

    - by BreezyChick89
    $ fsck fsck from util-linux 2.20.1 e2fsck 1.42 (29-Nov-2011) The filesystem size (according to the superblock) is 610471680 blocks The physical size of the device is 536870911 blocks Either the superblock or the partition table is likely to be corrupt! It should be 1 partition but it now shows 2.2tb partitioned and .3tb unpartitioned How do I make the first partition correctly be 2.5tb without destroying whatever is in either partition? I did not raid or anything. My devices have been getting repeatedly corrupt by thunderstorms. Looks like people recommend doing something like in other places. sudo resize2fs /dev/sdc1 610471680

    Read the article

  • What can cause the system to freeze in a way where even the reset button takes a long time to react?

    - by ThiefMaster
    What can be the reason for system freezes that are so "hard" that even the hardware reset button takes about 3 seconds until it actually resets the system (and then it actually powers down and up again instead of doing a "clean" hard reset like when pressing it during a normally running system). Since it initially happened mainly while playing videos from YouTube I suspected the graphics card - however, I replaced it recently and it did not change it. It still happens from time to time (and sometimes more often, like a few times times in the last few hours). The system is running Windows 7 - but I don't think this matters since I don't think any software, not even the OS, can actually affect the reset button's behaviour. The PC is not overheated and the freezes happen randomly. There is also no malware on the system. The CPU is an Intel Core i7-920 on a Gigabyte EX58-UD5 mainboard. What could be the cause for this problem? Faulty RAM? I did not run a full memtest86 check yet, but I wonder if there is a more likely issue than faulty RAM - checking 12G of ram does take some time after all! There are no entries in the event log - but that's what I expected since the system freezes so hard that I doubt it has time to write anything to any log.

    Read the article

  • How do you limit the bandwidth for a file copy?

    - by wizard
    I've got an old windows 2000 box in a remote location with a T1 connection and a vpn to my location. I normally use smb mounts to transfer files but now it's time to decommission the server and copy it's backups to my location. I have about 40 gigabytes (compressed) to copy. I'm prepared for it to take a long time, but I have a few caveats. I need to limit the bandwidth so terminal service connections to the site are not affected I want to be able to resume a partial transfer There are a few small files and several large files (10-20 gigabytes). I'm familiar with rsync on *nix platforms but have had bad luck with windows and I don't know that it will really keep partially transfered files. What do you use?

    Read the article

  • Is there a way for me to test my [closed]

    - by Jimi
    I have a home network with a cheap-o little router with a development server and a few devices hooked up to it. I am finding that backups of my server are taking FOREVER (a week for 60gb) running backups renders my internet connection useless from any other box int he house. I have maxed out the pipe to my house from the ISP (10down, 3up), but is there a way for me to test and see if my router is bottlenecking anything? I feel like 60gb backups shouldn't take this long so any help would be great!

    Read the article

  • Connecting WD World Book to my new laptop using downloaded win 7 driver

    - by Jeanie
    HELP!!! I have just been reading through and seeing that you can connect using a USB straight to your laptop. This would be great is I actually had the cable! My problem is I have a lovely new laptop with windows 7 - I had to buy this as my old laptop isnt working anymore and I have just ordered an external sata enclosure to see whether I can get any inormation of the old drive and put onto my new laptop, so at least I don't lose any work. This in itself will probably present problems as my old OS was Windows XP! I will deal with that when i get to it!! My problem at present is that I have a World Book and I wish to connect and configure it to my new laptop but again I need to download drivers for Windows 7, which I believe I have done, but now I can't seem to work out just what to do next to help these 2 devices to recognize each other and configure. If anyone has any answers or could talk me through it I would be really grateful. Thanks - Jeanie

    Read the article

  • Using Dropbox API instead of a FTP server for backing up DB/Source in your application.

    - by Somebody still uses you MS-DOS
    This is a small aplication scenario. Usually, when you have to do some backups of source code/database on your server, you use a second ftp server, a cronjob to tar.gz your db dumps and source files, and send this file to your ftp server from your application server. Dropbox created an API to use it's infrastrucutre. Since they provide 2gb for free accounts, I thought about being able to upload to it instead of a ftp server. So, if you do some freelance work, you can create a free account for each client and use this approach, maybe encrypting the files you send. You even gain a revision for each sent file, like a revison control system, for free, from the last 30 days. What do you think of this approach? Is it possible? And, more importantly: what are the security risks involved? (That's why I'm asking this on serverfault, since this POV from sysadmins will be more accurate). Thanks!

    Read the article

  • I want to "image" 40+ laptops quickly...i welcome suggestions on reliable software

    - by Joldfield101
    I often have batches of laptops/PC's to re-image and have tried various methods, but each of them has been problematic and often take more time to troubleshoot than it would have been to image them individually! For example, i have tried to use ghost - i installed ghostcast server on my laptop but the clients never seem to boot to LAN successfully, or it takes an hour to get everything sorted (drivers, LAN, DHCP etc etc). I want a reliable tool that makes imaging quick and easy - and i don't mind paying for it if it's going to work (but obviously free = always good!)

    Read the article

  • How to determine if my router is causing a bottleneck in uploads?

    - by Jimi
    I have a home network with a cheap-o little router with a development server and a few devices hooked up to it. I am finding that backups of my server are taking FOREVER (a week for 60gb) running backups renders my internet connection useless from any other box int he house. I have maxed out the pipe to my house from the ISP (10down, 3up), but is there a way for me to test and see if my router is bottlenecking anything? I feel like 60gb backups shouldn't take this long so any help would be great!

    Read the article

  • Can You Specify Where LVM Snapshots Are (Initially) Stored?

    - by bottles
    Disclaimer: this is my first time using lvm. Upon RTFM, it appears that LVM snapshots are automatically stored in the same directory as the original logical volume. In my case, that would mean the /dev directory. This isn't very nice, because there's not enough disk space in there for me to store a large snapshot. So when I run a command like lvcreate --size 1G --snapshot --name snapshot /dev/lvmData/usr, I need an additional 1G of space free in /dev? Is there any way to specify a different directory in which to store my snapshot?

    Read the article

  • if i have two external hard drives connected to my computer by USB (2.0 i think) will they load with consistent letters?

    - by Bec
    (I'm using windows-7 and the hard drives are western digital with whatever formatting they came with from the factory) i'm thinking of setting up two different back-ups one through windows and one with the software that came with the drive (because windows gives me a system image but isn't very user-friendly for my files) but will my computer get confused and load them as different letters each time?

    Read the article

  • Linux RHEL : Making disk image efficiently

    - by TheProfoundGeek
    I have a linux box having RHEL. Its disk (hda1) is having free space of about 25GB. I have an another disk (hda2) which is of 250GB having another RHEL instance, it's partitioned for 200GB. Data on the disk occupies about 21GB of data. The image of hda2 needs to be taken and restored on other disk of same specs. What is the best way to make image file of the hda2? Ideally the images size should be around 25GBs as the actual data on the disk is just 21GB. I am aware about the following two methods. Method 1 : Raw Image dd if=/dev/hda2 of=/path/to/image dd if=/path/to/image of=/dev/hda3 Question 1 : Will the above method make a gigantic image of 250GBs? Is it efficient? Method 2 : Compressed Image. dd if=/dev/hda2 | gzip > /path/to/image.gz gzip -dc /path/to/image.gz | dd of=/dev/hda2 Question 2 : I tried the method 2, its taking too long. What are the pit falls of this methods? Which of the above method id efficient and why? Is there any other Linux utility which can do the job? Third party tools are no no.

    Read the article

< Previous Page | 73 74 75 76 77 78 79 80 81 82 83 84  | Next Page >