Search Results

Search found 5747 results on 230 pages for 'backup'.

Page 22/230 | < Previous Page | 18 19 20 21 22 23 24 25 26 27 28 29  | Next Page >

  • java database backup and restore

    - by jawath
    How do I backup / restore any kind of databases inside my java application to flate files.Are there any tools framework available to backup database to flat file like CSV, XML, or secure encrypted file, or restore from csv or xml files to databases, it should be also capable of dumping table vise restore and backup also.

    Read the article

  • Backup multiple Exchange Accounts without direct access to exchange server

    - by Mike Wallace
    For e-mail, we use Microsoft Exchange and it is hosted by 1and1.com. We have about 30 Exchange accounts that I would like to backup to a PST file. That is, for each account that we have (all 30), I would like to create a single PST file (1.pst thru 30.pst). I do not have direct access to the Exchange server. Basically, for each Exchange account, I can supply: The IP address for the Exchange server or the URL to the OWA. The Username The Password Is there a tool out there that can do this for me? It seems that Microsoft's "Online Services Migration Tools" comes awfully close, but it appears that its geared to pull data out of any Exchange server and push it into Microsoft Online. I don't believe it can be used to simply pull the data out and generate PST's.

    Read the article

  • Backup and Restore ADAM database

    - by kuoson
    Hi, I was trying to backup and restore an ADAM database to a different server the other day. I copied all files under "Program Files/Microsoft ADAM" folder to the same path in the destination server and started the ADAM service in the destination server up. Although the service come back up successfully and I was able to connect to the instance with ADAM ADSI Edit mmc snap-in, I found I had to reset every single user's password before they can login again. Has anyone got this issue before? Is the password encrypted with the server IP address or something like that?

    Read the article

  • Unlimited and multi-computer online storage solutions with automatic backup

    - by JRL
    As the title says, what are the existing online storage solutions that provide: unlimited storage automatic backup and allow for an unlimited number of computers (use not tied to a single computer)? There are several existing questions on this site related to online storage solutions, but none that is specifically targeted to what I want, so I thought I'd ask the question. This wikipedia article lists some of them, are there others? How do they compare in terms of price, feature set and ease of use? Update: Kinda disappointed no one has any answers to this so far. JungleDisk looks promising, anyone have experience with it? Update 2: To answer the comments, what I'm looking for definitely DOES exist. These solutions all seem to fit the bill: BackMii CrashPlan DataPreserve Humyo JungleDisk KeepVault SpiderOak And some of them are quite cheap (CrashPlan is $100 a year). For unlimited space and computers, I'd say that's pretty good. Does anyone have experience with CrashPlan or any other of the above solutions?

    Read the article

  • Backup and Restore ADAM database

    - by kuoson
    I was trying to backup and restore an ADAM database to a different server the other day. I copied all files under "Program Files/Microsoft ADAM" folder to the same path in the destination server and started the ADAM service in the destination server up. Although the service come back up successfully and I was able to connect to the instance with ADAM ADSI Edit mmc snap-in, I found I had to reset every single user's password before they can login again. Has anyone got this issue before? Is the password encrypted with the server IP address or something like that?

    Read the article

  • Backup strategy for Windows Server 2008 R2 and Hyper-V

    - by winserveradmin
    I am in the process of planning a Hyper-V deployment with SCVMM 2008 R2. I have several VMs on VMWare Player (temporary solution) for stuff like Sharepoint 2007, 2010, and a couple of other server apps. I want to develop a resilient backup plan for this. On the software side, Data Protection Manager 2010 will support all of the server apps I run (sQL Server 2008 R2, Sharepoint, Exchange, etc). But on the hardware side (storage), what is the best way to go? Drobo seems to have issues with Hyper-V juding by a few threads on here and doesn't support DPM 2010 (which is in beta anyway) but not even the 2007 version (see http://www.drobo.com/support/best_practices.php). What storage device would work well? Do I need a home server or just an external usb drive? Capacity wise, 2tb will probably be best so I can have a small archive and implement a round-robin system. Thanks

    Read the article

  • How to backup 20+TB of data?

    - by Jesus Fidalgo
    We have a NAS server at the company I work for that is being used for storing photography sessions. Each session is approximately 100gb. Over the last couple of years this server has accumulated 10+ TB of data, and we are increasing the amount of photoshoots exponentially. I estimate that by the end of next year we will have 20+ TB stored on this NAS. We are currently backing this server up to tape using LTO-5 tapes with Symantec BackupExec. Since the size of this server has grown, full backups of this server are not completing overnight. Does anyone have any suggestion on how to backup this amount of data? Should we be backing it up to tape? Are there any other options which may be better?

    Read the article

  • Best way to backup Xbox 360 USB Drive

    - by TekiusFanatikus
    What is the best way to backup/restore my USB drive that I use for my Xbox? I want to make sure that if the USB drive goes, that I can retrieve my saved games and such onto another USB drive. I was able to show the content of the drive, however, I wasn't sure if I could simply copy the content onto a Truecrypt volume and be able to restore it from there at a later date. The file system is not FAT or NTFS, wasn't sure about the impacts of copying from two different file systems... I currently have a DataTraveler G3 16GB. After a bit of googling, I was able to find this article, mentions an app called USBXtafGUI

    Read the article

  • Backup to disk, encrypted, without any installed local software

    - by user30064
    Hi, Ok, this is a tough one, and it might not even be possible, but no harm in asking I guess. I have a Buffalo Terastation file server that I use for network attached storage. After a couple of phone calls to customer services I realised that there is no way to backup to disk encrypted. In effect, I would be carrying unencrypted company data off-site daily, which is obviously unacceptable. I had a go at TrueCrypt, EncFS, and a few others, and as far as I could see all of them required that you install some software on the machine that is to use the file system, which makes sense. Unfortunately the firmware on the Terastation is closed and I cannot install any software (and I can't build from source either, since Buffalo didn't include a compiler). Are there any ways to copy files to disk, where as soon as they are written to the disk they are transparently encrypted, without having to install additional software? I'm not sure it matters too much, but the Terastation firmware is Linux based, although as I mentioned, closed. Many thanks, Andreas

    Read the article

  • Creating hard drive backup images efficiently

    - by Arrieta
    We are in the process of pruning our directories to recuperate some disk space. The 'algorithm' for the pruning/backup process consists of a list of directories and, for each one of them, a set of rules, e.g. 'compress *.bin', 'move *.blah', 'delete *.crap', 'leave *.important'; these rules change from directory to directory but are well known. The compressed and moved files are stored in a temporary file system, burned onto a blue ray, tested within the blue ray, and, finally, deleted from their original locations. I am doing this in Python (basically a walk statement with a dictionary with the rules for each extension in each folder). Do you recommend a better methodology for pruning file systems? How do you do it? We run on Linux.

    Read the article

  • Backup VSS Issues

    - by MJ
    I've got this server, and we've been having VSS issues for quite awhile. We've tried every form of re-registering the VSS drivers we can think of. NTbackup won't run a system state backup vssadmin list writers doesn't list anything. We're getting VSS error 12302 in the event log. Has anyone ran into a situation like this before? We're thinking the next step, if possible, is a complete reinstall of the VSS. Any other ideas?

    Read the article

  • Cloud based backup solutions based on open standards?

    - by Rick
    I am looking for a solution to backup and consolidate important media from a couple Windows laptops and Mac laptop. I would like a solutions that based on open standards, so my data isn't trapped by proprietary formats and proprietary protocols. I would like the ability to switch clients or change providers in the future. For example, something like Jungle Disk plus S3 sounds like a great option. However, I am having trouble confirming how or if this can be setup meeting this criteria. Are there any real or de-facto standards for treating S3 as a filesystem? If so, what Windows and Mac clients support these standards?

    Read the article

  • Using rsync to take backup of folder

    - by Ali
    Hi, I have a server (Linux) with NAS which is mounted as folder "mount" I have website in "public_html" folder. I want to take backup of website in mount folder automatically at certain intervals for e.g. every hour. I read that there is something called "rsync" which is used to make two folders sync. And it doesn't copy all files every time and instead matches if the file has been changed and then only update changed files. How do I use it to make automatic backups? I have root access to server. Thanks

    Read the article

  • How to schedule daily backup in MSSQL Server 2008 Web Edition

    - by Xenon
    In MSSQL Management Studio I created a maintenance plan but it won't work Error is; "Message Executed as user: LITESPELL-19C34\Administrator. Microsoft (R) SQL Server Execute Package Utility Version 10.0.1600.22 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. The SQL Server Execute Package Utility requires Integration Services to be installed by one of these editions of SQL Server 2008: Standard, Enterprise, Developer, or Evaluation. To install Integration Services, run SQL Server Setup and select Integration Services. The package execution failed. The step failed." But in Microsoft page http://www.microsoft.com/sqlserver/2008/en/us/web.aspx in Automate tasks and policies section it is written that backup can be scheduled in this edition but how?

    Read the article

  • HP-UX -> Linux incremental remote backup

    - by stack_zen
    Hi. I've the need to setup a differential backup process from a range of remote HP-UXes to a central RHEL5 server. I'd happily go with rsync, problem is, stock HP-UX 11.11 has no built-in rsync and I don't have permissions to install any software on the remote stock HP-UXes. How should I approach this? HP-UX provides: fbackup (HP-UX exclusive) cpio (available in RHEL5, allows backing up only the files which changed, but always grabs the totality of the file) ssh remote_user@remote_host 'find /u01/engine/logs/ -type f -name "*.log" | cpio -o | gzip -' | cpio gunzip - | -idmv Those solutions don't really answer my incremental (bandwidth efficiency) problem do they?

    Read the article

  • How to securely store and update backup on remote server via ssh/rsync

    - by Sergey P. aka azure
    I have about 200 Gb of pictures (let's say about 1 mb/file, 200k files) on my desktop. I have access (including root access) to remote linux server. And I want to have updateable backup of my pictures on remote server. rsync seems to be the right tool for such kind of job. But other people also have access (including root access) to this server and I want to keep my pictures private. So the question is: what is the best way to keep private files on remote "shared" linux server securely?

    Read the article

  • Backup script that excludes large files using Duplicity and Amazon S3

    - by Jason
    I'm trying to write an backup script that will exclude files over a certain size. My script gives the proper command, but when run within the script it outputs an an error. However if the same command is run manually everything works...??? Here is the script based on one easy found with google #!/bin/bash # Export some ENV variables so you don't have to type anything export AWS_ACCESS_KEY_ID="accesskey" export AWS_SECRET_ACCESS_KEY="secretaccesskey" export PASSPHRASE="password" SOURCE=/home/ DEST=s3+http://s3bucket GPG_KEY="7743E14E" # exclude files over 100MB exclude () { find /home/jason -size +100M \ | while read FILE; do echo -n " --exclude " echo -n \'**${FILE##/*/}\' | sed 's/\ /\\ /g' #Replace whitespace with "\ " done } echo "Using Command" echo "duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST" duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST # Reset the ENV variables. export AWS_ACCESS_KEY_ID= export AWS_SECRET_ACCESS_KEY= export PASSPHRASE= If run I recieve the error; Command line error: Expected 2 args, got 6 Enter 'duplicity --help' for help screen. Any help your could offer would be greatly appreciated.

    Read the article

  • Switching to Linux, Need backup help

    - by Stephen
    I have an existing Vista installation on my thinkpad x200 and I want to install Linux on my machine. I've done this several times already but I usually format the whole disk and dual boot on Windows and Linux. Which means I have to reinstall and reconfigure everything I had on Windows. What I want to do is backup my windows installation (into an image) and start a clean Linux installation, and run the windows image thru Vmware or Virtualbox. Whats the easiest way to do this? any free tools available? I've seen acronis but dont wanna buy it for a 1 time session only.

    Read the article

  • Norton Backup "Failed to Restore"

    - by Teknophilia
    I recently had one of my computers (XP) die on me. I had it's files set to automatically backup to another PC's HD using Norton. I've tried using Norton restore on the second computer to try and restore some files (word documents, pictures), but when I try to do this, I get a dialog box saying that it "Failed to Restore". When I click to continue, it shows a list of the files I tried to restore, along with a status indicator for each file (which says "invalid file"). Any ideas?

    Read the article

  • Backup and restore MySQL database without system access

    - by Sencerd
    Hi guys, I am trying to move a database from 1 provider to another, the problem is that I don't have system access at either end (ie, no ssh), so I cannot use a mysqldump. I have already tried using MySQL Administrator, the backup took about 45 minutes, but when it came to restoring it was moving at a snails pace, and estimating 12+ hours. This is a live app so I need to keep the downtime to an absolute minimum. The database consists of 35 tables, a mixture of MyISAM and InnoDB, the whole thing comes to about 4.4GB. The source and destination databases are both running on very powerful servers. Any suggestions on a quick way of doing this will be gratefully received. Thanks

    Read the article

  • Network backup for Macs and PCs - formatting question

    - by neilfein
    I'm trying to use a LaCie 2TB drive as an AirPort drive, for backup on a home network. We have one mac and two PC laptops. My plan is to create a Mac partition and a Windows partition. However, Disk Utility won't let me set the windows partition to Windows format; there's no option in the menu for it in the partition tab. Am I doing something wrong? Alternatively, is there a way to partition the drive with one partition that all three machines can see? We have a Mac G5 with 10.4 and two laptops with Windows 7. Thanks!

    Read the article

  • Win 7 Explorer backup and long paths

    - by user53299
    I use Explorer to do backups because Win 7's backup program asks me to take backups previously done and to put them back in the drive. I am opposed to that idea since I believe backups should remain in storage. With Explorer backups (burn and burn to disc) I have encountered the "destination path too long" error message and it shows the name of a folder "Debug" three times. I have hundreds of folders named "Debug" thanks to Visual Studio. At this moment I'm too angry at Microsoft to write a program to determine my 3 longest paths. (Aside: This is all after coincidentally reading two articles about path junctions earlier this evening which already made me kind of unhappy.) Please, is there an easy way to continue to make backups with Explorer? Edit: I should add that renaming paths wrecks Visual Studio projects so I really need to isolate the small number of problem paths or find a cleaner solution.

    Read the article

  • Backup broken PostgreSQL 8.4 without pg_dump

    - by Daniil
    So. I have a problem. PostgreSQL 8.4 won't start or restart without any output given. But it worked for 3 monthes until hosting provider doesn't rebooted server. Now it is completly broken. It wan't start and doesn't give any output or log. pg_dump: [archiver (db)] connection to database "postgres" failed: No such file or directory Is the server running locally and accepting connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"? Now I want to backup (or just start pgsql socket) my database to reinstall postgesql. How?

    Read the article

  • What RAID level for a backup server?

    - by ispirto
    I'm building a server with 12 x 3TB disks to use daily backups. I'm thinking to use RAID50 to get a good 27TB usable space. The disks will be used brutally to backup 9 servers with 1.5TB of data once a day. I'll keep the backups for 2 days. So for each server I'll have 3TB of separate partitions. Do you think this kind of huge backups would stress the disks too much and make them fail? Should I better go with RAID10? Oktay

    Read the article

  • Backup virtual hard disk

    - by Harshil Sharma
    I have a VM created in VMWare Player. It's VHD is currently sized 17 GB, split among multiple 2 GB files. The host OS is Windows 8. I use CrashPlan in host OS for file backup. The problem is, whenever I use the VM, CrashPlan detects all parth of VHD as altered and backs up the 17 GB VHD. WHat I want is a software that can run on host OS (Windows 8), treat the VHD as a physical hard disk and create incremental backups of the VHD, includeing all files, programs and the OS

    Read the article

< Previous Page | 18 19 20 21 22 23 24 25 26 27 28 29  | Next Page >