Search Results

Search found 13175 results on 527 pages for 'live backup'.

Page 70/527 | < Previous Page | 66 67 68 69 70 71 72 73 74 75 76 77  | Next Page >

  • Windows Live Messenger stops space bar working

    - by PP
    On Windows Vista, the included Windows Live Messenger frequently stops the space bar working after a long period (or is it after I put the computer into standby). It is frustrating to have to exit Windows Live Messenger and re-start just so I can type spaces in conversations. Anybody else have this problem?

    Read the article

  • PHP postgres backup.

    - by russell kinch
    Hi. I am trying to make a Postgres PHP backup script. I have downloaded one for the command line which looks like this: #!/bin/bash find /home/russell/pg_bkp -type f -mtime +7 -exec rm {} \; time=`date +%Y-%m-%d`; # date in reverse so that lastest date appears last in the list of backup files. PGPASSWORD=****** pg_dump -i -h localhost -p 5432 -U postgres -F c -b -v -f "/home/russell/pg_bkp/$time.backup" ah3 How can I implement this in PHP? The extension that this creates is .backup. It works great and have used it many times. the data is perfect, but doing it from inside my website would be better. Thanks

    Read the article

  • Backup restore without content in sharepoint

    - by Abhishek Rao
    I want to take a backup of my existing site and restore it on a different Sharepoint server. I know its easily possible through stsadm backup restore. But I dont want the content of my lists to appear in the restored sites. Is there any way to achieve this using backup or export command????

    Read the article

  • GAE/J datastore backup

    - by jb
    What is the easiest way to do a GAE/J datastore backup? It looks like there is python bulkloader.py tool to do backup for Python apps, but what should I do to backup Java app? Is there any way to use python tool?

    Read the article

  • VDR - Trouble writing to destination volume, error -1020 (sharing violation)

    - by woodwarp
    Using VMware Data Recovery 1.1, backing up to CIFS share and getting this error 1/18/2010 8:55:31 AM: Performing incremental back up of disk "Lun VM/VM-DB1-flat.vmdk" using "SCSI Hot-Add" 1/18/2010 8:55:32 AM: Trouble writing to destination volume, error -1020 ( sharing violation) Integrity checks of the destination complete successfully and I tried rebooting the VDR appliance just in case. To resolve the issue I removed the share from the VDR, pointed the backups to other destinations and renamed the VMware Data Recovery subfolder in the destination, then re-added the share and pointed the backups, this of course creates a new Backup Store. Anyone have any ideas why this error is occuring, means I can't backup into this Backup Store any longer.

    Read the article

  • Backing up Information Store - Recovering to Different Information Store / RSG

    - by Kip
    Hi All, I have a question on a situation, that hasn't yet arrisen but I wondered the possibilities and how we go about it. Currently we backup our Exchange 2003 Cluster with Backup exec. Currently it is set to backup the Microsoft Information Store on that server and all of the Mailbox Stores beneath it. We have previously used this in conjunction with a recovery storage group on the same server to recover lost mailboxes. However, due to space constrictions on that server ( a seperate issue that is being addressed in the very near future but outside of the scope of this question) we now don't have enough space on that server to do a recovery storage group type restore. Is it possible, to restore an information store, to a different server in the same administrative group (ie first)? By that I mean we have the following: Server1 | First Storage Group | Mailbox Store1/2/3 Could Mailbox Store 1 be restored to: Server2 | First Storage Group | Recovery Storage Group Both servers are under the same Administrative Group Currently for whatever reason ( mainly time) the mailboxes are not being backed up individually. Regards Kip

    Read the article

  • Content server backups

    - by Dan Sosedoff
    What is the best way to backup data on content servers? For example, I have 15 servers that just have content, no applications running on it. Each server has a 250 GB hard drive. So, it's a pretty big amount of data. All the data have external access (via HTTP). So, the question is: what methodology is best in my case? The most useful method I know is cross-backup: when each server contains its own data and backup of one other server. But, there is significant reduction in total capacity. RAID?

    Read the article

  • Duplicity not writing to a pre-existing S3 bucket

    - by Saurabh Nanda
    I'm trying to backup a directory to a pre-existing Amazon S3 bucket using the following command: duplicity --no-encryption system/ s3+http://MY_BUCKET_NAME/backup However, I'm getting the following error consistently: S3CreateError: S3CreateError: 409 Conflict <?xml version="1.0" encoding="UTF-8"?> <Error><Code>BucketAlreadyOwnedByYou</Code><Message>Your previous request to create the named bucket succeeded and you already own it.</Message><BucketName>vacationlabs</BucketName><RequestId>3C1B8C49469E3374</RequestId><HostId>4dU1TKf3Td6R0yvG9MaLKCYvQfwaCpdM8FUcv53aIOh0LeJ6wtVHHduPSTqjDwt0</HostId></Error> The S3 bucket is empty and does NOT have the backup directory The bucket is in Singapore region

    Read the article

  • PLesk Error: pmm-ras error (Error code = -6): during restore.. /tmp folder to increase?

    - by eric
    I had to re-install plesk on a centos 6 system after a crash. The full backup file is 11 gb. but at the beginning of the backup reinstall I get the error Error: pmm-ras error (Error code = -6): argh ! my disk organization is like this Filesystem Size Used Avail Use% Mounted on /dev/xvda1 3.7G 801M 2.9G 22% / /dev/mapper/vg00-usr 14G 1.5G 12G 12% /usr /dev/mapper/vg00-var 155G 14G 134G 10% /var /dev/mapper/vg00-home 3.9G 136M 3.6G 4% /home none 1000M 7.5M 993M 1% /tmp I suppose I have to increase my /tmp folder to accept the backup size,but I don't know how-to. I'm on 1&1 cloud server Thanks for your help. You can imagine the emergency of this situation...

    Read the article

  • Automatically wake up notebooks not on the ethernet

    - by gletscher
    I am looking for an automated backup system and I like bacula. I have 3 Notebooks and a Desktop computer that need regular backup. Now I don't want to let them run all night just to do the backuping, so I was thinking I could use wake-on-lan to have bacula wake up the machines, then do the backups, and shut them down afterswards. While this may work with devices on the ethernet, it won't work with the Notebooks on the wifi. So is it possible to have the Notebooks schedules to automatically wake up from suspend or shutdown ? Or is it possible to interject a shutdown command if it is after a certain hour and call the bacula director to start the backup now? I'm new to controlling the linux system using scripts, so any hints on how and where to start are greatly appreciated. Thanks alot for your help, input and ideas.

    Read the article

  • To Delete or Not to Delete Arcserve Makeup Jobs?

    - by Cliff Racer
    Every once in a while, I have a backup job that fails with my backup server running ARCserve 12.5. A failure results in the creation of a 'makeup' job. I run the makeup job and even if it completes it stays in my job cue. These have piled up over the past couple of years and I find myself wondering if its ok to delete them knowing that ARCserve relies on a sql database to catalog backup info and data. If I run a makeup job, can I just delete it afterward? Should I just collect them? I have not seen anything so far that makes me feel confident with what to do with these leftover jobs.

    Read the article

  • Use Mac Pro as time machine, server and editing station?

    - by Dan
    Background: My fiancée needs a Mac Pro for movie editing and rendering. I need a web server and a backup solution for my MacBook Pro. Idea We thought we could split the costs of the Mac Pro and set it up to act as both a web server and a backup device. Question Is this a good idea? Specifically: Is it easy to set it up to incrementally backup one or several laptops over wifi? And what software would you recommend? Is it silent and stable enough to run a web server continuously? Will it manage all this, including simultaneous editing? Thanks.

    Read the article

  • Run script when shutting down ubuntu before the logged in user is logged out

    - by Travis
    I'm writing a script to backup some local directories on a unix machine (Ubuntu) to a samba drive. The script works fine and I've got it running at shutdown and restart using the method described at http://en.kioskea.net/faq/3348-ubuntu-executing-a-script-at-startup-and-shutdown It works by placing the backup script into the /etc/rc6.d and /etc/rc0.d directories. However there is a problem. After looking at the scripts logfile it seems to be run after the user is logged out. We are using LDAP authentication and when the user logs out, the system cannot backup to their samba share. Does anyone know of anyway to run the script before the user is logged out?

    Read the article

  • Avoid cache overflow in Atempo LiveBackup

    - by Vebjorn Ljosa
    When attempting the initial backup of a new client, Atempo LiveBackup seems to require a very large cache. For instance, a 20 GB cache is not enough to back up a computer that has 100 GB of data. It appears that LiveBackup is adding new files to the cache at a faster rate than it can send them to the server. When the cache fills up, the backup fails. Aside from removing most data from the computer and then add them back gradually after the initial backup, is there a good workaround? Is it possible to make LiveBackup slow down its scan so as to not fill the cache? Or is it possible to place the cache on an external drive?

    Read the article

  • Unable to create a Windows 7 system image of a failing hard drive

    - by Rahul
    The hard disk of my one year old T400 Thinkpad has started failing periodic hardware tests. I get a "Targeted Read Test Failed" error. The "SMART short self test" times out. I am now trying to create a Windows 7 System image of the hard disk but it fails without giving any specific error messages. I tried using Comodo Backup but got an error (code 101117) there as well. I have copied the important files in Dropbox but would like to take a full System backup as I have plenty of software installed on the machine. Does anyone know why this is happening and how I can take a backup of the system image ?

    Read the article

  • Is there a way to schedule an automatic WinClone run on my Bootcamp partition?

    - by user17873
    Hi, I've currently got time machine setup to back up my entire OS X installation. I also have a backup tool within my Bootcamp Windows 7 installation which automatically backs up my windows profile data to an external drive partition. Finally, I'm also backing up my Bootcamp partition weekly and storing on an external drive using WinClone. The final piece I need to complete my external backup process is to have the WinClone application backup my Bootcamp partition automatically once a week rather than having to call it manually and remember. Is this possible?

    Read the article

  • How to harden windows 2003 service account

    - by ITMan
    I remember there was a tech-net or WindowsITpro article about how to harden windows 2003 service accounts a couple of years ago. For backup software purposes (such as BackupExec / AppAsure / Etc.. , please don't bash these) I have to create a domain admin account (usually called something such as "backup") and have the services run from that account. In this article I remember you can create the domain admin user "Backup" however have it not able to login interactively. Do any of you remember such an article or have the knowledge on how to do it?

    Read the article

  • Replicate a big, dense Windows volume over a WAN -- too big for DFS-R

    - by Jesse
    I've got a server with a LOT of small files -- many millions files, and over 1.5 TB of data. I need a decent backup strategy. Any filesystem-based backup takes too long -- just enumerating which files need to be copied takes a day. Acronis can do a disk image in 24 hours, but fails when it tries to do a differential backup the next day. DFS-R won't replicate a volume with this many files. I'm starting to look at Double Take, which seems to be able to do continuous replication. Are there other solutions that can do continuous replication at a block or sector level -- not file-by-file over a WAN?

    Read the article

  • Turning a (dual boot) Ubuntu 9 into a virtual machine

    - by xain
    I have a dual boot box (Ubuntu 9 and Vista) and I'm about to upgrade Vista to Win 7. Being Ubuntu my main development environment, I'd like to use it as it is from the new environment via VirtualBox or VMWare. I know tools like clonezilla that backup entire drives; in my case, the linux partitions are distributed between several disks which in turn contain both linux and windows data. My intention is to use some backup tool (like Clonezilla if it fits) that allows me to ONLY backup the linux partitions distributed in several disks. Any hints ? Thanks in advance.

    Read the article

  • Oracle 10g Failover Database - How to fail back?

    - by rrkwells
    I want to know how the failover database concept works after recovery. We have defined our application to connect to a backup database in case the production database fails. If this happens, then all the transactions will be happening on that backup database. Once the production db server is running again, then how do we make sure the changes made in the backup database will be reflected on the production database? We want to make sure that any changes made while failed over are not lost. We are using Oracle 10g.

    Read the article

  • Get Directory Size through SFTP

    - by Nongo
    A client has a managed websever; on this server is an e-commerce script and part of the script dumps a backup every week. The backup is stored on the web server (not in the HTTP route). The ISP takes a copy, and my clients wants to take a copy too. What I am trying to do is before downloading the file I want to be able to calculate the backup directory size - but the only access I have is through SFTP. Is it possible to easily get the directory size and then use this in a PowerShell Script. NOTE: I have written an automatic download script in Powershell, and I want to extend this. Forgive me if this sounds vague I can provide further info if you have any specific questions.

    Read the article

  • SQL SERVER – 2008 – Introduction to Snapshot Database – Restore From Snapshot

    - by pinaldave
    Snapshot database is one of the most interesting concepts that I have used at some places recently. Here is a quick definition of the subject from Book On Line: A Database Snapshot is a read-only, static view of a database (the source database). Multiple snapshots can exist on a source database and can always reside on the same server instance as the database. Each database snapshot is consistent, in terms of transactions, with the source database as of the moment of the snapshot’s creation. A snapshot persists until it is explicitly dropped by the database owner. If you do not know how Snapshot database work, here is a quick note on the subject. However, please refer to the official description on Book-on-Line for accuracy. Snapshot database is a read-only database created from an original database called the “source database”. This database operates at page level. When Snapshot database is created, it is produced on sparse files; in fact, it does not occupy any space (or occupies very little space) in the Operating System. When any data page is modified in the source database, that data page is copied to Snapshot database, making the sparse file size increases. When an unmodified data page is read in the Snapshot database, it actually reads the pages of the original database. In other words, the changes that happen in the source database are reflected in the Snapshot database. Let us see a simple example of Snapshot. In the following exercise, we will do a few operations. Please note that this script is for demo purposes only- there are a few considerations of CPU, DISK I/O and memory, which will be discussed in the future posts. Create Snapshot Delete Data from Original DB Restore Data from Snapshot First, let us create the first Snapshot database and observe the sparse file details. USE master GO -- Create Regular Database CREATE DATABASE RegularDB GO USE RegularDB GO -- Populate Regular Database with Sample Table CREATE TABLE FirstTable (ID INT, Value VARCHAR(10)) INSERT INTO FirstTable VALUES(1, 'First'); INSERT INTO FirstTable VALUES(2, 'Second'); INSERT INTO FirstTable VALUES(3, 'Third'); GO -- Create Snapshot Database CREATE DATABASE SnapshotDB ON (Name ='RegularDB', FileName='c:\SSDB.ss1') AS SNAPSHOT OF RegularDB; GO -- Select from Regular and Snapshot Database SELECT * FROM RegularDB.dbo.FirstTable; SELECT * FROM SnapshotDB.dbo.FirstTable; GO Now let us see the resultset for the same. Now let us do delete something from the Original DB and check the same details we checked before. -- Delete from Regular Database DELETE FROM RegularDB.dbo.FirstTable; GO -- Select from Regular and Snapshot Database SELECT * FROM RegularDB.dbo.FirstTable; SELECT * FROM SnapshotDB.dbo.FirstTable; GO When we check the details of sparse file created by Snapshot database, we will find some interesting details. The details of Regular DB remain the same. It clearly shows that when we delete data from Regular/Source DB, it copies the data pages to Snapshot database. This is the reason why the size of the snapshot DB is increased. Now let us take this small exercise to  the next level and restore our deleted data from Snapshot DB to Original Source DB. -- Restore Data from Snapshot Database USE master GO RESTORE DATABASE RegularDB FROM DATABASE_SNAPSHOT = 'SnapshotDB'; GO -- Select from Regular and Snapshot Database SELECT * FROM RegularDB.dbo.FirstTable; SELECT * FROM SnapshotDB.dbo.FirstTable; GO -- Clean up DROP DATABASE [SnapshotDB]; DROP DATABASE [RegularDB]; GO Now let us check the details of the select statement and we can see that we are successful able to restore the database from Snapshot Database. We can clearly see that this is a very useful feature in case you would encounter a good business that needs it. I would like to request the readers to suggest more details if they are using this feature in their business. Also, let me know if you think it can be potentially used to achieve any tasks. Complete Script of the afore- mentioned operation for easy reference is as follows: USE master GO -- Create Regular Database CREATE DATABASE RegularDB GO USE RegularDB GO -- Populate Regular Database with Sample Table CREATE TABLE FirstTable (ID INT, Value VARCHAR(10)) INSERT INTO FirstTable VALUES(1, 'First'); INSERT INTO FirstTable VALUES(2, 'Second'); INSERT INTO FirstTable VALUES(3, 'Third'); GO -- Create Snapshot Database CREATE DATABASE SnapshotDB ON (Name ='RegularDB', FileName='c:\SSDB.ss1') AS SNAPSHOT OF RegularDB; GO -- Select from Regular and Snapshot Database SELECT * FROM RegularDB.dbo.FirstTable; SELECT * FROM SnapshotDB.dbo.FirstTable; GO -- Delete from Regular Database DELETE FROM RegularDB.dbo.FirstTable; GO -- Select from Regular and Snapshot Database SELECT * FROM RegularDB.dbo.FirstTable; SELECT * FROM SnapshotDB.dbo.FirstTable; GO -- Restore Data from Snapshot Database USE master GO RESTORE DATABASE RegularDB FROM DATABASE_SNAPSHOT = 'SnapshotDB'; GO -- Select from Regular and Snapshot Database SELECT * FROM RegularDB.dbo.FirstTable; SELECT * FROM SnapshotDB.dbo.FirstTable; GO -- Clean up DROP DATABASE [SnapshotDB]; DROP DATABASE [RegularDB]; GO Reference : Pinal Dave (http://blog.SQLAuthority.com) Filed under: SQL, SQL Authority, SQL Backup and Restore, SQL Data Storage, SQL Query, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology

    Read the article

< Previous Page | 66 67 68 69 70 71 72 73 74 75 76 77  | Next Page >