Search Results

Search found 6101 results on 245 pages for 'incremental backup'.

Page 109/245 | < Previous Page | 105 106 107 108 109 110 111 112 113 114 115 116  | Next Page >

  • Force initial Google Drive sync with a non-empty folder?

    - by Terrance Shaw
    I upgraded my iMac with an SSD last night and restored from a Time Capsule backup. Everything is now working substantially zippier and overall better, with the exception of one thing: Google Drive refuses to continue to sync with the Google Drive folder that it'd been using before the upgrade, and I ultimately ended up having to just delete the folder and let it resync from scratch to get past its stubborn error (alternatively, I suppose I could've simply moved the contents, set the path to the now-empty folder, then moved them back). Is there any way to get past this particular issue (for future reference), or is it something that Google put in place to ensure that a new user doesn't go and specify their root drive as the backup destination?

    Read the article

  • Recovering OS X Mail Accounts Lost in Crash

    - by Tim
    I had a hard crash on my Mac PowerBook and when I restarted, Mail came up with only my MobileMe account still available and I cannot figure out how to restore the other eight email accounts I have. The directories in ~/Library/Mail all seem to be there. I even did an rsync of the modified .plist files from a TimeMachine backup of the directory from before the crash (unfortunately, I was on travel, so the backup is more than a week old and I'd like to try and recover from that point without having to entirely restore from TimeMachine). I also did a fix permissions. So my questions are where exactly is the account information for Mac Mail kept? Any thoughts of what might have caused the failure? Why does only MobileMe come up? Any other thoughts on how to fix things?

    Read the article

  • Can't write to file - 'Operation not permitted' WITH sudo

    - by charliehorse55
    I am having trouble writing to a few files on an external HD. I am using it to store media files as well as my time machine backup. The drive is formatted as HFS+ Journaled, and other files on the drive can be written successfully. Additionally, the time machine backup is working perfectly. Permissions for the file: $ ls -le -@ Parks\ and\ Recreation\ -\ S01E01.avi -rw-rw-rw-@ 1 evantandersen staff 182950496 22 May 2009 Parks and Recreation - S01E01.avi com.apple.FinderInfo 32 Things I have already tried: sudo chflags -N sudo chown myusername sudo chown 666 sudo chgrp staff Checked that the file is not locked (get info in finder) Why can't I modify that file? Even with sudo I can't modify it at all.

    Read the article

  • Switching BIOS SATA RAID/AHCI setting causes BSOD at Windows Start - Why?

    - by thephatp
    I just changed my disk setup from: 1 SATA HDD Primary OS Disk 2x SATA HDD Backup Disks in RAID 1 TO: 1 SATA SSD Primary OS Disk 1 SATA HDD Backup Disk [No RAID] Everything worked great, no problem. So, since I don't have a RAID array anymore, I decided that I could change my BIOS setting to AHCI instead of RAID. I have a Gigabyte GA-P35-DS3R v1.0 mobo. These are my steps: Settings Integrated Peripherals "SATA RAID/AHCI Mode" = RAID -- Changed this setting to AHCI Reboot Windows Start screen shows up, but as the color orbs are spinning into focus, BSOD and immediate restart Repeated reboot several times, same outcome Next Step: Launch BIOS settings Integrated Peripherals "Onboard SATA/IDE Ctrl Mode" = RAID -- Changed this setting to AHCI Reboot Windows Start screen shows up, but as the color orbs are spinning into focus, BSOD and immediate restart Repeated reboot several times, same outcome Switch both settings back to RAID, reboot, and Windows starts up just fine, no issues. What am I missing? Why can't I set it to AHCI mode without BSODs?

    Read the article

  • trouble backing up large mysql database

    - by Patrick
    I have a wordpress MU database with something like 10,000+ tables for various user's blogs. I need to upgrade wordpress MU to newest version, but want to backup the DB before hand. PHPMyAdmin fails to even load the page when i click export. Ive tried going into the server (windows) and using dos command line: mysqldump -u USERNAME -p PASSWORD> BACKUP.sql but it hangs for a minute and gives me the error: error 23: out of resources when opinging file '.\USERNAME\wp_1037_links.MYD' (Errorcode: 24) when using LOCK Tables What am i doing wrong, or should i be doing? Is PHPMyAdmin right for something this size? Is there a better way of doing this than the two methods i tried? **Note that this is not my site, so any suggestions as to the setup of the DB ill have to run by the owner. Im just here for WP related crap, this is kind of out of scope for what i was brought on to do.

    Read the article

  • Failover tmpfs mirroring. Am I doing it right?

    - by user45286
    My goal is to have a certain directory to be available as tmpfs. There will be some modifications during server uptime in this dir and those modifications must be synced to non-tmpfs persistent dir on HDD over rsync. After server boot the latest version from non-tmpfs persistent dir must be moved to tmpfs and rsync syncing to be started. I'm afraid that rsync will erase non-tmpfs backup if tmpfs dir will be empty.. I'm doing it in this way right now: create tmpfs partition in /etc/fstab cat /etc/rc.local (pseudocode) delete "tmpfs rsync" cronjob from /var/spool/cron/crontabs if there is any cp -r /path/to/non-tmpfs-backup /path/to/tmpfs/dir append /var/spool/cron/crontabs with "tmpfs rsync" cronjob What do you think?

    Read the article

  • Disk Redundancy across different server

    - by Mascarpone
    I have 3 servers, all with the same specs: Intel CPU 8 GB RAM Linux or BSD Single 2TB desktop SATA with more than 10K Hours of operation, with only less than 300 GB Used My provider cannot install a second hard drive, but can guarantee me that the drive will be replaced immediately in case of failure, with another equally crappy drive. The likelihood of drive failure is high, and since I can't use RAID, I was thinking about keeping a back up of each machine on all the other machines, so that there are always 2 copies on 2 different drives, plus the original. I would synchronize the drives every hour, with rsync, to guarantee some sort of redundancy, since bandwidth inside the DC is free, so it would be much cheaper than offsite backup. (A daily offiste backup is kept anyhow). What do you think? Any suggestion?

    Read the article

  • How to express inter project dependencies in Eclipse PDE

    - by Roland Tepp
    I am looking for the best practice of handling inter project dependencies between mixed project types where some of the projects are eclipse plug-in/OSGI bundle projects (an RCP application) and others are just plain old java projects (web services modules). Few of the eclipse plug-ins have dependencies on Java projects. My problem is that at least as far as I've looked, there is no way of cleanly expressing such a dependency in Eclipse PDE environment. I can have plug-in projects depend on other plug-in projects (via Import-Package or Require-Bundle manifest headers), but not of the plain java projects. I seem to be able to have project declare a dependency on a jar from another project in a workspace, but these jar files do not get picked up by neither export nor launch configuration (although, java code editing sees the libraries just fine). The "Java projects" are used for building services to be deployed on an J2EE container (JBoss 4.2.2 for the moment) and produce in some cases multiple jar's - one for deploying to the JBoss ear and another for use by client code (an RCP application). The way we've "solved" this problem for now is that we have 2 more external tools launcher configurations - one for building all the jar's and another for copying these jar's to the plug-in projects. This works (sort of), but the "whole build" and "copy jars" targets incur quite a large build step, bypassing the whole eclipse incremental build feature and by copying the jars instead of just referencing the projects I am decoupling the dependency information and requesting quite a massive workspace refresh that eats up the development time like it was candy. What I would like to have is a much more "natural" workspace setup that would manage dependencies between projects and request incremental rebuilds only as they are needed, be able to use client code from service libraries in an RCP application plug-ins and be able to launch the RCP application with all the necessary classes where they are needed. So can I have my cake and eat it too ;) NOTE To be clear, this is not so much about dependency management and module management at the moment as it is about Eclipse PDE configuration. I am well aware of products like [Maven], [Ivy] and [Buckminster] and they solve a quite different problem (once I've solved the workspace configuration issue, these products can actually come in handy for materializing the workspace and building the product)

    Read the article

  • How many hardlinks in a drive?

    - by acidzombie24
    I made a backup of one of my external drives to another. They are both NTFS filesystems. I moved ALL disk contents into a folder called a and right clicked to get file/folder/size count. They are exactly the same. However windows reports J having 1.33gb (backup) and Q: as 521mb. Now I think maybe its because of hardlinks, I must have more on J then Q. How might I figure out how many hardlinks I have in a drive?

    Read the article

  • Backing up VM data to host drive on Windows 7

    - by malcolms
    Hi, I have created a VM for Virtual PC in windows 7. I am writing a batch file to backup data in the VM to a host USB drive. I have shared the host drives. I have a USB drive that I want to backup to. But how do refer to the USB drive in the batch file. I cannot seem to map a drive to it, It is called "H on Malcolm-Desktop" in windows explorer. This is what I have tried. XCOPY C:\Inetpub\wwwroot "\\H on Malcolm-Desktop\HALII_VHD_Backup\DataBackup\Inetpub\wwwroot" /S /E /Y /D How do I write this command? Malcolm

    Read the article

  • IIS 6 SSL Restore from PFX without Deleting Pending Request

    - by Sev
    I requested a new SSL certificate from a certificate authority, but until they process it my site is losing business. Before doing so, I had backed up the original certificate to a PFX file. Now when I try to restore the backup, it forces me to delete it, or process the request. Since the new one isn't ready yet, is there any way to restore the backup, without deleting the request? Or will it cause any issues if I delete the request to install the new one when it comes in? Server is IIS 6

    Read the article

  • SQL Server 2000 msdb database loading/suspect

    - by Blake Parcell
    My SQL Server recently suffered a raid controller/hard drive crash. After getting my hard drive problem corrected I soon found that some of my databases were (suspect) namely msdb. I am not a DBA by any means however am somewhat familiar with the daily SQL activities that happen on my server. So I restored from backup, and tried to bring my msdb database online. It is now forever stuck in (Loading\Suspect) and I am unable to script backups for my important databases. I can recreate all of the backup plans etc if i can somehow get a working msdb. Any help would be greatly appreciated. I am currently using: Microsoft SQL Server 2000 Version: 8.00.194

    Read the article

  • How to Set Linux Bonding Interface to Gigabit

    - by Kyle Brandt
    I have enabled Linux active backup mode bonding. Each interface is a gigabit interface, but the bond interface seems to end up at 100 Megabit: bonding: bond0: Warning: failed to get speed and duplex from eth1, assumed to be 100Mb/sec and Full. ... bnx2: eth0 NIC Link is Up, 1000 Mbps full duplex, receive & transmit flow control ON ... bonding: bond0: backup interface eth1 is now up ethtool apparently can't provide info on bond: sudo ethtool bond0 Settings for bond0: No data available So does this mean I am operating at 100 or 1000 Megabit (My guess is 1000)? If it is only 100, what options in the ifcfg scripts or the modprobe bonding options do I need to sett to make it 1000?

    Read the article

  • rsync command deletion error "IO error encountered -- skipping file deletion"

    - by Jam88
    I use rsync command to take backup of files from one of my ubuntu server to another ubuntu machine. Backup server trigger a script that use rysnc command. Here is the command I use rsync -rltvh --partial --stats --exclude=.beagle/ --exclude=.* --delete-after root@live_server:/home/ /home/live_server_backup/home /tmp/logfile.log 2&1 live_server is ssh-able without password. So it works. Now problem is with --delete-after option After all file synced .At the end I can see deletion procedure skipped.logfile error is like IO error encountered -- skipping file deletion When i tried to find log there were some error while file sync rsync: send_files failed to open "/home/xyz/Desktop/PPT_session_1_context.pdf": Permission denied (13) So my understanding is as rsync could not read all the files from target for safety reason it is skipping the file deletion. Is there any way to make --delete-after work even if there is some permission error? I do not want to use force deletion as it will be dangerous in some situation.

    Read the article

  • Robocopying DLLs and EXEs

    - by BinaryDeuce
    I've used Robocopy for several years now for backup purposes, and never looked back at any backup application I used in the past. I replicate whole valued directories to a removable HD, than from this HD to and equivalent system. Over time, quite a few DLL and EXE have accumulated in these directories, none of which Robocopy can seem to copy to my external HD. Thus, my 2 computers drift apart slowly... Is there anyway, using one of the eigthy-nine (89) switches (or one of the 2^89 - 1 = 6,1897002 × 10^26 combinations thereof) to force Robocopy to robotically copy EXEs, DLLs and other of my "access is denied" friends? Thanks

    Read the article

  • HDD is not recognized/initialized via USB, only via SATA - is a reformat through USB a bad idea?

    - by Wuschelbeutel Kartoffelhuhn
    I have a 4TB Hitachi HDD that I purchased in Europe (I use it as a backup disk); I use Windows 7. When I connect it to a SATA port, it is recognized in Windows Explorer and gives no problems, even after transferring 3TB at a time or after being on for days. When I connect it via a SATA-to-USB2.0 adapter, it is also recognized, but when I transfer a large amount of data, it will intermittently stop being recognized by Windows Explorer and cancel the transfer. When I connect it via an external enclosure (which is technically a SATA-to-USB3.0 adapter), it does not display at all in Windows Explorer, but Disk Management will show the drive, albeit uninitialized (prompts for format). I only got the external enclosure because I want to backup my files more conveniently (instead of having to open the computer case each time). Do you advise against reformat/initialization via the external enclosure? Can it screw up things in an irrevocable way (Master Boot Record etc.)?

    Read the article

  • Ubuntu OpenLDAP and Mac OS X Roaming Profiles

    - by Sam Hammamy
    Today, I'm installing OpenLDAP on Ubuntu 11.04. I have gone through the installation before a few times, but what I have never been able to do is use Mac OSX Lion's Directory Utility to administer the LDAP server, and also never got roaming profiles working. SO the question is: 1) Is there a configuration I need in OpenLDAP so I can administer it remotely. I vaguely remember something like that, but I can't find the answer online. 2) What do I need to setup roaming profiles for a mac os x. I'd like to backup all my laptop data to the LDAP's machine, and potentially, also backup with time machine to an external HD mounted no the Ubutnu machine (but that's a few days ahead of time for now).

    Read the article

  • Best way to rip DVD movies to ISO files

    - by alex
    I'm trying to backup my DVD collection. I have Handbrake, and will eventually experiment with the best settings to use. For now, I'd like to backup the DVD's to ISO files, that i can mount and then use Handbrake on later, or burn back on to DVD should the original get damaged. I have a WD TV box that is capable of playing ISO files also. What's the best program for doing this? I'm not so much concerned with file size.

    Read the article

  • Advantages of Terminal Server instead of normal client-Server installation?

    - by Sam
    What are the advantages of using a (Windows) Terminal Server and thin clients instead of using a normal Server and full clients? So far I've only really used normal servers and clients, but now customers ask about terminal Server, and I'd like to know pro's and con's of using them instead of an "old-fashioned" client-server network. Some things I can guess: easier administration (don't need to install/update office/stuff on 20 computers but only on the server). Easier backup (no need to backup client computers). And I'd guess it would be hard (impossible) to connect and use local (like USB) hardware with Terminal Server? What else are the reasons for or against switching to Terminal Server?

    Read the article

  • Can't mount time capsule via wi-fi.

    - by Grnmntn
    I have an Apple time capsule, and have been able to back up to it without a problem for the past few months. However, today I found I cannot connect to the time capsule from Finder, though it appears there, when I click "connect as...", it takes a few seconds and then reports connect failed, "the server may not exist or it is not operational at this time. Check the server name or IP address and your network connection and try again". When I check Time Machine, it says the last backup failed because "the backup volume could not be mounted". The strange part is, the time capsule is also my router, and I am able to use the time capsule as a wireless access point without a problem. Any ideas?

    Read the article

  • Moving VMWare Fusion image to Boot Camp

    - by Kristopher Johnson
    I have Windows 7 64-bit running in VMWare Fusion on my MacBook, but am disappointed with the performance, and so I want to try Boot Camp. However, I'd like to avoid reinstalling Windows and all my applications; I just want to somehow copy my VMWare Fusion "disk image" to a Boot Camp partition. My initial thoughts are that I should be able to run a Windows backup program in VMWare Fusion to back up the entire virtual disk, then set up Boot Camp and restore from that backup. However, Googling finds a few posts by people who have tried that and have encountered problems. So, is there a "known good" procedure for doing this?

    Read the article

  • P2V using Acronis True Image Home 10 and Windows 7

    - by Anthony
    I have a full system image using Acronis True Image Home 10 and want to run it as a virtual machine on Windows 7 Professional. I have created a virtual machine but Windows Virtual PC doesn't allow access to a USB external hard disk when booting from the Acronis Recovery CD. I've copied the backup onto the host machine and I can access it via the network using the Acronis boot CD but I'm wondering if there is an easier way? Does any other free Virtual Machine software support USB devices during boot (i.e. I can restore a backup image from the USB hard disk directly)

    Read the article

  • setup the git server in centos6.4 [on hold]

    - by hguser
    We have a server which using centos6.4. Now we want to make this server as the backup and the cvs server. We have ten user in our team. So I created ten accounts accordingly, then they can backup files to their own home directory using ftp. However I do not know how to setup the cvs, we preferred to use git. We want to implement this: Everyone can create git repositories in his home directory with read/write access using his account. Is this possible?

    Read the article

  • Install mountain lion by using the .dmg from installer app

    - by Leonardo
    I already have a Mountain Lion Installer app, downloaded from Apple Store. Now, I would like to install ML on another machine I own. I didn't want to download it again, so I copied the .app to the other machine and try to run. Unfortunately due to some error about mac unable to run the .app I wasn't able to install. Most tutorial suggest to make a bootable USB starting by the installer InstallESD.dmg, found in installer .app. I would like instead to run the .dmg directly. So I have three 'propedeutic' questions: can I just mount and run the .dmg without making a bootable drive ? I do have backup, a TimeCapsule one to be precise. In case of failure, can I just use the previous backup, and restore to Lion 10.7 ? from 'Apple Store point of view', would my machine be recognized as upgraded and elegible for future update ?

    Read the article

  • public key always asking for password and keyphrase

    - by Andrew Atkinson
    I am trying to SSH from a NAS to a webserver using a public key. NAS user is 'root' and webserver user is 'backup' I have all permissions set correctly and when I debug the SSH connection I get: (last little bit of the debug) debug1: SSH2_MSG_SERVICE_REQUEST sent debug1: SSH2_MSG_SERVICE_ACCEPT received debug1: Authentications that can continue: publickey,gssapi-keyex,gssapi-with-mic,password debug1: Next authentication method: publickey debug1: Offering DSA public key: /root/.ssh/id_dsa.pub debug1: Server accepts key: pkalg ssh-dss blen 433 debug1: key_parse_private_pem: PEM_read_PrivateKey failed debug1: read PEM private key done: type <unknown> Enter passphrase for key '/root/.ssh/id_dsa.pub': I am using the command: ssh -v -i /root/.ssh/id_dsa.pub [email protected] The fact that it is asking for a passphrase is a good sign surely, but I do not want it to prompt for this or a password (which comes afterwards if I press 'return' on the passphrase)

    Read the article

< Previous Page | 105 106 107 108 109 110 111 112 113 114 115 116  | Next Page >