Search Results

Search found 2822 results on 113 pages for 'scheduled backups'.

Page 48/113 | < Previous Page | 44 45 46 47 48 49 50 51 52 53 54 55  | Next Page >

  • DFS "clobering" files

    - by Badger
    We have DFS setup using the DFS Management Administrator Tool. I turned on replication in the Distributed File System Administrator Tool as well and this morning we lost tons of files from that share. Please explain to me why this was wrong and if there is anything that can be done to repair it. (No, we don't have backups. We have some shadow copies, but those were deleted as well. We have been using DFS as its own backup)

    Read the article

  • Has anyone achieved true differential sync with rsync in ESXi?

    - by Julius
    Berate me later on the fact that I'm using the service console to do anything in ESXi... I've got a working rsync binary (v3.0.4) that I can use in ESXi 4.1U1. I tend to use rsync over cp when copying VM's or backups from one local datastore to another local datastore. I've used rsync to copy data from one ESXi box to another but that was just for small files. In now trying to do true differential syncs of backups taken via ghettoVCB between my primary ESXi machine and a secondary one. But even when I do this locally (one datastore to another datastore on the same ESXi machine) rsync appears to copy the files in their entirety. I've got two VMDK's totally 80GB in size, and rsync still takes anywhere between 1 and 2 hours but the VMDK's aren't growing that much daily. Below is the rsync command I'm executing. I am copying locally because ultimately these files will get copied onto a datastore created from a LUN on a remote system. Its not an rsync that'll be serviced by an rsync daemon on a remote system. rsync -avPSI VMBACKUP_2011-06-10_02-27-56/* VMBACKUP_2011-06-01_06-37-11/ --stats --itemize-changes --existing --modify-window=2 --no-whole-file sending incremental file list >f..t...... VM-flat.vmdk 42949672960 100% 15.06MB/s 0:45:20 (xfer#1, to-check=5/6) >f..t...... VM.vmdk 556 100% 4.24kB/s 0:00:00 (xfer#2, to-check=4/6) >f..t...... VM.vmx 3327 100% 25.19kB/s 0:00:00 (xfer#3, to-check=3/6) >f..t...... VM_1-flat.vmdk 42949672960 100% 12.19MB/s 0:56:01 (xfer#4, to-check=2/6) >f..t...... VM_1.vmdk 558 100% 2.51kB/s 0:00:00 (xfer#5, to-check=1/6) >f..t...... STATUS.ok 30 100% 0.02kB/s 0:00:01 (xfer#6, to-check=0/6) Number of files: 6 Number of files transferred: 6 Total file size: 85899350391 bytes Total transferred file size: 85899350391 bytes Literal data: 2429682778 bytes Matched data: 83469667613 bytes File list size: 129 File list generation time: 0.001 seconds File list transfer time: 0.000 seconds Total bytes sent: 2432530094 Total bytes received: 5243054 sent 2432530094 bytes received 5243054 bytes 295648.92 bytes/sec total size is 85899350391 speedup is 35.24 Is this because ESXi is itself making so many changes to the VMDK's that as far as rsync is concerned the entire file has to be retransmitted? Has anyone actually achieved actual diff sync with ESXi?

    Read the article

  • How to migrate the data directory for MSSQL Server?

    - by Ryan
    I have an installation of MSSQL where I would like to move the data directory to another drive so that all the existing databases are located there and all new databases are created there, as well as the backups, logs, etc. I know I can detach/attach the existing databases, but what about the rest of the settings (backup, new databases)? Is this possible without an uninstall/reinstall? Thank you.

    Read the article

  • Volume Shadow Copy Remotely?

    - by Wringley
    I'm currently running a Microsoft Server 2008 R2 box and want to utilize Volume Shadow Copy but my development machine doesn't have enough hard drive space for 30 days worth of copies, which is what I am aiming for and currently only have 5 days of backups. I have another Windows XP box and I was wondering if it was possible to remotely store the shadow copy images in my remote XP box. If it isn't possible with the built in Volume Shadow Copy, is there an open source alternative that I can utilize that has the same or similar functionality?

    Read the article

  • VSS Information

    - by MJ
    I'm currently taking care of backups for about 100 clients, and I'm really getting hammered with VSS issues. I've tried many different things, re-regsitering the VSS Dlls, reboots, patches, etc. What I want to know: Where can I find some detailed, technical information about VSS.

    Read the article

  • Restore dpm 2010 protection groups from partitions

    - by Dragouf
    Hello, I have Data protection manager (DPM) 2010. I did a backup of my system which has been saved into different partitions. The computer running DPM crashed and is not allowing me to restore the backup. However, i still have all the backups as partitions. How can I restore the multiple protection groups from the physical existing partitions? I have been researching the msdn documentation for a solution, but no luck so far. Thanks for your help

    Read the article

  • Ubuntu server; Backup of server and MySql database, and Solr database

    - by Camran
    How is backup done on ubuntu servers? I have a server (Ubuntu 9.10) which has apache2 installed, php5, mysql etc... The website is a classifieds website where all classifieds are stored in mysql and Solr. I need to backup this server with all information to be able to fully restore it if something goes wrong. How should I start? Is it an automated task, or will I do backups manually? (prefer manually) Thanks

    Read the article

  • How to set up a file server in a restricted corporate environment

    - by Emilio M Bumachar
    I work in a big corporation, and the disk space my team gets in the corporate file server is so low, I am considering turning my work PC into a file server. I ask this community for links to tutorials, software suggestions, and advice in general about how to set it up. My machine is an Intel Core2Duo E7500 @ 3GHz, 3 GB of RAM, Running Windows XP Service Pack 3. Upgrading, formatting or installing another OS is out of the question. But I do have Administrator priviledges on the PC, and I can install programs (at least for now). A lot of security software I don't even know about is and must remain installed. But I only need communication whithin the corporate network, which is not restricted. People have usernames (logins) on the corporate network, and I need to use them to restrict access. Simply put, I have a list of logins of team members, and only people in the list should access the files. I have about 150 GB of free disk space. I'm thinking of allocating 100 GB to the team's shared files. I plan monthly backups on machines of co-workers, same configuration. But automation of backups is a nice, unnecessary feature: it's totally acceptable for me to manually copy the contents to a different machine once a month. Uptime is important, as everyone would use these files in their daily work. I have experience as a python and C programmer, but no experience whatsoever as a sysadmin, and almost nothing of my programming experience is network programming. I'm a complete beginner in this. Thanks in advance for any help. EDIT I honestly appreciate all the warnings, I really do, but what I plan to make available is mostly stuff that now is solely on DVDs just for space reasons. It's 'daily work' to read them, but 'daily work write' files will remain on the corporate server. As for the importance of uptime, I think I overstated it: a few outages are OK, it's already an improvement over getting the DVDs. As for policy, my manager is kind of on my side, I will confirm that before making my move. As for getting more space through the proper channels, well, that was Plan A, and it's still on the table... But I don't have much hope. I'm not as "core businees" as I'd like.

    Read the article

  • Evaluate cron expression

    - by Jake A. Smith
    Is there a command line tool that will simply evaluate a cron expression and return a bool response if it is supposed to be running right now? I'm looking for something I can use as a utility in another bash script. Something like so: run_script=$(/tools/evaluate-cron-expression "02 4 * * *") if [ "$run_script" -eq "1" ] # etc etc I know, I know, I could just setup a real cron job, but I'm playing with the idea of wrapping all of my scheduled scripts inside of another script.

    Read the article

  • Making Puppet manifests/modules available to a wide audience

    - by Kyle Smith
    Our team rolled puppet out to our systems over the last six months. We're managing all sorts of resources, and some of them have sensitive data (database passwords for automated backups, license keys for proprietary software, etc.). Other teams want to get involved in the development of (or at least be able to see) our modules and manifests. What have other people done to continue to have secure data moving through Puppet, while sharing the modules and manifests with a larger audience?

    Read the article

  • How do I (robustly) remotely execute tasks on Windows workstations in a domain?

    - by Zac B
    I'm not even sure if "robustly" is a word. Anyway. Context: We have a few hundred Windows 7 workstations on a LAN. We use AD/GPO management pretty heavily, but there are a lot of periodic and/or manual maintenance tasks we need to do that can't be done via GPO/scheduled task. For example, say I want to execute program X (which runs silently, in the background, and doesn't bother the user) on workstation Y, or say I want to execute task A on a workstation group B either on a schedule or on demand. Kicking the users off of their computers to do this (i.e. using RDP) is a no-no, and doesn't work on groups anyway. Question: What's the best way to do this that is robust enough that, after setup, I could give it to beginner support people (read: people who are phobic of the command line, and get confused with GUI interfaces more complicated than Firefox)? I'm a competent programmer, and, if there is a robust set of tools or framework out there for this type of task, I'd consider hacking something together myself if it didn't take too long. If there's some combination of tools or techniques that others use to make remote-workstation-administration doable by beginners, I have yet to find it. For those who care about the "why": I'm midlevel IT, and was told to implement a remote management solution that allows arbitrary/scheduled remote execution, with confirmation that programs actually ran remotely, and the ability to view what they returned. "Why?" I asked, "Can't I just use PsExec and the task scheduler on a dispatcher machine?" "No," I was told, "'Joe' the second-week tech is going to be in charge of this one, and he needs something simple with a GUI." What I've tried: I've played with making a bunch of one-clickable "transfer files to remote computer and run them with PsExec" batch/VB scrips, but those tend to break down and don't easily support running on customizable groups. I've played a little bit with the Windows version of Puppet, but it doesn't support arbitrary-time remote execution (it's ability to group computers into a tree/node structure is really nice though). I've used an older version of Altiris, and, while it does a lot of what I want, it's interface is awful, it's slow, crashes a lot, and is probably too expensive for management. SwiftWater's DMS solution does some of what I want, but it's very underdeveloped, closed-source (not a deal breaker but not ideal), and I get the impression that support and reliability are lacking.

    Read the article

  • Permissions for Creating a Schedule Task

    - by RPS
    What permissions are needed to create a scheduled task on Windows 2008 with AD: That error is 0x80070005 in hex. Facility 7 = Windows, error code 5 = Access Denied. The user account used to run this code doesn't have sufficient rights.

    Read the article

  • Where to put my backup.sh?

    - by Temnovit
    I'm writing a shell script that will make backups of my system ( like this: https://help.ubuntu.com/8.04/serverguide/C/backup-shellscripts.html ). What is the best location in my system, to store this file? I know, I can put it anywhere, but what will be if it will be stored in a directory being backed up? What is the best practice here? I'm running an Ubuntu-server.

    Read the article

  • Command-line tool to deduplicate a single humongous file?

    - by romkyns
    I make regular snapshots of my VM using a nightly script. These backups are compressed using WinRAR and do shrink considerably, but I suspect it's not as efficient as had the file had been deduplicated first (just a hunch which I'm hoping to test). So instead of compressing the VHD itself, I would like to deduplicate the single file first, and then compress the output of the deduplicator. Is anyone aware of such a CLI tool?

    Read the article

  • restore -A usage

    - by Martin v. Löwis
    I have created a number of dump files using Linux dump(8), using the -A option to get a table of contents on disk (the backups are on tape). Now I'm trying to look into these archive files, using restore -i -A <archive>` However, this insists on asking what tape to use, and complains if I say none. What am I doing incorrectly? I was hoping that I can use these archive index files without having to insert the tape to use.

    Read the article

  • NAS vs. Windows Home Server

    - by Makach
    I'm about to invest in something that will help me with backups etc. So, NAS vs. Windows Home Server? It feels like I'm getting a bit more with Windows Home Server - but I'm really not sure. Can I use Mac clients with Windows Home Server or any NAS?

    Read the article

  • Optimal disk partitions for database setup (15 Drives)

    - by Jason
    We are setting up a new database system and have 15 drives to play with (+2 on-board for the OS). With a total of 15 drives would it be better to setup all 14 as one RAID-10 block (+1 hot spare) OR split into two RAID-10 sets one for Data (8 disks) and one for logs/backups (6 disks). My question boils down to the following: is there a specific point where having more drives in a RAID-10 setup will out preform having the drives broken into smaller RAID-10 sets.

    Read the article

  • Redirect all outgoing traffic on port 80 to a different IP on the same server

    - by Spacedust
    I have multiple IP addresses on the same server and I would like to redirect all outgoing traffic on port 80 to a different IP on the same server just no to use always main IP. Currently I'm using this: /sbin/iptables -t nat -A POSTROUTING -o eth0 -j SNAT --to-source IP; and it works well, but it redirects everything and when I make backups over SSH backup it's failing. System: CentOS 5.8 64-bit

    Read the article

  • Shell script fro daily disk usage report

    - by Master
    I am doing backups on my local drives. The drives are mounted in /media folder. Now i want to run cron job daily which will tell in table format how much disk is used by folder and how much free space is left on drive It would be good if i can insert that info in database and i can see that info use webpage on locahost ubuntu 10

    Read the article

  • Executing batch file from sql server job

    - by uzay95
    I want to create backup job on sql server. And i want to execute batch file in job. I just wonder the part of executing batch file from sql job. Do you have any idea? Any help would appreciated. use MyDb go BACKUP DATABASE MyDb TO DISK = 'C:\BackUps\MyDb.bak' WITH differential go -- Call my batch file (which will zip MyDb.bak file)

    Read the article

< Previous Page | 44 45 46 47 48 49 50 51 52 53 54 55  | Next Page >