Search Results

Search found 30724 results on 1229 pages for 'backup solution'.

Page 9/1229 | < Previous Page | 5 6 7 8 9 10 11 12 13 14 15 16  | Next Page >

  • Backup a hosted Sharepoint

    - by David Mackintosh
    One of my customers has outsourced their Sharepoint and Exchange services to a hosted services provider. I believe it is a Sharepoint 2007 service. It is a shared hosting solution, so we do not have any kind of access to the server itself; we only have user-level and sharepoint-administrator-level access to the Sharepoint application. They have come to the point where they would like to have a copy of everything that is on the Sharepoint server. I have downloaded the Office Sharepoint Designer 2007, and it features three (!) ways to backup a Sharepoint server, none (!) of which work for me: File-Export-Personal Web Package: When selecting everything, it calculates a negative size. Barfs with No "content-type" in CGI environment error. File-Export-Sharepoint Template: barfs with a A World Wide Web browser, such as Windows Internet Explorer, is required to use this feature error. Site-Administration-Backup Web Site: wants to create the backup .cmp file on the sharepoint server itself. I don't have access to any servers on the same network so I can't redirect it to any form of the suggested \\server\place. Barfs with a The Web application at $URL could not be found. [...] error. Possibly moot because Google tells me that bad things happen using OSD to back up sites larger than 24MB (which this site is most definitely). So I called the helpdesk of the outsource provider, and got told that they recommend using OSD, but no they don't actually provide any application support for OSD (not that I blame them for that), but they could do a stsadm.exe backup and provide us with that, and OSD should be able to read the resulting cmp file. Then for authorization reasons they had my customer call them directly (since I can't authorize such an operation), and they told him that he didn't want a stsadm.exe backup, he wanted to get into an 'explorer view' and deal with things that way (they were vague). Google hasn't been much help in figuring out what an 'explorer view' is, let alone how I bring one up. The end goal of this operation is to have a backup of the site as it exists (hopefully today, but shortly anyways) in such a format that we don't need another sharepoint server to restore it to. Ie we'd like to be able to pick individual content directly out of this backup. We are not excessively concerned with things like formatting. We just want the documents. This is a fairly complex site with multiple subsites and multiple folders per subsite, so sitting there and manually downloading each file isn't really going to happen if there is a better easier way. So, my questions: Is the stsadm.exe backup what I want? If not, what do I want? If I manage to convince them that I do want the stsadm.exe backup, can I pick files out of the resulting backup file with OSD? If OSD isn't going to let me extract individual files, is there a tool I can use that can?

    Read the article

  • 8 Backup Tools Explained for Windows 7 and 8

    - by Chris Hoffman
    Backups on Windows can be confusing. Whether you’re using Windows 7 or 8, you have quite a few integrated backup tools to think about. Windows 8 made quite a few changes, too. You can also use third-party backup software, whether you want to back up to an external drive or back up your files to online storage. We won’t cover third-party tools here — just the ones built into Windows. Backup and Restore on Windows 7 Windows 7 has its own Backup and Restore feature that lets you create backups manually or on a schedule. You’ll find it under Backup and Restore in the Control Panel. The original version of Windows 8 still contained this tool, and named it Windows 7 File Recovery. This allowed former Windows 7 users to restore files from those old Windows 7 backups or keep using the familiar backup tool for a little while. Windows 7 File Recovery was removed in Windows 8.1. System Restore System Restore on both Windows 7 and 8 functions as a sort of automatic system backup feature. It creates backup copies of important system and program files on a schedule or when you perform certain tasks, such as installing a hardware driver. If system files become corrupted or your computer’s software becomes unstable, you can use System Restore to restore your system and program files from a System Restore point. This isn’t a way to back up your personal files. It’s more of a troubleshooting feature that uses backups to restore your system to its previous working state. Previous Versions on Windows 7 Windows 7′s Previous Versions feature allows you to restore older versions of files — or deleted files. These files can come from backups created with Windows 7′s Backup and Restore feature, but they can also come from System Restore points. When Windows 7 creates a System Restore point, it will sometimes contain your personal files. Previous Versions allows you to extract these personal files from restore points. This only applies to Windows 7. On Windows 8, System Restore won’t create backup copies of your personal files. The Previous Versions feature was removed on Windows 8. File History Windows 8 replaced Windows 7′s backup tools with File History, although this feature isn’t enabled by default. File History is designed to be a simple, easy way to create backups of your data files on an external drive or network location. File History replaces both Windows 7′s Backup and Previous Versions features. Windows System Restore won’t create copies of personal files on Windows 8. This means you can’t actually recover older versions of files until you enable File History yourself — it isn’t enabled by default. System Image Backups Windows also allows you to create system image backups. These are backup images of your entire operating system, including your system files, installed programs, and personal files. This feature was included in both Windows 7 and Windows 8, but it was hidden in the preview versions of Windows 8.1. After many user complaints, it was restored and is still available in the final version of Windows 8.1 — click System Image Backup on the File History Control Panel. Storage Space Mirroring Windows 8′s Storage Spaces feature allows you to set up RAID-like features in software. For example, you can use Storage Space to set up two hard disks of the same size in a mirroring configuration. They’ll appear as a single drive in Windows. When you write to this virtual drive, the files will be saved to both physical drives. If one drive fails, your files will still be available on the other drive. This isn’t a good long-term backup solution, but it is a way of ensuring you won’t lose important files if a single drive fails. Microsoft Account Settings Backup Windows 8 and 8.1 allow you to back up a variety of system settings — including personalization, desktop, and input settings. If you’re signing in with a Microsoft account, OneDrive settings backup is enabled automatically. This feature can be controlled under OneDrive > Sync settings in the PC settings app. This feature only backs up a few settings. It’s really more of a way to sync settings between devices. OneDrive Cloud Storage Microsoft hasn’t been talking much about File History since Windows 8 was released. That’s because they want people to use OneDrive instead. OneDrive — formerly known as SkyDrive — was added to the Windows desktop in Windows 8.1. Save your files here and they’ll be stored online tied to your Microsoft account. You can then sign in on any other computer, smartphone, tablet, or even via the web and access your files. Microsoft wants typical PC users “backing up” their files with OneDrive so they’ll be available on any device. You don’t have to worry about all these features. Just choose a backup strategy to ensure your files are safe if your computer’s hard disk fails you. Whether it’s an integrated backup tool or a third-party backup application, be sure to back up your files.

    Read the article

  • Recommended total system backup solution

    - by bioShark
    I hope this question won't get closed immediately since it has a generic title. I already searched a bit around the answers here, but nothing satisfied me. I want a back-up solution that makes a total back-up, so that I can restore my Ubuntu in case of major failures, like HDD failing. As far as I can see, I have 2 choices: 1) Backing up with Deja Dup to an external disk. This is fine and I am already doing, but in case my HDD fails, and I make a new Ubuntu install on a new disk, will Deja Dup be able to restore all my setting and stuff from the backed up files? If it can, then what other files/folders should I add in Deja Dup to back-up (currently I have set only the recommended /home folder)? Is there a point in telling Deja Dup to back-up everything under "/" ? 2) A disk/partition cloning software. This would be something similar to Noton ghost. Is there such a software with nice GUI that you could recommend for Ubuntu? And even better, it would be nice if Ubuntu's liveCD could recognize such a clone at install step. I am using 11.10

    Read the article

  • Backup hardware and strategy on distributed Windows Server 2008 network

    - by CesarGon
    This question is a follow up to this. We have a Windows Server 2008 R2 domain over a network that spans two different buildings, linked by a 100-Mbps point-to-point line. Over 60 users work in the organisation. We are planning to use DFS folders and DFS replication for file serving across the organisation. The estimated data volume is over 2 TB, and will grow at approximately 20% annually. The idea is to set up a DFS file server in each building and use DFS so that all the contents stay replicated over the 100-Mbps link. We are now considering backup hardware and strategies. We are Dell customers and, after browsing the online Dell catalogue, I can see a number of backup hardware options. My main doubts are the following: Would you go for a tape library, disk backup, or are there other options worth considering? Would you perform batch backups (i.e. nightly) or would you use continuous backup (i.e. while users are working)? Would you use a dedicated backup server to which the tape library (or any other backup device) is attached, or is there any other alternative way of doing things? My experience with backup hardware and overall setup is limited, so I appreciate any good piece of advice that you may have. Thanks.

    Read the article

  • Exchange DiskShadow/Robocopy backup does not purge log files

    - by Robert Allan Hennigan Leahy
    I have a series of scripts setup to backup my Exchange. The following command is executed to start the process: diskshadow /s C:\Backup_Scripts\exchangeserverbackupscript1.dsh This is exchangeserverbackupscript1.dsh: #DiskShadow script file set verbose on #delete shadows all set context persistent writer verify {76fe1ac4-15f7-4bcd-987e-8e1acb462fb7} set metadata C:\Backup_Scripts\shadowmetadata.cab begin backup add volume C: alias SH1 create expose %SH1% P: exec C:\Backup_Scripts\exchangeserverbackupscript1.cmd end backup delete shadows exposed P: exit #End of script And this is exchangeserverbackupscript1.cmd: robocopy "P:\Program Files\Microsoft\Exchange Server\Mailbox\First Storage Group" "\\leahyfs\J$\E-Mail Backups\Day 1" /MIR /R:0 /W:0 /COPY:DT /B This is not causing Exchange to purge its log files. The edb file is 4.7 gigabytes, but the First Storage Group folder itself is 50+ gigabytes due to many, many log files for each day going back to 2009. Is there any way -- I've Googled and haven't found anything -- to notify Exchange when I've completed a full backup, and have it purge its log files? According to this and this, end backup should cause Exchange to "flush the transaction logs for that storage group" but only "if a successful backup of a storage group occurred", which leaves my question as: What constitutes a "successful backup", and why is what I'm doing not it?

    Read the article

  • Backup solution to backup terabytes and lots of static files on linux server?

    - by user28679
    Which backup tool or solution would you use to backup terabytes and lots of files on a production linux server ? Note that the files are all different and almost never modified, and usage is mostly adding files, so data volume is today 3TB growing all the time at around +15GB/day. Please do not reply rsync. Basic unix tools are not enough, rsync does not keep history, rdiff-backup miserably fails from time to time and screw the history. Moreover these are all file based backup, which put a lot of IOwait just to browse directories and query stat(). But i guess, except R1Soft CDP, there is no way around that. We tried R1Soft CDP backup, which is block level backup, and it proved good and efficient for all our other servers, but systematically fails on the server with 3 terabytes and gazillions of files. That is already more than 2 months that the engineers of R1Soft and datacenter are playing a hot ball game... and still no backup except regular rsync We never tried big commercial solutions, except R1Soft CDP since it was provided as an optional service by the datacented hosting our servers.

    Read the article

  • Backup Exec 11D and Thecus N5200 Pro

    - by JohnyD
    I have a Thecus N5200 Pro integrated into my Windows 2003 AD network. My current backup solution involves Backup Exec 11D but this requires a running service for windows boxes or a similar daemon for linux machines. The N5200 runs a custom linux kernel but as of yet I am unable to add it to my backups through Backup Exec. Does anyone know of a method of backing up directly from the N5200 to Backup Exec without moving the data to an intermediary for archiving?

    Read the article

  • Drupal Sites Backup and Restore to Amazon S3

    - by Ngu Soon Hui
    There are modules written for database backup and files backup, but what I want is a complete backup to Amazon S3 or other cloud platforms, for both the data, and the sites. Currently as it stands, I have to separately and manually backup the two. Is there any module/tool/already-written-script that allows me to do that?

    Read the article

  • Sending email after backup (Windows Server 2008)

    - by woodsbw
    I have a client who is using Windows Server 2008 (Small Business Server), and using Windows Backup. What I need to do is configure the backup task so that, upon completion, it sends an email notifying the client of backup success or failure. I have been able to find that task in task scheduler, and even see where I can send an email...but I cannot find a way to make the content of the email different based on success or failure of the backup. How might I do this?

    Read the article

  • Trouble restoring an iPod touch backup

    - by stringo0
    I ran into some problems with my iPod, and am trying to restore a backup following this link - http://support.apple.com/kb/HT1414 I tried following the section under "Restoring your iPhone or iPod touch". When it gets to step 9, it simply doesn't show up - the restore happens, the iPod restarts, but I DO NOT get an option to set it up as a new device, or restore a backup. A backup does exist (checked the location). Any ideas on how I can restore the backup?

    Read the article

  • Set-and-forget Windows backup software with NAS-support?

    - by Evert
    I am looking for set-and-forget backup software for Windows (Vista & 7, and if possible XP/2003). The idea is that it runs in the background on the clients, and does its thing towards a network-share. In case the HDD of one of these clients spontaneously combusts, all I want to have to do is: replace the drive, insert a USB-stick, boot from it, and restore the machine. It should support drives which use [ICH]-RAID. What are my options here? It looks like WHS meets all the requirements, but I am curious about my other options here.

    Read the article

  • Set-and-forget Windows backup software with NAS-support?

    - by Evert
    Hi all, I am looking for set-and-forget backup software for Windows (Vista & 7, and if possible XP/2003). The idea is that it runs in the background on the clients, and does its thing towards a network-share. In case the HDD of one of these clients spontaneously combusts, all I want to have to do is: replace the drive, insert a USB-stick, boot from it, and restore the machine. It should support drives which use [ICH]-RAID. What are my options here? It looks like WHS meets all the requirements, but I am curious about my other options here.

    Read the article

  • SQL Backup files, distinguish partial and full backup files

    - by ccook
    I have scheduled backups running through SQL Agent, with Full Backups nightly, and differential backups hourly. Is there a way to determine which of the backup files is the Full backup, and which is the latest differential? Intuitively, it would seem the largest backup within 24 hours is the full, and the latest smaller backup is the partial. However, this isn't robust. Is there a way to probe the backup file to check the backup type? (Preferably in c#)

    Read the article

  • Which are the most important directories to backup on a Linux server?

    - by QAH
    Hello everyone! I'm running an Ubuntu 9.10 Linux server. I'm trying to find a way to backup the machine while it is running and from what I see, this eliminates the disk clone utilities. All of the disk clone stuff I have seen for Linux requires that you reboot into a special live CD. So my question is this, what is the best solution for backing up the system while it is running? Also, I don't really care about the OS config too much, I just want to be able to keep my stored files and my programs that I have installed on it. Thanks

    Read the article

  • How to store 250TB of data and develop a backup/recovery plan?

    - by luccio
    I'm really new to this topic, so big apology for stupid questions. I have a school project and I want to know how to store 250TB of data with life-cycle for 18 months. It means every record is stored for 18 months and after this period of time can be deleted. There are 2 issues: store data backup data Due to amount of data I will probably need to combine data tapes and hard drives. I'd like to have "fast" access to 3 month old data, so ~42TB on disk. I really don't know what RAID should I use, or is here any better solution than combining disk and data tapes? Thanks for any advice, article, anything. I'm getting lost.

    Read the article

  • MySQL Enterprise Backup 3.8.2 has been released!

    - by Hema Sridharan
    MySQL Enterprise Backup v3.8.2, a maintenance release of online MySQL backup tool, is now available for download from My Oracle Support  (MOS) website as our latest GA release.  It will also be available via the Oracle Software Delivery Cloud in approximately 1-2 weeks. A brief summary of the changes in MySQL Enterprise Backup version 3.8.2 is given below.   A. Functionality Added or Changed:  MySQL Enterprise Backup has a new --on-disk-full command line option. mysqlbackup could hang when the disk became full, rather than detecting the low space condition. mysqlbackup now monitors disk space when running backup commands, and users can now specify the action to take at a disk-full condition with the --on-disk-full option. For more details, refer this page MySQL Enterprise Backup has a new progress report feature, which periodically outputs short progress indicators on its  operations to user-selected destinations (for example, stdout, stderr, a file, or other choices). For more details on progress report options, refer here   B. Bugs Fixed: When --innodb-file-per-table=ON, if a table was renamed and backup-to-image was in progress, apply-log would fail when being run on the backup. (Bug #16903973)   MySQL Server failed to start after a backup was restored if  there had been online DDL transactions on partitioned tables during the time of backup. (Bug #16924499)   apply-log failed if ALTER TABLE ... REORGANIZE PARTITION was applied to partitioned InnoDB tables during backup. (Bug #16721824, Bug #16903951)  apply-incremental-backup might fail with an assertion error if  the InnoDB tables being backed up were created in Barracuda format and with their KEY_BLOCK_SIZE  values  different from the innodb_page_size . This fix ensures that different KEY_BLOCK_SIZE  values are handled properly during incremental backup and apply-incremental-backup operations.  If a table was renamed following a full backup, a subsequent incremental backup could copy the .frm file with the new name, but not the associated .ibd file with the new name. After a  restore, the InnoDB data dictionary could be in an  inconsistent state. This issue primarily occurred if the table  was not changed between the full backup and the subsequent  incremental backup. Bug #16262690)  After a full backup, if a table was renamed and modified,  apply-incremental-backup would crash when run on the backup directory. (Bug #16262609) The value of the binary log position in backup_variables.txt  could be different from the output displayed during the   backup-and-apply-log operation. (This issue did not occur if  the backup and apply-log steps were done separately.) (Bug  #16195529) When using the --only-innodb-with-frm option, MySQL Enterprise Backup tried to create temporary files at unintended locations in the file system, which might cause a failure when, for example, the user had no write privilege for those locations.   This fix makes sure the paths for the temporary files are  correct. (Bug #14787324)  A backup process might hang when it ran into an LSN mismatch between a data file  and the redo log. This fix makes sure the process does not hang and it displays an error message showing the  name of the problematic data file (Bug #14791645) Please post your questions / comments about Backup in forums. Thanks, MEB Team

    Read the article

  • Where did my backup files go? Can they be recovered?

    - by Ken
    I just purchased a Western Digital Essential SE 1TB external hard drive from Best Buy at their recommendation. I then exchanged it for a Toshiba Canvio (I think that was the name). I have a Toshiba Qosmio X505-Q898. The Canvio locked up my computer and rewrote some kind of OS file, and erased all the restore points as well as the system image backup (according to Best Buy) just by plugging it in for the first time. Never even got to the install part or anything -- plugged it in and fried my computer. They spent about an hour and a half on my computer and got it back to a somewhat working condition and gave me access to my files. So now they say I have to back it up using my recovery disk and rewriting my OS. Enter the Essential. Brought it home last night, plugged it in and installed everything. Works perfect, no problems. Backed up everything on it. I unplugged and plugged it twice to make sure that everything was on it. Essential told me it had both the HDD and SSD backed up. So I reinstalled my OS. Plugged the Essential in and everything loads right up. Went to retrieve my files and the Western Digital has nothing on it. It shows all my music, pics, ETC. as still being on my computer and needing to be backed up, but since there are no files on my computer now. Where is this information coming from and where did my files go? It's about 810GB worth of files I've amassed over several years. Is there any way to recover data from this? I plan to contact Western Digital and Best Buy, just thought I would check here too. Any advice will be appreciated as a lot of these files are invaluable to me.

    Read the article

  • AppCmd backup for IIS7 gives access denied error (hresult:80070005)

    - by TruMan1
    I have a script I have been using on another Windows 2008 to delete the IIS7 backup of configs and create a fresh one: SET DEST=C:\Backup\Web\IIS7 SET BACKUPNAME=IIS7-CONFIGS %windir%\system32\inetsrv\appcmd.exe delete backup "%BACKUPNAME%" %windir%\system32\inetsrv\appcmd.exe add backup "%BACKUPNAME%" robocopy %windir%\system32\inetsrv\backup "%DEST%" /MIR /R:6 /W:10 /ZB But on a new Windows 2008 server, I get an access denied on the delete: ERROR ( hresult:80070005, message:Command execution failed. Access is denied. ) I have UAC turned off and pretty much copied all the settings from the old server (including user role being an admin). What am I missing?

    Read the article

  • AppCmd backup for IIS7 gives access denied error (hresult:80070005)

    - by TruMan1
    I have a script I have been using on another Windows 2008 to delete the IIS7 backup of configs and create a fresh one: SET DEST=C:\Backup\Web\IIS7 SET BACKUPNAME=IIS7-CONFIGS %windir%\system32\inetsrv\appcmd.exe delete backup "%BACKUPNAME%" %windir%\system32\inetsrv\appcmd.exe add backup "%BACKUPNAME%" robocopy %windir%\system32\inetsrv\backup "%DEST%" /MIR /R:6 /W:10 /ZB But on a new Windows 2008 server, I get an access denied on the delete: ERROR ( hresult:80070005, message:Command execution failed. Access is denied. ) I have UAC turned off and pretty much copied all the settings from the old server (including user role being an admin). What am I missing?

    Read the article

  • Reduce the I/O priority of Windows Backup (Windows Server 2008 R2)

    - by HelloSam
    I have a PostgreSQL running on Windows Server 2008 R2 x64 box. And I have scheduled a backup everyday from the RAID 1 DB disk to a dedicated standalone disk. They are SAS 15k on Dell PERC 6i. I am using the built-in Windows Server Backup for purpose. The problem is, whenever the backup process is kicked in, the database performance is hogged. I would say almost a 10x of performance reduction. From the resource monitor, the disk queue is in the double digit range when backing up, and less than 1 during the day. The disk activity is like ~30-50MB/s during backup, so I guess the hardware is acting normally, though wbengine.exe takes up most of the portions. I think reduce the IO priority of the backup process would be an answer, but I couldn't find a way to. Tuning process CPU priority does not seems to help.

    Read the article

  • Reliable Backup Solution for Linux for Complete System Restoration

    - by Chris S
    What's the best backup solution for Linux that can completely restore the entire filesystem to a blank harddrive (including partitioning) after an old harddrive dies? I'm currently running a few Ubuntu machines, some with RAID-1 and others without RAID (mostly laptops). I'd like to implement a backup solution that can take incremental snapshots of the entire filesystem, so that if I were to replace all the harddrives in a machine, I could use the backup to restore a perfect copy of the previous filesystem. Unfortunately, nearly all the backup solutions I've found seem to be glorified rsync scripts, which only backup some files, and have no easy way to restore once the entire filesystem is gone. Some of the more complicated solutions, like Bacula, might do what I need, but require a complicated server/client setup and are notoriously difficult to maintain. I've heard that Apple's TimeMachine utility has this ability, and I've had similar success taking differential disk images with Acronis True Image on Windows, but of course neither of these work on Linux. Is there anything comparable for Ubuntu?

    Read the article

  • SQL Server backup and restore process

    - by Nai
    Just wondering what backup processes you guys have. I am currently operating a weekly full database backup with daily differential backups. My understanding is that with such a set up, the difference between Full recovery mode and Simple recovery mode is that with Full recovery mode, I will be able to use the transaction logs to rollback my DB to a specific point in time having applied the latest differential backup. Assuming that in my scenario, the last differential backup serves as my last and ultimate 'save point', I don't see a need to rollback my DB even further back using the logs. This brings me to my question: Is there any additional benefits to be had using a Full recovery mode for my current backup process?

    Read the article

  • Win7 backup fails with "The system cannot find the file specified" 0x80070002 - has worked before

    - by Thorbjørn Ravn Andersen
    I have a 500 GB USB-disk which has been used as a backup device for a few years now, but now fails without telling me why. This is a Dell box with Intel USB-controllers (so the NVIDEA problem does not apply here). I have previously had problems with a Git package marking a non-existent directory for backup (adding it make the backup succeede). The inaccessible boot partition problem described in Windows Backup fails with 0x80070002: "The system cannot find the file specified" does not apply to me. I would appreciate hints on where to look to identify why my backup fails so I can fix it.

    Read the article

  • Datacenter Backup Strategy

    - by EasyEcho
    What are common approaches to backup solutions in remote data centers? I am already familiar with general backup principals and have a very good backup strategy for our local data center but am having great difficulty extending it to a remote data center. We currently do a full backup on Friday, differential Mon - Thu, rotate offsite Friday morning ...rinse and repeat week after week. BTW, we use disks and have been very happy with this approach. We could buy a large storage server and backup everything to it, but this solution doesn't give you offsite. We could encrypt and upload to Amazon or some other online storage but that would take a large amount of time given the data and would be rather expensive paying for the bandwidth leaving the data center and receiving at amazon. We could drive to the data center every Friday and continue to rotate disks as we do now. But that just seems old fashion. What am I missing, are there better options?

    Read the article

  • Pervasive database backup

    - by Steven
    I'm looking for the best way to backup my pervasive database. I've read the documentation but still have a few questions. It appears that Continuous Operations method only allows me to backup the entire database? So I'd do butil -startbu @filelist, then backup the entire database (copy, rsync, etc), then run butil -endbu @filelist. Looking through the documentation I don't see a way to get transaction logs out of this method; like I would do for MSSQL (BACKUP LOG ACCT TO DISK) or Postgres (archive_command). With rsync, it might be feasible to still do this every 15 minutes. The Archival Logging method means I would have to occasionally stop the database to get a full backup, which is acceptable for me. But can I copy the log files off of the server every 15 minutes, ie log shipping? Thank you.

    Read the article

< Previous Page | 5 6 7 8 9 10 11 12 13 14 15 16  | Next Page >