Search Results

Search found 31902 results on 1277 pages for 'sql backup'.

Page 412/1277 | < Previous Page | 408 409 410 411 412 413 414 415 416 417 418 419  | Next Page >

  • ODBC in SSIS 2012

    - by jamiet
    In August 2011 the SQL Server client team published a blog post entitled Microsoft is Aligning with ODBC for Native Relational Data Access in which they basically said "OLE DB is the past, ODBC is the future. Deal with it.". From that blog post:We encourage you to adopt ODBC in the development of your new and future versions of your application. You don’t need to change your existing applications using OLE DB, as they will continue to be supported on Denali throughout its lifecycle. While this gives you a large window of opportunity for changing your applications before the deprecation goes into effect, you may want to consider migrating those applications to ODBC as a part of your future roadmap.I recently undertook a project using SSIS2012 and heeded that advice by opting to use ODBC Connection Managers rather than OLE DB Connection Managers. Unfortunately my finding was that the ODBC Connection Manager is not yet ready for primetime use in SSIS 2012. The main issue I found was that you can't populate an Object variable with a recordset when using an Execute SQL Task connecting to an ODBC data source; any attempt to do so will result in an error:"Disconnected recordsets are not available from ODBC connections." I have filed a bug on Connect at ODBC Connection Manager does not have same funcitonality as OLE DB. For this reason I strongly recommend that you don't make the move to ODBC Connection Managers in SSIS just yet - best to wait for the next version of SSIS before doing that.I found another couple of issues with the ODBC Connection Manager that are worth keeping in mind:It doesn't recognise System Data Source Names (DSNs), only User DSNs (bug filed at ODBC System DSNs are not available in the ODBC Connection Manager)  UPDATE: According to a comment on that Connect item this may only be a problem on 64bit.In the OLE DB Connection Manager parameter ordinals are 0-based, in the ODBC Connection Manager they are 1-based (oh I just can't wait for the upgrade mess that ensues from this one!!!)You have been warned!@jamiet

    Read the article

  • Could not continue scan with NOLOCK due to data movement during installation

    - by dbdev1
    I am running Windows Server 2008 Standard Edition R2 x64 and I installed SQL Server 2008 Developer Edition. All of the preliminary checks run fine (Apart from a warning about Windows Firewall and opening ports which is unrelated to this and shouldn't be an issue - I can open those ports). Half way through the actual installation, I get a popup with this error: Could not continue scan with NOLOCK due to data movement. The installation still runs to completion when I press ok. However, at the end, it states that the following services "failed": database engine services sql server replication full-text search reporting services How do I know if this actually means that anything from my installation (which is on a clean Windows Server setup - nothing else on there, no previous SQL Servers, no upgrades, etc) is missing? I know from my programming experience that locks are for concurrency control and the Microsoft help on this issue points to changing my query's lock/transactions in a certain way to fix the issue. But I am not touching any queries? Also, now that I have installed the app, when I login, I keep getting this message: TITLE: Connect to Server ------------------------------ Cannot connect to MSSQLSERVER. ------------------------------ ADDITIONAL INFORMATION: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server) (Microsoft SQL Server, Error: 67) For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&EvtSrc=MSSQLServer&EvtID=67&LinkId=20476 ------------------------------ BUTTONS: OK ------------------------------ I went into the Configuration Manager and enabled named pipes and restarted the service (this is something I have done before as this message is common and not serious). I have disabled Windows Firewall temporarily. I have checked the instance name against the error logs. Please advise on both of these errors. I think these two errors are related. Thanks

    Read the article

  • July, the 31 Days of SQL Server DMO’s – Day 2 (sys.dm_exec_sessions)

    - by Tamarick Hill
      This sys.dm_exec_sessions DMV is another Server-Scoped DMV which returns information for each authenticated session that is running on your SQL Server box. Lets take a look at some of the information that this DMV returns. SELECT * FROM sys.dm_exec_sessions This DMV is very similar to the DMV we reviewed yesterday, sys.dm_exec_requests, and returns some of the same information such as reads, writes, and status for a given session_id (SPID). But this DMV returns additional information such as the Host name of the machine that owns the SPID, the program that is being used to connect to SQL Server, and the Client interface name. In addition to this information, this DMV also provides useful information on session level settings that may be on or off such as quoted identifier, arithabort, ansi padding, ansi nulls, etc. This DMV will also provide information about what specific isolation level the session is executing under and if the default deadlock priority for your SPID has been changed from the default. Lastly, this DMV provides you with an Original Login Name, which comes in handy whenever you have some type of context switching taking place due to an ‘EXECUTE AS’ statement being used and you need to identify the original login that started a session. For more information on this DMV, please see the below Books Online link: http://msdn.microsoft.com/en-us/library/ms176013.aspx

    Read the article

  • Rsync from godaddy to OS X

    - by Ola
    I would like to use rsync to backup my website to my local computer (OS X). I started of with this guide and got pretty far. I use the following rsync-line: rsync -PzrlptgD --del --delete-excluded -r --rsync-path=~/bin/rsync user@server:~/ /local/backup/folder/ I wanted to use the -a option (same as rlptgoD) but it crashes as soon as I use the -o flag. receiving file list ... rsync: connection unexpectedly closed (8 bytes received so far) [receiver] rsync error: unexplained error (code 255) at /SourceCache/rsync/rsync-42/rsync/io.c(452) [receiver=2.6.9] If I skip the --owner flag it copies the files but I'm not really sure what difference it makes (I've tried to read up on it but found nothing) Should I just skip using the --owner flag? Or have I done any other mistake? Thanks in advance //OL

    Read the article

  • Could not continue scan with NOLOCK due to data movement during installation

    - by dbdev1
    Hi, I am running Windows Server 2008 Standard Edition R2 x64 and I installed SQL Server 2008 Developer Edition. All of the preliminary checks run fine (Apart from a warning about Windows Firewall and opening ports which is unrelated to this and shouldn't be an issue - I can open those ports). Half way through the actual installation, I get a popup with this error: Could not continue scan with NOLOCK due to data movement. The installation still runs to completion when I press ok. However, at the end, it states that the following services "failed": database engine services sql server replication full-text search reporting services How do I know if this actually means that anything from my installation (which is on a clean Windows Server setup - nothing else on there, no previous SQL Servers, no upgrades, etc) is missing? I know from my programming experience that locks are for concurrency control and the Microsoft help on this issue points to changing my query's lock/transactions in a certain way to fix the issue. But I am not touching any queries? Also, now that I have installed the app, when I login, I keep getting this message: TITLE: Connect to Server Cannot connect to MSSQLSERVER. ADDITIONAL INFORMATION: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server) (Microsoft SQL Server, Error: 67) For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&EvtSrc=MSSQLServer&EvtID=67&LinkId=20476 BUTTONS: OK I went into the Configuration Manager and enabled named pipes and restarted the service (this is something I have done before as this message is common and not serious). I have disabled Windows Firewall temporarily. I have checked the instance name against the error logs. Please advise on both of these errors. I think these two errors are related. Thanks

    Read the article

  • Update RDS db via mysqlbinlog: "you need (at least one of) the SUPER privilege(s)"

    - by timoxley
    We are moving a production site to EC2/RDS Followed these instructions: http://geehwan.posterous.com/moving-a-production-mysql-database-to-amazon I have set up row-based binary logging on the production server did a: mysqldump --single-transaction --master-data=2 -C -q -u root -p backup.sql then imported to RDS instance. No dramas. Due to the size of the db, and minimal downtime requirements, I've got to update the ec2 db to the latest datas via the binlogs, and it won't let me. mysqlbinlog mysql-bin.000004 --start-position=360812488 | mysql -uroot -p -h and it says: ERROR 1227 (42000) at line 6: Access denied; you need (at least one of) the SUPER privilege(s) for this operation My guess, based on what is on line 6 of the binlog, is that it's the 'write to the BINLOG' statements in the SQL backup, and because RDS doesn't support this, it can't run these statements, or something, I don't really know. Please help.

    Read the article

  • July, the 31 Days of SQL Server DMO’s – Day 1 (sys.dm_exec_requests)

    - by Tamarick Hill
    The first DMO that I would like to introduce you to is one of the most common and basic DMV’s out there. I use the term DMV because this DMO is actually a view as opposed to a function. This DMV is server-scoped and it returns information about all requests that are currently executing on your SQL Server instance. To illustrate what this DMV returns, lets take a look at the results. As you can see, this DMV returns a wealth of information about requests occurring on your server. You are able to see the SPID, the start time of a request, current status, and the command the SPID is executing. In addition to this you see columns for sql_handle and plan_handle. These columns (when combined with other DMO’s we will discuss later) can return the actual sql text that is being executed on your server as well as the actual execution plan that is cached and being used. This DMV also returns information about various wait types that may be occurring for your spid. The percent_complete column displays a percentage to completion for certain database actions such as DBCC CheckDB, Database Restores, Rollback’s, etc. In addition to these, you are also able to see the amount of reads, writes, and cpu that the SPID has consumed. You will find this DMV to be one of the primary DMV’s that you use when looking for information about what is occurring on your server.

    Read the article

  • Get-ChildItem fails to connect in SQLSERVER drive

    - by Norman Kelm
    I'm having some trouble with the SQLSERVER PSDRIVE. See error below. I only have named instances on my PC, both 2005 and 2008 Added the SQL snapins. The PC is named YODA The SQL instance is SQL2008 Navigate to the Databases folder for YODA\SQL2008. You can see the path below. dir -name spits out a connection error trying to connect to YODASQL2008\DEFAULT when it should be trying to connect to YODA\SQL2008. Then it outputs the db name which is Twitter in this case. Is there something missing from my config? Output: PS SQLSERVER:\SQL\YODA\SQL2008\Databases dir -name Get-ChildItem : SQL Server PowerShell provider error: Could not connect to 'YODASQL2008\DEFAULT'. [Failed to connect to server YODASQL2008. -- A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server)] At line:1 char:4 + dir <<<< -name + CategoryInfo : OpenError: (SQLSERVER:\SQL\...tabases\Twitter:SqlPath) [Get-ChildItem], GenericProviderException + FullyQualifiedErrorId : ConnectFailed,Microsoft.PowerShell.Commands.GetChildItemCommand Twitter Repeats with error for every database. Thanks, Norman

    Read the article

  • HP DAT72x6 autoloader

    - by ericmayo
    Hoping someone here has seen this similar issue and can offer soem advise... I have an HP DAT72x6 auto loader tape backup unit. The external kind, here is a link to an owner's manual I found of it. http://www.dectrader.com/docs/set2/emr_na-c00070400-1.pdf I purchased the unit used about 6 months ago. The unit stopped working after 3-4 back-ups, it's used one day a month to do a monthly backup of another system. Suffice it to say the unit gets very little usage. There is an amber light on the front of the unit called the OAR (Operator Attention Required). The manual states to call for service when this light comes on and stays on. I've tried a few things to resolve but none are working. I've tried power cycling, re-securing the SCSI cables at both ends. Unit was used so I didn't pay much ($500) and so I don't want to spend a lot to have it fixed; might as well buy something new one if fixing this is going to cost more than $100-$150 bucks. I'm curious to see if anyone here has been around these devices or possibly is an HP repair person that can give me some things to try to resolve. The manual states that a solid amber OAR light indicates a hardware failure. When I power cycle the unit I see one of two scenarios so far. The unit powers up, shows self test in the LCD, then LCD changes to show all possible images and the OAR light comes on. The unit powers up, LCD is completely blank, the green lights go through some sort of process of going on and off and later the amber OAR light comes on and stays on. If it's a simple misalignment issue, I may be able to fix myself but not knowing what could cause the OAR light to come on gives me no where to even start. Google around gave no help either. I hoping someone here has experience with this and can help or point me in the right direction. Also, I don't have the HP Diagnostic tools mentioned in many manuals. The unit is connected to a Linux box. The 3-4 backups I've done with it so far have had no issues. We run amanda backup. Before this incident the unit was backing up and reading tapes fine. Thanks for any help or suggestions.

    Read the article

  • BackupExec 12 to Bacula questions

    - by LVDave
    We have a 128 node compute cluster for environmental modeling, with a master/head node which we currently backup with a Windows2003 system running BackupExec 12 and a single HP LTO3 tape drive. We have recently ordered an Overland NEO200s 12 slot library, and are considering migrating off Windows to CentOS 5 for the backup server. The master/head node is RHEL5, with the compute nodes currently being migrated from a mix of RHEL3/4/5 to CentOS5. I'm fairly familiar with RH/Centos, but have no experience with Bacula. We've tentatively settled on Bacula as our cluster vendor recommended it. My questions are: 1) Does Bacula support an Overland NEO200s/LTO3 library? 2) Can Bacula catalog/restore tapes written by BE? and 3) I've head of Amanda, but am even more unfamiliar with it than Bacula. Any assistance would be appreciated Dave Frandin

    Read the article

  • Users in the field and Time Machine

    - by Bart Silverstrim
    We have several users in the field with MacBooks. Because they're not always on the network, they save files locally to their notebooks. Normally I back things up by making quick copies to other media, but with the Macs they have the option to run Time Machine (and the way OS X is designed, they're heavily encouraged to use it.) Question; for maintaining Macs and data, how reliable and thorough is Time Machine for backup/restoration? Does it just back up the user's home directory, or can it restore the Mac if, for example, the drive fails? And are there options for "securing" the data like corporate backup software for Windows does, to encrypt the data on the time machine drive?

    Read the article

  • How do I restore a non-system hard drive using Time Machine under OSX?

    - by richardtallent
    I dropped one of the external drives on my Mac Pro and it started making noises... so I bought a replacement drive. No biggie, that's why I have Time Machine, right? So now that I have the new drive up and initialized, how do I actually restore the drive from backup? Time Machine is intuitive when it comes to restoring the system drive or restoring individual folders/files on the same literal device, but I'm a bit stuck in how to properly restore an entire drive that is not the boot drive. I saw one suggestion to use the same volume name as the old drive and then go into Time Machine. Haven't tried that since the information is unconfirmed. For now, I just went to the Time Machine volume, found the latest backup folder for that volume, and I'm copying the files via Finder. Of couse, I expect this to work just fine, but I feel like I'm missing something if that's the "proper" way to do this.

    Read the article

  • best practice? Consumer data in MySQL on Amazon EBS (Elastic block store)

    - by jeff7091
    This is a consumer app, so I will care about storage costs - I don't want to have 5x copies of data lying about. The app shards very well, so I can use MySQL and not have scaling issues. Amazon EBS has a nice baseline+snapshot backup capability that uses S3. This should have a light footprint (in terms of storage cost). BUT: the magnolia.com story scares the crap out of me: basically flawless block-level backup of a corrupt DB or filesystem. Is there anything that is nearly as storage efficient as EBS at the MySQL level?

    Read the article

  • How to transfer files and settings from Windows 7 x64 to Windows 2008 R2?

    - by Mohamed Meligy
    If I want to re-install Windows 7 Enterprise 64 bit (or any other edition of Windows 7), I'd typically use "Windows Easy Transfer" utility built in the OS to backup and restore my files and settings. But in my case, I'm migrating to Windows Server 2008 R2. If I remember well -having worked on both Windows 2008/2008R2 before- "Windows Easy Transfer" is NOT installed on Windows server, and it doesn't even understand the format of the backup file it generates (".MIG" file). I can't remember for sure whether this is true, is it? And if it's true, what is the alternative for transferring the files and more importantly program settings to Windows 2008 R2? Of course I'm aware of the "manual" option and that automatic transfer sure will not transfer everything. Options??

    Read the article

  • July, the 31 Days of SQL Server DMO’s – Day 3 (sys.dm_exec_connections)

    - by Tamarick Hill
      The third DMV we will review is the sys.dm_exec_connections DMV. This DMV is Server-Scoped and displays information about each and every current connection on your SQL Server Instance. Lets take a look at some information that this DMV returns. SELECT * FROM sys.dm_exec_connections After reviewing this DMV, in my opinion, its not a whole lot of useful information returned from this DMV from a monitoring or troubleshooting standpoint. The primary use case I have for this DMV is when I need to get a quick count of how many connections I have on one of my SQL Server boxes. For this purpose a quick SELECT COUNT(*) satisfies my need. However, for those who need it, there is other information such as what type of authentication a specific connection is using, network packet size, and client/local TCP ports being used. This information can come in handy for specific scenarios but you probably wont need it very much for your day to day monitoring/troubleshooting needs. However, this is still an important DMV that you should be aware of in the event that you need it. For more information on this DMV, please see the below Books Online link: http://msdn.microsoft.com/en-us/library/ms181509.aspx

    Read the article

  • ASP.NET 4.0- CompressionEnabled Property in session state 4.0

    - by Jalpesh P. Vadgama
    Hello Guys, This blog has been quite for few days. Because i was busy with some personal and professional work both and that’s why i am not able to work on writing blog posts which i have discovered in last few days. Here is one features of asp.net 4.0 that I am going to explain. As a web developer we all know about session. Without the use of session any database driven web application is incomplete. As we all know unlike windows form web forms are state less so when user interacts with web application we need to maintain state amongst web pages and we are using session for maintaining state between web pages for each users. ASP.NET is also provide same kind of session state functionalities. ASP.Net Session state identify request coming for same user and same browser for specific session time out interval and its preserves values in session for that specific time intervals and that’s help us in maintaining state amongst web pages for a specific user. ASP.NET Session state allows us to store session in three way 1. IncProc 2. Session State Service 3. SQL Server. In SQL Server mode it will store session in SQL Server tables instead of storing it in Server Memory. ASP.NET 4.0 provides a new property called Compression Enabled that means when we store values in serialized form in SQL Server with GZip Compression and that results in better performance. For that you need to store property in web.config like following. <sessionState allowCustomSqlDatabase="true" sqlConnectionString="data source=Server;Initial Catalog=aspnetsessionstatedb" compressionEnabled="true" /> That’s it now with the use of this property you can have better performance when you are storing large amount of data in session.But still you need to decide that why you want to stored large amount of data in session because its against best practices. Technorati Tags: Session,ASP.NET 4.0

    Read the article

  • How do I protect business critical data against fire?

    - by Bill Knowles
    We have 72 hard drives that contain our webcast inventory. The number is increasing. We're located in a frame building and we are afraid of not only fire, but catastrophic fire. I've priced fireproof safes that hold to the required 125F for hard drives. Their price is through the roof. Seems to me if we made backups of each of the hard drives and stored them off-site somewhere, or contracted with an online backup storage company, we might run up a bill buying backup drives that would approach the $7,000 cost of the safe! What's the best way to protect our data from the risk of fire?

    Read the article

  • Where does Picasa store albums?

    - by Dan
    For people searching, the question might also be phrased: How do I restore Picasa albums from backup? When I reinstalled my computer and restored my photos from backup, some of my albums showed up, but many didn't. I've found the following info: Picasa on Windows stores (stored?) album info in these places: Vista: C:\Users\<myaccount>\AppData\Local\Google\Picasa2Albums\ XP: C:\Documents and Settings\<myaccount>\Local Settings\Application Data\google\Picasa2Albums\ I restored that folder and was still missing many of my albums. That folder also contained a folder of backups, but the most recent one was from a long time ago and I've created albums since then. According to https://support.google.com/picasa/bin/picasa.google.com/support/bin/static.py?hl=en&page=release_notes.cs, since the Dec 8, 2011 build, Picasa saves album info in .ini file(s). This probably explains the albums that I do see. http://katelharrison.blogspot.com/2012/01/how-to-restore-picasa-albums-mac.html has some great info on restoring albums on Macs, but the folder structure seems to be different there than on Windows.

    Read the article

  • Incremental backups from Rackspace Cloud Files to Amazon Glacier

    - by Martin Wilson
    Is there a software product/module (open-source or commercial) that can provide incremental backups from Rackspace Cloud Files to Amazon Glacier? We are looking for something that will provide the following functionality (or achieve the same result, i.e. a cost-effective backup strategy for files stored in Rackspace Cloud Files): Work out which files have been added to or modified in a Rackspace Cloud account (since the last backup). Create a ZIP (or similar) of these files and store them in Amazon Glacier. Keep a record of which files are in which ZIPs. Ideally, restore either a single file or all files from Glacier back into Rackspace.

    Read the article

  • SQL Azure Service Issues &ndash; 10.27.2012 (Restored Now)

    - by ToStringTheory
    Please note that if you have a Windows Azure website, or use SQL Azure, your site may be experiencing downtime currently.  Notice I just called in regarding one of my public facing internet sites, because the site was failing to load anything but its error page, I couldn’t connect to the database to inspect application error logs, and the Windows Azure Management portal won’t load the SQL Azure extension. After speaking to the representative, he also mentioned that they were also having some problems updating the Service Dashboard which shows service up/down time, and for now, they are posting messages at http://account.windowsazure.com.  Please note that this issue may only be effecting certain regions.  Last, I may have misheard the representative, but he said that the outage was being categorized as a level 8, and if I heard correctly, I think he said that level 8 was the worst level.  I can’t say for sure on this though, because the phone connection to their support number was bad – large amounts of white noise. Good Luck! Update It appears that this outage may also be effecting the following services: SQL Database, Service Bus, Datamarket, Windows Azure Marketplace, Shared Caching, Access Control 2.0, and SQL Reporting. The note on the account page says for the South Central US region, however, I believe the representative I spoke to also mentioned North Central. As I said before though, the connection was bad. Update 2 My site regained connectivity about an hour ago, and it appears that the service dashboard is back in operation with correct status and history. It does appear that I misheard on the phone regarding multiple regions, so chances are this only effected a percentage of the platform. All in all, if this WAS their worst level of a problem, they really got it fixed and back up pretty fast. All in all, I understand that it is inherent for a complex system such as Azure to have ups and downs, but at the end of the day, I am still happy to support Azure to its fullest!

    Read the article

  • Incremental backups in Quickbooks 2005

    - by Nathan DeWitt
    My church uses Quickbooks 2005. They have a backup to a 512 MB thumbdrive. They have been backing up about every week for the past 18 months. The filesize of the backups have grown from 14 MB to about 23 MB. I was planning on giving them a 1 or 2 GB thumb drive and calling it a day, but when I dumped this info into Excel and projected out the growth rate, I found that we'll hit 1 GB in July, and 10 GB in about another 18 months, and then 100 GB about 18 months after that. It looks to me like Quickbooks saves all the transactions with every backup. Is there a way to force incremental backups? If this is the way it is, that's fine, but I'd rather not keep buying another order of magnitude of storage space every 18 months. Can I safely delete the previous backups, and just keep the recent 2 or 3 months worth? Thanks.

    Read the article

  • Cluster Nodes as RAID Drives

    - by BuckWoody
    I'm unable to sleep tonight so I thought I would push this post out VERY early. When you don't sleep your mind takes interesting turns, which can be a good thing. I was watching a briefing today by a couple of friends as they were talking about various ways to arrange a Windows Server Cluster for SQL Server. I often see an "active" node of a cluster with a "passive" node backing it up. That means one node is working and accepting transactions, and the other is not doing any work but simply "standing by" waiting for the first to fail over. The configuration in the demonstration I saw was a bit different. In this example, there were three nodes that were actively working, and a fourth standing by for all three. I've put configurations like this one into place before, but as I was looking at their architecture diagram, it looked familar - it looked like a RAID drive setup! And that's not a bad way to think about your cluster arrangements. The same concerns you might think about for a particular RAID configuration provides a good way to think about protecting your systems in general. So even if you're not staying awake all night thinking about SQL Server clusters, take this post as an opportunity for "lateral thinking" - a way of combining in your mind the concepts from one piece of knowledge to another. You might find a new way of making your technical environment a little better. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Alternative to SecondCopy?

    - by overtherainbow
    Hello I've been using SecondCopy (7.0.0.146) on XP for a few years now to copy important files from one hard-disk to another. One thing that bothers me is that it is unable to copy some files that are open. I assume Windows provides an API that allows an application to put an exclusive lock on an open file and backup utilities like SecondCopy just can't access them until they are closed. As a result, since I have to close a bunch of files/applications for SecondCopy to complete successfully, I typically don't run SecondCopy regularly like I should... which pretty much beats the whole purpose of backing up data :-/ For those of you using a similar solution to back up your important file onto a second mass storage solution... Can you confirm that an open file can be set off-limit with an exclusive lock, and no backup solution will work with those? If you've tried SecondCopy and other solutions recently and you ended up using another solution, which one did you choose and why? Thank you for any feedback.

    Read the article

< Previous Page | 408 409 410 411 412 413 414 415 416 417 418 419  | Next Page >