Search Results

Search found 22449 results on 898 pages for 'complete pc backup'.

Page 65/898 | < Previous Page | 61 62 63 64 65 66 67 68 69 70 71 72  | Next Page >

  • backing up ntfs disk using rsync on ubuntu

    - by user70366
    For a long time I was using windows. I have a separate drive I use to keep copies of my media files, photos etc. on, which I periodically backup to an external drive. In Windows I used SyncToy to do this. After my Windows stopped booting, I decided to switch to Linux (Ubuntu 10.10). That seems to be going fine, but now I want to backup my drive to the external drive like before. Mostly the two drives will be already the same with maybe about 10GB of extra files added. So I try to use rsync to synchronise the two drives like this: rsync --dry-run -rvlt --modify-window=1 /media/Antonio1TB/Backup /media/FREECOM\ HDD/Backup The problem is the dry run indicates that every file on the drive will be copied. Not just the files I have recently added. What is the correct command to synch two NTFS drives under Ubuntu so that files that already exist don't get copied again? Thanks.

    Read the article

  • Is there a IDE/compiler PC benchmark I can use to compare my PCs performance?

    - by RickL
    I'm looking for a benchmark (and results on other PCs) which would give me an idea of the development performance gain I could get by upgrading my PC, also the benchmark could be used to justify the upgrade to my boss. I use Visual Studio 2008 for my development, so I'd like to get an idea of by what factor the build times would be improved, and also it would be good if the benchmark could incorporate IDE performance (i.e. when editing, using intellisense, opening code files etc) into its result. I currently have an AMD 3800x2, with 2GB RAM on Vista 32. For example, I'd like to know what kind of performance gain I'd see in Visual Studio 2008 with a Q6600, 4GB RAM on Vista 64. And also with other processors, and other RAM sizes... also see whether hard disk performance is a big factor. EDIT: I mentioned Vista 64 because I'm aware that Vista 32 can only use 3GB RAM maximum. So I'd presume that wanting to use more RAM would require Vista 64, but perhaps it could still be slower overall there is a large overhead in using the 32 bit VS 2008 on 64 bit OS.

    Read the article

  • Can't restore backup from SQL Server 2008 R2 to SQL Server 2005 or 2008

    - by Erick
    Hi everyone, I'm trying to get a backup from SQL Server 2008 R2 restored to SQL Server 2008, but when we try to do the restore we get this: The database was backed up on a server running version 10.50.1092. That version is incompatible with this server, which is running version 10.00.2531. Either restore the database on a server that supports the backup, or use a backup that is compatible with this server. I can use the script wizard to generate a script, but that takes over an hour to run. I also tried just exporting the data from server to server, but it had issues with the primary keys/identity columns. I will be running into this issue with several other clients so any help you could offer about how to get around this would be great. Thanks for your help!

    Read the article

  • Scripting a database copy from MS Sql 2005 to 2008 without detach/backup/RDP

    - by James Santiago
    My goal is to move a single SQL 2005 database to a seperate 2008 server. The issue is my level of access to both servers. On each I can only access the database and nothing else. I cant create a backup file or detach the database because I don't have access to the file system or to create a proxy. I've tried using the generate script function of sql 2005 management studio express to restore the schema but receive command not supported errors when attempting to execute the sql on the new database. Similarly I tried using EMS SQL Manager 2005 Lite to script a backup of the schema and data but ran into similar problems. How do I go about acomplishing this? I can't seem to find any solutions outside of using the detach and backup functions.

    Read the article

  • How do I backup Credentials Manager passwords (Windows 7)

    - by Andrew J. Brehm
    I am trying to create a backup of my stored passwords in Credentials Manager. But after Windows switches to the secure desktop to get the password for the backup file it simply announced that "Your stored logon credentials could not be backed up" and gives as explanation "Element not found", neither of which is helpful. (In fact I hate the "X could not Y" type of error message). I am an administrator on the machine and there is only one password in Credentials Manager. The sole point of the backup is to create a nearly empty Credentials Manager so that I don't have to delete manually hundreds of password entries every time I have to change my domain password. (I think Microsoft haven't throught this through properly. There appears to be no way to delete more than one entry at a time.) Any ideas?

    Read the article

  • How do I delete a differential backup?

    - by BlueMonkMN
    I often like to create backups when testing the software I work on, and will sometimes create a differential backup if I want to be able to get back to multiple previous states. However, sometimes I realize that I forgot one thing I wanted to include in a differential backup, or I no longer need a previous differential backup. Sometimes I simply want to create a new scenario from the original base image and start working with a new series of differential backups. So I'd like to be able to delete some older differential backups so I don't get confused about which ones I'm using. But I can't find any way to delete just the differential backups, selectively or all at once.

    Read the article

  • CA ArcServe r11.1 - have to switch Tape Drive Offline then Online to finish backup

    - by Richard
    Ill keep it brief, I have an HP Ultrium 1 in a server currently running CA ArcServe r11.1. I have 5 daily backup tapes, each of which are new. 3 of the 5 work fine without intervention but 2 of them stop at varying points through the backup asking for a new tape, even though that tape is not full. The way I have found around this is to switch the tape drive offline for 10 minutes then switch it back online, whilst the backup is still running. Has anyone ever seen this before? If so, any ideas how to permanently fix this. If all else fails just some pointers in the right direction. Thanks

    Read the article

  • Extract registry key from NTBackup System State backup

    - by phoenix8
    A Windows Server 2003 machine died recently but I need some information that was contained in the now-defunct server's registry. I have a "System State" backup file created by the Windows Server 2003 built-in backup program (NTBackup.exe). Is there any way to extract a key/value out of the backup file? I might be able do a Win2003 install on a similar machine then do a system-state restore but that's a lot of effort and I don't know for certain that the system-state restore will work on a different spec machine. (Would it work if I booted up in 'safe mode'?) But I'd really rather just get at the data straight out of the NTBackup file zip-file-esque styles if that's possible.

    Read the article

  • Use Backup Exec configuration 2010 R3 file on 2012

    - by Roger M
    I'm looking to upgrade my backup solution from Symantec Backup Exec 2010 R3 (Which i believe is the same version as 13) to Backup Exec 2012. Now, it's pretty easy to open BEutility and use the "Copy media server configuration"-function in 2010 R3, but I have not found any answers as to whether this file can be imported flawlessly into 2012 or not. It would save loads of time if it's doable. Since I HAVE TO remove the 2010 installation before installing 2012, it's not possible to just test it. I need to know before I go through with it. Anyone who's tried the same? PS: Running Windows Server Standard 2008 R2

    Read the article

  • Restoring exchange 2003 from a backup

    - by user64204
    Hi all, I'm restoring an Exchange server from a backup: [1] the backup was created on 19/12/2010 [2] the server kept running until 20/12/2010 [3] we're restoring the server today 21/12/2010 with the backup from [1] My understanding is that when the server comes back: [4] whatever is in users' inbox since [1] will be deleted. [5] whatever is in users' sent box since [2] should be re-sent. [6] As a safety measure we've moved all emails sent/received between [1] and [3] to .PST files. Questions: -are [4] & [5] statements correct? -is there any way to move back emails from the PST file [6] to the current inbox/sent folders so that Exchange takes these emails into account (instead of deleting them)? -what happens to the Calendar items that were added after [1], is there any way to back those up as well if needed? Many thanks

    Read the article

  • recommendations for disk -> usb backup software

    - by TWood
    Recently I lost a tape drive and rather than repair the unit I decided that backups to usb external drives would be cheaper. In the past I used NTBackup and figured that the new server 2008 R2 backup wbadmin utility would be able to meet my needs. It does not. I am looking for recommendations for another utility that i can use. My requirements are: -backup local disk in addition to files on a network share -scheduled task integration (or some gui options to manage schedule) -non-incremental backup Basically I could do this all with WBAdmin if it just supported network shares. I saw some links that described attaching a vhd pointed to a network share but I am trying to avoid hacks like that. If i'm going to do all that trouble I'd just as well manually copy the directories over myself. If anyone has any software suggestions that might make this task easier for me let me know please. I am considering BackupAssist but can only find a few reviews here and there for it.

    Read the article

  • XP Mode (Windows Virtual PC for Windows 7) no longer requires hardware virtualisation - hurrah !

    - by Liam Westley
    Windows Virtual PC (aka XP Mode) When XP Mode was released, it insisted on hardware virtualisation being present on your CPU and enabled in the BIOS.  Given that Windows Virtual PC was based on an improved Virtual PC 2007, which provided hardware virtualisation as a user selectable option, I did wonder why on earth Microsoft thought this was a good idea.  Not only do many people not have a CPU with hardware virtualisation support, some manufacturers don't provide a BIOS option to enable this setting, especially on laptops - yes Sony, Toshiba and Acer, I'm looking at you. Dumb and dumber This issue became a double whammy; not only was Microsoft a bit dumb on not supporting Windows Virtual PC without hardware virtualisation, your hardware manufacturer was also dumb in not supporting the option in the BIOS. Microsoft update to Windows Virtual PC Belatedly, Microsoft has seen the problem with this hardware virtualisation requirement and has now released a new version of Windows Virtual PC that works without hardware virtualisation.  This is really good news for those with older (or limited) CPUs and rubbish BIOS firmware. You can details of how to download the new versions of XP Mode here, http://blogs.msdn.com/virtual_pc_guy/archive/2010/03/18/windows-virtual-pc-no-hardware-virtualization-update-now-available-for-download.aspx And there is also an explanation of why the hardware virtualisation requirement was in place for previous releases, http://blogs.msdn.com/virtual_pc_guy/archive/2010/03/18/windows-virtual-pc-now-without-the-need-for-hardware-virtualization.aspx

    Read the article

  • SQL Server &ndash; Undelete a Table and Restore a Single Table from Backup

    - by Mladen Prajdic
    This post is part of the monthly community event called T-SQL Tuesday started by Adam Machanic (blog|twitter) and hosted by someone else each month. This month the host is Sankar Reddy (blog|twitter) and the topic is Misconceptions in SQL Server. You can follow posts for this theme on Twitter by looking at #TSQL2sDay hashtag. Let me start by saying: This code is a crazy hack that is to never be used unless you really, really have to. Really! And I don’t think there’s a time when you would really have to use it for real. Because it’s a hack there are number of things that can go wrong so play with it knowing that. I’ve managed to totally corrupt one database. :) Oh… and for those saying: yeah yeah.. you have a single table in a file group and you’re restoring that, I say “nay nay” to you. As we all know SQL Server can’t do single table restores from backup. This is kind of a obvious thing due to different relational integrity (RI) concerns. Since we have to maintain that we have to restore all tables represented in a RI graph. For this exercise i say BAH! to those concerns. Note that this method “works” only for simple tables that don’t have LOB and off rows data. The code can be expanded to include those but I’ve tried to leave things “simple”. Note that for this to work our table needs to be relatively static data-wise. This doesn’t work for OLTP table. Products are a perfect example of static data. They don’t change much between backups, pretty much everything depends on them and their table is one of those tables that are relatively easy to accidentally delete everything from. This only works if the database is in Full or Bulk-Logged recovery mode for tables where the contents have been deleted or truncated but NOT when a table was dropped. Everything we’ll talk about has to be done before the data pages are reused for other purposes. After deletion or truncation the pages are marked as reusable so you have to act fast. The best thing probably is to put the database into single user mode ASAP while you’re performing this procedure and return it to multi user after you’re done. How do we do it? We will be using an undocumented but known DBCC commands: DBCC PAGE, an undocumented function sys.fn_dblog and a little known DATABASE RESTORE PAGE option. All tests will be on a copy of Production.Product table in AdventureWorks database called Production.Product1 because the original table has FK constraints that prevent us from truncating it for testing. -- create a duplicate table. This doesn't preserve indexes!SELECT *INTO AdventureWorks.Production.Product1FROM AdventureWorks.Production.Product   After we run this code take a full back to perform further testing.   First let’s see what the difference between DELETE and TRUNCATE is when it comes to logging. With DELETE every row deletion is logged in the transaction log. With TRUNCATE only whole data page deallocations are logged in the transaction log. Getting deleted data pages is simple. All we have to look for is row delete entry in the sys.fn_dblog output. But getting data pages that were truncated from the transaction log presents a bit of an interesting problem. I will not go into depths of IAM(Index Allocation Map) and PFS (Page Free Space) pages but suffice to say that every IAM page has intervals that tell us which data pages are allocated for a table and which aren’t. If we deep dive into the sys.fn_dblog output we can see that once you truncate a table all the pages in all the intervals are deallocated and this is shown in the PFS page transaction log entry as deallocation of pages. For every 8 pages in the same extent there is one PFS page row in the transaction log. This row holds information about all 8 pages in CSV format which means we can get to this data with some parsing. A great help for parsing this stuff is Peter Debetta’s handy function dbo.HexStrToVarBin that converts hexadecimal string into a varbinary value that can be easily converted to integer tus giving us a readable page number. The shortened (columns removed) sys.fn_dblog output for a PFS page with CSV data for 1 extent (8 data pages) looks like this: -- [Page ID] is displayed in hex format. -- To convert it to readable int we'll use dbo.HexStrToVarBin function found at -- http://sqlblog.com/blogs/peter_debetta/archive/2007/03/09/t-sql-convert-hex-string-to-varbinary.aspx -- This function must be installed in the master databaseSELECT Context, AllocUnitName, [Page ID], DescriptionFROM sys.fn_dblog(NULL, NULL)WHERE [Current LSN] = '00000031:00000a46:007d' The pages at the end marked with 0x00—> are pages that are allocated in the extent but are not part of a table. We can inspect the raw content of each data page with a DBCC PAGE command: -- we need this trace flag to redirect output to the query window.DBCC TRACEON (3604); -- WITH TABLERESULTS gives us data in table format instead of message format-- we use format option 3 because it's the easiest to read and manipulate further onDBCC PAGE (AdventureWorks, 1, 613, 3) WITH TABLERESULTS   Since the DBACC PAGE output can be quite extensive I won’t put it here. You can see an example of it in the link at the beginning of this section. Getting deleted data back When we run a delete statement every row to be deleted is marked as a ghost record. A background process periodically cleans up those rows. A huge misconception is that the data is actually removed. It’s not. Only the pointers to the rows are removed while the data itself is still on the data page. We just can’t access it with normal means. To get those pointers back we need to restore every deleted page using the RESTORE PAGE option mentioned above. This restore must be done from a full backup, followed by any differential and log backups that you may have. This is necessary to bring the pages up to the same point in time as the rest of the data.  However the restore doesn’t magically connect the restored page back to the original table. It simply replaces the current page with the one from the backup. After the restore we use the DBCC PAGE to read data directly from all data pages and insert that data into a temporary table. To finish the RESTORE PAGE  procedure we finally have to take a tail log backup (simple backup of the transaction log) and restore it back. We can now insert data from the temporary table to our original table by hand. Getting truncated data back When we run a truncate the truncated data pages aren’t touched at all. Even the pointers to rows stay unchanged. Because of this getting data back from truncated table is simple. we just have to find out which pages belonged to our table and use DBCC PAGE to read data off of them. No restore is necessary. Turns out that the problems we had with finding the data pages is alleviated by not having to do a RESTORE PAGE procedure. Stop stalling… show me The Code! This is the code for getting back deleted and truncated data back. It’s commented in all the right places so don’t be afraid to take a closer look. Make sure you have a full backup before trying this out. Also I suggest that the last step of backing and restoring the tail log is performed by hand. USE masterGOIF OBJECT_ID('dbo.HexStrToVarBin') IS NULL RAISERROR ('No dbo.HexStrToVarBin installed. Go to http://sqlblog.com/blogs/peter_debetta/archive/2007/03/09/t-sql-convert-hex-string-to-varbinary.aspx and install it in master database' , 18, 1) SET NOCOUNT ONBEGIN TRY DECLARE @dbName VARCHAR(1000), @schemaName VARCHAR(1000), @tableName VARCHAR(1000), @fullBackupName VARCHAR(1000), @undeletedTableName VARCHAR(1000), @sql VARCHAR(MAX), @tableWasTruncated bit; /* THE FIRST LINE ARE OUR INPUT PARAMETERS In this case we're trying to recover Production.Product1 table in AdventureWorks database. My full backup of AdventureWorks database is at e:\AW.bak */ SELECT @dbName = 'AdventureWorks', @schemaName = 'Production', @tableName = 'Product1', @fullBackupName = 'e:\AW.bak', @undeletedTableName = '##' + @tableName + '_Undeleted', @tableWasTruncated = 0, -- copy the structure from original table to a temp table that we'll fill with restored data @sql = 'IF OBJECT_ID(''tempdb..' + @undeletedTableName + ''') IS NOT NULL DROP TABLE ' + @undeletedTableName + ' SELECT *' + ' INTO ' + @undeletedTableName + ' FROM [' + @dbName + '].[' + @schemaName + '].[' + @tableName + ']' + ' WHERE 1 = 0' EXEC (@sql) IF OBJECT_ID('tempdb..#PagesToRestore') IS NOT NULL DROP TABLE #PagesToRestore /* FIND DATA PAGES WE NEED TO RESTORE*/ CREATE TABLE #PagesToRestore ([ID] INT IDENTITY(1,1), [FileID] INT, [PageID] INT, [SQLtoExec] VARCHAR(1000)) -- DBCC PACE statement to run later RAISERROR ('Looking for deleted pages...', 10, 1) -- use T-LOG direct read to get deleted data pages INSERT INTO #PagesToRestore([FileID], [PageID], [SQLtoExec]) EXEC('USE [' + @dbName + '];SELECT FileID, PageID, ''DBCC TRACEON (3604); DBCC PAGE ([' + @dbName + '], '' + FileID + '', '' + PageID + '', 3) WITH TABLERESULTS'' as SQLToExecFROM (SELECT DISTINCT LEFT([Page ID], 4) AS FileID, CONVERT(VARCHAR(100), ' + 'CONVERT(INT, master.dbo.HexStrToVarBin(SUBSTRING([Page ID], 6, 20)))) AS PageIDFROM sys.fn_dblog(NULL, NULL)WHERE AllocUnitName LIKE ''%' + @schemaName + '.' + @tableName + '%'' ' + 'AND Context IN (''LCX_MARK_AS_GHOST'', ''LCX_HEAP'') AND Operation in (''LOP_DELETE_ROWS''))t');SELECT *FROM #PagesToRestore -- if upper EXEC returns 0 rows it means the table was truncated so find truncated pages IF (SELECT COUNT(*) FROM #PagesToRestore) = 0 BEGIN RAISERROR ('No deleted pages found. Looking for truncated pages...', 10, 1) -- use T-LOG read to get truncated data pages INSERT INTO #PagesToRestore([FileID], [PageID], [SQLtoExec]) -- dark magic happens here -- because truncation simply deallocates pages we have to find out which pages were deallocated. -- we can find this out by looking at the PFS page row's Description column. -- for every deallocated extent the Description has a CSV of 8 pages in that extent. -- then it's just a matter of parsing it. -- we also remove the pages in the extent that weren't allocated to the table itself -- marked with '0x00-->00' EXEC ('USE [' + @dbName + '];DECLARE @truncatedPages TABLE(DeallocatedPages VARCHAR(8000), IsMultipleDeallocs BIT);INSERT INTO @truncatedPagesSELECT REPLACE(REPLACE(Description, ''Deallocated '', ''Y''), ''0x00-->00 '', ''N'') + '';'' AS DeallocatedPages, CHARINDEX('';'', Description) AS IsMultipleDeallocsFROM (SELECT DISTINCT LEFT([Page ID], 4) AS FileID, CONVERT(VARCHAR(100), CONVERT(INT, master.dbo.HexStrToVarBin(SUBSTRING([Page ID], 6, 20)))) AS PageID, DescriptionFROM sys.fn_dblog(NULL, NULL)WHERE Context IN (''LCX_PFS'') AND Description LIKE ''Deallocated%'' AND AllocUnitName LIKE ''%' + @schemaName + '.' + @tableName + '%'') t;SELECT FileID, PageID , ''DBCC TRACEON (3604); DBCC PAGE ([' + @dbName + '], '' + FileID + '', '' + PageID + '', 3) WITH TABLERESULTS'' as SQLToExecFROM (SELECT LEFT(PageAndFile, 1) as WasPageAllocatedToTable , SUBSTRING(PageAndFile, 2, CHARINDEX('':'', PageAndFile) - 2 ) as FileID , CONVERT(VARCHAR(100), CONVERT(INT, master.dbo.HexStrToVarBin(SUBSTRING(PageAndFile, CHARINDEX('':'', PageAndFile) + 1, LEN(PageAndFile))))) as PageIDFROM ( SELECT SUBSTRING(DeallocatedPages, delimPosStart, delimPosEnd - delimPosStart) as PageAndFile, IsMultipleDeallocs FROM ( SELECT *, CHARINDEX('';'', DeallocatedPages)*(N-1) + 1 AS delimPosStart, CHARINDEX('';'', DeallocatedPages)*N AS delimPosEnd FROM @truncatedPages t1 CROSS APPLY (SELECT TOP (case when t1.IsMultipleDeallocs = 1 then 8 else 1 end) ROW_NUMBER() OVER(ORDER BY number) as N FROM master..spt_values) t2 )t)t)tWHERE WasPageAllocatedToTable = ''Y''') SELECT @tableWasTruncated = 1 END DECLARE @lastID INT, @pagesCount INT SELECT @lastID = 1, @pagesCount = COUNT(*) FROM #PagesToRestore SELECT @sql = 'Number of pages to restore: ' + CONVERT(VARCHAR(10), @pagesCount) IF @pagesCount = 0 RAISERROR ('No data pages to restore.', 18, 1) ELSE RAISERROR (@sql, 10, 1) -- If the table was truncated we'll read the data directly from data pages without restoring from backup IF @tableWasTruncated = 0 BEGIN -- RESTORE DATA PAGES FROM FULL BACKUP IN BATCHES OF 200 WHILE @lastID <= @pagesCount BEGIN -- create CSV string of pages to restore SELECT @sql = STUFF((SELECT ',' + CONVERT(VARCHAR(100), FileID) + ':' + CONVERT(VARCHAR(100), PageID) FROM #PagesToRestore WHERE ID BETWEEN @lastID AND @lastID + 200 ORDER BY ID FOR XML PATH('')), 1, 1, '') SELECT @sql = 'RESTORE DATABASE [' + @dbName + '] PAGE = ''' + @sql + ''' FROM DISK = ''' + @fullBackupName + '''' RAISERROR ('Starting RESTORE command:' , 10, 1) WITH NOWAIT; RAISERROR (@sql , 10, 1) WITH NOWAIT; EXEC(@sql); RAISERROR ('Restore DONE' , 10, 1) WITH NOWAIT; SELECT @lastID = @lastID + 200 END /* If you have any differential or transaction log backups you should restore them here to bring the previously restored data pages up to date */ END DECLARE @dbccSinglePage TABLE ( [ParentObject] NVARCHAR(500), [Object] NVARCHAR(500), [Field] NVARCHAR(500), [VALUE] NVARCHAR(MAX) ) DECLARE @cols NVARCHAR(MAX), @paramDefinition NVARCHAR(500), @SQLtoExec VARCHAR(1000), @FileID VARCHAR(100), @PageID VARCHAR(100), @i INT = 1 -- Get deleted table columns from information_schema view -- Need sp_executeSQL because database name can't be passed in as variable SELECT @cols = 'select @cols = STUFF((SELECT '', ['' + COLUMN_NAME + '']''FROM ' + @dbName + '.INFORMATION_SCHEMA.COLUMNSWHERE TABLE_NAME = ''' + @tableName + ''' AND TABLE_SCHEMA = ''' + @schemaName + '''ORDER BY ORDINAL_POSITIONFOR XML PATH('''')), 1, 2, '''')', @paramDefinition = N'@cols nvarchar(max) OUTPUT' EXECUTE sp_executesql @cols, @paramDefinition, @cols = @cols OUTPUT -- Loop through all the restored data pages, -- read data from them and insert them into temp table -- which you can then insert into the orignial deleted table DECLARE dbccPageCursor CURSOR GLOBAL FORWARD_ONLY FOR SELECT [FileID], [PageID], [SQLtoExec] FROM #PagesToRestore ORDER BY [FileID], [PageID] OPEN dbccPageCursor; FETCH NEXT FROM dbccPageCursor INTO @FileID, @PageID, @SQLtoExec; WHILE @@FETCH_STATUS = 0 BEGIN RAISERROR ('---------------------------------------------', 10, 1) WITH NOWAIT; SELECT @sql = 'Loop iteration: ' + CONVERT(VARCHAR(10), @i); RAISERROR (@sql, 10, 1) WITH NOWAIT; SELECT @sql = 'Running: ' + @SQLtoExec RAISERROR (@sql, 10, 1) WITH NOWAIT; -- if something goes wrong with DBCC execution or data gathering, skip it but print error BEGIN TRY INSERT INTO @dbccSinglePage EXEC (@SQLtoExec) -- make the data insert magic happen here IF (SELECT CONVERT(BIGINT, [VALUE]) FROM @dbccSinglePage WHERE [Field] LIKE '%Metadata: ObjectId%') = OBJECT_ID('['+@dbName+'].['+@schemaName +'].['+@tableName+']') BEGIN DELETE @dbccSinglePage WHERE NOT ([ParentObject] LIKE 'Slot % Offset %' AND [Object] LIKE 'Slot % Column %') SELECT @sql = 'USE tempdb; ' + 'IF (OBJECTPROPERTY(object_id(''' + @undeletedTableName + '''), ''TableHasIdentity'') = 1) ' + 'SET IDENTITY_INSERT ' + @undeletedTableName + ' ON; ' + 'INSERT INTO ' + @undeletedTableName + '(' + @cols + ') ' + STUFF((SELECT ' UNION ALL SELECT ' + STUFF((SELECT ', ' + CASE WHEN VALUE = '[NULL]' THEN 'NULL' ELSE '''' + [VALUE] + '''' END FROM ( -- the unicorn help here to correctly set ordinal numbers of columns in a data page -- it's turning STRING order into INT order (1,10,11,2,21 into 1,2,..10,11...21) SELECT [ParentObject], [Object], Field, VALUE, RIGHT('00000' + O1, 6) AS ParentObjectOrder, RIGHT('00000' + REVERSE(LEFT(O2, CHARINDEX(' ', O2)-1)), 6) AS ObjectOrder FROM ( SELECT [ParentObject], [Object], Field, VALUE, REPLACE(LEFT([ParentObject], CHARINDEX('Offset', [ParentObject])-1), 'Slot ', '') AS O1, REVERSE(LEFT([Object], CHARINDEX('Offset ', [Object])-2)) AS O2 FROM @dbccSinglePage WHERE t.ParentObject = ParentObject )t)t ORDER BY ParentObjectOrder, ObjectOrder FOR XML PATH('')), 1, 2, '') FROM @dbccSinglePage t GROUP BY ParentObject FOR XML PATH('') ), 1, 11, '') + ';' RAISERROR (@sql, 10, 1) WITH NOWAIT; EXEC (@sql) END END TRY BEGIN CATCH SELECT @sql = 'ERROR!!!' + CHAR(10) + CHAR(13) + 'ErrorNumber: ' + ERROR_NUMBER() + '; ErrorMessage' + ERROR_MESSAGE() + CHAR(10) + CHAR(13) + 'FileID: ' + @FileID + '; PageID: ' + @PageID RAISERROR (@sql, 10, 1) WITH NOWAIT; END CATCH DELETE @dbccSinglePage SELECT @sql = 'Pages left to process: ' + CONVERT(VARCHAR(10), @pagesCount - @i) + CHAR(10) + CHAR(13) + CHAR(10) + CHAR(13) + CHAR(10) + CHAR(13), @i = @i+1 RAISERROR (@sql, 10, 1) WITH NOWAIT; FETCH NEXT FROM dbccPageCursor INTO @FileID, @PageID, @SQLtoExec; END CLOSE dbccPageCursor; DEALLOCATE dbccPageCursor; EXEC ('SELECT ''' + @undeletedTableName + ''' as TableName; SELECT * FROM ' + @undeletedTableName)END TRYBEGIN CATCH SELECT ERROR_NUMBER() AS ErrorNumber, ERROR_MESSAGE() AS ErrorMessage IF CURSOR_STATUS ('global', 'dbccPageCursor') >= 0 BEGIN CLOSE dbccPageCursor; DEALLOCATE dbccPageCursor; ENDEND CATCH-- if the table was deleted we need to finish the restore page sequenceIF @tableWasTruncated = 0BEGIN -- take a log tail backup and then restore it to complete page restore process DECLARE @currentDate VARCHAR(30) SELECT @currentDate = CONVERT(VARCHAR(30), GETDATE(), 112) RAISERROR ('Starting Log Tail backup to c:\Temp ...', 10, 1) WITH NOWAIT; PRINT ('BACKUP LOG [' + @dbName + '] TO DISK = ''c:\Temp\' + @dbName + '_TailLogBackup_' + @currentDate + '.trn''') EXEC ('BACKUP LOG [' + @dbName + '] TO DISK = ''c:\Temp\' + @dbName + '_TailLogBackup_' + @currentDate + '.trn''') RAISERROR ('Log Tail backup done.', 10, 1) WITH NOWAIT; RAISERROR ('Starting Log Tail restore from c:\Temp ...', 10, 1) WITH NOWAIT; PRINT ('RESTORE LOG [' + @dbName + '] FROM DISK = ''c:\Temp\' + @dbName + '_TailLogBackup_' + @currentDate + '.trn''') EXEC ('RESTORE LOG [' + @dbName + '] FROM DISK = ''c:\Temp\' + @dbName + '_TailLogBackup_' + @currentDate + '.trn''') RAISERROR ('Log Tail restore done.', 10, 1) WITH NOWAIT;END-- The last step is manual. Insert data from our temporary table to the original deleted table The misconception here is that you can do a single table restore properly in SQL Server. You can't. But with little experimentation you can get pretty close to it. One way to possible remove a dependency on a backup to retrieve deleted pages is to quickly run a similar script to the upper one that gets data directly from data pages while the rows are still marked as ghost records. It could be done if we could beat the ghost record cleanup task.

    Read the article

  • Fujitsu LifeBook T4010D Laptop CPU Fan from Pcpartsltd.com

    - by pcpartsltd
    est and 100% work perfectly Fujitsu LifeBook T4010D Laptop CPU Cooling Fan MCF-S4512AM05 Features: * MODEL:MCF-S4512AM05. * Package Content: 1x CPU Cooling Fan * Condition: New * Warranty: 3 Months Warranty Compatible Model: Fujitsu LifeBook T4010D Laptop Pcpartsltd.com limited is a direct Exporter of high quality pc part notebooks, laptop power adapters, laptop batteries, laptop keyboards, laptop Inverters, laptop Hinges, laptop CPU Fan, laptop driver, laptop MotherBoards, Samsung Wall Mount, laptop LCD Bezel/ LCD lid, laptop lcd/led panel and Laptop LCD Video Cable. We are Laptop Parts experts.

    Read the article

  • what about laptop parts from Pcpartsltd.com?

    - by pcpartsltd
    i will buy some items form Pcpartsltd.com,what about it? i have seen some words:Pcpartsltd.com limited is a direct Exporter of high quality pc part notebooks, laptop power adapters, laptop batteries, laptop keyboards, laptop Inverters, laptop Hinges, laptop CPU Fan, laptop driver, laptop MotherBoards, Samsung Wall Mount, laptop LCD Bezel/ LCD lid, laptop lcd/led panel and Laptop LCD Video Cable. We are Laptop Parts experts.

    Read the article

  • Ubuntu Workstation

    - by John Smith
    I bought a dell inspiron n5110 hoping that i'll be able to use it fine with Linux; however i couldn't install the video card drivers and in 8 months the motherboard burned because the power management wasn't right and it was overheating. I want to buy a workstation pc that works with ubuntu. Does anyone have any suggestions? I'd like to have the videocard and all other hardware to be used properly Thank you

    Read the article

  • VBA Macro to save an excel file to a different backup location

    - by Joe Taylor
    I am trying to create a Macro that either runs on close or on save to backup the file to a different location. At the moment the Macro I have used is: Private Sub Workbook_BeforeClose(Cancel As Boolean) 'Private Sub Workbook_BeforeSave(ByVal SaveAsUI As Boolean, Cancel As Boolean) 'Saves the current file to a backup folder and the default folder 'Note that any backup is overwritten Application.DisplayAlerts = False ActiveWorkbook.SaveCopyAs Filename:="T:\TEC_SERV\Backup file folder - DO NOT DELETE\" & _ ActiveWorkbook.Name ActiveWorkbook.Save Application.DisplayAlerts = True End Sub This creates a backup of the file ok the first time, however if this is tried again I get: Run-Time Error '1004'; Microsoft Office Excel cannot access the file 'T:\TEC_SERV\Backup file folder - DO NOT DELETE\Test Macro Sheet.xlsm. There are several possible reasons: The file name or path does not exist The file is being used by another program The workbook you are trying to save has the same name as a... I know the path is correct, I also know that the file is not open anywhere else. The workbook has the same name as the one I'm trying to save over but it should just overwrite. Any help would be much appreciated. Joe

    Read the article

  • Git pull auto complete OSX

    - by vodkhang
    Follow some instruction on this site http://denis.tumblr.com/post/71390665/adding-bash-completion-for-git-on-mac-os-x-leopard . I can do git auto complete for MAC OS. However, when I type git pull origin ma (for master), and then tab it takes a long time for git to auto complete to become git pull origin master . I think it connect to the server to get the branch, but I am not sure, is there any way to make it faster and only get the branch on local machine cd /tmp git clone git://git.kernel.org/pub/scm/git/git.git cd git git checkout v`git --version | awk '{print $3}'` cp contrib/completion/git-completion.bash ~/.git-completion.bash cd ~ rm -rf /tmp/git echo -e "source ~/.git-completion.bash" >> .profile

    Read the article

  • Running complete Native Linux on phones that are bootloader-unlocked [on hold]

    - by james
    Since there are many phones (HTC, Samsung, LG, Nexus) today that have bootloader unlocked, I want to ask what's preventing them from running a complete native GNU/Linux. GNU/Linux has ARM port and we can run a command line GNU/Linux on top of Android by the method of chroot. So, what's preventing for existing bootloader unlocked Android phones from running a complete GNU/Linux natively? The device should get a long shelf life if it has one such port. My few thoughts.. Proprietary drivers for hardware that cannot be made to work when using a different OS. And the binary provider will never support any OS other than what the phone is shipped with. Application Interface. The interface for desktop apps doesn't fit to the mobile display with different PPI. Kernel. Since android devices use Linux kernel whose sources should be available, could the device kernel be modified to work with GNU/Linux. And any other reasons?

    Read the article

  • Bind shift-tab to complete-backward in fish

    - by Sebastian
    I found myself using the auto-complete functionality of the fish-shell, where pressing tab twice or more cycles through the suggestions. But then I accidentaly pressed tab once to many, and I wanted to go back to the previous suggestion, so I pressed shift-tab, which only appended [z to the command. For example, when I type cd D<tab><tab>: ~> cd Desktop/ I press <tab>, result: ~> cd Documents/ Now when I press <shift+tab>, the prompt changes to ~> cd Documents/[Z instead of returning to the desired: ~> cd Desktop/ How do I do this (preferably using the fish_user_key_bindings.fish file)? The documentation only provides the special function complete.

    Read the article

  • Complete Public Folder Migration from Exchange 2007 to Exchange 2010

    - by Michael Todd
    We were in the process of migrating from Exchange 2007 to Exchange 2010 and hit a brick wall when trying to migrate Public Folders. After resolving issues with connectivity (and another issue with an old Exchange 2003 server being listed in AD that was causing the replication to fail) it finally appeared that messages were migrating from one server to another. However, we appear to have jumped the gun and ran MoveAllReplicas before the process was complete. We are now stuck with about 210MB of public folders on the new server from a 7GB public folder store on the old server. The messages appear to be available on the old server since running get-publicfolderstatistics shows that there are messages available. We have waited several days for the move to continue but we are stuck at 210MB. Is there something we can do to complete the replication so that all of the messages move from the old server to the new server?

    Read the article

  • new PC not work with existing router, but works fine when directly connecting to cable modem

    - by user34786
    I bought a new desktop PC (eMachine ET1331G-03W from WalMart) with windows 7 installed, but I can not access internet by connecting to my existing wireless router(LinkSys BEFW11S4) with wired cable. Though all other existing desktops and laptops have no problem connecting to the same router. However, the new desktop PC works fine and able to connect to internet if I bypass the router and directly hook up with the cable modem. At new PC when connecting to the router, I got the below information by typing ipconfig, the IP address looks wrong to me: autoconfiguration IPv4 Address: 169.254.71.140 subnet mask: 255.255.0.0 default gateway: (empty) NetBIOS over Tcpip: Enabled Typing ipconfig at all other desktop and laptop have values like below, which are good to me: Connection-specific DNS Suffix . : IP Address. . . . . . . . . . . . : 192.168.1.140 Subnet Mask . . . . . . . . . . . : 255.255.255.0 Default Gateway . . . . . . . . . : 192.168.1.1 The wireless router was on 192.168.1.1, I do not know why the new desktop got 169.254.71.140 IP? It should have something like 192.168.1.xxx, and it was configured to automatically get IP by DHCP. I have tried to switch cables,power off cable modem, router and reboot new pc many times and got no luck. So I believe this is only an issue related to router or new pc configuration. Can someone help me figure out the issue?

    Read the article

  • Offline backup synchronization

    - by Pavan Kumar
    There is a Central Server running Windows Server 2003 and SQL Server 2005 and there are 7 client machines situated in various places and has XP Pro & SQL Server 2005 installed in all of them. They are not interconnected so they are physically seperate. One person goes to each of these centers maybe twice a month and takes the backup (Full database consisting of mdf and ldf files) with a pen drive and brings it to the Central server which contains the central database holding same schema as all the other client databases. I need to synchronize each backup database (belonging to different centers) one by one to update the existing data or inserting new data in the central database . The solution i got was Replication. The pendrive is brought to central server consisting of 7 instances of the databases and then the databases is attached to the central server one by one to the same SQL Server where the central database exists. Then my idea was to replicate the backup database one by one i.e using single subscription (Central Database) and multiple publication ( i.e 7 instances of databases in my case) toplogy by performing replication locally (i.e in the same machine). So i tried to develop a UI in C# .Net to programatically run the Transactional Replication with push subscription using RMO Programming (which is incomplete as of now because there is no point in developing when you already know it is not the solution). Transactional Replication can either be set to initialize with a snapshot or without a snapshot. If i go for the first option i.e with a snapshot , the data whatever is present in Central Database is overwritten by the new data . So the data present initially in the central database is lost. If i try to initialize without snapshot , no data (the data already has the updated and new data) will be sent from the backup database to server. The replication will work in a scenario where any incremental changes is done only after you set the replication . So the initial data whatever was present in the backup database when setting up the replication will not be replicated when running the snapshot agent for the first time to synchronize. Only changes in the backup database thereafter will be reflected to the central database .(Remember I am not going to insert new data or make any changes to the backup database after i attach it to the Central Server. ) So this solution is not feasible. I want a solution for synchronizing from one client database to central database present in the same machine using C#.NET. If you can provide me small example maybe with two databases(with same schema) DB1(Client) to DB2(Server) consisting of one or two tables it will be very helpful. The synchronization is not bidirectional.I want to only update existing data or insert new data from DB1 to DB2 (DB2 may contain some data initially). Thanks and Regards Pavan

    Read the article

< Previous Page | 61 62 63 64 65 66 67 68 69 70 71 72  | Next Page >