Search Results

Search found 5474 results on 219 pages for 'tiered storage'.

Page 24/219 | < Previous Page | 20 21 22 23 24 25 26 27 28 29 30 31  | Next Page >

  • Silverlight local storage

    - by IMHO
    As you may know Silverlight has support for local storage. We are looking at creating Sl application that will work in off line mode. This application may require quite a bit of data to be cached on the client side. Obvious solution - use local storage with some sort of XMl based structure won't work as our PoC showed due to performance issues. We are looking at several 3rd party solutions that implement light database engines on top of SL local storage. If you have solved this problem in the past or have any ideas - I would appreciate some pointers and ideas.

    Read the article

  • Is the usage of Isolated Storage in Silverlight 3 a security concern

    - by Prashant
    I am using Silverlight 3 on my website. I have a Login Page for role based authentication, that routes users with different privileges to different parts of the website. I want to use something analogous to the Session Variables available in standard ASP.Net applications. I intend to use Isolated Storage to achieve this. But I am skeptical about security in this option, as the Isolated Storage exists on the client side, and can be manipulated on client side. I am new to the Isolated Storage concept and don't know about the security options provided by it in terms of Encryption and server-side validation etc. If any of you have used it or are aware of the security provided in this case, could you please shed some light on the same. Thanks

    Read the article

  • Google chrome extension: local storage

    - by Rohan
    Hi there I'm developing an extension for Google Chrome, and have run into some trouble.I created an options.html page and added it to the manifest.json file.The page shows properly. I saved the options, and then went back to the page on which the extension is supposed to run. Unfortunately, the Local storage for the options was returning a 'null' instead of the option. If I set the local storage option directly from the extension's JS script, it works fine but not if it was set from the options page. Any idea how i can access the options.html local storage values from my Javascript file in the extension? thanks!

    Read the article

  • Looking for a safe, portable password-storage method

    - by Maciek
    Hello, I'm working on C++ project that is supposed to run on both Win32 and Linux, the software is to be deployed to small computers, usually working in remote locations. Recently, our client has requested that we introduce access control via password protection. We are to meet the following criteria : Support remote login Support remote password change Support remote password retrieval Support data retrieval on accidental/purposeful deletion Support secure storage I'm capable of meeting the "remote" requirements using an existing library, however what I do need to consider is a method of storing this data, preferably in a way that will work on both platforms and will not let the user see it/read it, encryption is not the issue here - it's the storage method itself. Can anyone recommend a sage storage method that could help me meet those criteria?

    Read the article

  • Retrieve Storage and Programs Memory on .NET Compact Framework 2 and WM5

    - by wintermute
    Hi! I've been looking for quite a while already and still couldn't find a solution for this. All I need is to retrieve the memory levels and percentage of use. OpenNETCF has a MemoryManagement class, which seems to encapsulates a data structure returned through a P/Invoke or something like that, and it gives me the TotalPhysicalMemory, TotalVirtualMemory, AvailablePhisicalMemory and such, but those do not directly relate to Storage and Programs, nor could I find a way to "convert" these attributes to those I need. Has anyone there already done this? It must be easy, I just need the very same values available on Settings System Memory. Thanks in advance! edit: I'm already being able to retrieve Available and total Storage memory through the GetDiskFreeSpaceEx P/Invoke. Since Storage and Programs memory seem to rely into the same hardware, maybe it's just a case of finding out what path to pass as the method's first parameter.

    Read the article

  • Isolated storage

    - by Costa
    Hi I am not sure that I understand Isolated storage. I read the article http://msdn.microsoft.com/en-us/library/3ak841sy%28VS.80%29.aspx 1) Why I don't just use App data folder? 2) In the link above : "With isolated storage, data is always isolated by user and by assembly. Credentials such as the origin or the strong name of the assembly determine assembly identity. Data can also be isolated by application domain, using similar credentials." I can't think about a scenario that makes this future important. In general I don't understand the philosophy and the need of "isolated storage" which inspire MS to create such a thing. Thanks

    Read the article

  • Storage device not found on ESX4 with AIC-9410

    - by Mads
    I am trying to install ESX 4.0 update 1 on a Supermicro X7DBR-3 system with an embedded AIC-9410 HBA (this HBA is listed on the HCG with Vendor ID 9005 and Device ID 041f) . All SATA controllers are disabled in the BIOS and the logical drive shows up in the Adaptec device summary during POST, however there is nothing listed on the Storage Device screen. The HBA itself is listed if I run esxcfg-info but not if I run esxcfg-scsidevs -a (under ESXi for that last command) Any ideas where I can look next or what might be wrong?

    Read the article

  • Remote IIS Administration - "Not enough storage available to process this command"

    - by Hainesy
    I'm trying to do Remote Administration of IIS in C#.NET using System.Web.Administration tools. Everything works fine on a test server (windows 2008), however when I try using our live server (windows 2003) it fails giving the message: System.Runtime.InteropServices.COMException : Not enough storage is available to process this command. (Exception from HRESULT: 0x80070008) The server itself has plenty of memory free, so I believe this is some kind of memory limit with the RPC itself. http://support.microsoft.com/kb/890425 Is there any way around this?

    Read the article

  • HP Usb storage Format tool : Error

    - by Srin
    I have a 2 GB Kingston Usb drive. This is working fine- I am able to format and access from 'My computer' I am trying to format this drive using HP USB Storage format tool to load a bootable image on to the drive, it says "device media is write protected" OS : Windows XP what could be wrong ? Thanks!

    Read the article

  • Alternatives to Remote Storage Service under Windows Server 2008 R2

    - by ObligatoryMoniker
    I am working on setting up a new Windows Server 2008 R2 file server for our organization and felt like the functionality offered by the Remote Storage Service in previous versions of Windows would meet our needs for segmenting our data so that we can have different backup schedules for different tiers of data based on the frequency of that data being used and updated. What software exists that provide this same or similar functionality for Server 2008 R2?

    Read the article

  • Roundcube connection to storage server failed

    - by sola
    I recently installed kloxo on a brand new VPS and set up mail servers and everything. i am using courier-imap on my VPS and i have verified it is running however i cannot for the life of me get into mail with round cube, i keep getting the error connection to storage server failed, is this an issue with my database. I have tried granting all privileges to the round cube user in MySQL and restarted qmail several times. Any ideas?

    Read the article

  • Storage and bandwidth for a social network

    - by user38141
    I guess i asked a dumb question earlier. I am fairly new at this. I have a socal network being built in PHP wit MYsql. I was wondering how much bandwidth and storage would allow users to have have 500mins of streaming video and allow them to store photos and videos. Please forgive me. I am not a technology guy and just doing some research as I am learning as I go along.

    Read the article

  • Can't Mount Databases in the Recovery Storage Group in Exchange 2003

    - by Kyle Brandt
    After restoring some mail stores, mounting them in the recovery storage group errors out. So far a repair with eseutil (which the log says was successful) and a server reboot has been tried. A defrag is currently running on the mail store I need to go through this again to post the exact error codes, so I will be updating this question. But I thought someone who does this a lot might have a "More times than not..." answer in the meantime.

    Read the article

  • Make a radio-streaming PC pretend to be a mass-storage USB device

    - by monov
    I'm listening to a net radio on my PC I want the sound to go through my boombox cause it has nice speakers/amp The boombox has no "incoming" audio jack that just plays what comes over the wire However the boombox has a USB jack where you can put a thumbdrive with music. The question: How do I make the PC pretend to be a mass-storage device, and dynamically send all received audiodata to the boombox over a symmetric male USB cable? Failing that, at least tell me how to do it for local files (rather than streams). OS: Vista

    Read the article

  • Access my router's gateway network?

    - by Danpe
    I have 2 routers in my place. Main Router (Connected to the Internet) - 192.168.1.1 Secondery Router (Connected to the Main Router) - 192.168.0.1 I have a Network Storage Device and few Shared Directorys connected to the Main Router. (Network Storage - 192.168.1.16) How can i acces one of them using a PC connected to the Secondery Router? Home Network Diagram: I currently have access to the internet using both laptop and Main PC. But i want to get access from my laptop to the Storage and to ym shared directorys. The problem is the my Main router always forwards all packets stright to the WAN.. (Internet)

    Read the article

  • large RAID 10 vs small RAID1

    - by user116399
    The machine will store and serve millions of small files (<15Kb each), and all those files require a total storage space of 400G Considering the exact same SATA hard drives maker and models, on the exact same environment (OS, cpu, ram, raid controller, etc...) which one of the setups bellow would be faster? A) RAID 1 with 2 drives of 2T each, making up total storage of 2T B) RAID 10 with 4 drives of 2T each, making up total storage of 4T [EDIT]: I'm aware RAID10 is faster than RAID1. The larger the disk, at least in theory, the longer will take to do seeks/writes. So, will the performance gain of RAID10 will be outweighed by the "drag" caused the larger disk area when seek/write operations happened?

    Read the article

  • Multiple iSCSI Targets or 1 that's shared?

    - by Joost Verdaasdonk
    On my network I have several types of files I want to save on a SAN like: SQL db's and logs Exchange data Random files Now I'm wondering if I should create one iSCSI Target with a large volume and initiate that from one of the servers. (and share it so other servers can use it too) Or I should create separate Targets to have each server use its own storage. For the record the storage could be separated because the servers aren't using the shared data. For one reason I was thinking of one storage is ease of backup. (but perhaps performance could be a problem?) What would be an advisable configuration for these type of data?

    Read the article

  • SQL SERVER – SHRINKFILE and TRUNCATE Log File in SQL Server 2008

    - by pinaldave
    Note: Please read the complete post before taking any actions. This blog post would discuss SHRINKFILE and TRUNCATE Log File. The script mentioned in the email received from reader contains the following questionable code: “Hi Pinal, If you could remember, I and my manager met you at TechEd in Bangalore. We just upgraded to SQL Server 2008. One of our jobs failed as it was using the following code. The error was: Msg 155, Level 15, State 1, Line 1 ‘TRUNCATE_ONLY’ is not a recognized BACKUP option. The code was: DBCC SHRINKFILE(TestDBLog, 1) BACKUP LOG TestDB WITH TRUNCATE_ONLY DBCC SHRINKFILE(TestDBLog, 1) GO I have modified that code to subsequent code and it works fine. But, are there other suggestions you have at the moment? USE [master] GO ALTER DATABASE [TestDb] SET RECOVERY SIMPLE WITH NO_WAIT DBCC SHRINKFILE(TestDbLog, 1) ALTER DATABASE [TestDb] SET RECOVERY FULL WITH NO_WAIT GO Configuration of our server and system is as follows: [Removed not relevant data]“ An email like this that suddenly pops out in early morning is alarming email. Because I am a dead, busy mind, so I had only one min to reply. I wrote down quickly the following note. (As I said, it was a single-minute email so it is not completely accurate). Here is that quick email shared with all of you. “Hi Mr. DBA [removed the name] Thanks for your email. I suggest you stop this practice. There are many issues included here, but I would list two major issues: 1) From the setting database to simple recovery, shrinking the file and once again setting in full recovery, you are in fact losing your valuable log data and will be not able to restore point in time. Not only that, you will also not able to use subsequent log files. 2) Shrinking file or database adds fragmentation. There are a lot of things you can do. First, start taking proper log backup using following command instead of truncating them and losing them frequently. BACKUP LOG [TestDb] TO  DISK = N'C:\Backup\TestDb.bak' GO Remove the code of SHRINKING the file. If you are taking proper log backups, your log file usually (again usually, special cases are excluded) do not grow very big. There are so many things to add here, but you can call me on my [phone number]. Before you call me, I suggest for accuracy you read Paul Randel‘s two posts here and here and Brent Ozar‘s Post here. Kind Regards, Pinal Dave” I guess this post is very much clear to you. Please leave your comments here. As mentioned, this is a very huge subject; I have just touched a tip of the ice-berg and have tried to point to authentic knowledge. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Backup and Restore, SQL Data Storage, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Attach mdf file without ldf file in Database

    - by pinaldave
    Background Story: One of my friends recently called up and asked me if I had spare time to look at his database and give him a performance tuning advice. Because I had some free time to help him out, I said yes. I asked him to send me the details of his database structure and sample data. He said that since his database is in a very early stage and is small as of the moment, so he told me that he would like me to have a complete database. My response to him was “Sure! In that case, take a backup of the database and send it to me. I will restore it into my computer and play with it.” He did send me his database; however, his method made me write this quick note here. Instead of taking a full backup of the database and sending it to me, he sent me only the .mdf (primary database file). In fact, I asked for a complete backup (I wanted to review file groups, files, as well as few other details).  Upon calling my friend,  I found that he was not available. Now,  he left me with only a .mdf file. As I had some extra time, I decided to checkout his database structure and get back to him regarding the full backup, whenever I can get in touch with him again. Technical Talk: If the database is shutdown gracefully and there was no abrupt shutdown (power outrages, pulling plugs to machines, machine crashes or any other reasons), it is possible (there’s no guarantee) to attach .mdf file only to the server. Please note that there can be many more reasons for a database that is not getting attached or restored. In my case, the database had a clean shutdown and there were no complex issues. I was able to recreate a transaction log file and attached the received .mdf file. There are multiple ways of doing this. I am listing all of them here. Before using any of them, please consult the Domain Expert in your company or industry. Also, never attempt this on live/production server without the presence of a Disaster Recovery expert. USE [master] GO -- Method 1: I use this method EXEC sp_attach_single_file_db @dbname='TestDb', @physname=N'C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\DATA\TestDb.mdf' GO -- Method 2: CREATE DATABASE TestDb ON (FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\DATA\TestDb.mdf') FOR ATTACH_REBUILD_LOG GO Method 2: If one or more log files are missing, they are recreated again. There is one more method which I am demonstrating here but I have not used myself before. According to Book Online, it will work only if there is one log file that is missing. If there are more than one log files involved, all of them are required to undergo the same procedure. -- Method 3: CREATE DATABASE TestDb ON ( FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\DATA\TestDb.mdf') FOR ATTACH GO Please read the Book Online in depth and consult DR experts before working on the production server. In my case, the above syntax just worked fine as the database was clean when it was detached. Feel free to write your opinions and experiences for it will help the IT community to learn more from your suggestions and skills. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, Readers Question, SQL, SQL Authority, SQL Backup and Restore, SQL Data Storage, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Save BIG on Storage &mdash; with Oracle Advanced Compression

    - by [email protected]
    Recently, we published a podcast revealing just how much Oracle benefits from its internal use of Oracle Database 11g and Advanced Compression. With hundreds of TB and millions of dollars saved, Oracle Advanced Compression is dramatically reducing storage costs and substantially improving efficiency across the company. Now, here's your chance: Meet the experts, have your questions answered by them and immediately start using your storage more efficiently: On April 14th, join me for a live Webcast with Oracle's Tim Shetler, Vice President of Product Management and Bill Hodak, Principal Product Manager, to learn just how Oracle Advanced Compression can Reduce disk space requirements for all types of data Improve query and storage performance Lower storage costs throughout the datacenter Register here! var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "http://www."); document.write(unescape("%3Cscript src='" + gaJsHost + "google-analytics.com/ga.js' type='text/javascript'%3E%3C/script%3E")); try { var pageTracker = _gat._getTracker("UA-13185312-1"); pageTracker._trackPageview(); } catch(err) {}

    Read the article

  • SQL SERVER – Retrieve and Explore Database Backup without Restoring Database – Idera virtual databas

    - by pinaldave
    I recently downloaded Idera’s SQL virtual database, and tested it. There are a few things about this tool which caught my attention. My Scenario It is quite common in real life that sometimes observing or retrieving older data is necessary; however, it had changed as time passed by. The full database backup was 40 GB in size, and, to restore it on our production server, it usually takes around 16 to 22 minutes, depending on the load server that is usually present. This range in time varies from one server to another as per the configuration of the computer. Some other issues we used to have are the following: When we try to restore a large 40-GB database, we needed at least that much space on our production server. Once in a while, we even had to make changes in the restored database, and use the said changed and restored database for our purpose, making it more time-consuming. My Solution I have heard a lot about the Idera’s SQL virtual database tool.. Well, right after we started to test this tool, we found out that it really delivers what it promises. Using this software was very easy and we were able to restore our database from backup in less than 2 minutes, sparing us from the usual longer time of 16–22 minutes. The needful was finished in a total of 10 minutes. Another interesting observation is that there is no need to have an additional space for restoring the database. For complete database restoration, the single additional MB on the drive is not required anymore. We can use the database in the same way as our regular database, and there is no need for any additional configuration and setup. Let us look at the most relevant points of this product based on my initial experience: Quick restoration of the database backup No additional space required for database restoration virtual database has no physical .MDF or .LDF The database which is restored is, in fact, the backup file converted in the virtual database. DDL and DML queries can be executed against this virtually restored database. Regular backup operation can be implemented against virtual database, creating a physical .bak file that can be used for future use. There was no observed degradation in performance on the original database as well the restored virtual database. Additional T-SQL queries can be let off on the virtual database. Well, this summarizes my quick review. And, as I was saying, I am very impressed with the product and I plan to explore it more. There are many features that I have noticed in this tool, which I think can be very useful if properly understood. I had taken a few screenshots using my demo database afterwards. Let us see what other things this tool can do besides the mentioned activities. I am surprised with its performance so I want to know how exactly this feature works, specifically in the matter of why it does not create any additional files and yet, it still allows update on the virtually restored database. I guess I will have to send an e-mail to the developers of Idera and try to figure this out from them. I think this tool is very useful, and it delivers a high level of performance way more than what I expected. Soon, I will write a review for additional uses of SQL virtual database.. If you are using SQL virtual database in your production environment, I am eager to learn more about it and your experience while using it. The ‘Virtual’ Part of virtual database When I set out to test this software, I thought virtual database had something to do with Hyper-V or visualization. In fact, the virtual database is a kind of database which shows up in your SQL Server Management Studio without actually restoring or even creating it. This tool creates a database in SSMS from the backup of the same database. The backup, however, works virtually the same way as original database. Potential Usage of virtual database: As soon as I described this tool to my teammate, I think his very first reaction was, “hey, if we have this then there is no need for log shipping.” I find his comment very interesting as log shipping is something where logs are moved to another server. In fact, there are no updates on the database from log; I would rather compare it with Snapshot Replication. In fact, whatever we use, snapshot replicated database can be similarly used and configured with virtual database. I totally believe that we can use it for reporting purpose. In fact, after this database was configured, I think the uses of this tool are unlimited. I will have to spend some more time studying it and will get back to you. Click on images to see larger images. virtual database Console Harddrive Space before virtual database Setup Attach Full Backup Screen Backup on Harddrive Attach Full Backup Screen with Settings virtual database Setup – less than 60 sec virtual database Setup – Online Harddrive Space after virtual database Setup Point in Time Recovery Option – Timeline View virtual database Summary No Performance Difference between Regular DB vs Virtual DB Please note that all SQL Server MVP gets free license of this software. Reference: Pinal Dave (http://blog.SQLAuthority.com), Idera (virtual database) Filed under: Database, Pinal Dave, SQL, SQL Add-On, SQL Authority, SQL Backup and Restore, SQL Data Storage, SQL Query, SQL Server, SQL Tips and Tricks, SQL Utility, SQLAuthority News, T SQL, Technology Tagged: Idera

    Read the article

  • SQL SERVER – 2008 – Introduction to Snapshot Database – Restore From Snapshot

    - by pinaldave
    Snapshot database is one of the most interesting concepts that I have used at some places recently. Here is a quick definition of the subject from Book On Line: A Database Snapshot is a read-only, static view of a database (the source database). Multiple snapshots can exist on a source database and can always reside on the same server instance as the database. Each database snapshot is consistent, in terms of transactions, with the source database as of the moment of the snapshot’s creation. A snapshot persists until it is explicitly dropped by the database owner. If you do not know how Snapshot database work, here is a quick note on the subject. However, please refer to the official description on Book-on-Line for accuracy. Snapshot database is a read-only database created from an original database called the “source database”. This database operates at page level. When Snapshot database is created, it is produced on sparse files; in fact, it does not occupy any space (or occupies very little space) in the Operating System. When any data page is modified in the source database, that data page is copied to Snapshot database, making the sparse file size increases. When an unmodified data page is read in the Snapshot database, it actually reads the pages of the original database. In other words, the changes that happen in the source database are reflected in the Snapshot database. Let us see a simple example of Snapshot. In the following exercise, we will do a few operations. Please note that this script is for demo purposes only- there are a few considerations of CPU, DISK I/O and memory, which will be discussed in the future posts. Create Snapshot Delete Data from Original DB Restore Data from Snapshot First, let us create the first Snapshot database and observe the sparse file details. USE master GO -- Create Regular Database CREATE DATABASE RegularDB GO USE RegularDB GO -- Populate Regular Database with Sample Table CREATE TABLE FirstTable (ID INT, Value VARCHAR(10)) INSERT INTO FirstTable VALUES(1, 'First'); INSERT INTO FirstTable VALUES(2, 'Second'); INSERT INTO FirstTable VALUES(3, 'Third'); GO -- Create Snapshot Database CREATE DATABASE SnapshotDB ON (Name ='RegularDB', FileName='c:\SSDB.ss1') AS SNAPSHOT OF RegularDB; GO -- Select from Regular and Snapshot Database SELECT * FROM RegularDB.dbo.FirstTable; SELECT * FROM SnapshotDB.dbo.FirstTable; GO Now let us see the resultset for the same. Now let us do delete something from the Original DB and check the same details we checked before. -- Delete from Regular Database DELETE FROM RegularDB.dbo.FirstTable; GO -- Select from Regular and Snapshot Database SELECT * FROM RegularDB.dbo.FirstTable; SELECT * FROM SnapshotDB.dbo.FirstTable; GO When we check the details of sparse file created by Snapshot database, we will find some interesting details. The details of Regular DB remain the same. It clearly shows that when we delete data from Regular/Source DB, it copies the data pages to Snapshot database. This is the reason why the size of the snapshot DB is increased. Now let us take this small exercise to  the next level and restore our deleted data from Snapshot DB to Original Source DB. -- Restore Data from Snapshot Database USE master GO RESTORE DATABASE RegularDB FROM DATABASE_SNAPSHOT = 'SnapshotDB'; GO -- Select from Regular and Snapshot Database SELECT * FROM RegularDB.dbo.FirstTable; SELECT * FROM SnapshotDB.dbo.FirstTable; GO -- Clean up DROP DATABASE [SnapshotDB]; DROP DATABASE [RegularDB]; GO Now let us check the details of the select statement and we can see that we are successful able to restore the database from Snapshot Database. We can clearly see that this is a very useful feature in case you would encounter a good business that needs it. I would like to request the readers to suggest more details if they are using this feature in their business. Also, let me know if you think it can be potentially used to achieve any tasks. Complete Script of the afore- mentioned operation for easy reference is as follows: USE master GO -- Create Regular Database CREATE DATABASE RegularDB GO USE RegularDB GO -- Populate Regular Database with Sample Table CREATE TABLE FirstTable (ID INT, Value VARCHAR(10)) INSERT INTO FirstTable VALUES(1, 'First'); INSERT INTO FirstTable VALUES(2, 'Second'); INSERT INTO FirstTable VALUES(3, 'Third'); GO -- Create Snapshot Database CREATE DATABASE SnapshotDB ON (Name ='RegularDB', FileName='c:\SSDB.ss1') AS SNAPSHOT OF RegularDB; GO -- Select from Regular and Snapshot Database SELECT * FROM RegularDB.dbo.FirstTable; SELECT * FROM SnapshotDB.dbo.FirstTable; GO -- Delete from Regular Database DELETE FROM RegularDB.dbo.FirstTable; GO -- Select from Regular and Snapshot Database SELECT * FROM RegularDB.dbo.FirstTable; SELECT * FROM SnapshotDB.dbo.FirstTable; GO -- Restore Data from Snapshot Database USE master GO RESTORE DATABASE RegularDB FROM DATABASE_SNAPSHOT = 'SnapshotDB'; GO -- Select from Regular and Snapshot Database SELECT * FROM RegularDB.dbo.FirstTable; SELECT * FROM SnapshotDB.dbo.FirstTable; GO -- Clean up DROP DATABASE [SnapshotDB]; DROP DATABASE [RegularDB]; GO Reference : Pinal Dave (http://blog.SQLAuthority.com) Filed under: SQL, SQL Authority, SQL Backup and Restore, SQL Data Storage, SQL Query, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology

    Read the article

< Previous Page | 20 21 22 23 24 25 26 27 28 29 30 31  | Next Page >