Search Results

Search found 32386 results on 1296 pages for 'sql tips and tricks'.

Page 62/1296 | < Previous Page | 58 59 60 61 62 63 64 65 66 67 68 69  | Next Page >

  • Automating deployments with the SQL Compare command line

    - by Jonathan Hickford
    In my previous article, “Five Tips to Get Your Organisation Releasing Software Frequently” I looked at how teams can automate processes to speed up release frequency. In this post, I’m looking specifically at automating deployments using the SQL Compare command line. SQL Compare compares SQL Server schemas and deploys the differences. It works very effectively in scenarios where only one deployment target is required – source and target databases are specified, compared, and a change script is automatically generated and applied. But if multiple targets exist, and pressure to increase the frequency of releases builds, this solution quickly becomes unwieldy.   This is where SQL Compare’s command line comes into its own. I’ve put together a PowerShell script that loops through the Servers table and pulls out the server and database, these are then passed to sqlcompare.exe to be used as target parameters. In the example the source database is a scripts folder, a folder structure of scripted-out database objects used by both SQL Source Control and SQL Compare. The script can easily be adapted to use schema snapshots.     -- Create a DeploymentTargets database and a Servers table CREATE DATABASE DeploymentTargets GO USE DeploymentTargets GO CREATE TABLE [dbo].[Servers]( [id] [int] IDENTITY(1,1) NOT NULL, [serverName] [nvarchar](50) NULL, [environment] [nvarchar](50) NULL, [databaseName] [nvarchar](50) NULL, CONSTRAINT [PK_Servers] PRIMARY KEY CLUSTERED ([id] ASC) ) GO -- Now insert your target server and database details INSERT INTO dbo.Servers ( serverName , environment , databaseName) VALUES ( N'myserverinstance' , N'myenvironment1' , N'mydb1') INSERT INTO dbo.Servers ( serverName , environment , databaseName) VALUES ( N'myserverinstance' , N'myenvironment2' , N'mydb2') Here’s the PowerShell script you can adapt for yourself as well. # We're holding the server names and database names that we want to deploy to in a database table. # We need to connect to that server to read these details $serverName = "" $databaseName = "DeploymentTargets" $authentication = "Integrated Security=SSPI" #$authentication = "User Id=xxx;PWD=xxx" # If you are using database authentication instead of Windows authentication. # Path to the scripts folder we want to deploy to the databases $scriptsPath = "SimpleTalk" # Path to SQLCompare.exe $SQLComparePath = "C:\Program Files (x86)\Red Gate\SQL Compare 10\sqlcompare.exe" # Create SQL connection string, and connection $ServerConnectionString = "Data Source=$serverName;Initial Catalog=$databaseName;$authentication" $ServerConnection = new-object system.data.SqlClient.SqlConnection($ServerConnectionString); # Create a Dataset to hold the DataTable $dataSet = new-object "System.Data.DataSet" "ServerList" # Create a query $query = "SET NOCOUNT ON;" $query += "SELECT serverName, environment, databaseName " $query += "FROM dbo.Servers; " # Create a DataAdapter to populate the DataSet with the results $dataAdapter = new-object "System.Data.SqlClient.SqlDataAdapter" ($query, $ServerConnection) $dataAdapter.Fill($dataSet) | Out-Null # Close the connection $ServerConnection.Close() # Populate the DataTable $dataTable = new-object "System.Data.DataTable" "Servers" $dataTable = $dataSet.Tables[0] #For every row in the DataTable $dataTable | FOREACH-OBJECT { "Server Name: $($_.serverName)" "Database Name: $($_.databaseName)" "Environment: $($_.environment)" # Compare the scripts folder to the database and synchronize the database to match # NB. Have set SQL Compare to abort on medium level warnings. $arguments = @("/scripts1:$($scriptsPath)", "/server2:$($_.serverName)", "/database2:$($_.databaseName)", "/AbortOnWarnings:Medium") # + @("/sync" ) # Commented out the 'sync' parameter for safety, write-host $arguments & $SQLComparePath $arguments "Exit Code: $LASTEXITCODE" # Some interesting variations # Check that every database matches a folder. # For example this might be a pre-deployment step to validate everything is at the same baseline state. # Or a post deployment script to validate the deployment worked. # An exit code of 0 means the databases are identical. # # $arguments = @("/scripts1:$($scriptsPath)", "/server2:$($_.serverName)", "/database2:$($_.databaseName)", "/Assertidentical") # Generate a report of the difference between the folder and each database. Generate a SQL update script for each database. # For example use this after the above to generate upgrade scripts for each database # Examine the warnings and the HTML diff report to understand how the script will change objects # #$arguments = @("/scripts1:$($scriptsPath)", "/server2:$($_.serverName)", "/database2:$($_.databaseName)", "/ScriptFile:update_$($_.environment+"_"+$_.databaseName).sql", "/report:update_$($_.environment+"_"+$_.databaseName).html" , "/reportType:Interactive", "/showWarnings", "/include:Identical") } It’s worth noting that the above example generates the deployment scripts dynamically. This approach should be problem-free for the vast majority of changes, but it is still good practice to review and test a pre-generated deployment script prior to deployment. An alternative approach would be to pre-generate a single deployment script using SQL Compare, and run this en masse to multiple targets programmatically using sqlcmd, or using a tool like SQL Multi Script.  You can use the /ScriptFile, /report, and /showWarnings flags to generate change scripts, difference reports and any warnings.  See the commented out example in the PowerShell: #$arguments = @("/scripts1:$($scriptsPath)", "/server2:$($_.serverName)", "/database2:$($_.databaseName)", "/ScriptFile:update_$($_.environment+"_"+$_.databaseName).sql", "/report:update_$($_.environment+"_"+$_.databaseName).html" , "/reportType:Interactive", "/showWarnings", "/include:Identical") There is a drawback of running a pre-generated deployment script; it assumes that a given database target hasn’t drifted from its expected state. Often there are (rightly or wrongly) many individuals within an organization who have permissions to alter the production database, and changes can therefore be made outside of the prescribed development processes. The consequence is that at deployment time, the applied script has been validated against a target that no longer represents reality. The solution here would be to add a check for drift prior to running the deployment script. This is achieved by using sqlcompare.exe to compare the target against the expected schema snapshot using the /Assertidentical flag. Should this return any differences (sqlcompare.exe Exit Code 79), a drift report is outputted instead of executing the deployment script.  See the commented out example. # $arguments = @("/scripts1:$($scriptsPath)", "/server2:$($_.serverName)", "/database2:$($_.databaseName)", "/Assertidentical") Any checks and processes that should be undertaken prior to a manual deployment, should also be happen during an automated deployment. You might think about triggering backups prior to deployment – even better, automate the verification of the backup too.   You can use SQL Compare’s command line interface along with PowerShell to automate multiple actions and checks that you need in your deployment process. Automation is a practical solution where multiple targets and a higher release cadence come into play. As we know, with great power comes great responsibility – responsibility to ensure that the necessary checks are made so deployments remain trouble-free.  (The code sample supplied in this post automates the simple dynamic deployment case – if you are considering more advanced automation, e.g. the drift checks, script generation, deploying to large numbers of targets and backup/verification, please email me at [email protected] for further script samples or if you have further questions)

    Read the article

  • Heroku Postgres: A New SQL Database-as-a-Service

    Idera, a Houston-based company known worldwide for its SQL Server solutions in the realms of backup and recovery, performance monitoring, auditing, security, and more, recently announced that it had won five of SQL Server Magazine's 2011 Community Choice Awards. SQL Server Magazine, a publication produced by Penton Media, offers SQL Server users, both beginning and advanced, a host of hands-on information delivered by SQL Server experts. The magazine presented Idera with 2011 Community Choice Awards for five separate products which will only serve to boost the already strong reputation of it...

    Read the article

  • Idera Releases SQL Diagnostic Manager v7.1

    Idera recently beefed up its portfolio of SQL Server management and administration tools with the release of SQL diagnostic manager 7.1. Idera has enhanced SQL diagnostic manager's already impressive set of features in version 7.1 with new additions that should appeal to database administrators. The release is another example of Idera's dedication to growing its portfolio of SQL Server solutions that has allowed the Microsoft Managed Partner to expand its client base to over 10,000 customers worldwide. The highlights of SQL diagnostic manager 7.1's new features begin with an impressive Serve...

    Read the article

  • How to Schedule Backups with SQL Server Express

    - by The Official Microsoft IIS Site
    Microsoft’s SQL Server Express is a fantastic product for anyone needing a relational database on a limited budget. By limited budget I’m talking free. Yes SQL Server Express is free but it comes with a few limitations such as only utilizing 1 GB of RAM, databases are limited to 10 GB, and it does not include SQL Profiler. For low volume sites that do not need enterprise level capabilities, this is a compelling solution. Here is a complete SQL Server feature comparison of all the SQL Server...(read more)

    Read the article

  • Quick Tip - Speed a Slow Restore from the Transaction Log

    - by KKline
    Here's a quick tip for you: During some restore operations on Microsoft SQL Server, the transaction log redo step might be taking an unusually long time. Depending somewhat on the version and edition of SQL Server you've installed, you may be able to increase performance by tinkering with the readahead performance for the redo operations. To do this, you should use the MAXTRANSFERSIZE parameter of the RESTORE statement. For example, if you set MAXTRANSFERSIZE=1048576, it'll use 1MB buffers. If you...(read more)

    Read the article

  • Quick Tip - Speed a Slow Restore from the Transaction Log

    - by KKline
    Here's a quick tip for you: During some restore operations on Microsoft SQL Server, the transaction log redo step might be taking an unusually long time. Depending somewhat on the version and edition of SQL Server you've installed, you may be able to increase performance by tinkering with the readahead performance for the redo operations. To do this, you should use the MAXTRANSFERSIZE parameter of the RESTORE statement. For example, if you set MAXTRANSFERSIZE=1048576, it'll use 1MB buffers. If you...(read more)

    Read the article

  • Querying Visual Studio project files using T-SQL and Powershell

    - by jamiet
    Earlier today I had a need to get some information out of a Visual Studio project file and in this blog post I’m going to share a couple of ways of going about that because I’m pretty sure I won’t be the only person that ever wants to do this. The specific problem I was trying to solve was finding out how many objects in my database project (i.e. in my .dbproj file) had any warnings suppressed but the techniques discussed below will work pretty well for any Visual Studio project file because every such file is simply an XML document, hence it can be queried by anything that can query XML documents. Ever heard the phrase “when all you’ve got is hammer everything looks like a nail”? Well that’s me with querying stuff – if I can write SQL then I’m writing SQL. Here’s a little noddy database project I put together for demo purposes: Two views and a stored procedure, nothing fancy. I suppressed warnings for [View1] & [Procedure1] and hence the pertinent part my project file looks like this:   <ItemGroup>    <Build Include="Schema Objects\Schemas\dbo\Views\View1.view.sql">      <SubType>Code</SubType>      <SuppressWarnings>4151,3276</SuppressWarnings>    </Build>    <Build Include="Schema Objects\Schemas\dbo\Views\View2.view.sql">      <SubType>Code</SubType>    </Build>    <Build Include="Schema Objects\Schemas\dbo\Programmability\Stored Procedures\Procedure1.proc.sql">      <SubType>Code</SubType>      <SuppressWarnings>4151</SuppressWarnings>    </Build>  </ItemGroup>  <ItemGroup> Note the <SuppressWarnings> elements – those are the bits of information that I am after. With a lot of help from folks on the SQL Server XML forum  I came up with the following query that nailed what I was after. It reads the contents of the .dbproj file into a variable of type XML and then shreds it using T-SQL’s XML data type methods: DECLARE @xml XML; SELECT @xml = CAST(pkgblob.BulkColumn AS XML) FROM   OPENROWSET(BULK 'C:\temp\QueryingProjectFileDemo\QueryingProjectFileDemo.dbproj' -- <-Change this path!                    ,single_blob) AS pkgblob                    ;WITH XMLNAMESPACES( 'http://schemas.microsoft.com/developer/msbuild/2003' AS ns) SELECT  REVERSE(SUBSTRING(REVERSE(ObjectPath),0,CHARINDEX('\',REVERSE(ObjectPath)))) AS [ObjectName]        ,[SuppressedWarnings] FROM   (        SELECT  build.query('.') AS [_node]        ,       build.value('ns:SuppressWarnings[1]','nvarchar(100)') AS [SuppressedWarnings]        ,       build.value('@Include','nvarchar(1000)') AS [ObjectPath]        FROM    @xml.nodes('//ns:Build[ns:SuppressWarnings]') AS R(build)        )q And here’s the output: And that’s it – an easy way of discovering which warnings have been suppressed and for which objects in your database projects. I won’t bother going over the code as it is fairly self-explanatory – peruse it at your leisure.   Once I had the SQL above I figured I’d share it around a little in case it was ever useful to anyone else; hence I’m writing this blog post and I also posted it on the Visual Studio Database Development Tools forum at FYI: Discover which objects have had warnings suppressed. Luckily Kevin Goode saw the thread and he posted a different solution to the same problem, one that uses Powershell. The advantage of Kevin’s Powershell approach is that it is easy to analyse many .dbproj files at the same time. Below is Kevin’s code which I have tweaked ever so slightly so that it produces the same results as my SQL script (I just want any object that had had a warning suppressed whereas Kevin was querying specifically for warning 4151):   cd 'C:\Temp\QueryingProjectFileDemo\' cls $projects = ls -r -i *.dbproj Foreach($project in $projects) { $xml = new-object System.Xml.XmlDocument $xml.set_PreserveWhiteSpace( $true ) $xml.Load($project) #$xpath = @{Start="/e:Project/e:ItemGroup/e:Build[e:SuppressWarnings=4151]/@Include"} #$xpath = @{Start="/e:Project/e:ItemGroup/e:Build[contains(e:SuppressWarnings,'4151')]/@Include"} $xpath = @{Start="/e:Project/e:ItemGroup/e:Build[e:SuppressWarnings]/@Include"} $ns = @{ e = "http://schemas.microsoft.com/developer/msbuild/2003" } $xml | Select-Xml -XPath $xpath.Start -Namespace $ns |Select -Expand Node | Select -expand Value } and here’s the output: Nice reusable Powershell and SQL scripts – not bad for an evening’s work. Thank you to Kevin for allowing me to share his code. Don’t forget that these techniques can easily be adapted to query any Visual Studio project file, they’re only XML documents after all! Doubtless many people out there already have code for doing this but nonetheless here is another offering to the great script library in the sky. Have fun! @Jamiet

    Read the article

  • Top 10 Transact-SQL Statements a SQL Server DBA Should Know

    Microsoft SQL Server is a feature rich database management system product, with an enormous number of T-SQL commands. With each feature supporting its own list of commands, it can be difficult to remember them all. MAK shares his top 10 T-SQL statements that a DBA should know. Join SQL Backup’s 35,000+ customers to compress and strengthen your backups "SQL Backup will be a REAL boost to any DBA lucky enough to use it." Jonathan Allen. Download a free trial now.

    Read the article

  • SQL Server 2012 RTM Available!

    - by Davide Mauri
    SQL Server 2012 is available for download! http://www.microsoft.com/sqlserver/en/us/default.aspx The Evaluation version is available here: http://www.microsoft.com/download/en/details.aspx?id=29066 and along with the SQL Server 2012 RTM there’s also the Feature Pack available: http://www.microsoft.com/download/en/details.aspx?id=29065 The Feature Pack is rich of useful and interesting stuff, something needed by some feature, like the Semantic Language Statistics Database some other a very good (I would say needed) download if you use certain technologies, like MDS or Data Mining. Btw, for Data Mining also the updated Excel Addin has been released and it’s available in the Feature Pack. As if this would not be enough, also the SQL Server Data Tools IDE has been released in RTM: http://msdn.microsoft.com/en-us/data/hh297027 Remember that SQL Server Data Tool is completely free and can be used with SQL Server 2005 and after. Happy downloading!

    Read the article

  • Easiest way to retrofit retry logic on LINQ to SQL migration to SQL Azure

    - by Pat James
    I have a couple of existing ASP .NET web forms and MVC applications that currently use LINQ to SQL with a SQL Server 2008 Express database on a Windows VPS: one VPS for both IIS and SQL. I am starting to outgrow the VPS's ability to effectively host both SQL and IIS and am getting ready to split them up. I am considering migrating the database to SQL Azure and keeping IIS on the VPS. After doing initial research it sounds like implementing retry logic in the data access layer is a must-do when adopting SQL Azure. I suspect this is even more critical to implement in my situation where IIS will be on a VPS outside of the Azure infrastructure. I am looking for pointers on how to do this with the least effort and impact on my existing code base. Is there a good retry pattern that can be applied once at the LINQ to SQL data access layer, as opposed to having to wrap all of my LINQ to SQL operations in try/catch/wait/retry logic?

    Read the article

  • Delphi Application using COMMIT and ROLLBACK for Multiple SQL Updates

    - by Matt
    Is it possible to use the SQL BEGIN TRANSACTION, COMMIT TRANSACTION, ROLLBACK TRANSACTION when embedding SQL Queries into an application with mutiple calls to the SQL for Table Updates. For example I have the following code: Q.SQL.ADD(<UPDATE A RECORD>); Q.ExecSQL; Q.Close; Q.SQL.Clear; Q.SQL.ADD(<Select Some Data>); Q.Open; Set Some Variables Q.Close; Q.SQL.Clear; Q.SQL.ADD(<UPDATE A RECORD>); Q.ExecSQL; What I would like to do is if the second update fails I want to roll back the first transaction. If I set a unique notation for the BEGIN, COMMIT, ROLLBACK so as to specify what is being committed or rolled back, is it feasible. i.e. before the first Update specify BEGIN TRANSACTION_A then after the last update specify COMMIT TRANSACTION_A I hope that makes sense. If I was doing this in a SQL Stored Procedure then I would be able to specify this at the start and end of the procedure, but I have had to break the code down into manageable chunks due to process blocks and deadlocks on a heavy loaded SQL Server.

    Read the article

  • Minimum privileges to read SQL Jobs using SQL SMO

    - by Gustavo Cavalcanti
    I wrote an application to use SQL SMO to find all SQL Servers, databases, jobs and job outcomes. This application is executed through a scheduled task using a local service account. This service account is local to the application server only and is not present in any SQL Server to be inspected. I am having problems getting information on job and job outcomes when connecting to the servers using a user with dbReader rights on system tables. If we set the user to be sysadmin on the server it all works fine. My question to you is: What are the minimum privileges a local SQL Server user needs to have in order to connect to the server and inspect jobs/job outcomes using the SQL SMO API? I connect to each SQL Server by doing the following: var conn = new ServerConnection { LoginSecure = false, ApplicationName = "SQL Inspector", ServerInstance = serverInstanceName, ConnectAsUser = false, Login = user, Password = password }; var smoServer = new Server (conn); I read the jobs by reading smoServer.JobServer.Jobs and read the JobSteps property on each of these jobs. The variable server is of type Microsoft.SqlServer.Management.Smo.Server. user/password are of the user found in each SQL Server to be inspected. If "user" is SysAdmin on the SQL Server to be inspected all works ok, as well as if we set ConnectAsUser to true and execute the scheduled task using my own credentials, which grants me SysAdmin privileges on SQL Server per my Active Directory membership. Thanks!

    Read the article

  • Good practices - database programming, unit testing

    - by Piotr Rodak
    Jason Brimhal wrote today on his blog that new book, Defensive Database Programming , written by Alex Kuznetsov ( blog ) is coming to bookstores. Alex writes about various techniques that make your code safer to run. SQL injection is not the only one vulnerability the code may be exposed to. Some other include inconsistent search patterns, unsupported character sets, locale settings, issues that may occur during high concurrency conditions, logic that breaks when certain conditions are not met. The...(read more)

    Read the article

  • COLUMNS_UPDATED() for audit triggers

    - by Piotr Rodak
    In SQL Server 2005, triggers are pretty much the only option if you want to audit changes to a table. There are many ways you can decide to store the change information. You may decide to store every changed row as a whole, either in a history table or as xml in audit table. The former case requires having a history table with exactly same schema as the audited table, the latter makes data retrieval and management of the table a bit tricky. Both approaches also suffer from the tendency to consume...(read more)

    Read the article

  • T-SQL Tuesday #31 - Logging Tricks with CONTEXT_INFO

    - by Most Valuable Yak (Rob Volk)
    This month's T-SQL Tuesday is being hosted by Aaron Nelson [b | t], fellow Atlantan (the city in Georgia, not the famous sunken city, or the resort in the Bahamas) and covers the topic of logging (the recording of information, not the harvesting of trees) and maintains the fine T-SQL Tuesday tradition begun by Adam Machanic [b | t] (the SQL Server guru, not the guy who fixes cars, check the spelling again, there will be a quiz later). This is a trick I learned from Fernando Guerrero [b | t] waaaaaay back during the PASS Summit 2004 in sunny, hurricane-infested Orlando, during his session on Secret SQL Server (not sure if that's the correct title, and I haven't used parentheses in this paragraph yet).  CONTEXT_INFO is a neat little feature that's existed since SQL Server 2000 and perhaps even earlier.  It lets you assign data to the current session/connection, and maintains that data until you disconnect or change it.  In addition to the CONTEXT_INFO() function, you can also query the context_info column in sys.dm_exec_sessions, or even sysprocesses if you're still running SQL Server 2000, if you need to see it for another session. While you're limited to 128 bytes, one big advantage that CONTEXT_INFO has is that it's independent of any transactions.  If you've ever logged to a table in a transaction and then lost messages when it rolled back, you can understand how aggravating it can be.  CONTEXT_INFO also survives across multiple SQL batches (GO separators) in the same connection, so for those of you who were going to suggest "just log to a table variable, they don't get rolled back":  HA-HA, I GOT YOU!  Since GO starts a new batch all variable declarations are lost. Here's a simple example I recently used at work.  I had to test database mirroring configurations for disaster recovery scenarios and measure the network throughput.  I also needed to log how long it took for the script to run and include the mirror settings for the database in question.  I decided to use AdventureWorks as my database model, and Adam Machanic's Big Adventure script to provide a fairly large workload that's repeatable and easily scalable.  My test would consist of several copies of AdventureWorks running the Big Adventure script while I mirrored the databases (or not). Since Adam's script contains several batches, I decided CONTEXT_INFO would have to be used.  As it turns out, I only needed to grab the start time at the beginning, I could get the rest of the data at the end of the process.   The code is pretty small: declare @time binary(128)=cast(getdate() as binary(8)) set context_info @time   ... rest of Big Adventure code ...   go use master; insert mirror_test(server,role,partner,db,state,safety,start,duration) select @@servername, mirroring_role_desc, mirroring_partner_instance, db_name(database_id), mirroring_state_desc, mirroring_safety_level_desc, cast(cast(context_info() as binary(8)) as datetime), datediff(s,cast(cast(context_info() as binary(8)) as datetime),getdate()) from sys.database_mirroring where db_name(database_id) like 'Adv%';   I declared @time as a binary(128) since CONTEXT_INFO is defined that way.  I couldn't convert GETDATE() to binary(128) as it would pad the first 120 bytes as 0x00.  To keep the CAST functions simple and avoid using SUBSTRING, I decided to CAST GETDATE() as binary(8) and let SQL Server do the implicit conversion.  It's not the safest way perhaps, but it works on my machine. :) As I mentioned earlier, you can query system views for sessions and get their CONTEXT_INFO.  With a little boilerplate code this can be used to monitor long-running procedures, in case you need to kill a process, or are just curious  how long certain parts take.  In this example, I added code to Adam's Big Adventure script to set CONTEXT_INFO messages at strategic places I want to monitor.  (His code is in UPPERCASE as it was in the original, mine is all lowercase): declare @msg binary(128) set @msg=cast('Altering bigProduct.ProductID' as binary(128)) set context_info @msg go ALTER TABLE bigProduct ALTER COLUMN ProductID INT NOT NULL GO set context_info 0x0 go declare @msg1 binary(128) set @msg1=cast('Adding pk_bigProduct Constraint' as binary(128)) set context_info @msg1 go ALTER TABLE bigProduct ADD CONSTRAINT pk_bigProduct PRIMARY KEY (ProductID) GO set context_info 0x0 go declare @msg2 binary(128) set @msg2=cast('Altering bigTransactionHistory.TransactionID' as binary(128)) set context_info @msg2 go ALTER TABLE bigTransactionHistory ALTER COLUMN TransactionID INT NOT NULL GO set context_info 0x0 go declare @msg3 binary(128) set @msg3=cast('Adding pk_bigTransactionHistory Constraint' as binary(128)) set context_info @msg3 go ALTER TABLE bigTransactionHistory ADD CONSTRAINT pk_bigTransactionHistory PRIMARY KEY NONCLUSTERED(TransactionID) GO set context_info 0x0 go declare @msg4 binary(128) set @msg4=cast('Creating IX_ProductId_TransactionDate Index' as binary(128)) set context_info @msg4 go CREATE NONCLUSTERED INDEX IX_ProductId_TransactionDate ON bigTransactionHistory(ProductId,TransactionDate) INCLUDE(Quantity,ActualCost) GO set context_info 0x0   This doesn't include the entire script, only those portions that altered a table or created an index.  One annoyance is that SET CONTEXT_INFO requires a literal or variable, you can't use an expression.  And since GO starts a new batch I need to declare a variable in each one.  And of course I have to use CAST because it won't implicitly convert varchar to binary.  And even though context_info is a nullable column, you can't SET CONTEXT_INFO NULL, so I have to use SET CONTEXT_INFO 0x0 to clear the message after the statement completes.  And if you're thinking of turning this into a UDF, you can't, although a stored procedure would work. So what does all this aggravation get you?  As the code runs, if I want to see which stage the session is at, I can run the following (assuming SPID 51 is the one I want): select CAST(context_info as varchar(128)) from sys.dm_exec_sessions where session_id=51   Since SQL Server 2005 introduced the new system and dynamic management views (DMVs) there's not as much need for tagging a session with these kinds of messages.  You can get the session start time and currently executing statement from them, and neatly presented if you use Adam's sp_whoisactive utility (and you absolutely should be using it).  Of course you can always use xp_cmdshell, a CLR function, or some other tricks to log information outside of a SQL transaction.  All the same, I've used this trick to monitor long-running reports at a previous job, and I still think CONTEXT_INFO is a great feature, especially if you're still using SQL Server 2000 or want to supplement your instrumentation.  If you'd like an exercise, consider adding the system time to the messages in the last example, and an automated job to query and parse it from the system tables.  That would let you track how long each statement ran without having to run Profiler. #TSQL2sDay

    Read the article

  • T-SQL Tuesday #31 - Logging Tricks with CONTEXT_INFO

    - by Most Valuable Yak (Rob Volk)
    This month's T-SQL Tuesday is being hosted by Aaron Nelson [b | t], fellow Atlantan (the city in Georgia, not the famous sunken city, or the resort in the Bahamas) and covers the topic of logging (the recording of information, not the harvesting of trees) and maintains the fine T-SQL Tuesday tradition begun by Adam Machanic [b | t] (the SQL Server guru, not the guy who fixes cars, check the spelling again, there will be a quiz later). This is a trick I learned from Fernando Guerrero [b | t] waaaaaay back during the PASS Summit 2004 in sunny, hurricane-infested Orlando, during his session on Secret SQL Server (not sure if that's the correct title, and I haven't used parentheses in this paragraph yet).  CONTEXT_INFO is a neat little feature that's existed since SQL Server 2000 and perhaps even earlier.  It lets you assign data to the current session/connection, and maintains that data until you disconnect or change it.  In addition to the CONTEXT_INFO() function, you can also query the context_info column in sys.dm_exec_sessions, or even sysprocesses if you're still running SQL Server 2000, if you need to see it for another session. While you're limited to 128 bytes, one big advantage that CONTEXT_INFO has is that it's independent of any transactions.  If you've ever logged to a table in a transaction and then lost messages when it rolled back, you can understand how aggravating it can be.  CONTEXT_INFO also survives across multiple SQL batches (GO separators) in the same connection, so for those of you who were going to suggest "just log to a table variable, they don't get rolled back":  HA-HA, I GOT YOU!  Since GO starts a new batch all variable declarations are lost. Here's a simple example I recently used at work.  I had to test database mirroring configurations for disaster recovery scenarios and measure the network throughput.  I also needed to log how long it took for the script to run and include the mirror settings for the database in question.  I decided to use AdventureWorks as my database model, and Adam Machanic's Big Adventure script to provide a fairly large workload that's repeatable and easily scalable.  My test would consist of several copies of AdventureWorks running the Big Adventure script while I mirrored the databases (or not). Since Adam's script contains several batches, I decided CONTEXT_INFO would have to be used.  As it turns out, I only needed to grab the start time at the beginning, I could get the rest of the data at the end of the process.   The code is pretty small: declare @time binary(128)=cast(getdate() as binary(8)) set context_info @time   ... rest of Big Adventure code ...   go use master; insert mirror_test(server,role,partner,db,state,safety,start,duration) select @@servername, mirroring_role_desc, mirroring_partner_instance, db_name(database_id), mirroring_state_desc, mirroring_safety_level_desc, cast(cast(context_info() as binary(8)) as datetime), datediff(s,cast(cast(context_info() as binary(8)) as datetime),getdate()) from sys.database_mirroring where db_name(database_id) like 'Adv%';   I declared @time as a binary(128) since CONTEXT_INFO is defined that way.  I couldn't convert GETDATE() to binary(128) as it would pad the first 120 bytes as 0x00.  To keep the CAST functions simple and avoid using SUBSTRING, I decided to CAST GETDATE() as binary(8) and let SQL Server do the implicit conversion.  It's not the safest way perhaps, but it works on my machine. :) As I mentioned earlier, you can query system views for sessions and get their CONTEXT_INFO.  With a little boilerplate code this can be used to monitor long-running procedures, in case you need to kill a process, or are just curious  how long certain parts take.  In this example, I added code to Adam's Big Adventure script to set CONTEXT_INFO messages at strategic places I want to monitor.  (His code is in UPPERCASE as it was in the original, mine is all lowercase): declare @msg binary(128) set @msg=cast('Altering bigProduct.ProductID' as binary(128)) set context_info @msg go ALTER TABLE bigProduct ALTER COLUMN ProductID INT NOT NULL GO set context_info 0x0 go declare @msg1 binary(128) set @msg1=cast('Adding pk_bigProduct Constraint' as binary(128)) set context_info @msg1 go ALTER TABLE bigProduct ADD CONSTRAINT pk_bigProduct PRIMARY KEY (ProductID) GO set context_info 0x0 go declare @msg2 binary(128) set @msg2=cast('Altering bigTransactionHistory.TransactionID' as binary(128)) set context_info @msg2 go ALTER TABLE bigTransactionHistory ALTER COLUMN TransactionID INT NOT NULL GO set context_info 0x0 go declare @msg3 binary(128) set @msg3=cast('Adding pk_bigTransactionHistory Constraint' as binary(128)) set context_info @msg3 go ALTER TABLE bigTransactionHistory ADD CONSTRAINT pk_bigTransactionHistory PRIMARY KEY NONCLUSTERED(TransactionID) GO set context_info 0x0 go declare @msg4 binary(128) set @msg4=cast('Creating IX_ProductId_TransactionDate Index' as binary(128)) set context_info @msg4 go CREATE NONCLUSTERED INDEX IX_ProductId_TransactionDate ON bigTransactionHistory(ProductId,TransactionDate) INCLUDE(Quantity,ActualCost) GO set context_info 0x0   This doesn't include the entire script, only those portions that altered a table or created an index.  One annoyance is that SET CONTEXT_INFO requires a literal or variable, you can't use an expression.  And since GO starts a new batch I need to declare a variable in each one.  And of course I have to use CAST because it won't implicitly convert varchar to binary.  And even though context_info is a nullable column, you can't SET CONTEXT_INFO NULL, so I have to use SET CONTEXT_INFO 0x0 to clear the message after the statement completes.  And if you're thinking of turning this into a UDF, you can't, although a stored procedure would work. So what does all this aggravation get you?  As the code runs, if I want to see which stage the session is at, I can run the following (assuming SPID 51 is the one I want): select CAST(context_info as varchar(128)) from sys.dm_exec_sessions where session_id=51   Since SQL Server 2005 introduced the new system and dynamic management views (DMVs) there's not as much need for tagging a session with these kinds of messages.  You can get the session start time and currently executing statement from them, and neatly presented if you use Adam's sp_whoisactive utility (and you absolutely should be using it).  Of course you can always use xp_cmdshell, a CLR function, or some other tricks to log information outside of a SQL transaction.  All the same, I've used this trick to monitor long-running reports at a previous job, and I still think CONTEXT_INFO is a great feature, especially if you're still using SQL Server 2000 or want to supplement your instrumentation.  If you'd like an exercise, consider adding the system time to the messages in the last example, and an automated job to query and parse it from the system tables.  That would let you track how long each statement ran without having to run Profiler. #TSQL2sDay

    Read the article

  • How to control order of assignment for new identity column in SQL Server?

    - by alpav
    I have a table with CreateDate datetime field default(getdate()) that does not have any identity column. I would like to add identity(1,1) field that would reflect same order of existing records as CreateDate field (order by would give same results). How can I do that ? I guess if I create clustered key on CreateDate field and then add identity column it will work (not sure if it's guaranteed), is there a good/better way ? I am interested in SQL Server 2005, but I guess the answer will be the same for SQL Server 2008, SQL Server 2000.

    Read the article

  • Where should the partitioning column go in the primary key on SQL Server?

    - by Bialecki
    Using SQL Server 2005 and 2008. I've got a potentially very large table (potentially hundreds of millions of rows) consisting of the following columns: CREATE TABLE ( date SMALLDATETIME, id BIGINT, value FLOAT ) which is being partitioned on column date in daily partitions. The question then is should the primary key be on date, id or value, id? I can imagine that SQL Server is smart enough to know that it's already partitioning on date and therefore, if I'm always querying for whole chunks of days, then I can have it second in the primary key. Or I can imagine that SQL Server will need that column to be first in the primary key to get the benefit of partitioning. Can anyone lend some insight into which way the table should be keyed?

    Read the article

  • Does anyone have a backup strategy for SQL Azure databases?

    - by Pete
    I'm using SQL Azure on a project and it works great. The problem is that the usual backup features do not exist. I have exported the database a couple of times using SQLAzureMW ( http://sqlazuremw.codeplex.com/ ) but this tool is now choking trying to download the database data with bcp. In any case, it's not as nice a solution as SQL Server backups. Is anyone aware of a commercial or open source tool, or other technique, for making reliable backups of SQL Azure databases? This is really a showstopper.

    Read the article

  • What is the Sql Server equivalent for Oracle's DBMS_ASSERT?

    - by dotNetYum
    DBMS_ASSERT is one of the keys to prevent SQL injection attacks in Oracle. I tried a cursory search...is there any SQL Server 2005/2008 equivalent for this functionality? I am looking for a specific implementation that has a counterpart of all the respective Oracle package members of DBMS_ASSERT. NOOP SIMPLE_SQL_NAME QUALIFIED_SQL_NAME SCHEMA_NAME I know the best-practices of preventing injection...bind variables...being one of them. But,in this question I am specifically looking for a good way to sanitize input...in scenarios where bind-variables were not used. Do you have any specific implemetations? Is there a library that actually is a SQL Server Port of the Oracle package?

    Read the article

  • Add the Recycle Bin to Start Menu in Windows 7

    - by Matthew Guay
    Have you ever tried to open the Recycle Bin by searching for “recycle bin” in the Start menu search, only to find nothing?  Here’s a quick trick that will let you find the Recycle Bin directly from your Windows Start menu search. The Start menu search may be the best timesaver ever added to Windows.  In fact, we use it so much that it seems painful to manually search for a program when using Windows XP or older versions of Windows.  You can easily find files, folders, programs and more through the Start menu search in both Vista and Windows 7. However, one thing you cannot find is the recycle bin; if you enter this in the start menu search it will not find it. Here’s how to add the Recycle Bin to your Start menu search. What to do To access the Recycle Bin from the Start menu search, we need to add a shortcut to the start menu.  Windows includes a personal Start menu folder, and an All Users start menu folder which all users on the computer can see.  This trick only works in the personal Start menu folder. Open up an Explorer window (Simply click the Computer link in the start menu), click the white part of the address bar, and, enter the following (substitute your username for your_user_name) and hit Enter. C:\Users\your_user_name\AppData\Roaming\Microsoft\Windows\Start Menu Now, right-click in the folder, select New, and then click Shortcut. In the location box, enter the following: explorer.exe shell:RecycleBinFolder When you’ve done this, click Next. Now, enter a name for the shortcut.  You can enter Recycle Bin like the standard shortcut, or you could name it something else such as Trash…if that’s easier for you to remember.  Click Finish when your done. By default it will have a folder icon.  Let’s switch that to the standard Recycle Bin icon.  Right-click on the new shortcut and click Properties. Click Change Icon… Type the following in the “Look for icons in this file:” box, and press the Enter key on your keyboard: %SystemRoot%\system32\imageres.dll Now, scroll and find the Recycle Bin icon and click Ok. Click Ok in the previous dialog, and now your Recycle Bin shortcut has the correct icon.   You can even have multiple shortcuts with different names, so when you searched either Recycle Bin or Trash it would come up in the Start menu.  To do that, simply repeat these directions, and enter another name of your choice at the prompt.  Here we have both a Recycle Bin and a Trash icon. Now, when you enter Recycle Bin (or trash, depending on what you chose) in your Start menu search, you will see it at the top of your Start menu.  Simply press Enter or click on the icon to open the Recycle Bin.   This trick will work in Windows Vista too!  Simply follow these same directions, and you can add the Recycle Bin to your Vista Start menu and find it via search. This is a simple trick, but may make it  much easier for you to open your Recycle Bin directly from your Windows Vista or 7 Start menu search.  If you’re using Windows 7, you can also check out our directions on how to Add the Recycle Bin to the Taskbar in Windows 7. Similar Articles Productive Geek Tips Hide, Delete, or Destroy the Recycle Bin Icon in Windows 7 or VistaDisable Deletion of the Recycle Bin in Windows VistaHide the Recycle Bin Icon Text on Windows VistaAdd the Recycle Bin to the Taskbar in Windows 7Resize the Recycle Bin in XP TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 PCmover Professional StockFox puts a Lightweight Stock Ticker in your Statusbar Explore Google Public Data Visually The Ultimate Excel Cheatsheet Convert the Quick Launch Bar into a Super Application Launcher Automate Tasks in Linux with Crontab Discover New Bundled Feeds in Google Reader

    Read the article

  • OData.org updated, gives clues about future SQL Azure enhancements

    - by jamiet
    The OData website at http://www.odata.org/home has been updated today to provide a much more engaging page than the previous sterile attempt. Moreover its now chockful of information about the progress of OData including a blog, a list of products that produce OData feeds plus some live OData feeds that you can hit up today, a list of OData-compliant clients and an FAQ. Most interestingly SQL Azure is listed as a producer of OData feeds: If you have a SQL Azure database account you can easily expose an OData feed through a simple configuration portal. You can select authenticated or anonymous access and expose different OData views according to permissions granted to the specified SQL Azure database user. A preview of this upcoming SQL Azure service is available at https://www.sqlazurelabs.com/OData.aspx and it enables you to select one of your existing SQL Azure databases and, with a few clicks, turn it into an OData feed. It looks as though SQL Azure will soon be added to the stable of products that natively support OData, good news indeed. @Jamiet Related blog posts: OData gunning for ubiquity across Microsoft products Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Remove Programs from the Open With Menu in Explorer

    - by Matthew Guay
    Would you like to clean up the Open with menu in Windows Explorer?  Here’s how you can remove program entries you don’t want in this menu on any version of Windows. Have you ever accidently opened an mp3 with Notepad, or a zip file with Word?  If so, you’re also likely irritated that these programs now show up in the Open with menu in Windows Explorer every time you select one of those files.  Whenever you open a file type with a particular program, Windows will add an entry for it to the Open with menu.  Usually this is helpful, but it can also clutter up the menu with wrong entries. On our computer, we have tried to open a PDF file with Word and Notepad, neither which can actually view the PDF itself.  Let’s remove these entries.  To do this, we need to remove the registry entries for these programs.  Enter regedit in your Start menu search or in the Run command to open the Registry editor. Backup your registry first just in case, so you can roll-back any changes you make if you accidently delete the wrong value.  Now, browse to the following key: HKEY_CURRENT_USER \Software \Microsoft \Windows \CurrentVersion \ Explorer \FileExts\ Here you’ll see a list of all the file extensions that are registered on your computer. Browse to the file extension you wish to edit, click the white triangle beside it to see the subfolders, and select OpenWithList.  In our test, we want to change the programs associated with PDF files, so we select the OpenWithList folder under .pdf. Notice the names of the programs under the Data column on the right.  Right-click the value for the program you don’t want to see in the Open With menu and select Delete. Click Yes at the prompt to confirm that you want to delete this value. Repeat these steps with all the programs you want to remove from this file type’s Open with menu.  You can go ahead and remove entries from other file types as well if you wish. Once you’ve removed the entries you didn’t want to see, check out the Open with menu in Explorer again.  Now it will be much more streamlined and will only show the programs you want to see. Conclusion This simple trick can help you keep your Open with menu tidy, and only show the programs you want in the list.  It can be irritating to accidently open files in programs that can’t even read them.  This trick works in all versions of Windows, including 2000, XP, Vista, and Windows 7. Similar Articles Productive Geek Tips Remove ISP Text or Corporate Branding from Internet Explorer Title BarRemove the Username From the Start Menu in XPKeep Start Menu From Closing After Opening ApplicationsRemove PartyPoker (Or Other Items) from the Internet Explorer Tools MenuUninstall, Disable, or Delete Internet Explorer 8 from Windows 7 TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips All My Movies 5.9 CloudBerry Online Backup 1.5 for Windows Home Server Snagit 10 VMware Workstation 7 OpenDNS Guide Google TV The iPod Revolution Ultimate Boot CD can help when disaster strikes Windows Firewall with Advanced Security – How To Guides Sculptris 1.0, 3D Drawing app

    Read the article

  • SQL Server 2008 R2 transactional replication over VPN

    - by enashnash
    I'm having difficulty setting up replication over a VPN. I have a SQL Server 2008 R2, Enterprise Edition database on a Windows 2008 R2 Server. SQL Server is running on a non-standard port. I have set it up so that it is acting as its own distributor and have configured a publisher on this server. It is set as an updatable transational publication (yes, this is necessary). On this server, I have Routing and Remote Access enabled in order to be able to establish VPN connections. It is configured with a static IP address pool, of which the first in the range is always assigned to the server. I have assigned a test user a static address within this range (I don't know if this is necessary or not). All clients will be 2008 R2 versions, but could be SQL Express or standalone developer instances of the full product. I can establish a VPN connection from the client without problems and can see that the correct IP addresses are allocated. After connecting to the database to test that I can establish a connection, I realised that I needed to be able to connect to the database using the server name rather than an IP address - required for replication - which wouldn't work initially. I created an entry in the hosts file for the server on the client using the NETBIOS name of the server, and now I can connect to the server, from the client, using the SERVER\INSTANCE, PORT syntax, over the VPN. As it is the default instance on the server, I can also connect with simply SERVER, PORT syntax. After all that, I still get the following dreaded error: SQL Server replication requires the actual server name to make a connection to the server. Connections through a server alias, IP address, or any other alternate name are not supported. Specify the actual server name, 'SERVER\INSTANCE'. (Replication.Utilities). What have I missed? How do I get this to work? TIA

    Read the article

< Previous Page | 58 59 60 61 62 63 64 65 66 67 68 69  | Next Page >