Search Results

Search found 28985 results on 1160 pages for 'sql training'.

Page 20/1160 | < Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >

  • SQL SERVER – CSVExpress and Quick Data Load

    - by pinaldave
    One of the newest ETL tools is CSVexpress.com.  This is a program that can quickly load any CSV file into ODBC compliant databases uses data integration.  For those of you familiar with databases and how they operate, the question that comes to mind might be what use this program will have in your life. I have written earlier article on this subject over here SQL SERVER – Import CSV into Database – Transferring File Content into a Database Table using CSVexpress. You might know that RDBMS have automatic support for loading CSV files into tables – but it is not quite as easy as one click of a button.  First of all, most databases have a command line interface and you need the file and configuration script in order to load up.  You also need to know enough to write the script – which for novices can be extremely daunting.  On top of all this, if you work with more than one type of RDBMS, you need to know the ins and outs of uploading and writing script for more than one program. So you might begin to see how useful CSVexpress.com might be!  There are many other tools that enable uploading files to a database.  They can be very fancy – some can generate configuration files automatically, others load the data directly.  Again, novices will be able to tell you why these aren’t the most useful programs in the world.  You see, these programs were created with SQL in mind, not for uploading data.  If you don’t have large amounts of data to upload, getting the configurations right can be a long process and you will have to check the code that is generated yourself.  Not exactly “easy to use” for novices. That makes CSVexpress.com one of the best new tools available for everyone – but especially people who don’t want to learn a lot of new material all at once.  CSVexpress has an easy to navigate graphical user interface and no scripting or coding is required.  There are built-in constraints and data validations, and you can configure transforms and reject records right there on the screen.  But the best thing of all – it’s free! That’s right, you can download CSVexpress for free from www.csvexpress.com and start easily uploading and configuring riles almost immediately.  If you’re currently happy with your method of data configuration, keep up with the good work.  For the rest of us, there’s CSVexpress.com. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQL Utility, T SQL, Technology

    Read the article

  • SQL SERVER – Validating Spatial Object with IsValidDetailed Function

    - by pinaldave
    What do you prefer – error or warning indicating error may happen with the reason for the error. While writing the previous statement I remember the movie “Minory Report”. This blog post is not about minority report but I will still cover the concept in a single statement “Let us predict the future and prevent the crime which is about to happen in future”. (Please feel free to correct me if I am wrong about the movie concept, I really do not want to hurt your sentiment if you are dedicated fan). Let us switch to the SQL Server world. Spatial data types are interesting concepts. I love writing about spatial data types because it allows me to be creative with shapes (just like toddlers). When working with Spatial Datatypes it is all good when the spatial object works fine. However, when the spatial object has issue or it is created with invalid coordinates it used to give a simple error that there is an issue with the object but did not provide much information. This made it very difficult to debug. If this spatial object was used in the big procedure and while this big procedural error out because of the invalid spatial object, it is indeed very difficult to debug it. I always wished that the more information provided regarding what is the problem with spatial datatype. SQL Server 2012 has introduced the new function IsValidDetailed(). This function has made my life very easy. In simple words this function will check if the spatial object passed is valid or not. If it is valid it will give information that it is valid. If the spatial object is not valid it will return the answer that it is not valid and the reason for the same. This makes it very easy to debug the issue and make the necessary correction. DECLARE @p GEOMETRY = 'Polygon((2 2, 6 6, 4 2, 2 2))' SELECT @p.IsValidDetailed() GO DECLARE @p GEOMETRY = 'Polygon((2 2, 3 3, 4 4, 5 5, 6 6, 2 2))' SELECT @p.IsValidDetailed() GO DECLARE @p GEOMETRY = 'Polygon((2 2, 4 4, 4 2, 2 3, 2 2))' SELECT @p.IsValidDetailed() GO DECLARE @p GEOMETRY = 'CIRCULARSTRING(2 2, 4 4, 0 0)' SELECT @p.IsValidDetailed() GO DECLARE @p GEOMETRY = 'CIRCULARSTRING(2 2, 4 4, 0 0)' SELECT @p.IsValidDetailed() GO DECLARE @p GEOMETRY = 'LINESTRING(2 2, 4 4, 0 0)' SELECT @p.IsValidDetailed() GO Here is the resultset of the above query. You can see any valid query and some invalid query. If the query is invalid it also demonstrates the reason along with the error message. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: Spatial Database, SQL Spatial

    Read the article

  • SQL SERVER – How to easily work with Database Diagrams

    - by Pinal Dave
    Databases are very widely used in the modern world. Regardless of the complexity of a database, each one requires in depth designing. To practice along please Download dbForge Studio now.  The right methodology of designing a database is based on the foundations of data normalization, according to which we should first define database’s key elements – entities. Afterwards the attributes of entities and relations between them are determined. There is a strong opinion that the process of database designing should start with a pencil and a blank sheet of paper. This might look old-fashioned nowadays, because SQL Server provides a much wider functionality for designing databases – Database Diagrams. When using SSMS for working with Database Diagrams I realized two things – on the one hand, visualization of a scheme allows designing a database more efficiently; on the other – when it came to creating a big scheme, some difficulties occurred when designing with SSMS. The alternatives haven’t taken long to wait and dbForge Studio for SQL Server is one of them. Its functions offer more advantages for working with Database Diagrams. For example, unlike SSMS, dbForge Studio supports an opportunity to drag-and-drop several tables at once from the Database Explorer. This is my opinion but personally I find this option very useful. Another great thing is that a diagram can be saved as both a graphic file and a special XML file, which in case of identical environment can be easily opened on the other server for continuing the work. During working with dbForge Studio it turned out that it offers a wide set of elements to operate with on the diagram. Noteworthy among such elements are containers which allow aggregating diagram objects into thematic groups. Moreover, you can even place an image directly on the diagram if the scheme design is based on a standard template. Each of the development environments has a different approach to storing a diagram (for example, SSMS stores them on a server-side, whereas dbForge Studio – in a local file). I haven’t found yet an ability to convert existing diagrams from SSMS to dbForge Studio. However I hope Devart developers will implement this feature in one of the following releases. All in all, editing Database Diagrams through dbForge Studio was a nice experience and allowed speeding-up the common database designing tasks. Download dbForge Studio now. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQL Utility, T SQL

    Read the article

  • SQL SERVER – Weekly Series – Memory Lane – #007

    - by pinaldave
    Here is the list of selected articles of SQLAuthority.com across all these years. Instead of just listing all the articles I have selected a few of my most favorite articles and have listed them here with additional notes below it. Let me know which one of the following is your favorite article from memory lane. 2006 Find Stored Procedure Related to Table in Database – Search in All Stored Procedure In 2006 I wrote a small script which will help user  find all the Stored Procedures (SP) which are related to one or more specific tables. This was quite a popular script however, in SQL Server 2012 the same can be achieved using new DMV sys.sql-expression_dependencies. I recently blogged about it over Find Referenced or Referencing Object in SQL Server using sys.sql_expression_dependencies. 2007 SQL SERVER – Versions, CodeNames, Year of Release 1993 – SQL Server 4.21 for Windows NT 1995 – SQL Server 6.0, codenamed SQL95 1996 – SQL Server 6.5, codenamed Hydra 1999 – SQL Server 7.0, codenamed Sphinx 1999 – SQL Server 7.0 OLAP, codenamed Plato 2000 – SQL Server 2000 32-bit, codenamed Shiloh (version 8.0) 2003 – SQL Server 2000 64-bit, codenamed Liberty 2005 – SQL Server 2005, codenamed Yukon (version 9.0) 2008 – SQL Server 2008, codenamed Katmai (version 10.0) 2011 – SQL Server 2008, codenamed Denali (version 11.0) Search String in Stored Procedure Searching sting in the stored procedure is one of the most frequent task developer do. They might be searching for a table, view or any other details. I have written a script to do the same in SQL Server 2000 and SQL Server 2005. This is worth bookmarking blog post. There is an alternative way to do the same as well here is the example. 2008 SQL SERVER – Refresh Database Using T-SQL NO! Some of the questions have a single answer NO! You may want to read the question in the original blog post. I had a great time saying No! SQL SERVER – Delete Backup History – Cleanup Backup History SQL Server stores history of all the taken backup forever. History of all the backup is stored in the msdb database. Many times older history is no more required. Following Stored Procedure can be executed with a parameter which takes days of history to keep. In the following example 30 is passed to keep a history of month. 2009 Stored Procedure are Compiled on First Run – SP taking Longer to Run First Time Is stored procedure pre-compiled? Why the Stored Procedure takes a long time to run for the first time?  This is a very common questions often discussed by developers and DBAs. There is an absolutely definite answer but the question has been discussed forever. There is a misconception that stored procedures are pre-compiled. They are not pre-compiled, but compiled only during the first run. For every subsequent runs, it is for sure pre-compiled. Read the entire article for example and demonstration. Removing Key Lookup – Seek Predicate – Predicate – An Interesting Observation Related to Datatypes This is one of the most important performance tuning lesson on my blog. I suggest this weekend you spend time reading them and let me know what you think about the concepts which I have demonstrated in the four part series. Part 1 | Part 2 | Part 3 | Part 4 Seek Predicate is the operation that describes the b-tree portion of the Seek. Predicate is the operation that describes the additional filter using non-key columns. Based on the description, it is very clear that Seek Predicate is better than Predicate as it searches indexes whereas in Predicate, the search is on non-key columns – which implies that the search is on the data in page files itself. Policy Based Management – Create, Evaluate and Fix Policies This article will cover the most spectacular feature of SQL Server – Policy-based management and how the configuration of SQL Server with policy-based management architecture can make a powerful difference. Policy based management is loaded with several advantages. It can help you implement various policies for reliable configuration of the system. It also provides additional administration assistance to DBAs and helps them effortlessly manage various tasks of SQL Server across the enterprise. 2010 Recycle Error Log – Create New Log file without Server Restart Once I observed a DBA to restaring the SQL Server when he needed new error log file. This was funny and sad both at the same time. There is no need to restart the server to create a new log file or recycle the log file. You can run sp_cycle_errorlog and achieve the same result. Get Database Backup History for a Single Database Simple but effective script! Reducing CXPACKET Wait Stats for High Transactional Database The subject is very complex and I have done my best to simplify the concept. In simpler words, when a parallel operation is created for SQL Query, there are multiple threads for a single query. Each query deals with a different set of the data (or rows). Due to some reasons, one or more of the threads lag behind, creating the CXPACKET Wait Stat. Threads which came first have to wait for the slower thread to finish. The Wait by a specific completed thread is called CXPACKET Wait Stat. Information Related to DATETIME and DATETIME2 There are quite a lot of confusion with DATETIME and DATETIME2. DATETIME2 is also one of the underutilized datatype of SQL Server.  In this blog post I have written a follow up of the my earlier datetime series where I clarify a few of the concepts related to datetime. Difference Between GETDATE and SYSDATETIME Difference Between DATETIME and DATETIME2 – WITH GETDATE Difference Between DATETIME and DATETIME2 2011 Introduction to CUME_DIST – Analytic Functions Introduced in SQL Server 2012 SQL Server 2012 introduces new analytical function CUME_DIST(). This function provides cumulative distribution value. It will be very difficult to explain this in words so I will attempt small example to explain you this function. Instead of creating new table, I will be using AdventureWorks sample database as most of the developer uses that for experiment. Introduction to FIRST _VALUE and LAST_VALUE – Analytic Functions Introduced in SQL Server 2012 SQL Server 2012 introduces new analytical functions FIRST_VALUE() and LAST_VALUE(). This function returns first and last value from the list. It will be very difficult to explain this in words so I’d like to attempt to explain its function through a brief example. Instead of creating a new table, I will be using the AdventureWorks sample database as most developers use that for experiment purposes. OVER clause with FIRST _VALUE and LAST_VALUE – Analytic Functions Introduced in SQL Server 2012 – ROWS BETWEEN UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING “Don’t you think there is bug in your first example where FIRST_VALUE is remain same but the LAST_VALUE is changing every line. I think the LAST_VALUE should be the highest value in the windows or set of result.” Puzzle – Functions FIRST_VALUE and LAST_VALUE with OVER clause and ORDER BY You can see that row number 2, 3, 4, and 5 has same SalesOrderID = 43667. The FIRST_VALUE is 78 and LAST_VALUE is 77. Now if these function was working on maximum and minimum value they should have given answer as 77 and 80 respectively instead of 78 and 77. Also the value of FIRST_VALUE is greater than LAST_VALUE 77. Why? Explain in detail. Introduction to LEAD and LAG – Analytic Functions Introduced in SQL Server 2012 SQL Server 2012 introduces new analytical function LEAD() and LAG(). This functions accesses data from a subsequent row (for lead) and previous row (for lag) in the same result set without the use of a self-join . It will be very difficult to explain this in words so I will attempt small example to explain you this function. Instead of creating new table, I will be using AdventureWorks sample database as most of the developer uses that for experiment. A Real Story of Book Getting ‘Out of Stock’ to A 25% Discount Story Available Our book was out of stock in 48 hours of it was arrived in stock! We got call from the online store with a request for more copies within 12 hours. But we had printed only as many as we had sent them. There were no extra copies. We finally talked to the printer to get more copies. However, due to festivals and holidays the copies could not be shipped to the online retailer for two days. We knew for sure that they were going to be out of the book for 48 hours. This is the story of how we overcame that situation! Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Memory Lane, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Free Print Book on SQL Server Joes 2 Pros Kit

    - by pinaldave
    Rick Morelan and I were discussing earlier this month that what we can give back to the community. We believe our books are very much successful and very well received by the community. The five books are a journey from novice to expert. The books have changed many lives and helped many get jobs as well pass the SQL Certifications. Rick is from Seattle, USA and I am from Bangalore, India. There are 12 hours difference between us. We try to do weekly meeting to catch up on various personal and SQL related topics. Here is one of our recent conversations. Rick and Pinal Pinal: Good Morning Rick! Rick: Good Morning…err… Good Evening to you – Pinal! Pinal: Hey Rick, did you read the recent email which I sent you – one of our reader is thanking us for writing Joes 2 Pros series. He wants to dedicate his success to us. Can you believe it? Rick: Yeah, he is very kind but did you tell him that it is all because of his hard work on learning subject and we have very little contribution in his success. Pinal: Absolutely, I told him the same – I said we just wrote the book but it is he who learned from it and proved himself in his job. It is all him! We were just igniters. Rick: Good response. Pinal: Hey Rick! Are we doing enough for the community? What can we do more? Rick: Hmmm… Let us do something more. Pinal: Remember once we discussed the idea of if anyone who buys our Joes 2 Pros Combo Kit in the next 2 weeks – we will send them SQL Wait Stats for free. What do you say? Rick: I agree! Great Idea! Let us do it. Free Giveaway Well Rick and I liked the idea of doing more. We have decided to give away free SQL Server Wait Stats books to everybody who will purchase Joes 2 Pros Combo Kit between today (Oct 15, 2012) and Oct 26, 2012. This is not a contest or a lucky winner opportunity. Everybody who participates will qualify for it. Combo Availability USA – Amazon India - Flipkart | Indiaplaza Note1: USA kit contains FREE 5 DVDs. India Kit does not contain 5 DVDs due to legal issues. Note2: Indian Kit is priced at special Indian Economic Price. Qualify for Free Giveaway You must have purchased our Joes 2 Pros Combo Kit of 5 books between Oct 15, 2012 and Oct 26, 2012. Purchase before Oct 15, 2012 and after Oct 26, 2012 will not qualify for this giveaway. Send your original receipt (email, order details) to following addresses: “[email protected];[email protected]” with the subject line “Joes 2 Pros Kit Promotion Free Offer”. Do not change the subject line or your email may be missed.  Clearly mention your shipping address with phone number and pin/zip code. Send your receipt before Oct 30, 2012. We will not entertain any conversation after Oct 30, 2012 cut off date. The Free books will be sent to USA and India address only. Availability USA - Amazon | India - Flipkart | Indiaplaza Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Joes 2 Pros, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Book Review, SQLServer, T SQL, Technology

    Read the article

  • SQL SERVER – Enable Identity Insert – Import Expert Wizard

    - by pinaldave
    I recently got email from old friend who told me that when he tries to execute SSIS package it fails with some identity error. After some debugging and opening his package we figure out that he has following issue. Let us see what kind of set up he had on his package. Source Table with Identity column Destination Table with Identity column Following checkbox was disabled in Import Expert Wizard (as per the image below) What did we do is we enabled the checkbox described as above and we fixed the problem he was having due to insertion in identity column. The reason he was facing this error because his destination table had IDENTITY property which will not allow any  insert from user. This value is automatically generated by system when new values are inserted in the table. However, when user manually tries to insert value in the table, it stops them and throws an error. As we enabled the checkbox “Enable Identity Insert”, this feature allowed the values to be insert in the identity field and this way from source database exact identity values were moved to destination table. Let me know if this blog post was easy to understand. Reference: Pinal Dave (http://blog.SQLAuthority.com), Filed under: Pinal Dave, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQL Utility, T SQL, Technology

    Read the article

  • SQL SERVER – 2 T-SQL Puzzles and Win USD 50 worth Amazon Gift Card and 25 Other Prizes

    - by pinaldave
    We all love brain teasers and interesting puzzles. Today I decided to come up with 2 interesting puzzles and winner of the contest will get USD 50 worth Amazon Gift Card. The puzzles are sponsored by NuoDB. Additionally, The first 25 individuals who download NuoDB Beta 8 by midnight Friday, Sept. 21 (EST) will automatically receive a $10 Amazon gift card. Puzzle 1: Why following code when executed in SSMS displays result as a * (Star)? SELECT CAST(634 AS VARCHAR(2)) Puzzle 2: Write the shortest code that produces results as 1 without using any numbers in the select statement. Bonus Q: How many different Operating System (OS) NuoDB support? Click here HINT If you can solve above puzzles you will be eligible for winning USD 50 Amazon Gift Card. However, you can always enroll yourself for following Bonus Prizes where if you have good chance of winning USD 10 Amazon Gift Card (if you are first 25 individual in specific time). Bonus Prizes: The first 25 individuals who download NuoDB Beta 8 by midnight Friday, Sept. 21 (EST) will automatically receive a $10 Amazon gift card. Rules: Please leave an answer in the comments section below. You can resubmit your answer multiple times, the latest entry will be considered valid. The winner will be announced on 1st October. Last day to participate in the puzzle is September 28th, 2012. All valid answer will be kept hidden till September 28th, 2012. Only One Winner will get USD 50 worth Amazon Gift Card. The first 25 individuals who download NuoDB Beta 8 by midnight Friday, Sept. 21 (EST) will automatically receive a $10 Amazon gift card. The winner will be selected using random algorithm. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Puzzle, SQL Query, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology Tagged: NuoDB

    Read the article

  • SQL SERVER – Demo Script – Keeping CPU Busy

    - by pinaldave
    Recently face very interesting situation, during presentations at event, I was asked very famous questions: “My CPU is very high all the time, how can I reduce it?” This is very interesting question and there are many answers and a single blog post is not good enough to justify this subject. I presented few situation to the person who asked the question. The member of the audience who asked question came to me afterwords and asked me few detailed questions. To answer him, I quickly wrote query which simulate high CPU. Here is the script which I wrote which increased CPU from 10% to 80%. I was wondering if there is any similar script which can simulate high CPU usage. If you have share with me and I will publish with due credit. Here is my script for the same: USE AdventureWorks GO DECLARE @Flag INT SET @Flag = 1 WHILE(@Flag < 1000) BEGIN ALTER INDEX [PK_SalesOrderDetail_SalesOrderID_SalesOrderDetailID] ON [Sales].[SalesOrderDetail] REBUILD SET @Flag = @Flag + 1 END GO   Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Database in RESTORING State for Long Time

    - by Pinal Dave
    A very interesting question I received the other day. “Our database has been in restoring stage for a long time. We have already restored all the necessary files there. After restoring the files we are expecting that  the database will be in operational mode, however, it is continuously in the restoring mode. Any suggestion?” The question is very common. I sent user follow up emails to understand what is actually going on with the user. I realized after restoring their bak files and log files their database was in the restoring state because they had not restored the latest log file with RECOVERY options. As they had completed all the database restore sequence (bak and log in order), the real need for them was to recover the database from norecovery state. User can restore log files till the database is no recovery mode. If the database is recovered it will be in operation and it can continue database operation. If the database has another operations we cannot restore further log as the chain of the log file after the database is recovered is meaningless. This is the reason why the database has to be norecovery state when it is restored. There are three different ways to recover the database. 1) Recover the database manually with following command. RESTORE DATABASE database_name WITH RECOVERY 2) Recover the database with the last log file. RESTORE LOG database_name FROM backup_device WITH RECOVERY 3) Recover the database when bak is restored RESTORE DATABASE database_name FROM backup_device WITH RECOVERY To understand how the backup restores timeline works read Backup Timeline and Understanding of Database Restore Process in Full Recovery Model. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Backup and Restore, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Validating If Date is Last Day of the Year, Month or Day

    - by Pinal Dave
    Here is one more question I recently received in an email- “Pinal, is there any ready made function which will display if the given date is the last day or the year, month or day? For example, if a date is the last day of the Year and last day of the month, I want to display Last Day of the Year and if a date is the last date of the month and last day of the week, I want to display the last day of the week. “ Well, very interesting question and there is no such function available for the same. However, here is the function I have written earlier for my personal use where I almost accomplish same task. -- Example of Year DECLARE @Day DATETIME SET @Day = '2014-12-31' SELECT CASE WHEN CAST(@Day AS DATE) = CAST(DATEADD(ms,-3,DATEADD(yy,0,DATEADD(yy,DATEDIFF(yy,0,@Day)+1,0))) AS DATE) THEN 'LastDayofYear' WHEN CAST(@Day AS DATE) = CAST(DATEADD(s,-1,DATEADD(mm, DATEDIFF(m,0,@Day)+1,0)) AS DATE) THEN 'LastDayofMonth' WHEN CAST(@Day AS DATE) = CAST(DATEADD(s,-1,DATEADD(wk, DATEDIFF(wk,0,@Day),0)) AS DATE) THEN 'LastDayofWeek' ELSE 'Day' END GO -- Example of Month DECLARE @Day DATETIME SET @Day = '2014-06-30' SELECT CASE WHEN CAST(@Day AS DATE) = CAST(DATEADD(ms,-3,DATEADD(yy,0,DATEADD(yy,DATEDIFF(yy,0,@Day)+1,0))) AS DATE) THEN 'LastDayofYear' WHEN CAST(@Day AS DATE) = CAST(DATEADD(s,-1,DATEADD(mm, DATEDIFF(m,0,@Day)+1,0)) AS DATE) THEN 'LastDayofMonth' WHEN CAST(@Day AS DATE) = CAST(DATEADD(s,-1,DATEADD(wk, DATEDIFF(wk,0,@Day),0)) AS DATE) THEN 'LastDayofWeek' ELSE 'Day' END GO -- Example of Month DECLARE @Day DATETIME SET @Day = '2014-05-04' SELECT CASE WHEN CAST(@Day AS DATE) = CAST(DATEADD(ms,-3,DATEADD(yy,0,DATEADD(yy,DATEDIFF(yy,0,@Day)+1,0))) AS DATE) THEN 'LastDayofYear' WHEN CAST(@Day AS DATE) = CAST(DATEADD(s,-1,DATEADD(mm, DATEDIFF(m,0,@Day)+1,0)) AS DATE) THEN 'LastDayofMonth' WHEN CAST(@Day AS DATE) = CAST(DATEADD(s,-1,DATEADD(wk, DATEDIFF(wk,0,@Day),0)) AS DATE) THEN 'LastDayofWeek' ELSE 'Day' END GO Let me know if you know any other smarter trick and we can post it here with due credit. Reference: Pinal Dave (http://blog.SQLAuthority.com)Filed under: PostADay, SQL, SQL Authority, SQL DateTime, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • SQL SERVER – How to an Add Identity Column to Table in SQL Server

    - by Pinal Dave
    Here is the question I received on SQLAuthority Fan Page. “How do I add an identity column to Table in SQL Server? “ Sometime the questions are very very simple but the answer is not easy to find. Scenario 1: If you are table does not have identity column, you can simply add the identity column by executing following script: ALTER TABLE MyTable ADD ID INT IDENTITY(1,1) NOT NULL Scenario 2: If your table already has a column which you want to convert to identity column, you can’t do that directly. There is a workaround for the same which I have discussed in depth over the article Add or Remove Identity Property on Column. Scenario 3: If your table has already identity column and you can want to add another identity column for any reason – that is not possible. A table can have only one identity column. If you try to have multiple identity column your table, it will give following error. Msg 2744, Level 16, State 2, Line 2 Multiple identity columns specified for table ‘MyTable‘. Only one identity column per table is allowed. Leave a comment if you have any suggestion. Reference : Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL Tagged: Identity

    Read the article

  • SQL SERVER – Select Columns from Stored Procedure Resultset

    - by Pinal Dave
    It is fun to go back to basics often. Here is the one classic question: “How to select columns from Stored Procedure Resultset?” Though Stored Procedure has been introduced many years ago, the question about retrieving columns from Stored Procedure is still very popular with beginners. Let us see the solution in quick steps. First we will create a sample stored procedure. CREATE PROCEDURE SampleSP AS SELECT 1 AS Col1, 2 AS Col2 UNION SELECT 11, 22 GO Now we will create a table where we will temporarily store the result set of stored procedures. We will be using INSERT INTO and EXEC command to retrieve the values and insert into temporary table. CREATE TABLE #TempTable (Col1 INT, Col2 INT) GO INSERT INTO #TempTable EXEC SampleSP GO Next we will retrieve our data from stored procedure. SELECT * FROM #TempTable GO Finally we will clean up all the objects which we have created. DROP TABLE #TempTable DROP PROCEDURE SampleSP GO Let me know if you want me to share such back to basic tips. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Stored Procedure, SQL Tips and Tricks, T SQL

    Read the article

  • SQL SERVER – ORDER BY ColumnName vs ORDER BY ColumnNumber

    - by pinaldave
    I strongly favor ORDER BY ColumnName. I read one of the blog post where blogger compared the performance of the two SELECT statement and come to conclusion that ColumnNumber has no harm to use it. Let us understand the point made by first that there is no performance difference. Run following two scripts together: USE AdventureWorks GO -- ColumnName (Recommended) SELECT * FROM HumanResources.Department ORDER BY GroupName, Name GO -- ColumnNumber (Strongly Not Recommended) SELECT * FROM HumanResources.Department ORDER BY 3,2 GO If you look at the result and see the execution plan you will see that both of the query will take the same amount of the time. However, this was not the point of this blog post. It is not good enough to stop here. We need to understand the advantages and disadvantages of both the methods. Case 1: When Not Using * and Columns are Re-ordered USE AdventureWorks GO -- ColumnName (Recommended) SELECT GroupName, Name, ModifiedDate, DepartmentID FROM HumanResources.Department ORDER BY GroupName, Name GO -- ColumnNumber (Strongly Not Recommended) SELECT GroupName, Name, ModifiedDate, DepartmentID FROM HumanResources.Department ORDER BY 3,2 GO Case 2: When someone changes the schema of the table affecting column order I will let you recreate the example for the same. If your development server where your schema is different than the production server, if you use ColumnNumber, you will get different results on the production server. Summary: When you develop the query it may not be issue but as time passes by and new columns are added to the SELECT statement or original table is re-ordered if you have used ColumnNumber it may possible that your query will start giving you unexpected results and incorrect ORDER BY. One should note that the usage of ORDER BY ColumnName vs ORDER BY ColumnNumber should not be done based on performance but usability and scalability. It is always recommended to use proper ORDER BY clause with ColumnName to avoid any confusion. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Grouping by Multiple Columns to Single Column as A String

    - by pinaldave
    One of the most common questions I receive in email is how to group multiple column data in comma separate values in a single row grouping by another column. I have previously blogged about it in following two blog posts. However, both aren’t addressing the following exact problem. Comma Separated Values (CSV) from Table Column Comma Separated Values (CSV) from Table Column – Part 2 The question comes in many different formats but in following image I am demonstrating the same question in simple words. This is the most popular question on my Facebook page as well. (Example) Here is the sample script to build the sample dataset. CREATE TABLE TestTable (ID INT, Col VARCHAR(4)) GO INSERT INTO TestTable (ID, Col) SELECT 1, 'A' UNION ALL SELECT 1, 'B' UNION ALL SELECT 1, 'C' UNION ALL SELECT 2, 'A' UNION ALL SELECT 2, 'B' UNION ALL SELECT 2, 'C' UNION ALL SELECT 2, 'D' UNION ALL SELECT 2, 'E' GO SELECT * FROM TestTable GO Here is the solution which will build an answer to the above question. -- Get CSV values SELECT t.ID, STUFF( (SELECT ',' + s.Col FROM TestTable s WHERE s.ID = t.ID FOR XML PATH('')),1,1,'') AS CSV FROM TestTable AS t GROUP BY t.ID GO I hope this is an easy solution. I am going to point to this blog post in the future for all the similar questions. Final Clean Up Act -- Clean up DROP TABLE TestTable GO Here is the question back to you - Is there any better way to write above script? Please leave a comment and I will write a separate blog post with due credit. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: SQL XML

    Read the article

  • SQL SERVER – Get Schema Name from Object ID using OBJECT_SCHEMA_NAME

    - by pinaldave
    Sometime a simple solution have even simpler solutions but we often do not practice it as we do not see value in it or find it useful. Well, today’s blog post is also about something which I have seen not practiced much in codes. We are so much comfortable with alternative usage that we do not feel like switching how we query the data. I was going over forums and I noticed that at one place user has used following code to get Schema Name from ObjectID. USE AdventureWorks2012 GO SELECT s.name AS SchemaName, t.name AS TableName, s.schema_id, t.OBJECT_ID FROM sys.Tables t INNER JOIN sys.schemas s ON s.schema_id = t.schema_id WHERE t.name = OBJECT_NAME(46623209) GO Before I continue let me say I do not see anything wrong with this script. It is just fine and one of the way to get SchemaName from Object ID. However, I have been using function OBJECT_SCHEMA_NAME to get the schema name. If I have to write the same code from the beginning I would have written the same code as following. SELECT OBJECT_SCHEMA_NAME(46623209) AS SchemaName, t.name AS TableName, t.schema_id, t.OBJECT_ID FROM sys.tables t WHERE t.name = OBJECT_NAME(46623209) GO Now, both of the above code give you exact same result. If you remove the WHERE condition it will give you information of all the tables of the database. Now the question is which one is better – honestly – it is not about one is better than other. Use the one which you prefer to use. I prefer to use second one as it requires less typing. Let me ask you the same question to you – which method to get schema name do yo use? and Why? Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL System Table, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Caption the Cartoon Contest – Last 2 Days

    - by pinaldave
    Developer’s life is very interesting, we often want to start my day early at a job so we can go home early. However, the day never comes as the life of the developer is always about working late hours. If the developer goes to the office early – there are good chances that his co-workers will come late. Additionally, I am confident that there will be always something urgent for developers or DBA to solve right at the time they are ready to go home. This is the life of the developers!  Here is the interesting story of a DBA who was about to go to the home. He had to take his girlfriend to a movie and dinner in 30 minutes. However, his manager asks him to fix the performance related issues with their production server. In normal case, he had only two choices a) Job or b) Girlfriend. Well, our super hero DBA decided to use efficient tools and improve the performance of the production server in merely 30 minutes. When he was done, his manager was absolutely surprised by his efficiency and accuracy of the work. He asked him following question - Here is the contest – you need to guess what was the answer of our Super Hero DBA. If you guess the answer correct you may win Star Wars R2-D2 Inflatable Remote Controlled device. Additionally, if you Download DB Optimizer before Dec 8, 2012 – you will be eligible for USD 25 Amazon Gift Card (there are total 10 such awards). Please do not leave comments in this thread – to participate in the contest – please leave a comment here in the original contest page. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Puzzle, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Parsing SSIS Catalog Messages – Notes from the Field #030

    - by Pinal Dave
    [Note from Pinal]: This is a new episode of Notes from the Field series. SQL Server Integration Service (SSIS) is one of the most key essential part of the entire Business Intelligence (BI) story. It is a platform for data integration and workflow applications. The tool may also be used to automate maintenance of SQL Server databases and updates to multidimensional cube data. In this episode of the Notes from the Field series I requested SSIS Expert Andy Leonard to discuss one of the most interesting concepts of SSIS Catalog Messages. There are plenty of interesting and useful information captured in the SSIS catalog and we will learn together how to explore the same. The SSIS Catalog captures a lot of cool information by default. Here’s a query I use to parse messages from the catalog.operation_messages table in the SSISDB database, where the logged messages are stored. This query is set up to parse a default message transmitted by the Lookup Transformation. It’s one of my favorite messages in the SSIS log because it gives me excellent information when I’m tuning SSIS data flows. The message reads similar to: Data Flow Task:Information: The Lookup processed 4485 rows in the cache. The processing time was 0.015 seconds. The cache used 1376895 bytes of memory. The query: USE SSISDB GO DECLARE @MessageSourceType INT = 60 DECLARE @StartOfIDString VARCHAR(100) = 'The Lookup processed ' DECLARE @ProcessingTimeString VARCHAR(100) = 'The processing time was ' DECLARE @CacheUsedString VARCHAR(100) = 'The cache used ' DECLARE @StartOfIDSearchString VARCHAR(100) = '%' + @StartOfIDString + '%' DECLARE @ProcessingTimeSearchString VARCHAR(100) = '%' + @ProcessingTimeString + '%' DECLARE @CacheUsedSearchString VARCHAR(100) = '%' + @CacheUsedString + '%' SELECT operation_id , SUBSTRING(MESSAGE, (PATINDEX(@StartOfIDSearchString,MESSAGE) + LEN(@StartOfIDString) + 1), ((CHARINDEX(' ', MESSAGE, PATINDEX(@StartOfIDSearchString,MESSAGE) + LEN(@StartOfIDString) + 1)) - (PATINDEX(@StartOfIDSearchString, MESSAGE) + LEN(@StartOfIDString) + 1))) AS LookupRowsCount , SUBSTRING(MESSAGE, (PATINDEX(@ProcessingTimeSearchString,MESSAGE) + LEN(@ProcessingTimeString) + 1), ((CHARINDEX(' ', MESSAGE, PATINDEX(@ProcessingTimeSearchString,MESSAGE) + LEN(@ProcessingTimeString) + 1)) - (PATINDEX(@ProcessingTimeSearchString, MESSAGE) + LEN(@ProcessingTimeString) + 1))) AS LookupProcessingTime , CASE WHEN (CONVERT(numeric(3,3),SUBSTRING(MESSAGE, (PATINDEX(@ProcessingTimeSearchString,MESSAGE) + LEN(@ProcessingTimeString) + 1), ((CHARINDEX(' ', MESSAGE, PATINDEX(@ProcessingTimeSearchString,MESSAGE) + LEN(@ProcessingTimeString) + 1)) - (PATINDEX(@ProcessingTimeSearchString, MESSAGE) + LEN(@ProcessingTimeString) + 1))))) = 0 THEN 0 ELSE CONVERT(bigint,SUBSTRING(MESSAGE, (PATINDEX(@StartOfIDSearchString,MESSAGE) + LEN(@StartOfIDString) + 1), ((CHARINDEX(' ', MESSAGE, PATINDEX(@StartOfIDSearchString,MESSAGE) + LEN(@StartOfIDString) + 1)) - (PATINDEX(@StartOfIDSearchString, MESSAGE) + LEN(@StartOfIDString) + 1)))) / CONVERT(numeric(3,3),SUBSTRING(MESSAGE, (PATINDEX(@ProcessingTimeSearchString,MESSAGE) + LEN(@ProcessingTimeString) + 1), ((CHARINDEX(' ', MESSAGE, PATINDEX(@ProcessingTimeSearchString,MESSAGE) + LEN(@ProcessingTimeString) + 1)) - (PATINDEX(@ProcessingTimeSearchString, MESSAGE) + LEN(@ProcessingTimeString) + 1)))) END AS LookupRowsPerSecond , SUBSTRING(MESSAGE, (PATINDEX(@CacheUsedSearchString,MESSAGE) + LEN(@CacheUsedString) + 1), ((CHARINDEX(' ', MESSAGE, PATINDEX(@CacheUsedSearchString,MESSAGE) + LEN(@CacheUsedString) + 1)) - (PATINDEX(@CacheUsedSearchString, MESSAGE) + LEN(@CacheUsedString) + 1))) AS LookupBytesUsed ,CASE WHEN (CONVERT(bigint,SUBSTRING(MESSAGE, (PATINDEX(@StartOfIDSearchString,MESSAGE) + LEN(@StartOfIDString) + 1), ((CHARINDEX(' ', MESSAGE, PATINDEX(@StartOfIDSearchString,MESSAGE) + LEN(@StartOfIDString) + 1)) - (PATINDEX(@StartOfIDSearchString, MESSAGE) + LEN(@StartOfIDString) + 1)))))= 0 THEN 0 ELSE CONVERT(bigint,SUBSTRING(MESSAGE, (PATINDEX(@CacheUsedSearchString,MESSAGE) + LEN(@CacheUsedString) + 1), ((CHARINDEX(' ', MESSAGE, PATINDEX(@CacheUsedSearchString,MESSAGE) + LEN(@CacheUsedString) + 1)) - (PATINDEX(@CacheUsedSearchString, MESSAGE) + LEN(@CacheUsedString) + 1)))) / CONVERT(bigint,SUBSTRING(MESSAGE, (PATINDEX(@StartOfIDSearchString,MESSAGE) + LEN(@StartOfIDString) + 1), ((CHARINDEX(' ', MESSAGE, PATINDEX(@StartOfIDSearchString,MESSAGE) + LEN(@StartOfIDString) + 1)) - (PATINDEX(@StartOfIDSearchString, MESSAGE) + LEN(@StartOfIDString) + 1)))) END AS LookupBytesPerRow FROM [catalog].[operation_messages] WHERE message_source_type = @MessageSourceType AND MESSAGE LIKE @StartOfIDSearchString GO Note that you have to set some parameter values: @MessageSourceType [int] – represents the message source type value from the following results: Value     Description 10           Entry APIs, such as T-SQL and CLR Stored procedures 20           External process used to run package (ISServerExec.exe) 30           Package-level objects 40           Control Flow tasks 50           Control Flow containers 60           Data Flow task 70           Custom execution message Note: Taken from Reza Rad’s (excellent!) helper.MessageSourceType table found here. @StartOfIDString [VarChar(100)] – use this to uniquely identify the message field value you wish to parse. In this case, the string ‘The Lookup processed ‘ identifies all the Lookup Transformation messages I desire to parse. @ProcessingTimeString [VarChar(100)] – this parameter is message-specific. I use this parameter to specifically search the message field value for the beginning of the Lookup Processing Time value. For this execution, I use the string ‘The processing time was ‘. @CacheUsedString [VarChar(100)] – this parameter is also message-specific. I use this parameter to specifically search the message field value for the beginning of the Lookup Cache  Used value. It returns the memory used, in bytes. For this execution, I use the string ‘The cache used ‘. The other parameters are built from variations of the parameters listed above. The query parses the values into text. The string values are converted to numeric values for ratio calculations; LookupRowsPerSecond and LookupBytesPerRow. Since ratios involve division, CASE statements check for denominators that equal 0. Here are the results in an SSMS grid: This is not the only way to retrieve this information. And much of the code lends itself to conversion to functions. If there is interest, I will share the functions in an upcoming post. If you want to get started with SSIS with the help of experts, read more over at Fix Your SQL Server. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: Notes from the Field, PostADay, SQL, SQL Authority, SQL Backup and Restore, SQL Query, SQL Server, SQL Tips and Tricks, T SQL Tagged: SSIS

    Read the article

  • SQL SERVER – Clustered Index and Primary Key – Contest Win Joes 2 Pros Combo (USD 198) – Day 3 of 5

    - by pinaldave
    August 2011 we ran a contest where every day we give away one book for an entire month. The contest had extreme success. Lots of people participated and lots of give away. I have received lots of questions if we are doing something similar this month. Absolutely, instead of running a contest a month long we are doing something more interesting. We are giving away USD 198 worth gift every day for this week. We are giving away Joes 2 Pros 5 Volumes (BOOK) SQL 2008 Development Certification Training Kit every day. One copy in India and One in USA. Total 2 of the giveaway (worth USD 198). All the gifts are sponsored from the Koenig Training Solution and Joes 2 Pros. The books are available here Amazon | Flipkart | Indiaplaza How to Win: Read the Question Read the Hints Answer the Quiz in Contact Form in following format Question Answer Name of the country (The contest is open for USA and India residents only) 2 Winners will be randomly selected announced on August 20th. Question of the Day: Which of the following datatype is usually NOT the best choice for Primary Key and Clustered Index? a) INT b) BIGINT c) GUID d) SMALLINT Query Hints: BIG HINT POST The clustered index is the placement order of a table’s records in memory pages. When you insert new records, then each record will be inserted into the memory page in the order it belongs. In the figure below we see another new record (Major Disarray) being inserted, in sequence, between Jonny and Rick. Since there is no room in this memory page, some records will need to shift around. The page split occurs when Irenes’ record moves to the second page. Page splits are considered very bad for performance, and there are a number of techniques to reduce, or even eliminate, the risk of page splits. You can create a clustered index on the table on any field you choose. Sometime SQL will create a clustered index for you. Often times the field having the Primary Key makes a great candidate for the clustered index. Additional Hints: I have previously discussed various concepts from SQL Server Joes 2 Pros Volume 3. SQL Joes 2 Pros Development Series – All about SQL Statistics SQL Joes 2 Pros Development Series – Introduction to Page Split SQL Joes 2 Pros Development Series – The Clustered Index – Simple Understanding SQL Joes 2 Pros Development Series – Geography Data Type – Calculating Distance Between Two Points on the Earth SQL Joes 2 Pros Development Series – Sparse Data and Space Used by Sparse Data SQL Joes 2 Pros Development Series – System and Time Data Types SQL Joes 2 Pros Development Series – Data Row Space Usage and NULL Storage Next Step: Answer the Quiz in Contact Form in following format Question Answer Name of the country (The contest is open for USA and India) Bonus Winner Leave a comment with your favorite article from the “additional hints” section and you may be eligible for surprise gift. There is no country restriction for this Bonus Contest. Do mention why you liked it any particular blog post and I will announce the winner of the same along with the main contest. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Joes 2 Pros, PostADay, SQL, SQL Authority, SQL Puzzle, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Beginning New Weekly Series – Memory Lane – #002

    - by pinaldave
    Here is the list of curetted articles of SQLAuthority.com across all these years. Instead of just listing all the articles I have selected a few of my most favorite articles and have listed them here with additional notes below it. Let me know which one of the following is your favorite article from memory lane. 2006 Query to Find ByteSize of All the Tables in Database This was my second blog post and today I do not remember what was the business need which has made me build this query. It was built for SQL Server 2000 and it will not directly run on SQL Server 2005 or later version now. It measured the byte size of the tables in the database. This can be done in many different ways as well for example SP_HELPDB as well SP_HELP. I wish to build similar script in 2005 and later version. 2007 This week I had completed my – 1 Year (365 blogs) and very first 1 Million Views. I was pretty excited at that time with this new achievement. SQL SERVER Versions, CodeNames, Year of Release When I started with SQL Server I did not know all the names correctly for each version and I often used to get confused with this. However, as time passed by I started to remember all the codename as well. In this blog post I have not included SQL Server 2012′s code name as it was not released at the time. SQL Server 2012′s code name is Denali. Here is the question for you – anyone know what is the internal name of the SQL Server’s next version? Searching String in Stored Procedure I have already started to work with 2005 by this time and I was personally converting each of my stored procedures to SQL Server 2005 compatible. As we were upgrading from SQL Server 2000 to SQL Server 2005 we had to search each of the stored procedures and make sure that we remove incompatible code from it. For example, syscolumns of SQL Server 2000 was now being replaced by sys.columns of SQL Server 2005. This stored procedure was pretty helpful at that time. Later on I build few additional versions of the same stored procedure. Version 1: This version finds the Stored Procedures related to Table Version 2: This is specific version which works with SQL Server 2005 and later version 2008 Clear Drop Down List of Recent Connection From SQL Server Management Studio It happens to all of us when we connected to some remote client server and we never ever have to connect to it again. However, it keeps on bothering us that the name shows up in the list all the time. In this blog post I covered a quick tip about how we can remove the same. I also wrote a small article about How to Check Database Integrity for all Databases and there was a funny question from a reader requesting T-SQL code to refresh databases. 2009 Stored Procedure are Compiled on First Run – SP is taking Longer to Run First Time A myth is quite prevailing in the industry that Stored Procedures are pre-compiled and they should always run faster. It is not true. Stored procedures are compiled on very first execution of it and that is the reason why it takes longer when it executes first time. In this blog post I had a great time discussing the same concept. If you do not agree with it, you are welcome to read this blog post. Removing Key Lookup – Seek Predicate – Predicate – An Interesting Observation Related to Datatypes Performance Tuning is an interesting concept and my personal favorite one. In many blog posts I have described how to do performance tuning and how to improve the performance of the queries. In this quick quick tip I have explained how one can remove the Key Lookup and improve performance. Here are very relevant articles on this subject: Article 1 | Article 2 | Article 3 2010 Recycle Error Log – Create New Log file without a Server Restart During one of the consulting assignments I noticed DBA restarting server to create new log file. This is absolutely not necessary and restarting server might have many other negative impacts. There is a common sp_cycle_errorlog which can do the same task efficiently and properly. Have you ever used this SP or feature? Additionally I had a great time presenting on SQL Server Best Practices in SharePoint Conference. 2011 SSMS 2012 Reset Keyboard Shortcuts to Default It is very much possible that we mix up various SQL Server shortcuts and at times we feel like resetting it to default. In SQL Server 2012 it is not easy to do it, there is a process to follow and I enjoyed blogging about it. Fundamentals of Columnstore Index Columnstore index is introduced in SQL Server 2012 and have been a very popular subject. It increases the speed of the server dramatically as well can be an extremely useful feature with Datawharehousing. However updating the columnstore index is not as simple as a simple UPDATE statement. Read in a detailed blog post about how Update works with Columnstore Index. Additionally, you can watch a Quick Video on this subject. SQL Server 2012 New Features I had decided to explore SQL Server 2012 features last year and went through pretty much every single concept introduced in separate blog posts. Here are two blog posts where I describe how SQL Server 2012 functions works. Introduction to CUME_DIST – Analytic Functions Introduction to FIRST _VALUE and LAST_VALUE – Analytic Functions OVER clause with FIRST_VALUE and LAST_VALUE – Analytic Functions I indeed enjoyed writing about SQL Server 2012 functions last year. Have you gone through all the new features which are introduced in SQL Server 2012? If not, it is still not late to go through them. Reference: Pinal Dave (http://blog.sqlauthority.com)   Filed under: Memory Lane, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – CTE can be Updated

    - by Pinal Dave
    Today I have received a fantastic email from Matthew Spieth. SQL Server expert from Ohio. He recently had a great conversation with his colleagues in the office and wanted to make sure that everybody who reads this blog knows about this little feature which is commonly confused. Here is his statement and we will start our story with Matthew’s own statement: “Users often confuse CTE with Temp Table but technically they both are different, CTE are like Views and they can be updated just like views.“ Very true statement from Matthew. I totally agree with what he is saying. Just like him, I have enough, time came across a situation when developers think CTE is like temp table. When you update temp table, it remains in the scope of the temp table and it does not propagate it to the table based on which temp table is built. However, this is not the case when it is about CTE, when you update CTE, it updates underlying table just like view does. Here is the working example of the same built by Matthew to illustrate this behavior. Check the value in the base table first. USE AdventureWorks2012; -- Check - The value in the base table is updated SELECT Color FROM [Production].[Product] WHERE ProductNumber = 'CA-6738'; Now let us build CTE with the same data. ;WITH CTEUpd(ProductID, Name, ProductNumber, Color) AS( SELECT ProductID, Name, ProductNumber, Color FROM [Production].[Product] WHERE ProductNumber = 'CA-6738') Now let us update CTE with following code. -- Update CTE UPDATE CTEUpd SET Color = 'Rainbow'; Now let us check the BASE table based on which the CTE was built. -- Check - The value in the base table is updated SELECT Color FROM [Production].[Product] WHERE ProductNumber = 'CA-6738'; That’s it! You can update CTE and it will update the base table. Here is the script which you should execute all together. USE AdventureWorks2012; -- Check - The value in the base table is updated SELECT Color FROM [Production].[Product] WHERE ProductNumber = 'CA-6738'; -- Build CTE ;WITH CTEUpd(ProductID, Name, ProductNumber, Color) AS( SELECT ProductID, Name, ProductNumber, Color FROM [Production].[Product] WHERE ProductNumber = 'CA-6738') -- Update CTE UPDATE CTEUpd SET Color = 'Rainbow'; -- Check - The value in the base table is updated SELECT Color FROM [Production].[Product] WHERE ProductNumber = 'CA-6738'; If you are aware of such scenario, do let me know and I will post this on my blog with due credit to you. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQL View, T SQL Tagged: CTE

    Read the article

  • SQL SERVER – ColumnStore Index – Batch Mode vs Row Mode

    - by pinaldave
    What do you do when you are in a hurry and hear someone say things which you do not agree or is wrong? Well, let me tell you what I do or what I recently did. I was walking by and heard someone mentioning “Columnstore Index are really great as they are using Batch Mode which makes them seriously fast.” While I was passing by and I heard this statement my first reaction was I thought Columnstore Index can use both – Batch Mode and Row Mode. I stopped by even though I was in a hurry and asked the person if he meant that Columnstore indexes are seriously fast because they use Batch Mode all the time or Batch Mode is one of the reasons for Columnstore Index to be faster. He responded that Columnstore Indexes can run only in Batch Mode. However, I do not like to confront anybody without hearing their complete story. Honestly, I like to do information sharing and avoid confronting as much as possible. There are always ways to communicate the same positively. Well, this is what I did, I quickly pull up my earlier article on Columnstore Index and copied the script to SQL Server Management Studio. I created two versions of the script. 1) Very Large Table 2) Reasonably Small Table. I a query which uses columnstore index on both of the versions. I found very interesting result of the my tests. I saved my tests and sent it to the person who mentioned about that Columnstore Indexes are using Batch Mode only. He immediately acknowledged that indeed he was incorrect in saying that Columnstore Index uses only Batch Mode. What really caught my attention is that he also thanked me for sending him detail email instead of just having argument where he and I both were standing in the corridor and neither have no way to prove any theory. Here is the screenshots of the both the scenarios. 1) Columnstore Index using Batch Mode 2) Columnstore Index using Row Mode Here is the logic behind when Columnstore Index uses Batch Mode and when it uses Row Mode. A batch typically represents about 1000 rows of data. Batch mode processing also uses algorithms that are optimized for the multicore CPUs and increased memory throughput.  Batch mode processing spreads metadata access costs and overhead over all the rows in a batch.  Batch mode processing operates on compressed data when possible leading superior performance. Here is one last point – Columnstore Index can use Batch Mode or Row Mode but Batch Mode processing is only available in Columnstore Index. I hope this statement truly sums up the whole concept. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Index, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL Contest – Win USD 300 Worth Gift – Cartoon Contest is Back

    - by pinaldave
    There are two excellent contests and we have lots of winning to do this year end. 1) Win USD 25 Amazon Gift Cards (10 Units) This is very simple, you just have to download SQL Server DB Optimizer. That’s it! There are only two conditions: You must have a valid email address. As USD 25 Amazon Gift Card will be sent to the same address. Download DB Optimizer between today and Dec 8, 2012. Link to Download DB Optimizer. Every day one winner will be notified about their winning USD 25 Amazon Gift Cards for next 10 days. 2) Win Star Wars R2-D2 Inflatable R/C This the coolest thing to win. I personally want one but as I am running a contest, I can’t  participate. You get this cool Remote Controlled Device – you just have to answer following cartoon contest. Read the complete story and think what will be the answer provided by the smart employee. There are only two conditions: Leave your answer in the comment area of this blog post (every comment will be hidden till Dec 8, 2012). Please leave your answer in the comment area between today and Dec 8, 2012. Remember you can participate as many times as you want. Make sure that your answer is correct and creative. The most creative answer will be selected. The decision of contest owner will be final. We may have runner’s up prices but for the moment let us try to win R2-D2. Here is the cool video of R2D2. Now here is the cartoon story, please follow the story and complete the very last cartoon template. Your answer should be correct and should be creative. However, the ideal answer will not be longer than one or two sentences. Hint: (Hint) Well, Leave your answer in the comment area of this blog post. If you do not win R2D2, trust me there are chances you may win a surprise gift from me. Remember your answer should be correct and should be creative. However, the ideal answer will not be longer than one or two sentences. Last day to participate in both of the contest is Dec 8, 2012. We will announce the winner in the week of December 10. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Puzzle, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, SQLServer, T SQL, Technology

    Read the article

  • SQL SERVER – Identify Numbers of Non Clustered Index on Tables for Entire Database

    - by pinaldave
    Here is the script which will give you numbers of non clustered indexes on any table in entire database. SELECT COUNT(i.TYPE) NoOfIndex, [schema_name] = s.name, table_name = o.name FROM sys.indexes i INNER JOIN sys.objects o ON i.[object_id] = o.[object_id] INNER JOIN sys.schemas s ON o.[schema_id] = s.[schema_id] WHERE o.TYPE IN ('U') AND i.TYPE = 2 GROUP BY s.name, o.name ORDER BY schema_name, table_name Here is the small story behind why this script was needed. I recently went to meet my friend in his office and he introduced me to his colleague in office as someone who is an expert in SQL Server Indexing. I politely said I am yet learning about Indexing and have a long way to go. My friend’s colleague right away said – he had a suggestion for me with related to Index. According to him he was looking for a script which will count all the non clustered on all the tables in the database and he was not able to find that on SQLAuthority.com. I was a bit surprised as I really do not remember all the details about what I have written so far. I quickly pull up my phone and tried to look for the script on my custom search engine and he was correct. I never wrote a script which will count all the non clustered indexes on tables in the whole database. Excessive indexing is not recommended in general. If you have too many indexes it will definitely negatively affect your performance. The above query will quickly give you details of numbers of indexes on tables on your entire database. You can quickly glance and use the numbers as reference. Please note that the number of the index is not a indication of bad indexes. There is a lot of wisdom I can write here but that is not the scope of this blog post. There are many different rules with Indexes and many different scenarios. For example – a table which is heap (no clustered index) is often not recommended on OLTP workload (here is the blog post to identify them), drop unused indexes with careful observation (here is the script for it), identify missing indexes and after careful testing add them (here is the script for it). Even though I have given few links here it is just the tip of the iceberg. If you follow only above four advices your ship may still sink. Those who wants to learn the subject in depth can watch the videos here after logging in. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Index, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQLAuthority News – New Book Released – SQL Server Interview Questions And Answers

    - by pinaldave
    Two days ago, on birthday of my blog – I asked simple question – Guess! What is in this box? I have received lots of interesting comments on the blog about what is in it. Many of you got it absolutely incorrect and many got it close to the right answer but no one got it 100% correct. Well, no issue at all, I am going to give away the price to whoever has the closest answer first in personal email. Here is the answer to the question about what is in the box? Here it is – the box has my new book. In fact, I should say our new book as I co-authored this book with my very good friend Vinod Kumar. We had real blast writing this book together and had lots of interesting conversation when we were writing this book. This book has one simple goal – “master the basics.” This book is not only for people who are preparing for interview. This book is for every one who wants to revisit the basics and wants to prepare themselves to the technology. One always needs to have practical knowledge to do their duty efficiently. This book talks about more than basics. There are multiple ways to present learning – either we can create simple book or make it interesting. We have decided the learning should be interactive and have opted for Interview Questions and Answer format. Here is quick interview which we have done together. Details of the books are here The core concept of this book will continue to evolve over time. I am sure many of you will come along with us on this journey and submit your suggestions to us to make this book a key reference for anybody who wants to start with SQL server. Today we want to acknowledge the fact that you will help us keep this book alive forever with the latest updates. We want to thank everyone who participates in this journey with us. You can get the books from [Amazon] | [Flipkart]. Read Vinod‘s blog post. Do not forget to wish him happy birthday as today is his birthday and also book release day – two reason to wish him congratulations. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Best Practices, Data Warehousing, Database, Pinal Dave, PostADay, SQL, SQL Authority, SQL Interview Questions and Answers, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Book Review, SQLAuthority News, SQLServer, T SQL, Technology

    Read the article

  • SQL SERVER – Order By Numeric Values Formatted as String

    - by pinaldave
    When I was writing this blog post I had a hard time to come up with the title of the blog post so I did my best to come up with one. Here is the reason why? I wrote a blog post earlier SQL SERVER – Find First Non-Numeric Character from String. One of the questions was that how that blog can be useful in real life scenario. This blog post is the answer to that question. Let us first see a problem. We have a table which has a column containing alphanumeric data. The data always has first as an integer and later part as a string. The business need is to order the data based on the first part of the alphanumeric data which is an integer. Now the problem is that no matter how we use ORDER BY the result is not produced as expected. Let us understand this with example. Prepare a sample data: -- How to find first non numberic character USE tempdb GO CREATE TABLE MyTable (ID INT, Col1 VARCHAR(100)) GO INSERT INTO MyTable (ID, Col1) SELECT 1, '1one' UNION ALL SELECT 2, '11eleven' UNION ALL SELECT 3, '2two' UNION ALL SELECT 4, '22twentytwo' UNION ALL SELECT 5, '111oneeleven' GO -- Select Data SELECT * FROM MyTable GO The above query will give following result set. Now let us use ORDER BY COL1 and observe the result along with Original SELECT. -- Select Data SELECT * FROM MyTable GO -- Select Data SELECT * FROM MyTable ORDER BY Col1 GO The result of the table is not as per expected. We need the result in following format. Here is the good example of how we can use PATINDEX. -- Use of PATINDEX SELECT ID, LEFT(Col1,PATINDEX('%[^0-9]%',Col1)-1) 'Numeric Character', Col1 'Original Character' FROM MyTable ORDER BY LEFT(Col1,PATINDEX('%[^0-9]%',Col1)-1) GO We can use PATINDEX to identify the length of the digit part in the alphanumeric string (Remember: Our string has a first part as an int always. It will not work in any other scenario). Now you can use the LEFT function to extract the INT portion from the alphanumeric string and order the data according to it. You can easily clean up the script by dropping following table. DROP TABLE MyTable GO Here is the complete script so you can easily refer it. -- How to find first non numberic character USE tempdb GO CREATE TABLE MyTable (ID INT, Col1 VARCHAR(100)) GO INSERT INTO MyTable (ID, Col1) SELECT 1, '1one' UNION ALL SELECT 2, '11eleven' UNION ALL SELECT 3, '2two' UNION ALL SELECT 4, '22twentytwo' UNION ALL SELECT 5, '111oneeleven' GO -- Select Data SELECT * FROM MyTable GO -- Select Data SELECT * FROM MyTable ORDER BY Col1 GO -- Use of PATINDEX SELECT ID, Col1 'Original Character' FROM MyTable ORDER BY LEFT(Col1,PATINDEX('%[^0-9]%',Col1)-1) GO DROP TABLE MyTable GO Well, isn’t it an interesting solution. Any suggestion for better solution? Additionally any suggestion for changing the title of this blog post? Reference : Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL String, SQL Tips and Tricks, T SQL, Technology

    Read the article

< Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >