Search Results

Search found 79588 results on 3184 pages for 'sql data storage'.

Page 12/3184 | < Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >

  • SQLAuthority News – Various Microsoft SQL Server Documentations Available for Download

    - by pinaldave
    Microsoft has recently released various SQL Server related documentations and here I have listed them here for quick reference. Microsoft SQL Server Protocol Documentation The Microsoft SQL Server protocol documentation provides technical specifications for Microsoft proprietary protocols that are implemented and used in Microsoft SQL Server 2008. Microsoft SQL Server Protocol Documentation The SQL Server data portability documentation explains various mechanisms by which user-created data in SQL Server can be extracted for use in other software products. These mechanisms include import/export functionality, documented APIs, industry standard formats, or documented data structures/file formats. SQL Server Standards Support Documentation The SQL Server standards support documentation provides detailed support information for certain standards that are implemented in Microsoft SQL Server. Microsoft Product Support Reports Download the scripted system configuration gathering tools. The Microsoft Product Support Reports utility facilitates the gathering of critical system and logging information used in troubleshooting support issues. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Stored Procedure and Transactions

    - by pinaldave
    I just overheard the following statement – “I do not use Transactions in SQL as I use Stored Procedure“. I just realized that there are so many misconceptions about this subject. Transactions has nothing to do with Stored Procedures. Let me demonstrate that with a simple example. USE tempdb GO -- Create 3 Test Tables CREATE TABLE TABLE1 (ID INT); CREATE TABLE TABLE2 (ID INT); CREATE TABLE TABLE3 (ID INT); GO -- Create SP CREATE PROCEDURE TestSP AS INSERT INTO TABLE1 (ID) VALUES (1) INSERT INTO TABLE2 (ID) VALUES ('a') INSERT INTO TABLE3 (ID) VALUES (3) GO -- Execute SP -- SP will error out EXEC TestSP GO -- Check the Values in Table SELECT * FROM TABLE1; SELECT * FROM TABLE2; SELECT * FROM TABLE3; GO Now, the main point is: If Stored Procedure is transactional then, it should roll back complete transactions when it encounters any errors. Well, that does not happen in this case, which proves that Stored Procedure does not only provide just the transactional feature to a batch of T-SQL. Let’s see the result very quickly. It is very clear that there were entries in table1 which are not shown in the subsequent tables. If SP was transactional in terms of T-SQL Query Batches, there would be no entries in any of the tables. If you want to use Transactions with Stored Procedure, wrap the code around with BEGIN TRAN and COMMIT TRAN. The example is as following. CREATE PROCEDURE TestSPTran AS BEGIN TRAN INSERT INTO TABLE1 (ID) VALUES (11) INSERT INTO TABLE2 (ID) VALUES ('b') INSERT INTO TABLE3 (ID) VALUES (33) COMMIT GO -- Execute SP EXEC TestSPTran GO -- Check the Values in Tables SELECT * FROM TABLE1; SELECT * FROM TABLE2; SELECT * FROM TABLE3; GO -- Clean up DROP TABLE Table1 DROP TABLE Table2 DROP TABLE Table3 GO In this case, there will be no entries in any part of the table. What is your opinion about this blog post? Please leave your comments about it here. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Query, SQL Scripts, SQL Server, SQL Stored Procedure, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Log File Growing for Model Database – model Database Log File Grew Too Big

    - by pinaldave
    After reading my earlier article SQL SERVER – master Database Log File Grew Too Big, I received an email recently from another reader asking why does the log file of model database grow every day when he is not carrying out any operation in the model database. As per the email, he is absolutely sure that he is doing nothing on his model database; he had used policy management to catch any T-SQL operation in the model database and there were none. This was indeed surprising to me. I sent a request to access to his server, which he happily agreed for and within a min, we figured out the issue. He was taking the backup of the model database every day taking the database backup every night. When I explained the same to him, he did not believe it; so I quickly wrote down the following script. The results before and after the usage of the script were very clear. What is a model database? The model database is used as the template for all databases created on an instance of SQL Server. Any object you create in the model database will be automatically created in subsequent user database created on the server. NOTE: Do not run this in production environment. During the demo, the model database was in full recovery mode and only full backup operation was performed (no log backup). Before Backup Script Backup Script in loop DECLARE @FLAG INT SET @FLAG = 1 WHILE(@FLAG < 1000) BEGIN BACKUP DATABASE [model] TO  DISK = N'D:\model.bak' SET @FLAG = @FLAG + 1 END GO After Backup Script Why did this happen? The model database was in full recovery mode and taking full backup is logged operation. As there was no log backup and only full backup was performed on the model database, the size of the log file kept growing. Resolution: Change the backup mode of model database from “Full Recovery” to “Simple Recovery.”. Take full backup of the model database “only” when you change something in the model database. Let me know if you have encountered a situation like this? If so, how did you resolve it? It will be interesting to know about your experience. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Backup and Restore, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – NTFS File System Performance for SQL Server

    - by pinaldave
    Note: Before practicing any of the suggestion of this article, consult your IT Infrastructural Admin, applying the suggestion without proper testing can only damage your system. Question: “Pinal, we have 80 GB of data including all the database files, we have our data in NTFS file system. We have proper backups are set up. Any suggestion for our NTFS file system performance improvement. Our SQL Server box is running only SQL Server and nothing else. Please advise.” When I receive questions which I have just listed above, it often sends me deep thought. Honestly, I know a lot but there are plenty of things, I believe can be built with community knowledge base. Today I need you to help me to complete this list. I will start the list and you help me complete it. NTFS File System Performance Best Practices for SQL Server Disable Indexing on disk volumes Disable generation of 8.3 names (command: FSUTIL BEHAVIOR SET DISABLE8DOT3 1) Disable last file access time tracking (command: FSUTIL BEHAVIOR SET DISABLELASTACCESS 1) Keep some space empty (let us say 15% for reference) on drive is possible (Only on Filestream Data storage volume) Defragement the volume Add your suggestions here… The one which I often get a pretty big debate is NTFS allocation size. I have seen that on the disk volume which stores filestream data, when increased allocation to 64K from 4K, it reduces the fragmentation. Again, I suggest you attempt this after proper testing on your server. Every system is different and the file stored is different. Here is when I would like to request you to share your experience with related to NTFS allocation size. If you do not agree with any of the above suggestions, leave a comment with reference and I will modify it. Please note that above list prepared assuming the SQL Server application is only running on the computer system. The next question does all these still relevant for SSD – I personally have no experience with SSD with large database so I will refrain from comment. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Performance, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQLAuthority News – Microsoft SQL Server 2008 R2 – PowerPivot for Microsoft Excel 2010

    - by pinaldave
    Microsoft has really and truly created some buzz for PowerPivot. I have been asked to show the demo of Powerpivot in recent time even when I am doing relational database training. Attached is the few details where everyone can download PowerPivot and use the same. Microsoft SQL Server 2008 R2 – PowerPivot for Microsoft Excel 2010 – RTM Microsoft® PowerPivot for Microsoft® Excel 2010 provides ground-breaking technology, such as fast manipulation of large data sets (often millions of rows), streamlined integration of data, and the ability to effortlessly share your analysis through Microsoft® SharePoint 2010. Microsoft PowerPivot for Excel 2010 Samples Microsoft® PowerPivot for Microsoft® Excel 2010 provides ground-breaking technology, such as fast manipulation of large data sets (often millions of rows), streamlined integration of data, and the ability to effortlessly share your analysis through Microsoft® SharePoint 2010. Download examples of the types of reports you can create. Microsoft PowerPivot for Excel 2010 Data Analysis Expressions Sample version 1.0 Microsoft® PowerPivot for Microsoft® Excel 2010 provides ground-breaking technology, such as fast manipulation of large data sets (often millions of rows), streamlined integration of data, and the ability to effortlessly share your analysis through Microsoft® SharePoint 2010. Download this PowerPivot workbook to learn more about DAX calculations. Note: The brief description below the download link is taken from respective download page. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: SQL, SQL Authority, SQL Documentation, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – A Puzzle – Illusion – Confusion – April Fools’ Day

    - by pinaldave
    Today is April 1st and just like every other year, I like to bring something interesting and light for the day. Atleast there should be days in every one’s life when they should feel easy. Here is a quick puzzle for you and I believe it will make you feel extremely smart if you can figure out the result behind the same. Run following in SQL Server Management Studio and observe the output: SELECT 30.0/(-2.0)/5.0; SELECT 30.0/-2.0/5.0; Here are few questions for you: 1) What will be the result of above two queries? 2) Why? If you think you can figure out the result without executing them – I encourage you to execute BOTH of them in SSMS and see if they give you same result or different result. Well, now I am waiting for your answer here – why? I often post similar things on my facebook page http://facebook.com/SQLAuth – you are welcome to play with me there. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Puzzle, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – sp_describe_first_result_set New System Stored Procedure in SQL Server 2012

    - by pinaldave
    I might have said this earlier many times but I will say it again – SQL Server never stops to amaze me. Here is the example of it sp_describe_first_result_set. I stumbled upon it when I was looking for something else on BOL. This new system stored procedure did attract me to experiment with it. This SP does exactly what its names suggests – describes the first result set. Let us see very simple example of the same. Please note that this will work on only SQL Server 2012. EXEC sp_describe_first_result_set N'SELECT * FROM AdventureWorks.Sales.SalesOrderDetail', NULL, 1 GO Here is the partial resultset. Now let us take this simple example to next level and learn one more interesting detail about this function. First I will be creating a view and then we will use the same procedure over the view. USE AdventureWorks GO CREATE VIEW dbo.MyView AS SELECT [SalesOrderID] soi_v ,[SalesOrderDetailID] sodi_v ,[CarrierTrackingNumber] stn_v FROM [Sales].[SalesOrderDetail] GO Now let us execute above stored procedure with various options. You can notice I am changing the very last parameter which I am passing to the stored procedure.This option is known as for browse_information_mode. EXEC sp_describe_first_result_set N'SELECT soi_v soi, sodi_v sodi, stn_v stn FROM MyView', NULL, 0; GO EXEC sp_describe_first_result_set N'SELECT soi_v soi, sodi_v sodi, stn_v stn FROM MyView', NULL, 1; GO EXEC sp_describe_first_result_set N'SELECT soi_v soi, sodi_v sodi, stn_v stn FROM MyView', NULL, 2; GO Here is result of all the three queries together in single image for easier understanding regarding their difference. You can see that when BrowseMode is set to 1 the resultset describes the details of the original source database, schema as well source table. When BrowseMode is set to 2 the resulset describes the details of the view as the source database. I found it really really interesting that there exists system stored procedure which now describes the resultset of the output. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Stored Procedure, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Precision of SMALLDATETIME – A 1 Minute Precision

    - by pinaldave
    I am myself surprised that I am writing this post today. I am going to present one of the very known facts of SQL Server SMALLDATETIME datatype. Even though this is a very well-known datatype, many a time, I have seen developers getting confused with precision of the SMALLDATETIME datatype. The precision of the datatype SMALLDATETIME is 1 minute. It discards the seconds by rounding up or rounding down any seconds greater than zero. Let us see the following example DECLARE @varSDate AS SMALLDATETIME SET @varSDate = '1900-01-01 12:12:01' SELECT @varSDate C_SDT SET @varSDate = '1900-01-01 12:12:29' SELECT @varSDate C_SDT SET @varSDate = '1900-01-01 12:12:30' SELECT @varSDate C_SDT SET @varSDate = '1900-01-01 12:12:59' SELECT @varSDate C_SDT Following is the result of the above script and note that any value between 0 (zero) and 59 is converted up or down. The part that confuses the developers is the value of the seconds in the display. I think if it is not maintained or recorded, it should not be displayed as well. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL DateTime, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQLAuthority News – SQL Server Performance Series Hyderabad / Pune – Nov/Dec 2010

    - by pinaldave
    Just a quick note that SQL Server Performance Tuning and Optimizations Seminar series which I am offering at Hyderabad and Pune are almost all sold out. Read the details of the earlier successful seminar conducted at Colombo, Sri Lanka over here. Hyderabad Nov 27-28, 2010 (Last 3 Seats Left) Best Western Amrutha Castle 5-9-16, Opp. Secretriat, Saifabad, Khairatabad Hyderabad, Andhra Pradesh Pune Dec 04-05, 2010 (Last 6 Seats Left) Location TBA as we are looking for larger capacity room. I promise that this is going to be great fun as this sessions are very different then any usual sessions you have ever attended. This sessions are absolutely interactive and all the attendees will feel part of the event. As larger group are not convenient we are limited this seminars to very small group of people. This way attendees can go to instructors any time and feel connected. This 2-day seminar will cover the best of the best concepts and practices from popular courses offered by Solid Quality Mentors. Instead of learning theory only, the seminar focuses on providing real world experience by using demos and scenarios derived from customer engagements. The seminar is uniquely structured and well-thought-out. Sessions are discussion- based and are designed to be an interactive gateway between the instructor and the participants for an optimal learning experience. The seminar is intended to be immersion-based where participants will have plenty of opportunities to get deeply involved in the concepts presented by the instructor. Agenda of the event To join the seminars drop me an email. My email address is pinal “at” SQLAuthority.com and IndiaInfo “at” SolidQ.com. If you specify SQLAuthority.com in Title, you will avail special discount in overall rates on specified price. Yes, a sure 20% I promise. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: About Me, Pinal Dave, SQL, SQL Authority, SQL Performance, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology

    Read the article

  • SQL SERVER – Find First Non-Numeric Character from String

    - by pinaldave
    It is fun when you have to deal with simple problems and there are no out of the box solution. I am sure there are many cases when we needed the first non-numeric character from the string but there is no function available to identify that right away. Here is the quick script I wrote down using PATINDEX. The function PATINDEX exists for quite a long time in SQL Server but I hardly see it being used. Well, at least I use it and I am comfortable using it. Here is a simple script which I use when I have to identify first non-numeric character. -- How to find first non numberic character USE tempdb GO CREATE TABLE MyTable (ID INT, Col1 VARCHAR(100)) GO INSERT INTO MyTable (ID, Col1) SELECT 1, '1one' UNION ALL SELECT 2, '11eleven' UNION ALL SELECT 3, '2two' UNION ALL SELECT 4, '22twentytwo' UNION ALL SELECT 5, '111oneeleven' GO -- Use of PATINDEX SELECT PATINDEX('%[^0-9]%',Col1) 'Position of NonNumeric Character', SUBSTRING(Col1,PATINDEX('%[^0-9]%',Col1),1) 'NonNumeric Character', Col1 'Original Character' FROM MyTable GO DROP TABLE MyTable GO Here is the resultset: Where do I use in the real world – well there are lots of examples. In one of the future blog posts I will cover that as well. Meanwhile, do you have any better way to achieve the same. Do share it here. I will write a follow up blog post with due credit to you. Reference : Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Function, SQL Query, SQL Server, SQL String, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Detecting Leap Year in T-SQL using SQL Server 2012 – IIF, EOMONTH and CONCAT Function

    - by pinaldave
    Note: Tomorrow is February 29th. This blog post is dedicated to coming tomorrow – a special day :) Subu: “How can I find leap year in using SQL Server 2012?“ Pinal: “Are you asking me how to year 2012 is leap year using T-SQL – search online and you will find many example of the same.” Subu: “No. I am asking – How can I find leap year in using SQL Server 2012?“ Pinal: “Oh so you are asking – How can I find leap year in using SQL Server 2012?“ Subu: “Yeah - How can I find leap year in using SQL Server 2012?“ Pinal: “Let me do that for you – How can you find leap year in using SQL Server 2012?“ Indeed a fun conversation. Honestly, only reason I pasted our conversation here is – it was fun. What he was asking is that how to do it using new functions introduced in SQL Server 2012. Here is the article I have written which introduces all the new functions in SQL Server 2012 Summary of All the Analytic Functions – MSDN and SQLAuthority and 14 New Functions – A Quick Guide. There are many functions written to figure out to figure out if any year is Leap Year or not. The same I have written using T-SQL function over here. CREATE FUNCTION dbo.IsLeapYear (@year INT) RETURNS INT AS BEGIN RETURN(IIF(DATEPART(dd,(EOMONTH(CONCAT(@year,'0201')))) = 29,1,0)) END GO What I really like is that I was able to use three newly introduced function in SQL Server 2012 in above script. You can read more about them here. IIF, EOMONTH and CONCAT. You can validate above query by running following script. SELECT dbo.IsLeapYear('2011') 'IsLeapYear'; SELECT dbo.IsLeapYear('2012') 'IsLeapYear'; GO You will get result 1 if the year is leap year and 0 if year is not leap year. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL DateTime, SQL Function, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER- Differences Between Left Join and Left Outer Join

    - by pinaldave
    There are a few questions that I had decided not to discuss on this blog because I think they are very simple and many of us know it. Many times, I even receive not-so positive notes from several readers when I am writing something simple. However, assuming that we know all and beginners should know everything is not the right attitude. Since day 1, I have been keeping a small journal regarding questions that I receive in this blog. There are around 200+ questions I receive every day through emails, comments and occasional phone calls. Yesterday, I received a comment with the following question: What are the differences between Left Join and Left Outer Join? Click here to read original comment. This question has triggered the threshold of receiving the same question repeatedly. Here is the answer: There is absolutely no difference between LEFT JOIN and LEFT OUTER JOIN. The same is true for RIGHT JOIN and RIGHT OUTER JOIN. When you use LEFT JOIN keyword in SQL Server, it means LEFT OUTER JOIN only. I have already written in-depth visual diagram discussing the JOINs. I encourage all of you to read the article for further understanding of the JOINs: Read Introduction to JOINs – Basic of JOINs Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Joins, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Reducing Page Contention on TempDB

    - by pinaldave
    I have recently received following email. “We are using TraceFlag 1118 to reduce the tempDB contention on our servers (2000 and 2005). What is your opinion? We have read lots of material, would you please answer me in single line.” Wow, this was very interesting question. What intrigued me was the second last where I am asked to answer in a single line. There is something about this strong email, I feel like blogging it here. I think I can talk over this subject forever – well, there is no clear answer. There are so many caveats about everything.  Again, I must stay honest to the request about answering in single line. I also do not like to answer which is YES/NO. What should I do? Let me ask this question to community today? What will you answer to this email? Let me start this by answering it myself in one line and taking one side. “I enable this trace flag in SQL Server 2000 without hot patch or service pack and not in later versions (2005+) onwards as code is improved”. What do you do in this case? The best answer will feature in this blog with due credit. Regarding further read and hint here is Microsoft KB which I think is very helpful. In quick summary: (Read KB for accuracy) When any page is allocated first 8 pages are allocated in mixed extended. This trace flag allocates uniform extended at the time, reducing contention. You can enable this trace flag at startup. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: SQL TempDB, TempDB

    Read the article

  • SQL SERVER – Get File Statistics Using fn_virtualfilestats

    - by pinaldave
    Quite often when I am staring at my SSMS I wonder what is going on under the hood in my SQL Server. I often want to know which database is very busy and which database is bit slow because of IO issue. Sometime, I think at the file level as well. I want to know which MDF or NDF is busiest and doing most of the work. Following query gets the same results very quickly. SELECT DB_NAME(vfs.DbId) DatabaseName, mf.name, mf.physical_name, vfs.BytesRead, vfs.BytesWritten, vfs.IoStallMS, vfs.IoStallReadMS, vfs.IoStallWriteMS, vfs.NumberReads, vfs.NumberWrites, (Size*8)/1024 Size_MB FROM ::fn_virtualfilestats(NULL,NULL) vfs INNER JOIN sys.master_files mf ON mf.database_id = vfs.DbId AND mf.FILE_ID = vfs.FileId GO When you run above query you will get many valuable information like what is the size of the file as well how many times the reads and writes are done for each file. It also displays the read/write data in bytes. Due to IO if there has been any stall (delay) in read or write, you can know that as well. I keep this handy but have not shared on blog earlier. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, SQL View, T SQL, Technology Tagged: Statistics

    Read the article

  • SQL SERVER – A Funny Cartoon on Index

    - by pinaldave
    Performance Tuning has been my favorite subject and I have done it for many years now. Today I will list one of the most common conversation about Index I have heard in my life. Every single time, I am at consultation for performance tuning I hear following conversation among various team members. I want to ask you, does this kind of conversation happens in your organization? Any way, If you think Index solves all of your performance problem I think it is not true. There are many other reason one has to consider along with Indexes. For example I consider following various topic one need to understand for performance tuning. ?Logical Query Processing ?Efficient Join Techniques ?Query Tuning Considerations ?Avoiding Common Performance Tuning Issues Statistics and Best Practices ?TempDB Tuning ?Hardware Planning ?Understanding Query Processor ?Using SQL Server 2005 and 2008 Updated Feature Sets ?CPU, Memory, I/O Bottleneck Index Tuning (of course) ?Many more… Well, I have written this blog thinking I will keep this blog post a bit easy and not load up. I will in future discuss about other performance tuning concepts. Let me know what do you think about the cartoon I made. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Humor, SQL Index, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – T-SQL Scripts to Find Maximum between Two Numbers

    - by pinaldave
    There are plenty of the things life one can make it simple. I really believe in the same. I was yesterday traveling for community related activity. On airport while returning I met a SQL Enthusiast. He asked me if there is any simple way to find maximum between two numbers in the SQL Server. I asked him back that what he really mean by Simple Way and requested him to demonstrate his code for finding maximum between two numbers. Here is his code: DECLARE @Value1 DECIMAL(5,2) = 9.22 DECLARE @Value2 DECIMAL(5,2) = 8.34 SELECT (0.5 * ((@Value1 + @Value2) + ABS(@Value1 - @Value2))) AS MaxColumn GO I thought his logic was accurate but the same script can be written another way. I quickly wrote following code for him and which worked just fine for him. Here is my code: DECLARE @Value1 DECIMAL(5,2) = 9.22 DECLARE @Value2 DECIMAL(5,2) = 8.34 SELECT CASE WHEN @Value1 > @Value2 THEN @Value1 ELSE @Value2 END AS MaxColumn GO He agreed that my code is much simpler but as per him there is some problem with my code which apparently he does not remember at this time. There are cases when his code will give accurate values and my code will not. I think his comment has value but both of us for the moment could not come up with any valid reason. Do you think any scenario where his code will work and my suggested code will not work? Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Two Puzzles – Answer and Win USD 25 Gift Card

    - by pinaldave
    Today I have two simple T-SQL Puzzle. You can answer them and win USD 25 Gift card. The gift card will be sent in email to winner. You will get choice of Gift Card brand based on your preference and country location. Puzzle 1: What will be the outcome and why? DECLARE @x REAL; SET @x = 9E-40 SELECT @x; The outcome here is obvious as I have used negative number in assignment. What is the reason behind the same? Puzzle 2: Why will be the outcome different from Puzzle 1: DECLARE @y REAL; SET @y = 9E+40 SELECT @y; The outcome of this puzzle very different from puzzle 1  as I have used positive number. There is number six (6) in the resultset why? Msg 232, Level 16, State 2, Line 2 Arithmetic overflow error for type real, value = 90000000000000006000000000000000000000000.000000. How to participate To win the Gift Card USD 25 you will have to answer both of the question on my Facebook page. If you are on twitter – you can increase the chance of winning by tweeting your participation. This contest is open for any one from any country. The winner will be selected Randomly. Winner will be announced on July 7, 2011. Related Post: SQLAuthority News – Monthly list of Puzzles and Solutions on SQLAuthority.com Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Puzzle, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology

    Read the article

  • SQL SERVER – 2012 – List All The Column With Specific Data Types in Database

    - by pinaldave
    5 years ago I wrote script SQL SERVER – 2005 – List All The Column With Specific Data Types, when I read it again, it is very much relevant and I liked it. This is one of the script which every developer would like to keep it handy. I have upgraded the script bit more. I have included few additional information which I believe I should have added from the beginning. It is difficult to visualize the final script when we are writing it first time. I use every script which I write on this blog, the matter of the fact, I write only those scripts here which I was using at that time. It is quite possible that as time passes by my needs are changing and I change my script. Here is the updated script of this subject. If there are any user data types, it will list the same as well. SELECT s.name AS 'schema', ts.name AS TableName, c.name AS column_name, c.column_id, SCHEMA_NAME(t.schema_id) AS DatatypeSchema, t.name AS Datatypename ,t.is_user_defined, t.is_assembly_type ,c.is_nullable, c.max_length, c.PRECISION, c.scale FROM sys.columns AS c INNER JOIN sys.types AS t ON c.user_type_id=t.user_type_id INNER JOIN sys.tables ts ON ts.OBJECT_ID = c.OBJECT_ID INNER JOIN sys.schemas s ON s.schema_id = t.schema_id ORDER BY s.name, ts.name, c.column_id I would be very interested to see your script which lists all the columns of the database with data types. If I am missing something in my script, I will modify it based on your comment. This way this page will be a good bookmark for the future for all of us. Reference : Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL DMV, SQL Query, SQL Server, SQL System Table, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQLAuthority News – Download Whitepaper – Power View Infrastructure Configuration and Installation: Step-by-Step and Scripts

    - by pinaldave
    Power View, a feature of SQL Server 2012 Reporting Services Add-in for Microsoft SharePoint Server 2010 Enterprise Edition, is an interactive data exploration, visualization, and presentation experience. It provides intuitive ad-hoc reporting for business users such as data analysts, business decision makers, and information workers. Microsoft has recently released very interesting whitepaper which covers a sample scenario that validates the connectivity of the Power View reports to both PowerPivot workbooks and tabular models. This white paper talks about following important concepts about Power View: Understanding the hardware and software requirements and their download locations Installing and configuring the required infrastructure when Power View and its data models are on the same computer and on different computer Installing and configuring a computer used for client access to Power View reports, models, Sharepoint 2012 and Power View in a workgroup Configuring single sign-on access for double-hop scenarios with and without Kerberos You can download the whitepaper from here. This whitepaper talks about many interesting scenarios. It would be really interesting to know if you are using Power View in your production environment. If yes, would you please share your experience over here. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Business Intelligence, Data Warehousing, PostADay, SQL, SQL Authority, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, SQL White Papers, T SQL, Technology

    Read the article

  • SQL SERVER – Auditing and Profiling Database Made Easy with SQL Audit and Comply

    - by Pinal Dave
    Do you like auditing your database, or can you think of about a million other things you’d rather do?  Unfortunately, auditing is incredibly important.  As with tax audits, it is important to audit databases to ensure they are following all the rules, but they are also important for troubleshooting and security. There are several ways to audit SQL Server.  There is manual auditing, which is going through your database “by hand,” and obviously takes a long time and is quite inefficient.  SQL Server also provides programs to help you audit your systems.  Different administrators will have different opinions about best practices and which tools to use, and each one will be perfected for certain systems and certain users. Today, though, I would like to talk about Apex SQL Audit.  It is an auditing tool that acts like “track changes” in a word processing document.  It will log what has changed on the database, who made the changes, and what effects these changes have had (i.e. what objects were affected down the line).  All this information is logged, and can be easily viewed or printed for easy access. One of the best features of Apex is that it is so customizable (and easy to use!).  First, start Apex.  Then you can connect to the database you would like to monitor. Once you select your database, you can select which table you want to audit. You can customize right down to the field you’d like to audit, and then select which types of actions you’d like tracked – insert, delete, or update.  Repeat these steps for every database you want monitored. To create the logs, choose “Create triggers” in the menu.  The script written here will be what logs each insert, delete, and update function.  Press F5 to execute.  All this tracking information will be stored in AUDIT_LOG_DATA and AUDIT_LOG_TRANSACTIONS tables.  View these tables using ApexSQL Audit reports. These transaction logs can be extremely detailed – especially on very busy servers, where every move it traced.  Reading them can be overwhelming, to say the least.  Apex has tried to make things easier for the average DBA, though. You can read these tracking logs in Apex, and it will display data and objects that affect your server – even things that were happening on your server before you installed Apex! To read these logs, open Apex, and connect to that database you want to audit. Go to the Transaction Logs tab, and add the logs you want to read. To narrow down what results you want to see, you can use the Filter tab to choose time, operation type, name, users, and more. Click Open, and you can see the results in a grid (as shown below).  You can export these results to CSV, HTML, XML or SQL files and save on the hard disk. One of the advantages is that since there are no triggers here, there are no other processes that will affect SQL Server performance.  Using this method is also how to view history from your database that occurred before Apex was installed.  This type of tracking does require storage space for the data sources, as the database must be fully running, and the transaction logs must exist (things not stored in the transactions logs will not be recoverable). Apex can also replace SQL Server Profiler and SQL Server Traces – which are much more complex and error-prone – with its ApexSQL Comply.  It can do fault tolerant auditing, centralized reporting, and “who saw what” information in an easy-to-use interface.  The tracking settings can be altered by the user, or the default options will provide solutions to the most common auditing problems. To get started: open ApexSQL Comply, and selected Database Filter Settings to choose which database you’d like to audit.  You can select which tracking you’re like in Operation Types – DML, DDL, queries executed, execute statements, and more.  To get started, click Start Auditing. After this, every action will be stored in the central repository database (ApexSQLCrd).  You can view the audit and create a report (or view the standard default report) using a wizard. You can see how easy it is to use ApexSQL Comply.  You can easily set audits, including the type and time, and create customized reports.  Remote users can easily access the reports through the user interface (available online, as well), and security concerns are all taken care of by the program.  Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQL Utility, T SQL, Technology

    Read the article

  • SQL SERVER – Pending IO request in SQL Server – DMV

    - by pinaldave
    I received following question: “How do we know how many pending IO requests are there for database files (.mdf, .ldf) individually?” Very interesting question and indeed answer is very interesting as well. Here is the quick script which I use to find the same. It has to be run in the context of the database for which you want to know pending IO statistics. USE DATABASE GO SELECT vfs.database_id, df.name, df.physical_name ,vfs.FILE_ID, ior.io_pending FROM sys.dm_io_pending_io_requests ior INNER JOIN sys.dm_io_virtual_file_stats (DB_ID(), NULL) vfs ON (vfs.file_handle = ior.io_handle) INNER JOIN sys.database_files df ON (df.FILE_ID = vfs.FILE_ID) I keep this script handy as it works like magic every time. If you use any other script please post here and I will post it with due credit. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL DMV, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Big Data – Buzz Words: Importance of Relational Database in Big Data World – Day 9 of 21

    - by Pinal Dave
    In yesterday’s blog post we learned what is HDFS. In this article we will take a quick look at the importance of the Relational Database in Big Data world. A Big Question? Here are a few questions I often received since the beginning of the Big Data Series - Does the relational database have no space in the story of the Big Data? Does relational database is no longer relevant as Big Data is evolving? Is relational database not capable to handle Big Data? Is it true that one no longer has to learn about relational data if Big Data is the final destination? Well, every single time when I hear that one person wants to learn about Big Data and is no longer interested in learning about relational database, I find it as a bit far stretched. I am not here to give ambiguous answers of It Depends. I am personally very clear that one who is aspiring to become Big Data Scientist or Big Data Expert they should learn about relational database. NoSQL Movement The reason for the NoSQL Movement in recent time was because of the two important advantages of the NoSQL databases. Performance Flexible Schema In personal experience I have found that when I use NoSQL I have found both of the above listed advantages when I use NoSQL database. There are instances when I found relational database too much restrictive when my data is unstructured as well as they have in the datatype which my Relational Database does not support. It is the same case when I have found that NoSQL solution performing much better than relational databases. I must say that I am a big fan of NoSQL solutions in the recent times but I have also seen occasions and situations where relational database is still perfect fit even though the database is growing increasingly as well have all the symptoms of the big data. Situations in Relational Database Outperforms Adhoc reporting is the one of the most common scenarios where NoSQL is does not have optimal solution. For example reporting queries often needs to aggregate based on the columns which are not indexed as well are built while the report is running, in this kind of scenario NoSQL databases (document database stores, distributed key value stores) database often does not perform well. In the case of the ad-hoc reporting I have often found it is much easier to work with relational databases. SQL is the most popular computer language of all the time. I have been using it for almost over 10 years and I feel that I will be using it for a long time in future. There are plenty of the tools, connectors and awareness of the SQL language in the industry. Pretty much every programming language has a written drivers for the SQL language and most of the developers have learned this language during their school/college time. In many cases, writing query based on SQL is much easier than writing queries in NoSQL supported languages. I believe this is the current situation but in the future this situation can reverse when No SQL query languages are equally popular. ACID (Atomicity Consistency Isolation Durability) – Not all the NoSQL solutions offers ACID compliant language. There are always situations (for example banking transactions, eCommerce shopping carts etc.) where if there is no ACID the operations can be invalid as well database integrity can be at risk. Even though the data volume indeed qualify as a Big Data there are always operations in the application which absolutely needs ACID compliance matured language. The Mixed Bag I have often heard argument that all the big social media sites now a days have moved away from Relational Database. Actually this is not entirely true. While researching about Big Data and Relational Database, I have found that many of the popular social media sites uses Big Data solutions along with Relational Database. Many are using relational databases to deliver the results to end user on the run time and many still uses a relational database as their major backbone. Here are a few examples: Facebook uses MySQL to display the timeline. (Reference Link) Twitter uses MySQL. (Reference Link) Tumblr uses Sharded MySQL (Reference Link) Wikipedia uses MySQL for data storage. (Reference Link) There are many for prominent organizations which are running large scale applications uses relational database along with various Big Data frameworks to satisfy their various business needs. Summary I believe that RDBMS is like a vanilla ice cream. Everybody loves it and everybody has it. NoSQL and other solutions are like chocolate ice cream or custom ice cream – there is a huge base which loves them and wants them but not every ice cream maker can make it just right  for everyone’s taste. No matter how fancy an ice cream store is there is always plain vanilla ice cream available there. Just like the same, there are always cases and situations in the Big Data’s story where traditional relational database is the part of the whole story. In the real world scenarios there will be always the case when there will be need of the relational database concepts and its ideology. It is extremely important to accept relational database as one of the key components of the Big Data instead of treating it as a substandard technology. Ray of Hope – NewSQL In this module we discussed that there are places where we need ACID compliance from our Big Data application and NoSQL will not support that out of box. There is a new termed coined for the application/tool which supports most of the properties of the traditional RDBMS and supports Big Data infrastructure – NewSQL. Tomorrow In tomorrow’s blog post we will discuss about NewSQL. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Big Data, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • SQL SERVER – Finding Shortest Distance between Two Shapes using Spatial Data Classes – Ramsetu or Adam’s Bridge

    - by pinaldave
    Recently I was reading excellent blog post by Lenni Lobel on Spatial Database. He has written very interesting function ShortestLineTo in Spatial Data Classes. I really loved this new feature of the finding shortest distance between two shapes in SQL Server. Following is the example which is same as Lenni talk on his blog article . DECLARE @Shape1 geometry = 'POLYGON ((-20 -30, -3 -26, 14 -28, 20 -40, -20 -30))' DECLARE @Shape2 geometry = 'POLYGON ((-18 -20, 0 -10, 4 -12, 10 -20, 2 -22, -18 -20))' SELECT @Shape1 UNION ALL SELECT @Shape2 UNION ALL SELECT @Shape1.ShortestLineTo(@Shape2).STBuffer(.25) GO When you run this script SQL Server finds out the shortest distance between two shapes and draws the line. We are using STBuffer so we can see the connecting line clearly. Now let us modify one of the object and then we see how the connecting shortest line works. DECLARE @Shape1 geometry = 'POLYGON ((-20 -30, -3 -30, 14 -28, 20 -40, -20 -30))' DECLARE @Shape2 geometry = 'POLYGON ((-18 -20, 0 -10, 4 -12, 10 -20, 2 -22, -18 -20))' SELECT @Shape1 UNION ALL SELECT @Shape2 UNION ALL SELECT @Shape1.ShortestLineTo(@Shape2).STBuffer(.25) GO Now once again let us modify one of the script and see how the shortest line to works. DECLARE @Shape1 geometry = 'POLYGON ((-20 -30, -3 -30, 14 -28, 20 -40, -20 -30))' DECLARE @Shape2 geometry = 'POLYGON ((-18 -20, 0 -10, 4 -12, 10 -20, 2 -18, -18 -20))' SELECT @Shape1 UNION ALL SELECT @Shape2 UNION ALL SELECT @Shape1.ShortestLineTo(@Shape2).STBuffer(.25) SELECT @Shape1.STDistance(@Shape2) GO You can see as the objects are changing the shortest lines are moving at appropriate place. I think even though this is very small feature this is really cool know. While I was working on this example, I suddenly thought about distance between Sri Lanka and India. The distance is very short infect it is less than 30 km by sea. I decided to map India and Sri Lanka using spatial data classes. To my surprise the plotted shortest line is the same as Adam’s Bridge or Ramsetu. Adam’s Bridge starts as chain of shoals from the Dhanushkodi tip of India’s Pamban Island and ends at Sri Lanka’s Mannar Island. Geological evidence suggests that this bridge is a former land connection between India and Sri Lanka. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Function, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: Spatial Database, SQL Spatial

    Read the article

  • SQL SERVER – Importance of User Without Login – T-SQL Demo Script

    - by pinaldave
    Earlier I wrote a blog post about SQL SERVER – Importance of User Without Login and my friend and SQL Expert Vinod Kumar has written excellent follow up blog post about Contained Databases inside SQL Server 2012. Now lots of people asked me if I can also explain the same concept again so here is the small demonstration for it. Let me show you how login without user can help. Before we continue on this subject I strongly recommend that you read my earlier blog post here. In following demo I am going to demonstrate following situation. Login using the System Admin account Create a user without login Checking Access Impersonate the user without login Checking Access Revert Impersonation Give Permission to user without login Impersonate the user without login Checking Access Revert Impersonation Clean up USE [AdventureWorks2012] GO -- Step 1 : Login using the SA -- Step 2 : Create Login Less User CREATE USER [testguest] 9ITHOUT LOGIN WITH DEFAULT_SCHEMA=[dbo] GO -- Step 3 : Checking access to Tables SELECT * FROM sys.tables; -- Step 4 : Changing the execution contest EXECUTE AS USER   = 'testguest'; GO -- Step 5 : Checking access to Tables SELECT * FROM sys.tables; GO -- Step 6 : Reverting Permissions REVERT; -- Step 7 : Giving more Permissions to testguest user GRANT SELECT ON [dbo].[ErrorLog] TO [testguest]; GRANT SELECT ON [dbo].[DatabaseLog] TO [testguest]; GO -- Step 8 : Changing the execution contest EXECUTE AS USER   = 'testguest'; GO -- Step 9 : Checking access to Tables SELECT * FROM sys.tables; GO -- Step 10 : Reverting Permissions REVERT; GO -- Step 11: Clean up DROP USER [testguest]Step 3 GO Here is the step 9 we will be able to notice that how a user without login gets access to some of the data/object which we gave permission. What I am going to prove with this example? Well there can be different rights with different account. Once the login is authenticated it makes sense for impersonating a user with only necessary permissions to be used for further operation. Again this is very basic and fundamental example. There are lots of more points to be discussed as we go in future posts. Just do not take this blog post as a template and implement everything as it is. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Security, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Step by Step Guide to Beginning Data Quality Services in SQL Server 2012 – Introduction to DQS

    - by pinaldave
    Data Quality Services is a very important concept of SQL Server. I have recently started to explore the same and I am really learning some good concepts. Here are two very important blog posts which one should go over before continuing this blog post. Installing Data Quality Services (DQS) on SQL Server 2012 Connecting Error to Data Quality Services (DQS) on SQL Server 2012 This article is introduction to Data Quality Services for beginners. We will be using an Excel file Click on the image to enlarge the it. In the first article we learned to install DQS. In this article we will see how we can learn about building Knowledge Base and using it to help us identify the quality of the data as well help correct the bad quality of the data. Here are the two very important steps we will be learning in this tutorial. Building a New Knowledge Base  Creating a New Data Quality Project Let us start the building the Knowledge Base. Click on New Knowledge Base. In our project we will be using the Excel as a knowledge base. Here is the Excel which we will be using. There are two columns. One is Colors and another is Shade. They are independent columns and not related to each other. The point which I am trying to show is that in Column A there are unique data and in Column B there are duplicate records. Clicking on New Knowledge Base will bring up the following screen. Enter the name of the new knowledge base. Clicking NEXT will bring up following screen where it will allow to select the EXCE file and it will also let users select the source column. I have selected Colors and Shade both as a source column. Creating a domain is very important. Here you can create a unique domain or domain which is compositely build from Colors and Shade. As this is the first example, I will create unique domain – for Colors I will create domain Colors and for Shade I will create domain Shade. Here is the screen which will demonstrate how the screen will look after creating domains. Clicking NEXT it will bring you to following screen where you can do the data discovery. Clicking on the START will start the processing of the source data provided. Pre-processed data will show various information related to the source data. In our case it shows that Colors column have unique data whereas Shade have non-unique data and unique data rows are only two. In the next screen you can actually add more rows as well see the frequency of the data as the values are listed unique. Clicking next will publish the knowledge base which is just created. Now the knowledge base is created. We will try to take any random data and attempt to do DQS implementation over it. I am using another excel sheet here for simplicity purpose. In reality you can easily use SQL Server table for the same. Click on New Data Quality Project to see start DQS Project. In the next screen it will ask which knowledge base to use. We will be using our Colors knowledge base which we have recently created. In the Colors knowledge base we had two columns – 1) Colors and 2) Shade. In our case we will be using both of the mappings here. User can select one or multiple column mapping over here. Now the most important phase of the complete project. Click on Start and it will make the cleaning process and shows various results. In our case there were two columns to be processed and it completed the task with necessary information. It demonstrated that in Colors columns it has not corrected any value by itself but in Shade value there is a suggestion it has. We can train the DQS to correct values but let us keep that subject for future blog posts. Now click next and keep the domain Colors selected left side. It will demonstrate that there are two incorrect columns which it needs to be corrected. Here is the place where once corrected value will be auto-corrected in future. I manually corrected the value here and clicked on Approve radio buttons. As soon as I click on Approve buttons the rows will be disappeared from this tab and will move to Corrected Tab. If I had rejected tab it would have moved the rows to Invalid tab as well. In this screen you can see how the corrected 2 rows are demonstrated. You can click on Correct tab and see previously validated 6 rows which passed the DQS process. Now let us click on the Shade domain on the left side of the screen. This domain shows very interesting details as there DQS system guessed the correct answer as Dark with the confidence level of 77%. It is quite a high confidence level and manual observation also demonstrate that Dark is the correct answer. I clicked on Approve and the row moved to corrected tab. On the next screen DQS shows the summary of all the activities. It also demonstrates how the correction of the quality of the data was performed. The user can explore their data to a SQL Server Table, CSV file or Excel. The user also has an option to either explore data and all the associated cleansing info or data only. I will select Data only for demonstration purpose. Clicking explore will generate the files. Let us open the generated file. It will look as following and it looks pretty complete and corrected. Well, we have successfully completed DQS Process. The process is indeed very easy. I suggest you try this out yourself and you will find it very easy to learn. In future we will go over advanced concepts. Are you using this feature on your production server? If yes, would you please leave a comment with your environment and business need. It will be indeed interesting to see where it is implemented. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Business Intelligence, Data Warehousing, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: Data Quality Services, DQS

    Read the article

< Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >