Search Results

Search found 2462 results on 99 pages for 'dave keck'.

Page 85/99 | < Previous Page | 81 82 83 84 85 86 87 88 89 90 91 92  | Next Page >

  • SQL SERVER – A Cool Trick – Restoring the Default SQL Server Management Studio – SSMS

    - by pinaldave
    “I do not know where my windows went!” “I just closed my object explorer and now I cannot find it.” “How do I get my original windows layout back in SQL Server Management Studio?” “How do I get the window which was there in left side back again?” Since last 2-3 years, every single day I receive more than 5 emails on SSMS and its layout. For the beginners it is very common to get confused when they attempt to change SQL Server Management Studio’s windows layout. They often change the layout and are not able to get the original layout back. Often people do not change the layout whole of their life, leading to uncomfortable feeling when they go to another’s computer where the windows are differently placed. Today’s blog post is dedicated all the beginners in SQL Server. It is extremely simple to reset the SSMS layout to default layout. The default layout involves 2 major things 1) Object Explorer on left side 2) Query Windows on right side (80% screen estate). Personally I am so used to this as well that if there is any other changes in the same, I do not enjoy working on the environment. Well, the solution to rest the SSMS layout is very simple. One can do it in split seconds.  To restore the default configuration, on the Window menu, click Reset Window Layout. Have you ever used this feature? Do you feel uncomfortable when SSMS layout is not in default state? How do you address this situation? Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQLAuthority News – Whitepaper Download – Using Star Join and Few-Outer-Row Optimizations to Improve Data Warehousing Queries

    - by pinaldave
    Size of the database is growing every day. Many organizations now a days have more than TB of the Data in their system. Performance is always part of the issue. Microsoft is really paying attention to the same and also focusing on improving performance for Data Warehousing. Microsoft has recently released whitepaper on the performance tuning subject of Data Warehousing. Here is the abstract about the whitepaper from official site: In this white paper we discuss two of the new features introduced in SQL Server 2008, Star Join and Few-Outer-Row optimizations. These two features are in SQL Server 2008 R2 as well.  We test the performance of SQL Server 2008 on a set of complex data warehouse queries designed to highlight the effect of these two features and observed a significant performance gain over SQL Server 2005 (without these two features). The results observed also apply to SQL Server 2008 R2.  On average, about 75 percent of the query execution time has been reduced, compared to SQL Server 2005. We also include data that shows a reduction in the number of rows processed and improved balance in parallel queries, both of which highlight the important role the Star Join and Few Outer-Row features played. I encouraged all of those interested in Data Warehouse to read it and see if they can learn the tricks. Using Star Join and Few-Outer-Row Optimizations to Improve Data Warehousing Queries Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Documentation, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology

    Read the article

  • WPF and Composite Application Library &ndash; Missing The Point

    - by David Totzke
    I have a headache and it’s not even 9AM yet.  Well, ok, it’s nearly ten here now in GMT –5 but it’s before nine somewhere still. Sometimes people will miss the point of something so utterly and completely that one is left wondering how such a person can even dress themselves. Writing an application using WPF and the Composite Application Library (Prism) means that one must learn the various programming idioms common to these frameworks.  The Windows Forms event driven model simply will not suffice.  You need to come to grips with the idea of a very loosely coupled application.  Concepts that must be absorbed and internalized include Data Binding, Control and Data Templates, Commands, Dependency Injection, and Inversion of Control, as well as the Supervising Controller, Presentation Model and Model-View-View-Model patterns. It is as simple as that.  Not to embrace these concepts is to invite pain.  It is to invite noodles; and not the holy kind. Someone actually said to me that “just because it’s not WPF, doesn’t mean it’s wrong.”  And he’s right.  Unless, of course, you are writing a WPF application and especially if you are using the Composite Application Library. In simple terms then; YOU’RE DOING IT WRONG!   Dave Just because I can…

    Read the article

  • Oracle Solaris 11 Summit Day at the LISA Conference 2011-Register Today!

    - by Terri Wischmann
    We have successfully launched and shipped Oracle Solaris 11!  Come to the LISA 2011 Conference in Boston, MA to learn about all the latest and greatest Oracle Solaris 11 technologies. On Tuesday, 12/6/11 we are hosting our 2nd annual Oracle Solaris 11 Summit Day! It's a Free full day of sessions covering the latest OS technologies, and a chance for you to meet key members of the Oracle Solaris engineering team as they conduct a deep-dive exploration of core Solaris features. See agenda below -Register Today!!  Time  Topic  Presenter  9:00 -9:30 am  Oracle Solaris 11 Strategy  Markus Flierl  9:30 - 11:00 am  Next Generation OS Lifecycle Management with Oracle Solaris 11  Dave Miner/Bart Smaalders  11:00 am  - 12:00 pm  Data Management with ZFS  Mark Maybee  12:00 - 1:00 pm  LUNCH  All  1:00 - 2:30 pm Oracle Solaris Virtualization and Oracle Solaris Networking  Mike Gerdts/Sebastian Roy 2:30 - 3:15 pm Security in your Oracle Solaris Cloud Environment  Glenn Faden  3:15 - 3:30 pm  BREAK  All  3:30 - 4:15 pm Oracle Solaris - The Best Platform to run your Oracle Applications David Brean  4:15 - 5:00 pm Oracle Solaris Cluster - HA in the Cloud Gia-Khahn Nguyen  5:00 - 6:30 pm  Reception  All

    Read the article

  • SQLAuthority News – Download – Microsoft SQL Server Compact 4.0

    - by pinaldave
    Microsoft SQL Server Compact 4.0 is a free, embedded database that software developers can use for building ASP.NET websites and Windows desktop applications. SQL Server Compact 4.0 has a small footprint and supports private deployment of its binaries within the application folder, easy application development in Visual Studio and WebMatrix, and seamless migration of schema and data to SQL Server. You can download very small file of SQL Server CE from here. Books Online is the primary documentation for SQL Server Compact 4.0. Books Online includes the following types of information: Setup and upgrade instructions. Information about new features and backward compatibility. Conceptual descriptions of the technologies and features in SQL Server Compact 4.0. Procedural topics describing how to use the various features in SQL Server Compact 4.0. Tutorials that guide you through common tasks. Reference documentation for the graphical tools, programming languages, and application programming interfaces (APIs) that are supported by SQL Server Compact 4.0. You can download SQL Server CE Book Online here. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Documentation, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Load Generator – Free Tool From CodePlex

    - by pinaldave
    One of the most common questions I receive is if there any tool available to generate load on SQL Server. Absolutely there is a fabulous free tool available to generate load on SQL Server on Codeplex. This tool was released in 2008 but it is still extremely relevant to generate the load on SQL Server as well works fabulously. CodePlex is a project initiated by Microsoft for hosting open source softwares. The best part of this SQL Server Load Generator is that users can run multiple simultaneous queries again SQL Server using different login account and different application name. The interface of the tool is extremely easy to use and very intuitive as well. One of the things which I felt needed improvement was a default configuration. As every single time when I was adding a query the default settings were showing up and I had to manually change that. However, when I went to Menu >> Tools >>Options I was really happy as it has options to change every single default which is available. Here one can give default username, password, database name as well various settings related to configuration. Additionally application logging is also possible through the options. A couple of other important points I noticed was the button to reset counters as well status bar containing useful information of Total Threads, Completed Queries and Failed Queries. I use this frequently for my load testing. What tool do you use for SQL Server Load Generator? Download SQL Load Generator Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQL Utility, T SQL, Technology

    Read the article

  • SQLAuthority News – NuoDB MeetUp on Nov 8, 2012 in Seattle

    - by pinaldave
    I am pleased to let you know that I will be attending again this year’s SQLPASS conference in Seattle and look forward to meeting all of you while at the conference. In the next two weeks, I will provide you with a full agenda of where I will be during PASS. During the week, I will also be stopping by at the NuoDB MeetUp, which will be held close by at the Edge Grill at 1522 6th Ave in Seattle on Thursday, November 8th. This will be an excellent opportunity for you to learn more about their brand new distributed, peer-to-peer database solution, which I believe will revolutionize SQL cloud database technology in the 21th century.  I have been personally following NuoDB for months now and am very excited about the architecture and capabilities of this innovative product. Wiqar Chaudry, NuoDB technology evangelist, will give a presentation and demonstration of their elastically scalable SQL cloud database in this Meetup event.  Prior to joining NuoDB, Wiqar was a Senior Architect at Epsilon, the data intelligence company with big brand name customers in insurance, consumer goods, etc.  He’s also going to discuss how NuoDB compares with Azure, the hometown favorite, and why cloud-based SQL deployment will pave the way for the future. I will be at the NuoDB MeetUp to briefly talk about my own experiences with NuoDB and will be giving away some signed copies of my latest book as well will have some interesting goodies. So please join me and the NuoDB team at their Meetup event. RSVP here. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: NuoDB

    Read the article

  • SQL SERVER – Validating Unique Columnname Across Whole Database

    - by pinaldave
    I sometimes come across very strange requirements and often I do not receive a proper explanation of the same. Here is the one of those examples. Asker: “Our business requirement is when we add new column we want it unique across current database.” Pinal: “Why do you have such requirement?” Asker: “Do you know the solution?” Pinal: “Sure I can come up with the answer but it will help me to come up with an optimal answer if I know the business need.” Asker: “Thanks – what will be the answer in that case.” Pinal: “Honestly I am just curious about the reason why you need your column name to be unique across database.” (Silence) Pinal: “Alright – here is the answer – I guess you do not want to tell me reason.” Option 1: Check if Column Exists in Current Database IF EXISTS (  SELECT * FROM sys.columns WHERE Name = N'NameofColumn') BEGIN SELECT 'Column Exists' -- add other logic END ELSE BEGIN SELECT 'Column Does NOT Exists' -- add other logic END Option 2: Check if Column Exists in Current Database in Specific Table IF EXISTS (  SELECT * FROM sys.columns WHERE Name = N'NameofColumn' AND OBJECT_ID = OBJECT_ID(N'tableName')) BEGIN SELECT 'Column Exists' -- add other logic END ELSE BEGIN SELECT 'Column Does NOT Exists' -- add other logic END I guess user did not want to share the reason why he had a unique requirement of having column name unique across databases. Here is my question back to you – have you faced a similar situation ever where you needed unique column name across a database. If not, can you guess what could be the reason for this kind of requirement?  Additional Reference: SQL SERVER – Query to Find Column From All Tables of Database Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL System Table, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – ERROR – FIX – Msg 3702, Level 16, State 3, Line 1 Cannot drop database “MyDBName” because it is currently in use

    - by pinaldave
    I often go to do various seminars and presentations at various organizations. During presentations I often create and drop various databases for demonstrations purpose. Recently in one of the presentations, I tried to remove my recently created database, I got following error. Msg 3702, Level 16, State 3, Line 1 Cannot drop database “MyDBName” because it is currently in use. The reason was very simple as my database was in use by another session or window. I had option that I should go and find open session and close it right away; later followed by dropping the database. As I was in rush I quickly wrote down following code and I was able to successfully drop the database. USE MASTER GO ALTER DATABASE MyDBName SET SINGLE_USER WITH ROLLBACK IMMEDIATE; DROP DATABASE MyDBName GO Please note that I am doing all this on my demonstrations, do not run above code on production without proper approvals and supervisions. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: SQL, SQL Authority, SQL Error Messages, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Fix: Error: 8117: Operand data type bit is invalid for sum operator

    - by pinaldave
    Here is the very interesting error I received from a reader. He has very interesting question. He attempted to use BIT filed in the SUM aggregation function and he got following error. He went ahead with various different datatype (i.e. INT, TINYINT etc) and he was able to do the SUM but with BIT he faced the problem. Error Received: Msg 8117, Level 16, State 1, Line 1 Operand data type bit is invalid for sum operator. Reproduction of the error: Set up the environment USE tempdb GO -- Preparing Sample Data CREATE TABLE TestTable (ID INT, Flag BIT) GO INSERT INTO TestTable (ID, Flag) SELECT 1, 0 UNION ALL SELECT 2, 1 UNION ALL SELECT 3, 0 UNION ALL SELECT 4, 1 GO SELECT * FROM TestTable GO Following script will work fine: -- This will work fine SELECT SUM(ID) FROM TestTable GO However following generate error: -- This will generate error SELECT SUM(Flag) FROM TestTable GO The workaround is to convert or cast the BIT to INT: -- Workaround of error SELECT SUM(CONVERT(INT, Flag)) FROM TestTable GO Clean up the setup -- Clean up DROP TABLE TestTable GO Workaround: As mentioned in above script the workaround is to covert the bit datatype to another friendly data types like INT, TINYINT etc. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQLAuthority News – 6th Anniversary and 50 Million Views and Over 2300 Blog Posts – Thank You Thank You

    - by pinaldave
    Six years ago, I started this SQLAuthority.com blog. There are so many things I want to say today – it is very very emotional. Instead of writing long I am including few images and cartoons. Last month we have also reached 50 Million Total Views on this blog. Here is the screen captured at that time. Click Image to Enlarge In 6 years there are total 2192 days (including 2 leap year day) and my total blog post count is 2300. That means I have been blogging more than 1 blog post every day. Here is the quick glance to all the numbers. Here you can find the list of all the 2300 blog posts. I am very glad to see my many of the friends stay in USA, India, United Kingdom, Canada and Australia in that order. You can see the geographic distribution of the support I receive on the blog from worldwide. On this day I would like to call out one 2 individuals who contribute equally or more in my success. When I started this blog 6 years ago, I was walking alone. After 2 years my wife Nupur joined my journey and 3 years later my daughter Shaivi joined the journey. Here is the example of the common conversation among us almost every day - Shaivi: Daddy, play catch-catch. Nupur: Shaivi, daddy will play with you once he finishes tomorrow’s blog. Shaivi: Daddy, Finish Blog. Okey. I play catch-catch (alone). SQLAuthority Family Well, thank you very much! We all love you! Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: About Me, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, SQLServer, T SQL, Technology

    Read the article

  • SQLAuthority News – Download Whitepaper – Choosing a Tabular or Multidimensional Modeling Experience in SQL Server 2012 Analysis Services

    - by pinaldave
    Data modeling is the most important task for any BI professional. Matter of the fact, the biggest challenge is to organizing disparate data into an analytic model that effectively and efficiently supports the reporting and analysis. SQL Server 2012 introduces BI Semantic Model (BISM), a single model that can support a broad range of reporting and analysis while blending two Analysis Services modeling experiences behind the scenes. Multidimensional modeling – enables BI professionals to create sophisticated multidimensional cubes using traditional online analytical processing (OLAP). Tabular modeling – provides self-service data modeling capabilities to business and data analysts. As data modeling is evolving and business needs are growing new technologies and tools are emerging to help end users to make the necessary adjustment to the reporting and analysis needs. This white paper is will provide practical guidance to help you decide which SQL Server 2012 Analysis Services modeling experience – tabular or multidimensional. Do let me know what do is your opinion as a comment. In simple word – I would like to know when will you use Tabular modeling and when Multidimensional modeling? Download Choosing a Tabular or Multidimensional Modeling Experience in SQL Server 2012 Analysis Services Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Business Intelligence, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQL White Papers, T SQL, Technology

    Read the article

  • SQL SERVER – Follow up – Usage of $rowguid and $IDENTITY

    - by pinaldave
    The most common question I often receive is why do I blog? The answer is even simpler – I blog because I get an extremely constructive comment and conversation from people like DHall and Kumar Harsh. Earlier this week, I shared a conversation between Madhivanan and myself regarding how to find out if a table uses ROWGUID or not? I encourage all of you to read the conversation here: SQL SERVER – Identifying Column Data Type of uniqueidentifier without Querying System Tables. In simple words the conversation between Madhivanan and myself brought out a simple query which returns the values of the UNIQUEIDENTIFIER  without knowing the name of the column. David Hall wrote few excellent comments as a follow up and every SQL Enthusiast must read them first, second and third. David is always with positive energy, he first of all shows the limitation of my solution here and here which he follows up with his own solution here. As he said his solution is also not perfect but it indeed leaves learning bites for all of us – worth reading if you are interested in unorthodox solutions. Kumar Harsh suggested that one can also find Identity Column used in the table very similar way using $IDENTITY. Here is how one can do the same. DECLARE @t TABLE ( GuidCol UNIQUEIDENTIFIER DEFAULT newsequentialid() ROWGUIDCOL, IDENTITYCL INT IDENTITY(1,1), data VARCHAR(60) ) INSERT INTO @t (data) SELECT 'test' INSERT INTO @t (data) SELECT 'test1' SELECT $rowguid,$IDENTITY FROM @t There are alternate ways also to find an identity column in the database as well. Following query will give a list of all column names with their corresponding tablename. SELECT SCHEMA_NAME(so.schema_id) SchemaName, so.name TableName, sc.name ColumnName FROM sys.objects so INNER JOIN sys.columns sc ON so.OBJECT_ID = sc.OBJECT_ID AND sc.is_identity = 1 Let me know if you use any alternate method related to identity, I would like to know what you do and how you do when you have to deal with Identity Column. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology

    Read the article

  • SQLAuthority News – BI Quiz Question – How to Optimize Cube? – Hints

    - by pinaldave
    I earlier wrote about SQL BI Quiz over here. The details of the quiz is as following: Working with huge data is very common when it is about Data Warehousing. It is necessary to create Cubes on the data to make it meaningful and consumable. There are cases when retrieving the data from cube takes lots of the time. Let us assume that your cube is returning you data very quickly. Suddenly on one day it is returning the data very slowly. What are the three things will you to diagnose this. After diagnose what you will do to resolve performance issue. Participate in my question over here Here is a couple of hints what I am looking for in answer: How to reach to root of slow performance? Is hardware causing the problem or something else? Is slowness is due to how cube is build and its granularity? Is underlying tables require maintenance? Is there is chance to refractor the process? Are there any tool which can help diagnosis the slowness of the cube? It is not necessary to answer all the question – but something to start with. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Puzzle, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQLAuthority News – Microsoft Whitepaper – AlwaysOn Solution Guide: Offloading Read-Only Workloads to Secondary Replicas

    - by pinaldave
    SQL Server 2012 has many interesting features but the most talked feature is AlwaysOn. Performance tuning is always a hot topic. I see lots of need of the same and lots of business around it. However, many times when people talk about performance tuning they think of it as a either query tuning, performance tuning, or server tuning. All are valid points, but performance tuning expert usually understands the business workload and business logic before making suggestions. For example, if performance tuning expert analysis workload and realize that there are plenty of reports as well read only queries on the server they can for sure consider alternate options for the same. If read only data is not required real time or it can accept the data which is delayed a bit it makes sense to divide the workload. A secondary replica of the original data which can serve all the read only queries and report is a good idea in most of the cases where there is plenty of workload which is not dependent on the real time data. SQL Server 2012 has introduced the feature of AlwaysOn which can very well fit in this scenario and provide a solution in Read-Only Workloads. Microsoft has recently announced a white paper which is based on absolutely the same subject. I recommend it to read for every SQL Enthusiast who is are going to implement a solution to offload read-only workloads to secondary replicas. Download white paper AlwaysOn Solution Guide: Offloading Read-Only Workloads to Secondary Replicas Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Backup and Restore, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: AlwaysOn

    Read the article

  • SQLAuthority News – #SQLPASS 2012 Schedule – Where can You Find Me

    - by pinaldave
    Yesterday I wrote about my memory lane with SQLPASS. It has been a fantastic experience and I am very confident that this year the same excellent experience is going to be repeated. Before I start for #SQLPASS every year, I plan where I want to be and what I will be doing. As I travel from India to attend this event (22+ hours flying time and door to door travel time around 36 hours), it is very crucial that I plan things in advance. This year here is my quick note where I will be during the SQLPASS event. If you can stop with me, I would like to meet you, shake your hand and will archive memories as a photograph. Tuesday, November 6, 2012 6:30pm-8:00pm PASS Summit 2012 Welcome Reception Wednesday, November 7, 2012 12pm-1pm – Book Signing at Exhibit Hall Joes Pros booth#117 (FREE BOOK) 5:30pm-6:30pm – Idera Reception at Fox Sports Grill 7pm-8pm - Embarcadero Booth Book Signing (FREE BOOK) 8pm onwards – Exhibitor Reception Thursday, November 8, 2012 12pm-1pm - Embarcadero Booth Book Signing (FREE BOOK) 7pm-10pm - Community Appreciation Party Friday, November 9, 2012 12pm-1pm - Joes 2 Pros Book Signing at Exhibit Hall Joes Pros booth#117 11:30pm-1pm - Birds of a Feather Luncheon Rest of the Time! Exhibition Hall Joes 2 Pros Booth #117. Stop by for the goodies! Lots of people have already sent me email asking if we can meet for a cup of coffee to discuss SQL. Absolutely! I like cafe mocha with skim milk and whip cream and I do not get tired of it. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL PASS, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Grouping by Multiple Columns to Single Column as A String

    - by pinaldave
    One of the most common questions I receive in email is how to group multiple column data in comma separate values in a single row grouping by another column. I have previously blogged about it in following two blog posts. However, both aren’t addressing the following exact problem. Comma Separated Values (CSV) from Table Column Comma Separated Values (CSV) from Table Column – Part 2 The question comes in many different formats but in following image I am demonstrating the same question in simple words. This is the most popular question on my Facebook page as well. (Example) Here is the sample script to build the sample dataset. CREATE TABLE TestTable (ID INT, Col VARCHAR(4)) GO INSERT INTO TestTable (ID, Col) SELECT 1, 'A' UNION ALL SELECT 1, 'B' UNION ALL SELECT 1, 'C' UNION ALL SELECT 2, 'A' UNION ALL SELECT 2, 'B' UNION ALL SELECT 2, 'C' UNION ALL SELECT 2, 'D' UNION ALL SELECT 2, 'E' GO SELECT * FROM TestTable GO Here is the solution which will build an answer to the above question. -- Get CSV values SELECT t.ID, STUFF( (SELECT ',' + s.Col FROM TestTable s WHERE s.ID = t.ID FOR XML PATH('')),1,1,'') AS CSV FROM TestTable AS t GROUP BY t.ID GO I hope this is an easy solution. I am going to point to this blog post in the future for all the similar questions. Final Clean Up Act -- Clean up DROP TABLE TestTable GO Here is the question back to you - Is there any better way to write above script? Please leave a comment and I will write a separate blog post with due credit. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: SQL XML

    Read the article

  • SQL SERVER – Storing Variable Values in Temporary Array or Temporary List

    - by pinaldave
    SQL Server does not support arrays or a dynamic length storage mechanism like list. Absolutely there are some clever workarounds and few extra-ordinary solutions but everybody can;t come up with such solution. Additionally, sometime the requirements are very simple that doing extraordinary coding is not required. Here is the simple case. Let us say here are the values: a, 10, 20, c, 30, d. Now the requirement is to store them in a array or list. It is very easy to do the same in C# or C. However, there is no quick way to do the same in SQL Server. Every single time when I get such requirement, I create a table variable and store the values in the table variables. Here is the example: For SQL Server 2012: DECLARE @ListofIDs TABLE(IDs VARCHAR(100)); INSERT INTO @ListofIDs VALUES('a'),('10'),('20'),('c'),('30'),('d'); SELECT IDs FROM @ListofIDs; GO When executed above script it will give following resultset. Above script will work in SQL Server 2012 only for SQL Server 2008 and earlier version run following code. DECLARE @ListofIDs TABLE(IDs VARCHAR(100), ID INT IDENTITY(1,1)); INSERT INTO @ListofIDs SELECT 'a' UNION ALL SELECT '10' UNION ALL SELECT '20' UNION ALL SELECT 'c' UNION ALL SELECT '30' UNION ALL SELECT 'd'; SELECT IDs FROM @ListofIDs; GO Now in this case, I have to convert numbers to varchars because I have to store mix datatypes in a single column. Additionally, this quick solution does not give any features of arrays (like inserting values in between as well accessing values using array index). Well, do you ever have to store temporary multiple values in SQL Server – if the count of values are dynamic and datatype is not specified early how will you about storing values which can be used later in the programming. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – A Puzzle Part 2 – Fun with SEQUENCE in SQL Server 2012 – Guess the Next Value

    - by pinaldave
    Before continuing this blog post – please read the first part of the SEQUENCE Puzzle here A Puzzle – Fun with SEQUENCE in SQL Server 2012 – Guess the Next Value. Where we played a simple guessing game about predicting next value. The answers the of puzzle is shared on the blog posts as a comment. Now here is the next puzzle based on yesterday’s puzzle. First execute the script which I have written here. The only difference between yesterday’s script is that I have removed the MINVALUE as 1 from the syntax. Now guess what will be the next value as requested in the query. USE TempDB GO -- Create sequence CREATE SEQUENCE dbo.SequenceID AS BIGINT START WITH 3 INCREMENT BY 1 MAXVALUE 5 CYCLE NO CACHE; GO -- Following will return 3 SELECT next value FOR dbo.SequenceID; -- Following will return 4 SELECT next value FOR dbo.SequenceID; -- Following will return 5 SELECT next value FOR dbo.SequenceID; -- Following will return which number SELECT next value FOR dbo.SequenceID; -- Clean up DROP SEQUENCE dbo.SequenceID; GO Above script gave me following resultset. 3 is the starting value and 5 is the maximum value. Once Sequence reaches to maximum value what happens? and WHY? I (kindly) suggest you try to attempt to answer this question without running this code in SQL Server 2012. I am very confident that irrespective of SQL Server version you are running you will have great learning. I will follow up of the answer in comments below. Recently my friend Vinod Kumar wrote excellent blog post on SQL Server 2012: Using SEQUENCE, you can head over there for learning sequence in details. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Puzzle, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • OpenJDK 6 B24 Available

    - by user9158633
    On November 16, 2011 the source bundle for OpenJDK 6 b24 was published at http://download.java.net/openjdk/jdk6/. The main changes in b24 are the latest round of security updates (e.g. the security changes in jdk repo) and a few other fixes.  For more information see the detailed list of all the changes in OpenJDK 6 B24. Test Results: All the jdk regression tests run with  make test passed. cd jdk6 make make test Per Kelly's  B23 Release blog: The new process is - all the jdk regression tests run with make test should just pass. Over time we will fix the tests that have been excluded, possibly add more tests, and exclude tests that fail to demonstrate stability (with a bug filed against the test). For the current list of excluded tests see  jdk6/jdk/test/ProblemList.txt file: ProblemList.html in B24  |  Latest ProblemList.txt (in the tip revision). Special thanks to Kelly O'Hair for his direction and Dave Katleman for his Release Engineering work.

    Read the article

  • Grow Your Oracle Exadata and Manageability Business: Engage With Us to Find Out How

    - by swalker
    Don't miss out on the first EMEA Partner Community Cast! If you are a business decision maker, project leader, technical leader or business development manager you will gain incredible value from these events, and we believe that this introduction to Oracle Partner Communities will bring you a wealth of new opportunities. Join Us on December 7th, 10:00 GMT (11:00 CET) for the first broadcast the Exadata and Manageability solution areas. In just 30 minutes, you will find out more about Oracle's Exadata, Manageability and Oracle Enterprise Manager 12c solutions, and the value they can generate for you and your customers. See the full agenda here. Hosted by Paul Thompson, Senior Director, Alliances and Solutions Partner Programs, Oracle EMEA and Javier Puerta, Director, Core Technology Partner Programs, Oracle EMEA, our special guests include: Steve McNickle, Vice President Europe, cVidya Dave Sanderson, Associate Partner, Technology Reply Patrick Rood, Lead for Indirect Manageability Business, Oracle EMEA Register Now Partner Community Casts are a new series of interactive broadcasts designed to help you truly engage with Oracle on an individual level, build expertise around your specialist solution area and make valuable new contacts in Oracle and other Oracle partners. Community Casts can be viewed live from our online platform. Audience members have the opportunity to submit questions during the show via chat or social media outlets, many of which are answered on-air. Learn more about EMEA Partner Community Casts Register Now to learn how participation in the Exadata and Manageability Partner Communities will help your business flourish!

    Read the article

  • Efficient mapping layout in 2D side-scroller, and collisions between character and the world

    - by Jack
    I haven't touched Visual Studio for a couple months now, but I was playing a game from the '90s toady and had an epiphany: I was looking for something what i didn't need, and I wasn't using what I knew correctly. One of those realizations was collision, so let me tell you a bit about my project that I was working on. The project's graphics looks like Mario or Dangerous Dave, etc., you get the idea - old-school pixels. So anyway I remember trying to think of something else than AABB for character form, but I couldn't think of anything. Perhaps I could get a suggestion for this? Another thing is the world - I don't want it to be just linear world, I want mountains, etc.. My idea is to use triangles, and no idea yet what to do if I want just part of the cube, say 3/4 or 2/4 or whatever. Hard-coding such things seems inefficient. P.S. I am not looking at the precision level offered by Box2D. Actually I remember trying to implement it at first, but I failed as my understanding of C++ wasn't advanced enough, as it'll be mentioned below. P.P.S. I am programming in C++, and I haven't done it for a couple months now. I have no means of testing it either, as my PC is broken down, and this one can barely run games from late '90s, not to speak about a compiler or a program with inefficient resource management... I am also not an expert (obviously), I don't even know if I can consider myself an average programmer. In short, I am simply curious about my thoughts and my past experience when programming the game. I may come back to it when my PC is fixed, I'm already filling a note about these things.

    Read the article

  • SQL SERVER – Get Schema Name from Object ID using OBJECT_SCHEMA_NAME

    - by pinaldave
    Sometime a simple solution have even simpler solutions but we often do not practice it as we do not see value in it or find it useful. Well, today’s blog post is also about something which I have seen not practiced much in codes. We are so much comfortable with alternative usage that we do not feel like switching how we query the data. I was going over forums and I noticed that at one place user has used following code to get Schema Name from ObjectID. USE AdventureWorks2012 GO SELECT s.name AS SchemaName, t.name AS TableName, s.schema_id, t.OBJECT_ID FROM sys.Tables t INNER JOIN sys.schemas s ON s.schema_id = t.schema_id WHERE t.name = OBJECT_NAME(46623209) GO Before I continue let me say I do not see anything wrong with this script. It is just fine and one of the way to get SchemaName from Object ID. However, I have been using function OBJECT_SCHEMA_NAME to get the schema name. If I have to write the same code from the beginning I would have written the same code as following. SELECT OBJECT_SCHEMA_NAME(46623209) AS SchemaName, t.name AS TableName, t.schema_id, t.OBJECT_ID FROM sys.tables t WHERE t.name = OBJECT_NAME(46623209) GO Now, both of the above code give you exact same result. If you remove the WHERE condition it will give you information of all the tables of the database. Now the question is which one is better – honestly – it is not about one is better than other. Use the one which you prefer to use. I prefer to use second one as it requires less typing. Let me ask you the same question to you – which method to get schema name do yo use? and Why? Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL System Table, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Podcast Show Notes: Conversations in the Cloud

    - by Bob Rhubart
    The centerpiece of every OTN Architect Day event is a panel discussion the gathers all of the session speakers togehter to respond to questions from the audience. I generally try to record these discussions, usually by stiking my iPad on top of one of the PA speakers, with mixed results. Fortunately, the A/V tech at the venue for the Los Angeles event, held on October 25, 2012, had the necessary gear to get a good-quality recording of the panel discussion. So starting this week the OTN ArchBeat Podcast will feature a short series of highlights from those discussions. Listen to Part 1: Dude, What's My Role? Members of the Architect Day panel respond to an audience question about what happens to traditional IT roles in a cloud environment. Listen to Part 2: Migrating Mission-Critical Applications to the Cloud (Nov 21) The panel offers advice and examples in response to an audience question about dealing with mission-critical applications. Listen to Part 3: All Clouds Are Not Equal (Nov 28) The panel responds to a challenging question about cloud strategy with a discussion of enterprise-grade cloud services. Listen to Part 4: Cloud Security and Auditing (Dec 5) The last segment in the series is short discussion in response to an audience question about auditing and security in the cloud. The Panelists (Listed alphabetically) Ashok Aletty, Senior Director of Product Management, Oracle Cloud Application Foundation Dr. James Baty, Vice President, Oracle Global Enterprise Architecture Program Dave Chappelle, Enterprise Architect, Oracle Global Enterprise Architecture Program Jeff Davies, Senior Principal Product Manager, Oracle Corporation Anbu Krishnaswamy, Enterprise Architect, Oracle Global Enterprise Architecture Program Dhanraj Pondicherry, Sales Consulting Manager, Oracle Exadata Perren Walker, Senior Principal Product Manager, Oracle Enterprise Manager Coming Soon Upcoming programs will focus on DevOps and Continuous Integration, and on Oracle's Java Cloud and Developer Cloud services. Stay tuned: RSS

    Read the article

  • SQL SERVER – Excel Losing Decimal Values When Value Pasted from SSMS ResultSet

    - by pinaldave
    No! It is not a SQL Server Issue or SSMS issue. It is how things work. There is a simple trick to resolve this issue. It is very common when users are coping the resultset to Excel, the floating point or decimals are missed. The solution is very much simple and it requires a small adjustment in the Excel. By default Excel is very smart and when it detects the value which is getting pasted is numeric it changes the column format to accommodate that. Now as Zero which are training any digit after decimal points have no value, Excel automatically hides it. To prevent this to happen user has to convert columns to text format so it can preserve the formatting. Here is how you can do it. Select the corner between A and 1 and Right Click on it. It will select complete spreadsheet. If you want to change the format of any column you can select an individual column the same way. In the menu Click on Format Cells… It will bring up the following menu. Here by default the selected column will be General, change that to Text. It will change the format of all the cells to Text. Now once again paste the values from SSMS to the Excel. This time it will preserve the decimal values from SSMS. Solved! Any other trick you do you know to preserve the decimal values? Leave a comment please. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQL Utility, T SQL, Technology Tagged: Excel

    Read the article

< Previous Page | 81 82 83 84 85 86 87 88 89 90 91 92  | Next Page >