Search Results

Search found 7706 results on 309 pages for 'inner join'.

Page 143/309 | < Previous Page | 139 140 141 142 143 144 145 146 147 148 149 150  | Next Page >

  • SQL SERVER – Enumerations in Relational Database – Best Practice

    - by pinaldave
    Marko Parkkola This article has been submitted by Marko Parkkola, Data systems designer at Saarionen Oy, Finland. Marko is excellent developer and always thinking at next level. You can read his earlier comment which created very interesting discussion here: SQL SERVER- IF EXISTS(Select null from table) vs IF EXISTS(Select 1 from table). I must express my special thanks to Marko for sending this best practice for Enumerations in Relational Database. He has really wrote excellent piece here and welcome comments here. Enumerations in Relational Database This is a subject which is very basic thing in relational databases but often not very well understood and sometimes badly implemented. There are of course many ways to do this but I concentrate only two cases, one which is “the right way” and one which is definitely wrong way. The concept Let’s say we have table Person in our database. Person has properties/fields like Firstname, Lastname, Birthday and so on. Then there’s a field that tells person’s marital status and let’s name it the same way; MaritalStatus. Now MaritalStatus is an enumeration. In C# I would definitely make it an enumeration with values likes Single, InRelationship, Married, Divorced. Now here comes the problem, SQL doesn’t have enumerations. The wrong way This is, in my opinion, absolutely the wrong way to do this. It has one upside though; you’ll see the enumeration’s description instantly when you do simple SELECT query and you don’t have to deal with mysterious values. There’s plenty of downsides too and one would be database fragmentation. Consider this (I’ve left all indexes and constraints out of the query on purpose). CREATE TABLE [dbo].[Person] ( [Firstname] NVARCHAR(100), [Lastname] NVARCHAR(100), [Birthday] datetime, [MaritalStatus] NVARCHAR(10) ) You have nvarchar(20) field in the table that tells the marital status. Obvious problem with this is that what if you create a new value which doesn’t fit into 20 characters? You’ll have to come and alter the table. There are other problems also but I’ll leave those for the reader to think about. The correct way Here’s how I’ve done this in many projects. This model still has one problem but it can be alleviated in the application layer or with CHECK constraints if you like. First I will create a namespace table which tells the name of the enumeration. I will add one row to it too. I’ll write all the indexes and constraints here too. CREATE TABLE [CodeNamespace] ( [Id] INT IDENTITY(1, 1), [Name] NVARCHAR(100) NOT NULL, CONSTRAINT [PK_CodeNamespace] PRIMARY KEY ([Id]), CONSTRAINT [IXQ_CodeNamespace_Name] UNIQUE NONCLUSTERED ([Name]) ) GO INSERT INTO [CodeNamespace] SELECT 'MaritalStatus' GO Then I create a table that holds the actual values and which reference to namespace table in order to group the values under different namespaces. I’ll add couple of rows here too. CREATE TABLE [CodeValue] ( [CodeNamespaceId] INT NOT NULL, [Value] INT NOT NULL, [Description] NVARCHAR(100) NOT NULL, [OrderBy] INT, CONSTRAINT [PK_CodeValue] PRIMARY KEY CLUSTERED ([CodeNamespaceId], [Value]), CONSTRAINT [FK_CodeValue_CodeNamespace] FOREIGN KEY ([CodeNamespaceId]) REFERENCES [CodeNamespace] ([Id]) ) GO -- 1 is the 'MaritalStatus' namespace INSERT INTO [CodeValue] SELECT 1, 1, 'Single', 1 INSERT INTO [CodeValue] SELECT 1, 2, 'In relationship', 2 INSERT INTO [CodeValue] SELECT 1, 3, 'Married', 3 INSERT INTO [CodeValue] SELECT 1, 4, 'Divorced', 4 GO Now there’s four columns in CodeValue table. CodeNamespaceId tells under which namespace values belongs to. Value tells the enumeration value which is used in Person table (I’ll show how this is done below). Description tells what the value means. You can use this, for example, column in UI’s combo box. OrderBy tells if the values needs to be ordered in some way when displayed in the UI. And here’s the Person table again now with correct columns. I’ll add one row here to show how enumerations are to be used. CREATE TABLE [dbo].[Person] ( [Firstname] NVARCHAR(100), [Lastname] NVARCHAR(100), [Birthday] datetime, [MaritalStatus] INT ) GO INSERT INTO [Person] SELECT 'Marko', 'Parkkola', '1977-03-04', 3 GO Now I said earlier that there is one problem with this. MaritalStatus column doesn’t have any database enforced relationship to the CodeValue table so you can enter any value you like into this field. I’ve solved this problem in the application layer by selecting all the values from the CodeValue table and put them into a combobox / dropdownlist (with Value field as value and Description as text) so the end user can’t enter any illegal values; and of course I’ll check the entered value in data access layer also. I said in the “The wrong way” section that there is one benefit to it. In fact, you can have the same benefit here by using a simple view, which I schema bound so you can even index it if you like. CREATE VIEW [dbo].[Person_v] WITH SCHEMABINDING AS SELECT p.[Firstname], p.[Lastname], p.[BirthDay], c.[Description] MaritalStatus FROM [dbo].[Person] p JOIN [dbo].[CodeValue] c ON p.[MaritalStatus] = c.[Value] JOIN [dbo].[CodeNamespace] n ON n.[Id] = c.[CodeNamespaceId] AND n.[Name] = 'MaritalStatus' GO -- Select from View SELECT * FROM [dbo].[Person_v] GO This is excellent write up byMarko Parkkola. Do you have this kind of design setup at your organization? Let us know your opinion. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Best Practices, Database, DBA, Readers Contribution, Software Development, SQL, SQL Authority, SQL Documentation, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Transforming Customer Experiences Through Agile Commerce With Forrester Research’s Brian Walker – April 4th Webinar

    - by Jeri Kelley
    eBusiness today has fundamentally changed. Platforms and technologies must be flexible to support a number of business functions - marketing, merchandising, shopping, customer service - across a variety of digital channels and provide customers with a seamless, well-designed brand experience. Join us for this complimentary webinar on Wednesday, April 4th, 2012 at 12:00pm ET as Forrester Research’s Brian Walker provides expert insight on: The latest innovations, best practices, and industry trends in agile commerce, and how brands can maximize efforts How forward-thinking companies today are leveraging technology to deliver powerful customer experiences across touchpoints  The future of eBusiness and agile commerce Register Now!

    Read the article

  • Transforming Customer Experiences Through Agile Commerce With Forrester Research’s Brian Walker – April 4th Webinar

    - by Jeri Kelley
    eBusiness today has fundamentally changed. Platforms and technologies must be flexible to support a number of business functions - marketing, merchandising, shopping, customer service - across a variety of digital channels and provide customers with a seamless, well-designed brand experience. Join us for this complimentary webinar on Wednesday, April 4th, 2012 at 12:00pm ET as Forrester Research’s Brian Walker provides expert insight on: The latest innovations, best practices, and industry trends in agile commerce, and how brands can maximize efforts How forward-thinking companies today are leveraging technology to deliver powerful customer experiences across touchpoints  The future of eBusiness and agile commerce Register Now!

    Read the article

  • Embed Google’s Pac Man Game On Your Website

    - by Gopinath
    Google is celebrating the 30th anniversary of Pac-Man with a playable Pac Man game doodle on it’s home page. You can play the full game(255 levels) at http://google.com. This is the first time ever Google released an interactive doodle. How To Embed the Pac Man Game In Your Web Pages? I’m surprised to see this game being a non-flash version and it seems to be a pure javascript + html script. Michael at RustyBricks.com published an unofficial way of embedding Google’s Pac Man game in any website along with a link to demo page. Check out How To Get Google’s Pac Man Game On Your Page for a quick script to have this game for your website users. Join us on Facebook to read all our stories right inside your Facebook news feed.

    Read the article

  • SQL SERVER – Get Latest SQL Query for Sessions – DMV

    - by pinaldave
    In recent SQL Training I was asked, how can one figure out what was the last SQL Statement executed in sessions. The query for this is very simple. It uses two DMVs and created following quick script for the same. SELECT session_id, TEXT FROM sys.dm_exec_connections CROSS APPLY sys.dm_exec_sql_text(most_recent_sql_handle) AS ST While working with DMVs if you ever find any DMV has column with name sql_handle you can right away join that DMV with another DMV sys.dm_exec_sql_text and can get the text of the SQL statement. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: DMV, SQL DMV

    Read the article

  • Consolidate Data in Private Clouds, But Consider Security and Regulatory Issues

    - by Troy Kitch
    The January 13 webcast Security and Compliance for Private Cloud Consolidation will provide attendees with an overview of private cloud computing based on Oracle's Maximum Availability Architecture and how security and regulatory compliance affects implementations. Many organizations are taking advantage of Oracle's Maximum Availability Architecture to drive down the cost of IT by deploying private cloud computing environments that can support downtime and utilization spikes without idle redundancy. With two-thirds of sensitive and regulated data in organizations' databases private cloud database consolidation means organizations must be more concerned than ever about protecting their information and addressing new regulatory challenges. Join us for this webcast to learn about greater risks and increased threats to private cloud data and how Oracle Database Security Solutions can assist in securely consolidating data and meet compliance requirements. Register Now.

    Read the article

  • LIVE WEBCAST March 24 2pm PT- Why Switch from Red Hat and SUSE Linux to Oracle Linux?

    - by Zeynep Koch
    Oracle has been offering affordable Linux support since 2006 and more than 6,000 customers already use it. Oracle's Unbreakable Linux support program draws on the expertise of a world-class support organization that understands how to diagnose and solve Linux issues integrated with the applications being deployed on it. Find out how you can save 50-90% on your support costs. Join Oracle's Monica Kumar, Sr.Director of Linux, Oracle VM and MySQL and Avi Miller, Principal Sales Consultant, Linux and Virtualization on Thursday, March 24, 2pm PT to hear:The "Why and how" of switching to Oracle LinuxTesting and integration with systems and applicationsFree management and high availability toolsReal life customer scenariosIf you are going to get free access to the most advanced Linux operating system, along with world-class support at a fraction of the cost, better testing and integration with your server and applications, why wouldn't you do it? Register Now

    Read the article

  • SQL Pre-Con…at the Beach

    - by Argenis
      Building upon the success of SQL Rally 2012 (where we packed a room full of DBAs), my friend Robert Davis [Twitter|Blog] and yours truly will be again delivering our day-long Pre-Conference “Demystifying Database Administration Best Practices” this Friday (6/8/2012) – right before SQLSaturday #132 in Pensacola, FL. If you are in the vicinity of Pensacola, come join us! We had tons of fun at Rally. Robert and I love sharing tips and stories that will help you on your day to day duties as a DBA. Some of the topics that we’ll touch on (this is by no means a comprehensive list) Active Directory configuration for SQL Server Deployments Windows Server Deployments Storage and I/O High Availability / Disaster Recovery / Business Continuity Replication Day-To-Day Operations Maintenance TempDB Code Reviews Other Database and Server Settings   Follow this link to sign up for the Pre-Con at Pensacola: http://demystifyingdba.eventbrite.com/ Here’s a blog post that Robert made on the subject of Best Practices.  Hope to see you there!

    Read the article

  • SQL Pre-Con…at the Beach

    - by Argenis
      Building upon the success of SQL Rally 2012 (where we packed a room full of DBAs), my friend Robert Davis [Twitter|Blog] and yours truly will be again delivering our day-long Pre-Conference “Demystifying Database Administration Best Practices” this Friday (6/8/2012) – right before SQLSaturday #132 in Pensacola, FL. If you are in the vicinity of Pensacola, come join us! We had tons of fun at Rally. Robert and I love sharing tips and stories that will help you on your day to day duties as a DBA. Some of the topics that we’ll touch on (this is by no means a comprehensive list) Active Directory configuration for SQL Server Deployments Windows Server Deployments Storage and I/O High Availability / Disaster Recovery / Business Continuity Replication Day-To-Day Operations Maintenance TempDB Code Reviews Other Database and Server Settings   Follow this link to sign up for the Pre-Con at Pensacola: http://demystifyingdba.eventbrite.com/ Here’s a blog post that Robert made on the subject of Best Practices.  Hope to see you there!

    Read the article

  • MySQL 5.5 is GA!

    - by rob.young(at)oracle.com
    It is my pleasure to announce that MySQL 5.5 is now GA and ready for production deployment.  You can read Oracle's official press release here. I am excited about 5.5 because of the performance and scalability gains, new replication enhancements and overall improved technical efficiencies.  Congratulations and a sincere "Thanks!" go out to the entire MySQL Community and product engineering teams for making 5.5 the best release of MySQL to date.Please join us for today's MySQL Technology Update webcast where Tomas Ulin and I will cover what's new in MySQL 5.5 and provide an update on the other technologies we are working on. You can download MySQL 5.5 here.  All of the documentation and what's new information is here.  There is also a great article on MySQL 5.5 and the MySQL community here.Thanks for reading, and as always, THANKS for your support of MySQL!

    Read the article

  • SQL SERVER – Find Most Active Database in SQL Server – DMV dm_io_virtual_file_stats

    - by pinaldave
    Few days ago, I wrote about SQL SERVER – Find Current Location of Data and Log File of All the Database. There was very interesting conversation in comments by blog readers. Blog reader and SQL Expert Sreedhar has very interesting DMV presented which lists the most active database in SQL Server. For quick reference he has included the size of the disk in KB, MB and GB as well. SELECT DB_NAME(mf.database_id) AS databaseName, name AS File_LogicalName, CASE WHEN type_desc = 'LOG' THEN 'Log File' WHEN type_desc = 'ROWS' THEN 'Data File' ELSE type_desc END AS File_type_desc ,mf.physical_name ,num_of_reads ,num_of_bytes_read ,io_stall_read_ms ,num_of_writes ,num_of_bytes_written ,io_stall_write_ms ,io_stall ,size_on_disk_bytes ,size_on_disk_bytes/ 1024 AS size_on_disk_KB ,size_on_disk_bytes/ 1024 / 1024 AS size_on_disk_MB ,size_on_disk_bytes/ 1024 / 1024 / 1024 AS size_on_disk_GB FROM sys.dm_io_virtual_file_stats(NULL, NULL) AS divfs JOIN sys.master_files AS mf ON mf.database_id = divfs.database_id AND mf.FILE_ID = divfs.FILE_ID ORDER BY num_of_Reads DESC If you like to read and practice with DMVs, I suggest to read the blog of my very good friend Glenn Berry. He is one DMV expert. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Webcast with Brian Griffin, Ancestry, 2013 Winner 10 Best Web Support Sites

    - by Tuula Fai
    The web is one of the fastest growing channels for providing service, support and information, as seen in The Service Council's (TSC) latest multi-channel research survey. Join TSC's Chief Customer Officer Sumair Dutta as he shares key findings from his current customer experience research from over 200 organizations. Sumair will be joined by Brian Griffin, Senior Program Manager, Global Support Experience, Ancestry.com who will show how Ancestry is using the web as a powerful tool to enhance self-service opportunities and increase customer engagement. Smarter Web Service Educast Thursday, November 14th 2 pm ET / 11 am PT Register: http://bit.ly/1cwz4Ns  

    Read the article

  • Fast Fashion Freshness

    - by David Dorf
    Fashion retailers such as H&M, Zara, and Wet Seal have perfected the fast fashion retailing model. The concept requires no replenishment in order to maintain assortment freshness and to create a sense of urgency for the consumer to purchase now. However, maintaining assortment freshness results in high product turnover, making markdown optimization a necessity. Wet Seal, for instance, needed to move from ad-hoc markdowns and dealing with surplus inventory to handling markdowns methodically across 8,000 SKUs with only 12-15 week lifecycle (from DC receipt to exit). By optimizing and automating markdowns, Wet Seal is reaching their goal of assortment freshness, which in turn increases sales. If you're interested in learning more, register for a free webinar occurring on May 13th featuring Join Daniel Ryu, Vice President of Planning and Allocation at Wet Seal. He'll be discussing how the fast fashion retailer maintains their goal of assortment freshness.

    Read the article

  • Mark Hurd and Balaji Yelamanchili present Oracle’s Business Analytics Strategy

    - by Mike.Hallett(at)Oracle-BI&EPM
    Join Mark Hurd and Balaji Yelamanchili as they unveil the latest advances in Oracle’s strategy for placing analytics into the hands of every decision-makers—so that they can see more, think smarter, and act faster. Wednesday, April 4, 2012   at 1.0 pm UK BST / 2.0 pm CET Register HERE today for this online event Agenda Keynote: Oracle’s Business Analytics StrategyMark Hurd, President, Oracle, and Balaji Yelamanchili, Senior Vice President, Analytics and Performance Management, Oracle Plus Breakout Sessions: Achieving Predictable Performance with Oracle Hyperion Enterprise Performance Managemen Explore All Relevant Data—Introducing Oracle Endeca Information Discovery Run Your Business Faster and Smarter with Oracle Business Intelligence Applications on Oracle Exalytics In-Memory Machine Analyzing and Deciding with Big Data

    Read the article

  • Webcast Replay: Upgrading to Oracle Database 11g with Tom Kyte

    - by john.brust
    On Friday, we caught up with Tom Kyte, Oracle database expert and host of AskTom.oracle.com. Join us to hear what Oracle Database 11g Release 2 means to Tom. Learn how you can make 'change' safe and benefit from new functionality to make better use of your standby databases, capture and replay workloads, tune SQL performance, flashback for rapid recovery from human error, and more. Click to view webcast (and Q&A) replay. We also invite you to visit us our Oracle Database 11g upgrade page on otn.oracle.com.

    Read the article

  • Chrome Apps Office Hours: Chrome Storage APIs

    Chrome Apps Office Hours: Chrome Storage APIs Ask and vote for questions: goo.gl You spoke, we listened. Join Paul Kinlan, Paul Lewis, Pete LePage, and Renato Dias to learn about the new storage APIs that are available to Chrome Packaged Apps in the next installment of Chrome Apps Office Hours. We'll take a look at the new sync-able and local storage APIs as well as other ways you can save data locally on your users machine. We didn't get through quite as many questions as we hoped last week, and are going to dedicate some extra time this week, so be sure to post your questions on Moderator below! From: GoogleDevelopers Views: 0 9 ratings Time: 00:00 More in Science & Technology

    Read the article

  • The “AfterDark” reception is back!

    - by rituchhibber
    This year, the OPN Exchange “AfterDark” Reception is moving to new heights! Join us on the 5th floor of the Metreon building in San Francisco for this exclusive ‘VIP’ event. The reception will be held from 7:30 p.m. – 10 p.m. on Sunday, September 30th. Enjoy the smooth sounds of Macy Gray over a cocktail, as you network the night away and watch the 2012 live Music Festival performances from above! Best of all, this event is exclusive and free to all Oracle PartnerNetwork Exchange attendees! So come mix and mingle with us as we kick-off Oracle OpenWorld 2012 with great conversation and music! See You After Dark! The OPN Communications Team

    Read the article

  • SSIS - XML Source Script

    - by simonsabin
    The XML Source in SSIS is great if you have a 1 to 1 mapping between entity and table. You can do more complex mapping but it becomes very messy and won't perform. What other options do you have? The challenge with XML processing is to not need a huge amount of memory. I remember using the early versions of Biztalk with loaded the whole document into memory to map from one document type to another. This was fine for small documents but was an absolute killer for large documents. You therefore need a streaming approach. For flexibility however you want to be able to generate your rows easily, and if you've ever used the XmlReader you will know its ugly code to write. That brings me on to LINQ. The is an implementation of LINQ over XML which is really nice. You can write nice LINQ queries instead of the XMLReader stuff. The downside is that by default LINQ to XML requires a whole XML document to work with. No streaming. Your code would look like this. We create an XDocument and then enumerate over a set of annoymous types we generate from our LINQ statement XDocument x = XDocument.Load("C:\\TEMP\\CustomerOrders-Attribute.xml");   foreach (var xdata in (from customer in x.Elements("OrderInterface").Elements("Customer")                        from order in customer.Elements("Orders").Elements("Order")                        select new { Account = customer.Attribute("AccountNumber").Value                                   , OrderDate = order.Attribute("OrderDate").Value }                        )) {     Output0Buffer.AddRow();     Output0Buffer.AccountNumber = xdata.Account;     Output0Buffer.OrderDate = Convert.ToDateTime(xdata.OrderDate); } As I said the downside to this is that you are loading the whole document into memory. I did some googling and came across some helpful videos from a nice UK DPE Mike Taulty http://www.microsoft.com/uk/msdn/screencasts/screencast/289/LINQ-to-XML-Streaming-In-Large-Documents.aspx. Which show you how you can combine LINQ and the XmlReader to get a semi streaming approach. I took what he did and implemented it in SSIS. What I found odd was that when I ran it I got different numbers between using the streamed and non streamed versions. I found the cause was a little bug in Mikes code that causes the pointer in the XmlReader to progress past the start of the element and thus foreach (var xdata in (from customer in StreamReader("C:\\TEMP\\CustomerOrders-Attribute.xml","Customer")                                from order in customer.Elements("Orders").Elements("Order")                                select new { Account = customer.Attribute("AccountNumber").Value                                           , OrderDate = order.Attribute("OrderDate").Value }                                ))         {             Output0Buffer.AddRow();             Output0Buffer.AccountNumber = xdata.Account;             Output0Buffer.OrderDate = Convert.ToDateTime(xdata.OrderDate);         } These look very similiar and they are the key element is the method we are calling, StreamReader. This method is what gives us streaming, what it does is return a enumerable list of elements, because of the way that LINQ works this results in the data being streamed in. static IEnumerable<XElement> StreamReader(String filename, string elementName) {     using (XmlReader xr = XmlReader.Create(filename))     {         xr.MoveToContent();         while (xr.Read()) //Reads the first element         {             while (xr.NodeType == XmlNodeType.Element && xr.Name == elementName)             {                 XElement node = (XElement)XElement.ReadFrom(xr);                   yield return node;             }         }         xr.Close();     } } This code is specifically designed to return a list of the elements with a specific name. The first Read reads the root element and then the inner while loop checks to see if the current element is the type we want. If not we do the xr.Read() again until we find the element type we want. We then use the neat function XElement.ReadFrom to read an element and all its sub elements into an XElement. This is what is returned and can be consumed by the LINQ statement. Essentially once one element has been read we need to check if we are still on the same element type and name (the inner loop) This was Mikes mistake, if we called .Read again we would advance the XmlReader beyond the start of the Element and so the ReadFrom method wouldn't work. So with the code above you can use what ever LINQ statement you like to flatten your XML into the rowsets you want. You could even have multiple outputs and generate your own surrogate keys.        

    Read the article

  • SmoothLife Is a Super Smooth Version of Conway’s Game of Life [Video]

    - by Jason Fitzpatrick
    What happens if you change cellular automaton program Game of Life to use floating point values instead of integers? You end up with SmoothLife, a fluid and organic growth simulator. SmoothLife is a family of rules created by Stephan Rafler. It was designed as a continuous version of Conway’s Game of Life – using floating point values instead of integers. This rule is SmoothLifeL which supports many interesting phenomena such as gliders that can travel in any direction, rotating pairs of gliders, wickstretchers and the appearance of elastic tension in the ‘cords’ that join the blobs. You can check out the paper outlining how SmoothLife works here and then grab the source code to run your own simulation here. [via Boing Boing] HTG Explains: What is the Windows Page File and Should You Disable It? How To Get a Better Wireless Signal and Reduce Wireless Network Interference How To Troubleshoot Internet Connection Problems

    Read the article

  • Silverlight Cream for June 13, 2010 -- #881

    - by Dave Campbell
    In this Issue: Mark Monster. Shoutouts: Adam Kinney has moved his blog, and his first post there is to announce New tutorials on .toolbox on PathListBox and Fluid UI Awesome graphics for the MEF'ed Video Player by Alan Beasley: New MEF Video Player Controls (1st Draft – Article to follow…) It must be a slow relaxing summer weekend, because I only found one post... and Mark submitted this one to me :) From SilverlightCream.com: How to improve the Windows Phone 7 Licensing development experience? Mark Monster is ahead of all of us if he's already programming his WP7 apps for 'trial versions'... but maybe it's time to start learning how to do that stuff :) Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • WalMart Slashes iPhone 3GS Price To 97$

    - by Gopinath
    WalMart store has slashed prices of iPhone 3GS 16GB model to 97$ with a two-year service contract. This offer saves you 100$ and it starts from today onwards. Apple slashes the prices of it’s products whenever they plan to release an upgraded version of the product. The slash  of iPhone 3GS has provided enough confirmation that Apple is planning to release next version of iPhone, unofficially dubbed as iPhone 4, in the upcoming WWDC conference. Click here to check the availability of iPhone 3GS stock at Walmart. Join us on Facebook to read all our stories right inside your Facebook news feed.

    Read the article

  • Hadron Collider – Can it unveil the hidden secrets of universe?

    - by samsudeen
    Scientist at  European Centre for Nuclear Research (CERN) today successfully simulated the Big Bang experiment finally by producing  the world’s first high-energy particle collision.This is achieved through the collision of two protons with a total energy of  around seven trillion electron volts sending sub-particles spread through in every direction.   The experiment is conducted successfully around the  European Centre for Nuclear Research (CERN) which is under 100 metres below the Franco-Swiss border. This is said to be the biggest experiment in terms on the investment (around $7 billion) and the scientific importance. This will lead to a new era of science and could change the theories about the origin of universe. You can find  more videos about the experiment at the LHC Videos Join us on Facebook to read all our stories right inside your Facebook news feed.

    Read the article

  • SQL SERVER – 3 Simple Puzzles – Need Your Suggestions

    - by pinaldave
    Last Month, I have posted three Simple Puzzles and I got very good response. I think there can be many interesting answers there. I would like to request all of you to take part the puzzles and provide your answer. I plant to consolidate answers and publish all the valid answers on this blog with due credit. SQL SERVER – Challenge – Puzzle – Usage of FAST Hint SQL SERVER – Puzzle – Challenge – Error While Converting Money to Decimal SQL SERVER – Challenge – Puzzle – Why does RIGHT JOIN Exists I am also thinking that after such a long time we should have word Database Developer (DBD) just like Database Administrator (DBA) in our dictionary. I have also created pole where I talk about this subject. SQL SERVER – Are you a Database Administrator or a Database Developer? ?Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: SQL, SQL Authority, SQL Puzzle, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Solution – Puzzle – Statistics are not Updated but are Created Once

    - by pinaldave
    Earlier I asked puzzle why statistics are not updated. Read the complete details over here: Statistics are not Updated but are Created Once In the question I have demonstrated even though statistics should have been updated after lots of insert in the table are not updated.(Read the details SQL SERVER – When are Statistics Updated – What triggers Statistics to Update) In this example I have created following situation: Create Table Insert 1000 Records Check the Statistics Now insert 10 times more 10,000 indexes Check the Statistics – it will be NOT updated Auto Update Statistics and Auto Create Statistics for database is TRUE Now I have requested two things in the example 1) Why this is happening? 2) How to fix this issue? I have many answers – here is the how I fixed it which has resolved the issue for me. NOTE: There are multiple answers to this problem and I will do my best to list all. Solution: Create nonclustered Index on column City Here is the working example for the same. Let us understand this script and there is added explanation at the end. -- Execution Plans Difference -- Estimated Execution Plan Vs Actual Execution Plan -- Create Sample Database CREATE DATABASE SampleDB GO USE SampleDB GO -- Create Table CREATE TABLE ExecTable (ID INT, FirstName VARCHAR(100), LastName VARCHAR(100), City VARCHAR(100)) GO CREATE NONCLUSTERED INDEX IX_ExecTable1 ON ExecTable (City); GO -- Insert One Thousand Records -- INSERT 1 INSERT INTO ExecTable (ID,FirstName,LastName,City) SELECT TOP 1000 ROW_NUMBER() OVER (ORDER BY a.name) RowID, 'Bob', CASE WHEN  ROW_NUMBER() OVER (ORDER BY a.name)%2 = 1 THEN 'Smith' ELSE 'Brown' END, CASE WHEN ROW_NUMBER() OVER (ORDER BY a.name)%20 = 1 THEN 'New York' WHEN  ROW_NUMBER() OVER (ORDER BY a.name)%20 = 5 THEN 'San Marino' WHEN  ROW_NUMBER() OVER (ORDER BY a.name)%20 = 3 THEN 'Los Angeles' WHEN  ROW_NUMBER() OVER (ORDER BY a.name)%20 = 7 THEN 'La Cinega' WHEN  ROW_NUMBER() OVER (ORDER BY a.name)%20 = 13 THEN 'San Diego' WHEN  ROW_NUMBER() OVER (ORDER BY a.name)%20 = 17 THEN 'Las Vegas' ELSE 'Houston' END FROM sys.all_objects a CROSS JOIN sys.all_objects b GO -- Display statistics of the table sp_helpstats N'ExecTable', 'ALL' GO -- Select Statement SELECT FirstName, LastName, City FROM ExecTable WHERE City  = 'New York' GO -- Display statistics of the table sp_helpstats N'ExecTable', 'ALL' GO -- Replace your Statistics over here DBCC SHOW_STATISTICS('ExecTable', IX_ExecTable1); GO -------------------------------------------------------------- -- Round 2 -- Insert One Thousand Records -- INSERT 2 INSERT INTO ExecTable (ID,FirstName,LastName,City) SELECT TOP 1000 ROW_NUMBER() OVER (ORDER BY a.name) RowID, 'Bob', CASE WHEN  ROW_NUMBER() OVER (ORDER BY a.name)%2 = 1 THEN 'Smith' ELSE 'Brown' END, CASE WHEN ROW_NUMBER() OVER (ORDER BY a.name)%20 = 1 THEN 'New York' WHEN  ROW_NUMBER() OVER (ORDER BY a.name)%20 = 5 THEN 'San Marino' WHEN  ROW_NUMBER() OVER (ORDER BY a.name)%20 = 3 THEN 'Los Angeles' WHEN  ROW_NUMBER() OVER (ORDER BY a.name)%20 = 7 THEN 'La Cinega' WHEN  ROW_NUMBER() OVER (ORDER BY a.name)%20 = 13 THEN 'San Diego' WHEN  ROW_NUMBER() OVER (ORDER BY a.name)%20 = 17 THEN 'Las Vegas' ELSE 'Houston' END FROM sys.all_objects a CROSS JOIN sys.all_objects b GO -- Select Statement SELECT FirstName, LastName, City FROM ExecTable WHERE City  = 'New York' GO -- Display statistics of the table sp_helpstats N'ExecTable', 'ALL' GO -- Replace your Statistics over here DBCC SHOW_STATISTICS('ExecTable', IX_ExecTable1); GO -- Clean up Database DROP TABLE ExecTable GO When I created non clustered index on the column city, it also created statistics on the same column with same name as index. When we populate the data in the column the index is update – resulting execution plan to be invalided – this leads to the statistics to be updated in next execution of SELECT. This behavior does not happen on Heap or column where index is auto created. If you explicitly update the index, often you can see the statistics are updated as well. You can see this is for sure happening if you follow the tell of John Sansom. John Sansom‘s suggestion: That was fun! Although the column statistics are invalidated by the time the second select statement is executed, the query is not compiled/recompiled but instead the existing query plan is reused. It is the “next” compiled query against the column statistics that will see that they are out of date and will then in turn instantiate the action of updating statistics. You can see this in action by forcing the second statement to recompile. SELECT FirstName, LastName, City FROM ExecTable WHERE City = ‘New York’ option(RECOMPILE) GO Kevin Cross also have another suggestion: I agree with John. It is reusing the Execution Plan. Aside from OPTION(RECOMPILE), clearing the Execution Plan Cache before the subsequent tests will also work. i.e., run this before round 2: ————————————————————– – Clear execution plan cache before next test DBCC FREEPROCCACHE WITH NO_INFOMSGS; ————————————————————– Nice puzzle! Kevin As this was puzzle John and Kevin both got the correct answer, there was no condition for answer to be part of best practices. I know John and he is finest DBA around – his tremendous knowledge has always impressed me. John and Kevin both will agree that clearing cache either using DBCC FREEPROCCACHE and recompiling each query every time is for sure not good advice on production server. It is correct answer but not best practice. By the way, if you have better solution or have better suggestion please advise. I am open to change my answer and publish further improvement to this solution. On very separate note, I like to have clustered index on my Primary Key, which I have not mentioned here as it is out of the scope of this puzzle. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, Readers Contribution, Readers Question, SQL, SQL Authority, SQL Index, SQL Puzzle, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: Statistics

    Read the article

  • Unlock the Full Value of Oracle CRM On Demand

    - by charles.knapp
    Register for this live Oracle CRM On Demand Virtual Community Session! Oracle CRM On Demand delivers the most complete On Demand CRM available anywhere. But how can you ensure you are getting maximum value from the many powerful features that Oracle CRM On Demand offers? Join our interactive Oracle CRM On Demand Virtual Community Session on Tuesday, January 11, 2011 from 10.00-11 a.m. PT / 1 p.m.-2 p.m. ET to get expert advice and discuss the best ways to unlock the full potential of Oracle CRM Demand with Mike Lairson, author of 'Oracle CRM On Demand Reporting'. Book OfferSend your Oracle CRM On Demand configuration ideas before the Webcast to [email protected] and you could win a free copy of 'Oracle CRM On Demand Reporting' by Mike Lairson. Learn more and register now!

    Read the article

< Previous Page | 139 140 141 142 143 144 145 146 147 148 149 150  | Next Page >