Search Results

Search found 42321 results on 1693 pages for 'sql reporting services 05'.

Page 137/1693 | < Previous Page | 133 134 135 136 137 138 139 140 141 142 143 144  | Next Page >

  • Smart defaults [SSDT]

    - by jamiet
    I’ve just discovered a new, somewhat hidden, feature in SSDT that I didn’t know about and figured it would be worth highlighting here because I’ll bet not many others know it either; the feature is called Smart Defaults. It gets around the problem of adding a NOT NULLable column to an existing table that has got data in it – previous to SSDT you would need to define a DEFAULT constraint however it does feel rather cumbersome to create an object purely for the purpose of pushing through a deployment – that’s the situation that Smart Defaults is meant to alleviate. The Smart Defaults option exists in the advanced section of a Publish Profile file: The description of the setting is “Automatically provides a default value when updating a table that contains data with a column that does not allow null values”, in other words checking that option will cause SSDT to insert an arbitrary default value into your newly created NON NULLable column. In case you’re wondering how it does it, here’s how: SSDT creates a DEFAULT CONSTRAINT at the same time as the column is created and then immediately removes that constraint: ALTER TABLE [dbo].[T1]    ADD [C1] INT NOT NULL,         CONSTRAINT [SD_T1_1df7a5f76cf44bb593506d05ff9a1e2b] DEFAULT 0 FOR [C1];ALTER TABLE [dbo].[T1] DROP CONSTRAINT [SD_T1_1df7a5f76cf44bb593506d05ff9a1e2b]; You can then update the value as appropriate in a Post-Deployment script. Pretty cool! On the downside, you can only specify this option for the whole project, not for an individual table or even an individual column – I’m not sure that I’d want to turn this on for an entire project as it could hide problems that a failed deployment would highlight, in other words smart defaults could be seen to be “papering over the cracks”. If you think that should be improved go and vote (and leave a comment) at [SSDT] Allow us to specify Smart defaults per table or even per column. @Jamiet

    Read the article

  • Working with Reporting Services Filters – Part 3: The TOP and BOTTOM Operators

    - by smisner
    Thus far in this series, I have described using the IN operator and the LIKE operator. Today, I’ll continue the series by reviewing the TOP and BOTTOM operators. Today, I happened to be working on an example of using the TOP N operator and was not successful on my first try because the behavior is just a bit different than we find when using an “equals” comparison as I described in my first post in this series. In my example, I wanted to display a list of the top 5 resellers in the United States for AdventureWorks, but I wanted it based on a filter. I started with a hard-coded filter like this: Expression Data Type Operator Value [ResellerSalesAmount] Float Top N 5 And received the following error: A filter value in the filter for tablix 'Tablix1' specifies a data type that is not supported by the 'TopN' operator. Verify that the data type for each filter value is Integer. Well, that puzzled me. Did I really have to convert ResellerSalesAmount to an integer to use the Top N operator? Just for kicks, I switched to the Top % operator like this: Expression Data Type Operator Value [ResellerSalesAmount] Float Top % 50 This time, I got exactly the results I expected – I had a total of 10 records in my dataset results, so 50% of that should yield 5 rows in my tablix. So thinking about the problem with Top N some  more, I switched the Value to an expression, like this: Expression Data Type Operator Value [ResellerSalesAmount] Float Top N =5 And it worked! So the value for Top N or Top % must reflect a number to plug into the calculation, such as Top 5 or Top 50%, and the expression is the basis for determining what’s in that group. In other words, Reporting Services will sort the rows by the expression – ResellerSalesAmount in this case – in descending order, and then filter out everything except the topmost rows based on the operator you specify. The curious thing is that, if you’re going to hard-code the value, you must enter the value for Top N with an equal sign in front of the integer, but you can omit the equal sign when entering a hard-coded value for Top %. This experience is why working with Reporting Services filters is not always intuitive! When you use a report parameter to set the value, you won’t have this problem. Just be sure that the data type of the report parameter is set to Integer. Jessica Moss has an example of using a Top N filter in a tablix which you can view here. Working with Bottom N and Bottom % works similarly. You just provide a number for N or for the percentage and Reporting Services works from the bottom up to determine which rows are kept and which are excluded.

    Read the article

  • Is the SAN dying???

    - by RickHeiges
    Is the SAN dying? The reason that I ask this question is that MSFT has unleashed technologies this year that point in that direction Always ON Availability Groups shuns shared storage Windows 2012 has Storage Replication Technology that does not require a SAN Windows 2012 has Hyper-V Replica Technology that does not require a SAN PDW v2 continues to reinforce the approach to avoid shared storage I'm not saying that SAN technology does not have its place or does not have benefits inherent to the beast....(read more)

    Read the article

  • Defaults for Exporting Data in Oracle SQL Developer

    - by thatjeffsmith
    I was testing a reported bug in SQL Developer today – so the bug I was looking for wasn’t there (YES!) but I found a different one (NO!) – and I was getting frustrated by having to check the same boxes over and over again. What I wanted was INSERT STATEMENTS to the CLIPBOARD. Not what I want! I’m always doing the same thing, over and over again. And I never go to FILE – that’s too permanent for my type of work. I either want stuff to the clipboard or to the worksheet. Surely there’s a way to tell SQL Developer how to behave? Oh yeah, check the preferences So you can set the defaults for this dialog. Go to: Tools – Preferences – Database – Utilities – Export Now I will always start with ‘INSERT’ and ‘Clipboard’ – woohoo! Now, I can also go INTO the preferences for each of the different formats to save me a few more clicks. I prefer pointy hats (^) for my delimiters, don’t you? So, spend a few minutes and set each of these to what you’re normally doing and save yourself a bunch of time going forward.

    Read the article

  • SQL Server 2008 R2 Service Pack 2 CTP is available

    - by AaronBertrand
    You can download the Service Pack 2 CTP from the following URL: http://www.microsoft.com/en-us/download/details.aspx?id=29848 The build # is 10.50.3720. This service pack contains all of the fixes from Service Pack 1 & Cumulative Updates 1 through 5, and a couple of other minor fixes (a couple of SSRS bugs and a bug about an ALTER TABLE batch not being cached correctly). It does not include fixes from Service Pack 1 Cumulative Update #6, which I mentioned recently . You should *NOT* install this...(read more)

    Read the article

  • Oracle Database 12c By Example – SQL Developer and Multitenant

    - by thatjeffsmith
    As you may have heard, Oracle Database 12c is now available. In addition to the binaries and docs going out, we also published a few new Oracle By Example (OBE) chapters. You can find those links here on our product page. Do you know who found these, practically the minute they were published? An enterprising DBA-extraordinaire who was just happening to be presenting at the ODTUG KScope13 conference in New Orleans. He thought it would be a good idea to download the new software over a hotel WIFI, install and create a new multitenant database, watch a few OBEs, and then demo that live for his ‘SQL Developer for DBAs‘ session. Pretty crazy, right? Well, he did it, and I was there to watch. Way cool. You can listen to @leight0nn tell his story in his own words via this ODTUG interview with @oraclenered. In case you’re too giddy to sit through the video, I’ll give you a preview – he succesfully cloned a pluggable database in about a minute with only a couple of clicks using Oracle SQL Developer 3.2.20.09 while connected to a 12c database.

    Read the article

  • Differences between Remote Desktop and Terminal services

    - by Uwe
    What is the difference between Remote Desktop and Terminal services? We run a windows 2008 R2 server. There are several administrators who need to access this server. Windows 2008 allows only two concurrent sessions with different users. So I thought of installing terminal services. But I wonder what will happen to the server if I do so? What will be installed additionally? Will there be more features, ports, issues with the server?

    Read the article

  • SQL Server Add Primary Key

    - by Derek D.
    Adding a primary key can be done either after a table is created, or at the same a table is created. It is important to note, that by default a primary key is clustered. This may or may not be the preferred method of creation. For more information on clustered vs non [...]

    Read the article

  • Reporting Release History : Q1 2010 SP1 (version 4.0.10.421)

    What's NewReport Viewers NEW: Report Viewer for WPF Added zoom modes: Percents/Page Width/Full Page to the Silverlight ViewerTelerik Report WCF Service Added self-hosting capability on WCF service What's FixedReporting API OBSOLETE: Telerik.Reporting.Report.DataMember property; use ObjectDataSource as DataSource and set its DataMember instead. OBSOLETE: Telerik.Reporting.Table.DataMember property; use ObjectDataSource as DataSource and set its DataMember instead. OBSOLETE: Telerik.Reporting.Chart.DataMember...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • MWS2K8R2: Enabling Media Sharing using Streaming Media Services Role

    - by TheLizardKing
    So I have a Microsoft Windows Server 2008 R2 that stores a large collection of media (mostly mp3s) and I want to be able to deliver these files using a server/client setup with Windows Media Player being the client. I downloaded and installed Streaming Media Services Role. I even setup a publishing point with on-demand access. My issue is I can connect using WMP12 but it only connects as more of a stream and not a shared library. I can pause/play/skip as if it's a powerful radio station which is ok in my book but what I'd really like to do is allow me to control my music remotely, search and play for artists, maybe create playlists (not required but nice) and even connect it to an xbox. Is Streaming Media Services Role not what I should be using for this? Would installing WMP and sharing using that mechanism be a better option? Any Ideas?

    Read the article

  • SQL Server 2000 need to prevent logons whilst performing a backup for a side by side migration

    - by pigeon
    I'm looking for a way to prevent logons from occurring in order to take a full backup of a Database to migrate from its current SQL Server 2000 instance to a new SQL 2005 instance. A friend of mine suggested running a script which would put the DB into a rollback state. Not being a DBA my DDL is very poor and running a script that I don't understand may not be the best idea. One option which might be easier is to simply detach and copy, to the new server. Any suggestions would be greatly appreciated.

    Read the article

  • SQL Server Manageability Series: how to change the default path of .cache files of a data collector? #sql #mdw #dba

    - by ssqa.net
    How to change the default path of .cache files of a data collector after the Management Data Warehouse (MDW has been setup? This was the question asked by one of the DBAs in a client's place, instantly I enquired that were there any folder specified while setting up the MDW and obvious answer was no as there were left default. This means all the .CACHE files are stored under %C\TEMP directory which may post out of disk space problem on the server where the MDW is setup to collect. Going back...(read more)

    Read the article

  • High Performance SQL Views Using WITH(NOLOCK)

    - by gt0084e1
    Every now and then you find a simple way to make everything much faster. We often find customers creating data warehouses or OLAP cubes even though they have a relatively small amount of data (a few gigs) compared to their server memory. If you have more server memory than the size of your database or working set, nearly any aggregate query should run in a second or less. In some situations there may be high traffic on from the transactional application and SQL server may wait for several other queries to run before giving you your results. The purpose of this is make sure you don’t get two versions of the truth. In an ATM system, you want to give the bank balance after the withdrawal, not before or you may get a very unhappy customer. So by default databases are rightly very conservative about this kind of thing. Unfortunately this split-second precision comes at a cost. The performance of the query may not be acceptable by today’s standards because the database has to maintain locks on the server. Fortunately, SQL Server gives you a simple way to ask for the current version of the data without the pending transactions. To better facilitate reporting, you can create a view that includes these directives. CREATE VIEW CategoriesAndProducts AS SELECT * FROM dbo.Categories WITH(NOLOCK) INNER JOIN dbo.Products WITH(NOLOCK) ON dbo.Categories.CategoryID = dbo.Products.CategoryID In some cases quires that are taking minutes end up taking seconds. Much easier than moving the data to a separate database and it’s still pretty much real time give or take a few milliseconds. You’ve been warned not to use this for bank balances though. More from Data Stream

    Read the article

  • Introducing Windows Azure Mobile Services

    - by Clint Edmonson
    Today I’m excited to share that the Windows Azure Mobile Services public preview is now available. This preview provides a turnkey backend cloud solution designed to accelerate connected client app development. These services streamline the development process by enabling you to leverage the cloud for common mobile application scenarios such as structured storage, user authentication and push notifications. If you’re building a Windows 8 app and want a fast and easy path to creating backend cloud services, this preview provides the capabilities you need. You to take advantage of the cloud to build and deploy modern apps for Windows 8 devices in anticipation of general availability on October 26th. Subsequent preview releases will extend support to iOS, Android, and Windows Phone. Features The preview makes it fast and easy to create cloud services for Windows 8 applications within minutes. Here are the key benefits:  Rapid development: configure a straightforward and secure backend in less than five minutes. Create modern mobile apps: common Windows Azure plus Windows 8 scenarios that Windows Azure Mobile Services preview will support include:  Automated Service API generation providing CRUD functionality and dynamic schematization on top of Structured Storage Structured Storage with powerful query support so a Windows 8 app can seamlessly connect to a Windows Azure SQL database Integrated Authentication so developers can configure user authentication via Windows Live Push Notifications to bring your Windows 8 apps to life with up to date and relevant information Access structured data: connect to a Windows Azure SQL database for simple data management and dynamically created tables. Easy to set and manage permissions. Pricing One of the key things that we’ve consistently heard from developers about using Windows Azure with mobile applications is the need for a low cost and simple offer. The simplest way to describe the pricing for Windows Azure Mobile Services at preview is that it is the same as Windows Azure Websites during preview. What’s FREE? Run up to 10 Mobile Services for free in a multitenant environment Free with valid Windows Azure Free Trial 1GB SQL Database Unlimited ingress 165MB/day egress  What do I pay for? Scaling up to dedicated VMs Once Windows Azure Free Trial expires - SQL Database and egress     Getting Started To start using Mobile Services, you will need to sign up for a Windows Azure free trial, if you have not done so already.  If you already have a Windows Azure account, you will need to request to enroll in this preview feature. Once you’ve enrolled, this getting started tutorial will walk you through building your first Windows 8 application using the preview’s services. The developer center contains more resources to teach you how to: Validate and authorize access to data using easy scripts that execute securely, on the server Easily authenticate your users via Windows Live Send toast notifications and update live tiles in just a few lines of code Our pricing calculator has also been updated for calculate costs for these new mobile services. Questions? Ask in the Windows Azure Forums. Feedback? Send it to [email protected].

    Read the article

  • I can't find Terminal Services Manager

    - by nickfranceschina
    I am obviously stupid... because I'm logged into this Windows 2003 Server box as administrator, yet when I go under Administrative Tools, I can't find "Terminal Services Manager". Terminal Services are installed... I'm using them right now... and I know that's where the manager is supposed to be... yet it is not what am I missing? why isn't it showing? where else can I find it?

    Read the article

  • MERGE gives better OUTPUT options

    - by Rob Farley
    MERGE is very cool. There are a ton of useful things about it – mostly around the fact that you can implement a ton of change against a table all at once. This is great for data warehousing, handling changes made to relational databases by applications, all kinds of things. One of the more subtle things about MERGE is the power of the OUTPUT clause. Useful for logging.   If you’re not familiar with the OUTPUT clause, you really should be – it basically makes your DML (INSERT/DELETE/UPDATE/MERGE) statement return data back to you. This is a great way of returning identity values from INSERT commands (so much better than SCOPE_IDENTITY() or the older (and worse) @@IDENTITY, because you can get lots of rows back). You can even use it to grab default values that are set using non-deterministic functions like NEWID() – things you couldn’t normally get back without running another query (or with a trigger, I guess, but that’s not pretty). That inserted table I referenced – that’s part of the ‘behind-the-scenes’ work that goes on with all DML changes. When you insert data, this internal table called inserted gets populated with rows, and then used to inflict the appropriate inserts on the various structures that store data (HoBTs – the Heaps or B-Trees used to store data as tables and indexes). When deleting, the deleted table gets populated. Updates get a matching row in both tables (although this doesn’t mean that an update is a delete followed by an inserted, it’s just the way it’s handled with these tables). These tables can be referenced by the OUTPUT clause, which can show you the before and after for any DML statement. Useful stuff. MERGE is slightly different though. With MERGE, you get a mix of entries. Your MERGE statement might be doing some INSERTs, some UPDATEs and some DELETEs. One of the most common examples of MERGE is to perform an UPSERT command, where data is updated if it already exists, or inserted if it’s new. And in a single operation too. Here, you can see the usefulness of the deleted and inserted tables, which clearly reflect the type of operation (but then again, MERGE lets you use an extra column called $action to show this). (Don’t worry about the fact that I turned on IDENTITY_INSERT, that’s just so that I could insert the values) One of the things I love about MERGE is that it feels almost cursor-like – the UPDATE bit feels like “WHERE CURRENT OF …”, and the INSERT bit feels like a single-row insert. And it is – but into the inserted and deleted tables. The operations to maintain the HoBTs are still done using the whole set of changes, which is very cool. And $action – very convenient. But as cool as $action is, that’s not the point of my post. If it were, I hope you’d all be disappointed, as you can’t really go near the MERGE statement without learning about it. The subtle thing that I love about MERGE with OUTPUT is that you can hook into more than just inserted and deleted. Did you notice in my earlier query that my source table had a ‘src’ field, that wasn’t used in the insert? Normally, this would be somewhat pointless to include in my source query. But with MERGE, I can put that in the OUTPUT clause. This is useful stuff, particularly when you’re needing to audit the changes. Suppose your query involved consolidating data from a number of sources, but you didn’t need to insert that into the actual table, just into a table for audit. This is now very doable, either using the INTO clause of OUTPUT, or surrounding the whole MERGE statement in brackets (parentheses if you’re American) and using a regular INSERT statement. This is also doable if you’re using MERGE to just do INSERTs. In case you hadn’t realised, you can use MERGE in place of an INSERT statement. It’s just like the UPSERT-style statement we’ve just seen, except that we want nothing to match. That’s easy to do, we just use ON 1=2. This is obviously more convoluted than a straight INSERT. And it’s slightly more effort for the database engine too. But, if you want the extra audit capabilities, the ability to hook into the other source columns is definitely useful. Oh, and before people ask if you can also hook into the target table’s columns... Yes, of course. That’s what deleted and inserted give you.

    Read the article

  • Q&amp;A: Will my favourite ORM Foo work with SQL Azure?

    - by Eric Nelson
    short answer: Quite probably, as SQL Azure is very similar to SQL Server longer answer: Object Relational Mappers (ORMs) that work with SQL Server are likely but not guaranteed to work with SQL Azure. The differences between the RDBMS versions are small – but may cause problems, for example in tools used to create the mapping between objects and tables or in generated SQL from the ORM which expects “certain things” :-) More specifically: ADO.NET Entity Framework / LINQ to Entities can be used with SQL Azure, but the Visual Studio designer does not currently work. You will need to point the designer at a version of your database running of SQL Server to create the mapping, then change the connection details to run against SQL Azure. LINQ to SQL has similar issues to ADO.NET Entity Framework above NHibernate can be used against SQL Azure DevExpress XPO supports SQL Azure from version 9.3 DataObjects.Net supports SQL Azure Open Access from Telerik works “seamlessly”  - their words not mine :-) The list above is by no means comprehensive – please leave a comment with details of other ORMs that work (or do not work) with SQL Azure. Related Links: General guidelines and limitations of SQL Azure SQL Azure vs SQL Server

    Read the article

  • A little on speaking and evaluations...

    - by AaronBertrand
    Buck Woody ( blog | twitter ) just published a great post on session evaluations , and a lot of his points hit home for me. The premise is that the evaluations are not really meant for the attendee or the event organizers, but so that the speaker can get better and make the next session better. In light of this, at least in my opinion, the existing evaluation forms (and the way attendees tend to fill them out) do not achieve this at all. It may be a little more work for events to generate a more...(read more)

    Read the article

  • Denali CTP3 - Semantic Search 2 (Lots of documents)

    - by sqlartist
    Hi again, I thought I would improve on the previous post by actually putting a decent about of content into the Filetable - this time I used the opensource DMOZ Health document repository which contains 5,880 files inside 220 folders. The files are all html and are pretty small in size. The entire document collection is about 120Mb unzipped and 30Mb zipped. If any one is interested in testing this collection drop me a note and I will upload the dmoz_health repository archive to Skydrive. This time...(read more)

    Read the article

  • T-SQL Tuesday #19: Blind Spots

    - by merrillaldrich
    A while ago I wrote a post, Visualize Disaster , prompted by a real incident we had at my office. Fortunately we came through it OK from a business point of view, but I took away an important lesson: it’s very easy, whether your organization and your team is savvy about disaster recovery or not, to have significant blind spots with regard to recovery in the face of some large, unexpected outage. We have very clear direction and decent budgets to work with, and the safety and recoverability of applications...(read more)

    Read the article

  • Interactive Reporting Translation Workbench utility is available

    - by THE
    As you may have seen in our  Newsletter, Oracle has released the "Oracle Hyperion Interactive Reporting Translation Workbench" for Hyperion Interactive Reporting (IR) customers who are moving to Oracle Business Intelligence Enterprise Edition (OBIEE). A summary for this utility can be found  here. To get the Utility along with documentation and training material we suggest that you visit the Oracle Technology Network ( OTN ) "Oracle Hyperion Interactive Reporting Downloads" page. Friendly enough, instead of hundreds of pages of "getting started Docs", Oracle has packed some training videos into the downloads, so that getting started is made as easy as possible. But of course the documentation comes with it as well.

    Read the article

  • Choosing the Right Financial Consolidation and Reporting Solution

    Financial reporting requirements for publicly-held companies are changing and getting more complex. With the upcoming convergence of US GAAP and IFRS, demand for more detailed non-financial disclosures, and the SEC mandate for XBRL financial executives are under pressure to ensure they have the right systems in place to support current and future reporting requirements. Tune into this conversation with Rich Clayton, VP of Enterprise Performance Management and BI products for Oracle, and Annette Melatti, Senior Director of Product Marketing for Financial Applications to learn about the latest market requirements, what capabilities are provided by Oracle's General Ledgers, and how customers can extend their investment in Oracle General Ledger solutions with Oracle's market-leading financial close and reporting products.

    Read the article

  • Scream if you want to go faster

    - by simonsabin
    My session for 24hrs of pass on High Performance functions will be starting at 11:00 GMT thats migdnight for folks in the UK. To attend follow this link https://www.livemeeting.com/cc/8000181573/join?id=N5Q8S7&role=attend&pw=d2%28_KmN3r The rest of the sessions can be found here http://www.sqlpass.org/24hours/2010/Sessions/ChronologicalOrder.aspx So far the sessions have been great so no pressure :( See you there in 4.5 hrs...(read more)

    Read the article

  • T-SQL (SCD) Slowly Changing Dimension Type 2 using a merge statement

    - by AtulThakor
    Working on stored procedure recently which loads records into a data warehouse I found that the existing record was being expired using an update statement followed by an insert to add the new active record. Playing around with the merge statement you can actually expire the current record and insert a new record within one clean statement. This is how the statement works, we do the normal merge statement to insert a record when there is no match, if we match the record we update the existing record by expiring it and deactivating. At the end of the merge statement we use the output statement to output the staging values for the update,  we wrap the whole merge statement within an insert statement and add new rows for the records which we inserted. I’ve added the full script at the bottom so you can paste it and play around.   1: INSERT INTO ExampleFactUpdate 2: (PolicyID, 3: Status) 4: SELECT -- these columns are returned from the output statement 5: PolicyID, 6: Status 7: FROM 8: ( 9: -- merge statement on unique id in this case Policy_ID 10: MERGE dbo.ExampleFactUpdate dp 11: USING dbo.ExampleStag s 12: ON dp.PolicyID = s.PolicyID 13: WHEN NOT MATCHED THEN -- when we cant match the record we insert a new record record and this is all that happens 14: INSERT (PolicyID,Status) 15: VALUES (s.PolicyID, s.Status) 16: WHEN MATCHED --if it already exists 17: AND ExpiryDate IS NULL -- and the Expiry Date is null 18: THEN 19: UPDATE 20: SET 21: dp.ExpiryDate = getdate(), --we set the expiry on the existing record 22: dp.Active = 0 -- and deactivate the existing record 23: OUTPUT $Action MergeAction, s.PolicyID, s.Status -- the output statement returns a merge action which can 24: ) MergeOutput -- be insert/update/delete, on our example where a record has been updated (or expired in our case 25: WHERE -- we'll filter using a where clause 26: MergeAction = 'Update'; -- here   Complete source for example 1: if OBJECT_ID('ExampleFactUpdate') > 0 2: drop table ExampleFactUpdate 3:  4: Create Table ExampleFactUpdate( 5: ID int identity(1,1), 3: go 6: PolicyID varchar(100), 7: Status varchar(100), 8: EffectiveDate datetime default getdate(), 9: ExpiryDate datetime, 10: Active bit default 1 11: ) 12:  13:  14: insert into ExampleFactUpdate( 15: PolicyID, 16: Status) 17: select 18: 1, 19: 'Live' 20:  21: /*Create Staging Table*/ 22: if OBJECT_ID('ExampleStag') > 0 23: drop table ExampleStag 24: go 25:  26: /*Create example fact table */ 27: Create Table ExampleStag( 28: PolicyID varchar(100), 29: Status varchar(100)) 30:  31: --add some data 32: insert into ExampleStag( 33: PolicyID, 34: Status) 35: select 36: 1, 37: 'Lapsed' 38: union all 39: select 40: 2, 41: 'Quote' 42:  43: select * 44: from ExampleFactUpdate 45:  46: select * 47: from ExampleStag 48:  49:  50: INSERT INTO ExampleFactUpdate 51: (PolicyID, 52: Status) 53: SELECT -- these columns are returned from the output statement 54: PolicyID, 55: Status 56: FROM 57: ( 58: -- merge statement on unique id in this case Policy_ID 59: MERGE dbo.ExampleFactUpdate dp 60: USING dbo.ExampleStag s 61: ON dp.PolicyID = s.PolicyID 62: WHEN NOT MATCHED THEN -- when we cant match the record we insert a new record record and this is all that happens 63: INSERT (PolicyID,Status) 64: VALUES (s.PolicyID, s.Status) 65: WHEN MATCHED --if it already exists 66: AND ExpiryDate IS NULL -- and the Expiry Date is null 67: THEN 68: UPDATE 69: SET 70: dp.ExpiryDate = getdate(), --we set the expiry on the existing record 71: dp.Active = 0 -- and deactivate the existing record 72: OUTPUT $Action MergeAction, s.PolicyID, s.Status -- the output statement returns a merge action which can 73: ) MergeOutput -- be insert/update/delete, on our example where a record has been updated (or expired in our case 74: WHERE -- we'll filter using a where clause 75: MergeAction = 'Update'; -- here 76:  77:  78: select * 79: from ExampleFactUpdate 80: 

    Read the article

< Previous Page | 133 134 135 136 137 138 139 140 141 142 143 144  | Next Page >