Search Results

Search found 4908 results on 197 pages for 'ssas 2005'.

Page 104/197 | < Previous Page | 100 101 102 103 104 105 106 107 108 109 110 111  | Next Page >

  • Not able add .mdf file in App_data

    - by vinod R
    HI I am not able to add .mdf file in App_data(vs 2008 web developer). If i right click on App_data and try to add new item and select sql server file and click OK. I am getting error as "Connections to Sql Server files(*.mdf) require SQL Server Express 2005 to function properly.Please verify the installation of the component or download from the url:http://www.microsoft.com/Sqlserver/2005/" But i have installed SQL Server Express 2005 still it is giving the same error(i have installed sql server after installing vs 2008) Please help me

    Read the article

  • BDD / TDD with JSpec - Removing code duplication

    - by Chetan
    How do I refactor to remove the code duplication in this spec: describe 'TestPlugins' describe '.MovieScanner(document)' before_each MoviePage_loggedIn = fixture("movie_logged_in.html") // Get logged-in movie page MoviePage_notloggedIn = fixture("movie_not_logged_in.html") // Get non logged-in movie page scanner = new MovieScanner() // Get movie scanner end it 'should scan logged-in movie page for movie data' doc = MoviePage_loggedIn // Get document to scan // Unit Tests // ------------------------------------------------------------ // Test movie scanner's functions scanner.getMovieTitle(doc).should.eql "The Jacket" scanner.getMovieYear(doc).should.eql "2005" // Test movie scanner's main scan function scannedData = scanner.scan(doc) scannedData.title.should.eql "The Jacket" scannedData.year.should.eql "2005" end it 'should scan non logged-in movie page for movie data' doc = MoviePage_notloggedIn // Get document to scan // Unit Tests // ------------------------------------------------------------ // Test movie scanner's functions scanner.getMovieTitle(doc).should.eql "The Jacket" scanner.getMovieYear(doc).should.eql "2005" // Test movie scanner's main scan function scannedData = scanner.scan(doc) scannedData.title.should.eql "The Jacket" scannedData.year.should.eql "2005" end end end

    Read the article

  • Strange problem installing a setup that I have just created.

    - by robUK
    Hello, I have both VS 2005 and 2008 installed. Just today I created a setup project in 2005 and I got this error when I tried to installed it. "The folder path 'http://sharingcentre.info' contains an invalid character." So I decided to create a simple project and added a new setup in 2005 and got the same problem. I then converted the project to 2008 and added another new setup. The same problem happened again. However, a previous setup project I created before work ok. I can install my application. Its only from today I cannot install any of my new setup projects. Using either 2005 and 2008. Has anyone else had this same problem. Searching google doesn't give me any real responses, I give the error message is too vague. many thanks for any suggestions,

    Read the article

  • SQL SERVER – Weekly Series – Memory Lane – #049

    - by Pinal Dave
    Here is the list of selected articles of SQLAuthority.com across all these years. Instead of just listing all the articles I have selected a few of my most favorite articles and have listed them here with additional notes below it. Let me know which one of the following is your favorite article from memory lane. 2007 Two Connections Related Global Variables Explained – @@CONNECTIONS and @@MAX_CONNECTIONS @@CONNECTIONS Returns the number of attempted connections, either successful or unsuccessful since SQL Server was last started. @@MAX_CONNECTIONS Returns the maximum number of simultaneous user connections allowed on an instance of SQL Server. The number returned is not necessarily the number currently configured. Query Editor – Microsoft SQL Server Management Studio This post may be very simple for most of the users of SQL Server 2005. Earlier this year, I have received one question many times – Where is Query Analyzer in SQL Server 2005? I wrote small post about it and pointed many users to that post – SQL SERVER – 2005 Query Analyzer – Microsoft SQL SERVER Management Studio. Recently I have been receiving similar question. OUTPUT Clause Example and Explanation with INSERT, UPDATE, DELETE SQL Server 2005 has a new OUTPUT clause, which is quite useful. OUTPUT clause has access to insert and deleted tables (virtual tables) just like triggers. OUTPUT clause can be used to return values to client clause. OUTPUT clause can be used with INSERT, UPDATE, or DELETE to identify the actual rows affected by these statements. OUTPUT clause can generate a table variable, a permanent table, or temporary table. Even though, @@Identity will still work with SQL Server 2005, however I find the OUTPUT clause very easy and powerful to use. Let us understand the OUTPUT clause using an example. Find Name of The SQL Server Instance Based on database server stored procedures has to run different logic. We came up with two different solutions. 1) When database schema is very much changed, we wrote completely new stored procedure and deprecated older version once it was not needed. 2) When logic depended on Server Name we used global variable @@SERVERNAME. It was very convenient while writing migrating script which depended on the server name for the same database. Explanation of TRY…CATCH and ERROR Handling With RAISEERROR Function One of the developers at my company thought that we can not use the RAISEERROR function in new feature of SQL Server 2005 TRY… CATCH. When asked for an explanation he suggested SQL SERVER – 2005 Explanation of TRY… CATCH and ERROR Handling article as excuse suggesting that I did not give example of RAISEERROR with TRY…CATCH. We all thought it was funny. Just to keep records straight, TRY… CATCH can sure use RAISEERROR function. Different Types of Cache Objects Serveral kinds of objects can be stored in the procedure cache: Compiled Plans: When the query optimizer finishes compiling a query plan, the principal output is compiled plan. Execution contexts: While executing a compiled plan, SQL Server has to keep track of information about the state of execution. Cursors: Cursors track the execution state of server-side cursors, including the cursor’s current location within a resultset. Algebrizer trees: The Algebrizer’s job is to produce an algebrizer tree, which represents the logic structure of a query. Open SSMS From Command Prompt – sqlwb.exe Example This article is written by request and suggestion of Sr. Web Developer at my organization. Due to the nature of this article most of the content is referred from Book On-Line. sqlwbcommand prompt utility which opens SQL Server Management Studio. Squib command does not run queries from the command prompt. sqlcmd utility runs queries from command prompt, read for more information. 2008 Puzzle – Solution – Computed Columns Datatype Explanation Just a day before I wrote article SQL SERVER – Puzzle – Computed Columns Datatype Explanation which was inspired by SQL Server MVP Jacob Sebastian. I suggest that before continuing this article read the original puzzle question SQL SERVER – Puzzle – Computed Columns Datatype Explanation.The question was if the computed column was of datatype TINYINT how to create a Computed Column of datatype INT? 2008 – Find If Index is Being Used in Database It is very often I get a query that how to find if any index is being used in the database or not. If any database has many indexes and not all indexes are used it can adversely affect performance. If the number of indices are higher it reduces the INSERT / UPDATE / DELETE operation but increase the SELECT operation. It is recommended to drop any unused indexes from table to improve the performance. 2009 Interesting Observation – Execution Plan and Results of Aggregate Concatenation Queries If you want to see what’s going on here, I think you need to shift your point of view from an implementation-centric view to an ANSI point of view. ANSI does not guarantee processing the order. Figure 2 is interesting, but it will be potentially misleading if you don’t understand the ANSI rule-set SQL Server operates under in most cases. Implementation thinking can certainly be useful at times when you really need that multi-million row query to finish before the backup fire off, but in this case, it’s counterproductive to understanding what is going on. SQL Server Management Studio and Client Statistics Client Statistics are very important. Many a times, people relate queries execution plan to query cost. This is not a good comparison. Both parameters are different, and they are not always related. It is possible that the query cost of any statement is less, but the amount of the data returned is considerably larger, which is causing any query to run slow. How do we know if any query is retrieving a large amount data or very little data? 2010 I encourage all of you to go through complete series and write your own on the subject. If you write an article and send it to me, I will publish it on this blog with due credit to you. If you write on your own blog, I will update this blog post pointing to your blog post. SQL SERVER – ORDER BY Does Not Work – Limitation of the View 1 SQL SERVER – Adding Column is Expensive by Joining Table Outside View – Limitation of the View 2 SQL SERVER – Index Created on View not Used Often – Limitation of the View 3 SQL SERVER – SELECT * and Adding Column Issue in View – Limitation of the View 4 SQL SERVER – COUNT(*) Not Allowed but COUNT_BIG(*) Allowed – Limitation of the View 5 SQL SERVER – UNION Not Allowed but OR Allowed in Index View – Limitation of the View 6 SQL SERVER – Cross Database Queries Not Allowed in Indexed View – Limitation of the View 7 SQL SERVER – Outer Join Not Allowed in Indexed Views – Limitation of the View 8 SQL SERVER – SELF JOIN Not Allowed in Indexed View – Limitation of the View 9 SQL SERVER – Keywords View Definition Must Not Contain for Indexed View – Limitation of the View 10 SQL SERVER – View Over the View Not Possible with Index View – Limitations of the View 11 SQL SERVER – Get Query Running in Session I was recently looking for syntax where I needed a query running in any particular session. I always remembered the syntax and ha d actually written it down before, but somehow it was not coming to mind quickly this time. I searched online and I ended up on my own article written last year SQL SERVER – Get Last Running Query Based on SPID. I felt that I am getting old because I forgot this really simple syntax. Find Total Number of Transaction on Interval In one of my recent Performance Tuning assignments I was asked how do someone know how many transactions are happening on a server during certain interval. I had a handy script for the same. Following script displays transactions happened on the server at the interval of one minute. You can change the WAITFOR DELAY to any other interval and it should work. 2011 Here are two DMV’s which are newly introduced in SQL Server 2012 and provides vital information about SQL Server. DMV – sys.dm_os_volume_stats – Information about operating system volume DMV – sys.dm_os_windows_info – Information about Operating System SQL Backup and FTP – A Quick and Handy Tool I have used this tool extensively since 2009 at numerous occasion and found it to be very impressive. What separates it from the crowd the most – it is it’s apparent simplicity and speed. When I install SQLBackupAndFTP and configure backups – all in 1 or 2 minutes, my clients are always impressed. Quick Note about JOIN – Common Questions and Simple Answers In this blog post we are going to talk about join and lots of things related to the JOIN. I recently started office hours to answer questions and issues of the community. I receive so many questions that are related to JOIN. I will share a few of the same over here. Most of them are basic, but note that the basics are of great importance. 2012 Importance of User Without Login Question: “In recent version of SQL Server we can create user without login. What is the use of it?” Great question indeed. Let me first attempt to answer this question but after reading my answer I need your help. I want you to help him as well with adding more value to it. Preserve Leading Zero While Coping to Excel from SSMS Earlier I wrote two articles about how to efficiently copy data from SSMS to Excel. Since I wrote that post there are plenty of interest generated on this subject. There are a few questions I keep on getting over this subject. One of the question is how to get the leading zero preserved while copying the data from SSMS to Excel. Well it is almost the same way as my earlier post SQL SERVER – Excel Losing Decimal Values When Value Pasted from SSMS ResultSet. The key here is in EXCEL and not in SQL Server. Solution – 2 T-SQL Puzzles – Display Star and Shortest Code to Display 1 Earlier on this blog we had asked two puzzles. The response from all of you is nothing but Amazing. I have received 350+ responses. Many are valid and many were indeed something I had not thought about it. I strongly suggest you read all the puzzles and their answers here - trust me if you start reading the comments you will not stop till you read every single comment. Seriously trust me on it. Personally I have learned a lot from it. Identify Most Resource Intensive Queries – SQL in Sixty Seconds #028 – Video http://www.youtube.com/watch?v=TvlYy-TGaaA Importance of User Without Login – T-SQL Demo Script Earlier I wrote a blog post about SQL SERVER – Importance of User Without Login and my friend and SQL Expert Vinod Kumar has written excellent follow up blog post about Contained Databases inside SQL Server 2012. Now lots of people asked me if I can also explain the same concept again so here is the small demonstration for it. Let me show you how login without user can help. Before we continue on this subject I strongly recommend that you read my earlier blog post here. In following demo I am going to demonstrate following situation. Login using the System Admin account Create a user without login Checking Access Impersonate the user without login Checking Access Revert Impersonation Give Permission to user without login Impersonate the user without login Checking Access Revert Impersonation Clean up Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Memory Lane, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Awesome Serenity (Firefly) – My Little Pony Movie Trailer Mashup [Video]

    - by Asian Angel
    Recently we featured an awesome Watchmen – My Little Pony mashup and today we are back with another great movie trailer mixer. This latest mashup video from BronyVids once again features the ever popular ponies and the movie trailer from the 2005 movie Serenity. Just for fun here is the original Serenity trailer that the video above is based on. My Little Serenity [via Geeks are Sexy] Serenity (2005) Trailer 1080p HD [YouTube] How To Encrypt Your Cloud-Based Drive with BoxcryptorHTG Explains: Photography with Film-Based CamerasHow to Clean Your Dirty Smartphone (Without Breaking Something)

    Read the article

  • How your Standard can become AWEsome

    - by NeilHambly
    Having tried to make a fun play on words to illustrate that for Standard Editions of SQL Server 2005/2008 since the releases of these Cumulative Updates: SQL 2005 SP3 & CU4 / SQL 2008 SP1 & CU2 we can make real use of AWE! Since (Mid 2009) when these CU’s where released, the ability to make use of required privilege “locking-pages-in-memory” which previously was only available in Enterprise Edition, allowing us to make use of those AWE APIs for resolving working set trim issues that resulted...(read more)

    Read the article

  • An XEvent a Day (21 of 31) – The Future – Tracking Blocking in Denali

    - by Jonathan Kehayias
    One of my favorite features that was added to SQL Server 2005 has been the Blocked Process Report trace event which collects an XML report whenever a process is blocked inside of the database engine longer than the user configurable threshold.  I wrote an article about this feature on SQL Server Central  two years ago titled Using the Blocked Process Report in SQL Server 2005/2008 .  One of the aspects of this feature is that it requires that you either have a SQL Trace running that...(read more)

    Read the article

  • VS.Php Standalone Edition

    - by Editor
    VS.Php is a fast, feature rich development environment for Php developers. It comes with only the Visual Studio 2005 IDE components necessary for developing Php applications like a WYSIWYG Html editor and a powerful Xml editor. Benefits One tool for all your Php development needs Based on Microsoft®Visual Studio® 2005, the most popular IDE on the [...]

    Read the article

  • Using "Show values as" option in Excel 2007 pivot table when source is SSAS cube?

    - by Brad
    I have an Excel 2007 pivot table showing "Year" across the top and "Month" down the side. What I am trying to do is represent the values as "% Difference" from the same month of the previous year. (Ex. If Jan-07 is $100,000 and Jan-08 is $120,000, I would like Jan-08 to show '20%'). However, every time I try to do this (using the "Show values as" tab of Value Field Settings) all of my numbers go to '#N/A'. Is there a way to do this using an Analysis Services cube as the data source? When I do this exact same thing using data on a different sheet as the data source for the pivot table, it works fine. Thanks in advance for any insight into this.

    Read the article

  • Backup solution, or, how Duplicati duped me

    - by blarghmaster
    TL/DR version: Mono + Duplicati.commandline.exe restore etc. etc. spits this out for several files regardless of what I try. I am able to list sets, list files in said sets, even do a verify, but each time i do a restore of any kind, i get errors to the effect of : Failed to restore file: "snapshot/blahblah/2005-11-07.tar.gz", Error message: The partial file record for snapshot/blahblah/2005-11-07.tar.gz does not match the file Any advice here, or an idea of where to look for a better solution? FULL STORY: Ive recently put together an nice clean, friendly backup solution for several servers, predominantly Linux, but occasionally a windows box is added too. The solution as is meets all my requirements and does it well... save 1: cross-compatibility The solution is based on a combination of several elements, but eventually comes done to using Duplicity and Duplicati for the actual storage of files. The entire solution was ready to go before i realized that Duplicati, does not, in fact allow me to restore my files to a Linux box, regardless of what the commandline under Mono might tell you. It just spits out errors on random zip and image files, for apparently no good reason as i have tried several options to get it to restore, and several versions of Mono including installing it pretty much lib-for-lib. There is no effective log file for the reasons for these errors, and even the "--debug-output=true" flag does nothing. I am able to list sets, list files in said sets, even do a verify, but each time i do a restore of any kind, i get errors to the effect of : Failed to restore file: "snapshot/blahblah/2005-11-07.tar.gz", Error message: The partial file record for snapshot/blahblah/2005-11-07.tar.gz does not match the file Now i could most likely use the friendly instructions on Duplicati's site and script a bash equivalent of the restore, but that's not exactly ideal. Any advice on this? or possibly an alternative solution that presents the same benefits of Duplicati/Duplicity but that actually works across platforms?

    Read the article

  • "No such file or directory"?

    - by user1509541
    Ok, so I have a VDS laying around, and I thought I would turn it into a TF2 game server. When I connect to my server through PuTTY, and use wget to download the package "hldsupdatetool.bin" from Steampowered.com. I go to run it and it says "No such file or directory found". When I use "ls" to see what files are in directory, it lists "hldsupdatetool.bin" as being in the directory. So, why is it saying it's not there? This has been a headache for the past 2 days. It's returning: root@10004:~# wget http://www.steampowered.com/download/hldsupdatetool.bin --2012-07-08 06:04:49-- http://www.steampowered.com/download/hldsupdatetool.bin Resolving www.steampowered.com... 208.64.202.68 Connecting to www.steampowered.com|208.64.202.68|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 3513408 (3.4M) [application/octet-stream] Saving to: “hldsupdatetool.bin.3” 100%[======================================>] 3,513,408 2.45M/s in 1.4s 2012-07-08 06:04:51 (2.45 MB/s) - “hldsupdatetool.bin.3” saved [3513408/3513408] root@10004:~# chmod +x hldsupdatetool.bin.3 root@10004:~# ./hldsupdatetool.bin.3 -bash: ./hldsupdatetool.bin.3: No such file or directory root@10004:~# More: root@10004:~# ls ffmpeg-packages hldsupdatetool.bin.1 hldsupdatetool.bin.3 hldsupdatetool.bin hldsupdatetool.bin.2 setup.sh root@10004:~# ls -la total 13828 drwx------ 4 root root 4096 Jul 8 06:04 . drwxr-xr-x 21 root root 4096 Jul 8 05:57 .. -rw------- 1 root root 8799 Jul 8 06:26 .bash_history -rw-r--r-- 1 root root 570 Jan 31 2010 .bashrc -rw-r--r-- 1 root root 4 Jul 2 19:39 .custombuild drwxr-xr-x 2 root root 4096 Jul 4 18:49 ffmpeg-packages ---x--xrwx 1 root root 3513408 Sep 2 2005 hldsupdatetool.bin -rwxr-xr-x 1 root root 3513408 Sep 2 2005 hldsupdatetool.bin.1 -rw-r--r-- 1 root root 3513408 Sep 2 2005 hldsupdatetool.bin.2 -rwxr-xr-x 1 root root 3513408 Sep 2 2005 hldsupdatetool.bin.3 -rw-r--r-- 1 root root 140 Nov 19 2007 .profile -rw------- 1 root root 1024 Jul 2 19:49 .rnd -rwxr-xr-x 1 root root 38866 May 23 22:02 setup.sh drwxr-xr-x 2 root root 4096 Jul 2 19:44 .ssh root@10004:~#

    Read the article

  • Change namespace Prefix WCF Envelope

    - by activebiz
    I was wondering is there anyway to change the namespace prefix for the WCF SOAP request? As you can see in the example below, The Envelope has namespace "http://www.w3.org/2005/08/addressing" with prefix 'a'. I want to change this to 'foo'. How can I do that. Note I dont have control over service code I can only create proxy class from the WSDL . <s:Envelope xmlns:a="http://www.w3.org/2005/08/addressing" xmlns:s="http://schemas.xmlsoap.org/soap/envelope/"> <s:Header> <a:Action s:mustUnderstand="1">http://www.starstandards.org/webservices/2005/10/transport/operations/MyAction</a:Action> <h:payloadManifest xmlns="http://www.starstandards.org/webservices/2005/10/transport" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:h="http://www.starstandards.org/webservices/2005/10/transport"> <manifest contentID="Content0" namespaceURI="http://www.starstandard.org/STAR/5" element="TESTMETHOD" version="5.2.4"></manifest> </h:payloadManifest> <h:Identity xmlns="urn:xxx/xxx/" xmlns:h="urn:xxx/xxx"> <SiteCode>XXXXXX</SiteCode> </h:Identity> <a:To>urn:xxx/xxx/Method1</a:To> <MessageID xmlns="http://www.w3.org/2005/08/addressing">XXXXX</MessageID> <a:ReplyTo> <a:Address>http://www.w3.org/2005/08/addressing/anonymous</a:Address> </a:ReplyTo> </s:Header> Many thanks

    Read the article

  • Trace File Source Adapter

    The Trace File Source adapter is a useful addition to your SSIS toolbox.  It allows you to read 2005 and 2008 profiler traces stored as .trc files and read them into the Data Flow.  From there you can perform filtering and analysis using the power of SSIS. There is no need for a SQL Server connection this just uses the trace file. Example Usages Cache warming for SQL Server Analysis Services Reading the flight recorder Find out the longest running queries on a server Analyze statements for CPU, memory by user or some other criteria you choose Properties The Trace File Source adapter has two properties, both of which combine to control the source trace file that is read at runtime. SQL Server 2005 and SQL Server 2008 trace files are supported for both the Database Engine (SQL Server) and Analysis Services. The properties are managed by the Editor form or can be set directly from the Properties Grid in Visual Studio. Property Type Description AccessMode Enumeration This property determines how the Filename property is interpreted. The values available are: DirectInput Variable Filename String This property holds the path for trace file to load (*.trc). The value is either a full path, or the name of a variable which contains the full path to the trace file, depending on the AccessMode property. Trace Column Definition Hopefully the majority of you can skip this section entirely, but if you encounter some problems processing a trace file this may explain it and allow you to fix the problem. The component is built upon the trace management API provided by Microsoft. Unfortunately API methods that expose the schema of a trace file have known issues and are unreliable, put simply the data often differs from what was specified. To overcome these limitations the component uses  some simple XML files. These files enable the trace column data types and sizing attributes to be overridden. For example SQL Server Profiler or TMO generated structures define EventClass as an integer, but the real value is a string. TraceDataColumnsSQL.xml  - SQL Server Database Engine Trace Columns TraceDataColumnsAS.xml    - SQL Server Analysis Services Trace Columns The files can be found in the %ProgramFiles%\Microsoft SQL Server\100\DTS\PipelineComponents folder, e.g. "C:\Program Files\Microsoft SQL Server\100\DTS\PipelineComponents\TraceDataColumnsSQL.xml" "C:\Program Files\Microsoft SQL Server\100\DTS\PipelineComponents\TraceDataColumnsAS.xml" If at runtime the component encounters a type conversion or sizing error it is most likely due to a discrepancy between the column definition as reported by the API and the actual value encountered. Whilst most common issues have already been fixed through these files we have implemented specific exception traps to direct you to the files to enable you to fix any further issues due to different usage or data scenarios that we have not tested. An example error that you can fix through these files is shown below. Buffer exception writing value to column 'Column Name'. The string value is 999 characters in length, the column is only 111. Columns can be overridden by the TraceDataColumns XML files in "C:\Program Files\Microsoft SQL Server\100\DTS\PipelineComponents\TraceDataColumnsAS.xml". Installation The component is provided as an MSI file which you can download and run to install it. This simply places the files on disk in the correct locations and also installs the assemblies in the Global Assembly Cache as per Microsoft’s recommendations. You may need to restart the SQL Server Integration Services service, as this caches information about what components are installed, as well as restarting any open instances of Business Intelligence Development Studio (BIDS) / Visual Studio that you may be using to build your SSIS packages. Finally you will have to add the transformation to the Visual Studio toolbox manually. Right-click the toolbox, and select Choose Items.... Select the SSIS Data Flow Items tab, and then check the Trace File Source transformation in the Choose Toolbox Items window. This process has been described in detail in the related FAQ entry for How do I install a task or transform component? We recommend you follow best practice and apply the current Microsoft SQL Server Service pack to your SQL Server servers and workstations. Please note that the Microsoft Trace classes used in the component are not supported on 64-bit platforms. To use the Trace File Source on a 64-bit host you need to ensure you have the 32-bit (x86) tools available, and the way you execute your package is setup to use them, please see the help topic 64-bit Considerations for Integration Services for more details. Downloads Trace Sources for SQL Server 2005 -- Trace Sources for SQL Server 2008 Version History SQL Server 2008 Version 2.0.0.382 - SQL Sever 2008 public release. (9 Apr 2009) SQL Server 2005 Version 1.0.0.321 - SQL Server 2005 public release. (18 Nov 2008) -- Screenshots

    Read the article

  • Migrating SQL Server Databases – The DBA’s Checklist (Part 2)

    - by Sadequl Hussain
    Continuing from Part 1  , our Migration Checklist continues: Step 5: Update statistics It is always a good idea to update the statistics of the database that you have just installed or migrated. To do this, run the following command against the target database: sp_updatestats The sp_updatestats system stored procedure runs the UPDATE STATISTICS command against every user and system table in the database.  However, a word of caution: running the sp_updatestats against a database with a compatibility level below 90 (SQL Server 2005) will reset the automatic UPDATE STATISTICS settings for every index and statistics of every table in the database. You may therefore want to change the compatibility mode before you run the command. Another thing you should remember to do is to ensure the new database has its AUTO_CREATE_STATISTICS and AUTO_UPDATE_STATISTICS properties set to ON. You can do so using the ALTER DATABASE command or from the SSMS. Step 6: Set database options You may have to change the state of a database after it has been restored. If the database was changed to single-user or read-only mode before backup, the restored copy will also retain these settings. This may not be an issue when you are manually restoring from Enterprise Manager or the Management Studio since you can change the properties. However, this is something to be mindful of if the restore process is invoked by an automated job or script and the database needs to be written to immediately after restore. You may want to check the database’s status programmatically in such cases. Another important option you may want to set for the newly restored / attached database is PAGE_VERIFY. This option specifies how you want SQL Server to ensure the physical integrity of the data. It is a new option from SQL Server 2005 and can have three values: CHECKSUM (default for SQL Server 2005 and latter databases), TORN_PAGE_DETECTION (default when restoring a pre-SQL Server 2005 database) or NONE. Torn page detection was itself an option for SQL Server 2000 databases. From SQL Server 2005, when PAGE_VERIFY is set to CHECKSUM, the database engine calculates the checksum for a page’s contents and writes it to the page header before storing it in disk. When the page is read from the disk, the checksum is computed again and compared with the checksum stored in the header.  Torn page detection works much like the same way in that it stores a bit in the page header for every 512 byte sector. When data is read from the page, the torn page bits stored in the header is compared with the respective sector contents. When PAGE_VERIFY is set to NONE, SQL Server does not perform any checking, even if torn page data or checksums are present in the page header.  This may not be something you would want to set unless there is a very specific reason.  Microsoft suggests using the CHECKSUM page verify option as this offers more protection. Step 7: Map database users to logins A common database migration issue is related to user access. Windows and SQL Server native logins that existed in the source instance and had access to the database may not be present in the destination. Even if the logins exist in the destination, the mapping between the user accounts and the logins will not be automatic. You can use a special system stored procedure called sp_change_users_login to address these situations. The procedure needs to be run against the newly attached or restored database and can accept four parameters. Depending on what you want to do, you may be using less than four though. The first parameter, @Action, can take three values. When you specify @Action = ‘Report’, the system will provide you with a list of database users which are not mapped to any login. If you want to map a database user to an existing SQL Server login, the value for @Action will be ‘Update_One’. In this case, you will only need to provide the database user name and the login it will map to. So if your newly restored database has a user account called “bob” and there is already a SQL Server login with the same name and you want to map the user to the login, you will execute a query like the following: sp_change_users_login         @Action = ‘Update_One’,         @UserNamePattern = ‘bob’,         @LoginName = ‘bob’ If the login does not exist, you can instruct SQL Server to create the login with the same name. In this case you will need to provide a password for the login and the value of the @Action parameter will be ‘Auto_Fix’. If the login already exists, it will be automatically mapped to the user account. Unfortunately sp_change_users_login system stored procedure cannot be used to map database users to trusted logins (Windows accounts) in SQL Server. You will need to follow a manual process to re-map the database user accounts.  Continues…

    Read the article

  • Trace File Source Adapter

    The Trace File Source adapter is a useful addition to your SSIS toolbox.  It allows you to read 2005 and 2008 profiler traces stored as .trc files and read them into the Data Flow.  From there you can perform filtering and analysis using the power of SSIS. There is no need for a SQL Server connection this just uses the trace file. Example Usages Cache warming for SQL Server Analysis Services Reading the flight recorder Find out the longest running queries on a server Analyze statements for CPU, memory by user or some other criteria you choose Properties The Trace File Source adapter has two properties, both of which combine to control the source trace file that is read at runtime. SQL Server 2005 and SQL Server 2008 trace files are supported for both the Database Engine (SQL Server) and Analysis Services. The properties are managed by the Editor form or can be set directly from the Properties Grid in Visual Studio. Property Type Description AccessMode Enumeration This property determines how the Filename property is interpreted. The values available are: DirectInput Variable Filename String This property holds the path for trace file to load (*.trc). The value is either a full path, or the name of a variable which contains the full path to the trace file, depending on the AccessMode property. Trace Column Definition Hopefully the majority of you can skip this section entirely, but if you encounter some problems processing a trace file this may explain it and allow you to fix the problem. The component is built upon the trace management API provided by Microsoft. Unfortunately API methods that expose the schema of a trace file have known issues and are unreliable, put simply the data often differs from what was specified. To overcome these limitations the component uses  some simple XML files. These files enable the trace column data types and sizing attributes to be overridden. For example SQL Server Profiler or TMO generated structures define EventClass as an integer, but the real value is a string. TraceDataColumnsSQL.xml  - SQL Server Database Engine Trace Columns TraceDataColumnsAS.xml    - SQL Server Analysis Services Trace Columns The files can be found in the %ProgramFiles%\Microsoft SQL Server\100\DTS\PipelineComponents folder, e.g. "C:\Program Files\Microsoft SQL Server\100\DTS\PipelineComponents\TraceDataColumnsSQL.xml" "C:\Program Files\Microsoft SQL Server\100\DTS\PipelineComponents\TraceDataColumnsAS.xml" If at runtime the component encounters a type conversion or sizing error it is most likely due to a discrepancy between the column definition as reported by the API and the actual value encountered. Whilst most common issues have already been fixed through these files we have implemented specific exception traps to direct you to the files to enable you to fix any further issues due to different usage or data scenarios that we have not tested. An example error that you can fix through these files is shown below. Buffer exception writing value to column 'Column Name'. The string value is 999 characters in length, the column is only 111. Columns can be overridden by the TraceDataColumns XML files in "C:\Program Files\Microsoft SQL Server\100\DTS\PipelineComponents\TraceDataColumnsAS.xml". Installation The component is provided as an MSI file which you can download and run to install it. This simply places the files on disk in the correct locations and also installs the assemblies in the Global Assembly Cache as per Microsoft’s recommendations. You may need to restart the SQL Server Integration Services service, as this caches information about what components are installed, as well as restarting any open instances of Business Intelligence Development Studio (BIDS) / Visual Studio that you may be using to build your SSIS packages. Finally you will have to add the transformation to the Visual Studio toolbox manually. Right-click the toolbox, and select Choose Items.... Select the SSIS Data Flow Items tab, and then check the Trace File Source transformation in the Choose Toolbox Items window. This process has been described in detail in the related FAQ entry for How do I install a task or transform component? We recommend you follow best practice and apply the current Microsoft SQL Server Service pack to your SQL Server servers and workstations. Please note that the Microsoft Trace classes used in the component are not supported on 64-bit platforms. To use the Trace File Source on a 64-bit host you need to ensure you have the 32-bit (x86) tools available, and the way you execute your package is setup to use them, please see the help topic 64-bit Considerations for Integration Services for more details. Downloads Trace Sources for SQL Server 2005 -- Trace Sources for SQL Server 2008 Version History SQL Server 2008 Version 2.0.0.382 - SQL Sever 2008 public release. (9 Apr 2009) SQL Server 2005 Version 1.0.0.321 - SQL Server 2005 public release. (18 Nov 2008) -- Screenshots

    Read the article

  • WIF-less claim extraction from ACS: SWT

    - by Elton Stoneman
    WIF with SAML is solid and flexible, but unless you need the power, it can be overkill for simple claim assertion, and in the REST world WIF doesn’t have support for the latest token formats.  Simple Web Token (SWT) may not be around forever, but while it's here it's a nice easy format which you can manipulate in .NET without having to go down the WIF route. Assuming you have set up a Relying Party in ACS, specifying SWT as the token format: When ACS redirects to your login page, it will POST the SWT in the first form variable. It comes through in the BinarySecurityToken element of a RequestSecurityTokenResponse XML payload , the SWT type is specified with a TokenType of http://schemas.xmlsoap.org/ws/2009/11/swt-token-profile-1.0 : <t:RequestSecurityTokenResponse xmlns:t="http://schemas.xmlsoap.org/ws/2005/02/trust">   <t:Lifetime>     <wsu:Created xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd">2012-08-31T07:31:18.655Z</wsu:Created>     <wsu:Expires xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd">2012-08-31T09:11:18.655Z</wsu:Expires>   </t:Lifetime>   <wsp:AppliesTo xmlns:wsp="http://schemas.xmlsoap.org/ws/2004/09/policy">     <EndpointReference xmlns="http://www.w3.org/2005/08/addressing">       <Address>http://localhost/x.y.z</Address>     </EndpointReference>   </wsp:AppliesTo>   <t:RequestedSecurityToken>     <wsse:BinarySecurityToken wsu:Id="uuid:fc8d3332-d501-4bb0-84ba-d31aa95a1a6c" ValueType="http://schemas.xmlsoap.org/ws/2009/11/swt-token-profile-1.0" EncodingType="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-soap-message-security-1.0#Base64Binary" xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd" xmlns:wsse="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd"> [ base64string ] </wsse:BinarySecurityToken>   </t:RequestedSecurityToken>   <t:TokenType>http://schemas.xmlsoap.org/ws/2009/11/swt-token-profile-1.0</t:TokenType>   <t:RequestType>http://schemas.xmlsoap.org/ws/2005/02/trust/Issue</t:RequestType>   <t:KeyType>http://schemas.xmlsoap.org/ws/2005/05/identity/NoProofKey</t:KeyType> </t:RequestSecurityTokenResponse> Reading the SWT is as simple as base-64 decoding, then URL-decoding the element value:     var wrappedToken = XDocument.Parse(HttpContext.Current.Request.Form[1]);     var binaryToken = wrappedToken.Root.Descendants("{http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd}BinarySecurityToken").First();     var tokenBytes = Convert.FromBase64String(binaryToken.Value);     var token = Encoding.UTF8.GetString(tokenBytes);     var tokenType = wrappedToken.Root.Descendants("{http://schemas.xmlsoap.org/ws/2005/02/trust}TokenType").First().Value; The decoded token contains the claims as key/value pairs, along with the issuer, audience (ACS realm), expiry date and an HMAC hash, which are in query string format. Separate them on the ampersand, and you can write out the claim values in your logged-in page:     var decoded = HttpUtility.UrlDecode(token);     foreach (var part in decoded.Split('&'))     {         Response.Write("<pre>" + part + "</pre><br/>");     } - which will produce something like this: http://schemas.microsoft.com/ws/2008/06/identity/claims/authenticationinstant=2012-08-31T06:57:01.855Z http://schemas.microsoft.com/ws/2008/06/identity/claims/authenticationmethod=http://schemas.microsoft.com/ws/2008/06/identity/authenticationmethod/windows http://schemas.microsoft.com/ws/2008/06/identity/claims/windowsaccountname=XYZ http://schemas.xmlsoap.org/ws/2005/05/identity/claims/[email protected] http://schemas.xmlsoap.org/ws/2005/05/identity/claims/[email protected] http://schemas.microsoft.com/accesscontrolservice/2010/07/claims/identityprovider=http://fs.svc.xyz.com/adfs/services/trust Audience=http://localhost/x.y.z ExpiresOn=1346402225 Issuer=https://x-y-z.accesscontrol.windows.net/ HMACSHA256=oDCeEDDAWEC8x+yBnTaCLnzp4L6jI0Z/xNK95PdZTts= The HMAC hash lets you validate the token to ensure it hasn’t been tampered with. You'll need the token signing key from ACS, then you can re-sign the token and compare hashes. There's a full implementation of an SWT parser and validator here: How To Request SWT Token From ACS And How To Validate It At The REST WCF Service Hosted In Windows Azure, and a cut-down claim inspector on my github code gallery: ACS Claim Inspector. Interestingly, ACS lets you have a value for your logged-in page which has no relation to the realm for authentication, so you can put this code into a generic claim inspector page, and set that to be your logged-in page for any relying party where you want to check what's being sent through. Particularly handy with ADFS, when you're modifying the claims provided, and want to quickly see the results.

    Read the article

  • TechEd North America 2012 – Day 1 #msTechEd

    - by Marco Russo (SQLBI)
    Yesterday I and Alberto delivered the PreCon day about BISM Tabular in Analysis Services 2012. We received very good feedback and now I am looking forward to meet people that read our blogs and our books! Ping me on Twitter at @marcorus if you want to contact me during the conference. This is my schedule for the next few days: ·         Monday, June 11, 2012 o   10:30am-12:30pm I will be in the Technical Learning Center area, at the Breaktrough Insights (station #8) in the Database & Business Intelligence area (dedicated to SQL Server 2012) o   I will try to watch some sessions in the afternoon o   6:30pm-7:00pm I will be at the O’Reilly booth meeting book readers and doing some book signing ·         Tuesday, June 12, 2012 o   12:30pm-3:30pm I will be in the Technical Learning Center area, at the Breaktrough Insights (station #8) in the Database & Business Intelligence area (dedicated to SQL Server 2012) o   5:00pm-6:15pm I will attend the Alberto’s session DBI413 Many-to-Many Relationships in BISM Tabular (room S330E) o   6:15pm-9:00pm Community Night & Ask the Experts, we’ll discuss about Analysis Services, Tabular and Multidimensional! ·         Wednesday, June 13, 2012 o   11:15am-11:30am Don’t miss this special demo session at the Private Cloud, Public Cloud and Data Platform Theater in the Technical Learning Center area (next to the SQL Server 2012 zone). I and Alberto will present Querying multi-billion rows with many to many relationships in SSAS Tabular (xVelocity) and you’re invited to guess the response time of DAX queries on a 4 billion rows table with many-to-many relationships before we run them! We’ll give away some 8GB USB key if you guess the right answer! o   12:30pm-1:00pm I and Alberto will have a book signing session at the TechEd Bookstore o   3:00pm-5:00pm I will be in the Technical Learning Center area, at the Breaktrough Insights (station #8) in the Database & Business Intelligence area (dedicated to SQL Server 2012) ·         Thursday, June 14, 2012 o   2:45pm-4:00pm I will deliver my DBI319 BISM: Multidimensional vs. Tabular breakthrough session in room S320A. I expect many questions here! And if you want to learn more about Analysis Services Tabular, we announced two more online sessions of our SSAS Tabular Workshop: ·         July 2-3, 2012 - SSAS Workshop Online - America's time zone ·         September 3-4, 2012 - SSAS Workshop Online - America's time zone Register now if you are interested, the early bird for the July session expires on June 19, 2012! I will also deliver a SSAS Workshop in Oslo (Norway) on August 27-28, 2012.  

    Read the article

  • SQL Server to sql server linked server setup

    - by ScottStonehouse
    Please explain what is required to set up a SQL Server linked server. Server A is SQL 2005 windows logins only Server B is the same (SQL 2005 windows logins only) Server A runs windows XP Server B runs Windows Server 2003 Both SQL Server services are running under the same domain account. I am logged into my workstation with a domain account that has administrative rights on both SQL Servers. Note these are both SQL Server 2005 SP2 - I've had old hotfixes pointed out to me, but those are already applied. The issue I am having is this error: "Login failed for user 'NT AUTHORITY\ANONYMOUS LOGON'. (Microsoft SQL Server, Error: 18456)"

    Read the article

  • Debuggin in Eclipse - Run till breakpoint

    - by pragadheesh
    I am trying to debug my java code in eclipse. By using break points and Debug mode, the control hits the break point after which I can use F6 to navigate through my code. Consider I have my break point inside a for loop. In Visual Studio 2005, if we hit Execute (F5), it would stop at next breakpoint. How can I achieve the same in eclipse. Also consider, if I make a change while debugging. So I want to Stop the execution and restart it again from beginning. Like we have Stop Execution in VS 2005. Mainly for those who have extensively used Visual Studio 2005, how does eclipse provide similar functionality.

    Read the article

  • Lync using SQL 2008 R2 SP1 - Publish Topology error

    - by EKS
    Error that shows in the web page opened by the Topology builder: Error: An error occurred: "Microsoft.Rtc.Common.Data.SqlConnectionException" "A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server)" Looking in the log file: ( I assume this is the acutal error making it STOP) Installed SQL Server 2005 Backward Compatibility version is 8.05.1054 Error: SQL Server 2005 Backward Compatibility is not installed or its version is not high enough. SQL Server 2005 Backward Compatability SP2 or higher must be installed. I have installed SQLServer2005_BC_x64.msi from MS, and cant seem to find this SP2 version. SQL server is a 2008 R2 SP1. I have also tested with a 2008 SP3 SQL server same error. Named pippes output via ( OSQL /L) SQL-2008-1 SQL2

    Read the article

  • SQL SERVER – What is AdventureWorks?

    - by pinaldave
    NOTE: If you know the answer of this question, then I request you to stop reading this post right now. Please do not leave comment about this blog post not being useful to you, if you knew the answer. Few days ago, I received DM asking What is an AdventureWorks database and why in all the examples I use that instead of any other database (e.g. Pubs or  Northwind)? As matter of fact, when I went back to my question list, which I have yet not answered, there were a few more variations of this same question. AdventureWorks is a Sample Database shipped with SQL Server and it can be downloaded from http://codeplex.com site. AdventureWorks has replaced Northwind and Pubs from the sample database in SQL Server 2005. The Microsoft team keeps updating the sample database as they release new versions. Here are some quick links: AdventureWorks SQL Server 2008 SR4 AdventureWorks 2008R2 November CTP AdventureWorks for SQL Azure (December CTP) AventureWorks for SQL Server 2005 SP2A SQL SERVER – 2008 – Download and Install Samples Database AdventureWorks 2005 – Detail Tutorial I have previously written few other articles on the same subject; you can find them easily here: [email protected] Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Documentation, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Fraud Detection with the SQL Server Suite Part 1

    - by Dejan Sarka
    While working on different fraud detection projects, I developed my own approach to the solution for this problem. In my PASS Summit 2013 session I am introducing this approach. I also wrote a whitepaper on the same topic, which was generously reviewed by my friend Matija Lah. In order to spread this knowledge faster, I am starting a series of blog posts which will at the end make the whole whitepaper. Abstract With the massive usage of credit cards and web applications for banking and payment processing, the number of fraudulent transactions is growing rapidly and on a global scale. Several fraud detection algorithms are available within a variety of different products. In this paper, we focus on using the Microsoft SQL Server suite for this purpose. In addition, we will explain our original approach to solving the problem by introducing a continuous learning procedure. Our preferred type of service is mentoring; it allows us to perform the work and consulting together with transferring the knowledge onto the customer, thus making it possible for a customer to continue to learn independently. This paper is based on practical experience with different projects covering online banking and credit card usage. Introduction A fraud is a criminal or deceptive activity with the intention of achieving financial or some other gain. Fraud can appear in multiple business areas. You can find a detailed overview of the business domains where fraud can take place in Sahin Y., & Duman E. (2011), Detecting Credit Card Fraud by Decision Trees and Support Vector Machines, Proceedings of the International MultiConference of Engineers and Computer Scientists 2011 Vol 1. Hong Kong: IMECS. Dealing with frauds includes fraud prevention and fraud detection. Fraud prevention is a proactive mechanism, which tries to disable frauds by using previous knowledge. Fraud detection is a reactive mechanism with the goal of detecting suspicious behavior when a fraudster surpasses the fraud prevention mechanism. A fraud detection mechanism checks every transaction and assigns a weight in terms of probability between 0 and 1 that represents a score for evaluating whether a transaction is fraudulent or not. A fraud detection mechanism cannot detect frauds with a probability of 100%; therefore, manual transaction checking must also be available. With fraud detection, this manual part can focus on the most suspicious transactions. This way, an unchanged number of supervisors can detect significantly more frauds than could be achieved with traditional methods of selecting which transactions to check, for example with random sampling. There are two principal data mining techniques available both in general data mining as well as in specific fraud detection techniques: supervised or directed and unsupervised or undirected. Supervised techniques or data mining models use previous knowledge. Typically, existing transactions are marked with a flag denoting whether a particular transaction is fraudulent or not. Customers at some point in time do report frauds, and the transactional system should be capable of accepting such a flag. Supervised data mining algorithms try to explain the value of this flag by using different input variables. When the patterns and rules that lead to frauds are learned through the model training process, they can be used for prediction of the fraud flag on new incoming transactions. Unsupervised techniques analyze data without prior knowledge, without the fraud flag; they try to find transactions which do not resemble other transactions, i.e. outliers. In both cases, there should be more frauds in the data set selected for checking by using the data mining knowledge compared to selecting the data set with simpler methods; this is known as the lift of a model. Typically, we compare the lift with random sampling. The supervised methods typically give a much better lift than the unsupervised ones. However, we must use the unsupervised ones when we do not have any previous knowledge. Furthermore, unsupervised methods are useful for controlling whether the supervised models are still efficient. Accuracy of the predictions drops over time. Patterns of credit card usage, for example, change over time. In addition, fraudsters continuously learn as well. Therefore, it is important to check the efficiency of the predictive models with the undirected ones. When the difference between the lift of the supervised models and the lift of the unsupervised models drops, it is time to refine the supervised models. However, the unsupervised models can become obsolete as well. It is also important to measure the overall efficiency of both, supervised and unsupervised models, over time. We can compare the number of predicted frauds with the total number of frauds that include predicted and reported occurrences. For measuring behavior across time, specific analytical databases called data warehouses (DW) and on-line analytical processing (OLAP) systems can be employed. By controlling the supervised models with unsupervised ones and by using an OLAP system or DW reports to control both, a continuous learning infrastructure can be established. There are many difficulties in developing a fraud detection system. As has already been mentioned, fraudsters continuously learn, and the patterns change. The exchange of experiences and ideas can be very limited due to privacy concerns. In addition, both data sets and results might be censored, as the companies generally do not want to publically expose actual fraudulent behaviors. Therefore it can be quite difficult if not impossible to cross-evaluate the models using data from different companies and different business areas. This fact stresses the importance of continuous learning even more. Finally, the number of frauds in the total number of transactions is small, typically much less than 1% of transactions is fraudulent. Some predictive data mining algorithms do not give good results when the target state is represented with a very low frequency. Data preparation techniques like oversampling and undersampling can help overcome the shortcomings of many algorithms. SQL Server suite includes all of the software required to create, deploy any maintain a fraud detection infrastructure. The Database Engine is the relational database management system (RDBMS), which supports all activity needed for data preparation and for data warehouses. SQL Server Analysis Services (SSAS) supports OLAP and data mining (in version 2012, you need to install SSAS in multidimensional and data mining mode; this was the only mode in previous versions of SSAS, while SSAS 2012 also supports the tabular mode, which does not include data mining). Additional products from the suite can be useful as well. SQL Server Integration Services (SSIS) is a tool for developing extract transform–load (ETL) applications. SSIS is typically used for loading a DW, and in addition, it can use SSAS data mining models for building intelligent data flows. SQL Server Reporting Services (SSRS) is useful for presenting the results in a variety of reports. Data Quality Services (DQS) mitigate the occasional data cleansing process by maintaining a knowledge base. Master Data Services is an application that helps companies maintaining a central, authoritative source of their master data, i.e. the most important data to any organization. For an overview of the SQL Server business intelligence (BI) part of the suite that includes Database Engine, SSAS and SSRS, please refer to Veerman E., Lachev T., & Sarka D. (2009). MCTS Self-Paced Training Kit (Exam 70-448): Microsoft® SQL Server® 2008 Business Intelligence Development and Maintenance. MS Press. For an overview of the enterprise information management (EIM) part that includes SSIS, DQS and MDS, please refer to Sarka D., Lah M., & Jerkic G. (2012). Training Kit (Exam 70-463): Implementing a Data Warehouse with Microsoft® SQL Server® 2012. O'Reilly. For details about SSAS data mining, please refer to MacLennan J., Tang Z., & Crivat B. (2009). Data Mining with Microsoft SQL Server 2008. Wiley. SQL Server Data Mining Add-ins for Office, a free download for Office versions 2007, 2010 and 2013, bring the power of data mining to Excel, enabling advanced analytics in Excel. Together with PowerPivot for Excel, which is also freely downloadable and can be used in Excel 2010, is already included in Excel 2013. It brings OLAP functionalities directly into Excel, making it possible for an advanced analyst to build a complete learning infrastructure using a familiar tool. This way, many more people, including employees in subsidiaries, can contribute to the learning process by examining local transactions and quickly identifying new patterns.

    Read the article

  • WIF-less claim extraction from ACS: JWT

    - by Elton Stoneman
    ACS support for JWT still shows as "beta", but it meets the spec and it works nicely, so it's becoming the preferred option as SWT is losing favour. (Note that currently ACS doesn’t support JWT encryption, if you want encrypted tokens you need to go SAML). In my last post I covered pulling claims from an ACS token without WIF, using the SWT format. The JWT format is a little more complex, but you can still inspect claims just with string manipulation. The incoming token from ACS is still presented in the BinarySecurityToken element of the XML payload, with a TokenType of urn:ietf:params:oauth:token-type:jwt: <t:RequestSecurityTokenResponse xmlns:t="http://schemas.xmlsoap.org/ws/2005/02/trust">   <t:Lifetime>     <wsu:Created xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd">2012-08-31T07:39:55.337Z</wsu:Created>     <wsu:Expires xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd">2012-08-31T09:19:55.337Z</wsu:Expires>   </t:Lifetime>   <wsp:AppliesTo xmlns:wsp="http://schemas.xmlsoap.org/ws/2004/09/policy">     <EndpointReference xmlns="http://www.w3.org/2005/08/addressing">       <Address>http://localhost/x.y.z</Address>     </EndpointReference>   </wsp:AppliesTo>   <t:RequestedSecurityToken>     <wsse:BinarySecurityToken wsu:Id="_1eeb5cf4-b40b-40f2-89e0-a3343f6bd985-6A15D1EED0CDB0D8FA48C7D566232154" ValueType="urn:ietf:params:oauth:token-type:jwt" EncodingType="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-soap-message-security-1.0#Base64Binary" xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd" xmlns:wsse="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd">[ base64string ] </wsse:BinarySecurityToken>   </t:RequestedSecurityToken>   <t:TokenType>urn:ietf:params:oauth:token-type:jwt</t:TokenType>   <t:RequestType>http://schemas.xmlsoap.org/ws/2005/02/trust/Issue</t:RequestType>   <t:KeyType>http://schemas.xmlsoap.org/ws/2005/05/identity/NoProofKey</t:KeyType> </t:RequestSecurityTokenResponse> The token as a whole needs to be base-64 decoded. The decoded value contains a header, payload and signature, dot-separated; the parts are also base-64, but they need to be decoded using a no-padding algorithm (implementation and more details in this MSDN article on validating an Exchange 2013 identity token). The values are then in JSON; the header contains the token type and the hashing algorithm: "{"typ":"JWT","alg":"HS256"}" The payload contains the same data as in the SWT, but JSON rather than querystring format: {"aud":"http://localhost/x.y.z" "iss":"https://adfstest-bhw.accesscontrol.windows.net/" "nbf":1346398795 "exp":1346404795 "http://schemas.microsoft.com/ws/2008/06/identity/claims/authenticationinstant":"2012-08-31T07:39:53.652Z" "http://schemas.microsoft.com/ws/2008/06/identity/claims/authenticationmethod":"http://schemas.microsoft.com/ws/2008/06/identity/authenticationmethod/windows" "http://schemas.microsoft.com/ws/2008/06/identity/claims/windowsaccountname":"xyz" "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddress":"[email protected]" "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/upn":"[email protected]" "identityprovider":"http://fs.svc.x.y.z.com/adfs/services/trust"} The signature is in the third part of the token. Unlike SWT which is fixed to HMAC-SHA-256, JWT can support other protocols (the one in use is specified as the "alg" value in the header). How to: Validate an Exchange 2013 identity token contains an implementation of a JWT parser and validator; apart from the custom base-64 decoding part, it’s very similar to SWT extraction. I've wrapped the basic SWT and JWT in a ClaimInspector.aspx page on gitHub here: SWT and JWT claim inspector. You can drop it into any ASP.Net site and set the URL to be your redirect page in ACS. Swap ACS to issue SWT or JWT, and using the same page you can inspect the claims that come out.

    Read the article

< Previous Page | 100 101 102 103 104 105 106 107 108 109 110 111  | Next Page >