Search Results

Search found 7857 results on 315 pages for 'scom 2007 r2'.

Page 104/315 | < Previous Page | 100 101 102 103 104 105 106 107 108 109 110 111  | Next Page >

  • An XEvent a Day (13 of 31) – The system_health Session

    - by Jonathan Kehayias
    Today’s post was originally planned for this coming weekend, but seems I’ve caught whatever bug my kids had over the weekend so I am changing up today’s blog post with one that is easier to cover and shorter.  If you’ve been running some of the queries from the posts in this series, you have no doubt come across an Event Session running on your server with the name of system_health.  In today’s post I’ll go over this session and provide links to references related to it. When Extended Events...(read more)

    Read the article

  • An XEvent a Day (14 of 31) – A Closer Look at Predicates

    - by Jonathan Kehayias
    When working with SQL Trace, one of my biggest frustrations has been the limitations that exist in filtering.  Using sp_trace_setfilter to establish the filter criteria is a non-trivial task, and it falls short of being able to deliver complex filtering that is sometimes needed to simplify analysis.  Filtering of trace data was performed globally and applied to the trace affecting all of the events being collected.  Extended Events introduces a much better system of filtering using...(read more)

    Read the article

  • An XEvent a Day (15 of 31) – Tracking Ghost Cleanup

    - by Jonathan Kehayias
    If you don’t know anything about Ghost Cleanup, I recommend highly that you go read Paul Randal’s blog posts Inside the Storage Engine: Ghost cleanup in depth , Ghost cleanup redux , and Turning off the ghost cleanup task for a performance gain .  To my knowledge Paul’s posts are the only things that cover Ghost Cleanup at any level online. In this post we’ll look at how you can use Extended Events to track the activity of Ghost Cleanup inside of your SQL Server.  To do this, we’ll first...(read more)

    Read the article

  • An XEvent a Day (17 of 31) – A Look at Backup Internals and How to Track Backup and Restore Throughput (Part 1)

    - by Jonathan Kehayias
    Today’s post is a continuation of yesterday’s post How Many Checkpoints are Issued During a Full Backup? and the investigation of Database Engine Internals with Extended Events.  In today’s post we’ll look at how Backup’s work inside of SQL Server and how to track the throughput of Backup and Restore operations.  This post is not going to cover Backups in SQL Server as a topic; if that is what you are looking for see Paul Randal’s TechNet Article Understanding SQL Server Backups . Yesterday...(read more)

    Read the article

  • An XEvent a Day (5 of 31) - Targets Week – ring_buffer

    - by Jonathan Kehayias
    Yesterday’s post, Querying the Session Definition and Active Session DMV’s , showed how to find information about the Event Sessions that exist inside a SQL Server and how to find information about the Active Event Sessions that are running inside a SQL Server using the Session Definition and Active Session DMV’s.  With the background information now out of the way, and since this post falls on the start of a new week I’ve decided to make this Targets Week, where each day we’ll look at a different...(read more)

    Read the article

  • An XEvent a Day (12 of 31) – Using the Extended Events SSMS Addin

    - by Jonathan Kehayias
    The lack of SSMS support for Extended Events, coupled with the fact that a number of the existing Events in SQL Trace were not implemented in SQL Server 2008, has no doubt been a key factor in its slow adoption rate. Since the release of SQL Server Denali CTP1, I have already seen a number of blog posts that talk about the introduction of Extended Events in SQL Server, because there is now a stub for it inside of SSMS. Don’t get excited yet, the functionality in CTP1 is very limited at this point,...(read more)

    Read the article

  • Trace Flag 610 – When should you use it?

    - by simonsabin
    Thanks to Marcel van der Holst for providing this great information on the use of Trace Flag 610. This trace flag can be used to have minimal logging into a b tree (i.e. clustered table or an index on a heap) that already has data. It is a trace flag because in testing they found some scenarios where it didn’t perform as well. Marcel explains why below. “ TF610 can be used to get minimal logging in a non-empty B-Tree. The idea is that when you insert a large amount of data, you don't want to...(read more)

    Read the article

  • An XEvent a Day (4 of 31) – Querying the Session Definition and Active Session DMV’s

    - by Jonathan Kehayias
    Yesterdays post, Managing Event Sessions , showed how to manage Event Sessions in Extended Events Sessions inside the Extended Events framework in SQL Server. In today's post, we’ll take a look at how to find information about the defined Event Sessions that already exist inside a SQL Server using the Session Definition DMV’s and how to find information about the Active Event Sessions that exist using the Active Session DMV’s. Session Definition DMV’s The Session Definition DMV’s provide information...(read more)

    Read the article

  • An XEvent a Day (16 of 31) – How Many Checkpoints are Issued During a Full Backup?

    - by Jonathan Kehayias
    This wasn’t my intended blog post for today, but last night a question came across #SQLHelp on Twitter from Varun ( Twitter ). #sqlhelp how many checkpoints are issued during a full backup? The question was answered by Robert Davis (Blog|Twitter) as: Just 1, at the very start. RT @ 1sql : #sqlhelp how many checkpoints are issued during a full backup? This seemed like a great thing to test out with Extended Events so I ran through the available Events in SQL Server 2008, and the only Event related...(read more)

    Read the article

  • SSAS DMVs: useful links

    - by Davide Mauri
    From time to time happens that I need to extract metadata informations from Analysis Services DMVS in order to quickly get an overview of the entire situation and/or drill down to detail level. As a memo I post the link I use most when need to get documentation on SSAS Objects Data DMVs: SSAS: Using DMV Queries to get Cube Metadata http://bennyaustin.wordpress.com/2011/03/01/ssas-dmv-queries-cube-metadata/ SSAS DMV (Dynamic Management View) http://dwbi1.wordpress.com/2010/01/01/ssas-dmv-dynamic-management-view/ Use Dynamic Management Views (DMVs) to Monitor Analysis Services http://msdn.microsoft.com/en-us/library/hh230820.aspx

    Read the article

  • An XEvent a Day (1 of 31) – An Overview of Extended Events

    - by Jonathan Kehayias
    First introduced in SQL Server 2008, Extended Events provided a new mechanism for capturing information about events inside the Database Engine that was both highly performant and highly configurable. Designed from the ground up with performance as a primary focus, Extended Events may seem a bit odd at first look, especially when you compare it to SQL Trace. However, as you begin to work with Extended Events, you will most likely change how you think about tracing problems, and will find the power...(read more)

    Read the article

  • An XEvent a Day (6 of 31) – Targets Week – asynchronous_file_target

    - by Jonathan Kehayias
    Yesterday’s post, Targets Week - ring_buffer , looked at the ring_buffer Target in Extended Events and how it outputs the raw Event data in an XML document.  Today I’m going to go over the details of the other Target in Extended Events that captures raw Event data, the asynchronous_file_target. What is the asynchronous_file_target? The asynchronous_file_target holds the raw format Event data in a proprietary binary file format that persists beyond server restarts and can be provided to another...(read more)

    Read the article

  • An XEvent a Day (8 of 31) – Targets Week – synchronous_event_counter

    - by Jonathan Kehayias
    Yesterday’s post, Targets Week - Bucketizers , looked at the bucketizer Targets in Extended Events and how they can be used to simplify analysis and perform more targeted analysis based on their output.  Today’s post will be fairly short, by comparison to the previous posts, while we look at the synchronous_event_counter target, which can be used to test the impact of an Event Session without actually incurring the cost of Event collection. What is the synchronous_event_counter? The synchronous_event_count...(read more)

    Read the article

  • An XEvent a Day (2 of 31) – Querying the Extended Events Metadata

    - by Jonathan Kehayias
    In yesterdays post, An Overview of Extended Events , I provided some of the necessary background for Extended Events that you need to understand to begin working with Extended Events in SQL Server. After receiving some feedback by email (thanks Aaron I appreciate it), I have changed the post naming convention associated with the post to reflect “2 of 31” instead of 2/31, which apparently caused some confusion in Paul Randal’s and Glenn Berry’s series which were mentioned in the round up post for...(read more)

    Read the article

  • An XEvent a Day (7 of 31) – Targets Week – bucketizers

    - by Jonathan Kehayias
    Yesterday’s post, Targets Week - asynchronous_file_target , looked at the asynchronous_file_target Target in Extended Events and how it outputs the raw Event data in an XML document. Continuing with Targets week today, we’ll look at the bucketizer targets in Extended Events which can be used to group Events based on the Event data that is being returned. What is the bucketizer? The bucketizer performs grouping of Events as they are processed by the target into buckets based on the Event data and...(read more)

    Read the article

  • SYS2 Scripts Updated – Scripts to monitor database backup, database space usage and memory grants now available

    - by Davide Mauri
    I’ve just released three new scripts of my “sys2” script collection that can be found on CodePlex: Project Page: http://sys2dmvs.codeplex.com/ Source Code Download: http://sys2dmvs.codeplex.com/SourceControl/changeset/view/57732 The three new scripts are the following sys2.database_backup_info.sql sys2.query_memory_grants.sql sys2.stp_get_databases_space_used_info.sql Here’s some more details: database_backup_info This script has been made to quickly check if and when backup was done. It will report the last full, differential and log backup date and time for each database. Along with these information you’ll also get some additional metadata that shows if a database is a read-only database and its recovery model: By default it will check only the last seven days, but you can change this value just specifying how many days back you want to check. To analyze the last seven days, and list only the database with FULL recovery model without a log backup select * from sys2.databases_backup_info(default) where recovery_model = 3 and log_backup = 0 To analyze the last fifteen days, and list only the database with FULL recovery model with a differential backup select * from sys2.databases_backup_info(15) where recovery_model = 3 and diff_backup = 1 I just love this script, I use it every time I need to check that backups are not too old and that t-log backup are correctly scheduled. query_memory_grants This is just a wrapper around sys.dm_exec_query_memory_grants that enriches the default result set with the text of the query for which memory has been granted or is waiting for a memory grant and, optionally, its execution plan stp_get_databases_space_used_info This is a stored procedure that list all the available databases and for each one the overall size, the used space within that size, the maximum size it may reach and the auto grow options. This is another script I use every day in order to be able to monitor, track and forecast database space usage. As usual feedbacks and suggestions are more than welcome!

    Read the article

  • An XEvent a Day (9 of 31) – Targets Week – pair_matching

    - by Jonathan Kehayias
    Yesterday’s post, Targets Week – synchronous_event_counter , looked at the counter Target in Extended Events and how it could be used to determine the number of Events a Event Session will generate without actually incurring the cost to collect and store the Events.  Today’s post is coming late, I know, but sometimes that’s just how the ball rolls.  My original planned demo’s for today’s post turned out to only work based on a fluke, though they were very consistent at working as expected,...(read more)

    Read the article

  • An XEvent a Day (27 of 31) – The Future - Tracking Page Splits in SQL Server Denali CTP1

    - by Jonathan Kehayias
    Nearly two years ago Kalen Delaney blogged about Splitting a page into multiple pages , showing how page splits occur inside of SQL Server.  Following her blog post, Michael Zilberstein wrote a post, Monitoring Page Splits with Extended Events , that showed how to see the sqlserver.page_split Events using Extended Events.  Eladio Rincón also blogged about Using XEvents (Extended Events) in SQL Server 2008 to detect which queries are causing Page Splits , but not in relation to Kalen’s blog...(read more)

    Read the article

  • SQL Server Certification - a database platform primer for your career path

    - by ssqa.net
    When you need to upgrade your knowledge then training is required, at the same time certifications will help you to keep up on what you have learned! There is a big debate on the web about whether certifications are important in your career or not, the bottomline is if you do not know the stuff or unable to answer few basic technical questions, it does'nt matter how many certifications you have then you will not get the job, well I'm not starting the same discussion here. But in the recent...(read more)

    Read the article

  • Running SSIS packages from C#

    - by Piotr Rodak
    Most of the developers and DBAs know about two ways of deploying packages: You can deploy them to database server and run them using SQL Server Agent job or you can deploy the packages to file system and run them using dtexec.exe utility. Both approaches have their pros and cons. However I would like to show you that there is a third way (sort of) that is often overlooked, and it can give you capabilities the ‘traditional’ approaches can’t. I have been working for a few years with applications that run packages from host applications that are implemented in .NET. As you know, SSIS provides programming model that you can use to implement more flexible solutions. SSIS applications are usually thought to be batch oriented, with fairly rigid architecture and processing model, with fixed timeframes when the packages are executed to process data. It doesn’t to be the case, you don’t have to limit yourself to batch oriented architecture. I have very good experiences with service oriented architectures processing large amounts of data. These applications are more complex than what I would like to show here, but the principle stays the same: you can execute packages as a service, on ad-hoc basis. You can also implement and schedule various signals, HTTP calls, file drops, time schedules, Tibco messages and other to run the packages. You can implement event handler that will trigger execution of SSIS when a certain event occurs in StreamInsight stream. This post is just a small example of how you can use the API and other features to create a service that can run SSIS packages on demand. I thought it might be a good idea to implement a restful service that would listen to requests and execute appropriate actions. As it turns out, it is trivial in C#. The application is implemented as console application for the ease of debugging and running. In reality, you might want to implement the application as Windows service. To begin, you have to reference namespace System.ServiceModel.Web and then add a few lines of code: Uri baseAddress = new Uri("http://localhost:8011/");               WebServiceHost svcHost = new WebServiceHost(typeof(PackRunner), baseAddress);                           try             {                 svcHost.Open();                   Console.WriteLine("Service is running");                 Console.WriteLine("Press enter to stop the service.");                 Console.ReadLine();                   svcHost.Close();             }             catch (CommunicationException cex)             {                 Console.WriteLine("An exception occurred: {0}", cex.Message);                 svcHost.Abort();             } The interesting lines are 3, 7 and 13. In line 3 you create a WebServiceHost object. In line 7 you start listening on the defined URL and then in line 13 you shut down the service. As you have noticed, the WebServiceHost constructor is accepting type of an object (here: PackRunner) that will be instantiated as singleton and subsequently used to process the requests. This is the class where you put your logic, but to tell WebServiceHost how to use it, the class must implement an interface which declares methods to be used by the host. The interface itself must be ornamented with attribute ServiceContract. [ServiceContract]     public interface IPackRunner     {         [OperationContract]         [WebGet(UriTemplate = "runpack?package={name}")]         string RunPackage1(string name);           [OperationContract]         [WebGet(UriTemplate = "runpackwithparams?package={name}&rows={rows}")]         string RunPackage2(string name, int rows);     } Each method that is going to be used by WebServiceHost has to have attribute OperationContract, as well as WebGet or WebInvoke attribute. The detailed discussion of the available options is outside of scope of this post. I also recommend using more descriptive names to methods . Then, you have to provide the implementation of the interface: public class PackRunner : IPackRunner     {         ... There are two methods defined in this class. I think that since the full code is attached to the post, I will show only the more interesting method, the RunPackage2.   /// <summary> /// Runs package and sets some of its variables. /// </summary> /// <param name="name">Name of the package</param> /// <param name="rows">Number of rows to export</param> /// <returns></returns> public string RunPackage2(string name, int rows) {     try     {         string pkgLocation = ConfigurationManager.AppSettings["PackagePath"];           pkgLocation = Path.Combine(pkgLocation, name.Replace("\"", ""));           Console.WriteLine();         Console.WriteLine("Calling package {0} with parameter {1}.", name, rows);                  Application app = new Application();         Package pkg = app.LoadPackage(pkgLocation, null);           pkg.Variables["User::ExportRows"].Value = rows;         DTSExecResult pkgResults = pkg.Execute();         Console.WriteLine();         Console.WriteLine(pkgResults.ToString());         if (pkgResults == DTSExecResult.Failure)         {             Console.WriteLine();             Console.WriteLine("Errors occured during execution of the package:");             foreach (DtsError er in pkg.Errors)                 Console.WriteLine("{0}: {1}", er.ErrorCode, er.Description);             Console.WriteLine();             return "Errors occured during execution. Contact your support.";         }                  Console.WriteLine();         Console.WriteLine();         return "OK";     }     catch (Exception ex)     {         Console.WriteLine(ex);         return ex.ToString();     } }   The method accepts package name and number of rows to export. The packages are deployed to the file system. The path to the packages is configured in the application configuration file. This way, you can implement multiple services on the same machine, provided you also configure the URL for each instance appropriately. To run a package, you have to reference Microsoft.SqlServer.Dts.Runtime namespace. This namespace is implemented in Microsoft.SQLServer.ManagedDTS.dll which in my case was installed in the folder “C:\Program Files (x86)\Microsoft SQL Server\100\SDK\Assemblies”. Once you have done it, you can create an instance of Microsoft.SqlServer.Dts.Runtime.Application as in line 18 in the above snippet. It may be a good idea to create the Application object in the constructor of the PackRunner class, to avoid necessity of recreating it each time the service is invoked. Then, in line 19 you see that an instance of Microsoft.SqlServer.Dts.Runtime.Package is created. The method LoadPackage in its simplest form just takes package file name as the first parameter. Before you run the package, you can set its variables to certain values. This is a great way of configuring your packages without all the hassle with dtsConfig files. In the above code sample, variable “User:ExportRows” is set to value of the parameter “rows” of the method. Eventually, you execute the package. The method doesn’t throw exceptions, you have to test the result of execution yourself. If the execution wasn’t successful, you can examine collection of errors exposed by the package. These are the familiar errors you often see during development and debugging of the package. I you run the package from the code, you have opportunity to persist them or log them using your favourite logging framework. The package itself is very simple; it connects to my AdventureWorks database and saves number of rows specified in variable “User::ExportRows” to a file. You should know that before you run the package, you can change its connection strings, logging, events and many more. I attach solution with the test service, as well as a project with two test packages. To test the service, you have to run it and wait for the message saying that the host is started. Then, just type (or copy and paste) the below command to your browser. http://localhost:8011/runpackwithparams?package=%22ExportEmployees.dtsx%22&rows=12 When everything works fine, and you modified the package to point to your AdventureWorks database, you should see "OK” wrapped in xml: I stopped the database service to simulate invalid connection string situation. The output of the request is different now: And the service console window shows more information: As you see, implementing service oriented ETL framework is not a very difficult task. You have ability to configure the packages before you run them, you can implement logging that is consistent with the rest of your system. In application I have worked with we also have resource monitoring and execution control. We don’t allow to run more than certain number of packages to run simultaneously. This ensures we don’t strain the server and we use memory and CPUs efficiently. The attached zip file contains two projects. One is the package runner. It has to be executed with administrative privileges as it registers HTTP namespace. The other project contains two simple packages. This is really a cool thing, you should check it out!

    Read the article

  • Master Data Services Employees Sample Model

    - by Davide Mauri
    I’ve been playing with Master Data Services quite a lot in those last days and I’m also monitoring the web for all available resources on it. Today I’ve found this freshly released sample available on MSDN Code Gallery: SQL Server Master Data Services Employee Sample Model http://code.msdn.microsoft.com/SSMDSEmployeeSample This sample shows how Recursive Hierarchies can be modeled in order to represent a typical organizational chart scenario where a self-relationship exists on the Employee entity. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • SQL Server v.Next (Denali) : OS compatibility & upgrade support

    - by AaronBertrand
    Microsoft's Manageability PPM Dan Jones has asked for our feedback on their proposed list of supported operating systems and upgrade paths for the next version of SQL Server. (See the original post ). This has generated all kinds of spirited debates on twitter, in protected mailing lists, and in private e-mail. If you're going to be involved in moving to Denali, you should be aware of these proposals and stay on top of the discussion until the results are in. (The media are starting to pick up on...(read more)

    Read the article

  • An XEvent a Day (31 of 31) – Event Session DDL Events

    - by Jonathan Kehayias
    To close out this month’s series on Extended Events we’ll look at the DDL Events for the Event Session DDL operations, and how those can be used to track changes to Event Sessions and determine all of the possible outputs that could exist from an Extended Event Session.  One of my least favorite quirks about Extended Events is that there is no way to determine the Events and Actions that may exist inside a Target, except to parse all of the the captured data.  Information about the Event...(read more)

    Read the article

  • Security updates for all supported versions of SQL Server

    - by AaronBertrand
    It's patch Tuesday! [ UPDATE June 19 : Please see my follow-up post about this security update.] Today Microsoft released a security bulletin covering several issues that could potentially affect SQL Server; these exploits include remote code execution, denial of service, information disclosure and elevation of privilege. You should test these patches on all machines running SQL Server, including those running only client tools (e.g. Management Studio or Management Studio Express). The updates affect...(read more)

    Read the article

  • Restricting logons during certain hours for certain users

    - by simonsabin
    Following a an email in a DL I decided to look at implementing a logon restriction system to prevent users from logging on at certain ties of the day. The poster had a solution but wanted to add auditing. I immediately thought of the My post on logging messages during a transaction because I new that part of the logon trigger functionality is that you rollback the connection. I therefore assumed you had to do the logging like I talk about in that post (otherwise the logging wouldn’t persist beyond...(read more)

    Read the article

< Previous Page | 100 101 102 103 104 105 106 107 108 109 110 111  | Next Page >