Search Results

Search found 27337 results on 1094 pages for 't sql'.

Page 732/1094 | < Previous Page | 728 729 730 731 732 733 734 735 736 737 738 739  | Next Page >

  • Presenting at Usergroup meeting in Dublin

    - by simonsabin
    I'll be over in Dublin doing a usergroup meeting on Thursday evening at the Microsoft office. The subject of the talk is “Almost all queries have to do two things, get data and join it together. In this session we will look at the aspects of these that most people think they know but in reality don’t. “ If you think you know SQL then you should come along and we'll see if you are right http://www.mtug.ie/UserGroups/ SQLServer/tabid/82/ctl/Details/ Mid/413/ItemID/110/ Default.aspx?ContainerSrc...(read more)

    Read the article

  • PASS Summit 2012, Day 1

    - by KKline
    One of the most positive experiences I can have, as a former leader of the PASS organization, is when I see a neophyte become a passionate support and champion for the community. On my first day in Seattle, before the event had even begun, I was stopped several by people who'd attended their first PASS Summit last year. But this year, they were excited to tell me that they'd started user groups in their own community, spoken for the first time at a PASS event, or even helped launch a SQL Saturday...(read more)

    Read the article

  • BonaVista Dimensions used as a report service

    - by Marco Russo (SQLBI)
    Recently I have seen a long demo of BonaVista Dimensions . It is a product that is able to create reports and, most important dashboards. You can use it also without SQL Server and Analysis Services, just by importing data in a local cube file that you can model using its own simple to use user interface. But what is interesting to me (in this post) is the capability to connect to a SSAS cube. It is somewhat similar to XLCubed and in reality these two products have something in common, because both...(read more)

    Read the article

  • Visual Studio 2012 Update 1 now available for download

    - by Greg Low
    Good to see the Visual Studio 2012 team get update 1 out the door. I'm using it now and am pretty happy with it.I like the way that the tools are now being updated out of band. Hopefully, the SQL BI folk will get their templates updated to VS2012 soon too.You can get it here: http://www.microsoft.com/visualstudio/eng/downloads#d-visual-studio-2012-updateDetailed list of what's changed is here: http://blogs.msdn.com/b/visualstudioalm/archive/2012/11/26/visual-studio-and-team-foundation-server-2012-update-1-now-available.aspx 

    Read the article

  • ClearTrace Performance on 170GB of Trace Files

    - by Bill Graziano
    I’ve always worked to make ClearTrace perform well.  That’s probably because I spend so much time watching it work.  I’m often going through two or three gigabytes of trace files but I rarely get the chance to run it on a really large set of files. One of my clients wanted to run a full trace for a week and then analyze the results.  At the end of that week we had 847 200MB trace files for a total of nearly 170GB. I regularly use 200MB trace files when I monitor production systems.  I usually get around 300,000 statements in a file that size if it’s mostly stored procedures.  So those 847 trace files contained roughly 250 million statements.  (That’s 730 bytes per statement if you’re keeping track.  Newer trace files have some compression in them but I’m not exactly sure what they’re doing.)  On a system running 1,000 statements per second I get a new file every five minutes or so. It took 27 hours to process these files on an older development box.  That works out to 1.77MB/second.  That means ClearTrace processed about 2,654 statements per second. You can query the data while you’re loading it but I’ve found it works better to use a second instance of ClearTrace to do this.  I’m not sure why yet but I think there’s still some dependency between the two processes. ClearTrace is almost always CPU bound.  It’s really just a huge, ugly collection of regular expressions.  It only writes a summary to its database at the end of each trace file so that usually isn’t a bottleneck.  At the end of this process, the executable was using roughly 435MB of RAM.  Certainly more than when it started but I think that’s acceptable. The database where all this is stored started out at 100MB.  After processing 170GB of trace files the database had grown to 203MB.  The space savings are due to the “datawarehouse-ish” design and only storing a summary of each trace file. You can download ClearTrace for SQL Server 2008 or test out the beta version for SQL Server 2012.  Happy Tuning!

    Read the article

  • SSIS Training Comes to NYC 30 Jul-3 Aug!

    - by andyleonard
    Linchpin People is excited to announce the scheduling of From Zero To SSIS in New York City 30 Jul – 03 Aug 2012! Training Description From Zero to SSIS was developed by Andy Leonard to train technology professionals in the fine art of using SQL Server Integration Services (SSIS) to build data integration and Extract-Transform-Load (ETL) solutions. The training is focused around labs and emphasizes a hands-on approach. Most technologists learn by doing; this training is designed to maximize the time...(read more)

    Read the article

  • Replication Presentation at the Southampton User Group

    - by GavinPayneUK
    Last night I delivered a presentation about SQL Server’s replication services to Mark Pryce-Maher’s user group in Southampton. As those who were there saw this is a massive topic and to deliver anything but a high level overview in 45 minutes does an injustice to the subject.  Therefore, what I gave the Wednesday night audience was a deliberately high level introduction through my slides with an accompanying detailed commentary as well as answering questions as we went along. The great thing...(read more)

    Read the article

  • Speaker Prep Tip: Use the AV Studio Built into that Laptop

    - by merrillaldrich
    Over at erinstellato.com there is a great post this week about tips for new presenters. Ms. Stellato suggests, insightfully, that we record ourselves, which is really a fantastic piece of advice. What’s extra-cool is that today you don’t need any special equipment or expensive software to do just that. This week I “filmed” two run-throughs of my talk for SQL Saturday tomorrow. For me, the timing is the hardest thing – figuring out how much content I can really present in the time allowed without...(read more)

    Read the article

  • SSIS Training Comes to NYC 30 Jul-3 Aug!

    - by andyleonard
    Linchpin People is excited to announce the scheduling of From Zero To SSIS in New York City 30 Jul – 03 Aug 2012! Training Description From Zero to SSIS was developed by Andy Leonard to train technology professionals in the fine art of using SQL Server Integration Services (SSIS) to build data integration and Extract-Transform-Load (ETL) solutions. The training is focused around labs and emphasizes a hands-on approach. Most technologists learn by doing; this training is designed to maximize the time...(read more)

    Read the article

  • How to find the longitude and latitude of a location

    - by simonsabin
    Just found this really simple but very useful site to give you the latitude and longitude of a location http://www.getlatlon.com/ The great thing is that you can select from a map and it also gives you the WKT you can use to generate you geography in SQL Server. This is the location for SQLBits VI...(read more)

    Read the article

  • Management - Level 9 in the Stairway to Reporting Services

    In the last article of the series, you will learn how to manage your reports once you've finished development, including how to use the Report Manager, deploy reports, and send reports to the appropriate end users. New! SQL Monitor 3.0 Red Gate's multi-server performance monitoring and alerting tool gets results from Day One.Simple to install and easy to use – download a free trial today.

    Read the article

  • LAG function – practical use and comparison to old syntax

    - by Michael Zilberstein
    Recently I had to analyze huge trace – 46GB of trc files. Looping over files I loaded them into trace table using fn_trace_gettable function and filters I could use in order to filter out irrelevant data. I ended up with 6.5 million rows table, total of 7.4GB in size. It contained RowNum column which was defined as identity, primary key, clustered. One of the first things I detected was that although time difference between first and last events in the trace was 10 hours, total duration of all sql...(read more)

    Read the article

  • LAG function – practical use and comparison to old syntax

    - by Michael Zilberstein
    Recently I had to analyze huge trace – 46GB of trc files. Looping over files I loaded them into trace table using fn_trace_gettable function and filters I could use in order to filter out irrelevant data. I ended up with 6.5 million rows table, total of 7.4GB in size. It contained RowNum column which was defined as identity, primary key, clustered. One of the first things I detected was that although time difference between first and last events in the trace was 10 hours, total duration of all sql...(read more)

    Read the article

  • SSIS Basics: Using the Merge Join Transformation

    SSIS is able to take sorted data from more than one OLE DB data source and merge them into one table which can then be sent to an OLE DB destination. This 'Merge Join' transformation works in a similar way to a SQL join by specifying a 'join key' relationship. this transformation can save a great deal of processing on the destination. Annette Allen, as usual, gives clear guidance on how to do it.

    Read the article

  • DevWeek Slides & Demos available for download

    - by Davide Mauri
    Anyone interested can download Slides, Demos and Demo Database (WhitepagesDB) of my “SQL Server best practices for developers” session here: http://www.davidemauri.it/resources/slide--demos.aspx Happy Downloading! :) Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • TechEd 2014 Day 3

    - by John Paul Cook
    There is some confusion about durability of data stored in SQL Server in-memory tables, so some review of the concepts is appropriate. The in-memory option is enabled at the database level. Enabling it at the database level only gives you the option to specify the in-memory feature on a table by table basis. No existing tables or new tables will by default become in-memory tables when you enable the feature at the database level. If you choose to make a table an in-memory table, by default it is...(read more)

    Read the article

  • Common database deployment blockers and Continuous Delivery headaches

    Deployability is now a first class concern for databases, so why isn’t it as easy as it should be? Matthew Skelton explores seven of the most common challenges which will bring your database deployments to their knees. Get alerts within 15 seconds of SQL Server issuesSQL Monitor checks performance data every 15 seconds, so you can fix issues before your users even notice them. Start monitoring with a free trial.

    Read the article

  • 5 Reasons You Must Start Capturing Baseline Data

    It is widely acknowledged within the SQL Server community that baselines represent valuable information that DBAs should capture. Unfortunately, very few companies manage to log and report on this information, and DBAs are then forced to troubleshoot from the hip and scramble to find evidence to prove that the database is not the problem. This article will make a compelling argument for why DBAs must start capturing baseline information, and will create a roadmap for subsequent posts.

    Read the article

  • Geek City: What gets logged for index rebuild operations?

    - by Kalen Delaney
    This blog post was inspired by a question from a future student. Someone who was already booked for my SQL Server Internals class in June asked for some information on a current problem he was having with transaction log writes causing excessive wait times during index rebuild operations when run in ONLINE mode. He wanted to know if switching to BULK_LOGGED recovery could help. I knew the difference between ALTER INDEX in FULL vs BULK_LOGGED recovery when doing normal OFFLINE rebuilds, but I wasn't...(read more)

    Read the article

  • The 5 Worst Days in a DBAs Life: Day 2

    Steve and the rest of the DBA Team are back for round two. In this episode they have to restore all of a business' data using nothing but a set of off-site backups, kanban, and witty repartee. "A real time saver" Andy Doyle, Head of IT ServicesAndy and his team saved time by automating backup and restores with SQL Backup Pro. Find out how much time you could save. Download a free trial now.

    Read the article

< Previous Page | 728 729 730 731 732 733 734 735 736 737 738 739  | Next Page >