Search Results

Search found 10447 results on 418 pages for 'blog blomqvist no'.

Page 130/418 | < Previous Page | 126 127 128 129 130 131 132 133 134 135 136 137  | Next Page >

  • More Tables or More Databases?

    - by BuckWoody
    I got an e-mail from someone that has an interesting situation. He has 15,000 customers, and he asks if he should have a database for their data per customer. Without a LOT more data it’s impossible to say, of course, but there are some general concepts to keep in mind. Whenever you’re segmenting data, it’s all about boundary choices. You have not only boundaries around how big the data will get, but things like how many objects (tables, stored procedures and so on) that will be involved, if there are any cross-sections of data (do they share location or product information) and – very important – what are the security requirements? From the answer to these types of questions, you now have the choice of making multiple tables in a single database, or using multiple databases. A database carries some overhead – it needs a certain amount of memory for locking and so on. But it has a very clean boundary – everything from objects to security can be kept apart. Having multiple users in the same database is possible as well, using things like a Schema. But keeping 15,000 schemas can be challenging as well. My recommendation in complex situations like this is similar to a post on decisions that I did earlier – I lay out the choices on a spreadsheet in rows, and then my requirements at the top in the columns. I  give each choice a number based on how well it meets each requirement. At the end, the highest number wins. And many times it’s a mix – perhaps this person could segment customers into larger regions or districts or products, in a database. Within that database might be multiple schemas for the customers. Of course, he needs to query across all customers, that becomes another requirement. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • How to start with PowerPivot for Excel

    - by Marco Russo (SQLBI)
    Now that Office 2010 has been released, many people will start looking for resources to start learning PowerPivot. Of course, the book I’m writing will be helpful when it will be published (September 2010), but you can also start with some online content on Microsoft sites. First of all, this is the web site dedicated to PowerPivot: http://www.powerpivot.com/ It contains several videos and demos and it’s also possible to use a Virtual Lab without installing Office 2010 on your PC. Then, there is...(read more)

    Read the article

  • &quot;CLR Enabled&quot; is not required to use CLR built-ins

    - by AaronBertrand
    Books Online articles referencing built-in CLR functions (such as FORMAT() ) have a remark similar to the following: "FORMAT relies on the presence of .the .NET Framework Common Language Runtime (CLR)." A lot of people seem to interpret this as meaning: "You must enable the sp_configure option 'CLR enabled' in order to use FORMAT()." Some then go on and suggest you run code similar to the following before you play with these functions: EXEC sp_configure 'show advanced options' , 1 ; GO RECONFIGURE...(read more)

    Read the article

  • QueryUnit 0.0.0.8 – Trust No One

    - by Davide Mauri
    Yesterday I’ve release an updated version of QueryUnit, the version 0.0.0.8. QueryUnit now supports AreNotEqual, Greater, and Less assertions and is more capable of managing strings results. I must say that I cannot live anymore without a proper Unit Testing of a BI solution. Just yesterday happened that one of the unit tests at a customer site failed showing a subtle situation where the release of a new version of custom application would have corrupted the source of BI data with a very low chance that someone would have noticed it before several days. It may happen when you have more the 15 systems that handles the data needed by your BI solution. The key message of this situation is “Trust No One”: if your data hasn’t passed quality testing it’s not trustable. Period. QueryUnit is now officialy an hero :) No superpowers still, but useful above all. http://queryunit.codeplex.com/ Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • StreamInsight will not push feature releases through Microsoft Update going forward

    - by Roman Schindlauer
    Until now, we've released StreamInsight through the Microsoft Download Center, and also released it out through Microsoft Update. Going forward, we will only release new StreamInsight versions through the Microsoft Download Center and only use MU to release service packs and security fixes (should any be needed). As a result of this decision, we are pulling off the recent StreamInsight 2.1 release from MU; this release is still available in Download Center. Don’t worry: there’s nothing wrong with the versions we’ve shipped in MU, we’ve just adjusted how we use MU. There is no action necessary from our customers as a result of this change, and we are not rolling back any changes to your current installation, so if you have installed StreamInsight 2.1 recently through the Microsoft Update, they will still work fine. Regards, The StreamInsight Team

    Read the article

  • LAG function – practical use and comparison to old syntax

    - by Michael Zilberstein
    Recently I had to analyze huge trace – 46GB of trc files. Looping over files I loaded them into trace table using fn_trace_gettable function and filters I could use in order to filter out irrelevant data. I ended up with 6.5 million rows table, total of 7.4GB in size. It contained RowNum column which was defined as identity, primary key, clustered. One of the first things I detected was that although time difference between first and last events in the trace was 10 hours, total duration of all sql...(read more)

    Read the article

  • LAG function – practical use and comparison to old syntax

    - by Michael Zilberstein
    Recently I had to analyze huge trace – 46GB of trc files. Looping over files I loaded them into trace table using fn_trace_gettable function and filters I could use in order to filter out irrelevant data. I ended up with 6.5 million rows table, total of 7.4GB in size. It contained RowNum column which was defined as identity, primary key, clustered. One of the first things I detected was that although time difference between first and last events in the trace was 10 hours, total duration of all sql...(read more)

    Read the article

  • Live from the #summit13 keynote : 2013-10-16

    - by AaronBertrand
    Early morning start here in Charlotte. I'm going to try and keep this post updated as I have new information from the keynote to share, so refresh often! 8:24 AM Bill Graziano takes the stage and welcomes us to the 15th PASS Summit. He mentions that PASS delivered over 700,000 hours of technical training in the previous fiscal year, and shows a Power BI Power Map video talking about all of the SQL Saturday accomplishments in the last few years. She introduces Amy Lewis, who wins this year's PASSion...(read more)

    Read the article

  • Sampling SQL server batch activity

    - by extended_events
    Recently I was troubleshooting a performance issue on an internal tracking workload and needed to collect some very low level events over a period of 3-4 hours.  During analysis of the data I found that a common pattern I was using was to find a batch with a duration that was longer than average and follow all the events it produced.  This pattern got me thinking that I was discarding a substantial amount of event data that had been collected, and that it would be great to be able to reduce the collection overhead on the server if I could still get all activity from some batches. In the past I’ve used a sampling technique based on the counter predicate to build a baseline of overall activity (see Mikes post here).  This isn’t exactly what I want though as there would certainly be events from a particular batch that wouldn’t pass the predicate.  What I need is a way to identify streams of work and select say one in ten of them to watch, and sql server provides just such a mechanism: session_id.  Session_id is a server assigned integer that is bound to a connection at login and lasts until logout.  So by combining the session_id predicate source and the divides_by_uint64 predicate comparator we can limit collection, and still get all the events in batches for investigation. CREATE EVENT SESSION session_10_percent ON SERVER ADD EVENT sqlserver.sql_statement_starting(     WHERE (package0.divides_by_uint64(sqlserver.session_id,10))), ADD EVENT sqlos.wait_info (        WHERE (package0.divides_by_uint64(sqlserver.session_id,10))), ADD EVENT sqlos.wait_info_external (        WHERE (package0.divides_by_uint64(sqlserver.session_id,10))), ADD EVENT sqlserver.sql_statement_completed(     WHERE (package0.divides_by_uint64(sqlserver.session_id,10))) ADD TARGET ring_buffer WITH (MAX_DISPATCH_LATENCY=30 SECONDS,TRACK_CAUSALITY=ON) GO   There we go; event collection is reduced while still providing enough information to find the root of the problem.  By the way the performance issue turned out to be an IO issue, and the session definition above was more than enough to show long waits on PAGEIOLATCH*.        

    Read the article

  • Testing and Validation – You Really Do Have The Time

    - by BuckWoody
    One of the great advantages in my role as a Technical Specialist here at Microsoft is that I get to work with so many great clients. I get to see their environments and how they use them, and the way they work with SQL Server. I’ve been a data professional myself for many years. Over that time I’ve worked with many database platforms, lots of client applications, and written a lot of code in many industries. For a while I was also a consultant, so I got to see how other shops did things as well. But because I now focus on a “set” base of clients (over 500 professionals in over 150 companies) I get to see them over a longer period of time. Many of them help me understand how they use the product in their projects, and I even attend some DBA regular meetings. I see the way the product succeeds, and I see when it fails. Something that has really impacted my way of thinking is the level of importance any given shop is able to place on testing and validation. I’ve always been a big proponent of setting up a test system and following a very disciplined regimen to make sure it will work in production for any new projects, and then taking the lessons learned into production as standards. I know, I know – there’s never enough time to do things right like this. Yet the shops I see that do it have the same level of work that they output as the shops that don’t. They just make the time to do the testing and validation and create a standard that they will follow in production. And what I’ve found (surprise surprise) is that they have fewer production problems. OK, that might seem obvious – but I’ve actually tracked it and those places that do the testing and best practices really do save stress, time and trouble from that effort. We all think that’s a good idea, but we just “don’t have time”. OK – but from what I’m seeing, you can gain time if you spend a little up front. You may find that you’re actually already spending the same amount of time that you would spend in doing the testing, you’re just doing it later, at night, under the gun. Food for thought.  Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Managing Confidence

    - by andyleonard
    Introduction This post is the fifty-third part of a ramble-rant about the software business. The current posts in this series can be found on the series landing page . This post is about inspiring others. Hot Chicks - Baby chickens beneath a warming lamp… </NonSubtleSEOPloy> For those who do not know, we raise chickens that laying eggs – referred to as “laying hens”. Natural attrition has taken our flock of laying hens to 11, plus one rooster. We recently received an order of new chicks (pictured...(read more)

    Read the article

  • Windows Not Sleeping All Night

    - by John Paul Cook
    Having a computer wake up when you don’t want it to wastes electricity and drains the battery on mobile devices. My desktop had been waking up at night, so I assumed it was some network traffic on my home network. I unchecked Allow this device to wake the computer on my network adapters . Figure 1. Network adapter Power Management tab. That didn’t solve the problem. I included the screen capture in Figure 1 because it could be part of the solution for someone else. To identify the root cause instead...(read more)

    Read the article

  • PASS Summit Location Redux

    - by andyleonard
    Introduction To quote Ronald Reagan, " There you go again ." The Professional Association for SQL Server (PASS) is considering locations for future PASS Summits. The apparent answer is: You Can Have The Summit Anywhere You Want... ... as long as it's in Seattle. PASS conducted a survey on this about a year ago, and I commented on the results and PASS' (mis-)interpretation of said results in a post entitled On PASS Summit Locations, Time Will Tell . "It's About Community" I think every member of the...(read more)

    Read the article

  • TechEd 2014 Day 2

    - by John Paul Cook
    Today people asked me about backing up older versions of SQL Server to Azure. Older versions back to SQL Server 2005 can be easily backed up to Azure Storage by installing Microsoft SQL Server Backup to Windows Azure Tool. It installs a service of the same name that applies rules to SQL Server backups. You can tell the tool to backup or encrypt your SQL Server backups. You can have it automatically upload your backups to Azure Storage. Even if you don’t want to upload your backups to Azure, you might...(read more)

    Read the article

  • TechEd 2014 Day 3

    - by John Paul Cook
    There is some confusion about durability of data stored in SQL Server in-memory tables, so some review of the concepts is appropriate. The in-memory option is enabled at the database level. Enabling it at the database level only gives you the option to specify the in-memory feature on a table by table basis. No existing tables or new tables will by default become in-memory tables when you enable the feature at the database level. If you choose to make a table an in-memory table, by default it is...(read more)

    Read the article

  • Oracle Partner Specialists – Sell & Deliver High Value Products to Customers

    - by Richard Lefebvre
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 Do you want to know where to find useful information about partner training and other activities to complete Oracle Specialization available in the country you are personally based? Go to the EMEA partner enablement blog and read latest information regarding training opportunities ready to join for Cloud Services, Applications, Business Intelligence, Middleware, Database 12c, Engineered System as well as Server & Storage. Recently, we announced new TestFest events in France, which you can join to pass your own Implementation Assessment within the Specialization category you have already chosen. To find out where and when the next TestFest close to your location will take place, please contact [email protected] or watch out for further announcements of TestFest events in your home country. Turnback to the EMEA Partner Enablement Blog from time to time to update your own Specialization and join the latest training for Sales, Presales or Implementation Specialists:  https://blogs.oracle.com/opnenablement/

    Read the article

  • Utility Queries–Structure of Tables with Identity Column

    - by drsql
    I have been doing a presentation on sequences of late (last planned version of that presentation was last week, but should be able to get the gist of things from the slides and the code posted here on my presentation page), and as part of that, I started writing some queries to interrogate the structure of tables. I started with tables using an identity column for some purpose because they are considerably easier to do than sequences, specifically because the limitations of identity columns make...(read more)

    Read the article

  • DevWeek Slides & Demos available for download

    - by Davide Mauri
    Anyone interested can download Slides, Demos and Demo Database (WhitepagesDB) of my “SQL Server best practices for developers” session here: http://www.davidemauri.it/resources/slide--demos.aspx Happy Downloading! :) Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • [Speaking] PowerShell at the PASS Summit

    - by AllenMWhite
    Next week is the annual PASS Summit , the event of the year for those of us in the SQL Server community. We get to see our old friends, make new friends, and learn an amazing amount about SQL Server, and it'll be in Seattle, so it's close to the mother ship. I love having Microsoft close, because it's easier to get to know the people who actually make this amazing product we spend our lives working with. This year I'm fortunate to have been selected to present three sessions. One is a regular session...(read more)

    Read the article

  • Redirecting from blogger to custom domain [closed]

    - by mdhar9e
    Possible Duplicate: How to have a blogspot blog in my domain? i have a blog from blogger named as www.myclipta.blogspot.com. i am updating regulary. Then i bought a custom domain with myclipta.com. Now i want to redirect from blogger domain to my custom domain. i don't know how to do this . i heard that to set dns name servers and CNAME..But i am not able to do this.. can any one can guide me please..

    Read the article

  • Rules of Holes #1: Stop Digging

    - by ArnieRowland
    You may have heard of the 'First Rule of Holes'. It goes something like this: " When you suspect you might be in a hole, stop digging. " That seems like obvious, and good advice, but what does it really mean? How does the Rule of Holes apply to you? How does it apply to your job? When things are not going right, stop doing the "same ol', same ol'" You find yourself involved in doing the same type of coding over and over. Maybe it's time to stop, step back, take a little time and learn something new....(read more)

    Read the article

  • Rebuilding system databases in 2008 R2

    - by TiborKaraszi
    All my attempts so far to rebuild the system databases in 2008 R2 has failed. I first tried to run setup from below path: C:\Program Files\Microsoft SQL Server\100\Setup Bootstrap\Release But above turns out to be the 2008 setup program, not 2008R2 setup; even though I have no 2008 instanced installed (I have only R2 instances installed). Apparently, the 2008 setup program does a version check of the instance to be rebuilt and since it is > 10.50.0, the rebuild fails. Books Online for R2 the section...(read more)

    Read the article

  • TechEd 2014 Day 1

    - by John Paul Cook
    Today at TechEd 2014, many people had questions about the in-memory database features in SQL Server 2014. A common question is how an in-memory database is different from having a database on a SQL Server with an amount of ram far greater than the size of the database. In-memory or memory optimized tables have different data structures and are accessed differently using a latch free and lock free approach that greatly improves performance. This provides part of the performance improvement. The rest...(read more)

    Read the article

  • New code release today - 2011.1.4.2

    - by Steve Tunstall
    Wow, two blog entries in the same day! When I wrote the large 'Quota' blog entry below, I did not realize there would be a micro-code update going out the same evening. So here it is. Code 2011.1.4.2 has just been released. You can get the readme file for it here: https://wikis.oracle.com/display/FishWorks/ak-2011.04.24.4.2+Release+Notes Download it, of course, through the MOS website. It looks like it fixes a pretty nasty bug. Get it if you think it applies to you. Unless you have a great reason NOT to upgrade, I would strongly advise you to upgrade to 2011.1.4.2. Why? Because the readme file says they STRONGLY RECOMMEND YOU ALL UPGRADE TO THIS CODE IMMEDIATELY using LOTS OF CAPITAL LETTERS. That's good enough for me. Be sure to run the health check like the readme tells you to. 

    Read the article

  • Ranking with PowerPivot – a different approach

    - by Marco Russo (SQLBI)
    Alberto Ferrari wrote an interesting post about a “different approach” in creating a ranking measure with PowerPivot . If you know DAX or you read our book , you will find that a DAX expression can solve the issue. However, such a formula is more complex than necessary. The next version of PowerPivot might have more built-in DAX functions and should solve the ranking need with a simpler formula. In the meantime, it is interesting to know a different approach that relies on Excel skills instead of...(read more)

    Read the article

< Previous Page | 126 127 128 129 130 131 132 133 134 135 136 137  | Next Page >