Search Results

Search found 10459 results on 419 pages for 'elaine blog'.

Page 130/419 | < Previous Page | 126 127 128 129 130 131 132 133 134 135 136 137  | Next Page >

  • Rules of Holes -#1: Stop Digging

    - by ArnieRowland
    You may have heard of the 'First Rule of Holes'. It goes something like this: " When you suspect you might be in a hole, stop digging. " That seems like obvious, and good advice, but what does it really mean? How does the Rule of Holes apply to you? How does it apply to your job? When things are not going right, stop doing the "same ol', same ol'" You find yourself involved in doing the same type of coding over and over. Maybe it's time to stop, step back, take a little time and learn something new....(read more)

    Read the article

  • Have You Downloaded SQL Server 2012 Evaluation Edition? Why Not?!

    - by andyleonard
    I am installing SQL Server 2012 Evaluation Edition on a virtual machine as I type. You can do this. Here’s one way: Grab some virtual machine software. I like Oracle VirtualBox . It’s cool. It’s free. Install VirtualBox. Download the 180-day free trial of Windows Server 2008 R2 . Also cool. Also free. Once Windows Server 2008 R2 is downloaded, build a VirtualBox VM. Download and install SQL Server 2012 Evaluation Edition ! That’s all there is to it. You can get started today, no need to wait until...(read more)

    Read the article

  • Developing a Support Plan for Cloud Applications

    - by BuckWoody
    Last week I blogged about developing a High-Availability plan. The specifics of a given plan aren't as simple as "Step 1, then Step 2" because in a hybrid environment (which most of us have) the situation changes the requirements. There are those that look for simple "template" solutions, but unless you settle on a single vendor and a single way of doing things, that's not really viable. The same holds true for support. As I've mentioned before, I'm not fond of the term "cloud", and would rather use the tem "Distributed Computing". That being said, more people understand the former, so I'll just use that for now. What I mean by Distributed Computing is leveraging another system or setup to perform all or some of a computing function. If this definition holds true, then you're essentially creating a partnership with a vendor to run some of your IT - whether that be IaaS, PaaS or SaaS, or more often, a mix. In your on-premises systems, you're the first and sometimes only line of support. That changes when you bring in a Cloud vendor. For Windows Azure, we have plans for support that you can pay for if you like. http://www.windowsazure.com/en-us/support/plans/ You're not off the hook entirely, however. You still need to create a plan to support your users in their applications, especially for the parts you control. The last thing they want to hear is "That's vendor X's problem - you'll have to call them." I find that this is often the last thing the architects think about in a solution. It's fine to put off the support question prior to deployment, but I would hold off on calling it "production" until you have that plan in place. There are lots of examples, like this one: http://www.va-interactive.com/inbusiness/editorial/sales/ibt/customer.html some of which are technology-specific. Once again, this is an "it depends" kind of approach. While it would be nice if there was just something in a box we could buy, it just doesn't work that way in a hybrid system. You have to know your options and apply them appropriately.

    Read the article

  • Chapter 7–Enforced Data Protection

    - by drsql
    As the book progresses, I find myself veering from the original stated outline quite a bit, because as I teach about this more (and I am teaching a daylong db design class in August at http://www.sqlsolstice.com/ … shameless plug, but it is on topic :) I start to find that a given order works better. Originally I had slated myself to talk more about modeling here for three chapters, then get back to the more implementation topics to finish out the book, but now I am going to keep plugging through...(read more)

    Read the article

  • Speaker Prep Tip: Use the AV Studio Built into that Laptop

    - by merrillaldrich
    Over at erinstellato.com there is a great post this week about tips for new presenters. Ms. Stellato suggests, insightfully, that we record ourselves, which is really a fantastic piece of advice. What’s extra-cool is that today you don’t need any special equipment or expensive software to do just that. This week I “filmed” two run-throughs of my talk for SQL Saturday tomorrow. For me, the timing is the hardest thing – figuring out how much content I can really present in the time allowed without...(read more)

    Read the article

  • Suggestion: ALLFILES option for RESTORE

    - by Greg Low
    The default action when performing a backup is to append to the backup file yet the default action when restoring a backup is to restore just the first file.I constantly come across customer situations where they are puzzled that they seem to have lost data after they have completed a restore. Invariably, it's just that they haven't restored all the backups contained within a single OS file. This happens most commonly with log backups but also happens when they have not restored the most recent database backup file.It is not trivial to achieve this within simple T-SQL scripts, when the number of backup files within the OS file is unknown. It really should be.I'd like to see a FILES=ALLFILES option on the RESTORE command. For RESTORE DATABASE, it should restore the most recent database backup plus any subsequent log files. For RESTORE LOG (which is the most important missing option), it should just restore all relevant log backups that are contained.If you agree, you know what to do: please vote:  https://connect.microsoft.com/SQLServer/feedback/details/769204/option-to-restore-all-backups-files-within-a-media-setAlternately, how would you write a T-SQL command to restore all log backups within a single OS file where the number of files is unknown? Would love to hear creative solutions because all the ones that I think of are pretty messy and need dynamic SQL. 

    Read the article

  • Supermicro motherboards and systems

    - by jchang
    I used to buy SuperMicro exclusively for my own lab. SuperMicro always had a deep lineup of motherboards with almost every conceivable variation. In particular, they had the maximum memory and IO configuration that is desired for database servers. But from around 2006, I became too lazy to source the additional components necessary to complete the system, and switched to Dell PowerEdge Tower servers. Now, I may reconsider as neither Dell or HP are offering the right combination of PCI-E slots. Nor...(read more)

    Read the article

  • IE 9 RC maybe possible to release on 10 February

    - by anirudha
    this is not a exclamatory we all know about that they always postponed their time for product release. I not know what is means of it. maybe it’s trick microsoft use to make their software popular. but sometime it’s give bad impression to user. On 2009 Microsoft put a widget [ countdown ] widget for launching Visual studio 2010. who used by many MSDN blogger. Somasagar are one of them who put the widget on their blog that show “How much time after Visual studio goes released”. but after post ponding the date I not know where widget was gone. site are down who provide the widget. they use same trick they postponed their  date 20 march to 12 April to release the Visual studio. well wait something more and next time never  believe that it’s really gone to release on certain date they show you on blog.

    Read the article

  • More Tables or More Databases?

    - by BuckWoody
    I got an e-mail from someone that has an interesting situation. He has 15,000 customers, and he asks if he should have a database for their data per customer. Without a LOT more data it’s impossible to say, of course, but there are some general concepts to keep in mind. Whenever you’re segmenting data, it’s all about boundary choices. You have not only boundaries around how big the data will get, but things like how many objects (tables, stored procedures and so on) that will be involved, if there are any cross-sections of data (do they share location or product information) and – very important – what are the security requirements? From the answer to these types of questions, you now have the choice of making multiple tables in a single database, or using multiple databases. A database carries some overhead – it needs a certain amount of memory for locking and so on. But it has a very clean boundary – everything from objects to security can be kept apart. Having multiple users in the same database is possible as well, using things like a Schema. But keeping 15,000 schemas can be challenging as well. My recommendation in complex situations like this is similar to a post on decisions that I did earlier – I lay out the choices on a spreadsheet in rows, and then my requirements at the top in the columns. I  give each choice a number based on how well it meets each requirement. At the end, the highest number wins. And many times it’s a mix – perhaps this person could segment customers into larger regions or districts or products, in a database. Within that database might be multiple schemas for the customers. Of course, he needs to query across all customers, that becomes another requirement. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Managing Confidence

    - by andyleonard
    Introduction This post is the fifty-third part of a ramble-rant about the software business. The current posts in this series can be found on the series landing page . This post is about inspiring others. Hot Chicks - Baby chickens beneath a warming lamp… </NonSubtleSEOPloy> For those who do not know, we raise chickens that laying eggs – referred to as “laying hens”. Natural attrition has taken our flock of laying hens to 11, plus one rooster. We recently received an order of new chicks (pictured...(read more)

    Read the article

  • LAG function – practical use and comparison to old syntax

    - by Michael Zilberstein
    Recently I had to analyze huge trace – 46GB of trc files. Looping over files I loaded them into trace table using fn_trace_gettable function and filters I could use in order to filter out irrelevant data. I ended up with 6.5 million rows table, total of 7.4GB in size. It contained RowNum column which was defined as identity, primary key, clustered. One of the first things I detected was that although time difference between first and last events in the trace was 10 hours, total duration of all sql...(read more)

    Read the article

  • Set and Verify the Retention Value for Change Data Capture

    - by AllenMWhite
    Last summer I set up Change Data Capture for a client to track changes to their application database to apply those changes to their data warehouse. The client had some issues a short while back and felt they needed to increase the retention period from the default 3 days to 5 days. I ran this query to make that change: sp_cdc_change_job @job_type='cleanup', @retention=7200 The value 7200 represents the number of minutes in a period of 5 days. All was well, but they recently asked how they can verify...(read more)

    Read the article

  • LAG function – practical use and comparison to old syntax

    - by Michael Zilberstein
    Recently I had to analyze huge trace – 46GB of trc files. Looping over files I loaded them into trace table using fn_trace_gettable function and filters I could use in order to filter out irrelevant data. I ended up with 6.5 million rows table, total of 7.4GB in size. It contained RowNum column which was defined as identity, primary key, clustered. One of the first things I detected was that although time difference between first and last events in the trace was 10 hours, total duration of all sql...(read more)

    Read the article

  • How to start with PowerPivot for Excel

    - by Marco Russo (SQLBI)
    Now that Office 2010 has been released, many people will start looking for resources to start learning PowerPivot. Of course, the book I’m writing will be helpful when it will be published (September 2010), but you can also start with some online content on Microsoft sites. First of all, this is the web site dedicated to PowerPivot: http://www.powerpivot.com/ It contains several videos and demos and it’s also possible to use a Virtual Lab without installing Office 2010 on your PC. Then, there is...(read more)

    Read the article

  • Never Bet Against the Impossible

    - by BuckWoody
    My uncle used to say “If a man tells you that his car squirts milk in his eye when you lift the hood, don’t bet against that. You’ll end up with milk in your eye.” My friend Allen White tells me this is taken from a play (and was said about playing cards), but I think the sentiment holds, even in database work. I mentioned the other day that you should allow the other person to talk and actively listen before you propose a solution. Well, I saw a consultant “bet against the impossible”  the other day – and it bit her. She explained to the person telling her the problem that the situation simply couldn’t exist that way, and he proceeded to show her that it did. She got silent, typed a few things, muttered a little, and then said “well, must be something else.” She just couldn’t admit she was wrong. So don’t go there. If someone explains a problem to you with their database, listen with purpose, and then explore the troubleshooting steps you know to find the problem. But keep your absolutes to yourself. In fact, I have a friend that has recently sent me one of those. He connects to a system with SQL Server Management Studio (SSMS) version 2008 (if I recall correctly) and it shows a certain version number of the target system in the connection tab. Then he connects to it using SSMS 2008 R2 and gets a different number. Now, as far as I know, we didn’t change the connection string information, and that’s provided by the target system, so this is impossible. But I won’t tell him that. Not until I look a little more. :) Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • PASS Summit Location Redux

    - by andyleonard
    Introduction To quote Ronald Reagan, " There you go again ." The Professional Association for SQL Server (PASS) is considering locations for future PASS Summits. The apparent answer is: You Can Have The Summit Anywhere You Want... ... as long as it's in Seattle. PASS conducted a survey on this about a year ago, and I commented on the results and PASS' (mis-)interpretation of said results in a post entitled On PASS Summit Locations, Time Will Tell . "It's About Community" I think every member of the...(read more)

    Read the article

  • Windows Not Sleeping All Night

    - by John Paul Cook
    Having a computer wake up when you don’t want it to wastes electricity and drains the battery on mobile devices. My desktop had been waking up at night, so I assumed it was some network traffic on my home network. I unchecked Allow this device to wake the computer on my network adapters . Figure 1. Network adapter Power Management tab. That didn’t solve the problem. I included the screen capture in Figure 1 because it could be part of the solution for someone else. To identify the root cause instead...(read more)

    Read the article

  • Live from the #summit13 keynote : 2013-10-16

    - by AaronBertrand
    Early morning start here in Charlotte. I'm going to try and keep this post updated as I have new information from the keynote to share, so refresh often! 8:24 AM Bill Graziano takes the stage and welcomes us to the 15th PASS Summit. He mentions that PASS delivered over 700,000 hours of technical training in the previous fiscal year, and shows a Power BI Power Map video talking about all of the SQL Saturday accomplishments in the last few years. She introduces Amy Lewis, who wins this year's PASSion...(read more)

    Read the article

  • QueryUnit 0.0.0.8 – Trust No One

    - by Davide Mauri
    Yesterday I’ve release an updated version of QueryUnit, the version 0.0.0.8. QueryUnit now supports AreNotEqual, Greater, and Less assertions and is more capable of managing strings results. I must say that I cannot live anymore without a proper Unit Testing of a BI solution. Just yesterday happened that one of the unit tests at a customer site failed showing a subtle situation where the release of a new version of custom application would have corrupted the source of BI data with a very low chance that someone would have noticed it before several days. It may happen when you have more the 15 systems that handles the data needed by your BI solution. The key message of this situation is “Trust No One”: if your data hasn’t passed quality testing it’s not trustable. Period. QueryUnit is now officialy an hero :) No superpowers still, but useful above all. http://queryunit.codeplex.com/ Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Testing and Validation – You Really Do Have The Time

    - by BuckWoody
    One of the great advantages in my role as a Technical Specialist here at Microsoft is that I get to work with so many great clients. I get to see their environments and how they use them, and the way they work with SQL Server. I’ve been a data professional myself for many years. Over that time I’ve worked with many database platforms, lots of client applications, and written a lot of code in many industries. For a while I was also a consultant, so I got to see how other shops did things as well. But because I now focus on a “set” base of clients (over 500 professionals in over 150 companies) I get to see them over a longer period of time. Many of them help me understand how they use the product in their projects, and I even attend some DBA regular meetings. I see the way the product succeeds, and I see when it fails. Something that has really impacted my way of thinking is the level of importance any given shop is able to place on testing and validation. I’ve always been a big proponent of setting up a test system and following a very disciplined regimen to make sure it will work in production for any new projects, and then taking the lessons learned into production as standards. I know, I know – there’s never enough time to do things right like this. Yet the shops I see that do it have the same level of work that they output as the shops that don’t. They just make the time to do the testing and validation and create a standard that they will follow in production. And what I’ve found (surprise surprise) is that they have fewer production problems. OK, that might seem obvious – but I’ve actually tracked it and those places that do the testing and best practices really do save stress, time and trouble from that effort. We all think that’s a good idea, but we just “don’t have time”. OK – but from what I’m seeing, you can gain time if you spend a little up front. You may find that you’re actually already spending the same amount of time that you would spend in doing the testing, you’re just doing it later, at night, under the gun. Food for thought.  Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • TechEd 2014 Day 2

    - by John Paul Cook
    Today people asked me about backing up older versions of SQL Server to Azure. Older versions back to SQL Server 2005 can be easily backed up to Azure Storage by installing Microsoft SQL Server Backup to Windows Azure Tool. It installs a service of the same name that applies rules to SQL Server backups. You can tell the tool to backup or encrypt your SQL Server backups. You can have it automatically upload your backups to Azure Storage. Even if you don’t want to upload your backups to Azure, you might...(read more)

    Read the article

  • &quot;CLR Enabled&quot; is not required to use CLR built-ins

    - by AaronBertrand
    Books Online articles referencing built-in CLR functions (such as FORMAT() ) have a remark similar to the following: "FORMAT relies on the presence of .the .NET Framework Common Language Runtime (CLR)." A lot of people seem to interpret this as meaning: "You must enable the sp_configure option 'CLR enabled' in order to use FORMAT()." Some then go on and suggest you run code similar to the following before you play with these functions: EXEC sp_configure 'show advanced options' , 1 ; GO RECONFIGURE...(read more)

    Read the article

  • DevWeek Slides & Demos available for download

    - by Davide Mauri
    Anyone interested can download Slides, Demos and Demo Database (WhitepagesDB) of my “SQL Server best practices for developers” session here: http://www.davidemauri.it/resources/slide--demos.aspx Happy Downloading! :) Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Sampling SQL server batch activity

    - by extended_events
    Recently I was troubleshooting a performance issue on an internal tracking workload and needed to collect some very low level events over a period of 3-4 hours.  During analysis of the data I found that a common pattern I was using was to find a batch with a duration that was longer than average and follow all the events it produced.  This pattern got me thinking that I was discarding a substantial amount of event data that had been collected, and that it would be great to be able to reduce the collection overhead on the server if I could still get all activity from some batches. In the past I’ve used a sampling technique based on the counter predicate to build a baseline of overall activity (see Mikes post here).  This isn’t exactly what I want though as there would certainly be events from a particular batch that wouldn’t pass the predicate.  What I need is a way to identify streams of work and select say one in ten of them to watch, and sql server provides just such a mechanism: session_id.  Session_id is a server assigned integer that is bound to a connection at login and lasts until logout.  So by combining the session_id predicate source and the divides_by_uint64 predicate comparator we can limit collection, and still get all the events in batches for investigation. CREATE EVENT SESSION session_10_percent ON SERVER ADD EVENT sqlserver.sql_statement_starting(     WHERE (package0.divides_by_uint64(sqlserver.session_id,10))), ADD EVENT sqlos.wait_info (        WHERE (package0.divides_by_uint64(sqlserver.session_id,10))), ADD EVENT sqlos.wait_info_external (        WHERE (package0.divides_by_uint64(sqlserver.session_id,10))), ADD EVENT sqlserver.sql_statement_completed(     WHERE (package0.divides_by_uint64(sqlserver.session_id,10))) ADD TARGET ring_buffer WITH (MAX_DISPATCH_LATENCY=30 SECONDS,TRACK_CAUSALITY=ON) GO   There we go; event collection is reduced while still providing enough information to find the root of the problem.  By the way the performance issue turned out to be an IO issue, and the session definition above was more than enough to show long waits on PAGEIOLATCH*.        

    Read the article

  • StreamInsight will not push feature releases through Microsoft Update going forward

    - by Roman Schindlauer
    Until now, we've released StreamInsight through the Microsoft Download Center, and also released it out through Microsoft Update. Going forward, we will only release new StreamInsight versions through the Microsoft Download Center and only use MU to release service packs and security fixes (should any be needed). As a result of this decision, we are pulling off the recent StreamInsight 2.1 release from MU; this release is still available in Download Center. Don’t worry: there’s nothing wrong with the versions we’ve shipped in MU, we’ve just adjusted how we use MU. There is no action necessary from our customers as a result of this change, and we are not rolling back any changes to your current installation, so if you have installed StreamInsight 2.1 recently through the Microsoft Update, they will still work fine. Regards, The StreamInsight Team

    Read the article

< Previous Page | 126 127 128 129 130 131 132 133 134 135 136 137  | Next Page >