Search Results

Search found 10594 results on 424 pages for 'vizioz umbraco blog'.

Page 126/424 | < Previous Page | 122 123 124 125 126 127 128 129 130 131 132 133  | Next Page >

  • TechEd 2010 Followup

    - by AllenMWhite
    Last week I presented a couple of sessions at Tech Ed NA in New Orleans. It was a great experience, even though my demos didn't always work out as planned. Here are the sessions I presented: DAT01-INT Administrative Demo-Fest for SQL Server 2008 SQL Server 2008 provides a wealth of features aimed at the DBA. In this demofest of features we'll see ways to make administering SQL Server easier and faster such as Centralized Data Management, Performance Data Warehouse, Resource Governor, Backup Compression...(read more)

    Read the article

  • SSD-HDD price parity

    - by jchang
    It is hard to believe that we are essentially at SSD-HDD price parity? Of course I am comparing enterprise class 10K/15K HDDs to consumer grade SSDs. Below are prices I am seeing 300GB 15K HDD $370 900GB 10K HDD $600 1TB 7200 HDD $230 (less for consumer HDDs) 512GB SATA SSD $400-600 Intel SSD DC S3700 400GB $940 The 512GB SATA SSDs are consumer grade, MLC NAND, with only 7% over provisioning. That is 512GB (1GB = 2^30) of NAND, with 512GB (1GB =10^9) of user capacity. Intel just announced the SSD...(read more)

    Read the article

  • Bleeding Edge 2012 – session material

    - by Hugo Kornelis
    As promised, here are the slide deck and demo code I used for my presentation at the Bleeding Edge 2012 conference in Laško, Slovenia. Okay, I promised to have them up by Tuesday or Wednesday at worst, and it is now Saturday – my apologies for the delay. Thanks again to all the attendees of my session. I hope you enjoyed it, and if you have any question then please don’t hesitate to get in touch with me. I had a great time in Slovenia, both during the event and in the after hours. Even if everything...(read more)

    Read the article

  • AngularJS: structuring a web application with multiple ng-apps

    - by mg1075
    The blogosphere has a number of articles on the topic of AngularJS app structuring guidelines such as these (and others): http://www.johnpapa.net/angular-app-structuring-guidelines/ http://codingsmackdown.tv/blog/2013/04/19/angularjs-modules-for-great-justice/ http://danorlando.com/angularjs-architecture-understanding-modules/ http://henriquat.re/modularizing-angularjs/modularizing-angular-applications/modularizing-angular-applications.html However, one scenario I have yet to come across for guidelines and best practices is the case where you have a large web application containing multiple "mini-spa" apps, and the mini-spa apps all share a certain amount of code. I am not referring to the case of trying to have multiple ng-app declarations on the same page; rather, I mean different sections of a large site that have their own, unique ng-app declaration. As Scott Allen writes in his OdeToCode blog: One scenario I haven't found addressed very well is the scenario where multiple apps exist in the same greater web application and require some shared code on the client. Are there any recommended approaches to take, pitfalls to avoid, or good sample structures of this scenario that you can point to?

    Read the article

  • T-SQL Tuesday #006 Round-up!

    - by Mike C
    T-SQL Tuesday this month was all about LOB (large object) data. Thanks to all the great bloggers out there who participated! The participants this month posted some very impressive articles with information running the gamut from Reporting Services to SQL Server spatial data types to BLOB-handling in SSIS. One thing I noticed immediately was a trend toward articles about spatial data (SQL Server 2008 Geography and Geometry data types, a very fun topic to explore if you haven’t played around with...(read more)

    Read the article

  • SQL Server 2012 Service Pack 1 is available!

    - by AaronBertrand
    Microsoft has released Service Pack 1 for SQL Server 2012. Though so far it is only being made available for the x64 platform. You can read about the fixes in the following KB article: http://support.microsoft.com/kb/20120401 The build # appears to be the same as RTM: 11.0.2100. Which may mean it's just a ploy to bring on board all those folks who still think it's necessary to wait for SP1. I haven't tried installing it yet, as I'm still on travel, but please let me know if you have any issues....(read more)

    Read the article

  • But what version is the database now?

    - by BuckWoody
    When you upgrade your system to SQL Server 2008 R2, you’ll know that the instance is at that version by using the standard commands like SELECT @@VERSION or EXEC xp_msver. My system came back with this info when I typed those: Microsoft SQL Server 2008 R2 (RTM) - 10.50.1600.1 (Intel X86)   Apr  2 2010 15:53:02   Copyright (c) Microsoft Corporation  Developer Edition on Windows NT 6.0 <X86> (Build 6002: Service Pack 2) (Hypervisor) Index Name Internal_Value Character_Value 1 ProductName NULL Microsoft SQL Server 2 ProductVersion 655410 10.50.1600.1 3 Language 1033 English (United States) 4 Platform NULL NT INTEL X86 5 Comments NULL SQL 6 CompanyName NULL Microsoft Corporation 7 FileDescription NULL SQL Server Windows NT 8 FileVersion NULL 2009.0100.1600.01 ((KJ_RTM).100402-1540 ) 9 InternalName NULL SQLSERVR 10 LegalCopyright NULL Microsoft Corp. All rights reserved. 11 LegalTrademarks NULL Microsoft SQL Server is a registered trademark of Microsoft Corporation. 12 OriginalFilename NULL SQLSERVR.EXE 13 PrivateBuild NULL NULL 14 SpecialBuild 104857601 NULL 15 WindowsVersion 393347078 6.0 (6002) 16 ProcessorCount 1 1 17 ProcessorActiveMask 1 1 18 ProcessorType 586 PROCESSOR_INTEL_PENTIUM 19 PhysicalMemory 2047 2047 (2146934784) 20 Product ID NULL NULL   But a database properties are separate from the Instance. After an upgrade, you always want to make sure that the compatibility options (which have much to do with how NULLs and other objects are treated) is at what you expect. For the most part, as long as the application can handle it, I set my compatibility levels to the latest version. For SQL Server 2008, that was “10.0” or “10”. You can do this with the ALTER DATABASE command or you can just right-click the database and select “Properties” and then “Database Options” in SQL Server Management Studio. To check the database compatibility level, I use this query: SELECT name, cmptlevel FROM sys.sysdatabases When I did that this morning I saw that the databases (all of them) were at 10.0 – not 10.5 like the Instance. That’s expected – we didn’t revise the database format up with the Instance for this particular release. Didn’t want to catch you by surprise on that. While your databases should be at the “proper” level for your situation, you can’t rely on the compatibility level to indicate the Instance level. More info on the ALTER DATABASE command in SQL Server 2008 R2 is here: http://technet.microsoft.com/en-us/library/bb510680(SQL.105).aspx Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Genetic Considerations in User Interface Design

    - by John Paul Cook
    There are several different genetic factors that are highly relevant to good user interface design. Color blindness is probably the best known. But did you know about motion sickness and epilepsy? We’ve been discussing how genetic factors should be considered in user interface design in one of my classes at Vanderbilt University School of Nursing. According to the National Library of Medicine, approximately 8% of males and 0.5% of females have red-green color discrimination problems with the most...(read more)

    Read the article

  • Working with Reporting Services Filters–Part 5: OR Logic

    - by smisner
    When you combine multiple filters, Reporting Services uses AND logic. Once upon a time, there was actually a drop-down list for selecting AND or OR between filters which was very confusing to people because often it was grayed out. Now that selection is gone, but no matter. It wouldn’t help us solve the problem that I want to describe today. As with many problems, Reporting Services gives us more than one way to apply OR logic in a filter. If I want a filter to include this value OR that value for the same field, one approach is to set up the filter is to use the IN operator as I explained in Part 1 of this series. But what if I want to base the filter on two different fields? I  need a different solution. Using the AdventureWorksDW2008R2 database, I have a report that lists product sales: Let’s say that I want to filter this report to show only products that are Bikes (a category) OR products for which sales were greater than $1,000 in a year. If I set up the filter like this: Expression Data Type Operator Value [Category] Text = Bikes [SalesAmount]   > 1000 Then AND logic is used which means that both conditions must be true. That’s not the result I want. Instead, I need to set up the filter like this: Expression Data Type Operator Value =Fields!EnglishProductCategoryName.Value = "Bikes" OR Fields!SalesAmount.Value > 1000 Boolean = =True The OR logic needs to be part of the expression so that it can return a Boolean value that we test against the Value. Notice that I have used =True rather than True for the value. The filtered report appears below. Any non-bike product appears only if the total sales exceed $1,000, whereas Bikes appear regardless of sales. (You can’t see it in this screenshot, but Mountain-400-W Silver, 38 has sales of $923 in 2007 but gets included because it is in the Bikes category.)

    Read the article

  • Worst code I've written in a while

    - by merrillaldrich
    Here's a nice, compact bit of WTF-ery I had to write for a prod issue today: Again: UPDATE TOP ( 1 ) dbo . someTable SET field3 = 'NEW' WHERE field2 = 'NEW' AND field3 = '' IF @@ROWCOUNT > 0 GOTO Again Can you guess from the code what awesomesauce issues I was working around? This was a reminder for me that sometimes there is time to do it right, but sometimes you just have to do it now. I need that lesson sometimes, as I tend to be a perfectionist. If you are trying to do it right , please don't...(read more)

    Read the article

  • [Speaking] Presenting at AITP Region 18 IT Con

    - by AllenMWhite
    The IT Con event this Saturday, June 18, is an event similar to SQL Saturday. It has three tracks, an IT Professional track, a Development track and a Career Development track. I'll be presenting two sessions in the IT Professional track. Gather SQL Server Performance Data with PowerShell We all know how important it is to keep a baseline of performance metrics that allow us to know when something is wrong and help us to track it down and fix the problem. We don't always know how to do this easily...(read more)

    Read the article

  • Connect Digest : 2011-06-27

    - by AaronBertrand
    Sorry I have fallen off the Connect Digest wagon for the past few weeks; been a little swamped since returning from SQLCruise Alaska. Not sure I'll be able to assemble a digest every week, but I'll certainly try to keep a steady pace. This week I wanted to highlight a few suggestions around indexed views. With the coming of SQL Server code-named "Denali" we will be pushed toward the new columnstore index as an alternative to indexed views. But this won't be for all cases, and it likely won't be available...(read more)

    Read the article

  • DTLoggedExec 1.1.2008.4 Released!

    - by Davide Mauri
    Today I've relased the latest version of my DTExec replacement tool, DTLoggedExec. The main changes are the following: Used a new strategy for version numbers. Now it will follow the following pattern Major.Minor.TargetSQLServerVersion.Revision Added support for Auto Configurations Fixed a bug that reported incorrect number of errors and warnings to Log Providers Fixed a buf that prevented correct casting of values when using /Set and /Param options Errors and Warnings are now counted more precisely. Updated database and log import scripts to categorize logs by projects and sections. E.g.: Project: MyBIProject; Sections: Staging, Datawarehouse Removed unused report stored procedures from database Updated Samples: 12 samples are now available to show ALL DTLoggedExec features From this version only SSIS 2008 will be supported http://dtloggedexec.codeplex.com/releases/view/62218  It useful to say something more on a couple of specific points: From this version only SSIS 2008 will be supportedYes, Integration Services 2005 are not supported anymore. The latest version capable of running SSIS 2005 Packages is the 1.0.0.2. Updated database and log import scripts to categorize logs by projects and sectionsWhen you import a log file, you can now assign it to a Project and to a Section of that project. In this way it's easier to gather statistical information for an entire project or a subsection of it. This also allows to store logged data of package belonging to different projects in the same database. For example:  Updated SamplesA complete set of samples that shows how to use all DTLoggedExec features are now shipped with the product. Enjoy! Added support for Auto ConfigurationsThis point will have a post on its own, since it's quite important and is by far the biggest new feature introduced in this release. To explain it in a few words, I can just say that you don't need to waste time with complex DTS configuration files or options, since a package will configure itself automatically. You just need to write a single statement as a parameter for DTLoggedExec. This feature can simplify deployment *a lot* :)   I the next days I'll write the mentioned post on Auto-Configurations and i'll update the documentation available on theDTLoggedExec website:   http://dtloggedexec.davidemauri.it/MainPage.ashx

    Read the article

  • SQL Saturday #156 : Providence, RI

    - by AaronBertrand
    Well, East Greenwich, RI. Another successful event, this one put on by John Miner, Brandon Leach, Steve Simon, Scott Abrants and a host of other folks. Several #SQLFamily friends in attendance as well: Grant Fritchey, Mike Walsh, Jack Corbett, Wayne Sheffield and others. I gave a session in the morning and then a session to cap off the day. Thanks to everyone who attended! The downloads are here: T-SQL : Bad Habits & Best Practices The Ins & Outs of Contained Databases...(read more)

    Read the article

  • 24 Hours of PASS next week, pre-con preview style

    - by drsql
    I will be doing my Characteristics of a Great Relational Database , which is a session that I haven’t done since last PASS. When I was asked about doing this Summit Preview version of 24 hours of PASS, I decided that I would do this session, largely because it is kind of light and fun, but also because it is either going to be the basis of the end section of my pre-con at the summit or it is going to be the section of the pre-con we don’t get to because we are so involved in working out designs that...(read more)

    Read the article

  • New Whitepaper from SQLBI: Vertipaq vs ColumnStore

    - by AlbertoFerrari
    At the end of June 2012, I was in Amsterdam to present some sessions at Teched Europe 2012 and, while preparing the material for the demos (yes, the best demos are the ones I prepare at the last minute), I decided to make a comparison between the two implementations of xVelocity of SQL 2012, one is the VertiPaq engine in SSAS Tabular and the other one is the ColumnStore index in SQL Server. After some trials, I decided that ColumnStore was a clear loser, because I was not able to see a real improvement...(read more)

    Read the article

  • Cloud Computing Pricing - It's like a Hotel

    - by BuckWoody
    I normally don't go into the economics or pricing side of Distributed Computing, but I've had a few friends that have been surprised by a bill lately and I wanted to quickly address at least one aspect of it. Most folks are used to buying software and owning it outright - like buying a car. We pay a lot for the car, and then we use it whenever we want. We think of the "cloud" services as a taxi - we'll just pay for the ride we take an no more. But it's not quite like that. It's actually more like a hotel. When you subscribe to Azure using a free offering like the MSDN subscription, you don't have to pay anything for the service. But when you create an instance of a Web or Compute Role, Storage, that sort of thing, you can think of the idea of checking into a hotel room. You get the key, you pay for the room. For Azure, using bandwidth, CPU and so on is billed just like it states in the Azure Portal. so in effect there is a cost for the service and then a cost to use it, like water or power or any other utility. Where this bit some folks is that they created an instance, played around with it, and then left it running. No one was using it, no one was on - so they thought they wouldn't be charged. But they were. It wasn't much, but it was a surprise.They had the hotel room key, but they weren't in the room, so to speak. To add to their frustration, they had to talk to someone on the phone to cancel the account. I understand the frustration. Although we have all this spelled out in the sign up area, not everyone has the time to read through all that. I get that. So why not make this easier? As an explanation, we bill for that time because the instance is still running, and we have to tie up resources to be available the second you want them, and that costs money. As far as being able to cancel from the portal, that's also something that needs to be clearer. You may not be aware that you can spin up instances using code - and so cancelling from the Portal would allow you to do the same thing. Since a mistake in code could erase all of your instances and the account, we make you call to make sure you're you and you really want to take it down. Not a perfect system by any means, but we'll evolve this as time goes on. For now, I wanted to make sure you're aware of what you should do. By the way, you don't have to cancel your whole account not to be billed. Just delete the instance from the portal and you won't be charged. You don't have to call anyone for that. And just FYI - you can download the SDK for Azure and never even hit the online version at all for learning and playing around. No sign-up, no credit card, PO, nothing like that. In fact, that's how I demo Azure all the time. Everything runs right on your laptop in an emulated environment.  

    Read the article

  • Where's the source code?

    - by Kyle Burns
    I've been contacted by several people through this blog asking about the missing source code for the "Beginning Windows 8 Application Development - XAML Edition" book (the book is available at http://www.amazon.com/gp/product/1430245662/http://www.amazon.com/gp/product/1430245662/) and wanted to share this with others who may have come to this blog looking for it but may not have communicated with me.  The publisher (Apress) does know that the source code is not posted on the book's product page and will be correcting it.  Apress is located in New York City and things were slowed down a little bit last week due to the storm, but I've been assured they will be correcting the product page as soon as they can.  Thanks to everyone who has bought the book and I especially appreciate your patience.

    Read the article

  • What are the most effective learning methods?

    - by BuckWoody
    After I got done speaking at the SQL Server 2008 R2 Launch Event yesterday I came back to the hotel room for a web-meeting with some of the other teachers at the University of Washington. As teachers we are always looking to improve the knowledge transfer to our students – and the Program Director found an interesting study that I thought I might share here. Below is an un-labeled chart showing the effectiveness of learning methods according to a recent study. At the top are the labels. (“Teaching” here means students teaching each other). Try the experiment we did: place the labels where you think they’ll go. I’ll post the completed chart tomorrow. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Process Improvement and the Data Professional

    - by BuckWoody
    Don’t be afraid of that title – I’m not talking about Six Sigma or anything super-formal here. In many organizations, there are more folks in other IT roles than in the Data Professional area. In other words, there are more developers, system administrators and so on than there are the “DBA” role. That means we often have more to do than the time we need to do it. And, oddly enough, the first thing that is sacrificed is process improvement – the little things we need to do to make the day go faster in the first place. Then we get even more behind, the work piles up and…well, you know all about that. Earlier I challenged you to find 10-30 minutes a day to study. Some folks wrote back and asked “where do I start”? Well, why not be super-efficient and combine that time with learning how to make yourself more efficient? Try out a new scripting language, learn a new tool that automates things or find out ways others have automated their systems. In general, find out what you’re doing and how, and then see if that can be improved. It’s kind of like doing a performance tuning gig on yourself! If you’re pressed for time, look for bite-sized articles (like the ones I’ve done here for PowerShell and SQL Server) that you can follow in a “serial” fashion. In a short time you’ll have a new set of knowledge you can use to make your day faster. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • DAX editor for SQL Server

    - by Davide Mauri
    One of the major criticism to DAX is the lack of a decent editor and more in general of a dedicated IDE, like the one we have for T-SQL or MDX. Well, this is no more true. On Codeplex a very interesting an promising Visual Studio 2010 extension has been released by the beginning of November 2011: http://daxeditor.codeplex.com/ Intellisense, Syntax Highlighting and all the typical features offered by Visual Studio are available also for DAX. Right now you have to download the source code and compile it, and that’s it!

    Read the article

  • Human Resources Sucks

    - by andyleonard
    Introduction This post is the twenty-seventh part of a ramble-rant about the software business. The current posts in this series are: Goodwill, Negative and Positive Visions, Quests, Missions Right, Wrong, and Style Follow Me Balance, Part 1 Balance, Part 2 Definition of a Great Team The 15-Minute Meeting Metaproblems: Drama The Right Question Software is Organic, Part 1 Metaproblem: Terror I Don't Work On My Car A Turning Point Human Doings Everything Changes Getting It Right The First Time One-Time...(read more)

    Read the article

  • Free tools for SQL Server - Automating Execution Plan Analysis

    - by jchang
    Since this topic is being discussed, I will plug my own tools, SQL Exec Stats and (a little dated) documentation the main capability is cross-referencing index usuage with specific execution plans. another feature is generating execution plans for all stored procedures in a database, along with the index usage cross-reference. There are several sources of execution plans or plan handles, this could be a live trace, a previously saved trace, previously saved sqlplan files, from dm_exec_cached_plans,...(read more)

    Read the article

  • Messages do not always appear in [catalog].[event_messages] in the order that they occur [SSIS]

    - by jamiet
    This is a simple heads up for anyone doing SQL Server Integration Services (SSIS) development using SSIS 2012. Be aware that messages do not always appear in [catalog].[event_messages] in the order that they occur, observe… In the following query I am looking at a subset of messages in [catalog].[event_messages] and ordering them by [event_message_id]: SELECT [event_message_id],[event_name],[message_time],[message_source_name]FROM   [catalog].[event_messages] emWHERE  [event_message_id] BETWEEN 290972 AND 290982ORDER  BY [event_message_id] ASC--ORDER BY [message_time] ASC Take a look at the two rows that I have highlighted, note how the OnPostExecute event for “Utility GetTargetLoadDatesPerETLIfcName” appears after the OnPreExecute event for “FELC Loop over TargetLoadDates”, I happen to know that this is incorrect because “Utility GetTargetLoadDatesPerETLIfcName” is a package that gets executed by an Execute Package Task prior to the For Each Loop “FELC Loop over TargetLoadDates”: If we order instead by [message_time] then we see something that makes more sense: SELECT [event_message_id],[event_name],[message_time],[message_source_name]FROM   [catalog].[event_messages] emWHERE  [event_message_id] BETWEEN 290972 AND 290982--ORDER BY [event_message_id] ASCORDER  BY [message_time] ASC We can see that the OnPostExecute for “Utility GetTargetLoadDatesPerETLIfcName” did indeed occur before the OnPreExecute event for “FELC Loop over TargetLoadDates”, they just did not get assigned an [event_message_id] in chronological order. We can speculate as to why that might be (I suspect the explanation is something to do with the two executables appearing in different packages) but the reason is not the important thing here, just be aware that you should be ordering by [message_time] rather than [event_message_id] if you want to get 100% accurate insights into your executions. @Jamiet

    Read the article

  • Thinking in DAX (#powerpivot and #bism)

    - by Marco Russo (SQLBI)
    Last week Alberto published an interesting post about Counting Products in the Current Status with PowerPivot . Starting from a question raised from a reader, Alberto described how to solve a common issue (let me know the “current status” of each item at a given point in time starting from a transactions table) by using a single DAX formula. I suggest you to read his post to understand the technical details of that. What is inspiring of this example is that we can look at Vertipaq and DAX from several...(read more)

    Read the article

< Previous Page | 122 123 124 125 126 127 128 129 130 131 132 133  | Next Page >