Search Results

Search found 274 results on 11 pages for 'seattle leonard'.

Page 9/11 | < Previous Page | 5 6 7 8 9 10 11  | Next Page >

  • How can I change the lock screen in Windows 8 that appears for the default user?

    - by Mark Allen
    This is about Windows 8 RTM. How can I change the lock screen in Windows 8 that appears after connecting to the machine via RDP? You can change your lock screen for your user account like so: Hit the Windows key. Right click your user name in the upper right hand corner, choose Change Account Picture. Click Lock Screen, choose a new picture. However, if you then connect to the computer where you've done this from another computer via RDP, using the same account, the physical machine you've connected to will display the "default" user lock screen - a stylized Space Needle / Seattle picture. It's not a bad picture, but I'd like to change it.

    Read the article

  • Microsoft Silverlight MVP one more time

    - by pluginbaby
    Another wonderful first email of the year… announcing that I’ve just been re-awarded Most Valuable Professional (MVP) by Microsoft for Silverlight. This is my 5th year as an MVP in a row and I am still very honoured and excited! In 2010 I had the pleasure to be involved in many community events around Silverlight, speaking at Microsoft conferences and user groups (doing the launch of the Vancouver Silverlight User Group was fun!), as well as taking part in worldwide conference like MIX Las Vegas and the MVP Summit in Redmond. Also I did new kind of activities in 2010: I wrote questions for the first Microsoft Silverlight certification exam (70-506), and I was Technical reviewer of 3 Silverlight books. I finally started to share more on Twitter @LaurentDuveau. In 2010 the content of this blog was mostly about Silverlight, I expect it to be the same in 2011, plus a touch of Windows Phone as well. I already know that 2011 will be hell of a good year.. I’ll be at the next MVP Summit in Seattle, also speaker at DevTeach which comes back to Montreal (at last!) and have some nice Silverlight trainings plans for France and Tunisia. More than that, my business RunAtServer is healthy (proud of my team!) and I have insane news and a very big surprise coming on that front.... stay tuned! Happy New Year!

    Read the article

  • Not sure how to link json 100% in php

    - by ronhdoge
    Im trying to create an rss feed that my droid app reads but i have some holes that i can figure how to fix the rss link page is http://www.mandarich.com/mandarichServer/mlb/indexbaseball.php when reading the rss i can see where the icon is missing on some and cant figure out why and cant figure saint louis at all. and the code i have for the php is as follows: <?php $teams["boston"] = "bostonredsox.gif"; $teams["nyyankees"] = "newyorkyankes.gif"; $teams["baltimore"] = "baltimoreorioles.gif"; $teams["tampa"] = "tampabayrays.gif"; $teams["toronto"] = "torontobluejays.gif"; $teams["atlanta"] = "atlantabraves.gif"; $teams["florida"] = "floridamarlins.gif"; $teams["nymets"] = "newyorkmets.gif"; $teams["philadelphia"] = "philadelphiaphillies.gif"; $teams["washington"] = "washingtonnationals.gif"; $teams["chicagosox"] = "chicagowhitesox.gif"; $teams["cleveland"] = "clevelandindians.gif"; $teams["detroit"] = "detroittigers.gif"; $teams["kansas"] = "kansascityroyals.gif"; $teams["minnesota"] = "minnesotatwins.gif"; $teams["chicagocubs"] = "chicagocubs.gif"; $teams["cincinnati"] = "cinncinatireds.gif"; $teams["houston"] = "houstonastros.gif"; $teams["milwaukee"] = "milwaukeebrewers.gif"; $teams["pittsburgh"] = "pitsburghpirates.gif"; $teams["st.louis"] = "stlouiscardinals.gif"; $teams["laangels"] = "losangelesangels.gif"; $teams["oakland"] = "oaklandathletics.gif"; $teams["seattle"] = "seattlemariners.gif"; $teams["texas"] = "texasrangers.gif"; $teams["arizona"] = "arizonadiamondbacks.gif"; $teams["colorado"] = "coloradorockies.gif"; $teams["ladodgers"] = "losangelesdodgers.gif"; $teams["sandiego"] = "sandiegopadres.gif"; $teams["sanfrancisco"] = "sanfranciscogiants.gif"; $abbr["arizona"] = "ARI"; $abbr["oakland"] = "OAK"; $abbr["baltimore"] = "BAL"; $abbr["tampa"] = "TAM"; $abbr["boston"] = "BOS"; $abbr["nyyankees"] = "NYY"; $abbr["texas"] = "TEX"; $abbr["toronto"] = "TOR"; $abbr["laangels"] = "LAA"; $abbr["atlanta"] = "ALT"; $abbr["colorado"] = "COL"; $abbr["philadelphia"] = "PHI"; $abbr["florida"] = "FLA"; $abbr["milwaukee"] = "MIL"; $abbr["washington"] = "WAS"; $abbr["chicagosox"] = "CHW"; $abbr["cleveland"] = "CLE"; $abbr["detroit"] = "DET"; $abbr["seattle"] = "SEA"; $abbr["sanfrancisco"] = "SFO"; $abbr["st.louis"] = "STL"; $abbr["chicagocubs"] = "CHC"; $abbr["houston"] = "HOU"; $abbr["nymets"] = "NYM"; $abbr["cincinnati"] = "CIN"; $abbr["sandiego"] = "SDG"; $abbr["ladodgers"] = "LAD"; $abbr["pittsburgh"] = "PIT"; $abbr["minnesota"] = "MIN"; $abbr["kansas"] = "KAN"; ?

    Read the article

  • PASS Summit book launch and meet the authors - Professional SQL Server 2012 Internals & Troubleshooting

    - by Christian
    I’m very pleased to announce that we’ll be officially launching our new book, Professional SQL Server 2012 Internals and Troubleshooting at the PASS Summit in Seattle tomorrow. In partnership with our great friends at SQL Sentry we’ll have most of the authors at the SQL Sentry exhibitors stand from 12:30 on Thursday 8th November for a book signing event which will give you a rare opportunity to meet with the authors and contributors, many of which have flown in from around the world. SQL Sentry also have lots and lots of copies to give away for free so be sure to drop by their stand and ask about it! If you really can’t wait or run the risk of not getting a copy then the PASS bookstore has a few copies for sale but don’t expect them to be there for long! You can also order it from your favourite online retailer: amazon.com: http://amzn.to/U9IlPV barnesandnoble.com: http://bitly.com/Ux1gog amazon.co.uk: http://bitly.com/WBJ18l I’ll be writing a follow-up post very soon explaining why I think you should buy this book so look out for it!   Christian Bolton - MCA, MCM, MVP Technical Director http://coeo.com - SQL Server Consulting & Managed Services

    Read the article

  • Sevensteps and I are joining forces

    - by Dennis Vroegop
    As of today, I will be partnering with Sevensteps when it comes to developing great Surface, Windows Phone 7 and Windows 7 Touch applications. Below you’ll find the press release we sent out today. I am looking forward to this partnership and expect great things coming from us both in the future!   Dennis Vroegop, Microsoft MVP, joins Sevensteps partner network 1 March 2011, Seattle / Amersfoort Today Dennis Vroegop and Bart Roozendaal, both Microsoft Most Valuable Professional for Microsoft Surface, announce the joining of Dennis Vroegop to the Sevensteps partner network. Dennis and Bart already worked together very closely through the Microsoft MVP connection, but decided to combined their efforts to make the new Microsoft Surface and our solutions for it, a success. Dennis will join the other Sevensteps partners in creating state of the art solutions for Microsoft Surface, Windows Phone 7 and Windows 7 Touch. Dennis brings a vast amount of knowledge about these technologies, as well as his network in the Dutch developer community. With Dennis joining the Sevensteps partner network we bring unique expertise, power and insight in the platforms, that no other company worldwide can offer. This step brings our goal of Sevensteps being the knowledge hub for Microsoft Surface of choice a whole lot closer. About Dennis Vroegop Dennis is a Microsoft MVP for Microsoft Surface and chairman of the Dutch dotNed user group. He has a long history promoting Microsoft Surface in the developer community. Dennis is a regular speaker at local and international conferences and a frequent writer of articles, including but not limited to Microsoft Surface. Dennis has a bachelor’s degree in computer sciences and has spent all of his professional life writing software for the Microsoft platform. About Sevensteps For more information about Sevensteps and Bart Roozendaal please point to http://www.sevensteps.com Tags: surface,wp7,windows touch

    Read the article

  • Visual Studio ALM MVP of the Year 2011

    - by Martin Hinshelwood
    For some reason this year some of my peers decided to vote for me as a contender for Visual Studio ALM MVP of the year. I am not sure what I did to deserve this, but a number of people have commented that I have a rather useful blog. I feel wholly unworthy to join the ranks of previous winners: Ed Blankenship (2010) Martin Woodward (2009) Thank you to everyone who voted regardless of who you voted for. If there was a prize for the best group of MVP’s then the Visual Studio ALM MVP would be a clear winner, as would the product group of product groups that is Visual Studio ALM Group. To use a phrase that I have learned since moving to Seattle and probably use too much: you guys are all just awesome. I have tried my best in the last year to document not only every problem that I have had with Team Foundation Server (TFS), but also to document as many of the things I am doing as possible. I have taken some of Adam Cogan’s rules to heart and when a customer asks me a question I always blog the answer and send them a link. This allows both my blog and my understanding of TFS to grow while creating a useful bank of content. The idea is that if one customer asks, all benefit. I try, when writing for my blog, to capture both the essence and the context for a problem being solved. This allows more people to benefit as they do not need to understand the specifics of an environment to gain value. I have a number of goals for this year that I think will help increase value in the community: persuade my new colleagues at Northwest Cadence to do more blogging (Steve, Jeff, Shad and Rennie) Rangers Project – TFS Iteration Automation with Willy-Peter Schaub, Bill Essary, Martin Hinshelwood, Mike Fourie, Jeff Bramwell and Brian Blackman Write a book on the Team Foundation Server API with Willy-Peter Schaub, Mike Fourie and Jeff Bramwell write more useful blog posts I do not think that these things are beyond the realms of do-ability, but we will see…

    Read the article

  • Meet SQLBI at PASS Summit 2012 #sqlpass

    - by Marco Russo (SQLBI)
    Next week I and Alberto Ferrari will be in Seattle at PASS Summit 2012. You can meet us at our sessions, at a book signing and hopefully watching some other session during the conference. Here are our appointments: Thursday, November 08, 2012, 10:15 AM - 11:45 AM – Alberto Ferrari – Room 606-607 Querying and Optimizing DAX (BIA-321-S) Do you want to learn how to write DAX queries and how to optimize them? Don’t miss this session! Thursday, November 08, 2012, 12:00 PM - 12:30 PM – Bookstore Book signing event at the Bookstore corner with Alberto Ferrari, Marco Russo and Chris Webb Visit the bookstore and sign your copy of our Microsoft SQL Server 2012 Analysis Services: The BISM Tabular Model book. Thursday, November 08, 2012, 1:30 PM - 2:45 PM – Marco Russo – Room 611 Near Real-Time Analytics with xVelocity (without DirectQuery) (BIA-312) What’s the latency you can tolerate for your data? Discover what is the limit in Tabular without using DirectQuery and learn how to optimize your data model and your queries for a near real-time analytical system. Not a trivial task, but more affordable than you might think. Friday, November 09, 2012, 9:45 AM - 11:00 AM Parent-Child Hierarchies in Tabular (BIA-301) Multidimensional has a more advanced support for hierarchies than Tabular, but in reality you can do almost the same things by using data modeling, DAX functions and BIDS Helper!  Friday, November 09, 2012, 1:00 PM - 2:15 PM – Marco Russo – Room 612 Inside DAX Query Plans (BIA-403) Discover the query plan for your DAX query and learn how to read it and how to optimize a DAX query by using these information. If you meet us at the conference, stop us and say hello: it’s always nice to know our readers!

    Read the article

  • Java Road Trip: Code to Coast

    - by Tori Wieldt
    tweetmeme_url = 'http://blogs.oracle.com/javaone/2010/06/java_road_trip_code_to_coast.html'; Share .FBConnectButton_Small{background-position:-5px -232px !important;border-left:1px solid #1A356E;} .FBConnectButton_Text{margin-left:12px !important ;padding:2px 3px 3px !important;} The Java Road Trip: Code to CoastJava developers, architects, programmers, and enthusiasts: get ready for a real adrenaline rush! Follow the Java Road Trip: Code to Coast as this high-tech block party on wheels travels to 20 cities across the United States showcasing Oracle's commitment to everything Java. It's a chance to talk to Java leaders and engineers and get your hands on the latest Java technology. The Java Road Trip kicks off June 14 in New York City with Octavian Tanase, Vice President, Java Platform Group at Oracle, headlining the event. Don't miss    EJBs in Boston!    Governance in Washington, DC!    Swing(ing) in Memphis!    Mile-high UIs in Denver!    Java in Seattle! (too easy)     and more!Join or follow the tour here: http://java.com/roadtrip/Read the Oracle Magazine articleUse or follow the hash tag #javaroadtrip

    Read the article

  • SQL Cruise Alaska 2011

    - by Grant Fritchey
    I had the extreme good fortune to get sent on the last SQL Cruise to Alaska. I love my job. In case you don't what this is, SQL Cruise is a trip on a cruise ship during which you get to attend classes while on the boat, learning all about SQL Server and related topics as well as network with the instructors and the other Cruisers. Frankly, it's amazing. Classes ran from Monday, 5/30, to Saturday, 6/4. The networking was constant, between classes, at night on cruise ship, out on excursions in Alaskan rainforests and while snorkeling in ocean waters. Here's a run down of the experience from my point of view. Because I couldn't travel out 2 days early, I missed the BBQ that occurred the day before the cruise when many of the Cruisers received their swag bags. Some of that swag came from Red Gate. I researched what was useful on a cruise like this and purchased small flashlights and binoculars for all the Cruisers. The flashlights were because, depending on your cabin, ships can be very dark. The binoculars were so that the cruisers could watch all the beautiful landscape as it flowed by. I would have liked to have been there when the bags were opened, but I heard from several people that they appreciated the gifts. Cruisers "In" the hot tub. Pictured: Marjory Woody, Michele Grondin, Kyle Brandt, Grant Fritchey, John Halunen Sunday I went to board the ship with my wife. We had a bit of an adventure because I messed up our documents. It all worked out and we got on board to meet up at the back of the boat at one of the outdoor bars with the other Cruisers, thanks to tweets letting everyone know where to go. That was the end of electronic coordination on the trip (connectivity in Alaska was horrible for everyone except AT&T). The Cruisers were a great bunch of people and it was a real honor to meet them and get to spend time with them. After everyone settled into their cabins, our very first activity was a contest, sponsored by Red Gate. The Cruisers, in an effort to get to know each other and the ship, were required to go all over taking various photographs, some of them hilarious. The winning team of three would all win prizes. Some of the significant others helped out and I tagged along with a team that tied for first but lost the coin toss. The winning team consisted of Christina Leo (blog|twitter), Ryan Malcom (twitter), Neil Hambly (blog|twitter). They then had to do math and identify the cabin with the lowest prime number, oh, and get a picture of it and be the first to get back up to the bar where we were waiting. Christina came in first and very happily carried home an Ipad2. Ryan won a 1TB portable hard drive and Neil won a wireless mouse (picture below, note my special SQL Server Central Friday Shirt. Thanks Steve (blog|twitter)). Winners: Christina Leo, Neil Hambly, Ryan Malcolm. Just Lucky: Grant Fritchey Monday morning classes started. Buck Woody (blog|twitter) was a special guest speaker on this cruise. His theme was "Three C's on the High Seas: Career, Communication and Cloud." The first session was all on Career. I'm not going to type out all my notes from the session, but let's just say, if you get the chance to hear Buck talk about how to manage your career, I suggest you attend. I have a ton of blog posts that I'll be putting together over the next several months (yes, months) both here and over on ScaryDBA. I also have a bunch of work I'm going to be doing to get my career performance bumped up a notch or two (and let's face it, that won't be easy). Later on Monday, Tim Ford (blog|twitter) did a session on DMOs. Specifically the session was on Tim's Period Table of DMOs that he has put together, and how to use some of the more interesting DMOs in your day to day job. It was a great session, packed with good information. Next, Brent Ozar (blog|twitter) did a session on how to monitor and guide SAN configuration for the DBA that doesn't have access to the SAN. That was some seriously useful information. Tuesday morning we only had a single class. Kendra Little (blog|twitter) taught us all about "No Lock for Yes Fun".  It was all about the different transaction isolation levels and how they work. There is so often confusion in this area and Kendra does a great job in clarifying the information. Also, she tosses in her excellent drawings to liven up the presentation. Then it was excursion time in Juneau. My wife and I, along with several other Cruisers, took a hike up around the Mendenhall Glacier. It was absolutely beautiful weather and walking through the Alaskan rain forest was a treat. Our guide, Jason, was a great guy and it was a good day of hiking. Wednesday was an all day excursion in Skagway. My wife and I took the "Ghost and Good Time Girls" walking tour that ended up at a bar that used to be a brothel, the Red Onion. It was a great history of the town. We went back out and hit a few museums and exhibits. We also hiked up the side of the mountain to see the Dewey Lake and some great views of the town. Finally we hiked out to the far side of town to see the Gold Rush cemetery. Hiking done we went back to the boat and had a quiet dinner on our own. Thursday we cruised through Glacier Bay and saw at least four different glaciers including sitting next to the Marjory Glacier for  about an hour. It was amazing. Then it got better. We went into class with Buck again, this time to talk about Communication. Again, I've got pages of notes that I'm going to be referring back to for some time to come. This was an excellent opportunity to learn. Snorkelers: Nicole Bertrand, Aaron Bertrand, Grant Fritchey, Neil Hambly, Christina Leo, John Robel, Yanni Robel, Tim Ford Friday we pulled into Ketchikan. A bunch of us went snorkeling. Yes, snorkeling. Yes, in Alaska. Yes, snorkeling in the ocean in Alaska. It was fantastic. They had us put on 7mm thick wet suits (an adventure all by itself) so it was basically warm the entire time we were in the water (except for the occasional squirt of cold water down my back). Before we got in the water a bald eagle flew up and landed about 15 feet in front of us, which was just an incredible event. Then our guide pointed out about 14 other eagles in the area, hanging out in the trees. Wow! The water was pretty clear and there was a ton of things to see. That was absolutely a blast. Back on the boat I presented a session called Execution Plans: The Deep Dive (note the nautical theme). It seemed to go over well and I had several good questions come out of the session that will lead to new blog posts. After I presented, it was Aaron Bertrand's (blog|twitter) turn. He did a session on "What's New in Denali" that provided a lot of great information. He was able to incorporate new things straight out of Tech-Ed, so this was expanded beyond his usual presentation. The man really knows what he's talking about and communicates it well. Saturday we were travelling so there was time for a bunch of classes. Jeremiah Peschka (blog|twitter) did a great overview of some of the NoSQL databases and what they should be used for. The session was called "The Database is Dead" but it was really about how there are specific uses for these databases that SQL Server doesn't fill, but also that these databases can't replace SQL Server in other areas. Again, good material. Brent Ozar presented again with a session on Defensive Indexing. It was an overview of how indexes work and a deep dive into how to apply them appropriately in your databases to better support access. A good session, as you would expect. Then we pulled into Victoria, BC, in Canada and had a nice dinner with several of the Cruisers, including Denny Cherry (blog|twitter). After that it was back to Seattle on Sunday. By the way, the Science Fiction Museum in Seattle isn't a Science Fiction Museum any more. I was very disappointed to discover this. Overall, it was a great experience. I'm extremely appreciative of Red Gate for sending me and for Tim, Brent, Kendra and Jeremiah for having me. The other Cruisers were all amazing people and it was an honor & privilege to meet them and spend time with them. While this was a seriously fun time, it was also a very serious training opportunity with solid information coming from seasoned industry pros.

    Read the article

  • Craftsmanship Tour Day 1: Didit Long Island

    - by Liam McLennan
    On Monday I was at Didit for my first ever craftsmanship visit. Didit seem to occupy a good part of a non-descript building in Rockville Centre Long Island. Since I had arrived early from Seattle I had some time to kill, so I stopped at the Rockville Diner on the corner of N Park Ave and Sunrise Hwy. I thoroughly enjoyed the pancakes and the friendly service. After walking to the Didit office I met Rik Dryfoos, the Didit Engineering Manager who organised my visit, and got the introduction to Didit and the work they are doing. I spent the morning in the room shared by the Didit developers, who are working on some fascinating deep engineering problems. After lunch at a local Thai place I setup a webcam to record an interview with Rik and Matt Roman (Didit VP of Engineering). I had a lot of trouble with the webcam, including losing several minutes of conversation, but in the end I was very happy the result. Here are the full interviews with Rik and Matt: Interview with Rik Dryfoos Interview with Matt Roman We had a great chat, much of which is captured in the recording. It was such great conversation that I almost missed my train to Manhattan. I’m sure Didit will continue to do well with such a dedicated and enthusiastic team. I sincerely thank them for hosting me for the day. If you are looking for a true agile environment and the opportunity to work with a high quality team then you should talk to Didit.

    Read the article

  • Declarative View Objects (VOs) for better ADF performance

    - by Shay Shmeltzer
    Just got back from ODTUG's kscope13 conference which had a lot of good deep ADF content. In one of my session I ran out of time to do one of my demos, so I wanted to share it here instead. This is a demo of how Declarative View Objects can increase your application's performance. For those who are not familiar with declarative VOs, those are VOs that don't actually specify a hard coded query. Instead ADF creates their query at runtime, and it does it based on the data that is requested in your UI layer. This can be a huge saver of both DB resources and network resources. More in the documentation. Here is a quick example that shows you how using such a VO can automatically switch to a simpler SQL instead of a complex join when needed. (note while I demo with 11.1.2.* the feature is there in 11.1.1.* versions also). The demo also shows you how you can monitor the SQL that ADF BC issues to the database using the WebLogic logging feature in JDeveloper. As a side note, I would have loved to see more ADF developers attending Kscope. This demo was part of the "ADF intro" track at Kscope, In the advanced ADF track you would have been treated to a full tuning session about ADF with lots of other tips. Consider attending Kscope next year - it is going to be in Seattle this time.

    Read the article

  • Are you at Super Computing 10?

    - by Daniel Moth
    Like last year, I was going to attend SC this year, but other events are unfortunately keeping me here in Seattle next week. If you are going to be in New Orleans, have fun and be sure not to miss out on the following two opportunities. MPI Debugging UX Study Throughout the week, my team is conducting 90-minute studies on debugging MPI applications within Visual Studio. In exchange for your feedback (under NDA) you will receive a Microsoft Gratuity (and the knowledge that you are impacting the development of Visual Studio). If you are interested, sign up at the Microsoft Information Desk in the Exhibitor Hall during exhibit hours. Outside of exhibit hours, send email to [email protected]. If you took part in the GPGPU study, this is very similar except it is for MPI. Microsoft High Performance Computing Summit On Monday 15th, the Microsoft annual user group meeting takes place. Shuttle transportation and lunch is provided. For full details of this event and to register, please visit the official event page. Comments about this post welcome at the original blog.

    Read the article

  • PASS Summit 2011 &ndash; Part I

    - by Tara Kizer
    What an amazing week I had at PASS Summit 2011 in Seattle, WA!  I hadn’t attended a PASS conference since September of 2005 when it was in Grapevine, Texas.  It has grown so much since then.  I am not sure how many people attended back then, but I’d guesstimate about 1500.  They announced that at this year’s conference there were 4000 attendees.  WOW! Here are my favorite aspects of this conference: Networking! – Not only did I meet a lot of new people, but I also got to meet people in person that I’ve known on the Internet for years like Mladen Prajdic (blog|twitter) and Rob Volk (blog|twitter).  I even met someone that I’d recently helped out in the SQLTeam forums.  Learning – I took a lot of notes during the sessions I attended and plan on blogging very soon about them.  It is amazing the amount of things you learn and the things that you unlearn.  Yes I said unlearn.  Some of the stuff that I thought I knew was either out-dated or just plain wrong.  Fun, fun, fun – To say that this conference was fun would be an understatement.  I had a blast!  I attended the “Welcome Reception and Quizbowl” on Tuesday night, the “Exhibitor Reception” on Wednesday night, and the “Community Appreciation Party” at GameWorks on Thursday night.  There were many other after-hours events to attend, but I had to make my kids a priority at night so I had to get back to my hotel room before 9pm so that I could Skype with them.   It was very entertaining reading and posting with #sqlpass on Twitter.  Twitter has changed the conference experience for the better.  I will definitely be able to do my job better due to attending this conference.  The return on investment is HUGE!

    Read the article

  • Got that Friday feeling?

    - by Rebecca Amos
    Saturday is just around the corner, and we’re all starting to wrap up for the weekend. If you’re the DBA that ‘Friday feeling’ might be as much about checking and preparing your SQL Servers for the next two days, as about looking forward to spending time with friends and family. Whether you’re double-checking your disaster recovery strategy, or know that it’s your turn to be on-call this weekend, it’s likely you’re preparing for the worst, just in case. The fact that you’re making these checks, and caring about both your servers and your users, means that you might be an exceptional DBA. You’re already putting in that extra effort to make other people’s lives easier. So why not take some time for your professional development and enter the Exceptional DBA Awards? If you’re looking for some inspiration for your entry, download our Judges’ Top Tips poster for advice on what the judges are looking for from this year’s entrants. Not only will you be boosting your professional development, but you could win full conference registration for the 2011 PASS Summit in Seattle (where the awards ceremony will take place), four nights' hotel accommodation, and a copy of Red Gate’s SQL DBA Bundle. So take some time out for yourself this weekend and get started on your entry: www.exceptionaldba.com

    Read the article

  • Almost time to hit the road again

    - by Chris Williams
    I’ve had a few months of not much traveling, but now that the weather is improving… conference season is starting up again. That means it’s time for me to start hitting the road. In June, I have Tech Ed 2010 in New Orleans, LA. I lived in New Orleans for several years, both as military and civilian and I have a few friends still down there. I haven’t been there since before Hurricane Katrina, so I have mixed feelings about returning… but I am still looking forward to it. Also in June, I have Codestock in Knoxville, TN. Codestock is one of my favorite events, primarily because of the excellent people that speak there and also attend sessions. It’s a great mix of people and technologies. Sometime in July or August, I’m headed to Austin, TX for a couple days. I don’t know the exact date yet, but if you have an event down there in that timeframe, let me know and maybe we can sort something out. In September, I’m heading to Seattle for my first PAX (Penny Arcade Expo.)  I’m going strictly as an attendee and it looks like a LOT of fun. Really excited to check it out. Also in September, I’m headed to Omaha for the Heartland Developers Conference. This is a FANTASTIC event, and certainly one of my local favorites. (I guess local is relative, it’s about a 6 hour drive.) In addition to speaking on WP7, I’ll be doing a series of hands on labs on XNA they day before the conference starts, so that should be a lot of fun as well.   In addition to all this stuff, I have my own XNA User Group to take care of. In August, Andy “The Z-Man” Dunn is coming to speak and check out the various food on a stick offerings at the Minnesota State Fair!

    Read the article

  • LASTDATE dates arguments and upcoming events #dax #tabular #powerpivot

    - by Marco Russo (SQLBI)
    Recently I had to write a DAX formula containing a LASTDATE within the logical condition of a FILTER: I found that its behavior was not the one I expected and I further investigated. At the end, I wrote my findings in this article on SQLBI, which can be applied to any Time Intelligence function with a <dates> argument.The key point is that when you write LASTDATE( table[column] )in reality you obtain something like LASTDATE( CALCULATETABLE( VALUES( table[column] ) ) )which converts an existing row context into a filter context.Thus, if you have something like FILTER( table, table[column] = LASTDATE( table[column] ) the FILTER will return all the rows of table, whereas you probably want to use FILTER( table, table[column] = LASTDATE( VALUES( table[column] ) ) )so that the existing filter context before executing FILTER is used to get the result from VALUES( table[column] ), avoiding the automatic expansion that would include a CALCULATETABLE that would hide the existing filter context.If after reading the article you want to get more insights, read the Jeffrey Wang's post here.In these days I'm speaking at SQLRally Nordic 2012 in Copenhagen and I will be in Cologne (Germany) next week for a SSAS Tabular Workshop, whereas Alberto will teach the same workshop in Amsterdam one week later. Both workshops still have seats available and the Amsterdam's one is still in early bird discount until October 3rd!Then, in November I expect to meet many blog readers at PASS Summit 2012 in Seattle and I hope to find the time to write other article on interesting things on Tabular and PowerPivot. Stay tuned!

    Read the article

  • International Pricing of Software [closed]

    - by arachnode.net
    I operate a small company that charges $99 for a piece of software. I'd like to know what would be a fair price for non-US customers. Today I sold a license to a party in South Africa. He told me he had been watching the project for two years while business justification could be made for the purchase as SA's currency is nine times weaker than the US dollar. I found this resource detailing how much a Big Mac costs in various countries: http://howmuchatyourplace.com/how_much_does/Big%20Mac_cost.php I realize that the cost of producing a Big Mac varies from locale to locale as does the demand for one. I am aware that many software companies charge prices in local currencies that equate to the price in US dollars. I am aware that my costs remain fixed, and I obviously I cannot discount the rate at which my time costs me. I'm OK with earning less per sale as I would rather get my software onto the desktops of those that need it rather than having them try to write it themselves. Support is light and I can usually point a user to an existing blog or forum post. Being a resident of Hawaii, I am aware that certain goods and services cost more here. Power is up to six times as much per KWH as it is in, say, Seattle, and wages are approximately 60% of what they are for my profession (programmer). I'd like to offer my software at a price that would be fair for everyone around the globe. If a currency is 2 foreign units to 1 US dollar, and goods and services cost 50% more and pay for an equivalent job is 50% of what it is here, should I charge, say, $50 instead of $99? Is there a resource which would allow me to input a price in US dollars and adjust for a list of international locations?

    Read the article

  • SEO: Getting site to show in location-specific searches

    - by willvv
    I'm really new to this SEO world and I've been reading a lot to try and figure it out. We have a site moodbond.com that allows users to browse/create events anywhere. And we fill it with content from the main cities in the US. We would like it to show for searches for things like "events in san francisco" or "what to do in new york", however, since the site is not really location-specific, I'm not really sure where to begin. I've been thinking a couple of things, maybe you can help me decide if these would be a good way to start or if I should try something different. 1- Allow something like location-specific urls (e.g. moodbond.com/browse/san-francisco) could just show the main page centered in San Francisco. 2- Change the headers/title of the page so it adapts automatically to the city being browsed (and change this dynamically as the user changes the location of the map). 3- Add internal links to different locations (e.g. add a link at the footer of the page that says "Events in Seattle" that makes the site load events in that city. (this would probably depend on implementing #1). What do you guys think? will any of these really help or should I look for a different approach? any advice is welcome. Thanks

    Read the article

  • Does my API design violate RESTful principles?

    - by peta
    Hello everybody, I'm currently (I try to) designing a RESTful API for a social network. But I'm not sure if my current approach does still accord to the RESTful principles. I'd be glad if some brighter heads could give me some tips. Suppose the following URI represents the name field of a user account: people/{UserID}/profile/fields/name But there are almost hundred possible fields. So I want the client to create its own field views or use predefined ones. Let's suppose that the following URI represents a predefined field view that includes the fields "name", "age", "gender": utils/views/field-views/myFieldView And because field views are kind of higher logic I don't want to mix support for field views into the "people/{UserID}/profile/fields" resource. Instead I want to do the following: utils/views/field-views/myFieldView/{UserID} Though Leonard Richardson & Sam Ruby state in their book "RESTful Web Services" that a RESTful design is somehow like an "extreme object oriented" approach, I think that my approach is object oriented and therefore accords to RESTful principles. Or am I wrong? When not: Are such "object oriented" approaches generally encouraged when used with care and in order to avoid query-based REST-RPC hybrids? Thanks for your feedback in advance, peta

    Read the article

  • C program - Seg fault, cause of

    - by resonant_fractal
    Running this gives me a seg fault (gcc filename.c -lm), when i enter 6 (int) as a value. Please help me get my head around this. The intended functionality has not yet been implemented, but I need to know why I'm headed into seg faults already. Thanks! #include<stdio.h> #include<math.h> int main (void) { int l = 5; int n, i, tmp, index; char * s[] = {"Sheldon", "Leonard", "Penny", "Raj", "Howard"}; scanf("%d", &n); //Solve Sigma(Ai*2^(i-1)) = (n - k)/l if (n/l <= 1) printf("%s\n", s[n-1]); else { tmp = n; for (i = 1;;) { tmp = tmp - (l * pow(2,i-1)); if (tmp <= 5) { // printf("Breaking\n"); break; } ++i; } printf("Last index = %d\n", i); // ***NOTE*** //Value lies in next array, therefore ++i; index = tmp + pow(2, n-1); printf("%d\n", index); } return 0; }

    Read the article

  • R Package Installation with Oracle R Enterprise

    - by Sherry LaMonica-Oracle
    Normal 0 false false false EN-US X-NONE X-NONE Programming languages give developers the opportunity to write reusable functions and to bundle those functions into logical deployable entities. In R, these are called packages. R has thousands of such packages provided by an almost equally large group of third-party contributors. To allow others to benefit from these packages, users can share packages on the CRAN system for use by the vast R development community worldwide. R's package system along with the CRAN framework provides a process for authoring, documenting and distributing packages to millions of users. In this post, we'll illustrate the various ways in which such R packages can be installed for use with R and together with Oracle R Enterprise. In the following, the same instructions apply when using either open source R or Oracle R Distribution. In this post, we cover the following package installation scenarios for: R command line Linux shell command line Use with Oracle R Enterprise Installation on Exadata or RAC Installing all packages in a CRAN Task View Troubleshooting common errors 1. R Package Installation BasicsR package installation basics are outlined in Chapter 6 of the R Installation and Administration Guide. There are two ways to install packages from the command line: from the R command line and from the shell command line. For this first example on Oracle Linux using Oracle R Distribution, we’ll install the arules package as root so that packages will be installed in the default R system-wide location where all users can access it, /usr/lib64/R/library.Within R, using the install.packages function always attempts to install the latest version of the requested package available on CRAN:R> install.packages("arules")If the arules package depends upon other packages that are not already installed locally, the R installer automatically downloads and installs those required packages. This is a huge benefit that frees users from the task of identifying and resolving those dependencies.You can also install R from the shell command line. This is useful for some packages when an internet connection is not available or for installing packages not uploaded to CRAN. To install packages this way, first locate the package on CRAN and then download the package source to your local machine. For example:$ wget http://cran.r-project.org/src/contrib/arules_1.1-2.tar.gz Then, install the package using the command R CMD INSTALL:$ R CMD INSTALL arules_1.1-2.tar.gzA major difference between installing R packages using the R package installer at the R command line and shell command line is that package dependencies must be resolved manually at the shell command line. Package dependencies are listed in the Depends section of the package’s CRAN site. If dependencies are not identified and installed prior to the package’s installation, you will see an error similar to:ERROR: dependency ‘xxx’ is not available for package ‘yyy’As a best practice and to save time, always refer to the package’s CRAN site to understand the package dependencies prior to attempting an installation. If you don’t run R as root, you won’t have permission to write packages into the default system-wide location and you will be prompted to create a personal library accessible by your userid. You can accept the personal library path chosen by R, or specify the library location by passing parameters to the install.packages function. For example, to create an R package repository in your home directory: R> install.packages("arules", lib="/home/username/Rpackages")or$ R CMD INSTALL arules_1.1-2.tar.gz --library=/home/username/RpackagesRefer to the install.packages help file in R or execute R CMD INSTALL --help at the shell command line for a full list of command line options.To set the library location and avoid having to specify this at every package install, simply create the R startup environment file .Renviron in your home area if it does not already exist, and add the following piece of code to it:R_LIBS_USER = "/home/username/Rpackages" 2. Setting the RepositoryEach time you install an R package from the R command line, you are asked which CRAN mirror, or server, R should use. To set the repository and avoid having to specify this during every package installation, create the R startup command file .Rprofile in your home directory and add the following R code to it:cat("Setting Seattle repository")r = getOption("repos") r["CRAN"] = "http://cran.fhcrc.org/"options(repos = r)rm(r) This code snippet sets the R package repository to the Seattle CRAN mirror at the start of each R session. 3. Installing R Packages for use with Oracle R EnterpriseEmbedded R execution with Oracle R Enterprise allows the use of CRAN or other third-party R packages in user-defined R functions executed on the Oracle Database server. The steps for installing and configuring packages for use with Oracle R Enterprise are the same as for open source R. The database-side R engine just needs to know where to find the R packages.The Oracle R Enterprise installation is performed by user oracle, which typically does not have write permission to the default site-wide library, /usr/lib64/R/library. On Linux and UNIX platforms, the Oracle R Enterprise Server installation provides the ORE script, which is executed from the operating system shell to install R packages and to start R. The ORE script is a wrapper for the default R script, a shell wrapper for the R executable. It can be used to start R, run batch scripts, and build or install R packages. Unlike the default R script, the ORE script installs packages to a location writable by user oracle and accessible by all ORE users - $ORACLE_HOME/R/library.To install a package on the database server so that it can be used by any R user and for use in embedded R execution, an Oracle DBA would typically download the package source from CRAN using wget. If the package depends on any packages that are not in the R distribution in use, download the sources for those packages, also.  For a single Oracle Database instance, replace the R script with ORE to install the packages in the same location as the Oracle R Enterprise packages. $ wget http://cran.r-project.org/src/contrib/arules_1.1-2.tar.gz$ ORE CMD INSTALL arules_1.1-2.tar.gzBehind the scenes, the ORE script performs the equivalent of setting R_LIBS_USER to the value of $ORACLE_HOME/R/library, and all R packages installed with the ORE script are installed to this location. For installing a package on multiple database servers, such as those in an Oracle Real Application Clusters (Oracle RAC) or a multinode Oracle Exadata Database Machine environment, use the ORE script in conjunction with the Exadata Distributed Command Line Interface (DCLI) utility.$ dcli -g nodes -l oracle ORE CMD INSTALL arules_1.1-1.tar.gz The DCLI -g flag designates a file containing a list of nodes to install on, and the -l flag specifies the user id to use when executing the commands. For more information on using DCLI with Oracle R Enterprise, see Chapter 5 in the Oracle R Enterprise Installation Guide.If you are using an Oracle R Enterprise client, install the package the same as any R package, bearing in mind that you must install the same version of the package on both the client and server machines to avoid incompatibilities. 4. CRAN Task ViewsCRAN also maintains a set of Task Views that identify packages associated with a particular task or methodology. Task Views are helpful in guiding users through the huge set of available R packages. They are actively maintained by volunteers who include detailed annotations for routines and packages. If you find one of the task views is a perfect match, you can install every package in that view using the ctv package - an R package for automating package installation. To use the ctv package to install a task view, first, install and load the ctv package.R> install.packages("ctv")R> library(ctv)Then query the names of the available task views and install the view you choose.R> available.views() R> install.views("TimeSeries") 5. Using and Managing R packages To use a package, start up R and load packages one at a time with the library command.Load the arules package in your R session. R> library(arules)Verify the version of arules installed.R> packageVersion("arules")[1] '1.1.2'Verify the version of arules installed on the database server using embedded R execution.R> ore.doEval(function() packageVersion("arules"))View the help file for the apropos function in the arules packageR> ?aproposOver time, your package repository will contain more and more packages, especially if you are using the system-wide repository where others are adding additional packages. It’s good to know the entire set of R packages accessible in your environment. To list all available packages in your local R session, use the installed.packages command:R> myLocalPackages <- row.names(installed.packages())R> myLocalPackagesTo access the list of available packages on the ORE database server from the ORE client, use the following embedded R syntax: R> myServerPackages <- ore.doEval(function() row.names(installed.packages()) R> myServerPackages 6. Troubleshooting Common ProblemsInstalling Older Versions of R packagesIf you immediately upgrade to the latest version of R, you will have no problem installing the most recent versions of R packages. However, if your version of R is older, some of the more recent package releases will not work and install.packages will generate a message such as: Warning message: In install.packages("arules") : package ‘arules’ is not availableThis is when you have to go to the Old sources link on the CRAN page for the arules package and determine which version is compatible with your version of R.Begin by determining what version of R you are using:$ R --versionOracle Distribution of R version 3.0.1 (--) -- "Good Sport" Copyright (C) The R Foundation for Statistical Computing Platform: x86_64-unknown-linux-gnu (64-bit)Given that R-3.0.1 was released May 16, 2013, any version of the arules package released after this date may work. Scanning the arules archive, we might try installing version 0.1.1-1, released in January of 2014:$ wget http://cran.r-project.org/src/contrib/Archive/arules/arules_1.1-1.tar.gz$ R CMD INSTALL arules_1.1-1.tar.gzFor use with ORE:$ ORE CMD INSTALL arules_1.1-1.tar.gzThe "package not available" error can also be thrown if the package you’re trying to install lives elsewhere, either another R package site, or it’s been removed from CRAN. A quick Google search usually leads to more information on the package’s location and status.Oracle R Enterprise is not in the R library pathOn Linux hosts, after installing the ORE server components, starting R, and attempting to load the ORE packages, you may receive the error:R> library(ORE)Error in library(ORE) : there is no package called ‘ORE’If you know the ORE packages have been installed and you receive this error, this is the result of not starting R with the ORE script. To resolve this problem, exit R and restart using the ORE script. After restarting R and ">running the command to load the ORE packages, you should not receive any errors.$ ORER> library(ORE)On Windows servers, the solution is to make the location of the ORE packages visible to R by adding them to the R library paths. To accomplish this, exit R, then add the following lines to the .Rprofile file. On Windows, the .Rprofile file is located in R\etc directory C:\Program Files\R\R-<version>\etc. Add the following lines:.libPaths("<path to $ORACLE_HOME>/R/library")The above line will tell R to include the R directory in the Oracle home as part of its search path. When you start R, the path above will be included, and future R package installations will also be saved to $ORACLE_HOME/R/library. This path should be writable by the user oracle, or the userid for the DBA tasked with installing R packages.Binary package compiled with different version of RBy default, R will install pre-compiled versions of packages if they are found. If the version of R under which the package was compiled does not match your installed version of R you will get an error message:Warning message: package ‘xxx’ was built under R version 3.0.0The solution is to download the package source and build it for your version of R.$ wget http://cran.r-project.org/src/contrib/Archive/arules/arules_1.1-1.tar.gz$ R CMD INSTALL arules_1.1-1.tar.gzFor use with ORE:$ ORE CMD INSTALL arules_1.1-1.tar.gzUnable to execute files in /tmp directoryBy default, R uses the /tmp directory to install packages. On security conscious machines, the /tmp directory is often marked as "noexec" in the /etc/fstab file. This means that no file under /tmp can ever be executed, and users who attempt to install R package will receive an error:ERROR: 'configure' exists but is not executable -- see the 'R Installation and Administration Manual’The solution is to set the TMP and TMPDIR environment variables to a location which R will use as the compilation directory. For example:$ mkdir <some path>/tmp$ export TMPDIR= <some path>/tmp$ export TMP= <some path>/tmpThis error typically appears on Linux client machines and not database servers, as Oracle Database writes to the value of the TMP environment variable for several tasks, including holding temporary files during database installation. 7. Creating your own R packageCreating your own package and submitting to CRAN is for advanced users, but it is not difficult. The procedure to follow, along with details of R's package system, is detailed in the Writing R Extensions manual.

    Read the article

  • php - sort the unsorted text file and rewrite to same text file in sorted order

    - by arrgggg
    Hi, I have a question. I am in process of learning how to read/write files, but having little trouble trying to do both at the same time in same php script. I have a text file with words like this, Richmond,Virginia Seattle,Washington Los Angeles,California Dallas,Texas Jacksonville,Florida I wrote a code to sort them in order and this will display in sort order by City. <?php $file = file("states.txt"); sort($file); for($i=0; $i<count($file); $i++) { $states = explode(",", $file[$i]); echo $states[0], $states[1],"<br />"; } ?> From this, how can I rewrite those sorted information back into the states.txt file? Thanks for your help.

    Read the article

  • TypeError: Python thinks that I passed a function 2 arguments but I only passed it 1

    - by slhck
    I work on something in Seattle Repy which is a restricted subset of Python. Anyway, I wanted to implement my own Queue that derives from a list: class Queue(list): job_count = 0 def __init__(self): list.__init__(self) def appendleft(item): item.creation_time = getruntime() item.current_count = self.job_count self.insert(0, item) def pop(): item = self.pop() item.pop_time = getruntime() return item Now I call this in my main server, where I use my own Job class to pass Jobs to the Queue: mycontext['queue'] = Queue() # ... job = Job(str(ip), message) mycontext['queue'].appendleft(job) The last line raises the following exception: Exception (with type 'exceptions.TypeError'): appendleft() takes exactly 1 argument (2 given) I'm relatively new to Python, so could anyone explain to me why it would think that I gave appendleft() two arguments when there obviously was only one?

    Read the article

  • DataSet with many OR clauses

    - by Silvan
    Hello :) I've got a little problem with a query which I create in the Visual Studio Designer. I need a query with a lot of 'OR'-clauses for the same column. I found the operator 'IN', but I don't know how to use that in the Visual Studio Designer: Example IN: SELECT EmployeeID, FirstName, LastName, HireDate, City FROM Employees WHERE City IN ('Seattle', 'Tacoma', 'Redmond') I tried to do it in this way: SELECT [MachineryId], [ConstructionSiteId], [DateTime], [Latitude], [Longitude], [HoursCounter] FROM [PositionData] WHERE [MachineryID] IN @MachineryIDs But this doesn't work. Is there another way to handle a lot of OR clauses? Thank you very much for your help.

    Read the article

  • SQL SERVER – Parsing SSIS Catalog Messages – Notes from the Field #030

    - by Pinal Dave
    [Note from Pinal]: This is a new episode of Notes from the Field series. SQL Server Integration Service (SSIS) is one of the most key essential part of the entire Business Intelligence (BI) story. It is a platform for data integration and workflow applications. The tool may also be used to automate maintenance of SQL Server databases and updates to multidimensional cube data. In this episode of the Notes from the Field series I requested SSIS Expert Andy Leonard to discuss one of the most interesting concepts of SSIS Catalog Messages. There are plenty of interesting and useful information captured in the SSIS catalog and we will learn together how to explore the same. The SSIS Catalog captures a lot of cool information by default. Here’s a query I use to parse messages from the catalog.operation_messages table in the SSISDB database, where the logged messages are stored. This query is set up to parse a default message transmitted by the Lookup Transformation. It’s one of my favorite messages in the SSIS log because it gives me excellent information when I’m tuning SSIS data flows. The message reads similar to: Data Flow Task:Information: The Lookup processed 4485 rows in the cache. The processing time was 0.015 seconds. The cache used 1376895 bytes of memory. The query: USE SSISDB GO DECLARE @MessageSourceType INT = 60 DECLARE @StartOfIDString VARCHAR(100) = 'The Lookup processed ' DECLARE @ProcessingTimeString VARCHAR(100) = 'The processing time was ' DECLARE @CacheUsedString VARCHAR(100) = 'The cache used ' DECLARE @StartOfIDSearchString VARCHAR(100) = '%' + @StartOfIDString + '%' DECLARE @ProcessingTimeSearchString VARCHAR(100) = '%' + @ProcessingTimeString + '%' DECLARE @CacheUsedSearchString VARCHAR(100) = '%' + @CacheUsedString + '%' SELECT operation_id , SUBSTRING(MESSAGE, (PATINDEX(@StartOfIDSearchString,MESSAGE) + LEN(@StartOfIDString) + 1), ((CHARINDEX(' ', MESSAGE, PATINDEX(@StartOfIDSearchString,MESSAGE) + LEN(@StartOfIDString) + 1)) - (PATINDEX(@StartOfIDSearchString, MESSAGE) + LEN(@StartOfIDString) + 1))) AS LookupRowsCount , SUBSTRING(MESSAGE, (PATINDEX(@ProcessingTimeSearchString,MESSAGE) + LEN(@ProcessingTimeString) + 1), ((CHARINDEX(' ', MESSAGE, PATINDEX(@ProcessingTimeSearchString,MESSAGE) + LEN(@ProcessingTimeString) + 1)) - (PATINDEX(@ProcessingTimeSearchString, MESSAGE) + LEN(@ProcessingTimeString) + 1))) AS LookupProcessingTime , CASE WHEN (CONVERT(numeric(3,3),SUBSTRING(MESSAGE, (PATINDEX(@ProcessingTimeSearchString,MESSAGE) + LEN(@ProcessingTimeString) + 1), ((CHARINDEX(' ', MESSAGE, PATINDEX(@ProcessingTimeSearchString,MESSAGE) + LEN(@ProcessingTimeString) + 1)) - (PATINDEX(@ProcessingTimeSearchString, MESSAGE) + LEN(@ProcessingTimeString) + 1))))) = 0 THEN 0 ELSE CONVERT(bigint,SUBSTRING(MESSAGE, (PATINDEX(@StartOfIDSearchString,MESSAGE) + LEN(@StartOfIDString) + 1), ((CHARINDEX(' ', MESSAGE, PATINDEX(@StartOfIDSearchString,MESSAGE) + LEN(@StartOfIDString) + 1)) - (PATINDEX(@StartOfIDSearchString, MESSAGE) + LEN(@StartOfIDString) + 1)))) / CONVERT(numeric(3,3),SUBSTRING(MESSAGE, (PATINDEX(@ProcessingTimeSearchString,MESSAGE) + LEN(@ProcessingTimeString) + 1), ((CHARINDEX(' ', MESSAGE, PATINDEX(@ProcessingTimeSearchString,MESSAGE) + LEN(@ProcessingTimeString) + 1)) - (PATINDEX(@ProcessingTimeSearchString, MESSAGE) + LEN(@ProcessingTimeString) + 1)))) END AS LookupRowsPerSecond , SUBSTRING(MESSAGE, (PATINDEX(@CacheUsedSearchString,MESSAGE) + LEN(@CacheUsedString) + 1), ((CHARINDEX(' ', MESSAGE, PATINDEX(@CacheUsedSearchString,MESSAGE) + LEN(@CacheUsedString) + 1)) - (PATINDEX(@CacheUsedSearchString, MESSAGE) + LEN(@CacheUsedString) + 1))) AS LookupBytesUsed ,CASE WHEN (CONVERT(bigint,SUBSTRING(MESSAGE, (PATINDEX(@StartOfIDSearchString,MESSAGE) + LEN(@StartOfIDString) + 1), ((CHARINDEX(' ', MESSAGE, PATINDEX(@StartOfIDSearchString,MESSAGE) + LEN(@StartOfIDString) + 1)) - (PATINDEX(@StartOfIDSearchString, MESSAGE) + LEN(@StartOfIDString) + 1)))))= 0 THEN 0 ELSE CONVERT(bigint,SUBSTRING(MESSAGE, (PATINDEX(@CacheUsedSearchString,MESSAGE) + LEN(@CacheUsedString) + 1), ((CHARINDEX(' ', MESSAGE, PATINDEX(@CacheUsedSearchString,MESSAGE) + LEN(@CacheUsedString) + 1)) - (PATINDEX(@CacheUsedSearchString, MESSAGE) + LEN(@CacheUsedString) + 1)))) / CONVERT(bigint,SUBSTRING(MESSAGE, (PATINDEX(@StartOfIDSearchString,MESSAGE) + LEN(@StartOfIDString) + 1), ((CHARINDEX(' ', MESSAGE, PATINDEX(@StartOfIDSearchString,MESSAGE) + LEN(@StartOfIDString) + 1)) - (PATINDEX(@StartOfIDSearchString, MESSAGE) + LEN(@StartOfIDString) + 1)))) END AS LookupBytesPerRow FROM [catalog].[operation_messages] WHERE message_source_type = @MessageSourceType AND MESSAGE LIKE @StartOfIDSearchString GO Note that you have to set some parameter values: @MessageSourceType [int] – represents the message source type value from the following results: Value     Description 10           Entry APIs, such as T-SQL and CLR Stored procedures 20           External process used to run package (ISServerExec.exe) 30           Package-level objects 40           Control Flow tasks 50           Control Flow containers 60           Data Flow task 70           Custom execution message Note: Taken from Reza Rad’s (excellent!) helper.MessageSourceType table found here. @StartOfIDString [VarChar(100)] – use this to uniquely identify the message field value you wish to parse. In this case, the string ‘The Lookup processed ‘ identifies all the Lookup Transformation messages I desire to parse. @ProcessingTimeString [VarChar(100)] – this parameter is message-specific. I use this parameter to specifically search the message field value for the beginning of the Lookup Processing Time value. For this execution, I use the string ‘The processing time was ‘. @CacheUsedString [VarChar(100)] – this parameter is also message-specific. I use this parameter to specifically search the message field value for the beginning of the Lookup Cache  Used value. It returns the memory used, in bytes. For this execution, I use the string ‘The cache used ‘. The other parameters are built from variations of the parameters listed above. The query parses the values into text. The string values are converted to numeric values for ratio calculations; LookupRowsPerSecond and LookupBytesPerRow. Since ratios involve division, CASE statements check for denominators that equal 0. Here are the results in an SSMS grid: This is not the only way to retrieve this information. And much of the code lends itself to conversion to functions. If there is interest, I will share the functions in an upcoming post. If you want to get started with SSIS with the help of experts, read more over at Fix Your SQL Server. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: Notes from the Field, PostADay, SQL, SQL Authority, SQL Backup and Restore, SQL Query, SQL Server, SQL Tips and Tricks, T SQL Tagged: SSIS

    Read the article

< Previous Page | 5 6 7 8 9 10 11  | Next Page >