Search Results

Search found 12887 results on 516 pages for 'hard boiled wonderland'.

Page 463/516 | < Previous Page | 459 460 461 462 463 464 465 466 467 468 469 470  | Next Page >

  • Anatomy of a serialization killer

    - by Brian Donahue
    As I had mentioned last month, I have been working on a project to create an easy-to-use managed debugger. It's still an internal tool that we use at Red Gate as part of product support to analyze application errors on customer's computers, and as such, should be easy to use and not require installation. Since the project has got rather large and important, I had decided to use SmartAssembly to protect all of my hard work. This was trivial for the most part, but the loading and saving of results was broken by SA after using the obfuscation, rendering the loading and saving of XML results basically useless, although the merging and error reporting was an absolute godsend and definitely worth the price of admission. (Well, I get my Red Gate licenses for free, but you know what I mean!)My initial reaction was to simply exclude the serializable results class and all of its' members from obfuscation, and that was just dandy, but a few weeks on I decided to look into exactly why serialization had broken and change the code to work with SA so I could write any new code to be compatible with SmartAssembly and save me some additional testing and changes to the SA project.In simple terms, SA does all that it can to prevent serialization problems, for instance, it will not obfuscate public members of a DLL and it will exclude any types with the Serializable attribute from obfuscation. This prevents public members and properties from being made private and having the name changed. If the serialization is done inside the executable, however, public members have the access changed to private and are renamed. That was my first problem, because my types were in the executable assembly and implemented ISerializable, but did not have the Serializable attribute set on them!public class RedFlagResults : ISerializable        {        }The second problem caused by the pruning feature. Although RedFlagResults had public members, they were not truly properties, and used the GetObjectData() method of ISerializable to serialize the members. For that reason, SA could not exclude these members from pruning and further broke the serialization. public class RedFlagResults : ISerializable        {                public List<RedFlag.Exception> Exceptions;                 #region ISerializable Members                 public void GetObjectData(SerializationInfo info, StreamingContext context)                {                                info.AddValue("Exceptions", Exceptions);                }                 #endregionSo to fix this, it was necessary to make Exceptions a proper property by implementing get and set on it. Also, I added the Serializable attribute so that I don't have to exclude the class from obfuscation in the SA project any more. The DoNotPrune attribute means I do not need to exclude the class from pruning.[Serializable, SmartAssembly.Attributes.DoNotPrune]        public class RedFlagResults        {                public List<RedFlag.Exception> Exceptions {get;set;}        }Similarly, the Exception class gets the Serializable and DoNotPrune attributes applied so all of its' properties are excluded from obfuscation.Now my project has some protection from prying eyes by scrambling up the code so it's harder to reverse-engineer, without breaking anything. SmartAssembly has also provided the benefit of merging so that the end-user doesn't need to extract all of the DLL files needed by RedFlag into a directory, and can be run directly from the .zip archive. When an error occurs (hey, I'm only human!), an exception report can be sent to me so I can see what went wrong without having to, er, debug the debugger.

    Read the article

  • AutoFit in PowerPoint: Turn it OFF

    - by Daniel Moth
    Once a feature has shipped, it is very hard to eliminate it from the next release. If I was in charge of the PowerPoint product, I would not hesitate for a second to remove the dreadful AutoFit feature. Fortunately, AutoFit can be turned off on a slide-by-slide basis and, even better, globally: go to the PowerPoint "Options" and under "Proofing" find the "AutoCorrect Options…" button which brings up the dialog where you need to uncheck the last two checkboxes (see the screenshot to the right). AutoFit is the ability for the user to keep hitting the Enter key as they type more and more text into a slide and it magically still fits, by shrinking the space between the lines and then the text font size. It is the root of all slide evil. It encourages people to think of a slide as a Word document (which may be your goal, if you are presenting to execs in Microsoft, but that is a different story). AutoFit is the reason you fall asleep in presentations. AutoFit causes too much text to appear on a slide which by extension causes the following: When the slide appears, the text is so small so it is not readable by everyone in the audience. They dismiss the presenter as someone who does not care for them and then they stop paying attention. If the text is readable, but it is too much (hence the AutoFit feature kicked in when the slide was authored), the audience is busy reading the slide and not paying attention to the presenter. Humans can either listen well or read well at the same time, so when they are done reading they now feel that they missed whatever the speaker was saying. So they "switch off" for the rest of the slide until the next slide kicks in, which is the natural point for them to pick up paying attention again. Every slide ends up with different sized text. The less visual consistency between slides, the more your presentation feels unprofessional. You can do better than dismiss the (subconscious) negative effect a deck with inconsistent slides has on an audience. In contrast, the absence of AutoFit Leads to consistency among all slides in a deck with regards to amount of text and size of said text. Ensures the text is readable by everyone in the audience (presuming the PowerPoint template is designed for the room where the presentation is delivered). Encourages the presenter to create slides with the minimum necessary text to help the audience understand the basic structure, flow, and key points of the presentation. The "meat" of the presentation is delivered verbally by the presenter themselves, which is why they are in the room in the first place. Following on from the previous point, the audience can at a quick glance consume the text on the slide when it appears and then concentrate entirely on the presenter and what they have to say. You could argue that everything above has nothing to do with the AutoFit feature and all to do with the advice to keep slide content short. You would be right, but the on-by-default AutoFit feature is the one that stops most people from seeing and embracing that truth. In other words, the slides are the tool that aids the presenter in delivering their message, instead of the presenter being the tool that advances the slides which hold the message. To get there, embrace terse slides: the first step is to turn off this horrible feature (that was probably introduced due to the misuse of this tool within Microsoft). The next steps are described on my next post. Comments about this post welcome at the original blog.

    Read the article

  • Make Text and Images Easier to Read with the Windows 7 Magnifier

    - by DigitalGeekery
    Do you have impaired vision or find it difficult to read small print on your computer screen? Today, we’ll take a closer look at how to magnify that hard to read content with the Magnifier in Windows 7. Magnifier was available in previous versions of Windows, but the Windows 7 version comes with some notable improvements. There are now three screen modes in Magnifier. Full Screen and Lens mode, however, require Windows Aero to be enabled. If your computer doesn’t support Aero, or if you’re not using am Aero theme, Magnifier will only work in Docked mode. Using Magnifier in Windows 7 You can find the Magnifier by going to Start > All Programs > Accessories > Ease of Access > Magnifier.   Alternately, you can type magnifier into the Search box in the Start Menu and hit Enter. On the Magnifier toolbar, choose your View mode by clicking Views and choosing from the available options. Clicking the plus (+) and minus (-) buttons will zoom in or zoom out. You can change the zoom in/out percentage by adjusting the slider bar. You can also enable color inversion and select tracking options. Click OK when finished to save your settings.   After a brief period, the Magnifier Toolbar will switch to a magnifying glass icon. Simply click the magnifying glass to display the Magnifier Toolbar again.   Docked Mode In Docked mode, a portion of the screen is magnified and docked at the top of the screen. The rest of your desktop will remain in it’s normal state. You can then control which area of the screen is magnified by moving your mouse.   Full Screen Mode This magnifies your entire screen and follows your mouse as you move it around. If you loose track of where you are on the screen, use the Ctrl + Alt + Spacebar shortcut to preview where your mouse pointer is on the screen.   Lens Mode The Lens screen mode is similar to holding a magnifying glass up to your screen. Full screen mode magnifies the area around the mouse. The magnified area moves around the screen with your mouse.    Shortcut Keys Windows key + (+) to zoom in Windows key + (-) to zoom out Windows key + ESC to exit Ctrl + Alt + F – Full screen mode Ctrl + Alt + L – Lens mode Ctrl + Alt + D – Dock mode Ctrl + Alt + R – Resize the lens Ctrl + Alt + Spacebar – Preview full screen Conclusion Windows Magnifier is a nice little tool if you have impaired vision or just need to make items on the screen easier to read. Similar Articles Productive Geek Tips New Features in WordPad and Paint in Windows 7How-To Geek on Lifehacker: How to Make Windows Vista Less AnnoyingUsing Comments in Word 2007 DocumentsMake Your PC Look Like Windows Phone 7Use Image Placeholders to Display Documents Faster in Word TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips HippoRemote Pro 2.2 Xobni Plus for Outlook All My Movies 5.9 CloudBerry Online Backup 1.5 for Windows Home Server Windows Media Player Plus! – Cool WMP Enhancer Get Your Team’s World Cup Schedule In Google Calendar Backup Drivers With Driver Magician TubeSort: YouTube Playlist Organizer XPS file format & XPS Viewer Explained Microsoft Office Web Apps Guide

    Read the article

  • Big Data – Beginning Big Data – Day 1 of 21

    - by Pinal Dave
    What is Big Data? I want to learn Big Data. I have no clue where and how to start learning about it. Does Big Data really means data is big? What are the tools and software I need to know to learn Big Data? I often receive questions which I mentioned above. They are good questions and honestly when we search online, it is hard to find authoritative and authentic answers. I have been working with Big Data and NoSQL for a while and I have decided that I will attempt to discuss this subject over here in the blog. In the next 21 days we will understand what is so big about Big Data. Big Data – Big Thing! Big Data is becoming one of the most talked about technology trends nowadays. The real challenge with the big organization is to get maximum out of the data already available and predict what kind of data to collect in the future. How to take the existing data and make it meaningful that it provides us accurate insight in the past data is one of the key discussion points in many of the executive meetings in organizations. With the explosion of the data the challenge has gone to the next level and now a Big Data is becoming the reality in many organizations. Big Data – A Rubik’s Cube I like to compare big data with the Rubik’s cube. I believe they have many similarities. Just like a Rubik’s cube it has many different solutions. Let us visualize a Rubik’s cube solving challenge where there are many experts participating. If you take five Rubik’s cube and mix up the same way and give it to five different expert to solve it. It is quite possible that all the five people will solve the Rubik’s cube in fractions of the seconds but if you pay attention to the same closely, you will notice that even though the final outcome is the same, the route taken to solve the Rubik’s cube is not the same. Every expert will start at a different place and will try to resolve it with different methods. Some will solve one color first and others will solve another color first. Even though they follow the same kind of algorithm to solve the puzzle they will start and end at a different place and their moves will be different at many occasions. It is  nearly impossible to have a exact same route taken by two experts. Big Market and Multiple Solutions Big Data is exactly like a Rubik’s cube – even though the goal of every organization and expert is same to get maximum out of the data, the route and the starting point are different for each organization and expert. As organizations are evaluating and architecting big data solutions they are also learning the ways and opportunities which are related to Big Data. There is not a single solution to big data as well there is not a single vendor which can claim to know all about Big Data. Honestly, Big Data is too big a concept and there are many players – different architectures, different vendors and different technology. What is Next? In this 31 days series we will be exploring many essential topics related to big data. I do not claim that you will be master of the subject after 31 days but I claim that I will be covering following topics in easy to understand language. Architecture of Big Data Big Data a Management and Implementation Different Technologies – Hadoop, Mapreduce Real World Conversations Best Practices Tomorrow In tomorrow’s blog post we will try to answer one of the very essential questions – What is Big Data? Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Big Data, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • P90X or How I Stopped Worrying and Love Exercise

    - by Matt Christian
    Last Wednesday, after many UPS delivery failures, I received P90X in the mail.  P90X is a series of DVD's and a nutrition guide you use to shed pounds and gain muscle.  Odds are you've seen the infomercial on TV at some point if you watch a little tube now and again.  I started last Thursday and am still standing to tell this tale. At it's core, P90X is a 12 DVD set of exercise videos.  Each video is comprised of a different workout routine that typically last around an hour (some up to 1 1/2 hours).  Every day you are supposed to do one of the workouts which are different every day (sometimes you may repeat a shorter 6 min workout dedicated to abs twice a week).  There are different 'programs' focused on different areas, for weight loss you do the Lean Program, standard weight loss and muscle gain do the Regular Program, and for those hardcore health-nuts, the Insane Program (which consists of 2 - 1 hour long exercises per day).  Each Program has a different set of workouts per week which you repeat for 3 weeks, followed by a 'Relaxation Week' which is essentially a slightly different order.  After the month of workouts is over, you've finished 1 phase out of 3.  P90X takes 90 days, split into 3 Phases (1 phase per month).  Every phase has a different workout order which is also focused on different areas (Weight Loss, Muscle Gain, etc...)  With the DVD's you also get a small glossy book of about 100 pages detailing the different workouts and the different programs as well as a sample workout to see if you're even ready to start P90X. The second part of P90X, which can also be considered the 'core' (actually the other half of the core) is the nutrition guide that is included.  The Nutrition Guide is a book similar to the one that defines the exercises (about 100 glossy pages) though it details foods you should eat, the amounts, and a number of healthy (and tasty!) recipes.  The guide is split up into 3 phases as well, promoting high protein and low carb/dairy at during Phase 1, and levelling off through to Phase 3 where you have a relatively balanced amount of every food group. So after 1 week where am I?  I've stuck quite close to the nutrition guide (there isn't 'diet food' in here people, it's ACTUALLY food) and done my exercise every day.  I think a lot of the first week is getting into the whole idea and learning the moves performed on the DVD.  Have I lost weight?  No.  Do I feel some definition already starting to poke out?  Absolutely (no pun intended). Tony Horton (the 51-year old hulk that runs the whole thing) is very fun to listen and work along with and the 'diet' really isn't too hard to follow unless all you eat is carbs.  I've tried the gym thing and could not get motivated enough to continue going.  P90X is the first time I've ached from a workout, BEFORE starting my next workout.  For anyone interested, Google 'P90X' or 'BeachBody' to find out more information about this awesome program!

    Read the article

  • SQLAuthority News – Book Signing Event – SQLPASS 2011 Event Log

    - by pinaldave
    I have been dreaming of writing book for really long time, and I finally got the chance – in fact, two chances!  I recently wrote two books: SQL Programming Joes 2 Pros: Programming and Development for Microsoft SQL Server 2008 [Amazon] | [Flipkart] | [Kindle] and SQL Wait Stats Joes 2 Pros: SQL Performance Tuning Techniques Using Wait Statistics, Types & Queues [Amazon] | [Flipkart] | [Kindle].  I had a lot of fun writing these two books, even though sometimes I had to sacrifice some family time and time for other personal development to write the books. The good side of writing book is that when the efforts put in writing books are recognize by books readers and kind organizations like expressor studio. Book Signing Event Book writing is a complex process.  Even after you spend months, maybe years, writing the material you still have to go through the editing and fact checking processes.  And, once the book is out there, there is no way to take back all the copies to change mistakes or add something you forgot.  Most of the time it is a one-way street. Book Signing Event Just like every author, I had a dream that after the books were written, they would be loved by people and gain acceptance by an audience. My first book, SQL Programming Joes 2 Pros: Programming and Development for Microsoft SQL Server 2008, is extremely popular because it helps lots of people learn various fundamental topics. My second book covers beginning to learn SQL Server Wait Stats, which is a relatively new subject. This book has had very good acceptance in the community. Book Signing Event Helping my community is my primary focus, so I was happy to see this year’s SQLPASS tag line: ‘This is a Community.‘ At the event, the expressor studio guys came up with a very novel idea. They had previously used my books and they had found them very useful. They got 100 copies of the book and decided to give it away to community folks. They invited me and my co-author Rick Morelan to hold a book signing event. We did a book signing on Thursday between 1 pm and 2 pm. Book Signing Event This event was one of the best events for me. This was my first book signing event outside of India. I reached the book signing location around 20 minutes before the scheduled time and what I saw was a big line for the book signing event. I felt very honored looking at the crowd and all the people around the event location. I felt very humbled when I saw some of my very close friends standing in the line to get my signature. It was really heartwarming to see so many enthusiasts waiting for more than an hour to get my signature. While standing in line I had the chance to have a conversation with every single person who showed up for the signature. I made sure that I repeated every single name and wrote it in every book with my signature. There is saying that if we write a name once we will remember it forever. I want to remember all of you who saw me at the book signing. Your comments were wonderful, your feedback was amazing and you were all very supportive. Book Signing Event I have made a note of every conversation I had with all of you when I was signing the books. Once again, I just want to express my thanks for coming to my book signing event. The whole experience was very humbling. On the top of it, I want to thank the expressor studio people who made it possible, who organized the whole signing event. I am so thankful to them for facilitating the whole experience, which is going to be hard to beat by any future experience. My books Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL PASS, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, SQLAuthority News, T SQL, Technology

    Read the article

  • The Java Community Process: What's Broken and How to Fix It

    - by Tori Wieldt
    In a panel discussion today at TheServerSide Java Symposium, Patrick Curran, Head of the Java Community Process, James Gosling, and ?Reza Rahman, member, Java EE 6 and EJB 3.1 expert groups, discussed the state of the JCP. Moderated by Cameron McKenzie, Editor of TheServerSide.com, they discussed what's wrong with JCP and ways to fix it.What's wrong with the JCP? Reza Rahman was quite supportive of the JCP. "I work as a consultant, and it's much better than getting a decision made a large company," Reza commented. He gave the JCP "Five stars" and explained that as an individual, he was able to have an impact on things that mattered to him. Cameron asked, "Now all these JCP problems came after Oracle acquired Sun, right?" To which the crowd had a good laugh, and the panel all agreed many of the JCP problems existed under Sun. How is the JCP handled differently under Oracle than Sun? "Pretty similar," said James. Oracle "tends more towards practicality" said Reza. "I'm glad to see things moving again, we've got several new JSRs filed," Patrick commented.How to Fix It?They all agreed greater transparency is a top issue. Without it, people assume sinister behavior whether it's there or not. Patrick said that currently spec leads are "encouraged" to be transparent, and the JCP office is planning to submit JSRs to change the JCP process so transparency is mandated, both for mailing lists and issue tracking. Shining a light on problems is the best way to fix them.Reza said the biggest problem is lack of a participation from the community. If more people are involved, a lot of the problems go away. "Developers are too non-chalant, they should realize what happens in the JCP has an direct impact on their career and they need to get involved." Reza commented.Got Involved!During Q&A, someone asked how a developer could get involved. They answered: Pick a JSR you are interested in and follow it. To start, you could read an article about the JSR and comment on the article (expert group members do read the comments). Or read the spec, discuss it with others and post a blog about it. Read the Expert Group proceedings. Join the JCP (free for individuals). Open source projects have code that you can download and play with, download it and provide feedback. Patrick mentioned that the JCP really wants more participation. "One way we are working on it is that we are encouraging JUGs to join the JCP as a group, and that makes all members of the JUG JCP members," Patrick said.They commented that most spec leads are desperate for feedback. "And, please get involved BEFORE the spec is finalized!" James declared. Someone from the audience said it's hard to put valuable time into something before it's baked. Patrick explained that Post Final Draft (PFD) is the time in the JCP process when the spec is mature enough to review but before the spec is finalized. The panel agreed the worst thing that could happen is that most people in the Java community just complain about the JCP without getting involved. Developer Sumit Goyal, conference attendee, thought it was a healthy discussion. "I got insights into how JSRs are worked on and finalized," he said.Key LinksThe Java Community Process Website  http://jcp.org/en/home/indexArticle: A Conversation with JCP Chair Patrick Curran Oracle Technology Network http://www.oracle.com/technetwork/java/index.htmlTheServerSide Java Symposium  http://javasymposium.techtarget.com/

    Read the article

  • SQLAuthority News – History of the Database – 5 Years of Blogging at SQLAuthority

    - by pinaldave
    Don’t miss the Contest:Participate in 5th Anniversary Contest   Today is this blog’s birthday, and I want to do a fun, informative blog post. Five years ago this day I started this blog. Intention – my personal web blog. I wrote this blog for me and still today whatever I learn I share here. I don’t want to wander too far off topic, though, so I will write about two of my favorite things – history and databases.  And what better way to cover these two topics than to talk about the history of databases. If you want to be technical, databases as we know them today only date back to the late 1960’s and early 1970’s, when computers began to keep records and store memories.  But the idea of memory storage didn’t just appear 40 years ago – there was a history behind wanting to keep these records. In fact, the written word originated as a way to keep records – ancient man didn’t decide they suddenly wanted to read novels, they needed a way to keep track of the harvest, of their flocks, and of the tributes paid to the local lord.  And that is how writing and the database began.  You could consider the cave paintings from 17,0000 years ago at Lascaux, France, or the clay token from the ancient Sumerians in 8,000 BC to be the first instances of record keeping – and thus databases. If you prefer, you can consider the advent of written language to be the first database.  Many historians believe the first written language appeared in the 37th century BC, with Egyptian hieroglyphics. The ancient Sumerians, not to be outdone, also created their own written language within a few hundred years. Databases could be more closely described as collections of information, in which case the Sumerians win the prize for the first archive.  A collection of 20,000 stone tablets was unearthed in 1964 near the modern day city Tell Mardikh, in Syria.  This ancient database is from 2,500 BC, and appears to be a sort of law library where apprentice-scribes copied important documents.  Further archaeological digs hope to uncover the palace library, and thus an even larger database. Of course, the most famous ancient database would have to be the Royal Library of Alexandria, the great collection of records and wisdom in ancient Egypt.  It was created by Ptolemy I, and existed from 300 BC through 30 AD, when Julius Caesar effectively erased the hard drives when he accidentally set fire to it.  As any programmer knows who has forgotten to hit “save” or has experienced a sudden power outage, thousands of hours of work was lost in a single instant. Databases existed in very similar conditions up until recently.  Cuneiform tablets gave way to papyrus, which led to vellum, and eventually modern paper and the printing press.  Someday the databases we rely on so much today will become another chapter in the history of record keeping.  Who knows what the databases of tomorrow will look like! Reference:  Pinal Dave (http://blog.SQLAuthority.com) Filed under: About Me, Database, Pinal Dave, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology

    Read the article

  • LibGDX Box2D Body and Sprite AND DebugRenderer out of sync

    - by Free Lancer
    I am having a couple issues with Box2D bodies. I have a GameObject holding a Sprite and Body. I use a ShapeRenderer to draw an outline of the Body's and Sprite's bounding boxes. I also added a Box2DDebugRenderer to make sure everything's lining up properly. My problem is the Sprite and Body at first overlap perfectly, but as I turn the Body moves a bit off the sprite then comes back when the Car is facing either North or South. Here's an image of what I mean: (Not sure what that line is, first time to show up) BLUE is the Body, RED is the Sprite, PURPLE is the Box2DDebugRenderer. Also, you probably noticed a purple square in the top right corner. Well that's the Car drawn by the Box2D Debug Renderer. I thought it might be the camera but I've been playing with the Cameras for hours and nothing seems to work. All give me weird results. Here's my code: Screen: public void show() { // --------------------- SETUP ALL THE CAMERA STUFF ------------------------------ // battleStage = new Stage( 720, 480, false ); // Setup the camera. In Box2D we operate on a meter scale, pixels won't do it. So we use // an Orthographic camera with a Viewport of 24 meters in width and 16 meters in height. battleStage.setCamera( new OrthographicCamera( CAM_METER_WIDTH, CAM_METER_HEIGHT ) ); battleStage.getCamera().position.set( CAM_METER_WIDTH / 2, CAM_METER_HEIGHT / 2, 0 ); // The Box2D Debug Renderer will handle rendering all physics objects for debugging debugger = new Box2DDebugRenderer( true, true, true, true ); //debugCam = new OrthographicCamera( CAM_METER_WIDTH, CAM_METER_HEIGHT ); } public void render(float delta) { // Update the Physics World, use 1/45 for something around 45 Frames/Second for mobile devices physicsWorld.step( 1/45.0f, 8, 3 ); // 1/45 for devices // Set the Camera matrices and clear the screen Gdx.gl.glClear(GL10.GL_COLOR_BUFFER_BIT); battleStage.getCamera().update(); // Draw game objects here battleStage.act(delta); battleStage.draw(); // Again update the Camera matrices and call the debug renderer debugCam.update(); debugger.render( physicsWorld, debugCam.combined); // Vehicle handles its own interaction with the HUD // update all Actors movements in the game Stage hudStage.act( delta ); // Draw each Actor onto the Scene at their new positions hudStage.draw(); } Car: (extends Actor) public Car( Texture texture, float posX, float posY, World world ) { super( "Car" ); mSprite = new Sprite( texture ); mSprite.setSize( WIDTH * Consts.PIXEL_METER_RATIO, HEIGHT * Consts.PIXEL_METER_RATIO ); mSprite.setOrigin( mSprite.getWidth()/2, mSprite.getHeight()/2); // set the origin to be at the center of the body mSprite.setPosition( posX * Consts.PIXEL_METER_RATIO, posY * Consts.PIXEL_METER_RATIO ); // place the car in the center of the game map FixtureDef carFixtureDef = new FixtureDef(); mBody = Physics.createBoxBody( BodyType.DynamicBody, carFixtureDef, mSprite ); } public void draw() { mSprite.setPosition( mBody.getPosition().x * Consts.PIXEL_METER_RATIO, mBody.getPosition().y * Consts.PIXEL_METER_RATIO ); mSprite.setRotation( MathUtils.radiansToDegrees * mBody.getAngle() ); // draw the sprite mSprite.draw( batch ); } Physics: (Create the Body) public static Body createBoxBody( final BodyType pBodyType, final FixtureDef pFixtureDef, Sprite pSprite ) { float pRotation = 0; float pWidth = pSprite.getWidth(); float pHeight = pSprite.getHeight(); final BodyDef boxBodyDef = new BodyDef(); boxBodyDef.type = pBodyType; boxBodyDef.position.x = pSprite.getX() / Consts.PIXEL_METER_RATIO; boxBodyDef.position.y = pSprite.getY() / Consts.PIXEL_METER_RATIO; // Temporary Box shape of the Body final PolygonShape boxPoly = new PolygonShape(); final float halfWidth = pWidth * 0.5f / Consts.PIXEL_METER_RATIO; final float halfHeight = pHeight * 0.5f / Consts.PIXEL_METER_RATIO; boxPoly.setAsBox( halfWidth, halfHeight ); // set the anchor point to be the center of the sprite pFixtureDef.shape = boxPoly; final Body boxBody = BattleScreen.getPhysicsWorld().createBody(boxBodyDef); boxBody.createFixture(pFixtureDef); } Sorry for all the code and long description but it's hard to pin down what exactly might be causing the problem.

    Read the article

  • How to convert from amateur web app developer to professional web apper?

    - by Nilesh
    This is more of a practical question on web app development and deployment process. Here is some background information. I use PHP for server side scripting, javascript for client side. I use Netbeans and notepad++. I user Firefox and firebug for debugging and testing. The process I use is very amateurish, I code something in netbeans, something in notepad++ and since there is nothing to compile, I just refresh the firefox browser and test it. This is convenient and faster compared to the Java development enviornment where you would have to atleast compile and deploy the jar files before you could run them. I have been thinking of putting a formal process in my development and find it hard putting it together. There are so many things to do before you can deploy your final web app. I keep hearing jslint, compression, unit testing (selenium), Ant, YUI compressor etc but I am now looking for some steps that I can take to make me more organized. For e.g I use netbeans but don't use any projects within it. I directly update the files. I don't use any source control but use my Iomega backup that saves each save into a different version and at the end of the day I backup the dev directory to my Amazon s3 account. For me development environment is just a DEV directory, TEST is my intermediate stage and PROD is the final directory that gets pushed out to the server. But all these directories are in the same apache home. I have few php scripts that just copies the needed files into the production directory. Thats about it for my development approach. I know I am missing the following - Regression testing (manual or automated ??) - automated testing (selenium ??) - automated deployment (ANT ??) - source control (svn ??) - quality control (jslint ??) Can someone explain what are the missing steps and how to go about filling those steps in order to have more professional approach. I am looking for tools with example tutorials in streamlining the whole development to deployment stage. For me just getting a hang of database, server side and client side development all in synchronization was itself a huge accomplishment. And now I feel there is lot missing before you can produce quality web application. For e.g I see lot of mention about using automated testing but how to put in use with respect to javascript and php. How to use ANT for the deployment etc. Is this all too much for a single or two person development team? Is there a way to automate all the above so that I just keep coding in netbeans and then run a batch file that is configured once and run it everytime to produce the code in the production directory? Lot of these information is scattered on the web and here, if someone can guide I would be happy to consolidate here. Thank you for your patience :)

    Read the article

  • SQL SERVER – Query Hint – Contest Win Joes 2 Pros Combo (USD 198) – Day 1 of 5

    - by pinaldave
    August 2011 we ran a contest where every day we give away one book for an entire month. The contest had extreme success. Lots of people participated and lots of give away. I have received lots of questions if we are doing something similar this month. Absolutely, instead of running a contest a month long we are doing something more interesting. We are giving away USD 198 worth gift every day for this week. We are giving away Joes 2 Pros 5 Volumes (BOOK) SQL 2008 Development Certification Training Kit every day. One copy in India and One in USA. Total 2 of the giveaway (worth USD 198). All the gifts are sponsored from the Koenig Training Solution and Joes 2 Pros. The books are available here Amazon | Flipkart | Indiaplaza How to Win: Read the Question Read the Hints Answer the Quiz in Contact Form in following format Question Answer Name of the country (The contest is open for USA and India residents only) 2 Winners will be randomly selected announced on August 20th. Question of the Day: Which of the following queries will return dirty data? a) SELECT * FROM Table1 (READUNCOMMITED) b) SELECT * FROM Table1 (NOLOCK) c) SELECT * FROM Table1 (DIRTYREAD) d) SELECT * FROM Table1 (MYLOCK) Query Hints: BIG HINT POST Most SQL people know what a “Dirty Record” is. You might also call that an “Intermediate record”. In case this is new to you here is a very quick explanation. The simplest way to describe the steps of a transaction is to use an example of updating an existing record into a table. When the insert runs, SQL Server gets the data from storage, such as a hard drive, and loads it into memory and your CPU. The data in memory is changed and then saved to the storage device. Finally, a message is sent confirming the rows that were affected. For a very short period of time the update takes the data and puts it into memory (an intermediate state), not a permanent state. For every data change to a table there is a brief moment where the change is made in the intermediate state, but is not committed. During this time, any other DML statement needing that data waits until the lock is released. This is a safety feature so that SQL Server evaluates only official data. For every data change to a table there is a brief moment where the change is made in this intermediate state, but is not committed. During this time, any other DML statement (SELECT, INSERT, DELETE, UPDATE) needing that data must wait until the lock is released. This is a safety feature put in place so that SQL Server evaluates only official data. Additional Hints: I have previously discussed various concepts from SQL Server Joes 2 Pros Volume 1. SQL Joes 2 Pros Development Series – Dirty Records and Table Hints SQL Joes 2 Pros Development Series – Row Constructors SQL Joes 2 Pros Development Series – Finding un-matching Records SQL Joes 2 Pros Development Series – Efficient Query Writing Strategy SQL Joes 2 Pros Development Series – Finding Apostrophes in String and Text SQL Joes 2 Pros Development Series – Wildcard – Querying Special Characters SQL Joes 2 Pros Development Series – Wildcard Basics Recap Next Step: Answer the Quiz in Contact Form in following format Question Answer Name of the country (The contest is open for USA and India) Bonus Winner Leave a comment with your favorite article from the “additional hints” section and you may be eligible for surprise gift. There is no country restriction for this Bonus Contest. Do mention why you liked it any particular blog post and I will announce the winner of the same along with the main contest. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Joes 2 Pros, PostADay, SQL, SQL Authority, SQL Puzzle, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Cutting Edge versus Just Average? Your SOA, Got BPM? by Mala Ramakrishnan

    - by JuergenKress
    Service Oriented Architecture (SOA) has completely transformed IT from the time it was introduced well over a decade ago. Organizations have been re-plumbing their infrastructure for reusability, efficiency and gain and succeeding with it. Best practices have emerged and people and technology have matured. We have got better at delivering on a stable platform on mission critical applications and services. Yet, there is this one secret that sets some SOA customers apart from the others. These companies grow and revolutionize their business and not just transform their IT infrastructure. The differences seem subtle for an untrained eye examining these organizations externally. And from within the company, it’s a bit like an ant sitting on an elephant, hard to differentiate between the IT trunk and business tail. What is it that some organizations do differently that makes them succeed beyond SOA? These organizations pull in business people more and more to weigh into their IT decisions. They wrench understanding process over services. They don’t settle easily when bridging business metrics and IT performance. They anguish over business requirements not translating seamlessly and quickly into IT. IT is not just an enabler but a pillar that revolutionizes their business. Okay, I’ll give it to you. These organizations layer Business Process Management (BPM) on top of their SOA. Think about lifeblood business processes in your own organizations. If you are Fedex, this would be shipping and handling. If you are Stanford Hospital, this would be patient case-management: from on-boarding through discharge and follow-up care. If you are Wells Fargo, this would be loan origination. Now think about how your SOA ties into your business process. Can you decouple your business processes from your SOA so that the two can transform and change independent of each other? Can you forecast success metrics for your business process, make the changes across the board and then look back over different periods of time to see if you are on track? Are your critical business processes entrenched in the minds of few experts in your organization or does everyone from the receptionist to your enterprise architect to your CEO understand what they can do to revolutionize it? Business Process Management is a superset of SOA. It is the process of getting your business to articulate business value and metrics and have it implemented in IT without any loss in translation. It is the act of extracting the business process from the minds of experts and IT applications in your organization and valuing them as assets for performance and gain. BPM is stepping outside your SOA and moving your organization to the next level of innovation. Oracle is accelerating BPM across industries with the latest launch. Join us to understand how BPM can give your organization a cutting edge over your SOA. SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Facebook Wiki Mix Forum Technorati Tags: SOA,BPM,SOA Community,Oracle SOA,Oracle BPM,Community,OPN,Jürgen Kress

    Read the article

  • Fill a Flash Drive with Portable Software using Lupo PenSuite

    - by Asian Angel
    A flash drive full of portable software is helpful to have along wherever you go. The Lupo PenSuite lets you choose from three different versions to get the best fit for your everyday needs. Note: If running the full version you will need a 512 MB USB flash drive or larger. Using Lupo PenSuite The one window to watch for during the setup process is where you have the opportunity to add a specific language pack if needed. Outside of that all that you need to do is sit back and wait for the suite to be extracted. Note: Extraction times will vary based on version and extraction location. Here we browsed to our flash drive to extract it to… Once the setup process is complete locate and double click the Lupo_PenSuite.exe file. This one time window will present you the opportunity to start using the suite immediately, or go directly into the options. When the suite is active you will have a new system tray icon that operates as a start menu button. At the bottom you can monitor the remaining room on your flash drive, and use the close button to exit the suite (may display as a power button based on menu theme). A quick look at the set up inside the suite. There is a pre-configured area for organizing and storing your personal files. Prefer a classic style menu? Just select for it in the options (various tab) and enjoy a smaller streamlined look. Note: You can also change the theme for the regular menu and add a user pic. The suite provides access to your portable software and online sites. You get to enjoy the best of both as shown in the following examples. Websites will open using the suite’s portable Firefox install. VLC is ready to play your downloaded videos. The suite also has some very nice photo editing programs added in. Installing Additional Apps If one of your favorite programs is not included in the suite version, it only takes a few minutes to add it in. Go to the Additional Apps webpage, download the app(s), and extract them onto your hard-drive. Note: Link for additional apps webpage provided below. Add the extracted app(s) to the MyApps folder in the suite’s folder hierarchy. Click on ASuite in the suite’s start menu. Drag and drop the portable app’s exe file into the MyApps section in the ASuite window. Your new software’s shortcut should display as shown here. Close this window when finished. Checking the suite’s start menu will show your new software ready to be used. Conclusion If you need a good portable software collection to carry with you on a flash drive then Lupo PenSuite is definitely worth taking a look at. We tested Lupo PenSuite on XP, Vista, and Windows 7 and it works great on all three. Another popular choice is PortableApps and you can check out our Review of that too they are essentially the same thing, each is just packaged differently. Links Download Lupo PenSuite (Full, Lite, & Zero versions) *Download links approximately one-third down the page. Download Additional Apps for Lupo PenSuite Download Additional Skins for Lupo PenSuite Start Menu View Video Tutorials *Has tutorial for easy updating of entire suite. Similar Articles Productive Geek Tips Install and Run Applications from Your iPod, Flash Drive or Mp3 PlayerRebit Backup Software [Review]BitLocker To Go Encrypts Portable Flash Drives in Windows 7Create a Bootable Ubuntu USB Flash Drive the Easy WaySpeed up Your Windows Vista Computer with ReadyBoost TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips All My Movies 5.9 CloudBerry Online Backup 1.5 for Windows Home Server Snagit 10 VMware Workstation 7 Google TV The iPod Revolution Ultimate Boot CD can help when disaster strikes Windows Firewall with Advanced Security – How To Guides Sculptris 1.0, 3D Drawing app AceStock, a Tiny Desktop Quote Monitor

    Read the article

  • Database – Beginning with Cloud Database As A Service

    - by Pinal Dave
    I love my weekend projects. Everybody does different activities in their weekend – like traveling, reading or just nothing. Every weekend I try to do something creative and different in the database world. The goal is I learn something new and if I enjoy my learning experience I share with the world. This weekend, I decided to explore Cloud Database As A Service – Morpheus. In my career I have managed many databases in the cloud and I have good experience in managing them. I should highlight that today’s applications use multiple databases from SQL for transactions and analytics, NoSQL for documents, In-Memory for caching to Indexing for search.  Provisioning and deploying these databases often require extensive expertise and time.  Often these databases are also not deployed on the same infrastructure and can create unnecessary latency between the application layer and the databases.  Not to mention the different quality of service based on the infrastructure and the service provider where they are deployed. Moreover, there are additional problems that I have experienced with traditional database setup when hosted in the cloud: Database provisioning & orchestration Slow speed due to hardware issues Poor Monitoring Tools High network latency Now if you have a great software and expert network engineer, you can continuously work on above problems and overcome them. However, not every organization have the luxury to have top notch experts in the field. Now above issues are related to infrastructure, but there are a few more problems which are related to software/application as well. Here are the top three things which can be problems if you do not have application expert: Replication and Clustering Simple provisioning of the hard drive space Automatic Sharding Well, Morpheus looks like a product build by experts who have faced similar situation in the past. The product pretty much addresses all the pain points of developers and database administrators. What is different about Morpheus is that it offers a variety of databases from MySQL, MongoDB, ElasticSearch to Reddis as a service.  Thus users can pick and chose any combination of these databases.  All of them can be provisioned in a matter of minutes with a simple and intuitive point and click user interface.  The Morpheus cloud is built on Solid State Drives (SSD) and is designed for high-speed database transactions.  In addition it offers a direct link to Amazon Web Services to minimize latency between the application layer and the databases. Here are the few steps on how one can get started with Morpheus. Follow along with me.  First go to http://www.gomorpheus.com and register for a new and free account. Step 1: Signup It is very simple to signup for Morpheus. Step 2: Select your database   I use MySQL for my daily routine, so I have selected MySQL. Upon clicking on the big red button to add Instance, it prompted a dialogue of creating a new instance.   Step 3: Create User Now we just have to create a user in our portal which we will use to connect to a database hosted at Morpheus. Click on your database instance and it will bring you to User Screen. Over here you will notice once again a big red button to create a new user. I created a user with my first name.   Step 4: Configure your MySQL client I used MySQL workbench and connected to MySQL instance, which I had created with an IP address and user.   That’s it! You are connecting to MySQL instance. Now you can create your objects just like you would create on your local box. You will have all the features of the Morpheus when you are working with your database. Dashboard While working with Morpheus, I was most impressed with its dashboard. In future blog posts, I will write more about this feature.  Also with Morpheus you use the same process for provisioning and connecting with other databases: MongoDB, ElasticSearch and Reddis. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: MySQL, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • Unit Testing DateTime – The Crazy Way

    - by João Angelo
    We all know that the process of unit testing code that depends on DateTime, particularly the current time provided through the static properties (Now, UtcNow and Today), it’s a PITA. If you go ask how to unit test DateTime.Now on stackoverflow I’ll bet that you’ll get two kind of answers: Encapsulate the current time in your own interface and use a standard mocking framework; Pull out the big guns like Typemock Isolator, JustMock or Microsoft Moles/Fakes and mock the static property directly. Now each alternative has is pros and cons and I would have to say that I glean more to the second approach because the first adds a layer of abstraction just for the sake of testability. However, the second approach depends on commercial tools that not every shop wants to buy or in the not so friendly Microsoft Moles. (Sidenote: Moles is now named Fakes and it will ship with VS 2012) This tends to leave people without an acceptable and simple solution so after reading another of these types of questions in SO I came up with yet another alternative, one based on the first alternative that I presented here but tries really hard to not get in your way with yet another layer of abstraction. So, without further dues, I present you, the Tardis. The Tardis is single section of conditionally compiled code that overrides the meaning of the DateTime expression inside a single class. You still get the normal coding experience of using DateTime all over the place, but in a DEBUG compilation your tests will be able to mock every static method or property of the DateTime class. An example follows, while the full Tardis code can be downloaded from GitHub: using System; using NSubstitute; using NUnit.Framework; using Tardis; public class Example { public Example() : this(string.Empty) { } public Example(string title) { #if DEBUG this.DateTime = DateTimeProvider.Default; this.Initialize(title); } internal IDateTimeProvider DateTime { get; set; } internal Example(string title, IDateTimeProvider provider) { this.DateTime = provider; #endif this.Initialize(title); } private void Initialize(string title) { this.Title = title; this.CreatedAt = DateTime.UtcNow; } private string title; public string Title { get { return this.title; } set { this.title = value; this.UpdatedAt = DateTime.UtcNow; } } public DateTime CreatedAt { get; private set; } public DateTime UpdatedAt { get; private set; } } public class TExample { public void T001() { // Arrange var tardis = Substitute.For<IDateTimeProvider>(); tardis.UtcNow.Returns(new DateTime(2000, 1, 1, 6, 6, 6)); // Act var sut = new Example("Title", tardis); // Assert Assert.That(sut.CreatedAt, Is.EqualTo(tardis.UtcNow)); } public void T002() { // Arrange var tardis = Substitute.For<IDateTimeProvider>(); var sut = new Example("Title", tardis); tardis.UtcNow.Returns(new DateTime(2000, 1, 1, 6, 6, 6)); // Act sut.Title = "Updated"; // Assert Assert.That(sut.UpdatedAt, Is.EqualTo(tardis.UtcNow)); } } This approach is also suitable for other similar classes with commonly used static methods or properties like the ConfigurationManager class.

    Read the article

  • How can we improve overall Programmer Education & Training?

    - by crosenblum
    Last week, I was just viewing this amazing interview by Kevin Rose of Phillip Rosedale, of Second Life. And they had an amazing discussion about how to find, hire and identify good programmer's, and how hard it is to find good ones. Which has lead me to really think about the way we programmer's learn, are taught. For a majority of us, myself included, we are self-taught. Which is great about being a programmer, anyone can learn and develop skills. But this also means, that there is no real standards of what a good programmer is/are, and what kind of environment's encourage the growth of programming skills. This isn't so much a question, but just a desire in me, to see how we can change the culture of programming, and the manager's of programming, so that education and self-improvement is encouraged. There are a lot of avenue's for continued education, youtube videos, books, conferences, but because of the experiental nature of what we do, it isn't always clear what's important to learn and to master. Let's look at the The Joel 12 Steps. The Joel Test Do you use source control? Can you make a build in one step? Do you make daily builds? Do you have a bug database? Do you fix bugs before writing new code? Do you have an up-to-date schedule? Do you have a spec? Do programmers have quiet working conditions? Do you use the best tools money can buy? Do you have testers? Do new candidates write code during their interview? Do you do hallway usability testing? I think all of these have important value, but because of something I call the Experiential Gap, if a programmer or manager has never experienced any of the negative consequences for not having done items on the list, they will never see the need to do any of them. The Experiental Gap, is my basic theory, that each of us has different jobs and different experiences. So for some of us, that have always worked with dozens of programmer's, source control is a must have. But for people who have always been the only programmer, they can not imagine the need for source control. And it's because of this major flaw in how we learn, that we evaluate people by what best practices they do or not do, and the reason for either can start a flame war. We always evaluate people in our field by what they do, and think "Oh if this guy/gal isn't doing xyz best practice, he/she can't be a good programmer, so let's not waste time or energy talking to them." This is exactly why we have so many programming flame wars, that it becomes, because of the Experiental Gap, we can't imagine people not having made the decisions that we have had to made. So this has lead me to think, that we totally need to rethink how we train, educate and manage programmer's. For example, what percentage of you have had encouragement by your manager's to go to conferences, and even have them pay for it? For me, and a lot of people, this is extremely rare, a lot of us would love to go to conferences, to learn more, but the money ain't there to do that. So the point of this question is really to spark a lot of how can we train, learn and manage better? How can we create a new culture of learning that doesn't insult people for not having the same job experiences. Yes we all have jobs and work to do, but our ability to do our jobs well, depends on our desire, interest and support in improving our mastery of our skills. Right now, I see our culture being rather disorganized, we support the elite, but those tons of us that want to get better, just don't have enough support to learn and improve ourselves. I mean, do we as an industry, want to be perceived as just replaceable cogs? Thank you...

    Read the article

  • SQL Developer Data Modeler v3.3 Early Adopter: Search

    - by thatjeffsmith
    photo: Stuck in Customs via photopin cc The next version of Oracle SQL Developer Data Modeler is now available as an Early Adopter (read, beta) release. There are many new major feature enhancements to talk about, but today’s focus will be on the brand new Search mechanism. Data, data, data – SO MUCH data Google has made countless billions of dollars around a very efficient and intelligent search business. People have become accustomed to having their data accessible AND searchable. Data models can have thousands of entities or tables, each having dozens of attributes or columns. Imagine how hard it could be to find what you’re looking for here. This is the challenge we have tackled head-on in v3.3. Same location as the Search toolbar in Oracle SQL Developer (and most web browsers) Here’s how it works: Search as you type – wicked fast as the entire model is loaded into memory Supports regular expressions (regex) Results loaded to a new panel below Search across designs, models Search EVERYTHING, or filter by type Save your frequent searches Save your search results as a report Open common properties of object in search results and edit basic properties on-the-fly Want to just watch the video? We have a new Oracle Learning Library resource available now which introduces the new and improved Search mechanism in SQL Developer Data Modeler. Go watch the video and then come back. Some Screenshots This will be a pretty easy feature to pick up. Search is intuitive – we’ve already learned how to do search. Now we just have a better interface for it in SQL Developer Data Modeler. But just in case you need a couple of pointers… The SYS data dictionary in model form with Search Results If I type ‘translation’ in the search dialog, then the results will come up as hits are ‘resolved.’ By default, everything is searched, although I can filter the results after-the-fact. You can see where the search finds a match in the ‘Content’ column Save the Results as a Report If you limit the search results to a category and a model, then you can save the results as a report. All of the usual suspects You can optionally include the search string, which displays in the top of of the report as ‘PATTERN.’ You can save you common reporting setups as a template and reuse those as well. Here’s a sample HTML report: Yes, I like to search my search results report! Two More Ways to Search You can search ‘in context’ by opening the ‘Find’ dialog from an active design. You can do this using the ‘Search’ toolbar button or from a model context menu. Searching a specific model Instead of bringing up the old modal Find dialog, you now get to use the new and improved Search panel. Notice there’s no ‘Model’ drop-down to select and that the active Search form is now in the Search panel versus the search toolbar up top. What else is new in SQL Developer Data Modeler version 3.3? All kinds of goodies. You can send your model to Excel for quick edits/reviews and suck the changes back into your model, you can share objects between models, and much much more. You’ll find new videos and blog posts on the subject in the new few days and weeks. Enjoy! If you have any feedback or want to report bugs, please visit our forums.

    Read the article

  • Who Makes a Good Product Owner

    - by Robert May
    In general, the best product owners are those that care passionately about the customer of the product.  Note that I didn’t say about the product itself.  Actually, people that only care about the product, generally do not make good product owners.  Products only matter in relationship to their customers.  If a product doesn’t provide value to the customer, then the product has no value, no matter what a person might think of the product, and no matter what cool technologies exist inside of the product. A good product owner is also a good negotiator.  They recognize that many different priorities exist inside of a corporation, but that there can be only one list that developers work from.  A good product owner recognizes that its their job to help others around them prioritize (perhaps with a Product Council), but also understand that they alone have the final say about priorities and are willing to make the tough decisions required.  Deciding the priority between two perfectly valid stories is very difficult, especially when the stories are from two different departments! A good product owner is deeply interested in helping the team be successful.  They don’t seek to control the team, but instead seek to understand what the team can do and then work with the team to get the best product possible for the Customer.  A good product owner is never denigrating to team members, ever.  They recognize that such behavior would damage the trust that needs to be present between team members and product owners and will avoid it at all costs. In general, technical people (i.e. former or current developers) make poor product owners.  In their minds, they can’t separate implementation details from user functionality, so their stories end up sounding like implementation details.  For example, “The user enters their username on the password screen” is something that a technical product owner would write.  The proper wording for that story is “A user supplies the system with their credentials.”  Because technical people think different from the rest of the population, they are generally not a good fit. A good product owner is also a good writer.  Writing good stories demands good writing.  The art of persuasion, descriptiveness and just general good grammar are all required.  A good Product Owner must also be well spoken, since most of what will be conveyed will be conveyed with the spoken word, not just written word. A good product owner is a “People Person.”  They like talking to people and are very patient.  They don’t mind having questions repeated or fielding many questions, because they want to make sure that the ideas they’re conveying are properly understood so the customer gets the best product possible.  They are happy to answer any questions a team member may have and invite feedback and criticism of designs and stories, since they want a good product.  They really have little ego that gets in the way of building a great product. All of these qualities can be hard to find, but if you look close enough, you’ll find the right person in your organization.  Product owners can be found anywhere, not just in upper management.  Some of the best product owners are those that are very close to the customer.  In fact, check your customer support staff.  I’d bet that several great product owners are lurking there. Final note about what makes a good product owner.  You’re probably NOT going to find a good product owner in a manager, especially if they consider themselves a “Manager.”  Product owners don’t manage anything but the backlog, so be especially careful if the person you’re selecting for Product Owner is a manager. Up Next, “Messing with the Team.” Technorati Tags: Scrum,Product Owner

    Read the article

  • December 3 is Stephanie Choyer Day

    - by rickramsey
    I don't answer Stephanie Choyer's email just so I can enjoy her French accent when she calls. "Reek! Reek! Why do joo not answer my eemails?" Without the French, life on Earth would be so much poorer. No, they don't bring to the party any motorcycles that grow chest on your hair, and the Citroen is such a frightening study in Automobile design that I don't dare climb inside one. But they have French architecture. French sidewalks. French villages. The French Alps. Grenoble. French cheese. French wine. And that glorious French accent. If I were French, I'd spend all my time enjoying being French. Which makes the work that Stephanie does day in and day with our hard-edged technologies and stubborn technologists so admirable. Oracle Solaris 11 Resources for Sysadmins and Developers The page in the link above represents the work of many people, but it was Steph who rounded them up. And it wasn't easy. I know, because I ran and hid from her on many, many occasions. But she was tireless. "Reek. Reek. Why have you not published Glynn's article? Pleeeease, you must!" Remember when tech companies gave you a simple choice? You could either read the 27,000 pages of documentation or a double-sided data sheet. Which will it be, pal? Then they started writing white papers. 74 pages of excellent prose did a beautiful job of explaining why the technology was fantastic, but never told you how to use it. Well, have you taken a look at these? How-To Technical Articles for System Admins and Developers Now you can get wicked excited about a cool technique described in a 74-page white paper, and find a technical article that shows you exactly how to use it. The wicked smart marketing folks on the Oracle Solaris team wrote them, but it was Steph who bribed them with a Cabernet or beat them over the head with a baguette until all that work was finished and posted on OTN. There are songs about French wine, but not about French vintners. There are songs about French cities, but not about French bricklayers. About French sidewalks, but not about the French policemen who keep them safe. As far as I know, there are no songs about OTN, but if there were, they would probably neglect to mention Steph. Which is why today, Dec 3rd, we celebrate Stephanie Choyer Day. We dedicate this day to our relentless, hardworking, tireless, patient and friendly French colleague with the delightful accent. If I knew how to speak French, I'd say "Thanks for all you do" in French. But I don't speak French. And I don't trust online translations. I'd probably wind up saying "My left foot yearns for curdled milk." So here it is in plain old English: Thank you, Stephanie. psssst! about that documentation and those white papers ... In case you haven't noticed, the Oracle Solaris doc team has done some pretty cool things with the Solaris docs. And those white papers are interesting reading, well worth setting aside some time. Because with Solaris, as you know, it's not just about getting by with a rudimentary grasp of the basics. It's about the amazing stuff savvy sysadmins and developers can do when they really understand it. Find them here: White Papers Documentation And don't forget training! - Rick Website Newsletter Facebook Twitter

    Read the article

  • SQLAuthority News – Learning Never Ends – Becoming Student Again

    - by pinaldave
    From my past few blog posts you may see a pattern – learning.  I finished my own college education a few years ago, but I firmly believe that learning should never stop.  We can learn on the job, or from outside reading, but we should always try to be learning new things.  It keeps the brain sharp!  In fact, I often find myself learning new things from reviewing old material.  If you have been reading my blog lately, you will recognize the name Koenig Solutions. You might also be rolling your eyes at me and my enthusiasm for learning and training.  College was hard work, why continue it?  Didn’t we all get educations so that we could get jobs and go on vacation? Of course, having a job means that you cannot take vacations all the time.  I have often asked my friend who owns Koenig, jokingly, when he is going to open a Koenig center in Bangalore. I relocated to Bangalore 1.5 years ago, so I wanted a center I could walk to anytime.  Last week I was very happy to hear that they have opened a center in Bangalore. Pinal Dave at Friend’s Company I could not let a new center open without visiting it and congratulating my friend, so I recently stopped by.  I was immediately taken by the desire to go back to “school” and learn something new.  I have signed up to take a continuing education course through the new Koenig center and here is the exciting part: I will be blogging about it so that you all can be inspired to learn, too!  Keep checking back here for further updates and blog posts about my learning experience. However, what is the fun to attend the session in the town where you stay. I indeed visited their center in Bangalore but I have opted to learn the course in another city. Well, more information about the same in near future. Pinal Dave is going to be a student again Honestly, why not learn new things and become more confident?  When we have more education we will become better at our jobs, which can lead to more confidence and efficiency, but may also have more physical rewards – like a raise or promotion.  We don’t always have to focus on shallow rewards like money and recognition, so think about how much more you will enjoy your work when you know more about it.  Koenig is offering training for new certificates in SQL Server 2012, and I am planning on investigating these for sure. I feel good that I am going to be a student again and will be learning new stuff. As I said I will blog my experience as I go. I hope that my continuing education blog posts will inspire you, my readers, to go out and learn more.  I am serious about my education and my goal is to prove how serious I am here, on my blog. I am a big fan of Learning and Sharing and I hope this series will inspire you to learn new technology which can help you progress in your career and help balance your life with work. Note: This blog post is about what inspired me to sign up for learning course. Becoming student should be the attitude of a lifetime. This post is not about a career change. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • MacGyver Moments

    - by Geoff N. Hiten
    Denny Cherry tagged me to write about my best MacGyver Moment.  Usually I ignore blogosphere fluff and just use this space to write what I think is important.  However, #MVP10 just ended and I have a stronger sense of community.  Besides, where else would I mention my second best Macgyver moment was making a BIOS jumper out of a soda can.  Aluminum is conductive and I didn't have any real jumpers lying around. My best moment is probably my entire home computer network.  Every system but one is hand-built, usually cobbled together out of spare parts and 'adapted' from its original purpose. My Primary Domain Controller is a Dell 2300.   The Service Tag indicates it was shipped to the original owner in 1999.  Box has a PERC/1 RAID controller.  I acquired this from a previous employer for $50.  It runs Windows Server 2003 Enterprise Edition.  Does DNS, DHCP, and RADIUS services as a bonus.  RADIUS authentication is used for VPN and Wireless access.  It is nice to sign in once and be done with it. The Secondary Domain Controller is an old desktop.  Dual P-III 933 with some extra drives. My VPN box is a P-II 250 with 384MB of RAM and a 21 GB hard drive.  I did a P-to-V to my Hyper-V box a year or so ago and retired the hardware again.  Dynamic DNS lets me connect no matter how often Comcast shuffles my IP. The Hyper-V box is a desktop system with 8GB RAM and an AMD Athlon 5000+ processor.  Cost me less than $500 to put together nearly two years ago.  I reasoned that if Vista and Windows 2008 were the same code then Vista 64-bit certified meant the drivers for Vista would load into Windows 2008.  Turns out I was right. Later I added three 1TB drives but wasn't too happy with how that turned out.  I recovered two of the drives yesterday and am building an iSCSI storage unit. (Much thanks to Starwind.  Great product).  I am using an old AMD 1.1GhZ box with 1.5 GB RAM (cobbled together from three old PCs) as my storave server.  The Hyper-V box is slated for an OS rebuild to 2008 R2 once I get the storage system worked out.  maybe in a week or two. A couple of DLink Gigabit switches ties everything together. Add in the Vonage box, the three PCs, the Wireless-N Access Point, the two notebooks and the XBox and you have gone from MacGyver to darn near Rube Goldberg. The only thing I really spend money on is power supplies and fans.  I buy top-of-the-line for both. I even pull and crimp my own cables. Oh, and if my kids hose up a PC, I have all of their data on a server elsewhere.  Every PC and laptop is pretty much interchangable for email and basic workstation tasks.  That helps a lot too. Of course I will tag SQLVariant.

    Read the article

  • Surface V2.0

    - by Dennis Vroegop
    It’s been quiet around here. And the reason for that is that it’s been quiet around Surface for a while. Now, a lot of people assume that when a product team isn’t making too much noise that must mean they stopped working on their product. Remember the PDC keynote in 2010? Just because they didn’t mention WPF there a lot of people had the idea that WPF was dead and abandoned for Silverlight. Of course, this couldn’t be farther from the truth. The same applies to Surface. While we didn’t hear much from the team in Redmond they were busy putting together the next version of the platform. And at the CES in January the world saw what they have been up to all along: Surface V2.0 as it’s commonly known. Of course, the product is still in development. It’s not here yet, we can’t buy one yet. However, more and more information comes available and I think this is a good time to share with you what it’s all about! The biggest change from an organizational point of view is that Microsoft decided to stop producing the hardware themselves. Instead, they have formed a partnership with Samsung who will manufacture the devices. This means that you as a buyer get the benefits of a large, worldwide supplier with all the services they can offer. Not that Microsoft didn’t do that before but since Surface wasn’t a ‘big’ product it was sometimes hard to get to the right people. The new device is officially called the “Samsung SUR 40 for Microsoft Surface” which is quite a mouthful. The software that runs the device is of course still coming from Microsoft. Let’s dive into the technical specs (note: all of this is preliminary, it’s still in the Alpha phase!): Audio out HDMI / StereoRCA / SPDIF / 2 times 3.5mm audio out jack Brightness 300 CD/m2 Communications 1GB Ethernet/802.11/Bluetooth Contrast Ratio 1:1000 CPU AMD Athlon X2 245e 2.9Ghz Dual Core Display Resolution Full HD 1080p 1920x1080 / 16:9 aspect ratio GPU AMD Radeon HD 6750 1GB GDDRS HDD 320 GB / 7200 RPM HDMI In / HDMI out Yes I/O Ports 4 USB, SD Card reader Operation System Embedded Windows 7 Professional 64 bits Panel Size 40” diagonal Protection Glass Gorilla Glass RAM 4 GB DD3 Weight / with standard legs 70.0 Kg / 154 lbs Weight / standalone 39.5 Kg / 87 lbs Height (without legs) 4 inch Contact points recognized > 50 Cool Factor Extremely   Ok, the last point is not official, but I do think it needs to be there. Let’s talk software. As noted, it runs Windows 7 Professional 64 bit, which means you can run Visual Studio 2010 on it. The software is going to be developed in WPF4.0 with the additional Surface SDK 2.0. It will contain all the things you’ve seen before plus some extra’s. They have taken some steps to align it more with the Surface Toolkit which you can download today, so if you do things right your software should be portable between a WPF4.0 Windows 7 Multi-touch app and the Surface v2 environment. It still uses infrared to detect contacts, so in that respect nothing much has changed conceptually. We still can differentiate between a finger, a tag or a blob. Of course, since the new platform has a much higher resolution (compared to the 1024x768 of the first version) you might need to look at your code again. I’ve seen a lot of applications on Surface that assume the old resolution and moving that to V2 is going to be some work. To be honest: as I am under NDA I cannot disclose much about the new software besides what I have told you here, but trust me: it’s going to blow people away. Now, the biggest question for me is: when can I get one? Until we can, have a look here: Tags van Technorati: surface,samsung,WPF

    Read the article

  • Application Performance Episode 2: Announcing the Judges!

    - by Michaela Murray
    The story so far… We’re writing a new book for ASP.NET developers, and we want you to be a part of it! If you work with ASP.NET applications, and have top tips, hard-won lessons, or sage advice for avoiding, finding, and fixing performance problems, we want to hear from you! And if your app uses SQL Server, even better – interaction with the database is critical to application performance, so we’re looking for database top tips too. There’s a Microsoft Surface apiece for the person who comes up with the best tip for SQL Server and the best tip for .NET. Of course, if your suggestion is selected for the book, you’ll get full credit, by name, Twitter handle, GitHub repository, or whatever you like. To get involved, just email your nuggets of performance wisdom to [email protected] – there are examples of what we’re looking for and full competition details at Application Performance: The Best of the Web. Enter the judges… As mentioned in my last blogpost, we have a mystery panel of celebrity judges lined up to select the prize-winning performance pointers. We’re now ready to reveal their secret identities! Judging your ASP.NET  tips will be: Jean-Phillippe Gouigoux, MCTS/MCPD Enterprise Architect and MVP Connected System Developer. He’s a board member at French software company MGDIS, and teaches algorithms, security, software tests, and ALM at the Université de Bretagne Sud. Jean-Philippe also lectures at IT conferences and writes articles for programming magazines. His book Practical Performance Profiling is published by Simple-Talk. Nik Molnar,  a New Yorker, ASP Insider, and co-founder of Glimpse, an open source ASP.NET diagnostics and debugging tool. Originally from Florida, Nik specializes in web development, building scalable, client-centric solutions. In his spare time, Nik can be found cooking up a storm in the kitchen, hanging with his wife, speaking at conferences, and working on other open source projects. Mitchel Sellers, Microsoft C# and DotNetNuke MVP. Mitchel is an experienced software architect, business leader, public speaker, and educator. He works with companies across the globe, as CEO of IowaComputerGurus Inc. Mitchel writes technical articles for online and print publications and is the author of Professional DotNetNuke Module Programming. He frequently answers questions on StackOverflow and MSDN and is an active participant in the .NET and DotNetNuke communities. Clive Tong, Software Engineer at Red Gate. In previous roles, Clive spent a lot of time working with Common LISP and enthusing about functional languages, and he’s worked with managed languages since before his first real job (which was a long time ago). Long convinced of the productivity benefits of managed languages, Clive is very interested in getting good runtime performance to keep managed languages practical for real-world development. And our trio of SQL Server specialists, ready to select your top suggestion, are (drumroll): Rodney Landrum, a SQL Server MVP who writes regularly about Integration Services, Analysis Services, and Reporting Services. He’s authored SQL Server Tacklebox, three Reporting Services books, and contributes regularly to SQLServerCentral, SQL Server Magazine, and Simple–Talk. His day job involves overseeing a large SQL Server infrastructure in Orlando. Grant Fritchey, Product Evangelist at Red Gate and SQL Server MVP. In an IT career spanning more than 20 years, Grant has written VB, VB.NET, C#, and Java. He’s been working with SQL Server since version 6.0. Grant volunteers with the Editorial Committee at PASS and has written books for Apress and Simple-Talk. Jonathan Allen, leader and founder of the PASS SQL South West user group. He’s been working with SQL Server since 1999 and enjoys performance tuning, development, and using SQL Server for business solutions. He’s spoken at SQLBits and SQL in the City, as well as local user groups across the UK. He’s also a moderator at ask.sqlservercentral.com.

    Read the article

  • i don't receive mail notification...Nagios Core 4

    - by alessio
    I have a problem with automatically mail notification in Nagios Core 4 installed on ubuntu 12 .04 lts server... i have tried to send mail with nagios user and root user with the command: echo "test" | mail -s "test mail" [email protected] and i received mail correctly... but i don't receive any automatically mail notification... i don't know how can i do to resolve this issue! :( these are my configuration files (commands.cfg, contacts.cfg, nagios.log, mail.log): commands.cfg (the path /usr/bin/mail is the right path): # 'notify-host-by-email' command definition define command{ command_name notify-host-by-email command_line /usr/bin/printf "%b" "***** Nagios *****\n\nNotification Type: $NOTIFICATIONTYPE$\nHost: $HOSTNAME$\nState: $HOSTSTATE$\nAddress: $HOSTADDRESS$\nInfo: $HOSTOUTPUT$\n\nDate/Time: $LONGDATETIME$\n" | /usr/bin/mail -s "** $NOTIFICATIONTYPE$ Host Alert: $HOSTNAME$ is $HOSTSTATE$ **" $CONTACTEMAIL$ } # 'notify-service-by-email' command definition define command{ command_name notify-service-by-email command_line /usr/bin/printf "%b" "***** Nagios *****\n\nNotification Type: $NOTIFICATIONTYPE$\n\nService: $SERVICEDESC$\nHost: $HOSTALIAS$\nAddress: $HOSTADDRESS$\nState: $SERVICESTATE$\n\nDate/Time: $LONGDATETIME$\n\nAdditional Info:\n\n$SERVICEOUTPUT$\n" | /usr/bin/mail -s "** $NOTIFICATIONTYPE$ Service Alert: $HOSTALIAS$/$SERVICEDESC$ is $SERVICESTATE$ **" $CONTACTEMAIL$ } # 'process-host-perfdata' command definition define command{ command_name process-host-perfdata command_line /usr/bin/printf "%b" "$LASTHOSTCHECK$\t$HOSTNAME$\t$HOSTSTATE$\t$HOSTATTEMPT$\t$HOSTSTATETYPE$\t$HOSTEXECUTIONTIME$\t$HOSTOUTPUT$\t$HOSTPERFDATA$\n" >> /usr/local/nagios/var/host-perfdata.out } # 'process-service-perfdata' command definition define command{ command_name process-service-perfdata command_line /usr/bin/printf "%b" "$LASTSERVICECHECK$\t$HOSTNAME$\t$SERVICEDESC$\t$SERVICESTATE$\t$SERVICEATTEMPT$\t$SERVICESTATETYPE$\t$SERVICEEXECUTIONTIME$\t$SERVICELATENCY$\t$SERVICEOUTPUT$\t$SERVICEPERFDATA$\n" >> /usr/local/nagios/var/service-perfdata.out } contacts.cfg: define contact{ contact_name supporto alias Supporto Clienti DEA service_notification_period 24x7 host_notification_period 24x7 service_notification_options w,u,c,r host_notification_options d,r service_notification_commands notify-service-by-email host_notification_commands notify-host-by-email email [email protected] } define contactgroup{ contactgroup_name admins alias Nagios Administrators members supporto } nagios.log: [1401871412] SERVICE ALERT: fileserver;Current Users;OK;SOFT;2;USERS OK - 1 users currently logged in [1401871953] SERVICE ALERT: backups;Nagios Status;WARNING;SOFT;1;NAGIOS WARNING: 36 processes, status log updated 541 seconds ago [1401872133] SERVICE ALERT: backups;Nagios Status;OK;SOFT;2;NAGIOS OK: 36 processes, status log updated 180 seconds ago [1401872321] SERVICE ALERT: posta;Swap Usage;CRITICAL;SOFT;1;CRITICAL - Plugin timed out after 10 seconds [1401872322] SERVICE ALERT: fileserver;Current Users;CRITICAL;SOFT;1;CRITICAL - Plugin timed out after 10 seconds [1401872420] SERVICE ALERT: archivio;Disk Space;CRITICAL;SOFT;1;CRITICAL - Plugin timed out after 10 seconds [1401872492] SERVICE ALERT: fileserver;Current Users;OK;SOFT;2;USERS OK - 1 users currently logged in [1401872492] SERVICE ALERT: posta;Swap Usage;OK;SOFT;2;SWAP OK: 100% free (1984 MB out of 1984 MB) [1401872590] SERVICE ALERT: archivio;Disk Space;OK;SOFT;2;DISK OK [1401872931] Auto-save of retention data completed successfully. [1401873333] SERVICE ALERT: backups;Nagios Status;WARNING;SOFT;1;NAGIOS WARNING: 36 processes, status log updated 402 seconds ago [1401873513] SERVICE ALERT: backups;Nagios Status;OK;SOFT;2;NAGIOS OK: 36 processes, status log updated 180 seconds ago mail.log (i think that the problem is here but i don't know how to resolve it): Jun 4 10:00:01 backups sm-msp-queue[6109]: My unqualified host name (backups) unknown; sleeping for retry Jun 4 10:01:01 backups sm-msp-queue[6109]: unable to qualify my own domain name (backups) -- using short name Jun 4 10:20:01 backups sm-msp-queue[7247]: My unqualified host name (backups) unknown; sleeping for retry Jun 4 10:21:01 backups sm-msp-queue[7247]: unable to qualify my own domain name (backups) -- using short name Jun 4 10:40:01 backups sm-msp-queue[8327]: My unqualified host name (backups) unknown; sleeping for retry Jun 4 10:41:01 backups sm-msp-queue[8327]: unable to qualify my own domain name (backups) -- using short name Jun 4 11:00:01 backups sm-msp-queue[9549]: My unqualified host name (backups) unknown; sleeping for retry Jun 4 11:01:01 backups sm-msp-queue[9549]: unable to qualify my own domain name (backups) -- using short name Jun 4 11:20:01 backups sm-msp-queue[10678]: My unqualified host name (backups) unknown; sleeping for retry Jun 4 11:21:01 backups sm-msp-queue[10678]: unable to qualify my own domain name (backups) -- using short name i'm at the last step and i want to finish this Nagios Core! :) Any help be appreciate!:) host definition (this host have the disk almost full and it is in hard state but non notification) : define host{ use generic-host ; Name of host template to use host_name posta alias Server Posta ESA address 10.10.2.102 parents xen1, xen2 icon_image redhat.png statusmap_image redhat.gd2 } service definition: define service{ use generic-service host_name xen1, maestro, xen2, posta, nas002, serv2, esasrvmi02, esaubuntumi service_description Disk Space check_command ssh_all_disks!10%!5% } Notification is allowed for the contact definition you gave, but is it also allowed at the the service level ? sorry but i don't understand this thing! :(

    Read the article

  • Database Mirroring on SQL Server Express Edition

    - by Most Valuable Yak (Rob Volk)
    Like most SQL Server users I'm rather frustrated by Microsoft's insistence on making the really cool features only available in Enterprise Edition.  And it really doesn't help that they changed the licensing for SQL 2012 to be core-based, so now it's like 4 times as expensive!  It almost makes you want to go with Oracle.  That, and a desire to have Larry Ellison do things to your orifices. And since they've introduced Availability Groups, and marked database mirroring as deprecated, you'd think they'd make make mirroring available in all editions.  Alas…they don't…officially anyway.  Thanks to my constant poking around in places I'm not "supposed" to, I've discovered the low-level code that implements database mirroring, and found that it's available in all editions! It turns out that the query processor in all SQL Server editions prepends a simple check before every edition-specific DDL statement: IF CAST(SERVERPROPERTY('Edition') as nvarchar(max)) NOT LIKE '%e%e%e% Edition%' print 'Lame' else print 'Cool' If that statement returns true, it fails. (the print statements are just placeholders)  Go ahead and test it on Standard, Workgroup, and Express editions compared to an Enterprise or Developer edition instance (which support everything). Once again thanks to Argenis Fernandez (b | t) and his awesome sessions on using Sysinternals, I was able to watch the exact process SQL Server performs when setting up a mirror.  Surprisingly, it's not actually implemented in SQL Server!  Some of it is, but that's something of a smokescreen, the real meat of it is simple filesystem primitives. The NTFS filesystem supports links, both hard links and symbolic, so that you can create two entries for the same file in different directories and/or different names.  You can create them using the MKLINK command in a command prompt: mklink /D D:\SkyDrive\Data D:\Data mklink /D D:\SkyDrive\Log D:\Log This creates a symbolic link from my data and log folders to my Skydrive folder.  Any file saved in either location will instantly appear in the other.  And since my Skydrive will be automatically synchronized with the cloud, any changes I make will be copied instantly (depending on my internet bandwidth of course). So what does this have to do with database mirroring?  Well, it seems that the mirroring endpoint that you have to create between mirror and principal servers is really nothing more than a Skydrive link.  Although it doesn't actually use Skydrive, it performs the same function.  So in effect, the following statement: ALTER DATABASE Mir SET PARTNER='TCP://MyOtherServer.domain.com:5022' Is turned into: mklink /D "D:\Data" "\\MyOtherServer.domain.com\5022$" The 5022$ "port" is actually a hidden system directory on the principal and mirror servers. I haven't quite figured out how the log files are included in this, or why you have to SET PARTNER on both principal and mirror servers, except maybe that mklink has to do something special when linking across servers.  I couldn't get the above statement to work correctly, but found that doing mklink to a local Skydrive folder gave me similar functionality. To wrap this up, all you have to do is the following: Install Skydrive on both SQL Servers (principal and mirror) and set the local Skydrive folder (D:\SkyDrive in these examples) On the principal server, run mklink /D on the data and log folders to point to SkyDrive: mklink /D D:\SkyDrive\Data D:\Data On the mirror server, run the complementary linking: mklink /D D:\Data D:\SkyDrive\Data Create your database and make sure the files map to the principal data and log folders (D:\Data and D:\Log) Viola! Your databases are kept in sync on multiple servers! One wrinkle you will encounter is that the mirror server will show the data and log files, but you won't be able to attach them to the mirror SQL instance while they are attached to the principal. I think this is a bug in the Skydrive, but as it turns out that's fine: you can't access a mirror while it's hosted on the principal either.  So you don't quite get automatic failover, but you can attach the files to the mirror if the principal goes offline.  It's also not exactly synchronous, but it's better than nothing, and easier than either replication or log shipping with a lot less latency. I will end this with the obvious "not supported by Microsoft" and "Don't do this in production without an updated resume" spiel that you should by now assume with every one of my blog posts, especially considering the date.

    Read the article

< Previous Page | 459 460 461 462 463 464 465 466 467 468 469 470  | Next Page >