Search Results

Search found 4879 results on 196 pages for 'geeks'.

Page 94/196 | < Previous Page | 90 91 92 93 94 95 96 97 98 99 100 101  | Next Page >

  • Windows Azure Recipe: Consumer Portal

    - by Clint Edmonson
    Nearly every company on the internet has a web presence. Many are merely using theirs for informational purposes. More sophisticated portals allow customers to register their contact information and provide some level of interaction or customer support. But as our understanding of how consumers use the web increases, the more progressive companies are taking advantage of social web and rich media delivery to connect at a deeper level with the consumers of their goods and services. Drivers Cost reduction Scalability Global distribution Time to market Solution Here’s a sketch of how a Windows Azure Consumer Portal might be built out: Ingredients Web Role – this will host the core of the solution. Each web role is a virtual machine hosting an application written in ASP.NET (or optionally php, or node.js). The number of web roles can be scaled up or down as needed to handle peak and non-peak traffic loads. Database – every modern web application needs to store data. SQL Azure databases look and act exactly like their on-premise siblings but are fault tolerant and have data redundancy built in. Access Control (optional) – if identity needs to be tracked within the solution, the access control service combined with the Windows Identity Foundation framework provides out-of-the-box support for several social media platforms including Windows LiveID, Google, Yahoo!, Facebook. It also has a provider model to allow integration with other platforms as well. Caching (optional) – for sites with high traffic with lots of read-only data and lists, the distributed in-memory caching service can be used to cache and serve up static data at higher scale and speed than direct database requests. It can also be used to manage user session state. Blob Storage (optional) – for sites that serve up unstructured data such as documents, video, audio, device drivers, and more. The data is highly available and stored redundantly across data centers. Each entry in blob storage is provided with it’s own unique URL for direct access by the browser. Content Delivery Network (CDN) (optional) – for sites that service users around the globe, the CDN is an extension to blob storage that, when enabled, will automatically cache frequently accessed blobs and static site content at edge data centers around the world. The data can be delivered statically or streamed in the case of rich media content. Training Labs These links point to online Windows Azure training labs where you can learn more about the individual ingredients described above. (Note: The entire Windows Azure Training Kit can also be downloaded for offline use.) Windows Azure (16 labs) Windows Azure is an internet-scale cloud computing and services platform hosted in Microsoft data centers, which provides an operating system and a set of developer services which can be used individually or together. It gives developers the choice to build web applications; applications running on connected devices, PCs, or servers; or hybrid solutions offering the best of both worlds. New or enhanced applications can be built using existing skills with the Visual Studio development environment and the .NET Framework. With its standards-based and interoperable approach, the services platform supports multiple internet protocols, including HTTP, REST, SOAP, and plain XML SQL Azure (7 labs) Microsoft SQL Azure delivers on the Microsoft Data Platform vision of extending the SQL Server capabilities to the cloud as web-based services, enabling you to store structured, semi-structured, and unstructured data. Windows Azure Services (9 labs) As applications collaborate across organizational boundaries, ensuring secure transactions across disparate security domains is crucial but difficult to implement. Windows Azure Services provides hosted authentication and access control using powerful, secure, standards-based infrastructure. See my Windows Azure Resource Guide for more guidance on how to get started, including links web portals, training kits, samples, and blogs related to Windows Azure.

    Read the article

  • Create a Remote Git Repository from an Existing XCode Repository

    - by codeWithoutFear
    Introduction Distributed version control systems (VCS’s), like Git, provide a rich set of features for managing source code.  Many development tools, including XCode, provide built-in support for various VCS’s.  These tools provide simple configuration with limited customization to get you up and running quickly while still providing the safety net of basic version control. I hate losing (and re-doing) work.  I have OCD when it comes to saving and versioning source code.  Save early, save often, and commit to the VCS often.  I also hate merging code.  Smaller and more frequent commits enable me to minimize merge time and effort as well. The work flow I prefer even for personal exploratory projects is: Make small local changes to the codebase to create an incrementally improved (and working) system. Commit these changes to the local repository.  Local repositories are quick to access, function even while offline, and provides the confidence to continue making bold changes to the system.  After all, I can easily recover to a recent working state. Repeat 1 & 2 until the codebase contains “significant” functionality and I have connectivity to the remote repository. Push the accumulated changes to the remote repository.  The smaller the change set, the less likely extensive merging will be required.  Smaller is better, IMHO. The remote repository typically has a greater degree of fault tolerance and active management dedicated to it.  This can be as simple as a network share that is backed up nightly or as complex as dedicated hardware with specialized server-side processing and significant administrative monitoring. XCode’s out-of-the-box Git integration enables steps 1 and 2 above.  Time Machine backups of the local repository add an additional degree of fault tolerance, but do not support collaboration or take advantage of managed infrastructure such as on-premises or cloud-based storage. Creating a Remote Repository These are the steps I use to enable the full workflow identified above.  For simplicity the “remote” repository is created on the local file system.  This location could easily be on a mounted network volume. Create a Test Project My project is called HelloGit and is located at /Users/Don/Dev/HelloGit.  Be sure to commit all outstanding changes.  XCode always leaves a single changed file for me after the project is created and the initial commit is submitted. Clone the Local Repository We want to clone the XCode-created Git repository to the location where the remote repository will reside.  In this case it will be /Users/Don/Dev/RemoteHelloGit. Open the Terminal application. Clone the local repository to the remote repository location: git clone /Users/Don/Dev/HelloGit /Users/Don/Dev/RemoteHelloGit Convert the Remote Repository to a Bare Repository The remote repository only needs to contain the Git database.  It does not need a checked out branch or local files. Go to the remote repository folder: cd /Users/Don/Dev/RemoteHelloGit Indicate the repository is “bare”: git config --bool core.bare true Remove files, leaving the .git folder: rm -R * Remove the “origin” remote: git remote rm origin Configure the Local Repository The local repository should reference the remote repository.  The remote name “origin” is used by convention to indicate the originating repository.  This is set automatically when a repository is cloned.  We will use the “origin” name here to reflect that relationship. Go to the local repository folder: cd /Users/Don/Dev/HelloGit Add the remote: git remote add origin /Users/Don/Dev/RemoteHelloGit Test Connectivity Any changes made to the local Git repository can be pushed to the remote repository subject to the merging rules Git enforces. Create a new local file: date > date.txt /li> Add the new file to the local index: git add date.txt Commit the change to the local repository: git commit -m "New file: date.txt" Push the change to the remote repository: git push origin master Now you can save, commit, and push/pull to your OCD hearts’ content! Code without fear! --Don

    Read the article

  • Retrieving a list of eBay categories using the .NET SDK and GetCategoriesCall

    - by Bill Osuch
    eBay offers a .Net SDK for its Trading API - this post will show you the basics of making an API call and retrieving a list of current categories. You'll need the category ID(s) for any apps that post or search eBay. To start, download the latest SDK from https://www.x.com/developers/ebay/documentation-tools/sdks/dotnet and create a new console app project. Add a reference to the eBay.Service DLL, and a few using statements: using eBay.Service.Call; using eBay.Service.Core.Sdk; using eBay.Service.Core.Soap; I'm assuming at this point you've already joined the eBay Developer Network and gotten your app IDs and user tokens. If not: Join the developer program Generate tokens Next, add an app.config file that looks like this: <?xml version="1.0"?> <configuration>   <appSettings>     <add key="Environment.ApiServerUrl" value="https://api.ebay.com/wsapi"/>     <add key="UserAccount.ApiToken" value="YourBigLongToken"/>   </appSettings> </configuration> And then add the code to get the xml list of categories: ApiContext apiContext = GetApiContext(); GetCategoriesCall apiCall = new GetCategoriesCall(apiContext); apiCall.CategorySiteID = "0"; //Leave this commented out to retrieve all category levels (all the way down): //apiCall.LevelLimit = 4; //Uncomment this to begin at a specific parent category: //StringCollection parentCategories = new StringCollection(); //parentCategories.Add("63"); //apiCall.CategoryParent = parentCategories; apiCall.DetailLevelList.Add(DetailLevelCodeType.ReturnAll); CategoryTypeCollection cats = apiCall.GetCategories(); using (StreamWriter outfile = new StreamWriter(@"C:\Temp\EbayCategories.xml")) {    outfile.Write(apiCall.SoapResponse); } GetApiContext() (provided in the sample apps in the SDK) is required for any call:         static ApiContext GetApiContext()         {             //apiContext is a singleton,             //to avoid duplicate configuration reading             if (apiContext != null)             {                 return apiContext;             }             else             {                 apiContext = new ApiContext();                 //set Api Server Url                 apiContext.SoapApiServerUrl = ConfigurationManager.AppSettings["Environment.ApiServerUrl"];                 //set Api Token to access eBay Api Server                 ApiCredential apiCredential = new ApiCredential();                 apiCredential.eBayToken = ConfigurationManager.AppSettings["UserAccount.ApiToken"];                 apiContext.ApiCredential = apiCredential;                 //set eBay Site target to US                 apiContext.Site = SiteCodeType.US;                 return apiContext;             }         } Running this will give you a large (4 or 5 megs) XML file that looks something like this: <soapenv:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/">    <soapenv:Body>       <GetCategoriesResponse >          <Timestamp>2012-06-06T16:03:46.158Z</Timestamp>          <Ack>Success</Ack>          <CorrelationID>d02dd9e3-295a-4268-9ea5-554eeb2e0e18</CorrelationID>          <Version>775</Version>          <Build>E775_CORE_BUNDLED_14891042_R1</Build> -          <CategoryArray>             <Category>                <BestOfferEnabled>true</BestOfferEnabled>                <AutoPayEnabled>true</AutoPayEnabled>                <CategoryID>20081</CategoryID>                <CategoryLevel>1</CategoryLevel>                <CategoryName>Antiques</CategoryName>                <CategoryParentID>20081</CategoryParentID>             </Category>             <Category>                <BestOfferEnabled>true</BestOfferEnabled>                <AutoPayEnabled>true</AutoPayEnabled>                <CategoryID>37903</CategoryID>                <CategoryLevel>2</CategoryLevel>                <CategoryName>Antiquities</CategoryName>                <CategoryParentID>20081</CategoryParentID>             </Category> (etc.) You could work with this, but I wanted a nicely nested view, like this: <CategoryArray>    <Category Name='Antiques' ID='20081' Level='1'>       <Category Name='Antiquities' ID='37903' Level='2'/> </CategoryArray> ...so I transformed the xml: private void TransformXML(CategoryTypeCollection cats)         {             XmlElement topLevelElement = null;             XmlElement childLevelElement = null;             XmlNode parentNode = null;             string categoryString = "";             XmlDocument returnDoc = new XmlDocument();             XmlElement root = returnDoc.CreateElement("CategoryArray");             returnDoc.AppendChild(root);             XmlNode rootNode = returnDoc.SelectSingleNode("/CategoryArray");             //Loop through CategoryTypeCollection             foreach (CategoryType category in cats)             {                 if (category.CategoryLevel == 1)                 {                     //Top-level category, so we know we can just add it                     topLevelElement = returnDoc.CreateElement("Category");                     topLevelElement.SetAttribute("Name", category.CategoryName);                     topLevelElement.SetAttribute("ID", category.CategoryID);                     rootNode.AppendChild(topLevelElement);                 }                 else                 {                     // Level number will determine how many Category nodes we are deep                     categoryString = "";                     for (int x = 1; x < category.CategoryLevel; x++)                     {                         categoryString += "/Category";                     }                     parentNode = returnDoc.SelectSingleNode("/CategoryArray" + categoryString + "[@ID='" + category.CategoryParentID[0] + "']");                     childLevelElement = returnDoc.CreateElement("Category");                     childLevelElement.SetAttribute("Name", category.CategoryName);                     childLevelElement.SetAttribute("ID", category.CategoryID);                     parentNode.AppendChild(childLevelElement);                 }             }             returnDoc.Save(@"C:\Temp\EbayCategories-Modified.xml");         } Yes, there are probably much cleaner ways of dealing with it, but I'm not an xml expert… Keep in mind, eBay categories do not change on a regular basis, so you should be able to cache this data (either in a file or database) for some time. The xml returns a CategoryVersion node that you can use to determine if the category list has changed. Technorati Tags: Csharp, eBay

    Read the article

  • The Dispose Pattern (and FxCop warnings)

    - by Scott Dorman
    [This is actually a response to Bill’s blog post, but since it isn’t possible to leave this as a comment on his blog it’s a post here.] There are many different ways to implement the Dispose pattern correctly. Some are (in my opinion) better than others. In Bill’s blog post he presents a particular pattern, which is an excerpt from his book (Effective C#). The issue centers around the fact that a reader took the code sample presented in the book and ran FxCop (Code Analysis) on it, which generated a warning: “Ensure that base.Dispose() is always called.” The “lesson learned” that Bill presents is that “tools are there to help us, not control us.” While I completely agree with the belief that tools are there to help us, I think it’s important to understand why FxCop is raising this particular warning. The code presented in Bill’s book looks like: // Have its own disposed flag.private bool disposed = false;protected override void Dispose(bool isDisposing){ // Don't dispose more than once. if (disposed) return; if (isDisposing) { // TODO: free managed resources here. } // TODO: free unmanaged resources here. // Let the base class free its resources. // Base class is responsible for calling // GC.SuppressFinalize( ) base.Dispose(isDisposing); // Set derived class disposed flag: disposed = true;} This code does follow all of the guidelines for implementing the Dispose pattern. In this case, it’s presumably part of a larger example showing how to implement the pattern as part of a base class. The reason FxCop is warning you about this code is the first if statement in the Dispose method, which will cause the method to exit if disposed is true. The problem here is that there is the possibility that if the disposed flag is true, the call to base.Dispose() will never be executed. As Bill points out, it is possible for some other code elsewhere in the class to set this flag. He states that this is an “unlikely occurrence.” While that is probably true, it can be a potentially dangerous assumption to make and is one that can be easily corrected. By changing the code slightly you can remove this assumption and correct the FxCop violation. private bool disposed = false;protected override void Dispose(bool disposing){ if (!disposed) { if (disposing) { // Dispose managed resources. } // Dispose unmanaged resources. disposed = true; } base.Dispose(disposing);} Using this implementation allows the call to base.Dispose() to always occur, which ensures that the the disposal chain is always properly followed. Technorati Tags: .NET,C#,Dispose Pattern

    Read the article

  • Your personal backlog

    - by johndoucette
    Whenever I start a new project or come in during a hectic time to help salvage a deliverable – there is always a backlog. Generating the backlog can be a daunting exercise, but worth the effort. Once I have a backlog, I feel in control and the chaos begins to quell. In your everyday life, you too should keep a backlog. Here is how I do it; 1. Always carry a notebook 2. Start each day marking a new page with today’s date 3. Flip to yesterday’s notes and copy every task with an empty checkbox next to it, to the new empty page (today) 4. As the day progresses and you go to meetings, do your work, or get interrupted to do something…jot it down in today’s page and put an empty checkbox next to it. If you get it done during the day, awesome. Mark it complete. Keep carrying and writing every task to each new day until it is complete. Maybe one day, you will have an empty backlog and your sprint will be complete!

    Read the article

  • Announcing the Winnipeg Code Camp!

    - by D'Arcy Lussier
    Alright, the event website will be up this weekend, but I wanted to get the word out sooner than later. For the third year, we’re holding the Winnipeg Code Camp! When: Saturday, February 26th Where: Red River College, Princess Street Campus Cost: FREE!!! Continental Breakfast and Hot Lunch Provided! We’re going to have four rooms worth of sessions going on like last year, so lots of great sessions and great content. To register for the event (we need to know numbers for ordering the food), please do so here: http://www.eventbrite.com/event/1180632303 If you’re interested in speaking at the Code Camp, please contact me at darcy.lussier at gmail.com. Note that we have only 20 slots for the day, so contact me sooner than later! D

    Read the article

  • DRY and SRP

    - by Timothy Klenke
    Originally posted on: http://geekswithblogs.net/TimothyK/archive/2014/06/11/dry-and-srp.aspxKent Beck’s XP Simplicity Rules (aka Four Rules of Simple Design) are a prioritized list of rules that when applied to your code generally yield a great design.  As you’ll see from the above link the list has slightly evolved over time.  I find today they are usually listed as: All Tests Pass Don’t Repeat Yourself (DRY) Express Intent Minimalistic These are prioritized.  If your code doesn’t work (rule 1) then everything else is forfeit.  Go back to rule one and get the code working before worrying about anything else. Over the years the community have debated whether the priority of rules 2 and 3 should be reversed.  Some say a little duplication in the code is OK as long as it helps express intent.  I’ve debated it myself.  This recent post got me thinking about this again, hence this post.   I don’t think it is fair to compare “Expressing Intent” against “DRY”.  This is a comparison of apples to oranges.  “Expressing Intent” is a principal of code quality.  “Repeating Yourself” is a code smell.  A code smell is merely an indicator that there might be something wrong with the code.  It takes further investigation to determine if a violation of an underlying principal of code quality has actually occurred. For example “using nouns for method names”, “using verbs for property names”, or “using Booleans for parameters” are all code smells that indicate that code probably isn’t doing a good job at expressing intent.  They are usually very good indicators.  But what principle is the code smell of Duplication pointing to and how good of an indicator is it? Duplication in the code base is bad for a couple reasons.  If you need to make a change and that needs to be made in a number of locations it is difficult to know if you have caught all of them.  This can lead to bugs if/when one of those locations is overlooked.  By refactoring the code to remove all duplication there will be left with only one place to change, thereby eliminating this problem. With most projects the code becomes the single source of truth for a project.  If a production code base is inconsistent with a five year old requirements or design document the production code that people are currently living with is usually declared as the current reality (or truth).  Requirement or design documents at this age in a project life cycle are usually of little value. Although comparing production code to external documentation is usually straight forward, duplication within the code base muddles this declaration of truth.  When code is duplicated small discrepancies will creep in between the two copies over time.  The question then becomes which copy is correct?  As different factions debate how the software should work, trust in the software and the team behind it erodes. The code smell of Duplication points to a violation of the “Single Source of Truth” principle.  Let me define that as: A stakeholder’s requirement for a software change should never cause more than one class to change. Violation of the Single Source of Truth principle will always result in duplication in the code.  However, the inverse is not always true.  Duplication in the code does not necessarily indicate that there is a violation of the Single Source of Truth principle. To illustrate this, let’s look at a retail system where the system will (1) send a transaction to a bank and (2) print a receipt for the customer.  Although these are two separate features of the system, they are closely related.  The reason for printing the receipt is usually to provide an audit trail back to the bank transaction.  Both features use the same data:  amount charged, account number, transaction date, customer name, retail store name, and etcetera.  Because both features use much of the same data, there is likely to be a lot of duplication between them.  This duplication can be removed by making both features use the same data access layer. Then start coming the divergent requirements.  The receipt stakeholder wants a change so that the account number has the last few digits masked out to protect the customer’s privacy.  That can be solve with a small IF statement whilst still eliminating all duplication in the system.  Then the bank wants to take a picture of the customer as well as capture their signature and/or PIN number for enhanced security.  Then the receipt owner wants to pull data from a completely different system to report the customer’s loyalty program point total. After a while you realize that the two stakeholders have somewhat similar, but ultimately different responsibilities.  They have their own reasons for pulling the data access layer in different directions.  Then it dawns on you, the Single Responsibility Principle: There should never be more than one reason for a class to change. In this example we have two stakeholders giving two separate reasons for the data access class to change.  It is clear violation of the Single Responsibility Principle.  That’s a problem because it can often lead the project owner pitting the two stakeholders against each other in a vein attempt to get them to work out a mutual single source of truth.  But that doesn’t exist.  There are two completely valid truths that the developers need to support.  How is this to be supported and honour the Single Responsibility Principle?  The solution is to duplicate the data access layer and let each stakeholder control their own copy. The Single Source of Truth and Single Responsibility Principles are very closely related.  SST tells you when to remove duplication; SRP tells you when to introduce it.  They may seem to be fighting each other, but really they are not.  The key is to clearly identify the different responsibilities (or sources of truth) over a system.  Sometimes there is a single person with that responsibility, other times there are many.  This can be especially difficult if the same person has dual responsibilities.  They might not even realize they are wearing multiple hats. In my opinion Single Source of Truth should be listed as the second rule of simple design with Express Intent at number three.  Investigation of the DRY code smell should yield to the proper application SST, without violating SRP.  When necessary leave duplication in the system and let the class names express the different people that are responsible for controlling them.  Knowing all the people with responsibilities over a system is the higher priority because you’ll need to know this before you can express it.  Although it may be a code smell when there is duplication in the code, it does not necessarily mean that the coder has chosen to be expressive over DRY or that the code is bad.

    Read the article

  • Cooking with Wessty: WordPress and HTML 5

    - by David Wesst
    WordPress is easily one, if not the most, popular blogging platforms on the web. With the release of WordPress 3.x, the potential for what you can do with this open source software is limitless. This technique intends to show you how to get your WordPress wielding the power of the future web, that being HTML 5. --- Ingredients WordPress 3.x Your favourite HTML 5 compliant browser (e.g. Internet Explorer 9) Directions Setup WordPress on your server or host. Note: You can setup a WordPress.com account, but you will require an paid add-on to really take advantage of this technique.Login to the administration panel. Login to the administration section of your blog, using your web browser.  On the left side of the page, click the Appearance heading. Then, click on Themes. At the top of the page, select the Install Themes tab. In the search box, type the “toolbox” and click search. In the search results, you should see an theme called Toolbox. Click the Install link in the Toolbox item. A dialog window should appear with a sample picture of what the theme looks like. Click on the Install Now button in the bottom right corner. Et voila! Once the installation is done, you are done and ready to bring your blog into the future of the web. Try previewing your blog in HTML 5 by clicking the preview link.   Now, you are probably thinking “Man…HTML 5 looks like junk”. To that, I respond: “HTML was never why your site looked good in the first place. It was the CSS.” Now you have an un-stylized theme that uses HTML 5 elements throughout your WordPress site. If you want to learn how to apply CSS to your WordPress blog, you should check out the WordPress codex that pretty much covers everything there is to cover about WordPress development. Now, remember how we noted earlier that your free WordPress.com account wouldn’t take advantage of this technique? That is because, as of the time of this writing, you needed to pay a fee to use custom CSS. Remember now, this only gives you the foundation to create your own HTML 5 WordPress site. There are some HTML 5 themes out there that already look good, and were built using this as the foundation and added some CSS 3 to really spice it up. Looking forward to seeing more HTML 5 WordPress sites! Enjoy developing the future of the web. Resources Toolbox Theme JustCSS Theme WordPress Installation Tutorial WordPress Theme Development Tutorial This post also appears at http://david.wes.st

    Read the article

  • 10 Weeks of Gift Ideas – All Offers Good Through January 19, 2012

    - by TATWORTH
    O'Reilly are offering a series of good offers through to Jan 19, 2012. The main page is at http://shop.oreilly.com/category/deals/hd-10-weeks.do Already available are: JavaScript path to Mastery set at http://shop.oreilly.com/category/deals/hd-javascript-path.do I recommend JavaScript: The Definitive Guide, 6th Edition- PDF is 50% off at http://shop.oreilly.com/product/9780596805531.do HTML 5 Programming set at http://shop.oreilly.com/category/deals/hd-html5.do Again the PDF's are 50% off.

    Read the article

  • Planning an Event&ndash;SPS NYC

    - by MOSSLover
    I bet some of you were wondering why I am not going to any events for the most part in June and July (aside from volunteering at SPS Chicago).  Well I basically have no life for the next 2 months.  We are approaching the 11th hour of SharePoint Saturday New York City.  The event is slated to have 350 to 400 attendees.  This is second year in a row I’ve helped run it with Jason Gallicchio.  It’s amazingly crazy how much effort this event requires versus Kansas City.  It’s literally 2x the volume of attendees, speakers, and sponsors plus don’t even get me started about volunteers.  So here is a bit of the break down… We have 30 volunteers+ that Tasha Scott from the Hampton Roads Area will be managing the day of the event to do things like timing the speakers, handing out food, making sure people don’t walk into the event that did not sign up until we get a count for fire code, registering people, watching the sharpees, watching the prizes, making sure attendees get to the right place,  opening and closing the partition in the big room, moving chairs, moving furniture, etc…Then there is Jason, Greg, and I who will be making sure that the speakers, sponsors, and everything is going smoothly in the background.  We need to make sure that everything is setup properly and in the right spot.  We also need to make sure signs are printed, schedules are created, bags are stuffed with sponsor material.  Plus we need to send out emails to sponsors reminding them to send us the right information to post on the site for charity sessions, send us boxes with material to stuff bags, and we need to make sure that Michael Lotter gets there information for invoicing.  We also need to check that Michael has invoiced everyone and who has paid, because we can’t order anything for the event without money.  Once people have paid we need to setup food orders, speaker and volunteer dinners, buy prizes, buy bags, buy speakers/volunteer/organizer shirts, etc…During this process we need all the abstracts from speakers, all the bios, pictures, shirt sizes, and other items so we can create schedules and order items.  We also need to keep track of who is attending the dinner the night before for volunteers and speakers and make sure we don’t hit capacity.  Then there is attendee tracking and making sure that we don’t hit too many attendees.  We need to make sure that attendees know where to go and what to do.  We have to make all kinds of random supply lists during this process and keep on track with a variety of lists and emails plus conference calls.  All in all it’s a lot of work and I am trying to keep track of it all the top so that we don’t duplicate anything or miss anything.  So basically all in all if you don’t see me around for another month don’t worry I do exist. Right now if you look at what I’m doing I am traveling every Monday morning and Thursday night back and forth to Washington DC from New Jersey.  Every night I am working on organizational stuff for SharePoint Saturday New York City.  Every Tuesday night we are running an event conference call.  Every weekend I am either with family or my boyfriend and cat trying hard not to touch the event.  So all my time is pretty much work, event, and family/boyfriend.  I have 0 bandwidth for anything in the community.  If you compound that with my severe allergy problems in DC and a doctor’s appointment every month plus a new med once a week I’m lucky I am still standing and walking.  So basically once July 30th hits hopefully Jason Gallicchio, Greg Hurlman, and myself will be able to breathe a little easier.  If I forget to do this thank you Greg and Jason.  Thank you Tom Daly.  Thank you Michael Lotter.  Thank you Tasha Scott.  Thank you Kevin Griffin.  Thank you all the volunteers, speakers, sponsors, and attendees who will and have made this event a success.  Hopefully, we have enough time until next year to regroup, recharge, and make the event grow bigger in a different venue.  Awesome job everyone we sole out within 3 days of registration and we still have several weeks to go.  Right now the waitlist is at 49 people with room to grow.  If you attend the event thank all these guys I mentioned above for making it possible.  It’s going to be awesome I know it but I probably won’t remember half of it due to the blur of things that we will all be taking care of the day of the event.  Catch you all in the end of July/Early August where I will attempt to post something useful and clever and possibly while wearing a fez. Technorati Tags: SPS NYC,SharePoint Saturday,SharePoint Saturday New York City

    Read the article

  • Real World Nuget

    - by JoshReuben
    Why Nuget A higher level of granularity for managing references When you have solutions of many projects that depend on solutions of many projects etc à escape from Solution Hell. Links · Using A GUI (Package Explorer) to build packages - http://docs.nuget.org/docs/creating-packages/using-a-gui-to-build-packages · Creating a Nuspec File - http://msdn.microsoft.com/en-us/vs2010trainingcourse_aspnetmvcnuget_topic2.aspx · consuming a Nuget Package - http://msdn.microsoft.com/en-us/vs2010trainingcourse_aspnetmvcnuget_topic3 · Nuspec reference - http://docs.nuget.org/docs/reference/nuspec-reference · updating packages - http://nuget.codeplex.com/wikipage?title=Updating%20All%20Packages · versioning - http://docs.nuget.org/docs/reference/versioning POC Folder Structure POC Setup Steps · Install package explorer · Source o Create a source solution – configure output directory for projects (Project > Properties > Build > Output Path) · Package o Add assemblies to package from output directory (D&D)- add net folder o File > Export – save .nuspec files and lib contents <?xml version="1.0" encoding="utf-16"?> <package > <metadata> <id>MyPackage</id> <version>1.0.0.3</version> <title /> <authors>josh-r</authors> <owners /> <requireLicenseAcceptance>false</requireLicenseAcceptance> <description>My package description.</description> <summary /> </metadata> </package> o File > Save – saves .nupkg file · Create Target Solution o In Tools > Options: Configure package source & Add package Select projects: Output from package manager (powershell console) ------- Installing...MyPackage 1.0.0 ------- Added file 'NugetSource.AssemblyA.dll' to folder 'MyPackage.1.0.0\lib'. Added file 'NugetSource.AssemblyA.pdb' to folder 'MyPackage.1.0.0\lib'. Added file 'NugetSource.AssemblyB.dll' to folder 'MyPackage.1.0.0\lib'. Added file 'NugetSource.AssemblyB.pdb' to folder 'MyPackage.1.0.0\lib'. Added file 'MyPackage.1.0.0.nupkg' to folder 'MyPackage.1.0.0'. Successfully installed 'MyPackage 1.0.0'. Added reference 'NugetSource.AssemblyA' to project 'AssemblyX' Added reference 'NugetSource.AssemblyB' to project 'AssemblyX' Added file 'packages.config'. Added file 'packages.config' to project 'AssemblyX' Added file 'repositories.config'. Successfully added 'MyPackage 1.0.0' to AssemblyX. ============================== o Packages folder created at solution level o Packages.config file generated in each project: <?xml version="1.0" encoding="utf-8"?> <packages>   <package id="MyPackage" version="1.0.0" targetFramework="net40" /> </packages> A local Packages folder is created for package versions installed: Each folder contains the downloaded .nupkg file and its unpacked contents – eg of dlls that the project references Note: this folder is not checked in UpdatePackages o Configure Package Manager to automatically check for updates o Browse packages - It automatically picked up the updates Update Procedure · Modify source · Change source version in assembly info · Build source · Open last package in package explorer · Increment package version number and re-add assemblies · Save package with new version number and export its definition · In target solution – Tools > Manage Nuget Packages – click on All to trigger refresh , then click on recent packages to see updates · If problematic, delete packages folder Versioning uninstall-package mypackage install-package mypackage –version 1.0.0.3 uninstall-package mypackage install-package mypackage –version 1.0.0.4 Dependencies · <?xml version="1.0" encoding="utf-16"?> <package xmlns="http://schemas.microsoft.com/packaging/2012/06/nuspec.xsd"> <metadata> <id>MyDependentPackage</id> <version>1.0.0</version> <title /> <authors>josh-r</authors> <owners /> <requireLicenseAcceptance>false</requireLicenseAcceptance> <description>My package description.</description> <dependencies> <group targetFramework=".NETFramework4.0"> <dependency id="MyPackage" version="1.0.0.4" /> </group> </dependencies> </metadata> </package> Using NuGet without committing packages to source control http://docs.nuget.org/docs/workflows/using-nuget-without-committing-packages Right click on the Solution node in Solution Explorer and select Enable NuGet Package Restore. — Recall that packages folder is not part of solution If you get downloading package ‘Nuget.build’ failed, config proxy to support certificate for https://nuget.org/api/v2/ & allow unrestricted access to packages.nuget.org To test connectivity: get-package –listavailable To test Nuget Package Restore – delete packages folder and open vs as admin. In nugget msbuild: <Import Project="$(SolutionDir)\.nuget\nuget.targets" /> TFSBuild Integration Modify Nuget.Targets file <RestorePackages Condition="  '$(RestorePackages)' == '' "> True </RestorePackages> … <PackageSource Include="\\IL-CV-004-W7D\Packages" /> Add System Environment variable EnableNuGetPackageRestore=true & restart the “visual studio team foundation build service host” service. Important: Ensure Network Service has access to Packages folder Nugetter TFS Build integration Add Nugetter build process templates to TFS source control For Build Controller - Specify location of custom assemblies Generate .nuspec file from Package Explorer: File > Export Edit the file elements – remove path info from src and target attributes <?xml version="1.0" encoding="utf-16"?> <package xmlns="http://schemas.microsoft.com/packaging/2012/06/nuspec.xsd">     <metadata>         <id>Common</id>         <version>1.0.0</version>         <title />         <authors>josh-r</authors>         <owners />         <requireLicenseAcceptance>false</requireLicenseAcceptance>         <description>My package description.</description>         <dependencies>             <group targetFramework=".NETFramework3.5" />         </dependencies>     </metadata>     <files>         <file src="CommonTypes.dll" target="CommonTypes.dll" />         <file src="CommonTypes.pdb" target="CommonTypes.pdb" /> … Add .nuspec file to solution so that it is available for build: Dev\NovaNuget\Common\NuSpec\common.1.0.0.nuspec Add a Build Process Definition based on the Nugetter build process template: Configure the build process – specify: · .sln to build · Base path (output directory) · Nuget.exe file path · .nuspec file path Copy DLLs to a binary folder 1) Set copy local for an assembly reference to false 2)  MSBuild Copy Task – modify .csproj file: http://msdn.microsoft.com/en-us/library/3e54c37h.aspx <ItemGroup>     <MySourceFiles Include="$(MSBuildProjectDirectory)\..\SourceAssemblies\**\*.*" />   </ItemGroup>     <Target Name="BeforeBuild">     <Copy SourceFiles="@(MySourceFiles)" DestinationFolder="bin\debug\SourceAssemblies" />   </Target> 3) Set Probing assembly search path from app.config - http://msdn.microsoft.com/en-us/library/823z9h8w(v=vs.80).aspx -                 <?xml version="1.0" encoding="utf-8" ?> <configuration>   <runtime>     <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">       <probing privatePath="SourceAssemblies"/>     </assemblyBinding>   </runtime> </configuration> Forcing 'copy local = false' The following generic powershell script was added to the packages install.ps1: param($installPath, $toolsPath, $package, $project) if( $project.Object.Project.Name -ne "CopyPackages") { $asms = $package.AssemblyReferences | %{$_.Name} foreach ($reference in $project.Object.References) { if ($asms -contains $reference.Name + ".dll") { $reference.CopyLocal = $false; } } } An empty project named "CopyPackages" was added to the solution - it references all the packages and is the only one set to CopyLocal="true". No MSBuild knowledge required.

    Read the article

  • Windows Azure SDK 1.2 Available - .NET 4.0 Support

    - by Shaun
    The Windows Azure team had just announced the release of the latest version of its tools and SDK (v1.2) at the TechED 2010 New Orleans. You can download it here. The biggest new feature/improvement of this version of the SDK would be Visual Studio 2010 RTM and .NET 4.0 support. It gives us the facilities to build our azure-based applications on top of .NET 3.5 and 4.0 as well. So the guys who is working on, like me, or is going to be working on .NET 4 would better to have this SDK installed I think. Also there are some other information about the envolution of the Windows Azure at this TechED session you can find here.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • crunchbang: it takes up *how* much memory?!?!

    - by Theo Moore
    I've been trying many distros of Linux lately, trying to find something I like for my netbook. I started out with Ubuntu, and I can tell you I am a big fan. Ubuntu is now fast to install, much simpler to administer, and pretty light resource-wise. My original install was the standard 32 bit version of 9.04. I tried the netbook remix version of this release, but it was very, very slow. Even the full-blown version used only about 200mb. Much better than the almost 800 that the recommended Windows y version took. Once the newest release of Ubuntu was released, I decided to try the netbook remx of 10.04. It used even less RAM; only about 150mb. I thought I'd found my OS. I certainly settled in and prepared to use it forever. Then, someone I know suggested I try cunchbang. It is the most minimalistic UI I've ever seen, using Openbox rather than Gnome or KDE. Very slick, simple and clean. Since I am using the alpha of the most recent version (using Debian Squeeze), the apps provided for you are few...although more will be provided soon. You do have a word processor, etc., although not the OpenOffice you would normally get in Ubuntu. But the best part? 48MB. That's it. 48mb fully loaded, supporting what I can "hotel services". It's fast, boots quick, and believe it or not, I can even do Java-based development....on my netbook! Pretty slick.   More on it as I use it.

    Read the article

  • 45minute video on introduction to Windows Azure and running Ruby on Rails in the cloud

    - by Eric Nelson
    Last week I presented at Cloud and Grid Exchange 2010. I did an introduction to Windows Azure and a demo of Ruby on Rails running on Azure. My slides and links can be found here – but just spotted that the excellent Skills Matter folks have already published the video. Watch the video at http://skillsmatter.com/podcast/cloud-grid/looking-at-the-clouds-through-dirty-windows  P.S. I really need to shed a few pounds!

    Read the article

  • The Great Divorce

    - by BlackRabbitCoder
    I have a confession to make: I've been in an abusive relationship for more than 17 years now.  Yes, I am not ashamed to admit it, but I'm finally doing something about it. I met her in college, she was new and sexy and amazingly fast -- and I'd never met anything like her before.  Her style and her power captivated me and I couldn't wait to learn more about her.  I took a chance on her, and though I learned a lot from her -- and will always be grateful for my time with her -- I think it's time to move on. Her name was C++, and she so outshone my previous love, C, that any thoughts of going back evaporated in the heat of this new romance.  She promised me she'd be gentle and not hurt me the way C did.  She promised me she'd clean-up after herself better than C did.  She promised me she'd be less enigmatic and easier to keep happy than C was.  But I was deceived.  Oh sure, as far as truth goes, it wasn't a complete lie.  To some extent she was more fun, more powerful, safer, and easier to maintain.  But it just wasn't good enough -- or at least it's not good enough now. I loved C++, some part of me still does, it's my first-love of programming languages and I recognize its raw power, its blazing speed, and its improvements over its predecessor.  But with today's hardware, at speeds we could only dream to conceive of twenty years ago, that need for speed -- at the cost of all else -- has died, and that has left my feelings for C++ moribund. If I ever need to write an operating system or a device driver, then I might need that speed.  But 99% of the time I don't.  I'm a business-type programmer and chances are 90% of you are too, and even the ones who need speed at all costs may be surprised by how much you sacrifice for that.   That's not to say that I don't want my software to perform, and it's not to say that in the business world we don't care about speed or that our job is somehow less difficult or technical.  There's many times we write programs to handle millions of real-time updates or handle thousands of financial transactions or tracking trading algorithms where every second counts.  But if I choose to write my code in C++ purely for speed chances are I'll never notice the speed increase -- and equally true chances are it will be far more prone to crash and far less easy to maintain.  Nearly without fail, it's the macro-optimizations you need, not the micro-optimizations.  If I choose to write a O(n2) algorithm when I could have used a O(n) algorithm -- that can kill me.  If I choose to go to the database to load a piece of unchanging data every time instead of caching it on first load -- that too can kill me.  And if I cross the network multiple times for pieces of data instead of getting it all at once -- yes that can also kill me.  But choosing an overly powerful and dangerous mid-level language to squeeze out every last drop of performance will realistically not make stock orders process any faster, and more likely than not open up the system to more risk of crashes and resource leaks. And that's when my love for C++ began to die.  When I noticed that I didn't need that speed anymore.  That that speed was really kind of a lie.  Sure, I can be super efficient and pack bits in a byte instead of using separate boolean values.  Sure, I can use an unsigned char instead of an int.  But in the grand scheme of things it doesn't matter as much as you think it does.  The key is maintainability, and that's where C++ failed me.  I like to tell the other developers I work with that there's two levels of correctness in coding: Is it immediately correct? Will it stay correct? That is, you can hack together any piece of code and make it correct to satisfy a task at hand, but if a new developer can't come in tomorrow and make a fairly significant change to it without jeopardizing that correctness, it won't stay correct. Some people laugh at me when I say I now prefer maintainability over speed.  But that is exactly the point.  If you focus solely on speed you tend to produce code that is much harder to maintain over the long hall, and that's a load of technical debt most shops can't afford to carry and end up completely scrapping code before it's time.  When good code is written well for maintainability, though, it can be correct both now and in the future. And you know the best part is?  My new love is nearly as fast as C++, and in some cases even faster -- and better than that, I know C# will treat me right.  Her creators have poured hundreds of thousands of hours of time into making her the sexy beast she is today.  They made her easy to understand and not an enigmatic mess.  They made her consistent and not moody and amorphous.  And they made her perform as fast as I care to go by optimizing her both at compile time and a run-time. Her code is so elegant and easy on the eyes that I'm not worried where she will run to or what she'll pull behind my back.  She is powerful enough to handle all my tasks, fast enough to execute them with blazing speed, maintainable enough so that I can rely on even fairly new peers to modify my work, and rich enough to allow me to satisfy any need.  C# doesn't ask me to clean up her messes!  She cleans up after herself and she tries to make my life easier for me by taking on most of those optimization tasks C++ asked me to take upon myself.  Now, there are many of you who would say that I am the cause of my own grief, that it was my fault C++ didn't behave because I didn't pay enough attention to her.  That I alone caused the pain she inflicted on me.  And to some extent, you have a point.  But she was so high maintenance, requiring me to know every twist and turn of her vast and unrestrained power that any wrong term or bout of forgetfulness was met with painful reminders that she wasn't going to watch my back when I made a mistake.  But C#, she loves me when I'm good, and she loves me when I'm bad, and together we make beautiful code that is both fast and safe. So that's why I'm leaving C++ behind.  She says she's changing for me, but I have no interest in what C++0x may bring.  Oh, I'll still keep in touch, and maybe I'll see her now and again when she brings her problems to my door and asks for some attention -- for I always have a soft spot for her, you see.  But she's out of my house now.  I have three kids and a dog and a cat, and all require me to clean up after them, why should I have to clean up after my programming language as well?

    Read the article

  • Speaking At The Chicago Code Camp

    - by Tim Murphy
    I just got news that my talk on Office Open XML has been accepted for the Chicago Code Camp.  I hear that they will be announcing the full schedule of sessions soon.  Be sure to register and join us.  As a bonus the guys from .NET Rocks will be there. http://www.chicagocodecamp.com del.icio.us Tags: .NET Rocks,Chicago Code Camp,Speaking,OOXML SDK 2.0,OOXML,Office Open XML,PSC Group

    Read the article

  • Caveat utilitor - Can I run two versions of Microsoft Project side-by-side?

    - by Martin Hinshelwood
    A number of out customers have asked if there are any problems in installing and running multiple versions of Microsoft Project on a single client. Although this is a case of Caveat utilitor (Let the user beware), as long as the user understands and accepts the issues that can occur then they can do this. Although Microsoft provide the ability to leave old versions of Office products (except Outlook) on your client when you are installing a new version of the product they certainly do not endorse doing so. Figure: For Project you can choose to keep the old stuff   That being the case I would have preferred that they put a “(NOT RECOMMENDED)” after the options to impart that knowledge to the rest of us, but they did not. The default and recommended behaviour is for the newer version installer to remove the older versions. Of course this does not apply in the revers. There are no forward compatibility packs for Office. There are a number of negative behaviours (or bugs) that can occur in this configuration: There is only one MS Project In Windows a file extension can only be associated with a single program.  In this case, MPP files can be associated with only one version of winproj.exe.  The executables are in different folders so if a user double-clicks a Project file on the desktop, file explorer, or Outlook email, Windows will launch the winproj.exe associated with MPP and then load the MPP file.  There are problems associated with this situation and in some cases workarounds. The user double-clicks on a Project 2010 file, Project 2007 launches but is unable to open the file because it is a newer version.  The workaround is for the user to launch Project 2010 from the Start menu then open the file.  If the file is attached to an email they will need to first drag the file to the desktop. All your linked MS Project files need to be of the same version There are a number of problems that occur when people use on Microsoft’s Object Linking and Embedding (OLE) technology.  The three common uses of OLE are: for inserted projects where a Master project contains sub-projects and each sub-project resides in its own MPP file shared resource pools where multiple MPP files share a common resource pool kept in a single MPP file cross-project links where a task or milestone in one MPP file has a  predecessor/successor relationship with a task or milestone in a different MPP file What I’ve seen happen before is that if you are running in a version of Project that is not associated with the MPP extension and then try and activate an OLE link then Project tries to launch the other version of Project.  Things start getting very confused since different MPP files are being controlled by different versions of Project running at the same time.  I haven’t tried this in awhile so I can’t give you exact symptoms but I suspect that if Project 2010 is involved the symptoms will be different then in a Project 2003/2007 scenario.  I’ve noticed that Project 2010 gives different error messages for the exact same problem when it occurs in Project 2003 or 2007.  -Anonymous The recommendation would be either not to use this feature if you have to have multiple versions of Project installed or to use only a single version of Project. You may get unexpected negative behaviours if you are using shared resource pools or resource pools even when you are not running multiple versions as I have found that they can get broken very easily. If you need these thing then it is probably best to use Project Server as it was created to solve many of these specific issues. Note: I would not even allow multiple people to access a network copy of a Project file because of the way Windows locks files in write mode. This can cause write-locks that get so bad a server restart is required I’ve seen user’s files get write-locked to the point where the only resolution is to reboot the server. Changing the default version to run for an extension So what if you want to change the default association from Project 2007 to Project 2010?   Figure: “Control Panel | Folder Options | Change the file associated with a file extension” Windows normally only lists the last version installed for a particular extension. You can select a specific version by selecting the program you want to change and clicking “Change program… | Browse…” and then selecting the .exe you want to use on the file system. Figure: You will need to select the exact version of “winproj.exe” that you want to run Conclusion Although it is possible to run multiple versions of Project on one system in the main it does not really make sense.

    Read the article

  • Convert MP3 to AAC,FLAC to AAC (.NET/C#) FREE :)

    - by PearlFactory
    So I was tasked with looking at converting 10 million tracks from mp3 320k to AAC and also Converting from mp3 320k to mp3 128k After a bit of hunting around the tool you need to use is FFMPEG Download x64 WindowsAlso for the best results get the Nero AACEncoder Download Now the command line STEP 1(From Flac)ffmpeg -i input.flac -f wav - | neroAacEnc -ignorelength -q 0.5 -if - -of output.m4aor (From mp3)ffmpeg -i input.mp3 -f wav - | neroAacEnc -ignorelength -q 0.5 -if - -of output.m4aNow the output.m4a is a intermediate state that we now put a ACC wrapper on via FFMpeg STEP 2ffmpeg -i output.m4a -vn -acodec copy final.aacDone :) There are a couple of options with the FFMPEG library as in we can look at importing the librarys and manipulation the API for the direct result FFMPEG has this support. You can get the relevant librarys from HereThey even have the source if you are that keen :-)In this case I am going to wrap the command lines into c# external process threads.( For the app that i am building to convert the 10 million tracks there is a complex multithreaded app to support this novel code )//Arrange Metadata about Call Process myProcess = new Process();ProcessStartInfo p = new ProcessStartInfo();string sArgs = string.format(" -i {0} -f wav - | neroAacEnc -ignorelength -q 0.5 -if - -of {1}",inputfile,outputfil) ; p.FileName = "ffmpeg.exe" ; p.CreateNoWindow = true; p.RedirectStandardOutput = true; //p.WindowStyle = ProcessWindowStyle.Normal p.UseShellExecute = false;//Execute p.Arguments = sArgs; myProcess.StartInfo = p; myProcess.Start(); myProcess.WaitForExit();//Write details about call  myProcess.StandardOutput.ReadToEnd();Now in this case we would execute a 2nd call using the same code but with different sArgs to put the AAC wrapper on the m4a file. Thats it. So if you need to do some conversions of any kind for you ASP.net sites/apps this is a great start and super fast.. With conversion times of around 2-3 seconds all of this can be done on the fly:-)Justin Oehlmannref : StackOverflow.com

    Read the article

  • Imaginet Resources acquires Notion Solutions

    - by Aaron Kowall
    Huge news for my company and me especially. http://www.imaginets.com/news--events/imaginet_acquisition_notion.html With the acquisition we become a very significant player in the Microsoft ALM space.  This increases our scale significantly and also our knowledgebase.  We now have a 2 Regional Directors and a pile of MS MVP’s. The timing couldn’t be more perfect since the launch of Visual Studio 2010 and Team Foundation Server 2010 is TODAY!! Oh, and we aren’t done with announcements today… More later. Technorati Tags: VS 2010,TFS 2010,Notion,Imaginet

    Read the article

  • #iPad at One Week: A Great Device Made with a Heavy Hand

    - by andrewbrust
    I have now had my iPad for a little over a week. In that time, Apple introduced the world to its iPhone OS 4 (and the SDK agreement’s draconian new section 3.3.1), HP introduced is Slate, and Microsoft got ready to launch Visual Studio 2010 and .NET 4.0. And through it all I have used my iPad. I've used it for email, calendar, controlling my Sonos, and writing an essay. I've used it for getting on TripIt and Twitter, and surfing the Web. I've used it for online banking, and online ordering and delivery of food. And the verdict? Honestly? I think it's a great device and I thoroughly enjoy using it. The screen is bright and vibrant. I am surprisingly fast and accurate when I type on it. The touch screen's responsiveness is nearly flawless. The software, including a number of third party applications, include pleasing animations and use of color that make it fun to get work done. And speaking of work, the Exchange integration is, dare I say it, robust. Not as full-featured as on a PC or Windows Mobile device, but still offering core functionality and, so far at least, without bugs. The UI is intuitive, not just to me, but also to my 5 1/2 year old, and also to my nearly-3-year-old son. They picked it up and, with just just a few pointers from me, they almost immediately knew what to do, whether they were looking at photos (and swiping/flicking along as they did so), using a drawing program, playing a game, or watching YouTube videos. The younger of the two of them even tried to get up on a chair and grab the thing today. He dropped it, from about 4 feet off the ground. And it's still fine. (Meanwhile, I'll be keeping it on a higher shelf.) I cannot fully describe yet what makes this form factor and this product so appealing. Maybe it's that it's an always-on device. Maybe it's just being able to hold such a nice, relatively large display so close. Maybe it's the design sensibility, that seems to pervade throughout the app ecosystem. Or maybe it's that one's fingers, and not pens or mice, are the software's preferred input device. Whatever the attraction, it's strong. And no matter how much I tend to root for Microsoft and against Apple, Cupertino has, in my mind, scored big, Can Microsoft compete? Yes, but not with the Windows 7 standard UI (nor with individual OEMs’ own UIs on top). I hope Microsoft builds a variant of the Windows Phone 7 specifically for tablet devices. And I hope they make it clear that all developers, and programming languages, are welcome to the platform. Once that’s established, the OEMs have to build great hardware with fast, responsive touch screens, under Microsoft's watchful eye. That may be the hardest part of getting this right. No matter what, Microsoft's got a fight on its hands. I don't know if it can count on winning that fight, either. But Silverlight and Live Tiles could certainly help. And so can treating developers like adults.  Apple seems intent on treating their devs like kids, and then giving the kids a curfew.  For that, dev-friendly Microsoft may one day give thanks.

    Read the article

  • Death March

    - by Nick Harrison
    It is a horrible sight to watch a project fail. There are few things as bad. Watching a project fail regardless of the reason is almost like sitting in a room with a "Dementor" from Harry Potter. It will literally suck all of the life and joy out of the room. Nearly every project that I have seen fail has failed because of political challenges or management challenges. Sometimes there are technical challenges that bring a project to its knees, but usually projects fail for less technical reasons. Here a few observations about projects failing for political reasons. Both the client and the consultants have to be committed to seeing the project succeed. Put simply, you cannot solve a problem when the primary stake holders do not truly want it solved. This could come from a consultant being more interested in extended the engagement. It could come from a client being afraid of what will happen to them once the problem is solved. It could come from disenfranchised stake holders. Sometimes a project is beset on all sides. When you find yourself working on a project that has this kind of threat, do all that you can to constrain the disruptive influences of the bad apples. If their influence cannot be constrained, you truly have no choice but to move on to a new project. Tough choices have to be made to make a project successful. These choices will affect everyone involved in the project. These choices may involve users not getting a change request through that they want. Developers may not get to use the tools that they want. Everyone may have to put in more hours that they originally planned. Steps may be skipped. Compromises will be made, but if everyone stays committed to the end goal, you can still be successful. If individuals start feeling disgruntled or resentful of the compromises reached, the project can easily be derailed. When everyone is not working towards a common goal, it is like driving with one foot on the break and one foot on the accelerator. Not only will you not get to where you are planning, you will also damage the car and possibly the passengers as well.   It is important to always keep the end result in mind. Regardless of the development methodology being followed, the end goal is not comprehensive documentation. In all cases, it is working software. Comprehensive documentation is nice but useless if the software doesn't work.   You can never get so distracted by the next goal that you fail to meet the current goal. Most projects are ultimately marathons. This means that the pace must be sustainable. Regardless of the temptations, you cannot burn the team alive. Processes will fail. Technology will get outdated. Requirements will change, but your people will adapt and learn and grow. If everyone on the team from the most senior analyst to the most junior recruit trusts and respects each other, there is no challenge that they cannot overcome. When everyone involved faces challenges with the attitude "This is my project and I will not let it fail" "You are my teammate and I will not let you fail", you will in fact not fail. When you find a team that embraces this attitude, protect it at all cost. Edward Yourdon wrote a book called Death March. In it, he included a graph for categorizing Death March project types based on the Happiness of the Team and the Chances of Success.   Chances are we have all worked on Death March projects. We will all most likely work on more Death March projects in the future. To a certain extent, they seem to be inevitable, but they should never be suicide or ugly. Ideally, they can all be "Mission Impossible" where everyone works hard, has fun, and knows that there is good chance that they will succeed. If you are ever lucky enough to work on such a project, you will know that sense of pride that comes from the eventual success. You will recognize a profound bond with the team that you worked with. Chances are it will change your life or at least your outlook on life. If you have not already read this book, get a copy and study it closely. It will help you survive and make the most out of your next Death March project.

    Read the article

  • Automatically create bug resolution task using the TFS 2010 API

    - by Bob Hardister
    My customer requires bug resolution to be approved and tracked.  To minimize the overhead for developers I implemented a TFS 2010 server-side plug-in to automatically create a child resolution task for the bug when the “CCB” field is set to approved. The CCB field is a custom field.  I also added the story points field to the bug WIT for sizing purposes. Redundant tasks will not be created unless the bug title is changed or the prior task is closed. The program writes an audit trail to a log file visible in the TFS Admin Console Log view. Here’s the code. BugAutoTask.cs /* SPECIFICATION * When the CCB field on the bug is set to approved, create a child task where the task: * name = Resolve bug [ID] - [Title of bug] * assigned to = same as assigned to field on the bug * same area path * same iteration path * activity = Bug Resolution * original estimate = bug points * * The source code is used to build a dll (Ows.TeamFoundation.BugAutoTaskCreation.PlugIns.dll), * which needs to be copied to * C:\Program Files\Microsoft Team Foundation Server 2010\Application Tier\Web Services\bin\Plugins * on ALL TFS application-tier servers. * * Author: Bob Hardister. */ using System; using System.Collections.Generic; using System.IO; using System.Xml; using System.Text; using System.Diagnostics; using System.Linq; using Microsoft.TeamFoundation.Common; using Microsoft.TeamFoundation.Framework.Server; using Microsoft.TeamFoundation.WorkItemTracking.Client; using Microsoft.TeamFoundation.WorkItemTracking.Server; using Microsoft.TeamFoundation.Client; using System.Collections; namespace BugAutoTaskCreation { public class BugAutoTask : ISubscriber { public EventNotificationStatus ProcessEvent(TeamFoundationRequestContext requestContext, NotificationType notificationType, object notificationEventArgs, out int statusCode, out string statusMessage, out ExceptionPropertyCollection properties) { statusCode = 0; properties = null; statusMessage = String.Empty; // Error message for for tracing last code executed and optional fields string lastStep = "No field values found or set "; try { if ((notificationType == NotificationType.Notification) && (notificationEventArgs.GetType() == typeof(WorkItemChangedEvent))) { WorkItemChangedEvent workItemChange = (WorkItemChangedEvent)notificationEventArgs; // see ConnectToTFS() method below to select which TFS instance/collection // to connect to TfsTeamProjectCollection tfs = ConnectToTFS(); WorkItemStore wiStore = tfs.GetService<WorkItemStore>(); lastStep = lastStep + ": connection to TFS successful "; // Get the work item that was just changed by the user. WorkItem witem = wiStore.GetWorkItem(workItemChange.CoreFields.IntegerFields[0].NewValue); lastStep = lastStep + ": retrieved changed work item, ID:" + witem.Id + " "; // Filter for Bug work items only if (witem.Type.Name == "Bug") { // DEBUG lastStep = lastStep + ": changed work item is a bug "; // Filter for CCB (i.e. Baseline Status) field set to approved only bool BaselineStatusChange = false; if (workItemChange.ChangedFields != null) { ProcessBugRevision(ref lastStep, workItemChange, wiStore, ref witem, ref BaselineStatusChange); } } } } catch (Exception e) { Trace.WriteLine(e.Message); Logger log = new Logger(); log.WriteLineToLog(MsgLevel.Error, "Application error: " + lastStep + " - " + e.Message + " - " + e.InnerException); } statusCode = 1; statusMessage = "Bug Auto Task Evaluation Completed"; properties = null; return EventNotificationStatus.ActionApproved; } // PRIVATE METHODS private static void ProcessBugRevision(ref string lastStep, WorkItemChangedEvent workItemChange, WorkItemStore wiStore, ref WorkItem witem, ref bool BaselineStatusChange) { foreach (StringField field in workItemChange.ChangedFields.StringFields) { // DEBUG lastStep = lastStep + ": last changed field is - " + field.Name + " "; if (field.Name == "Baseline Status") { lastStep = lastStep + ": retrieved bug baseline status field value, bug ID:" + witem.Id + " "; BaselineStatusChange = (field.NewValue != field.OldValue); if ((BaselineStatusChange) && (field.NewValue == "Approved")) { // Instanciate logger Logger log = new Logger(); // *** Create resolution task for this bug *** // ******************************************* // Get the team project and selected field values of the bug work item Project teamProject = witem.Project; int bugID = witem.Id; string bugTitle = witem.Fields["System.Title"].Value.ToString(); string bugAssignedTo = witem.Fields["System.AssignedTo"].Value.ToString(); string bugAreaPath = witem.Fields["System.AreaPath"].Value.ToString(); string bugIterationPath = witem.Fields["System.IterationPath"].Value.ToString(); string bugChangedBy = witem.Fields["System.ChangedBy"].OriginalValue.ToString(); string bugTeamProject = witem.Project.Name; lastStep = lastStep + ": all mandatory bug field values found "; // Optional fields Field bugPoints = witem.Fields["Microsoft.VSTS.Scheduling.StoryPoints"]; if (bugPoints.Value != null) { lastStep = lastStep + ": all mandatory and optional bug field values found "; } // Initialize child resolution task title string childTaskTitle = "Resolve bug " + bugID + " - " + bugTitle; // At this point I can check if a resolution task (of the same name) // for the bug already exist // If so, do not create a new resolution task bool createResolutionTask = true; WorkItem parentBug = wiStore.GetWorkItem(bugID); WorkItemLinkCollection links = parentBug.WorkItemLinks; foreach (WorkItemLink wil in links) { if (wil.LinkTypeEnd.Name == "Child") { WorkItem childTask = wiStore.GetWorkItem(wil.TargetId); if ((childTask.Title == childTaskTitle) && (childTask.State != "Closed")) { createResolutionTask = false; log.WriteLineToLog(MsgLevel.Info, "Team project " + bugTeamProject + ": " + bugChangedBy + " - set the CCB field to \"Approved\" for bug, ID: " + bugID + ". Task not created as open one of the same name already exist, ID:" + childTask.Id); } } } if (createResolutionTask) { // Define the work item type of the new work item WorkItemTypeCollection workItemTypes = wiStore.Projects[teamProject.Name].WorkItemTypes; WorkItemType wiType = workItemTypes["Task"]; // Setup the new task and assign field values witem = new WorkItem(wiType); witem.Fields["System.Title"].Value = "Resolve bug " + bugID + " - " + bugTitle; witem.Fields["System.AssignedTo"].Value = bugAssignedTo; witem.Fields["System.AreaPath"].Value = bugAreaPath; witem.Fields["System.IterationPath"].Value = bugIterationPath; witem.Fields["Microsoft.VSTS.Common.Activity"].Value = "Bug Resolution"; lastStep = lastStep + ": all mandatory task field values set "; // Optional fields if (bugPoints.Value != null) { witem.Fields["Microsoft.VSTS.Scheduling.OriginalEstimate"].Value = bugPoints.Value; lastStep = lastStep + ": all mandatory and optional task field values set "; } // Check for validation errors before saving the new task and linking it to the bug ArrayList validationErrors = witem.Validate(); if (validationErrors.Count == 0) { witem.Save(); // Link the new task (child) to the bug (parent) var linkType = wiStore.WorkItemLinkTypes[CoreLinkTypeReferenceNames.Hierarchy]; // Fetch the work items to be linked var parentWorkItem = wiStore.GetWorkItem(bugID); int taskID = witem.Id; var childWorkItem = wiStore.GetWorkItem(taskID); // Add a new link to the parent relating the child and save it parentWorkItem.Links.Add(new WorkItemLink(linkType.ForwardEnd, childWorkItem.Id)); parentWorkItem.Save(); log.WriteLineToLog(MsgLevel.Info, "Team project " + bugTeamProject + ": " + bugChangedBy + " - set the CCB field to \"Approved\" for bug, ID:" + bugID + ", which automatically created child resolution task, ID:" + taskID); } else { log.WriteLineToLog(MsgLevel.Error, "Error in creating bug resolution child task for bug ID:" + bugID); foreach (Field taskField in validationErrors) { log.WriteLineToLog(MsgLevel.Error, " - Validation Error in task field: " + taskField.ReferenceName); } } } } } } } private TfsTeamProjectCollection ConnectToTFS() { // Connect to TFS string tfsUri = string.Empty; // Production TFS instance production collection tfsUri = @"xxxx"; // Production TFS instance admin collection //tfsUri = @"xxxxx"; // Local TFS testing instance default collection //tfsUri = @"xxxxx"; TfsTeamProjectCollection tfs = new TfsTeamProjectCollection(new System.Uri(tfsUri)); tfs.EnsureAuthenticated(); return tfs; } // HELPERS public string Name { get { return "Bug Auto Task Creation Event Handler"; } } public SubscriberPriority Priority { get { return SubscriberPriority.Normal; } } public enum MsgLevel { Info, Warning, Error }; public Type[] SubscribedTypes() { return new Type[1] { typeof(WorkItemChangedEvent) }; } } } Logger.cs using System; using System.Collections.Generic; using System.IO; using System.Linq; using System.Text; using System.Windows.Forms; namespace BugAutoTaskCreation { class Logger { // fields private string _ApplicationDirectory = @"C:\ProgramData\Microsoft\Team Foundation\Server Configuration\Logs"; private string _LogFileName = @"\CFG_ACCT_AT_OWS_BugAutoTaskCreation.log"; private string _LogFile; private string _LogTimestamp = DateTime.Now.ToString("MM/dd/yyyy HH:mm:ss"); private string _MsgLevelText = string.Empty; // default constructor public Logger() { // check for a prior log file FileInfo logFile = new FileInfo(_ApplicationDirectory + _LogFileName); if (!logFile.Exists) { CreateNewLogFile(ref logFile); } } // properties public string ApplicationDirectory { get { return _ApplicationDirectory; } set { _ApplicationDirectory = value; } } public string LogFile { get { _LogFile = _ApplicationDirectory + _LogFileName; return _LogFile; } set { _LogFile = value; } } // PUBLIC METHODS public void WriteLineToLog(BugAutoTask.MsgLevel msgLevel, string logRecord) { try { // set msgLevel text if (msgLevel == BugAutoTask.MsgLevel.Info) { _MsgLevelText = "[Info @" + MsgTimeStamp() + "] "; } else if (msgLevel == BugAutoTask.MsgLevel.Warning) { _MsgLevelText = "[Warning @" + MsgTimeStamp() + "] "; } else if (msgLevel == BugAutoTask.MsgLevel.Error) { _MsgLevelText = "[Error @" + MsgTimeStamp() + "] "; } else { _MsgLevelText = "[Error: unsupported message level @" + MsgTimeStamp() + "] "; } // write a line to the log file StreamWriter logFile = new StreamWriter(_ApplicationDirectory + _LogFileName, true); logFile.WriteLine(_MsgLevelText + logRecord); logFile.Close(); } catch (Exception) { throw; } } // PRIVATE METHODS private void CreateNewLogFile(ref FileInfo logFile) { try { string logFilePath = logFile.FullName; // write the log file header _MsgLevelText = "[Info @" + MsgTimeStamp() + "] "; string cpu = string.Empty; if (Environment.Is64BitOperatingSystem) { cpu = " (x64)"; } StreamWriter newLog = new StreamWriter(logFilePath, false); newLog.Flush(); newLog.WriteLine(_MsgLevelText + "===================================================================="); newLog.WriteLine(_MsgLevelText + "Team Foundation Server Administration Log"); newLog.WriteLine(_MsgLevelText + "Version : " + "1.0.0 Author: Bob Hardister"); newLog.WriteLine(_MsgLevelText + "DateTime : " + _LogTimestamp); newLog.WriteLine(_MsgLevelText + "Type : " + "OWS Custom TFS API Plug-in"); newLog.WriteLine(_MsgLevelText + "Activity : " + "Bug Auto Task Creation for CCB Approved Bugs"); newLog.WriteLine(_MsgLevelText + "Area : " + "Build Explorer"); newLog.WriteLine(_MsgLevelText + "Assembly : " + "Ows.TeamFoundation.BugAutoTaskCreation.PlugIns.dll"); newLog.WriteLine(_MsgLevelText + "Location : " + @"C:\Program Files\Microsoft Team Foundation Server 2010\Application Tier\Web Services\bin\Plugins"); newLog.WriteLine(_MsgLevelText + "User : " + Environment.UserDomainName + @"\" + Environment.UserName); newLog.WriteLine(_MsgLevelText + "Machine : " + Environment.MachineName); newLog.WriteLine(_MsgLevelText + "System : " + Environment.OSVersion + cpu); newLog.WriteLine(_MsgLevelText + "===================================================================="); newLog.WriteLine(_MsgLevelText); newLog.Close(); } catch (Exception) { throw; } } private string MsgTimeStamp() { string msgTimestamp = string.Empty; return msgTimestamp = DateTime.Now.ToString("yyyy-MM-dd HH:mm:ss:fff"); } } }

    Read the article

  • Windows Phone SDK 7.1 Beta2

    - by Nikita Polyakov
    It’s here – the brand new - Windows Phone SDK 7.1 Beta2. This time it has ability for your to Flash your Developer Unlocked phone to Mango Beta! How awesome is that? Mega-Ultra-Mango-Awesome!! The Windows Phone SDK includes the following Windows Phone SDK 7.1 (Beta2) Windows Phone Emulator (Beta2) Windows Phone SDK 7.1 Assemblies (Beta2) Silverlight 4 SDK and DRT Windows Phone SDK 7.1 Extensions for XNA Game Studio 4.0 Microsoft Expression Blend SDK Preview for Windows Phone 7.1 WCF Data Services Client for Window Phone 7.1 Microsoft Advertising SDK for Windows Phone 7 The direct download link is: http://www.microsoft.com/download/en/details.aspx?displaylang=en&id=26648 Official Details and instructions: http://create.msdn.com/en-US/news/Mango_Beta Official Blog Post: Windows Phone 7 Developer Blog - Developers Get Goody Basket Full of Mangos "If you're registered for Windows Phone Marketplace, you'll receive an invitation from Microsoft Connect that will provide access to a firmware update for your retail Windows Phone device" #ProTip’s: -Uninstall any Mango 7.1 Windows Phone 7 Beta tools if you already had them installed. -If you have Visual Studio 2010 RTM installed, you must install Service Pack 1 RTM before you install Windows Phone SDK 7.1 Beta 2. Please refer to the Service Pack 1 release notes for installation issues. Visual Studio 2010 SP1 RTM -If you installed Visual Basic for Windows Phone Developer Tools 7.0, you must uninstall it before installing Windows Phone SDK 7.1 Beta 2. Uninstall the item Visual Basic for Windows Phone Developer Tools – RTW from the programs list on your computer. Visual Basic is now fully integrated into Windows Phone SDK 7.1 Beta 2; you do not need to install it separately. Follow the instructions very-very closely. Updating your phone yourself is serious business and should not be done while not paying attention!

    Read the article

  • Good SLA

    - by PointsToShare
    © 2011 Dov Trietsch What is a good SLA? I have frequently pondered about Service Level Agreements (SLA). Yesterday after ordering and while waiting, and waiting, and waiting for the food to arrive, I passed the time reading and re-reading the restaurant menu (again and again..) until I noticed their very interesting SLA.   Because (as promised) we had to wait even longer and the conversation around me was mostly in Russian, I ended doodling some of my thoughts of the menu, on the menu. People are both providers and consumers of services. As a service consumer – maybe the SLA above sucks – though to be honest, had the service been better, I would not have noticed this and you, the reader, would have been spared this rambling monograph. As a provider, I think it’s great! Because I provide services in the form of business software, I extend the idea to the following principles of design: 1: Wygiwyg. You guessed it. What You Get Is What You Get. 2: Ugiwugi.  U Get It When U Get It. How’s this for a developer friendly SLA? I’ll never be off the spec, or late. And BTW, the food was good, so when I finally got what I got, I liked it. That's All Folks!!

    Read the article

  • Apress Deal of the day - 23/Feb/2011 - Ultra-Fast ASP.NET: Building Ultra-Fast and Ultra-Scalable Websites Using ASP.NET and SQL Server

    - by TATWORTH
    Today's $10 deal of the day at http://www.apress.com/info/dailydeal  is Ultra-Fast ASP.NET: Building Ultra-Fast and Ultra-Scalable Websites Using ASP.NET and SQL Server by Richard Kessig - ISBN 978-1-4302-2383-2 I won a copy of this book at 101 Books. Richard Kessig is an all-star member of forums.asp.net - see http://forums.asp.net/members/RickNZ.aspx - this book has been on before as deal of the day. If you did not get a copy then, I suggest getting it today. " Ultra-Fast ASP.NET provides a practical guide to building extremely fast and scalable web sites using ASP.NET and SQL Server. It strikes a balance between imparting usable advice and backing that advice up with supporting background information. $49.99 | Published Nov 2009 | Rick Kiessig"

    Read the article

< Previous Page | 90 91 92 93 94 95 96 97 98 99 100 101  | Next Page >