Search Results

Search found 464 results on 19 pages for 'laurie young'.

Page 19/19 | < Previous Page | 15 16 17 18 19 

  • Code Contracts: Unit testing contracted code

    - by DigiMortal
    Code contracts and unit tests are not replacements for each other. They both have different purpose and different nature. It does not matter if you are using code contracts or not – you still have to write tests for your code. In this posting I will show you how to unit test code with contracts. In my previous posting about code contracts I showed how to avoid ContractExceptions that are defined in code contracts runtime and that are not accessible for us in design time. This was one step further to make my randomizer testable. In this posting I will complete the mission. Problems with current code This is my current code. public class Randomizer {     public static int GetRandomFromRangeContracted(int min, int max)     {         Contract.Requires<ArgumentOutOfRangeException>(             min < max,             "Min must be less than max"         );           Contract.Ensures(             Contract.Result<int>() >= min &&             Contract.Result<int>() <= max,             "Return value is out of range"         );           var rnd = new Random();         return rnd.Next(min, max);     } } As you can see this code has some problems: randomizer class is static and cannot be instantiated. We cannot move this class between components if we need to, GetRandomFromRangeContracted() is not fully testable because we cannot currently affect random number generator output and therefore we cannot test post-contract. Now let’s solve these problems. Making randomizer testable As a first thing I made Randomizer to be class that must be instantiated. This is simple thing to do. Now let’s solve the problem with Random class. To make Randomizer testable I define IRandomGenerator interface and RandomGenerator class. The public constructor of Randomizer accepts IRandomGenerator as argument. public interface IRandomGenerator {     int Next(int min, int max); }   public class RandomGenerator : IRandomGenerator {     private Random _random = new Random();       public int Next(int min, int max)     {         return _random.Next(min, max);     } } And here is our Randomizer after total make-over. public class Randomizer {     private IRandomGenerator _generator;       private Randomizer()     {         _generator = new RandomGenerator();     }       public Randomizer(IRandomGenerator generator)     {         _generator = generator;     }       public int GetRandomFromRangeContracted(int min, int max)     {         Contract.Requires<ArgumentOutOfRangeException>(             min < max,             "Min must be less than max"         );           Contract.Ensures(             Contract.Result<int>() >= min &&             Contract.Result<int>() <= max,             "Return value is out of range"         );           return _generator.Next(min, max);     } } It seems to be inconvenient to instantiate Randomizer now but you can always use DI/IoC containers and break compiled dependencies between the components of your system. Writing tests for randomizer IRandomGenerator solved problem with testing post-condition. Now it is time to write tests for Randomizer class. Writing tests for contracted code is not easy. The main problem is still ContractException that we are not able to access. Still it is the main exception we get as soon as contracts fail. Although pre-conditions are able to throw exceptions with type we want we cannot do much when post-conditions will fail. We have to use Contract.ContractFailed event and this event is called for every contract failure. This way we find ourselves in situation where supporting well input interface makes it impossible to support output interface well and vice versa. ContractFailed is nasty hack and it works pretty weird way. Although documentation sais that ContractFailed is good choice for testing contracts it is still pretty painful. As a last chance I got tests working almost normally when I wrapped them up. Can you remember similar solution from the times of Visual Studio 2008 unit tests? Cannot understand how Microsoft was able to mess up testing again. [TestClass] public class RandomizerTest {     private Mock<IRandomGenerator> _randomMock;     private Randomizer _randomizer;     private string _lastContractError;       public TestContext TestContext { get; set; }       public RandomizerTest()     {         Contract.ContractFailed += (sender, e) =>         {             e.SetHandled();             e.SetUnwind();               throw new Exception(e.FailureKind + ": " + e.Message);         };     }       [TestInitialize()]     public void RandomizerTestInitialize()     {         _randomMock = new Mock<IRandomGenerator>();         _randomizer = new Randomizer(_randomMock.Object);         _lastContractError = string.Empty;     }       #region InputInterfaceTests     [TestMethod]     [ExpectedException(typeof(Exception))]     public void GetRandomFromRangeContracted_should_throw_exception_when_min_is_not_less_than_max()     {         try         {             _randomizer.GetRandomFromRangeContracted(100, 10);         }         catch (Exception ex)         {             throw new Exception(string.Empty, ex);         }     }       [TestMethod]     [ExpectedException(typeof(Exception))]     public void GetRandomFromRangeContracted_should_throw_exception_when_min_is_equal_to_max()     {         try         {             _randomizer.GetRandomFromRangeContracted(10, 10);         }         catch (Exception ex)         {             throw new Exception(string.Empty, ex);         }     }       [TestMethod]     public void GetRandomFromRangeContracted_should_work_when_min_is_less_than_max()     {         int minValue = 10;         int maxValue = 100;         int returnValue = 50;           _randomMock.Setup(r => r.Next(minValue, maxValue))             .Returns(returnValue)             .Verifiable();           var result = _randomizer.GetRandomFromRangeContracted(minValue, maxValue);           _randomMock.Verify();         Assert.AreEqual<int>(returnValue, result);     }     #endregion       #region OutputInterfaceTests     [TestMethod]     [ExpectedException(typeof(Exception))]     public void GetRandomFromRangeContracted_should_throw_exception_when_return_value_is_less_than_min()     {         int minValue = 10;         int maxValue = 100;         int returnValue = 7;           _randomMock.Setup(r => r.Next(10, 100))             .Returns(returnValue)             .Verifiable();           try         {             _randomizer.GetRandomFromRangeContracted(minValue, maxValue);         }         catch (Exception ex)         {             throw new Exception(string.Empty, ex);         }           _randomMock.Verify();     }       [TestMethod]     [ExpectedException(typeof(Exception))]     public void GetRandomFromRangeContracted_should_throw_exception_when_return_value_is_more_than_max()     {         int minValue = 10;         int maxValue = 100;         int returnValue = 102;           _randomMock.Setup(r => r.Next(10, 100))             .Returns(returnValue)             .Verifiable();           try         {             _randomizer.GetRandomFromRangeContracted(minValue, maxValue);         }         catch (Exception ex)         {             throw new Exception(string.Empty, ex);         }           _randomMock.Verify();     }     #endregion        } Although these tests are pretty awful and contain hacks we are at least able now to make sure that our code works as expected. Here is the test list after running these tests. Conclusion Code contracts are very new stuff in Visual Studio world and as young technology it has some problems – like all other new bits and bytes in the world. As you saw then making our contracted code testable is easy only to the point when pre-conditions are considered. When we start dealing with post-conditions we will end up with hacked tests. I hope that future versions of code contracts will solve error handling issues the way that testing of contracted code will be easier than it is right now.

    Read the article

  • CodePlex Daily Summary for Friday, June 03, 2011

    CodePlex Daily Summary for Friday, June 03, 2011Popular ReleasesMedia Companion: MC 3.404b Weekly: Extract the entire archive to a folder which has user access rights, eg desktop, documents etc. Refer to the documentation on this site for the Installation & Setup Guide Important! *** Due to an issue where the date added & the full genre information was not being read into the Movie cache, it is highly recommended that you perform a Rebuild Movies when first running this latest version. This will read in the information from the nfo's & repopulate the cache used by MC during operation. Fi...Terraria Map Generator: TerrariaMapTool 1.0.0.4 Beta: 1) Fixed the generated map.html file so that the file:/// is included in the base path. 2) Added the ability to use parallelization during generation. This will cause the program to use as many threads as there are physical cores. 3) Fixed some background overdraw.DotRas: DotRas v1.2 (Version 1.2.4168): This release includes compiled (and signed) versions of the binaries, PDBs, CHM help documentation, along with both C# and VB.NET examples. Please don't forget to rate the release! If you find a bug, please open a work item and give as much description as possible. Stack traces, which operating system(s) you're targeting, and build type is also very helpful for bug hunting. If you find something you believe to be a bug but are not sure, create a new discussion on the discussions board. Thank...Ajax Minifier: Microsoft Ajax Minifier 4.21: Fixed issues #15949 and #15868. Tweaked output in multi-line (pretty-print) mode so that var-statements in the initializer of for-statements break multiple variables onto multiple lines. Added TreeModification flag (and can now therefore turn off) the behavior of removing quotes around object literal property names if the names can be valid identifiers. Convert true literals to !0 and false literals to !1. Added a Visitor class approach to iterating over a generated abstract syntax tree to he...Caliburn Micro: WPF, Silverlight and WP7 made easy.: Caliburn.Micro v1.1 RTW: Download ContentsDebug and Release Assemblies Samples Changes.txt License.txt Release Highlights For WP7A new Tombstoning API based on ideas from Fluent NHibernate A new Launcher/Chooser API Strongly typed Navigation SimpleContainer included The full phone lifecycle is made easy to work with ie. we figure out whether your app is actually Resurrecting or just Continuing for you For WPFSupport for the Client Profile Better support for WinForms integration All PlatformsA power...VidCoder: 0.9.1: Added color coding to the Log window. Errors are highlighted in red, HandBrake logs are in black and VidCoder logs are in dark blue. Moved enqueue button to the right with the other control buttons. Added logic to report failures when errors are logged during the encode or when the encode finishes prematurely. Added Copy button to Log window. Adjusted audio track selection box to always show the full track name. Changed encode job progress bar to also be colored yellow when the enco...AutoLoL: AutoLoL v2.0.1: - Fixed a small bug in Auto Login - Fixed the updaterEPPlus-Create advanced Excel 2007 spreadsheets on the server: EPPlus 2.9.0.1: EPPlus-Create advanced Excel 2007 spreadsheets on the server This version has been updated to .Net Framework 3.5 New Features Data Validation. PivotTables (Basic functionalliy...) Support for worksheet data sources. Column, Row, Page and Data fields. Date and Numeric grouping Build in styles. ...and more And some minor new features... Ranges Text-Property|Get the formated value AutofitColumns-method to set the column width from the content of the range LoadFromCollection-metho...jQuery ASP.Net MVC Controls: Version 1.4.0.0: Version 1.4.0.0 contains the following additions: Upgraded to MVC 3.0 Upgraded to jQuery 1.6.1 (Though the project supports all jQuery version from 1.4.x onwards) Upgraded to jqGrid 3.8 Better Razor View-Engine support Better Pager support, includes support for custom pagers Added jqGrid toolbar buttons support Search module refactored, with full suport for multiple filters and ordering And Code cleanup, bug-fixes and better controller configuration support.Nearforums - ASP.NET MVC forum engine: Nearforums v6.0: Version 6.0 of Nearforums, the ASP.NET MVC Forum Engine, containing new features: Authentication using Membership Provider for SQL Server and MySql Spam prevention: Flood Control Moderation: Flag messages Content management: Pages: Create pages (about us/contact/texts) through web administration Allow nearforums to run as an IIS subapp Migrated Facebook Connect to OAuth 2.0 Visit the project Roadmap for more details.NetOffice - The easiest way to use Office in .NET: NetOffice Release 0.8b: Changes: - fix critical issue 15922(AccessViolationException) once and for all ...update is strongly recommended Known Issues: - some addin ribbon examples has a COM Register function with missing codebase entry(win32 registry) ...the problem is only affected to c# examples. fixed in latest source code. NetOffice\ReleaseTags\NetOffice Release 0.8.rar Includes: - Runtime Binaries and Source Code for .NET Framework:......v2.0, v3.0, v3.5, v4.0 - Tutorials in C# and VB.Net:...................Facebook Graph Toolkit: Facebook Graph Toolkit 1.5.4186: Updates the API in response to Facebook's recent change of policy: All Graph Api accessing feeds or posts must provide a AccessToken.Serviio for Windows Home Server: Beta Release 0.5.2.0: Ready for widespread beta. Synchronized build number to Serviio version to avoid confusion.AcDown????? - Anime&Comic Downloader: AcDown????? v3.0 Beta4: ??AcDown?????????????,??????????????,????、????。?????Acfun????? ????32??64? Windows XP/Vista/7 ????????????? ??:????????Windows XP???,?????????.NET Framework 2.0???(x86)?.NET Framework 2.0???(x64),?????"?????????"??? ??v3.0 Beta4 2011-5-31?? ???Bilibili.us????? ???? ?? ???"????" ???Bilibili.us??? ??????? ?? ??????? ?? ???????? ?? ?? ???Bilibili.us?????(??????????????????) ??????(6.cn)?????(????) ?? ?????Acfun?????????? ?????????????? ???QQ???????? ????????????Discussion...CodeCopy Auto Code Converter: Code Copy v0.1: Full add-in, setup project source code and setup fileWordpress Theme 'Windows Metro': v1.00.0017: v1.00.0017 Dies ist eine Vorab-Version aus der Entwickler-Phase. This is a preview version from the development phase.TerrariViewer: TerrariViewer v2.4.1: Added Piggy Bank editor and fixed some minor bugs.Kooboo CMS: Kooboo CMS 3.02: Updated the Kooboo_CMS.zip at 2011-06-02 11:44 GMT 1.Fixed: Adding data rule issue on page. 2.Fixed: Caching issue for higher performance. Updated the Kooboo_CMS.zip at 2011-06-01 10:00 GMT 1. Fixed the published package throwed a compile error under WebSite mode. 2. Fixed the ContentHelper.NewMediaFolderObject return TextFolder issue. 3. Shorten the name of ContentHelper API. NewMediaFolderObject=>MediaFolder, NewTextFolderObject=> TextFolder, NewSchemaObject=>Schema. Also update the C...mojoPortal: 2.3.6.6: see release notes on mojoportal.com http://www.mojoportal.com/mojoportal-2366-released Note that we have separate deployment packages for .NET 3.5 and .NET 4.0 The deployment package downloads on this page are pre-compiled and ready for production deployment, they contain no C# source code. To download the source code see the Source Code Tab I recommend getting the latest source code using TortoiseHG, you can get the source code corresponding to this release here.Terraria World Creator: Terraria World Creator: Version 1.01 Fixed a bug that would cause the application to crash. Re-named the Application.New Projects"Background Transfer Service Sample" From WP7 Sample: This project is an extension to existing Windows Phone7 sample available from MS sight ("Background Transfer Service Sample"). This project extends functionality to display the background downloaded item to verify download conten. Agile project and development management dashboard - Meter Console: Agile project and development management dashboard application. The focus of this application to bridge the gap between the management, clients and developer and chive best possible transparency. Ajax Multi-ListBox: Ajax Multi-ListBox is .NET C# written user control with a dual listbox that uses ajax to move items from left to right and vice versa.aLOG : Asynchronous logging (Logging Library): aLog is an asynchronous logging component built on Microsoft Enterprise Exception handling and Logging Component block. It allows application to asynchronously log the data in multiple data sources such as file system, database , XML. Application Base Component Library: Application Base Component LibraryConcise Service Bus: A compact but comprehensive WCF service bus framework.Contour provider for Umbraco Courier: Transfer Umbraco Contour forms between Umbraco instances with Courier.CoveyNet MidWest: CoveyNet MidWestCSharp Samples: This contain C#- wpf, wcf, linqtosql, some basic oops stuff etc.DotNet1: DotNet1FireWeb: fefaesfrafasGeoCoder: Basic server side GeoCoder which accesses Google, YAHOO etc. This is developed in C# and is a mix of language styles, just depended what I was doing that week. All the libraries include tests which can be used to check if the libraries still work, it's worth doing this as the consumed webservices may change from time to time. GrandReward: GrandReward is designed to help young and old people to learn by answer questions. It is easy to use and the community can design own topic files for own questions to learn. The project is develop in C# and contains some WPF views. The project homepage is: http://grand-reward.com/helferlein: The helferlein library contains some web controls and tools I used in other projects.helferlein_BabelFish: helferlein_BabelFish is a DNN module to support multilingual features for other helferlein modules. It is developed in C#.helferlein_Form: helferlein_Form is an easy-to-use module for DotNetNuke that allows creating forms. The submissions can be sent by e-mail and/or stored in a database. It's developed in C#.ILSpy/Silverlight — Proof Of Concept: ILSpy refactoring (dire hacking) to make it work in Silverlight.ISGProject: ISGProjectMarcelo Rocha: Trabalhos acadêmicos, Apresentações, Projetos e outras Invencionices de Marcelo Rocha.MosquitoOnline: MosquitoOnline ProjectNET.Studio: Ein kleines Visual Studio für die SchuleOmegle Desktop Client: A desktop client for the anonymous chat site, Omegle. Designed to combat some of the shortcomings of the webpage, the desktop client will have no adverts, and be able to flash the taskbar window, and play a sound on message received. Online PHP IDE: Free Open Source online editor for working with files on FTP. Free and easy deployment.Open JGL: A utillity library using OpenGL, which focus' on graphics and vertex control.Open Tokenization Server: Open Tokenization Server is a simplistic implementation of a tokenization data security service. It allows users to create and retrieve tokens that represent sensitive information.OurCodeNights: Just some small projects we "work" on at our codenights. A good excuse to meet and have a good time ;)Raple: Raple is simple programming languageSea: Set of C# Extension libraries that make life easier.Shai Chi: Shai ChiSharePoint Sandbox Logging: SharePoint Sandbox Logging enables developers and solution designers to easily log events to logging list in your site collection, allowing you to easily trace errors on custom solutions within your site collection, with no need for server access. Perfect for sandbox / Office365. This project contains one sandbox solution with a site collection logging feature. When feature is active - it creates the list and logs. If it is not active - it does not log. Simple. It also contains a sampl...SharePoint Stock Ticker: SharePoint Stock Ticker is a SharePoint 2010 project which consists of a SharePoint Stock Ticker Web Part driven off of Google's Stock API. The width of this Web Part was developed to fit within the default SharePoint 2010 Quick Launch navigation. This has also been tested to run over http and https SharePoint 2010 farms and will not throw a mixed content warning with Internet Explorer.SharpLinx: SharpLinx is an open source social networking application on ASP.Net.Single.Net - less Copy & Paste,less code lines,more easier: make your code life easierSqlMyAdmin: A projecto to create a PHPMyAdmin clone app that access Sql Server.TDB: TDB is a NOSQL Key-value store entirely designed to run in windows environment. It's directly inspired by Redis project and use Redis' command sematincs and Redis' communication protocol to maintain a base compatibility with it.Tomasulo: A simulator of Tomasulo CDB dynamic instruction scheduling algorithm in Java. Authors: Mengyu Zhou @J85 Zhenyao Zhu @J85 Jun Li @J81 @Tsinghua UniversityTransport Broker Management System: Transport Management system will serve the small freight brokers manage the carriers and orders. It has features like manage truck owners, carriers, source, destination, orders. Verification system for MCBS: Verification system for mobile communication base station.VietCard: VietCardVkontakte File converter: ????????? ?????? ??? ???????? ?? ? "?????????" vkontakte.ru. VSGesture for Visual Studio: VSGesture can execute command via mouse gestures within Visual Studio2010. If you have any feedback, please send me an email to powerumc at gmail.com.WCF-Silverlight Helper: Class library to generate WCF DataContractSerializer-serializable and Silverlight (SL) -compatible classes. Useful, for example, if you have an assembly with classes that fail to serialize with DataContractSerializer. T4 Templates are used to auto-generate code.Windows Phone eBook Samples: Windows Phone eBook Samples is a collection of samples associated with the Windows Phone eBoom community initiative.WinPreciousMetal: WinPreciousMetal helps people know the precious metal's states. Because I am a Chinese, my soft mainly focus at the price of metal in RMB.

    Read the article

  • How to access members of an rdf list with rdflib (or plain sparql)

    - by tjb
    What is the best way to access the members of an rdf list? I'm using rdflib (python) but an answer given in plain SPARQL is also ok (this type of answer can be used through rdfextras, a rdflib helper library). I'm trying to access the authors of a particular journal article in rdf produced by Zotero (some fields have been removed for brevity): <rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:z="http://www.zotero.org/namespaces/export#" xmlns:dcterms="http://purl.org/dc/terms/" xmlns:bib="http://purl.org/net/biblio#" xmlns:foaf="http://xmlns.com/foaf/0.1/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:prism="http://prismstandard.org/namespaces/1.2/basic/" xmlns:link="http://purl.org/rss/1.0/modules/link/"> <bib:Article rdf:about="http://www.ncbi.nlm.nih.gov/pubmed/18273724"> <z:itemType>journalArticle</z:itemType> <dcterms:isPartOf rdf:resource="urn:issn:0954-6634"/> <bib:authors> <rdf:Seq> <rdf:li> <foaf:Person> <foaf:surname>Lee</foaf:surname> <foaf:givenname>Hyoun Seung</foaf:givenname> </foaf:Person> </rdf:li> <rdf:li> <foaf:Person> <foaf:surname>Lee</foaf:surname> <foaf:givenname>Jong Hee</foaf:givenname> </foaf:Person> </rdf:li> <rdf:li> <foaf:Person> <foaf:surname>Ahn</foaf:surname> <foaf:givenname>Gun Young</foaf:givenname> </foaf:Person> </rdf:li> <rdf:li> <foaf:Person> <foaf:surname>Lee</foaf:surname> <foaf:givenname>Dong Hun</foaf:givenname> </foaf:Person> </rdf:li> <rdf:li> <foaf:Person> <foaf:surname>Shin</foaf:surname> <foaf:givenname>Jung Won</foaf:givenname> </foaf:Person> </rdf:li> <rdf:li> <foaf:Person> <foaf:surname>Kim</foaf:surname> <foaf:givenname>Dong Hyun</foaf:givenname> </foaf:Person> </rdf:li> <rdf:li> <foaf:Person> <foaf:surname>Chung</foaf:surname> <foaf:givenname>Jin Ho</foaf:givenname> </foaf:Person> </rdf:li> </rdf:Seq> </bib:authors> <dc:title>Fractional photothermolysis for the treatment of acne scars: a report of 27 Korean patients</dc:title> <dcterms:abstract>OBJECTIVES: Atrophic post-acne scarring remains a therapeutically challe *CUT*, erythema and edema. CONCLUSIONS: The 1550-nm erbium-doped FP is associated with significant patient-reported improvement in the appearance of acne scars, with minimal downtime.</dcterms:abstract> <bib:pages>45-49</bib:pages> <dc:date>2008</dc:date> <z:shortTitle>Fractional photothermolysis for the treatment of acne scars</z:shortTitle> <dc:identifier> <dcterms:URI> <rdf:value>http://www.ncbi.nlm.nih.gov/pubmed/18273724</rdf:value> </dcterms:URI> </dc:identifier> <dcterms:dateSubmitted>2010-12-06 11:36:52</dcterms:dateSubmitted> <z:libraryCatalog>NCBI PubMed</z:libraryCatalog> <dc:description>PMID: 18273724</dc:description> </bib:Article> <bib:Journal rdf:about="urn:issn:0954-6634"> <dc:title>The Journal of Dermatological Treatment</dc:title> <prism:volume>19</prism:volume> <prism:number>1</prism:number> <dcterms:alternative>J Dermatolog Treat</dcterms:alternative> <dc:identifier>DOI 10.1080/09546630701691244</dc:identifier> <dc:identifier>ISSN 0954-6634</dc:identifier> </bib:Journal>

    Read the article

  • Very simple, terse and easy GUI programming “frameworks”

    - by jetxee
    Please list GUI programming libraries, toolkits, frameworks which allow to write GUI apps quickly. I mean in such a way, that GUI is described entirely in a human-readable (and human-writable) plain text file (code) code is terse (1 or 2 lines of code per widget/event pair), suitable for scripting structure and operation of the GUI is evident from the code (nesting of widgets and flow of events) details about how to build the GUI are hidden (things like mainloop, attaching event listeners, etc.) auto-layouts are supported (vboxes, hboxes, etc.) As answers suggest, this may be defined as declarative GUI programming, but it is not necessarily such. Any approach is OK if it works, is easy to use and terse. There are some GUI libraries/toolkits like this. They are listed below. Please extend the list if you see a qualifying toolkit missing. Indicate if the project is crossplatform, mature, active, and give an example if possible. Please use this wiki to discuss only Open Source projects. This is the list so far (in alphabetical order): Fudgets Fudgets is a Haskell library. Platform: Unix. Status: Experimental, but still maintained. An example: import Fudgets main = fudlogue (shellF "Hello" (labelF "Hello, world!" >+< quitButtonF)) GNUstep Renaissance Renaissance allows to describe GUI in simple XML. Platforms: OSX/GNUstep. Status: part of GNUstep. An example below: <window title="Example"> <vbox> <label font="big"> Click the button below to quit the application </label> <button title="Quit" action="terminate:"/> </vbox> </window> HTML HTML-based GUI (HTML + JS). Crossplatform, mature. Can be used entirely on the client side. Looking for a nice “helloworld” example. JavaFX JavaFX is usable for standalone (desktop) apps as well as for web applications. Not completely crossplatform, not yet completely open source. Status: 1.0 release. An example: Frame { content: Button { text: "Press Me" action: operation() { System.out.println("You pressed me"); } } visible: true } Screenshot is needed. Phooey Phooey is another Haskell library. Crossplatform (wxWidgets), HTML+JS backend planned. Mature and active. An example (a little more than a helloworld): ui1 :: UI () ui1 = title "Shopping List" $ do a <- title "apples" $ islider (0,10) 3 b <- title "bananas" $ islider (0,10) 7 title "total" $ showDisplay (liftA2 (+) a b) PythonCard PythonCard describes GUI in a Python dictionary. Crossplatform (wxWidgets). Some apps use it, but the project seems stalled. There is an active fork. I skip PythonCard example because it is too verbose for the contest. Shoes Shoes for Ruby. Platforms: Win/OSX/GTK+. Status: Young but active. A minimal app looks like this: Shoes.app { @push = button "Push me" @note = para "Nothing pushed so far" @push.click { @note.replace "Aha! Click!" } } Tcl/Tk Tcl/Tk. Crossplatform (its own widget set). Mature (probably even dated) and active. An example: #!/usr/bin/env wish button .hello -text "Hello, World!" -command { exit } pack .hello tkwait window . tekUI tekUI for Lua (and C). Platforms: X11, DirectFB. Status: Alpha (usable, but API still evolves). An example: #/usr/bin/env lua ui = require "tek.ui" ui.Application:new { Children = { ui.Window:new { Title = "Hello", Children = { ui.Text:new { Text = "_Hello, World!", Style = "button", Mode = "button", }, }, }, }, }:run() Treethon Treethon for Python. It describes GUI in a YAML file (Python in a YAML tree). Platform: GTK+. Status: work in proress. A simple app looks like this: _import: gtk view: gtk.Window() add: - view: gtk.Button('Hello World') on clicked: print view.get_label() Yet unnamed Python library by Richard Jones: This one is not released yet. The idea is to use Python context managers (with keyword) to structure GUI code. See Richard Jones' blog for details. with gui.vertical: text = gui.label('hello!') items = gui.selection(['one', 'two', 'three']) with gui.button('click me!'): def on_click(): text.value = items.value text.foreground = red XUL XUL + Javascript may be used to create stand-alone desktop apps with XULRunner as well as Mozilla extensions. Mature, open source, crossplatform. <?xml version="1.0"?> <?xml-stylesheet href="chrome://global/skin/" type="text/css"?> <window id="main" title="My App" width="300" height="300" xmlns="http://www.mozilla.org/keymaster/gatekeeper/there.is.only.xul"> <caption label="Hello World"/> </window> Thank your for contributions!

    Read the article

  • urgent help needed to convert arabic html to pdf

    - by Mariam
    <div> <table border="1" width="500px"> <tr> <td colspan="2"> aspdotnetcodebook ????? ???????</td> </tr> <tr> <td> cell1 </td> <td> cell2 </td> </tr> <tr> <td colspan="2"> <asp:Label ID="lblLabel" runat="server" Text=""></asp:Label> <img alt="" src="logo.gif" style="width: 174px; height: 40px" /></td> </tr> <tr> <td colspan="2" dir="rtl"> <h1> <img alt="" height="168" src="http://a.cksource.com/c/1/inc/img/demo-little-red.jpg" style="margin-left: 10px; margin-right: 10px; float: left;" width="120" />????? ????? ??? ??? ?? ?? ??</h1> <p> &quot;<b>Little Red Riding Hood</b>&quot; is a famous <a href="http://en.wikipedia.org/wiki/Fairy_tale" title="Fairy tale">fairy tale</a> about a young girl&#39;s encounter with a wolf. The story has been changed considerably in its history and subject to numerous modern adaptations and readings.</p> <table align="right" border="1" cellpadding="1" cellspacing="1" style="width: 200px;"> <caption> <strong>International Names</strong></caption> <tr> <td> ????? ???????</td> <td> &nbsp;</td> </tr> <tr> <td> Italian</td> <td> <i>Cappuccetto Rosso</i></td> </tr> <tr> <td> Spanish</td> <td> <i>Caperucita Roja</i></td> </tr> </table> <p> The version most widely known today is based on the <a href="http://en.wikipedia.org/wiki/Brothers_Grimm" title="Brothers Grimm"> Brothers Grimm</a> variant. It is about a girl called Little Red Riding Hood, after the red <a href="http://en.wikipedia.org/wiki/Hood_(headgear%2529" title="Hood (headgear)">hooded</a> <a href="http://en.wikipedia.org/wiki/Cape" title="Cape">cape</a> or <a href="http://en.wikipedia.org/wiki/Cloak" title="Cloak">cloak</a> she wears. The girl walks through the woods to deliver food to her sick grandmother.</p> <p> A wolf wants to eat the girl but is afraid to do so in public. He approaches the girl, and she naïvely tells him where she is going. He suggests the girl pick some flowers, which she does. In the meantime, he goes to the grandmother&#39;s house and gains entry by pretending to be the girl. He swallows the grandmother whole, and waits for the girl, disguised as the grandmother.</p> <p> When the girl arrives, she notices he looks very strange to be her grandma. In most retellings, this eventually culminates with Little Red Riding Hood saying, &quot;My, what big teeth you have!&quot;<br /> To which the wolf replies, &quot;The better to eat you with,&quot; and swallows her whole, too.</p> <p> A <a href="http://en.wikipedia.org/wiki/Hunter" title="Hunter">hunter</a>, however, comes to the rescue and cuts the wolf open. Little Red Riding Hood and her grandmother emerge unharmed. They fill the wolf&#39;s body with heavy stones, which drown him when he falls into a well. Other versions of the story have had the grandmother shut in the closet instead of eaten, and some have Little Red Riding Hood saved by the hunter as the wolf advances on her rather than after she is eaten.</p> <p> The tale makes the clearest contrast between the safe world of the village and the dangers of the <a href="http://en.wikipedia.org/wiki/Enchanted_forest" title="Enchanted forest">forest</a>, conventional antitheses that are essentially medieval, though no written versions are as old as that.</p> </td> </tr> </table> </div> i use itextsharp to convert this content which is stored in DB to pdf file to be downloaded to the user i cant achieve this

    Read the article

  • Array help Index out of range exeption was unhandled

    - by Michael Quiles
    I am trying to populate combo boxes from a text file using comma as a delimiter everything was working fine, but now when I debug I get the "Index out of range exeption was unhandled" warning. I guess I need a fresh pair of eyes to see where I went wrong, I commented on the line that gets the error //Fname = fields[1]; using System; using System.Collections.Generic; using System.ComponentModel; using System.Data; using System.Drawing; using System.Drawing.Printing; using System.Linq; using System.Text; using System.Windows.Forms; using System.IO; namespace Sullivan_Payroll { public partial class xEmpForm : Form { bool complete = false; public xEmpForm() { InitializeComponent(); } private void xEmpForm_Resize(object sender, EventArgs e) { this.xCenterPanel.Left = Convert.ToInt16((this.Width - this.xCenterPanel.Width) / 2); this.xCenterPanel.Top = Convert.ToInt16((this.Height - this.xCenterPanel.Height) / 2); Refresh(); } private void exitToolStripMenuItem_Click(object sender, EventArgs e) { //Exits the application this.Close(); } private void xEmpForm_FormClosing(object sender, FormClosingEventArgs e) //use this on xtrip calculator { DialogResult Response; if (complete == true) { Application.Exit(); } else { Response = MessageBox.Show("Are you sure you want to Exit?", "Exit", MessageBoxButtons.YesNo, MessageBoxIcon.Question, MessageBoxDefaultButton.Button2); if (Response == DialogResult.No) { complete = false; e.Cancel = true; } else { complete = true; Application.Exit(); } } } private void xEmpForm_Load(object sender, EventArgs e) { //file sources string fileDept = "source\\Department.txt"; string fileSex = "source\\Sex.txt"; string fileStatus = "source\\Status.txt"; if (File.Exists(fileDept)) { using (System.IO.StreamReader sr = System.IO.File.OpenText(fileDept)) { string dept = ""; while ((dept = sr.ReadLine()) != null) { this.xDeptComboBox.Items.Add(dept); } } } else { MessageBox.Show("The Department file can not be found.", "Error", MessageBoxButtons.OK, MessageBoxIcon.Error); } if (File.Exists(fileSex)) { using (System.IO.StreamReader sr = System.IO.File.OpenText(fileSex)) { string sex = ""; while ((sex = sr.ReadLine()) != null) { this.xSexComboBox.Items.Add(sex); } } } else { MessageBox.Show("The Sex file can not be found.", "Error", MessageBoxButtons.OK, MessageBoxIcon.Error); } if (File.Exists(fileStatus)) { using (System.IO.StreamReader sr = System.IO.File.OpenText(fileStatus)) { string status = ""; while ((status = sr.ReadLine()) != null) { this.xStatusComboBox.Items.Add(status); } } } else { MessageBox.Show("The Status file can not be found.", "Error", MessageBoxButtons.OK, MessageBoxIcon.Error); } } private void xFileSaveMenuItem_Click(object sender, EventArgs e) { { const string fileNew = "source\\New Staff.txt"; string recordIn; FileStream outFile = new FileStream(fileNew, FileMode.Create, FileAccess.Write); StreamWriter writer = new StreamWriter(outFile); for (int count = 0; count <= this.xEmployeeListBox.Items.Count - 1; count++) { this.xEmployeeListBox.SelectedIndex = count; recordIn = this.xEmployeeListBox.SelectedItem.ToString(); writer.WriteLine(recordIn); } writer.Close(); outFile.Close(); this.xDeptComboBox.SelectedIndex = -1; this.xStatusComboBox.SelectedIndex = -1; this.xSexComboBox.SelectedIndex = -1; MessageBox.Show("your file is saved"); } } private void xViewFacultyMenuItem_Click(object sender, EventArgs e) { const string fileStaff = "source\\Staff.txt"; const char DELIM = ','; string Lname, Fname, Depart, Stat, Sex, Salary, cDept, cStat, cSex; double Gtotal; string recordIn; string[] fields; cDept = this.xDeptComboBox.SelectedItem.ToString(); cStat = this.xStatusComboBox.SelectedItem.ToString(); cSex = this.xSexComboBox.SelectedItem.ToString(); FileStream inFile = new FileStream(fileStaff, FileMode.Open, FileAccess.Read); StreamReader reader = new StreamReader(inFile); recordIn = reader.ReadLine(); while (recordIn != null) { fields = recordIn.Split(DELIM); Lname = fields[0]; Fname = fields[1]; // this is where the error appears Depart = fields[2]; Stat = fields[3]; Sex = fields[4]; Salary = fields[5]; Fname = fields[1].TrimStart(null); Depart = fields[2].TrimStart(null); Stat = fields[3].TrimStart(null); Sex = fields[4].TrimStart(null); Salary = fields[5].TrimStart(null); Gtotal = double.Parse(Salary); if (Depart == cDept && cStat == Stat && cSex == Sex) { this.xEmployeeListBox.Items.Add(recordIn); } recordIn = reader.ReadLine(); } reader.Close(); inFile.Close(); if (this.xEmployeeListBox.Items.Count >= 1) { this.xFileSaveMenuItem.Enabled = true; this.xFilePrintMenuItem.Enabled = true; this.xEditClearMenuItem.Enabled = true; } else { this.xFileSaveMenuItem.Enabled = false; this.xFilePrintMenuItem.Enabled = false; this.xEditClearMenuItem.Enabled = false; MessageBox.Show("Records not found"); } } private void xEditClearMenuItem_Click(object sender, EventArgs e) { this.xEmployeeListBox.Items.Clear(); this.xDeptComboBox.SelectedIndex = -1; this.xStatusComboBox.SelectedIndex = -1; this.xSexComboBox.SelectedIndex = -1; this.xFileSaveMenuItem.Enabled = false; this.xFilePrintMenuItem.Enabled = false; this.xEditClearMenuItem.Enabled = false; } } } Source file -- Anderson, Kristen, Accounting, Assistant, Female, 43155 Ball, Robin, Accounting, Instructor, Female, 42723 Chin, Roger, Accounting, Full, Male,59281 Coats, William, Accounting, Assistant, Male, 45371 Doepke, Cheryl, Accounting, Full, Female, 52105 Downs, Clifton, Accounting, Associate, Male, 46887 Garafano, Karen, Finance, Associate, Female, 49000 Hill, Trevor, Management, Instructor, Male, 38590 Jackson, Carole, Accounting, Instructor, Female, 38781 Jacobson, Andrew, Management, Full, Male, 56281 Lewis, Karl, Management, Associate, Male, 48387 Mack, Kevin, Management, Assistant, Male, 45000 McKaye, Susan, Management, Instructor, Female, 43979 Nelsen, Beth, Finance, Full, Female, 52339 Nelson, Dale, Accounting, Full, Male, 54578 Palermo, Sheryl, Accounting, Associate, Female, 45617 Rais, Mary, Finance, Instructor, Female, 27000 Scheib, Earl, Management, Instructor, Male, 37389 Smith, Tom, Finance, Full, Male, 57167 Smythe, Janice, Management, Associate, Female, 46887 True, David, Accounting, Full, Male, 53181 Young, Jeff, Management, Assistant, Male, 43513

    Read the article

  • SQLAuthority News – TechEd India – April 12-14, 2010 Bangalore – An Unforgettable Experience – An Op

    - by pinaldave
    TechEd India was one of the largest Technology events in India led by Microsoft. This event was attended by more than 3,000 technology enthusiasts, making it one of the most well-organized events of the year. Though I attempted to attend almost all the technology events here, I have not seen any bigger or better event in Indian subcontinents other than this. There are 21 Technical Tracks at Tech·Ed India 2010 that span more than 745 learning opportunities. I was fortunate enough to be a part of this whole event as a speaker and a delegate, as well. TechEd India Speaker Badge and A Token of Lifetime Hotel Selection I presented three different sessions at TechEd India and was also a part of panel discussion. (The details of the sessions are given at the end of this blog post.) Due to extensive traveling, I stay away from my family occasionally. For this reason, I took my wife – Nupur and daughter Shaivi (8 months old) to the event along with me. We stayed at the same hotel where the event was organized so as to maximize my time bonding with my family and to have more time in networking with technology community, at the same time. The hotel Lalit Ashok is the largest and most luxurious venue one can find in Bangalore, located in the middle of the city. The cost of the hotel was a bit pricey, but looking at all the advantages, I had decided to ask for a booking there. Hotel Lalit Ashok Nupur Dave and Shaivi Dave Arrival Day – DAY 0 – April 11, 2010 I reached the event a day earlier, and that was one wise decision for I was able to relax a bit and go over my presentation for the next day’s course. I am a kind of person who likes to get everything ready ahead of time. I was also able to enjoy a pleasant evening with several Microsoft employees and my family friends. I even checked out the location where I would be doing presentations the next day. I was fortunate enough to meet Bijoy Singhal from Microsoft who helped me out with a few of the logistics issues that occured the day before. I was not aware of the fact that the very next day he was going to be “The Man” of the TechEd 2010 event. Vinod Kumar from Microsoft was really very kind as he talked to me regarding my subsequent session. He gave me some suggestions which were really helpful that I was able to incorporate them during my presentation. Finally, I was able to meet Abhishek Kant from Microsoft; his valuable suggestions and unlimited passion have inspired many people like me to work with the Community. Pradipta from Microsoft was also around, being extremely busy with logistics; however, in those busy times, he did find some good spare time to have a chat with me and the other Community leaders. I also met Harish Ranganathan and Sachin Rathi, both from Microsoft. It was so interesting to listen to both of them talking about SharePoint. I just have no words to express my overwhelmed spirit because of all these passionate young guys - Pradipta,Vinod, Bijoy, Harish, Sachin and Ahishek (of course!). Map of TechEd India 2010 Event Day 1 – April 12, 2010 From morning until night time, today was truly a very busy day for me. I had two presentations and one panel discussion for the day. Needless to say, I had a few meetings to attend as well. The day started with a keynote from S. Somaseger where he announced the launch of Visual Studio 2010. The keynote area was really eye-catching because of the very large, bigger-than- life uniform screen. This was truly one to show. The title music of the keynote was very interesting and it featured Bijoy Singhal as the model. It was interesting to talk to him afterwards, when we laughed at jokes together about his modeling assignment. TechEd India Keynote Opening Featuring Bijoy TechEd India 2010 Keynote – S. Somasegar Time: 11:15pm – 11:45pm Session 1: True Lies of SQL Server – SQL Myth Buster Following the excellent keynote, I had my very first session on the subject of SQL Server Myth Buster. At first, I was a bit nervous as right after the keynote, for this was my very first session and during my presentation I saw lots of Microsoft Product Team members. Well, it really went well and I had a really good discussion with attendees of the session. I felt that a well begin was half-done and my confidence was regained. Right after the session, I met a few of my Community friends and had meaningful discussions with them on many subjects. The abstract of the session is as follows: In this 30-minute demo session, I am going to briefly demonstrate few SQL Server Myths and their resolutions as I back them up with some demo. This demo presentation is a must-attend for all developers and administrators who would come to the event. This is going to be a very quick yet fun session. Pinal Presenting session at TechEd India 2010 Time: 1:00 PM – 2:00 PM Lunch with Somasegar After the session I went to see my daughter, and then I headed right away to the lunch with S. Somasegar – the keynote speaker and senior vice president of the Developer Division at Microsoft. I really thank to Abhishek who made it possible for us. Because of his efforts, all the MVPs had the opportunity to meet such a legendary person and had to talk with them on Microsoft Technology. Though Somasegar is currently holding such a high position in Microsoft, he is very polite and a real gentleman, and how I wish that everybody in industry is like him. Believe me, if you spread love and kindness, then that is what you will receive back. As soon as lunch time was over, I ran to the session hall as my second presentation was about to start. Time: 2:30pm – 3:30pm Session 2: Master Data Services in Microsoft SQL Server 2008 R2 Business Intelligence is a subject which was widely talked about at TechEd. Everybody was interested in this subject, and I did not excuse myself from this great concept as well. I consider myself fortunate as I was presenting on the subject of Master Data Services at TechEd. When I had initially learned this subject, I had a bit of confusion about the usage of this tool. Later on, I decided that I would tackle about how we all developers and DBAs are not able to understand something so simple such as this, and even worst, creating confusion about the technology. During system designing, it is very important to have a reference material or master lookup tables. Well, I talked about the same subject and presented the session keeping that as my center talk. The session went very well and I received lots of interesting questions. I got many compliments for talking about this subject on the real-life scenario. I really thank Rushabh Mehta (CEO, Solid Quality Mentors India) for his supportive suggestions that helped me prepare the slide deck, as well as the subject. Pinal Presenting session at TechEd India 2010 The abstract of the session is as follows: SQL Server Master Data Services will ship with SQL Server 2008 R2 and will improve Microsoft’s platform appeal. This session provides an in-depth demonstration of MDS features and highlights important usage scenarios. Master Data Services enables consistent decision-making process by allowing you to create, manage and propagate changes from a single master view of your business entities. Also, MDS – Master Data-hub which is a vital component, helps ensure the consistency of reporting across systems and deliver faster and more accurate results across the enterprise. We will talk about establishing the basis for a centralized approach to defining, deploying, and managing master data in the enterprise. Pinal Presenting session at TechEd India 2010 The day was still not over for me. I had ran into several friends but we were not able keep our enthusiasm under control about all the rumors saying that SQL Server 2008 R2 was about to be launched tomorrow in the keynote. I then ran to my third and final technical event for the day- a panel discussion with the top technologies of India. Time: 5:00pm – 6:00pm Panel Discussion: Harness the power of Web – SEO and Technical Blogging As I have delivered two technical sessions by this time, I was a bit tired but  not less enthusiastic when I had to talk about Blog and Technology. We discussed many different topics there. I told them that the most important aspect for any blog is its content. We discussed in depth the issues with plagiarism and how to avoid it. Another topic of discussion was how we technology bloggers can create awareness in the Community about what the right kind of blogging is and what morally and technically wrong acts are. A couple of questions were raised about what type of liberty a person can have in terms of writing blogs. Well, it was generically agreed that a blog is mainly a representation of our ideas and thoughts; it should not be governed by external entities. As long as one is writing what they really want to say, but not providing incorrect information or not practicing plagiarism, a blogger should be allowed to express himself. This panel discussion was supposed to be over in an hour, but the interest of the participants was remarkable and so it was extended for 30 minutes more. Finally, we decided to bring to a close the discussion and agreed that we will continue the topic next year. TechEd India Panel Discussion on Web, Technology and SEO Surprisingly, the day was just beginning after doing all of these. By this time, I have almost met all the MVP who arrived at the event, as well as many Microsoft employees. There were lots of Community folks present, too. I decided that I would go to meet several friends from the Community and continue to communicate with me on SQLAuthority.com. I also met Abhishek Baxi and had a good talk with him regarding Win Mobile and Twitter. He also took a very quick video of me wherein I spoke in my mother’s tongue, Gujarati. It was funny that I talked in Gujarati almost all the day, but when I was talking in the interview I could not find the right Gujarati words to speak. I think we all think in English when we think about Technology, so as to address universality. After meeting them, I headed towards the Speakers’ Dinner. Time: 8:00 PM – onwards Speakers Dinner The Speakers’ dinner was indeed a wonderful opportunity for all the speakers to get together and relax. We talked so many different things, from XBOX to Hindi Movies, and from SQL to Samosas. I just could not express how much fun I had. After a long evening, when I returned tmy room and met Shaivi, I just felt instantly relaxed. Kids are really gifts from God. Today was a really long but exciting day. So many things happened in just one day: Visual Studio Lanch, lunch with Somasegar, 2 technical sessions, 1 panel discussion, community leaders meeting, speakers dinner and, last but not leas,t playing with my child! A perfect day! Day 2 – April 13, 2010 Today started with a bang with the excellent keynote by Kamal Hathi who launched SQL Server 2008 R2 in India and demonstrated the power of PowerPivot to all of us. 101 Million Rows in Excel brought lots of applause from the audience. Kamal Hathi Presenting Keynote at TechEd India 2010 The day was a bit easier one for me. I had no sessions today and no events planned. I had a few meetings planned for the second day of the event. I sat in the speaker’s lounge for half a day and met many people there. I attended nearly 9 different meetings today. The subjects of the meetings were very different. Here is a list of the topics of the Community-related meetings: SQL PASS and its involvement in India and subcontinents How to start community blogging Forums and developing aptitude towards technology Ahmedabad/Gandhinagar User Groups and their developments SharePoint and SQL Business Meeting – a client meeting Business Meeting – a potential performance tuning project Business Meeting – Solid Quality Mentors (SolidQ) And family friends Pinal Dave at TechEd India The day passed by so quickly during this meeting. In the evening, I headed to Partners Expo with friends and checked out few of the booths. I really wanted to talk about some of the products, but due to the freebies there was so much crowd that I finally decided to just take the contact details of the partner. I will now start sending them with my queries and, hopefully, I will have my questions answered. Nupur and Shaivi had also one meeting to attend; it was with our family friend Vijay Raj. Vijay is also a person who loves Technology and loves it more than anybody. I see him growing and learning every day, but still remaining as a ‘human’. I believe that if someone acquires as much knowledge as him, that person will become either a computer or cyborg. Here, Vijay is still a kind gentleman and is able to stay as our close family friend. Shaivi was really happy to play with Uncle Vijay. Pinal Dave and Vijay Raj Renuka Prasad, a Microsoft MVP, impressed me with his passion and knowledge of SQL. Every time he gives me credit for his success, I believe that he is very humble. He has way more certifications than me and has worked many more years with SQL compared to me. He is an excellent photographer as well. Most of the photos in this blog post have been taken by him. I told him if ever he wants to do a part time job, he can do the photography very well. Pinal Dave and Renuka Prasad I also met L Srividya from Microsoft, whom I was looking forward to meet. She is a bundle of knowledge that everyone would surely learn a lot from her. I was able to get a few minutes from her and well, I felt confident. She enlightened me with SQL Server BI concepts, domain management and SQL Server security and few other interesting details. I also had a wonderful time talking about SharePoint with fellow Solid Quality Mentor Joy Rathnayake. He is very passionate about SharePoint but when you talk .NET and SQL with him, he is still overwhelmingly knowledgeable. In fact, while talking to him, I figured out that the recent training he delivered was on SQL Server 2008 R2. I told him a joke that it hurts my ego as he is more popular now in SQL training and consulting than me. I am sure all of you agree that working with good people is a gift from God. I am fortunate enough to work with the best of the best Industry experts. It was a great pleasure to hang out with my Community friends – Ahswin Kini, HimaBindu Vejella, Vasudev G, Suprotim Agrawal, Dhananjay, Vikram Pendse, Mahesh Dhola, Mahesh Mitkari,  Manu Zacharia, Shobhan, Hardik Shah, Ashish Mohta, Manan, Subodh Sohani and Sanjay Shetty (of course!) .  (Please let me know if I have met you at the event and forgot your name to list here). Time: 8:00 PM – onwards Community Leaders Dinner After lots of meetings, I headed towards the Community Leaders dinner meeting and met almost all the folks I met in morning. The discussion was almost the same but the real good thing was that we were enjoying it. The food was really good. Nupur was invited in the event, but Shaivi could not come. When Nupur tried to enter the event, she was stopped as Shaivi did not have the pass to enter the dinner. Nupur expressed that Shaivi is only 8 months old and does not eat outside food as well and could not stay by herself at this age, but the door keeper did not agree and asked that without the entry details Shaivi could not go in, but Nupur could. Nupur called me on phone and asked me to help her out. By the time, I was outside; the organizer of the event reached to the door and happily approved Shaivi to join the party. Once in the party, Shaivi had lots of fun meeting so many people. Shaivi Dave and Abhishek Kant Dean Guida (Infragistics President and CEO) and Pinal Dave (SQLAuthority.com) Day 3 – April 14, 2010 Though, it was last day, I was very much excited today as I was about to present my very favorite session. Query Optimization and Performance Tuning is my domain expertise and I make my leaving by consulting and training the same. Today’s session was on the same subject and as an additional twist, another subject about Spatial Database was presented. I was always intrigued with Spatial Database and I have enjoyed learning about it; however, I have never thought about Spatial Indexing before it was decided that I will do this session. I really thank Solid Quality Mentor Dr. Greg Low for his assistance in helping me prepare the slide deck and also review the content. Furthermore, today was really what I call my ‘learning day’ . So far I had not attended any session in TechEd and I felt a bit down for that. Everybody spends their valuable time & money to learn something new and exciting in TechEd and I had not attended a single session at the moment thinking that it was already last day of the event. I did have a plan for the day and I attended two technical sessions before my session of spatial database. I attended 2 sessions of Vinod Kumar. Vinod is a natural storyteller and there was no doubt that his sessions would be jam-packed. People attended his sessions simply because Vinod is syhe speaker. He did not have a single time disappointed audience; he is truly a good speaker. He knows his stuff very well. I personally do not think that in India he can be compared to anyone for SQL. Time: 12:30pm-1:30pm SQL Server Query Optimization, Execution and Debugging Query Performance I really had a fun time attending this session. Vinod made this session very interactive. The entire audience really got into the presentation and started participating in the event. Vinod was presenting a small problem with Query Tuning, which any developer would have encountered and solved with their help in such a fashion that a developer feels he or she have already resolved it. In one question, I was the only one who was ready to answer and Vinod told me in a light tone that I am now allowed to answer it! The audience really found it very amusing. There was a huge crowd around Vinod after the session. Vinod – A master storyteller! Time: 3:45pm-4:45pm Data Recovery / consistency with CheckDB This session was much heavier than the earlier one, and I must say this is my most favorite session I EVER attended in India. In this TechEd I have only attended two sessions, but in my career, I have attended numerous technical sessions not only in India, but all over the world. This session had taken my breath away. One by one, Vinod took the different databases, and started to corrupt them in different ways. Each database has some unique ways to get corrupted. Once that was done, Vinod started to show the DBCC CEHCKDB and demonstrated how it can solve your problem. He finally fixed all the databases with this single tool. I do have a good knowledge of this subject, but let me honestly admit that I have learned a lot from this session. I enjoyed and cheered during this session along with other attendees. I had total satisfaction that, just like everyone, I took advantage of the event and learned something. I am now TECHnically EDucated. Pinal Dave and Vinod Kumar After two very interactive and informative SQL Sessions from Vinod Kumar, the next turn me presenting on Spatial Database and Indexing. I got once again nervous but Vinod told me to stay natural and do my presentation. Well, once I got a huge stage with a total of four projectors and a large crowd, I felt better. Time: 5:00pm-6:00pm Session 3: Developing with SQL Server Spatial and Deep Dive into Spatial Indexing Pinal Presenting session at TechEd India 2010 Pinal Presenting session at TechEd India 2010 I kicked off this session with Michael J Swart‘s beautiful spatial image. This session was the last one for the day but, to my surprise, I had more than 200+ attendees. Slowly, the rain was starting outside and I was worried that the hall would not be full; despite this, there was not a single seat available in the first five minutes of the session. Thanks to all of you for attending my presentation. I had demonstrated the map of world (and India) and quickly explained what  Geographic and Geometry data types in Spatial Database are. This session had interesting story of Indexing and Comparison, as well as how different traditional indexes are from spatial indexing. Pinal Presenting session at TechEd India 2010 Due to the heavy rain during this event, the power went off for about 22 minutes (just an accident – nobodies fault). During these minutes, there were no audio, no video and no light. I continued to address the mass of 200+ people without any audio device and PowerPoint. I must thank the audience because not a single person left from the session. They all stayed in their place, some moved closure to listen to me properly. I noticed that the curiosity and eagerness to learn new things was at the peak even though it was the very last session of the TechEd. Everybody wanted get the maximum knowledge out of this whole event. I was touched by the support from audience. They listened and participated in my session even without any kinds of technology (no ppt, no mike, no AC, nothing). During these 22 minutes, I had completed my theory verbally. Pinal Presenting session at TechEd India 2010 After a while, we got the projector back online and we continued with some exciting demos. Many thanks to Microsoft people who worked energetically in background to get the backup power for project up. I had a very interesting demo wherein I overlaid Bangalore and Hyderabad on the India Map and find their aerial distance between them. After finding the aerial distance, we browsed online and found that SQL Server estimates the exact aerial distance between these two cities, as compared to the factual distance. There was a huge applause from the crowd on the subject that SQL Server takes into the count of the curvature of the earth and finds the precise distances based on details. During the process of finding the distance, I demonstrated a few examples of the indexes where I expressed how one can use those indexes to find these distances and how they can improve the performance of similar query. I also demonstrated few examples wherein we were able to see in which data type the Index is most useful. We finished the demos with a few more internal stuff. Pinal Presenting session at TechEd India 2010 Despite all issues, I was mostly satisfied with my presentation. I think it was the best session I have ever presented at any conference. There was no help from Technology for a while, but I still got lots of appreciation at the end. When we ended the session, the applause from the audience was so loud that for a moment, the rain was not audible. I was truly moved by the dedication of the Technology enthusiasts. Pinal Dave After Presenting session at TechEd India 2010 The abstract of the session is as follows: The Microsoft SQL Server 2008 delivers new spatial data types that enable you to consume, use, and extend location-based data through spatial-enabled applications. Attend this session to learn how to use spatial functionality in next version of SQL Server to build and optimize spatial queries. This session outlines the new geography data type to store geodetic spatial data and perform operations on it, use the new geometry data type to store planar spatial data and perform operations on it, take advantage of new spatial indexes for high performance queries, use the new spatial results tab to quickly and easily view spatial query results directly from within Management Studio, extend spatial data capabilities by building or integrating location-enabled applications through support for spatial standards and specifications and much more. Time: 8:00 PM – onwards Dinner by Sponsors After the lively session during the day, there was another dinner party courtesy of one of the sponsors of TechEd. All the MVPs and several Community leaders were present at the dinner. I would like to express my gratitude to Abhishek Kant for organizing this wonderful event for us. It was a blast and really relaxing in all angles. We all stayed there for a long time and talked about our sweet and unforgettable memories of the event. Pinal Dave and Bijoy Singhal It was really one wonderful event. After writing this much, I say that I have no words to express about how much I enjoyed TechEd. However, it is true that I shared with you only 1% of the total activities I have done at the event. There were so many people I have met, yet were not mentioned here although I wanted to write their names here, too . Anyway, I have learned so many things and up until now, I am not able to get over all the fun I had in this event. Pinal Dave at TechEd India 2010 The Next Days – April 15, 2010 – till today I am still not able to get my mind out of the whole experience I had at TechEd India 2010. It was like a whole Microsoft Family working together to celebrate a happy occasion. TechEd India – Truly An Unforgettable Experience! Reference : Pinal Dave (http://blog.SQLAuthority.com) Filed under: About Me, MVP, Pinal Dave, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, SQLAuthority News, SQLServer, T SQL, Technology Tagged: TechEd, TechEdIn

    Read the article

  • SQLAuthority News – TechEd India – April 12-14, 2010 Bangalore – An Unforgettable Experience – An Op

    - by pinaldave
    TechEd India was one of the largest Technology events in India led by Microsoft. This event was attended by more than 3,000 technology enthusiasts, making it one of the most well-organized events of the year. Though I attempted to attend almost all the technology events here, I have not seen any bigger or better event in Indian subcontinents other than this. There are 21 Technical Tracks at Tech·Ed India 2010 that span more than 745 learning opportunities. I was fortunate enough to be a part of this whole event as a speaker and a delegate, as well. TechEd India Speaker Badge and A Token of Lifetime Hotel Selection I presented three different sessions at TechEd India and was also a part of panel discussion. (The details of the sessions are given at the end of this blog post.) Due to extensive traveling, I stay away from my family occasionally. For this reason, I took my wife – Nupur and daughter Shaivi (8 months old) to the event along with me. We stayed at the same hotel where the event was organized so as to maximize my time bonding with my family and to have more time in networking with technology community, at the same time. The hotel Lalit Ashok is the largest and most luxurious venue one can find in Bangalore, located in the middle of the city. The cost of the hotel was a bit pricey, but looking at all the advantages, I had decided to ask for a booking there. Hotel Lalit Ashok Nupur Dave and Shaivi Dave Arrival Day – DAY 0 – April 11, 2010 I reached the event a day earlier, and that was one wise decision for I was able to relax a bit and go over my presentation for the next day’s course. I am a kind of person who likes to get everything ready ahead of time. I was also able to enjoy a pleasant evening with several Microsoft employees and my family friends. I even checked out the location where I would be doing presentations the next day. I was fortunate enough to meet Bijoy Singhal from Microsoft who helped me out with a few of the logistics issues that occured the day before. I was not aware of the fact that the very next day he was going to be “The Man” of the TechEd 2010 event. Vinod Kumar from Microsoft was really very kind as he talked to me regarding my subsequent session. He gave me some suggestions which were really helpful that I was able to incorporate them during my presentation. Finally, I was able to meet Abhishek Kant from Microsoft; his valuable suggestions and unlimited passion have inspired many people like me to work with the Community. Pradipta from Microsoft was also around, being extremely busy with logistics; however, in those busy times, he did find some good spare time to have a chat with me and the other Community leaders. I also met Harish Ranganathan and Sachin Rathi, both from Microsoft. It was so interesting to listen to both of them talking about SharePoint. I just have no words to express my overwhelmed spirit because of all these passionate young guys - Pradipta,Vinod, Bijoy, Harish, Sachin and Ahishek (of course!). Map of TechEd India 2010 Event Day 1 – April 12, 2010 From morning until night time, today was truly a very busy day for me. I had two presentations and one panel discussion for the day. Needless to say, I had a few meetings to attend as well. The day started with a keynote from S. Somaseger where he announced the launch of Visual Studio 2010. The keynote area was really eye-catching because of the very large, bigger-than- life uniform screen. This was truly one to show. The title music of the keynote was very interesting and it featured Bijoy Singhal as the model. It was interesting to talk to him afterwards, when we laughed at jokes together about his modeling assignment. TechEd India Keynote Opening Featuring Bijoy TechEd India 2010 Keynote – S. Somasegar Time: 11:15pm – 11:45pm Session 1: True Lies of SQL Server – SQL Myth Buster Following the excellent keynote, I had my very first session on the subject of SQL Server Myth Buster. At first, I was a bit nervous as right after the keynote, for this was my very first session and during my presentation I saw lots of Microsoft Product Team members. Well, it really went well and I had a really good discussion with attendees of the session. I felt that a well begin was half-done and my confidence was regained. Right after the session, I met a few of my Community friends and had meaningful discussions with them on many subjects. The abstract of the session is as follows: In this 30-minute demo session, I am going to briefly demonstrate few SQL Server Myths and their resolutions as I back them up with some demo. This demo presentation is a must-attend for all developers and administrators who would come to the event. This is going to be a very quick yet fun session. Pinal Presenting session at TechEd India 2010 Time: 1:00 PM – 2:00 PM Lunch with Somasegar After the session I went to see my daughter, and then I headed right away to the lunch with S. Somasegar – the keynote speaker and senior vice president of the Developer Division at Microsoft. I really thank to Abhishek who made it possible for us. Because of his efforts, all the MVPs had the opportunity to meet such a legendary person and had to talk with them on Microsoft Technology. Though Somasegar is currently holding such a high position in Microsoft, he is very polite and a real gentleman, and how I wish that everybody in industry is like him. Believe me, if you spread love and kindness, then that is what you will receive back. As soon as lunch time was over, I ran to the session hall as my second presentation was about to start. Time: 2:30pm – 3:30pm Session 2: Master Data Services in Microsoft SQL Server 2008 R2 Business Intelligence is a subject which was widely talked about at TechEd. Everybody was interested in this subject, and I did not excuse myself from this great concept as well. I consider myself fortunate as I was presenting on the subject of Master Data Services at TechEd. When I had initially learned this subject, I had a bit of confusion about the usage of this tool. Later on, I decided that I would tackle about how we all developers and DBAs are not able to understand something so simple such as this, and even worst, creating confusion about the technology. During system designing, it is very important to have a reference material or master lookup tables. Well, I talked about the same subject and presented the session keeping that as my center talk. The session went very well and I received lots of interesting questions. I got many compliments for talking about this subject on the real-life scenario. I really thank Rushabh Mehta (CEO, Solid Quality Mentors India) for his supportive suggestions that helped me prepare the slide deck, as well as the subject. Pinal Presenting session at TechEd India 2010 The abstract of the session is as follows: SQL Server Master Data Services will ship with SQL Server 2008 R2 and will improve Microsoft’s platform appeal. This session provides an in-depth demonstration of MDS features and highlights important usage scenarios. Master Data Services enables consistent decision-making process by allowing you to create, manage and propagate changes from a single master view of your business entities. Also, MDS – Master Data-hub which is a vital component, helps ensure the consistency of reporting across systems and deliver faster and more accurate results across the enterprise. We will talk about establishing the basis for a centralized approach to defining, deploying, and managing master data in the enterprise. Pinal Presenting session at TechEd India 2010 The day was still not over for me. I had ran into several friends but we were not able keep our enthusiasm under control about all the rumors saying that SQL Server 2008 R2 was about to be launched tomorrow in the keynote. I then ran to my third and final technical event for the day- a panel discussion with the top technologies of India. Time: 5:00pm – 6:00pm Panel Discussion: Harness the power of Web – SEO and Technical Blogging As I have delivered two technical sessions by this time, I was a bit tired but  not less enthusiastic when I had to talk about Blog and Technology. We discussed many different topics there. I told them that the most important aspect for any blog is its content. We discussed in depth the issues with plagiarism and how to avoid it. Another topic of discussion was how we technology bloggers can create awareness in the Community about what the right kind of blogging is and what morally and technically wrong acts are. A couple of questions were raised about what type of liberty a person can have in terms of writing blogs. Well, it was generically agreed that a blog is mainly a representation of our ideas and thoughts; it should not be governed by external entities. As long as one is writing what they really want to say, but not providing incorrect information or not practicing plagiarism, a blogger should be allowed to express himself. This panel discussion was supposed to be over in an hour, but the interest of the participants was remarkable and so it was extended for 30 minutes more. Finally, we decided to bring to a close the discussion and agreed that we will continue the topic next year. TechEd India Panel Discussion on Web, Technology and SEO Surprisingly, the day was just beginning after doing all of these. By this time, I have almost met all the MVP who arrived at the event, as well as many Microsoft employees. There were lots of Community folks present, too. I decided that I would go to meet several friends from the Community and continue to communicate with me on SQLAuthority.com. I also met Abhishek Baxi and had a good talk with him regarding Win Mobile and Twitter. He also took a very quick video of me wherein I spoke in my mother’s tongue, Gujarati. It was funny that I talked in Gujarati almost all the day, but when I was talking in the interview I could not find the right Gujarati words to speak. I think we all think in English when we think about Technology, so as to address universality. After meeting them, I headed towards the Speakers’ Dinner. Time: 8:00 PM – onwards Speakers Dinner The Speakers’ dinner was indeed a wonderful opportunity for all the speakers to get together and relax. We talked so many different things, from XBOX to Hindi Movies, and from SQL to Samosas. I just could not express how much fun I had. After a long evening, when I returned tmy room and met Shaivi, I just felt instantly relaxed. Kids are really gifts from God. Today was a really long but exciting day. So many things happened in just one day: Visual Studio Lanch, lunch with Somasegar, 2 technical sessions, 1 panel discussion, community leaders meeting, speakers dinner and, last but not leas,t playing with my child! A perfect day! Day 2 – April 13, 2010 Today started with a bang with the excellent keynote by Kamal Hathi who launched SQL Server 2008 R2 in India and demonstrated the power of PowerPivot to all of us. 101 Million Rows in Excel brought lots of applause from the audience. Kamal Hathi Presenting Keynote at TechEd India 2010 The day was a bit easier one for me. I had no sessions today and no events planned. I had a few meetings planned for the second day of the event. I sat in the speaker’s lounge for half a day and met many people there. I attended nearly 9 different meetings today. The subjects of the meetings were very different. Here is a list of the topics of the Community-related meetings: SQL PASS and its involvement in India and subcontinents How to start community blogging Forums and developing aptitude towards technology Ahmedabad/Gandhinagar User Groups and their developments SharePoint and SQL Business Meeting – a client meeting Business Meeting – a potential performance tuning project Business Meeting – Solid Quality Mentors (SolidQ) And family friends Pinal Dave at TechEd India The day passed by so quickly during this meeting. In the evening, I headed to Partners Expo with friends and checked out few of the booths. I really wanted to talk about some of the products, but due to the freebies there was so much crowd that I finally decided to just take the contact details of the partner. I will now start sending them with my queries and, hopefully, I will have my questions answered. Nupur and Shaivi had also one meeting to attend; it was with our family friend Vijay Raj. Vijay is also a person who loves Technology and loves it more than anybody. I see him growing and learning every day, but still remaining as a ‘human’. I believe that if someone acquires as much knowledge as him, that person will become either a computer or cyborg. Here, Vijay is still a kind gentleman and is able to stay as our close family friend. Shaivi was really happy to play with Uncle Vijay. Pinal Dave and Vijay Raj Renuka Prasad, a Microsoft MVP, impressed me with his passion and knowledge of SQL. Every time he gives me credit for his success, I believe that he is very humble. He has way more certifications than me and has worked many more years with SQL compared to me. He is an excellent photographer as well. Most of the photos in this blog post have been taken by him. I told him if ever he wants to do a part time job, he can do the photography very well. Pinal Dave and Renuka Prasad I also met L Srividya from Microsoft, whom I was looking forward to meet. She is a bundle of knowledge that everyone would surely learn a lot from her. I was able to get a few minutes from her and well, I felt confident. She enlightened me with SQL Server BI concepts, domain management and SQL Server security and few other interesting details. I also had a wonderful time talking about SharePoint with fellow Solid Quality Mentor Joy Rathnayake. He is very passionate about SharePoint but when you talk .NET and SQL with him, he is still overwhelmingly knowledgeable. In fact, while talking to him, I figured out that the recent training he delivered was on SQL Server 2008 R2. I told him a joke that it hurts my ego as he is more popular now in SQL training and consulting than me. I am sure all of you agree that working with good people is a gift from God. I am fortunate enough to work with the best of the best Industry experts. It was a great pleasure to hang out with my Community friends – Ahswin Kini, HimaBindu Vejella, Vasudev G, Suprotim Agrawal, Dhananjay, Vikram Pendse, Mahesh Dhola, Mahesh Mitkari,  Manu Zacharia, Shobhan, Hardik Shah, Ashish Mohta, Manan, Subodh Sohani and Sanjay Shetty (of course!) .  (Please let me know if I have met you at the event and forgot your name to list here). Time: 8:00 PM – onwards Community Leaders Dinner After lots of meetings, I headed towards the Community Leaders dinner meeting and met almost all the folks I met in morning. The discussion was almost the same but the real good thing was that we were enjoying it. The food was really good. Nupur was invited in the event, but Shaivi could not come. When Nupur tried to enter the event, she was stopped as Shaivi did not have the pass to enter the dinner. Nupur expressed that Shaivi is only 8 months old and does not eat outside food as well and could not stay by herself at this age, but the door keeper did not agree and asked that without the entry details Shaivi could not go in, but Nupur could. Nupur called me on phone and asked me to help her out. By the time, I was outside; the organizer of the event reached to the door and happily approved Shaivi to join the party. Once in the party, Shaivi had lots of fun meeting so many people. Shaivi Dave and Abhishek Kant Dean Guida (Infragistics President and CEO) and Pinal Dave (SQLAuthority.com) Day 3 – April 14, 2010 Though, it was last day, I was very much excited today as I was about to present my very favorite session. Query Optimization and Performance Tuning is my domain expertise and I make my leaving by consulting and training the same. Today’s session was on the same subject and as an additional twist, another subject about Spatial Database was presented. I was always intrigued with Spatial Database and I have enjoyed learning about it; however, I have never thought about Spatial Indexing before it was decided that I will do this session. I really thank Solid Quality Mentor Dr. Greg Low for his assistance in helping me prepare the slide deck and also review the content. Furthermore, today was really what I call my ‘learning day’ . So far I had not attended any session in TechEd and I felt a bit down for that. Everybody spends their valuable time & money to learn something new and exciting in TechEd and I had not attended a single session at the moment thinking that it was already last day of the event. I did have a plan for the day and I attended two technical sessions before my session of spatial database. I attended 2 sessions of Vinod Kumar. Vinod is a natural storyteller and there was no doubt that his sessions would be jam-packed. People attended his sessions simply because Vinod is syhe speaker. He did not have a single time disappointed audience; he is truly a good speaker. He knows his stuff very well. I personally do not think that in India he can be compared to anyone for SQL. Time: 12:30pm-1:30pm SQL Server Query Optimization, Execution and Debugging Query Performance I really had a fun time attending this session. Vinod made this session very interactive. The entire audience really got into the presentation and started participating in the event. Vinod was presenting a small problem with Query Tuning, which any developer would have encountered and solved with their help in such a fashion that a developer feels he or she have already resolved it. In one question, I was the only one who was ready to answer and Vinod told me in a light tone that I am now allowed to answer it! The audience really found it very amusing. There was a huge crowd around Vinod after the session. Vinod – A master storyteller! Time: 3:45pm-4:45pm Data Recovery / consistency with CheckDB This session was much heavier than the earlier one, and I must say this is my most favorite session I EVER attended in India. In this TechEd I have only attended two sessions, but in my career, I have attended numerous technical sessions not only in India, but all over the world. This session had taken my breath away. One by one, Vinod took the different databases, and started to corrupt them in different ways. Each database has some unique ways to get corrupted. Once that was done, Vinod started to show the DBCC CEHCKDB and demonstrated how it can solve your problem. He finally fixed all the databases with this single tool. I do have a good knowledge of this subject, but let me honestly admit that I have learned a lot from this session. I enjoyed and cheered during this session along with other attendees. I had total satisfaction that, just like everyone, I took advantage of the event and learned something. I am now TECHnically EDucated. Pinal Dave and Vinod Kumar After two very interactive and informative SQL Sessions from Vinod Kumar, the next turn me presenting on Spatial Database and Indexing. I got once again nervous but Vinod told me to stay natural and do my presentation. Well, once I got a huge stage with a total of four projectors and a large crowd, I felt better. Time: 5:00pm-6:00pm Session 3: Developing with SQL Server Spatial and Deep Dive into Spatial Indexing Pinal Presenting session at TechEd India 2010 Pinal Presenting session at TechEd India 2010 I kicked off this session with Michael J Swart‘s beautiful spatial image. This session was the last one for the day but, to my surprise, I had more than 200+ attendees. Slowly, the rain was starting outside and I was worried that the hall would not be full; despite this, there was not a single seat available in the first five minutes of the session. Thanks to all of you for attending my presentation. I had demonstrated the map of world (and India) and quickly explained what  Geographic and Geometry data types in Spatial Database are. This session had interesting story of Indexing and Comparison, as well as how different traditional indexes are from spatial indexing. Pinal Presenting session at TechEd India 2010 Due to the heavy rain during this event, the power went off for about 22 minutes (just an accident – nobodies fault). During these minutes, there were no audio, no video and no light. I continued to address the mass of 200+ people without any audio device and PowerPoint. I must thank the audience because not a single person left from the session. They all stayed in their place, some moved closure to listen to me properly. I noticed that the curiosity and eagerness to learn new things was at the peak even though it was the very last session of the TechEd. Everybody wanted get the maximum knowledge out of this whole event. I was touched by the support from audience. They listened and participated in my session even without any kinds of technology (no ppt, no mike, no AC, nothing). During these 22 minutes, I had completed my theory verbally. Pinal Presenting session at TechEd India 2010 After a while, we got the projector back online and we continued with some exciting demos. Many thanks to Microsoft people who worked energetically in background to get the backup power for project up. I had a very interesting demo wherein I overlaid Bangalore and Hyderabad on the India Map and find their aerial distance between them. After finding the aerial distance, we browsed online and found that SQL Server estimates the exact aerial distance between these two cities, as compared to the factual distance. There was a huge applause from the crowd on the subject that SQL Server takes into the count of the curvature of the earth and finds the precise distances based on details. During the process of finding the distance, I demonstrated a few examples of the indexes where I expressed how one can use those indexes to find these distances and how they can improve the performance of similar query. I also demonstrated few examples wherein we were able to see in which data type the Index is most useful. We finished the demos with a few more internal stuff. Pinal Presenting session at TechEd India 2010 Despite all issues, I was mostly satisfied with my presentation. I think it was the best session I have ever presented at any conference. There was no help from Technology for a while, but I still got lots of appreciation at the end. When we ended the session, the applause from the audience was so loud that for a moment, the rain was not audible. I was truly moved by the dedication of the Technology enthusiasts. Pinal Dave After Presenting session at TechEd India 2010 The abstract of the session is as follows: The Microsoft SQL Server 2008 delivers new spatial data types that enable you to consume, use, and extend location-based data through spatial-enabled applications. Attend this session to learn how to use spatial functionality in next version of SQL Server to build and optimize spatial queries. This session outlines the new geography data type to store geodetic spatial data and perform operations on it, use the new geometry data type to store planar spatial data and perform operations on it, take advantage of new spatial indexes for high performance queries, use the new spatial results tab to quickly and easily view spatial query results directly from within Management Studio, extend spatial data capabilities by building or integrating location-enabled applications through support for spatial standards and specifications and much more. Time: 8:00 PM – onwards Dinner by Sponsors After the lively session during the day, there was another dinner party courtesy of one of the sponsors of TechEd. All the MVPs and several Community leaders were present at the dinner. I would like to express my gratitude to Abhishek Kant for organizing this wonderful event for us. It was a blast and really relaxing in all angles. We all stayed there for a long time and talked about our sweet and unforgettable memories of the event. Pinal Dave and Bijoy Singhal It was really one wonderful event. After writing this much, I say that I have no words to express about how much I enjoyed TechEd. However, it is true that I shared with you only 1% of the total activities I have done at the event. There were so many people I have met, yet were not mentioned here although I wanted to write their names here, too . Anyway, I have learned so many things and up until now, I am not able to get over all the fun I had in this event. Pinal Dave at TechEd India 2010 The Next Days – April 15, 2010 – till today I am still not able to get my mind out of the whole experience I had at TechEd India 2010. It was like a whole Microsoft Family working together to celebrate a happy occasion. TechEd India – Truly An Unforgettable Experience! Reference : Pinal Dave (http://blog.SQLAuthority.com) Filed under: About Me, MVP, Pinal Dave, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, SQLAuthority News, SQLServer, T SQL, Technology Tagged: TechEd, TechEdIn

    Read the article

  • A way of doing real-world test-driven development (and some thoughts about it)

    - by Thomas Weller
    Lately, I exchanged some arguments with Derick Bailey about some details of the red-green-refactor cycle of the Test-driven development process. In short, the issue revolved around the fact that it’s not enough to have a test red or green, but it’s also important to have it red or green for the right reasons. While for me, it’s sufficient to initially have a NotImplementedException in place, Derick argues that this is not totally correct (see these two posts: Red/Green/Refactor, For The Right Reasons and Red For The Right Reason: Fail By Assertion, Not By Anything Else). And he’s right. But on the other hand, I had no idea how his insights could have any practical consequence for my own individual interpretation of the red-green-refactor cycle (which is not really red-green-refactor, at least not in its pure sense, see the rest of this article). This made me think deeply for some days now. In the end I found out that the ‘right reason’ changes in my understanding depending on what development phase I’m in. To make this clear (at least I hope it becomes clear…) I started to describe my way of working in some detail, and then something strange happened: The scope of the article slightly shifted from focusing ‘only’ on the ‘right reason’ issue to something more general, which you might describe as something like  'Doing real-world TDD in .NET , with massive use of third-party add-ins’. This is because I feel that there is a more general statement about Test-driven development to make:  It’s high time to speak about the ‘How’ of TDD, not always only the ‘Why’. Much has been said about this, and me myself also contributed to that (see here: TDD is not about testing, it's about how we develop software). But always justifying what you do is very unsatisfying in the long run, it is inherently defensive, and it costs time and effort that could be used for better and more important things. And frankly: I’m somewhat sick and tired of repeating time and again that the test-driven way of software development is highly preferable for many reasons - I don’t want to spent my time exclusively on stating the obvious… So, again, let’s say it clearly: TDD is programming, and programming is TDD. Other ways of programming (code-first, sometimes called cowboy-coding) are exceptional and need justification. – I know that there are many people out there who will disagree with this radical statement, and I also know that it’s not a description of the real world but more of a mission statement or something. But nevertheless I’m absolutely sure that in some years this statement will be nothing but a platitude. Side note: Some parts of this post read as if I were paid by Jetbrains (the manufacturer of the ReSharper add-in – R#), but I swear I’m not. Rather I think that Visual Studio is just not production-complete without it, and I wouldn’t even consider to do professional work without having this add-in installed... The three parts of a software component Before I go into some details, I first should describe my understanding of what belongs to a software component (assembly, type, or method) during the production process (i.e. the coding phase). Roughly, I come up with the three parts shown below:   First, we need to have some initial sort of requirement. This can be a multi-page formal document, a vague idea in some programmer’s brain of what might be needed, or anything in between. In either way, there has to be some sort of requirement, be it explicit or not. – At the C# micro-level, the best way that I found to formulate that is to define interfaces for just about everything, even for internal classes, and to provide them with exhaustive xml comments. The next step then is to re-formulate these requirements in an executable form. This is specific to the respective programming language. - For C#/.NET, the Gallio framework (which includes MbUnit) in conjunction with the ReSharper add-in for Visual Studio is my toolset of choice. The third part then finally is the production code itself. It’s development is entirely driven by the requirements and their executable formulation. This is the delivery, the two other parts are ‘only’ there to make its production possible, to give it a decent quality and reliability, and to significantly reduce related costs down the maintenance timeline. So while the first two parts are not really relevant for the customer, they are very important for the developer. The customer (or in Scrum terms: the Product Owner) is not interested at all in how  the product is developed, he is only interested in the fact that it is developed as cost-effective as possible, and that it meets his functional and non-functional requirements. The rest is solely a matter of the developer’s craftsmanship, and this is what I want to talk about during the remainder of this article… An example To demonstrate my way of doing real-world TDD, I decided to show the development of a (very) simple Calculator component. The example is deliberately trivial and silly, as examples always are. I am totally aware of the fact that real life is never that simple, but I only want to show some development principles here… The requirement As already said above, I start with writing down some words on the initial requirement, and I normally use interfaces for that, even for internal classes - the typical question “intf or not” doesn’t even come to mind. I need them for my usual workflow and using them automatically produces high componentized and testable code anyway. To think about their usage in every single situation would slow down the production process unnecessarily. So this is what I begin with: namespace Calculator {     /// <summary>     /// Defines a very simple calculator component for demo purposes.     /// </summary>     public interface ICalculator     {         /// <summary>         /// Gets the result of the last successful operation.         /// </summary>         /// <value>The last result.</value>         /// <remarks>         /// Will be <see langword="null" /> before the first successful operation.         /// </remarks>         double? LastResult { get; }       } // interface ICalculator   } // namespace Calculator So, I’m not beginning with a test, but with a sort of code declaration - and still I insist on being 100% test-driven. There are three important things here: Starting this way gives me a method signature, which allows to use IntelliSense and AutoCompletion and thus eliminates the danger of typos - one of the most regular, annoying, time-consuming, and therefore expensive sources of error in the development process. In my understanding, the interface definition as a whole is more of a readable requirement document and technical documentation than anything else. So this is at least as much about documentation than about coding. The documentation must completely describe the behavior of the documented element. I normally use an IoC container or some sort of self-written provider-like model in my architecture. In either case, I need my components defined via service interfaces anyway. - I will use the LinFu IoC framework here, for no other reason as that is is very simple to use. The ‘Red’ (pt. 1)   First I create a folder for the project’s third-party libraries and put the LinFu.Core dll there. Then I set up a test project (via a Gallio project template), and add references to the Calculator project and the LinFu dll. Finally I’m ready to write the first test, which will look like the following: namespace Calculator.Test {     [TestFixture]     public class CalculatorTest     {         private readonly ServiceContainer container = new ServiceContainer();           [Test]         public void CalculatorLastResultIsInitiallyNull()         {             ICalculator calculator = container.GetService<ICalculator>();               Assert.IsNull(calculator.LastResult);         }       } // class CalculatorTest   } // namespace Calculator.Test       This is basically the executable formulation of what the interface definition states (part of). Side note: There’s one principle of TDD that is just plain wrong in my eyes: I’m talking about the Red is 'does not compile' thing. How could a compiler error ever be interpreted as a valid test outcome? I never understood that, it just makes no sense to me. (Or, in Derick’s terms: this reason is as wrong as a reason ever could be…) A compiler error tells me: Your code is incorrect, but nothing more.  Instead, the ‘Red’ part of the red-green-refactor cycle has a clearly defined meaning to me: It means that the test works as intended and fails only if its assumptions are not met for some reason. Back to our Calculator. When I execute the above test with R#, the Gallio plugin will give me this output: So this tells me that the test is red for the wrong reason: There’s no implementation that the IoC-container could load, of course. So let’s fix that. With R#, this is very easy: First, create an ICalculator - derived type:        Next, implement the interface members: And finally, move the new class to its own file: So far my ‘work’ was six mouse clicks long, the only thing that’s left to do manually here, is to add the Ioc-specific wiring-declaration and also to make the respective class non-public, which I regularly do to force my components to communicate exclusively via interfaces: This is what my Calculator class looks like as of now: using System; using LinFu.IoC.Configuration;   namespace Calculator {     [Implements(typeof(ICalculator))]     internal class Calculator : ICalculator     {         public double? LastResult         {             get             {                 throw new NotImplementedException();             }         }     } } Back to the test fixture, we have to put our IoC container to work: [TestFixture] public class CalculatorTest {     #region Fields       private readonly ServiceContainer container = new ServiceContainer();       #endregion // Fields       #region Setup/TearDown       [FixtureSetUp]     public void FixtureSetUp()     {        container.LoadFrom(AppDomain.CurrentDomain.BaseDirectory, "Calculator.dll");     }       ... Because I have a R# live template defined for the setup/teardown method skeleton as well, the only manual coding here again is the IoC-specific stuff: two lines, not more… The ‘Red’ (pt. 2) Now, the execution of the above test gives the following result: This time, the test outcome tells me that the method under test is called. And this is the point, where Derick and I seem to have somewhat different views on the subject: Of course, the test still is worthless regarding the red/green outcome (or: it’s still red for the wrong reasons, in that it gives a false negative). But as far as I am concerned, I’m not really interested in the test outcome at this point of the red-green-refactor cycle. Rather, I only want to assert that my test actually calls the right method. If that’s the case, I will happily go on to the ‘Green’ part… The ‘Green’ Making the test green is quite trivial. Just make LastResult an automatic property:     [Implements(typeof(ICalculator))]     internal class Calculator : ICalculator     {         public double? LastResult { get; private set; }     }         One more round… Now on to something slightly more demanding (cough…). Let’s state that our Calculator exposes an Add() method:         ...   /// <summary>         /// Adds the specified operands.         /// </summary>         /// <param name="operand1">The operand1.</param>         /// <param name="operand2">The operand2.</param>         /// <returns>The result of the additon.</returns>         /// <exception cref="ArgumentException">         /// Argument <paramref name="operand1"/> is &lt; 0.<br/>         /// -- or --<br/>         /// Argument <paramref name="operand2"/> is &lt; 0.         /// </exception>         double Add(double operand1, double operand2);       } // interface ICalculator A remark: I sometimes hear the complaint that xml comment stuff like the above is hard to read. That’s certainly true, but irrelevant to me, because I read xml code comments with the CR_Documentor tool window. And using that, it looks like this:   Apart from that, I’m heavily using xml code comments (see e.g. here for a detailed guide) because there is the possibility of automating help generation with nightly CI builds (using MS Sandcastle and the Sandcastle Help File Builder), and then publishing the results to some intranet location.  This way, a team always has first class, up-to-date technical documentation at hand about the current codebase. (And, also very important for speeding up things and avoiding typos: You have IntelliSense/AutoCompletion and R# support, and the comments are subject to compiler checking…).     Back to our Calculator again: Two more R# – clicks implement the Add() skeleton:         ...           public double Add(double operand1, double operand2)         {             throw new NotImplementedException();         }       } // class Calculator As we have stated in the interface definition (which actually serves as our requirement document!), the operands are not allowed to be negative. So let’s start implementing that. Here’s the test: [Test] [Row(-0.5, 2)] public void AddThrowsOnNegativeOperands(double operand1, double operand2) {     ICalculator calculator = container.GetService<ICalculator>();       Assert.Throws<ArgumentException>(() => calculator.Add(operand1, operand2)); } As you can see, I’m using a data-driven unit test method here, mainly for these two reasons: Because I know that I will have to do the same test for the second operand in a few seconds, I save myself from implementing another test method for this purpose. Rather, I only will have to add another Row attribute to the existing one. From the test report below, you can see that the argument values are explicitly printed out. This can be a valuable documentation feature even when everything is green: One can quickly review what values were tested exactly - the complete Gallio HTML-report (as it will be produced by the Continuous Integration runs) shows these values in a quite clear format (see below for an example). Back to our Calculator development again, this is what the test result tells us at the moment: So we’re red again, because there is not yet an implementation… Next we go on and implement the necessary parameter verification to become green again, and then we do the same thing for the second operand. To make a long story short, here’s the test and the method implementation at the end of the second cycle: // in CalculatorTest:   [Test] [Row(-0.5, 2)] [Row(295, -123)] public void AddThrowsOnNegativeOperands(double operand1, double operand2) {     ICalculator calculator = container.GetService<ICalculator>();       Assert.Throws<ArgumentException>(() => calculator.Add(operand1, operand2)); }   // in Calculator: public double Add(double operand1, double operand2) {     if (operand1 < 0.0)     {         throw new ArgumentException("Value must not be negative.", "operand1");     }     if (operand2 < 0.0)     {         throw new ArgumentException("Value must not be negative.", "operand2");     }     throw new NotImplementedException(); } So far, we have sheltered our method from unwanted input, and now we can safely operate on the parameters without further caring about their validity (this is my interpretation of the Fail Fast principle, which is regarded here in more detail). Now we can think about the method’s successful outcomes. First let’s write another test for that: [Test] [Row(1, 1, 2)] public void TestAdd(double operand1, double operand2, double expectedResult) {     ICalculator calculator = container.GetService<ICalculator>();       double result = calculator.Add(operand1, operand2);       Assert.AreEqual(expectedResult, result); } Again, I’m regularly using row based test methods for these kinds of unit tests. The above shown pattern proved to be extremely helpful for my development work, I call it the Defined-Input/Expected-Output test idiom: You define your input arguments together with the expected method result. There are two major benefits from that way of testing: In the course of refining a method, it’s very likely to come up with additional test cases. In our case, we might add tests for some edge cases like ‘one of the operands is zero’ or ‘the sum of the two operands causes an overflow’, or maybe there’s an external test protocol that has to be fulfilled (e.g. an ISO norm for medical software), and this results in the need of testing against additional values. In all these scenarios we only have to add another Row attribute to the test. Remember that the argument values are written to the test report, so as a side-effect this produces valuable documentation. (This can become especially important if the fulfillment of some sort of external requirements has to be proven). So your test method might look something like that in the end: [Test, Description("Arguments: operand1, operand2, expectedResult")] [Row(1, 1, 2)] [Row(0, 999999999, 999999999)] [Row(0, 0, 0)] [Row(0, double.MaxValue, double.MaxValue)] [Row(4, double.MaxValue - 2.5, double.MaxValue)] public void TestAdd(double operand1, double operand2, double expectedResult) {     ICalculator calculator = container.GetService<ICalculator>();       double result = calculator.Add(operand1, operand2);       Assert.AreEqual(expectedResult, result); } And this will produce the following HTML report (with Gallio):   Not bad for the amount of work we invested in it, huh? - There might be scenarios where reports like that can be useful for demonstration purposes during a Scrum sprint review… The last requirement to fulfill is that the LastResult property is expected to store the result of the last operation. I don’t show this here, it’s trivial enough and brings nothing new… And finally: Refactor (for the right reasons) To demonstrate my way of going through the refactoring portion of the red-green-refactor cycle, I added another method to our Calculator component, namely Subtract(). Here’s the code (tests and production): // CalculatorTest.cs:   [Test, Description("Arguments: operand1, operand2, expectedResult")] [Row(1, 1, 0)] [Row(0, 999999999, -999999999)] [Row(0, 0, 0)] [Row(0, double.MaxValue, -double.MaxValue)] [Row(4, double.MaxValue - 2.5, -double.MaxValue)] public void TestSubtract(double operand1, double operand2, double expectedResult) {     ICalculator calculator = container.GetService<ICalculator>();       double result = calculator.Subtract(operand1, operand2);       Assert.AreEqual(expectedResult, result); }   [Test, Description("Arguments: operand1, operand2, expectedResult")] [Row(1, 1, 0)] [Row(0, 999999999, -999999999)] [Row(0, 0, 0)] [Row(0, double.MaxValue, -double.MaxValue)] [Row(4, double.MaxValue - 2.5, -double.MaxValue)] public void TestSubtractGivesExpectedLastResult(double operand1, double operand2, double expectedResult) {     ICalculator calculator = container.GetService<ICalculator>();       calculator.Subtract(operand1, operand2);       Assert.AreEqual(expectedResult, calculator.LastResult); }   ...   // ICalculator.cs: /// <summary> /// Subtracts the specified operands. /// </summary> /// <param name="operand1">The operand1.</param> /// <param name="operand2">The operand2.</param> /// <returns>The result of the subtraction.</returns> /// <exception cref="ArgumentException"> /// Argument <paramref name="operand1"/> is &lt; 0.<br/> /// -- or --<br/> /// Argument <paramref name="operand2"/> is &lt; 0. /// </exception> double Subtract(double operand1, double operand2);   ...   // Calculator.cs:   public double Subtract(double operand1, double operand2) {     if (operand1 < 0.0)     {         throw new ArgumentException("Value must not be negative.", "operand1");     }       if (operand2 < 0.0)     {         throw new ArgumentException("Value must not be negative.", "operand2");     }       return (this.LastResult = operand1 - operand2).Value; }   Obviously, the argument validation stuff that was produced during the red-green part of our cycle duplicates the code from the previous Add() method. So, to avoid code duplication and minimize the number of code lines of the production code, we do an Extract Method refactoring. One more time, this is only a matter of a few mouse clicks (and giving the new method a name) with R#: Having done that, our production code finally looks like that: using System; using LinFu.IoC.Configuration;   namespace Calculator {     [Implements(typeof(ICalculator))]     internal class Calculator : ICalculator     {         #region ICalculator           public double? LastResult { get; private set; }           public double Add(double operand1, double operand2)         {             ThrowIfOneOperandIsInvalid(operand1, operand2);               return (this.LastResult = operand1 + operand2).Value;         }           public double Subtract(double operand1, double operand2)         {             ThrowIfOneOperandIsInvalid(operand1, operand2);               return (this.LastResult = operand1 - operand2).Value;         }           #endregion // ICalculator           #region Implementation (Helper)           private static void ThrowIfOneOperandIsInvalid(double operand1, double operand2)         {             if (operand1 < 0.0)             {                 throw new ArgumentException("Value must not be negative.", "operand1");             }               if (operand2 < 0.0)             {                 throw new ArgumentException("Value must not be negative.", "operand2");             }         }           #endregion // Implementation (Helper)       } // class Calculator   } // namespace Calculator But is the above worth the effort at all? It’s obviously trivial and not very impressive. All our tests were green (for the right reasons), and refactoring the code did not change anything. It’s not immediately clear how this refactoring work adds value to the project. Derick puts it like this: STOP! Hold on a second… before you go any further and before you even think about refactoring what you just wrote to make your test pass, you need to understand something: if your done with your requirements after making the test green, you are not required to refactor the code. I know… I’m speaking heresy, here. Toss me to the wolves, I’ve gone over to the dark side! Seriously, though… if your test is passing for the right reasons, and you do not need to write any test or any more code for you class at this point, what value does refactoring add? Derick immediately answers his own question: So why should you follow the refactor portion of red/green/refactor? When you have added code that makes the system less readable, less understandable, less expressive of the domain or concern’s intentions, less architecturally sound, less DRY, etc, then you should refactor it. I couldn’t state it more precise. From my personal perspective, I’d add the following: You have to keep in mind that real-world software systems are usually quite large and there are dozens or even hundreds of occasions where micro-refactorings like the above can be applied. It’s the sum of them all that counts. And to have a good overall quality of the system (e.g. in terms of the Code Duplication Percentage metric) you have to be pedantic on the individual, seemingly trivial cases. My job regularly requires the reading and understanding of ‘foreign’ code. So code quality/readability really makes a HUGE difference for me – sometimes it can be even the difference between project success and failure… Conclusions The above described development process emerged over the years, and there were mainly two things that guided its evolution (you might call it eternal principles, personal beliefs, or anything in between): Test-driven development is the normal, natural way of writing software, code-first is exceptional. So ‘doing TDD or not’ is not a question. And good, stable code can only reliably be produced by doing TDD (yes, I know: many will strongly disagree here again, but I’ve never seen high-quality code – and high-quality code is code that stood the test of time and causes low maintenance costs – that was produced code-first…) It’s the production code that pays our bills in the end. (Though I have seen customers these days who demand an acceptance test battery as part of the final delivery. Things seem to go into the right direction…). The test code serves ‘only’ to make the production code work. But it’s the number of delivered features which solely counts at the end of the day - no matter how much test code you wrote or how good it is. With these two things in mind, I tried to optimize my coding process for coding speed – or, in business terms: productivity - without sacrificing the principles of TDD (more than I’d do either way…).  As a result, I consider a ratio of about 3-5/1 for test code vs. production code as normal and desirable. In other words: roughly 60-80% of my code is test code (This might sound heavy, but that is mainly due to the fact that software development standards only begin to evolve. The entire software development profession is very young, historically seen; only at the very beginning, and there are no viable standards yet. If you think about software development as a kind of casting process, where the test code is the mold and the resulting production code is the final product, then the above ratio sounds no longer extraordinary…) Although the above might look like very much unnecessary work at first sight, it’s not. With the aid of the mentioned add-ins, doing all the above is a matter of minutes, sometimes seconds (while writing this post took hours and days…). The most important thing is to have the right tools at hand. Slow developer machines or the lack of a tool or something like that - for ‘saving’ a few 100 bucks -  is just not acceptable and a very bad decision in business terms (though I quite some times have seen and heard that…). Production of high-quality products needs the usage of high-quality tools. This is a platitude that every craftsman knows… The here described round-trip will take me about five to ten minutes in my real-world development practice. I guess it’s about 30% more time compared to developing the ‘traditional’ (code-first) way. But the so manufactured ‘product’ is of much higher quality and massively reduces maintenance costs, which is by far the single biggest cost factor, as I showed in this previous post: It's the maintenance, stupid! (or: Something is rotten in developerland.). In the end, this is a highly cost-effective way of software development… But on the other hand, there clearly is a trade-off here: coding speed vs. code quality/later maintenance costs. The here described development method might be a perfect fit for the overwhelming majority of software projects, but there certainly are some scenarios where it’s not - e.g. if time-to-market is crucial for a software project. So this is a business decision in the end. It’s just that you have to know what you’re doing and what consequences this might have… Some last words First, I’d like to thank Derick Bailey again. His two aforementioned posts (which I strongly recommend for reading) inspired me to think deeply about my own personal way of doing TDD and to clarify my thoughts about it. I wouldn’t have done that without this inspiration. I really enjoy that kind of discussions… I agree with him in all respects. But I don’t know (yet?) how to bring his insights into the described production process without slowing things down. The above described method proved to be very “good enough” in my practical experience. But of course, I’m open to suggestions here… My rationale for now is: If the test is initially red during the red-green-refactor cycle, the ‘right reason’ is: it actually calls the right method, but this method is not yet operational. Later on, when the cycle is finished and the tests become part of the regular, automated Continuous Integration process, ‘red’ certainly must occur for the ‘right reason’: in this phase, ‘red’ MUST mean nothing but an unfulfilled assertion - Fail By Assertion, Not By Anything Else!

    Read the article

  • Node.js Adventure - Storage Services and Service Runtime

    - by Shaun
    When I described on how to host a Node.js application on Windows Azure, one of questions might be raised about how to consume the vary Windows Azure services, such as the storage, service bus, access control, etc.. Interact with windows azure services is available in Node.js through the Windows Azure Node.js SDK, which is a module available in NPM. In this post I would like to describe on how to use Windows Azure Storage (a.k.a. WAS) as well as the service runtime.   Consume Windows Azure Storage Let’s firstly have a look on how to consume WAS through Node.js. As we know in the previous post we can host Node.js application on Windows Azure Web Site (a.k.a. WAWS) as well as Windows Azure Cloud Service (a.k.a. WACS). In theory, WAWS is also built on top of WACS worker roles with some more features. Hence in this post I will only demonstrate for hosting in WACS worker role. The Node.js code can be used when consuming WAS when hosted on WAWS. But since there’s no roles in WAWS, the code for consuming service runtime mentioned in the next section cannot be used for WAWS node application. We can use the solution that I created in my last post. Alternatively we can create a new windows azure project in Visual Studio with a worker role, add the “node.exe” and “index.js” and install “express” and “node-sqlserver” modules, make all files as “Copy always”. In order to use windows azure services we need to have Windows Azure Node.js SDK, as knows as a module named “azure” which can be installed through NPM. Once we downloaded and installed, we need to include them in our worker role project and make them as “Copy always”. You can use my “Copy all always” tool mentioned in my last post to update the currently worker role project file. You can also find the source code of this tool here. The source code of Windows Azure SDK for Node.js can be found in its GitHub page. It contains two parts. One is a CLI tool which provides a cross platform command line package for Mac and Linux to manage WAWS and Windows Azure Virtual Machines (a.k.a. WAVM). The other is a library for managing and consuming vary windows azure services includes tables, blobs, queues, service bus and the service runtime. I will not cover all of them but will only demonstrate on how to use tables and service runtime information in this post. You can find the full document of this SDK here. Back to Visual Studio and open the “index.js”, let’s continue our application from the last post, which was working against Windows Azure SQL Database (a.k.a. WASD). The code should looks like this. 1: var express = require("express"); 2: var sql = require("node-sqlserver"); 3:  4: var connectionString = "Driver={SQL Server Native Client 10.0};Server=tcp:ac6271ya9e.database.windows.net,1433;Database=synctile;Uid=shaunxu@ac6271ya9e;Pwd={PASSWORD};Encrypt=yes;Connection Timeout=30;"; 5: var port = 80; 6:  7: var app = express(); 8:  9: app.configure(function () { 10: app.use(express.bodyParser()); 11: }); 12:  13: app.get("/", function (req, res) { 14: sql.open(connectionString, function (err, conn) { 15: if (err) { 16: console.log(err); 17: res.send(500, "Cannot open connection."); 18: } 19: else { 20: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 21: if (err) { 22: console.log(err); 23: res.send(500, "Cannot retrieve records."); 24: } 25: else { 26: res.json(results); 27: } 28: }); 29: } 30: }); 31: }); 32:  33: app.get("/text/:key/:culture", function (req, res) { 34: sql.open(connectionString, function (err, conn) { 35: if (err) { 36: console.log(err); 37: res.send(500, "Cannot open connection."); 38: } 39: else { 40: var key = req.params.key; 41: var culture = req.params.culture; 42: var command = "SELECT * FROM [Resource] WHERE [Key] = '" + key + "' AND [Culture] = '" + culture + "'"; 43: conn.queryRaw(command, function (err, results) { 44: if (err) { 45: console.log(err); 46: res.send(500, "Cannot retrieve records."); 47: } 48: else { 49: res.json(results); 50: } 51: }); 52: } 53: }); 54: }); 55:  56: app.get("/sproc/:key/:culture", function (req, res) { 57: sql.open(connectionString, function (err, conn) { 58: if (err) { 59: console.log(err); 60: res.send(500, "Cannot open connection."); 61: } 62: else { 63: var key = req.params.key; 64: var culture = req.params.culture; 65: var command = "EXEC GetItem '" + key + "', '" + culture + "'"; 66: conn.queryRaw(command, function (err, results) { 67: if (err) { 68: console.log(err); 69: res.send(500, "Cannot retrieve records."); 70: } 71: else { 72: res.json(results); 73: } 74: }); 75: } 76: }); 77: }); 78:  79: app.post("/new", function (req, res) { 80: var key = req.body.key; 81: var culture = req.body.culture; 82: var val = req.body.val; 83:  84: sql.open(connectionString, function (err, conn) { 85: if (err) { 86: console.log(err); 87: res.send(500, "Cannot open connection."); 88: } 89: else { 90: var command = "INSERT INTO [Resource] VALUES ('" + key + "', '" + culture + "', N'" + val + "')"; 91: conn.queryRaw(command, function (err, results) { 92: if (err) { 93: console.log(err); 94: res.send(500, "Cannot retrieve records."); 95: } 96: else { 97: res.send(200, "Inserted Successful"); 98: } 99: }); 100: } 101: }); 102: }); 103:  104: app.listen(port); Now let’s create a new function, copy the records from WASD to table service. 1. Delete the table named “resource”. 2. Create a new table named “resource”. These 2 steps ensures that we have an empty table. 3. Load all records from the “resource” table in WASD. 4. For each records loaded from WASD, insert them into the table one by one. 5. Prompt to user when finished. In order to use table service we need the storage account and key, which can be found from the developer portal. Just select the storage account and click the Manage Keys button. Then create two local variants in our Node.js application for the storage account name and key. Since we need to use WAS we need to import the azure module. Also I created another variant stored the table name. In order to work with table service I need to create the storage client for table service. This is very similar as the Windows Azure SDK for .NET. As the code below I created a new variant named “client” and use “createTableService”, specified my storage account name and key. 1: var azure = require("azure"); 2: var storageAccountName = "synctile"; 3: var storageAccountKey = "/cOy9L7xysXOgPYU9FjDvjrRAhaMX/5tnOpcjqloPNDJYucbgTy7MOrAW7CbUg6PjaDdmyl+6pkwUnKETsPVNw=="; 4: var tableName = "resource"; 5: var client = azure.createTableService(storageAccountName, storageAccountKey); Now create a new function for URL “/was/init” so that we can trigger it through browser. Then in this function we will firstly load all records from WASD. 1: app.get("/was/init", function (req, res) { 2: // load all records from windows azure sql database 3: sql.open(connectionString, function (err, conn) { 4: if (err) { 5: console.log(err); 6: res.send(500, "Cannot open connection."); 7: } 8: else { 9: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 10: if (err) { 11: console.log(err); 12: res.send(500, "Cannot retrieve records."); 13: } 14: else { 15: if (results.rows.length > 0) { 16: // begin to transform the records into table service 17: } 18: } 19: }); 20: } 21: }); 22: }); When we succeed loaded all records we can start to transform them into table service. First I need to recreate the table in table service. This can be done by deleting and creating the table through table client I had just created previously. 1: app.get("/was/init", function (req, res) { 2: // load all records from windows azure sql database 3: sql.open(connectionString, function (err, conn) { 4: if (err) { 5: console.log(err); 6: res.send(500, "Cannot open connection."); 7: } 8: else { 9: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 10: if (err) { 11: console.log(err); 12: res.send(500, "Cannot retrieve records."); 13: } 14: else { 15: if (results.rows.length > 0) { 16: // begin to transform the records into table service 17: // recreate the table named 'resource' 18: client.deleteTable(tableName, function (error) { 19: client.createTableIfNotExists(tableName, function (error) { 20: if (error) { 21: error["target"] = "createTableIfNotExists"; 22: res.send(500, error); 23: } 24: else { 25: // transform the records 26: } 27: }); 28: }); 29: } 30: } 31: }); 32: } 33: }); 34: }); As you can see, the azure SDK provide its methods in callback pattern. In fact, almost all modules in Node.js use the callback pattern. For example, when I deleted a table I invoked “deleteTable” method, provided the name of the table and a callback function which will be performed when the table had been deleted or failed. Underlying, the azure module will perform the table deletion operation in POSIX async threads pool asynchronously. And once it’s done the callback function will be performed. This is the reason we need to nest the table creation code inside the deletion function. If we perform the table creation code after the deletion code then they will be invoked in parallel. Next, for each records in WASD I created an entity and then insert into the table service. Finally I send the response to the browser. Can you find a bug in the code below? I will describe it later in this post. 1: app.get("/was/init", function (req, res) { 2: // load all records from windows azure sql database 3: sql.open(connectionString, function (err, conn) { 4: if (err) { 5: console.log(err); 6: res.send(500, "Cannot open connection."); 7: } 8: else { 9: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 10: if (err) { 11: console.log(err); 12: res.send(500, "Cannot retrieve records."); 13: } 14: else { 15: if (results.rows.length > 0) { 16: // begin to transform the records into table service 17: // recreate the table named 'resource' 18: client.deleteTable(tableName, function (error) { 19: client.createTableIfNotExists(tableName, function (error) { 20: if (error) { 21: error["target"] = "createTableIfNotExists"; 22: res.send(500, error); 23: } 24: else { 25: // transform the records 26: for (var i = 0; i < results.rows.length; i++) { 27: var entity = { 28: "PartitionKey": results.rows[i][1], 29: "RowKey": results.rows[i][0], 30: "Value": results.rows[i][2] 31: }; 32: client.insertEntity(tableName, entity, function (error) { 33: if (error) { 34: error["target"] = "insertEntity"; 35: res.send(500, error); 36: } 37: else { 38: console.log("entity inserted"); 39: } 40: }); 41: } 42: // send the 43: console.log("all done"); 44: res.send(200, "All done!"); 45: } 46: }); 47: }); 48: } 49: } 50: }); 51: } 52: }); 53: }); Now we can publish it to the cloud and have a try. But normally we’d better test it at the local emulator first. In Node.js SDK there are three build-in properties which provides the account name, key and host address for local storage emulator. We can use them to initialize our table service client. We also need to change the SQL connection string to let it use my local database. The code will be changed as below. 1: // windows azure sql database 2: //var connectionString = "Driver={SQL Server Native Client 10.0};Server=tcp:ac6271ya9e.database.windows.net,1433;Database=synctile;Uid=shaunxu@ac6271ya9e;Pwd=eszqu94XZY;Encrypt=yes;Connection Timeout=30;"; 3: // sql server 4: var connectionString = "Driver={SQL Server Native Client 11.0};Server={.};Database={Caspar};Trusted_Connection={Yes};"; 5:  6: var azure = require("azure"); 7: var storageAccountName = "synctile"; 8: var storageAccountKey = "/cOy9L7xysXOgPYU9FjDvjrRAhaMX/5tnOpcjqloPNDJYucbgTy7MOrAW7CbUg6PjaDdmyl+6pkwUnKETsPVNw=="; 9: var tableName = "resource"; 10: // windows azure storage 11: //var client = azure.createTableService(storageAccountName, storageAccountKey); 12: // local storage emulator 13: var client = azure.createTableService(azure.ServiceClient.DEVSTORE_STORAGE_ACCOUNT, azure.ServiceClient.DEVSTORE_STORAGE_ACCESS_KEY, azure.ServiceClient.DEVSTORE_TABLE_HOST); Now let’s run the application and navigate to “localhost:12345/was/init” as I hosted it on port 12345. We can find it transformed the data from my local database to local table service. Everything looks fine. But there is a bug in my code. If we have a look on the Node.js command window we will find that it sent response before all records had been inserted, which is not what I expected. The reason is that, as I mentioned before, Node.js perform all IO operations in non-blocking model. When we inserted the records we executed the table service insert method in parallel, and the operation of sending response was also executed in parallel, even though I wrote it at the end of my logic. The correct logic should be, when all entities had been copied to table service with no error, then I will send response to the browser, otherwise I should send error message to the browser. To do so I need to import another module named “async”, which helps us to coordinate our asynchronous code. Install the module and import it at the beginning of the code. Then we can use its “forEach” method for the asynchronous code of inserting table entities. The first argument of “forEach” is the array that will be performed. The second argument is the operation for each items in the array. And the third argument will be invoked then all items had been performed or any errors occurred. Here we can send our response to browser. 1: app.get("/was/init", function (req, res) { 2: // load all records from windows azure sql database 3: sql.open(connectionString, function (err, conn) { 4: if (err) { 5: console.log(err); 6: res.send(500, "Cannot open connection."); 7: } 8: else { 9: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 10: if (err) { 11: console.log(err); 12: res.send(500, "Cannot retrieve records."); 13: } 14: else { 15: if (results.rows.length > 0) { 16: // begin to transform the records into table service 17: // recreate the table named 'resource' 18: client.deleteTable(tableName, function (error) { 19: client.createTableIfNotExists(tableName, function (error) { 20: if (error) { 21: error["target"] = "createTableIfNotExists"; 22: res.send(500, error); 23: } 24: else { 25: async.forEach(results.rows, 26: // transform the records 27: function (row, callback) { 28: var entity = { 29: "PartitionKey": row[1], 30: "RowKey": row[0], 31: "Value": row[2] 32: }; 33: client.insertEntity(tableName, entity, function (error) { 34: if (error) { 35: callback(error); 36: } 37: else { 38: console.log("entity inserted."); 39: callback(null); 40: } 41: }); 42: }, 43: // send reponse 44: function (error) { 45: if (error) { 46: error["target"] = "insertEntity"; 47: res.send(500, error); 48: } 49: else { 50: console.log("all done"); 51: res.send(200, "All done!"); 52: } 53: } 54: ); 55: } 56: }); 57: }); 58: } 59: } 60: }); 61: } 62: }); 63: }); Run it locally and now we can find the response was sent after all entities had been inserted. Query entities against table service is simple as well. Just use the “queryEntity” method from the table service client and providing the partition key and row key. We can also provide a complex query criteria as well, for example the code here. In the code below I queried an entity by the partition key and row key, and return the proper localization value in response. 1: app.get("/was/:key/:culture", function (req, res) { 2: var key = req.params.key; 3: var culture = req.params.culture; 4: client.queryEntity(tableName, culture, key, function (error, entity) { 5: if (error) { 6: res.send(500, error); 7: } 8: else { 9: res.json(entity); 10: } 11: }); 12: }); And then tested it on local emulator. Finally if we want to publish this application to the cloud we should change the database connection string and storage account. For more information about how to consume blob and queue service, as well as the service bus please refer to the MSDN page.   Consume Service Runtime As I mentioned above, before we published our application to the cloud we need to change the connection string and account information in our code. But if you had played with WACS you should have known that the service runtime provides the ability to retrieve configuration settings, endpoints and local resource information at runtime. Which means we can have these values defined in CSCFG and CSDEF files and then the runtime should be able to retrieve the proper values. For example we can add some role settings though the property window of the role, specify the connection string and storage account for cloud and local. And the can also use the endpoint which defined in role environment to our Node.js application. In Node.js SDK we can get an object from “azure.RoleEnvironment”, which provides the functionalities to retrieve the configuration settings and endpoints, etc.. In the code below I defined the connection string variants and then use the SDK to retrieve and initialize the table client. 1: var connectionString = ""; 2: var storageAccountName = ""; 3: var storageAccountKey = ""; 4: var tableName = ""; 5: var client; 6:  7: azure.RoleEnvironment.getConfigurationSettings(function (error, settings) { 8: if (error) { 9: console.log("ERROR: getConfigurationSettings"); 10: console.log(JSON.stringify(error)); 11: } 12: else { 13: console.log(JSON.stringify(settings)); 14: connectionString = settings["SqlConnectionString"]; 15: storageAccountName = settings["StorageAccountName"]; 16: storageAccountKey = settings["StorageAccountKey"]; 17: tableName = settings["TableName"]; 18:  19: console.log("connectionString = %s", connectionString); 20: console.log("storageAccountName = %s", storageAccountName); 21: console.log("storageAccountKey = %s", storageAccountKey); 22: console.log("tableName = %s", tableName); 23:  24: client = azure.createTableService(storageAccountName, storageAccountKey); 25: } 26: }); In this way we don’t need to amend the code for the configurations between local and cloud environment since the service runtime will take care of it. At the end of the code we will listen the application on the port retrieved from SDK as well. 1: azure.RoleEnvironment.getCurrentRoleInstance(function (error, instance) { 2: if (error) { 3: console.log("ERROR: getCurrentRoleInstance"); 4: console.log(JSON.stringify(error)); 5: } 6: else { 7: console.log(JSON.stringify(instance)); 8: if (instance["endpoints"] && instance["endpoints"]["nodejs"]) { 9: var endpoint = instance["endpoints"]["nodejs"]; 10: app.listen(endpoint["port"]); 11: } 12: else { 13: app.listen(8080); 14: } 15: } 16: }); But if we tested the application right now we will find that it cannot retrieve any values from service runtime. This is because by default, the entry point of this role was defined to the worker role class. In windows azure environment the service runtime will open a named pipeline to the entry point instance, so that it can connect to the runtime and retrieve values. But in this case, since the entry point was worker role and the Node.js was opened inside the role, the named pipeline was established between our worker role class and service runtime, so our Node.js application cannot use it. To fix this problem we need to open the CSDEF file under the azure project, add a new element named Runtime. Then add an element named EntryPoint which specify the Node.js command line. So that the Node.js application will have the connection to service runtime, then it’s able to read the configurations. Start the Node.js at local emulator we can find it retrieved the connections, storage account for local. And if we publish our application to azure then it works with WASD and storage service through the configurations for cloud.   Summary In this post I demonstrated how to use Windows Azure SDK for Node.js to interact with storage service, especially the table service. I also demonstrated on how to use WACS service runtime, how to retrieve the configuration settings and the endpoint information. And in order to make the service runtime available to my Node.js application I need to create an entry point element in CSDEF file and set “node.exe” as the entry point. I used five posts to introduce and demonstrate on how to run a Node.js application on Windows platform, how to use Windows Azure Web Site and Windows Azure Cloud Service worker role to host our Node.js application. I also described how to work with other services provided by Windows Azure platform through Windows Azure SDK for Node.js. Node.js is a very new and young network application platform. But since it’s very simple and easy to learn and deploy, as well as, it utilizes single thread non-blocking IO model, Node.js became more and more popular on web application and web service development especially for those IO sensitive projects. And as Node.js is very good at scaling-out, it’s more useful on cloud computing platform. Use Node.js on Windows platform is new, too. The modules for SQL database and Windows Azure SDK are still under development and enhancement. It doesn’t support SQL parameter in “node-sqlserver”. It does support using storage connection string to create the storage client in “azure”. But Microsoft is working on make them easier to use, working on add more features and functionalities.   PS, you can download the source code here. You can download the source code of my “Copy all always” tool here.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • Returning a list from a function in Python

    - by Jasper
    Hi, I'm creating a game for my sister, and I want a function to return a list variable, so I can pass it to another variable. The relevant code is as follows: def startNewGame(): while 1: #Introduction: print print """Hello, You will now be guided through the setup process. There are 7 steps to this. You can cancel setup at any time by typing 'cancelSetup' Thankyou""" #Step 1 (Name): print print """Step 1 of 7: Type in a name for your PotatoHead: """ inputPHName = raw_input('|Enter Name:|') if inputPHName == 'cancelSetup': sys.exit() #Step 2 (Gender): print print """Step 2 of 7: Choose the gender of your PotatoHead: input either 'm' or 'f' """ inputPHGender = raw_input('|Enter Gender:|') if inputPHGender == 'cancelSetup': sys.exit() #Step 3 (Colour): print print """Step 3 of 7: Choose the colour your PotatoHead will be: Only Red, Blue, Green and Yellow are currently supported """ inputPHColour = raw_input('|Enter Colour:|') if inputPHColour == 'cancelSetup': sys.exit() #Step 4 (Favourite Thing): print print """Step 4 of 7: Type your PotatoHead's favourite thing: """ inputPHFavThing = raw_input('|Enter Favourite Thing:|') if inputPHFavThing == 'cancelSetup': sys.exit() # Step 5 (First Toy): print print """Step 5 of 7: Choose a first toy for your PotatoHead: """ inputPHFirstToy = raw_input('|Enter First Toy:|') if inputPHFirstToy == 'cancelSetup': sys.exit() #Step 6 (Check stats): while 1: print print """Step 6 of 7: Check the following details to make sure that they are correct: """ print print """Name:\t\t\t""" + inputPHName + """ Gender:\t\t\t""" + inputPHGender + """ Colour:\t\t\t""" + inputPHColour + """ Favourite Thing:\t""" + inputPHFavThing + """ First Toy:\t\t""" + inputPHFirstToy + """ """ print print "Enter 'y' or 'n'" inputMCheckStats = raw_input('|Is this information correct?|') if inputMCheckStats == 'cancelSetup': sys.exit() elif inputMCheckStats == 'y': break elif inputMCheckStats == 'n': print "Re-enter info: ..." print break else: "The value you entered was incorrect, please re-enter your choice" if inputMCheckStats == 'y': break #Step 7 (Define variables for the creation of the PotatoHead): MFCreatePH = [] print print """Step 7 of 7: Your PotatoHead will now be created... Creating variables... """ MFCreatePH = [inputPHName, inputPHGender, inputPHColour, inputPHFavThing, inputPHFirstToy] time.sleep(1) print "inputPHName" print time.sleep(1) print "inputPHFirstToy" print return MFCreatePH print "Your PotatoHead varibles have been successfully created!" Then it is passed to another function that was imported from another module from potatohead import * ... welcomeMessage() MCreatePH = startGame() myPotatoHead = PotatoHead(MCreatePH) the code for the PotatoHead object is in the potatohead.py module which was imported above, and is as follows: class PotatoHead: #Initialise the PotatoHead object: def __init__(self, data): self.data = data #Takes the data from the start new game function - see main.py #Defines the PotatoHead starting attributes: self.name = data[0] self.gender = data[1] self.colour = data[2] self.favouriteThing = data[3] self.firstToy = data[4] self.age = '0.0' self.education = [self.eduScience, self.eduEnglish, self.eduMaths] = '0.0', '0.0', '0.0' self.fitness = '0.0' self.happiness = '10.0' self.health = '10.0' self.hunger = '0.0' self.tiredness = 'Not in this version' self.toys = [] self.toys.append(self.firstToy) self.time = '0' #Sets data lists for saving, loading and general use: self.phData = (self.name, self.gender, self.colour, self.favouriteThing, self.firstToy) self.phAdvData = (self.name, self.gender, self.colour, self.favouriteThing, self.firstToy, self.age, self.education, self.fitness, self.happiness, self.health, self.hunger, self.tiredness, self.toys) However, when I run the program this error appears: Traceback (most recent call last): File "/Users/Jasper/Documents/Programming/Potato Head Game/Current/main.py", line 158, in <module> myPotatoHead = PotatoHead(MCreatePH) File "/Users/Jasper/Documents/Programming/Potato Head Game/Current/potatohead.py", line 15, in __init__ self.name = data[0] TypeError: 'NoneType' object is unsubscriptable What am i doing wrong? -----EDIT----- The program finishes as so: Step 7 of 7: Your PotatoHead will now be created... Creating variables... inputPHName inputPHFirstToy Then it goes to the Tracback -----EDIT2----- This is the EXACT code I'm running in its entirety: #+--------------------------------------+# #| main.py |# #| A main module for the Potato Head |# #| Game to pull the other modules |# #| together and control through user |# #| input |# #| Author: |# #| Date Created / Modified: |# #| 3/2/10 | 20/2/10 |# #+--------------------------------------+# Tested: No #Import the required modules: import time import random import sys from potatohead import * from toy import * #Start the Game: def welcomeMessage(): print "----- START NEW GAME -----------------------" print "==Print Welcome Message==" print "loading... \t loading... \t loading..." time.sleep(1) print "loading..." time.sleep(1) print "LOADED..." print; print; print; print """Hello, Welcome to the Potato Head Game. In this game you can create a Potato Head, and look after it, like a Virtual Pet. This game is constantly being updated and expanded. Please look out for updates. """ #Choose whether to start a new game or load a previously saved game: def startGame(): while 1: print "--------------------" print """ Choose an option: New_Game or Load_Game """ startGameInput = raw_input('>>> >') if startGameInput == 'New_Game': startNewGame() break elif startGameInput == 'Load_Game': print "This function is not yet supported" print "Try Again" print else: print "You must have mistyped the command: Type either 'New_Game' or 'Load_Game'" print #Set the new game up: def startNewGame(): while 1: #Introduction: print print """Hello, You will now be guided through the setup process. There are 7 steps to this. You can cancel setup at any time by typing 'cancelSetup' Thankyou""" #Step 1 (Name): print print """Step 1 of 7: Type in a name for your PotatoHead: """ inputPHName = raw_input('|Enter Name:|') if inputPHName == 'cancelSetup': sys.exit() #Step 2 (Gender): print print """Step 2 of 7: Choose the gender of your PotatoHead: input either 'm' or 'f' """ inputPHGender = raw_input('|Enter Gender:|') if inputPHGender == 'cancelSetup': sys.exit() #Step 3 (Colour): print print """Step 3 of 7: Choose the colour your PotatoHead will be: Only Red, Blue, Green and Yellow are currently supported """ inputPHColour = raw_input('|Enter Colour:|') if inputPHColour == 'cancelSetup': sys.exit() #Step 4 (Favourite Thing): print print """Step 4 of 7: Type your PotatoHead's favourite thing: """ inputPHFavThing = raw_input('|Enter Favourite Thing:|') if inputPHFavThing == 'cancelSetup': sys.exit() # Step 5 (First Toy): print print """Step 5 of 7: Choose a first toy for your PotatoHead: """ inputPHFirstToy = raw_input('|Enter First Toy:|') if inputPHFirstToy == 'cancelSetup': sys.exit() #Step 6 (Check stats): while 1: print print """Step 6 of 7: Check the following details to make sure that they are correct: """ print print """Name:\t\t\t""" + inputPHName + """ Gender:\t\t\t""" + inputPHGender + """ Colour:\t\t\t""" + inputPHColour + """ Favourite Thing:\t""" + inputPHFavThing + """ First Toy:\t\t""" + inputPHFirstToy + """ """ print print "Enter 'y' or 'n'" inputMCheckStats = raw_input('|Is this information correct?|') if inputMCheckStats == 'cancelSetup': sys.exit() elif inputMCheckStats == 'y': break elif inputMCheckStats == 'n': print "Re-enter info: ..." print break else: "The value you entered was incorrect, please re-enter your choice" if inputMCheckStats == 'y': break #Step 7 (Define variables for the creation of the PotatoHead): MFCreatePH = [] print print """Step 7 of 7: Your PotatoHead will now be created... Creating variables... """ MFCreatePH = [inputPHName, inputPHGender, inputPHColour, inputPHFavThing, inputPHFirstToy] time.sleep(1) print "inputPHName" print time.sleep(1) print "inputPHFirstToy" print return MFCreatePH print "Your PotatoHead varibles have been successfully created!" #Run Program: welcomeMessage() MCreatePH = startGame() myPotatoHead = PotatoHead(MCreatePH) The potatohead.py module is as follows: #+--------------------------------------+# #| potatohead.py |# #| A module for the Potato Head Game |# #| Author: |# #| Date Created / Modified: |# #| 24/1/10 | 24/1/10 |# #+--------------------------------------+# Tested: Yes (24/1/10) #Create the PotatoHead class: class PotatoHead: #Initialise the PotatoHead object: def __init__(self, data): self.data = data #Takes the data from the start new game function - see main.py #Defines the PotatoHead starting attributes: self.name = data[0] self.gender = data[1] self.colour = data[2] self.favouriteThing = data[3] self.firstToy = data[4] self.age = '0.0' self.education = [self.eduScience, self.eduEnglish, self.eduMaths] = '0.0', '0.0', '0.0' self.fitness = '0.0' self.happiness = '10.0' self.health = '10.0' self.hunger = '0.0' self.tiredness = 'Not in this version' self.toys = [] self.toys.append(self.firstToy) self.time = '0' #Sets data lists for saving, loading and general use: self.phData = (self.name, self.gender, self.colour, self.favouriteThing, self.firstToy) self.phAdvData = (self.name, self.gender, self.colour, self.favouriteThing, self.firstToy, self.age, self.education, self.fitness, self.happiness, self.health, self.hunger, self.tiredness, self.toys) #Define the phStats variable, enabling easy display of PotatoHead attributes: def phDefStats(self): self.phStats = """Your Potato Head's Stats are as follows: ---------------------------------------- Name: \t\t""" + self.name + """ Gender: \t\t""" + self.gender + """ Colour: \t\t""" + self.colour + """ Favourite Thing: \t""" + self.favouriteThing + """ First Toy: \t""" + self.firstToy + """ Age: \t\t""" + self.age + """ Education: \t""" + str(float(self.eduScience) + float(self.eduEnglish) + float(self.eduMaths)) + """ -> Science: \t""" + self.eduScience + """ -> English: \t""" + self.eduEnglish + """ -> Maths: \t""" + self.eduMaths + """ Fitness: \t""" + self.fitness + """ Happiness: \t""" + self.happiness + """ Health: \t""" + self.health + """ Hunger: \t""" + self.hunger + """ Tiredness: \t""" + self.tiredness + """ Toys: \t\t""" + str(self.toys) + """ Time: \t\t""" + self.time + """ """ #Change the PotatoHead's favourite thing: def phChangeFavouriteThing(self, newFavouriteThing): self.favouriteThing = newFavouriteThing phChangeFavouriteThingMsg = "Your Potato Head's favourite thing is " + self.favouriteThing + "." #"Feed" the Potato Head i.e. Reduce the 'self.hunger' attribute's value: def phFeed(self): if float(self.hunger) >=3.0: self.hunger = str(float(self.hunger) - 3.0) elif float(self.hunger) < 3.0: self.hunger = '0.0' self.time = str(int(self.time) + 1) #Pass time #"Exercise" the Potato Head if between the ages of 5 and 25: def phExercise(self): if float(self.age) < 5.1 or float(self.age) > 25.1: print "This Potato Head is either too young or too old for this activity!" else: if float(self.fitness) <= 8.0: self.fitness = str(float(self.fitness) + 2.0) elif float(self.fitness) > 8.0: self.fitness = '10.0' self.time = str(int(self.time) + 1) #Pass time #"Teach" the Potato Head: def phTeach(self, subject): if subject == 'Science': if float(self.eduScience) <= 9.0: self.eduScience = str(float(self.eduScience) + 1.0) elif float(self.eduScience) > 9.0 and float(self.eduScience) < 10.0: self.eduScience = '10.0' elif float(self.eduScience) == 10.0: print "Your Potato Head has gained the highest level of qualifications in this subject! It cannot learn any more!" elif subject == 'English': if float(self.eduEnglish) <= 9.0: self.eduEnglish = str(float(self.eduEnglish) + 1.0) elif float(self.eduEnglish) > 9.0 and float(self.eduEnglish) < 10.0: self.eduEnglish = '10.0' elif float(self.eduEnglish) == 10.0: print "Your Potato Head has gained the highest level of qualifications in this subject! It cannot learn any more!" elif subject == 'Maths': if float(self.eduMaths) <= 9.0: self.eduMaths = str(float(self.eduMaths) + 1.0) elif float(self.eduMaths) > 9.0 and float(self.eduMaths) < 10.0: self.eduMaths = '10.0' elif float(self.eduMaths) == 10.0: print "Your Potato Head has gained the highest level of qualifications in this subject! It cannot learn any more!" else: print "That subject is not an option..." print "Please choose either Science, English or Maths" self.time = str(int(self.time) + 1) #Pass time #Increase Health: def phGoToDoctor(self): self.health = '10.0' self.time = str(int(self.time) + 1) #Pass time #Sleep: Age, change stats: #(Time Passes) def phSleep(self): self.time = '0' #Resets time for next 'day' (can do more things next day) #Increase hunger: if float(self.hunger) <= 5.0: self.hunger = str(float(self.hunger) + 5.0) elif float(self.hunger) > 5.0: self.hunger = '10.0' #Lower Fitness: if float(self.fitness) >= 0.5: self.fitness = str(float(self.fitness) - 0.5) elif float(self.fitness) < 0.5: self.fitness = '0.0' #Lower Health: if float(self.health) >= 0.5: self.health = str(float(self.health) - 0.5) elif float(self.health) < 0.5: self.health = '0.0' #Lower Happiness: if float(self.happiness) >= 2.0: self.happiness = str(float(self.happiness) - 2.0) elif float(self.happiness) < 2.0: self.happiness = '0.0' #Increase the Potato Head's age: self.age = str(float(self.age) + 0.1) The game is still under development - There may be parts of modules that aren't complete, but I don't think they're causing the problem

    Read the article

  • Check on every page to ensure user has validated as being over 18

    - by liquilife
    Hi Guys and Girls. I'm working on a website (tobacco related) that requires all visitors to validate they are over 18 before they can view the site. I have a form in place that validates the age but I'm at a dead end. How can I use this to store a cookie that they've passed the test and do a check on all pages to see if this check has already been passed or not? Any suggestions and help would be hugely appreciated! Below is my validation form: <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <meta http-equiv="Content-Type" content="text/html; charset=UTF-8" /> <title>Validate</title> <script type="text/javascript" src="http://jqueryjs.googlecode.com/files/jquery-1.3.2.js"></script> <script language="javascript"> function checkAge() { var min_age = 18; var year = parseInt(document.forms["age_form"]["year"].value); var month = parseInt(document.forms["age_form"]["month"].value) - 1; var day = parseInt(document.forms["age_form"]["day"].value); var theirDate = new Date((year + min_age), month, day); var today = new Date; if ( (today.getTime() - theirDate.getTime()) < 0) { alert("You are too young to enter this site!"); return false; } else { return true; } } </script> </head> <body> <form action="index.html" name="age_form" method="get" id="age_form"> <select name="day" id="day"> <option value="0" selected>DAY</option> <option value="1">1</option> <option value="2">2</option> <option value="3">3</option> <option value="4">4</option> <option value="5">5</option> <option value="6">6</option> <option value="7">7</option> <option value="8">8</option> <option value="9">9</option> <option value="10">10</option> <option value="11">11</option> <option value="12">12</option> <option value="13">13</option> <option value="14">14</option> <option value="15">15</option> <option value="16">16</option> <option value="17">17</option> <option value="18">18</option> <option value="19">19</option> <option value="20">20</option> <option value="21">21</option> <option value="22">22</option> <option value="23">23</option> <option value="24">24</option> <option value="25">25</option> <option value="26">26</option> <option value="27">27</option> <option value="28">28</option> <option value="29">29</option> <option value="30">30</option> <option value="31">31</option> </select> <select name="month" id="month"> <option value="0" selected>MONTH</option> <option value="1">January</option> <option value="2">February</option> <option value="3">March</option> <option value="4">April</option> <option value="5">May</option> <option value="6">June</option> <option value="7">July</option> <option value="8">August</option> <option value="9">September</option> <option value="10">October</option> <option value="11">November</option> <option value="12">December</option> </select> <select name="year" id="year"> <option value="0" selected>YEAR</option> <option value="1920">1920</option> <option value="1921">1921</option> <option value="1922">1922</option> <option value="1923">1923</option> <option value="1924">1924</option> <option value="1925">1925</option> <option value="1926">1926</option> <option value="1927">1927</option> <option value="1928">1928</option> <option value="1929">1929</option> <option value="1930">1930</option> <option value="1931">1931</option> <option value="1932">1932</option> <option value="1933">1933</option> <option value="1934">1934</option> <option value="1935">1935</option> <option value="1936">1936</option> <option value="1937">1937</option> <option value="1938">1938</option> <option value="1939">1939</option> <option value="1940">1940</option> <option value="1941">1941</option> <option value="1942">1942</option> <option value="1943">1943</option> <option value="1944">1944</option> <option value="1945">1945</option> <option value="1946">1946</option> <option value="1947">1947</option> <option value="1948">1948</option> <option value="1949">1949</option> <option value="1950">1950</option> <option value="1951">1951</option> <option value="1952">1952</option> <option value="1953">1953</option> <option value="1954">1954</option> <option value="1955">1955</option> <option value="1956">1956</option> <option value="1957">1957</option> <option value="1958">1958</option> <option value="1959">1959</option> <option value="1960">1960</option> <option value="1961">1961</option> <option value="1962">1962</option> <option value="1963">1963</option> <option value="1964">1964</option> <option value="1965">1965</option> <option value="1966">1966</option> <option value="1967">1967</option> <option value="1968">1968</option> <option value="1969">1969</option> <option value="1970">1970</option> <option value="1971">1971</option> <option value="1972">1972</option> <option value="1973">1973</option> <option value="1974">1974</option> <option value="1975">1975</option> <option value="1976">1976</option> <option value="1977">1977</option> <option value="1978">1978</option> <option value="1979">1979</option> <option value="1980">1980</option> <option value="1981">1981</option> <option value="1982">1982</option> <option value="1983">1983</option> <option value="1984">1984</option> <option value="1985">1985</option> <option value="1986">1986</option> <option value="1987">1987</option> <option value="1988">1988</option> <option value="1989">1989</option> <option value="1990">1990</option> <option value="1991">1991</option> <option value="1992">1992</option> <option value="1993">1993</option> <option value="1994">1994</option> <option value="1995">1995</option> <option value="1996">1996</option> <option value="1997">1997</option> <option value="1998">1998</option> <option value="1999">1999</option> </select> </form> </body> </html>

    Read the article

  • What are good design practices when working with Entity Framework

    - by AD
    This will apply mostly for an asp.net application where the data is not accessed via soa. Meaning that you get access to the objects loaded from the framework, not Transfer Objects, although some recommendation still apply. This is a community post, so please add to it as you see fit. Applies to: Entity Framework 1.0 shipped with Visual Studio 2008 sp1. Why pick EF in the first place? Considering it is a young technology with plenty of problems (see below), it may be a hard sell to get on the EF bandwagon for your project. However, it is the technology Microsoft is pushing (at the expense of Linq2Sql, which is a subset of EF). In addition, you may not be satisfied with NHibernate or other solutions out there. Whatever the reasons, there are people out there (including me) working with EF and life is not bad.make you think. EF and inheritance The first big subject is inheritance. EF does support mapping for inherited classes that are persisted in 2 ways: table per class and table the hierarchy. The modeling is easy and there are no programming issues with that part. (The following applies to table per class model as I don't have experience with table per hierarchy, which is, anyway, limited.) The real problem comes when you are trying to run queries that include one or many objects that are part of an inheritance tree: the generated sql is incredibly awful, takes a long time to get parsed by the EF and takes a long time to execute as well. This is a real show stopper. Enough that EF should probably not be used with inheritance or as little as possible. Here is an example of how bad it was. My EF model had ~30 classes, ~10 of which were part of an inheritance tree. On running a query to get one item from the Base class, something as simple as Base.Get(id), the generated SQL was over 50,000 characters. Then when you are trying to return some Associations, it degenerates even more, going as far as throwing SQL exceptions about not being able to query more than 256 tables at once. Ok, this is bad, EF concept is to allow you to create your object structure without (or with as little as possible) consideration on the actual database implementation of your table. It completely fails at this. So, recommendations? Avoid inheritance if you can, the performance will be so much better. Use it sparingly where you have to. In my opinion, this makes EF a glorified sql-generation tool for querying, but there are still advantages to using it. And ways to implement mechanism that are similar to inheritance. Bypassing inheritance with Interfaces First thing to know with trying to get some kind of inheritance going with EF is that you cannot assign a non-EF-modeled class a base class. Don't even try it, it will get overwritten by the modeler. So what to do? You can use interfaces to enforce that classes implement some functionality. For example here is a IEntity interface that allow you to define Associations between EF entities where you don't know at design time what the type of the entity would be. public enum EntityTypes{ Unknown = -1, Dog = 0, Cat } public interface IEntity { int EntityID { get; } string Name { get; } Type EntityType { get; } } public partial class Dog : IEntity { // implement EntityID and Name which could actually be fields // from your EF model Type EntityType{ get{ return EntityTypes.Dog; } } } Using this IEntity, you can then work with undefined associations in other classes // lets take a class that you defined in your model. // that class has a mapping to the columns: PetID, PetType public partial class Person { public IEntity GetPet() { return IEntityController.Get(PetID,PetType); } } which makes use of some extension functions: public class IEntityController { static public IEntity Get(int id, EntityTypes type) { switch (type) { case EntityTypes.Dog: return Dog.Get(id); case EntityTypes.Cat: return Cat.Get(id); default: throw new Exception("Invalid EntityType"); } } } Not as neat as having plain inheritance, particularly considering you have to store the PetType in an extra database field, but considering the performance gains, I would not look back. It also cannot model one-to-many, many-to-many relationship, but with creative uses of 'Union' it could be made to work. Finally, it creates the side effet of loading data in a property/function of the object, which you need to be careful about. Using a clear naming convention like GetXYZ() helps in that regards. Compiled Queries Entity Framework performance is not as good as direct database access with ADO (obviously) or Linq2SQL. There are ways to improve it however, one of which is compiling your queries. The performance of a compiled query is similar to Linq2Sql. What is a compiled query? It is simply a query for which you tell the framework to keep the parsed tree in memory so it doesn't need to be regenerated the next time you run it. So the next run, you will save the time it takes to parse the tree. Do not discount that as it is a very costly operation that gets even worse with more complex queries. There are 2 ways to compile a query: creating an ObjectQuery with EntitySQL and using CompiledQuery.Compile() function. (Note that by using an EntityDataSource in your page, you will in fact be using ObjectQuery with EntitySQL, so that gets compiled and cached). An aside here in case you don't know what EntitySQL is. It is a string-based way of writing queries against the EF. Here is an example: "select value dog from Entities.DogSet as dog where dog.ID = @ID". The syntax is pretty similar to SQL syntax. You can also do pretty complex object manipulation, which is well explained [here][1]. Ok, so here is how to do it using ObjectQuery< string query = "select value dog " + "from Entities.DogSet as dog " + "where dog.ID = @ID"; ObjectQuery<Dog> oQuery = new ObjectQuery<Dog>(query, EntityContext.Instance)); oQuery.Parameters.Add(new ObjectParameter("ID", id)); oQuery.EnablePlanCaching = true; return oQuery.FirstOrDefault(); The first time you run this query, the framework will generate the expression tree and keep it in memory. So the next time it gets executed, you will save on that costly step. In that example EnablePlanCaching = true, which is unnecessary since that is the default option. The other way to compile a query for later use is the CompiledQuery.Compile method. This uses a delegate: static readonly Func<Entities, int, Dog> query_GetDog = CompiledQuery.Compile<Entities, int, Dog>((ctx, id) => ctx.DogSet.FirstOrDefault(it => it.ID == id)); or using linq static readonly Func<Entities, int, Dog> query_GetDog = CompiledQuery.Compile<Entities, int, Dog>((ctx, id) => (from dog in ctx.DogSet where dog.ID == id select dog).FirstOrDefault()); to call the query: query_GetDog.Invoke( YourContext, id ); The advantage of CompiledQuery is that the syntax of your query is checked at compile time, where as EntitySQL is not. However, there are other consideration... Includes Lets say you want to have the data for the dog owner to be returned by the query to avoid making 2 calls to the database. Easy to do, right? EntitySQL string query = "select value dog " + "from Entities.DogSet as dog " + "where dog.ID = @ID"; ObjectQuery<Dog> oQuery = new ObjectQuery<Dog>(query, EntityContext.Instance)).Include("Owner"); oQuery.Parameters.Add(new ObjectParameter("ID", id)); oQuery.EnablePlanCaching = true; return oQuery.FirstOrDefault(); CompiledQuery static readonly Func<Entities, int, Dog> query_GetDog = CompiledQuery.Compile<Entities, int, Dog>((ctx, id) => (from dog in ctx.DogSet.Include("Owner") where dog.ID == id select dog).FirstOrDefault()); Now, what if you want to have the Include parametrized? What I mean is that you want to have a single Get() function that is called from different pages that care about different relationships for the dog. One cares about the Owner, another about his FavoriteFood, another about his FavotireToy and so on. Basicly, you want to tell the query which associations to load. It is easy to do with EntitySQL public Dog Get(int id, string include) { string query = "select value dog " + "from Entities.DogSet as dog " + "where dog.ID = @ID"; ObjectQuery<Dog> oQuery = new ObjectQuery<Dog>(query, EntityContext.Instance)) .IncludeMany(include); oQuery.Parameters.Add(new ObjectParameter("ID", id)); oQuery.EnablePlanCaching = true; return oQuery.FirstOrDefault(); } The include simply uses the passed string. Easy enough. Note that it is possible to improve on the Include(string) function (that accepts only a single path) with an IncludeMany(string) that will let you pass a string of comma-separated associations to load. Look further in the extension section for this function. If we try to do it with CompiledQuery however, we run into numerous problems: The obvious static readonly Func<Entities, int, string, Dog> query_GetDog = CompiledQuery.Compile<Entities, int, string, Dog>((ctx, id, include) => (from dog in ctx.DogSet.Include(include) where dog.ID == id select dog).FirstOrDefault()); will choke when called with: query_GetDog.Invoke( YourContext, id, "Owner,FavoriteFood" ); Because, as mentionned above, Include() only wants to see a single path in the string and here we are giving it 2: "Owner" and "FavoriteFood" (which is not to be confused with "Owner.FavoriteFood"!). Then, let's use IncludeMany(), which is an extension function static readonly Func<Entities, int, string, Dog> query_GetDog = CompiledQuery.Compile<Entities, int, string, Dog>((ctx, id, include) => (from dog in ctx.DogSet.IncludeMany(include) where dog.ID == id select dog).FirstOrDefault()); Wrong again, this time it is because the EF cannot parse IncludeMany because it is not part of the functions that is recognizes: it is an extension. Ok, so you want to pass an arbitrary number of paths to your function and Includes() only takes a single one. What to do? You could decide that you will never ever need more than, say 20 Includes, and pass each separated strings in a struct to CompiledQuery. But now the query looks like this: from dog in ctx.DogSet.Include(include1).Include(include2).Include(include3) .Include(include4).Include(include5).Include(include6) .[...].Include(include19).Include(include20) where dog.ID == id select dog which is awful as well. Ok, then, but wait a minute. Can't we return an ObjectQuery< with CompiledQuery? Then set the includes on that? Well, that what I would have thought so as well: static readonly Func<Entities, int, ObjectQuery<Dog>> query_GetDog = CompiledQuery.Compile<Entities, int, string, ObjectQuery<Dog>>((ctx, id) => (ObjectQuery<Dog>)(from dog in ctx.DogSet where dog.ID == id select dog)); public Dog GetDog( int id, string include ) { ObjectQuery<Dog> oQuery = query_GetDog(id); oQuery = oQuery.IncludeMany(include); return oQuery.FirstOrDefault; } That should have worked, except that when you call IncludeMany (or Include, Where, OrderBy...) you invalidate the cached compiled query because it is an entirely new one now! So, the expression tree needs to be reparsed and you get that performance hit again. So what is the solution? You simply cannot use CompiledQueries with parametrized Includes. Use EntitySQL instead. This doesn't mean that there aren't uses for CompiledQueries. It is great for localized queries that will always be called in the same context. Ideally CompiledQuery should always be used because the syntax is checked at compile time, but due to limitation, that's not possible. An example of use would be: you may want to have a page that queries which two dogs have the same favorite food, which is a bit narrow for a BusinessLayer function, so you put it in your page and know exactly what type of includes are required. Passing more than 3 parameters to a CompiledQuery Func is limited to 5 parameters, of which the last one is the return type and the first one is your Entities object from the model. So that leaves you with 3 parameters. A pitance, but it can be improved on very easily. public struct MyParams { public string param1; public int param2; public DateTime param3; } static readonly Func<Entities, MyParams, IEnumerable<Dog>> query_GetDog = CompiledQuery.Compile<Entities, MyParams, IEnumerable<Dog>>((ctx, myParams) => from dog in ctx.DogSet where dog.Age == myParams.param2 && dog.Name == myParams.param1 and dog.BirthDate > myParams.param3 select dog); public List<Dog> GetSomeDogs( int age, string Name, DateTime birthDate ) { MyParams myParams = new MyParams(); myParams.param1 = name; myParams.param2 = age; myParams.param3 = birthDate; return query_GetDog(YourContext,myParams).ToList(); } Return Types (this does not apply to EntitySQL queries as they aren't compiled at the same time during execution as the CompiledQuery method) Working with Linq, you usually don't force the execution of the query until the very last moment, in case some other functions downstream wants to change the query in some way: static readonly Func<Entities, int, string, IEnumerable<Dog>> query_GetDog = CompiledQuery.Compile<Entities, int, string, IEnumerable<Dog>>((ctx, age, name) => from dog in ctx.DogSet where dog.Age == age && dog.Name == name select dog); public IEnumerable<Dog> GetSomeDogs( int age, string name ) { return query_GetDog(YourContext,age,name); } public void DataBindStuff() { IEnumerable<Dog> dogs = GetSomeDogs(4,"Bud"); // but I want the dogs ordered by BirthDate gridView.DataSource = dogs.OrderBy( it => it.BirthDate ); } What is going to happen here? By still playing with the original ObjectQuery (that is the actual return type of the Linq statement, which implements IEnumerable), it will invalidate the compiled query and be force to re-parse. So, the rule of thumb is to return a List< of objects instead. static readonly Func<Entities, int, string, IEnumerable<Dog>> query_GetDog = CompiledQuery.Compile<Entities, int, string, IEnumerable<Dog>>((ctx, age, name) => from dog in ctx.DogSet where dog.Age == age && dog.Name == name select dog); public List<Dog> GetSomeDogs( int age, string name ) { return query_GetDog(YourContext,age,name).ToList(); //<== change here } public void DataBindStuff() { List<Dog> dogs = GetSomeDogs(4,"Bud"); // but I want the dogs ordered by BirthDate gridView.DataSource = dogs.OrderBy( it => it.BirthDate ); } When you call ToList(), the query gets executed as per the compiled query and then, later, the OrderBy is executed against the objects in memory. It may be a little bit slower, but I'm not even sure. One sure thing is that you have no worries about mis-handling the ObjectQuery and invalidating the compiled query plan. Once again, that is not a blanket statement. ToList() is a defensive programming trick, but if you have a valid reason not to use ToList(), go ahead. There are many cases in which you would want to refine the query before executing it. Performance What is the performance impact of compiling a query? It can actually be fairly large. A rule of thumb is that compiling and caching the query for reuse takes at least double the time of simply executing it without caching. For complex queries (read inherirante), I have seen upwards to 10 seconds. So, the first time a pre-compiled query gets called, you get a performance hit. After that first hit, performance is noticeably better than the same non-pre-compiled query. Practically the same as Linq2Sql When you load a page with pre-compiled queries the first time you will get a hit. It will load in maybe 5-15 seconds (obviously more than one pre-compiled queries will end up being called), while subsequent loads will take less than 300ms. Dramatic difference, and it is up to you to decide if it is ok for your first user to take a hit or you want a script to call your pages to force a compilation of the queries. Can this query be cached? { Dog dog = from dog in YourContext.DogSet where dog.ID == id select dog; } No, ad-hoc Linq queries are not cached and you will incur the cost of generating the tree every single time you call it. Parametrized Queries Most search capabilities involve heavily parametrized queries. There are even libraries available that will let you build a parametrized query out of lamba expressions. The problem is that you cannot use pre-compiled queries with those. One way around that is to map out all the possible criteria in the query and flag which one you want to use: public struct MyParams { public string name; public bool checkName; public int age; public bool checkAge; } static readonly Func<Entities, MyParams, IEnumerable<Dog>> query_GetDog = CompiledQuery.Compile<Entities, MyParams, IEnumerable<Dog>>((ctx, myParams) => from dog in ctx.DogSet where (myParams.checkAge == true && dog.Age == myParams.age) && (myParams.checkName == true && dog.Name == myParams.name ) select dog); protected List<Dog> GetSomeDogs() { MyParams myParams = new MyParams(); myParams.name = "Bud"; myParams.checkName = true; myParams.age = 0; myParams.checkAge = false; return query_GetDog(YourContext,myParams).ToList(); } The advantage here is that you get all the benifits of a pre-compiled quert. The disadvantages are that you most likely will end up with a where clause that is pretty difficult to maintain, that you will incur a bigger penalty for pre-compiling the query and that each query you run is not as efficient as it could be (particularly with joins thrown in). Another way is to build an EntitySQL query piece by piece, like we all did with SQL. protected List<Dod> GetSomeDogs( string name, int age) { string query = "select value dog from Entities.DogSet where 1 = 1 "; if( !String.IsNullOrEmpty(name) ) query = query + " and dog.Name == @Name "; if( age > 0 ) query = query + " and dog.Age == @Age "; ObjectQuery<Dog> oQuery = new ObjectQuery<Dog>( query, YourContext ); if( !String.IsNullOrEmpty(name) ) oQuery.Parameters.Add( new ObjectParameter( "Name", name ) ); if( age > 0 ) oQuery.Parameters.Add( new ObjectParameter( "Age", age ) ); return oQuery.ToList(); } Here the problems are: - there is no syntax checking during compilation - each different combination of parameters generate a different query which will need to be pre-compiled when it is first run. In this case, there are only 4 different possible queries (no params, age-only, name-only and both params), but you can see that there can be way more with a normal world search. - Noone likes to concatenate strings! Another option is to query a large subset of the data and then narrow it down in memory. This is particularly useful if you are working with a definite subset of the data, like all the dogs in a city. You know there are a lot but you also know there aren't that many... so your CityDog search page can load all the dogs for the city in memory, which is a single pre-compiled query and then refine the results protected List<Dod> GetSomeDogs( string name, int age, string city) { string query = "select value dog from Entities.DogSet where dog.Owner.Address.City == @City "; ObjectQuery<Dog> oQuery = new ObjectQuery<Dog>( query, YourContext ); oQuery.Parameters.Add( new ObjectParameter( "City", city ) ); List<Dog> dogs = oQuery.ToList(); if( !String.IsNullOrEmpty(name) ) dogs = dogs.Where( it => it.Name == name ); if( age > 0 ) dogs = dogs.Where( it => it.Age == age ); return dogs; } It is particularly useful when you start displaying all the data then allow for filtering. Problems: - Could lead to serious data transfer if you are not careful about your subset. - You can only filter on the data that you returned. It means that if you don't return the Dog.Owner association, you will not be able to filter on the Dog.Owner.Name So what is the best solution? There isn't any. You need to pick the solution that works best for you and your problem: - Use lambda-based query building when you don't care about pre-compiling your queries. - Use fully-defined pre-compiled Linq query when your object structure is not too complex. - Use EntitySQL/string concatenation when the structure could be complex and when the possible number of different resulting queries are small (which means fewer pre-compilation hits). - Use in-memory filtering when you are working with a smallish subset of the data or when you had to fetch all of the data on the data at first anyway (if the performance is fine with all the data, then filtering in memory will not cause any time to be spent in the db). Singleton access The best way to deal with your context and entities accross all your pages is to use the singleton pattern: public sealed class YourContext { private const string instanceKey = "On3GoModelKey"; YourContext(){} public static YourEntities Instance { get { HttpContext context = HttpContext.Current; if( context == null ) return Nested.instance; if (context.Items[instanceKey] == null) { On3GoEntities entity = new On3GoEntities(); context.Items[instanceKey] = entity; } return (YourEntities)context.Items[instanceKey]; } } class Nested { // Explicit static constructor to tell C# compiler // not to mark type as beforefieldinit static Nested() { } internal static readonly YourEntities instance = new YourEntities(); } } NoTracking, is it worth it? When executing a query, you can tell the framework to track the objects it will return or not. What does it mean? With tracking enabled (the default option), the framework will track what is going on with the object (has it been modified? Created? Deleted?) and will also link objects together, when further queries are made from the database, which is what is of interest here. For example, lets assume that Dog with ID == 2 has an owner which ID == 10. Dog dog = (from dog in YourContext.DogSet where dog.ID == 2 select dog).FirstOrDefault(); //dog.OwnerReference.IsLoaded == false; Person owner = (from o in YourContext.PersonSet where o.ID == 10 select dog).FirstOrDefault(); //dog.OwnerReference.IsLoaded == true; If we were to do the same with no tracking, the result would be different. ObjectQuery<Dog> oDogQuery = (ObjectQuery<Dog>) (from dog in YourContext.DogSet where dog.ID == 2 select dog); oDogQuery.MergeOption = MergeOption.NoTracking; Dog dog = oDogQuery.FirstOrDefault(); //dog.OwnerReference.IsLoaded == false; ObjectQuery<Person> oPersonQuery = (ObjectQuery<Person>) (from o in YourContext.PersonSet where o.ID == 10 select o); oPersonQuery.MergeOption = MergeOption.NoTracking; Owner owner = oPersonQuery.FirstOrDefault(); //dog.OwnerReference.IsLoaded == false; Tracking is very useful and in a perfect world without performance issue, it would always be on. But in this world, there is a price for it, in terms of performance. So, should you use NoTracking to speed things up? It depends on what you are planning to use the data for. Is there any chance that the data your query with NoTracking can be used to make update/insert/delete in the database? If so, don't use NoTracking because associations are not tracked and will causes exceptions to be thrown. In a page where there are absolutly no updates to the database, you can use NoTracking. Mixing tracking and NoTracking is possible, but it requires you to be extra careful with updates/inserts/deletes. The problem is that if you mix then you risk having the framework trying to Attach() a NoTracking object to the context where another copy of the same object exist with tracking on. Basicly, what I am saying is that Dog dog1 = (from dog in YourContext.DogSet where dog.ID == 2).FirstOrDefault(); ObjectQuery<Dog> oDogQuery = (ObjectQuery<Dog>) (from dog in YourContext.DogSet where dog.ID == 2 select dog); oDogQuery.MergeOption = MergeOption.NoTracking; Dog dog2 = oDogQuery.FirstOrDefault(); dog1 and dog2 are 2 different objects, one tracked and one not. Using the detached object in an update/insert will force an Attach() that will say "Wait a minute, I do already have an object here with the same database key. Fail". And when you Attach() one object, all of its hierarchy gets attached as well, causing problems everywhere. Be extra careful. How much faster is it with NoTracking It depends on the queries. Some are much more succeptible to tracking than other. I don't have a fast an easy rule for it, but it helps. So I should use NoTracking everywhere then? Not exactly. There are some advantages to tracking object. The first one is that the object is cached, so subsequent call for that object will not hit the database. That cache is only valid for the lifetime of the YourEntities object, which, if you use the singleton code above, is the same as the page lifetime. One page request == one YourEntity object. So for multiple calls for the same object, it will load only once per page request. (Other caching mechanism could extend that). What happens when you are using NoTracking and try to load the same object multiple times? The database will be queried each time, so there is an impact there. How often do/should you call for the same object during a single page request? As little as possible of course, but it does happens. Also remember the piece above about having the associations connected automatically for your? You don't have that with NoTracking, so if you load your data in multiple batches, you will not have a link to between them: ObjectQuery<Dog> oDogQuery = (ObjectQuery<Dog>)(from dog in YourContext.DogSet select dog); oDogQuery.MergeOption = MergeOption.NoTracking; List<Dog> dogs = oDogQuery.ToList(); ObjectQuery<Person> oPersonQuery = (ObjectQuery<Person>)(from o in YourContext.PersonSet select o); oPersonQuery.MergeOption = MergeOption.NoTracking; List<Person> owners = oPersonQuery.ToList(); In this case, no dog will have its .Owner property set. Some things to keep in mind when you are trying to optimize the performance. No lazy loading, what am I to do? This can be seen as a blessing in disguise. Of course it is annoying to load everything manually. However, it decreases the number of calls to the db and forces you to think about when you should load data. The more you can load in one database call the better. That was always true, but it is enforced now with this 'feature' of EF. Of course, you can call if( !ObjectReference.IsLoaded ) ObjectReference.Load(); if you want to, but a better practice is to force the framework to load the objects you know you will need in one shot. This is where the discussion about parametrized Includes begins to make sense. Lets say you have you Dog object public class Dog { public Dog Get(int id) { return YourContext.DogSet.FirstOrDefault(it => it.ID == id ); } } This is the type of function you work with all the time. It gets called from all over the place and once you have that Dog object, you will do very different things to it in different functions. First, it should be pre-compiled, because you will call that very often. Second, each different pages will want to have access to a different subset of the Dog data. Some will want the Owner, some the FavoriteToy, etc. Of course, you could call Load() for each reference you need anytime you need one. But that will generate a call to the database each time. Bad idea. So instead, each page will ask for the data it wants to see when it first request for the Dog object: static public Dog Get(int id) { return GetDog(entity,"");} static public Dog Get(int id, string includePath) { string query = "select value o " + " from YourEntities.DogSet as o " +

    Read the article

  • How to group using XSLT

    - by AdRock
    I'm having trouble grouping a set of nodes. I've found an article that does work with grouping and i have tested it and it works on a small test stylesheet i have I now need to use it in my stylesheet where I only want to select node sets that have a specific value. What I want to do in my stylesheet is select all users who have a userlevel of 2 then to group them by the volunteer region. What happens at the minute is that it gets the right amount of users with userlevel 2 but doesn't print them. It just repeats the first user in the xml file. <?xml version="1.0" encoding="ISO-8859-1"?> <xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform"> <xsl:key name="volunteers-by-region" match="volunteer" use="region" /> <xsl:template name="hoo" match="/"> <html> <head> <title>Registered Volunteers</title> <link rel="stylesheet" type="text/css" href="volunteer.css" /> </head> <body> <h1>Registered Volunteers</h1> <h3>Ordered by the username ascending</h3> <xsl:for-each select="folktask/member[user/account/userlevel='2']"> <xsl:for-each select="volunteer[count(. | key('volunteers-by-region', region)[1]) = 1]"> <xsl:sort select="region" /> <xsl:for-each select="key('volunteers-by-region', region)"> <xsl:sort select="folktask/member/user/personal/name" /> <div class="userdiv"> <xsl:call-template name="member_userid"> <xsl:with-param name="myid" select="/folktask/member/user/@id" /> </xsl:call-template> <xsl:call-template name="volunteer_volid"> <xsl:with-param name="volid" select="/folktask/member/volunteer/@id" /> </xsl:call-template> <xsl:call-template name="volunteer_role"> <xsl:with-param name="volrole" select="/folktask/member/volunteer/roles" /> </xsl:call-template> <xsl:call-template name="volunteer_region"> <xsl:with-param name="volloc" select="/folktask/member/volunteer/region" /> </xsl:call-template> </div> </xsl:for-each> </xsl:for-each> </xsl:for-each> <xsl:if test="position()=last()"> <div class="count"><h2>Total number of volunteers: <xsl:value-of select="count(/folktask/member/user/account/userlevel[text()=2])"/></h2></div> </xsl:if> </body> </html> </xsl:template> <xsl:template name="member_userid"> <xsl:param name="myid" select="'Not Available'" /> <div class="heading bold"><h2>USER ID: <xsl:value-of select="$myid" /></h2></div> </xsl:template> <xsl:template name="volunteer_volid"> <xsl:param name="volid" select="'Not Available'" /> <div class="heading2 bold"><h2>VOLUNTEER ID: <xsl:value-of select="$volid" /></h2></div> </xsl:template> <xsl:template name="volunteer_role"> <xsl:param name="volrole" select="'Not Available'" /> <div class="small bold">ROLES:</div> <div class="large"> <xsl:choose> <xsl:when test="string-length($volrole)!=0"> <xsl:value-of select="$volrole" /> </xsl:when> <xsl:otherwise> <xsl:text> </xsl:text> </xsl:otherwise> </xsl:choose> </div> </xsl:template> <xsl:template name="volunteer_region"> <xsl:param name="volloc" select="'Not Available'" /> <div class="small bold">REGION:</div> <div class="large"><xsl:value-of select="$volloc" /></div> </xsl:template> </xsl:stylesheet> here is my full xml file <?xml version="1.0" encoding="ISO-8859-1" ?> <?xml-stylesheet type="text/xsl" href="volunteers.xsl"?> <folktask xmlns:xs="http://www.w3.org/2001/XMLSchema-instance" xs:noNamespaceSchemaLocation="folktask.xsd"> <member> <user id="1"> <personal> <name>Abbie Hunt</name> <sex>Female</sex> <address1>108 Access Road</address1> <address2></address2> <city>Wells</city> <county>Somerset</county> <postcode>BA5 8GH</postcode> <telephone>01528927616</telephone> <mobile>07085252492</mobile> <email>[email protected]</email> </personal> <account> <username>AdRock</username> <password>269eb625e2f0cf6fae9a29434c12a89f</password> <userlevel>4</userlevel> <signupdate>2010-03-26T09:23:50</signupdate> </account> </user> <volunteer id="1"> <roles></roles> <region>South West</region> </volunteer> </member> <member> <user id="2"> <personal> <name>Aidan Harris</name> <sex>Male</sex> <address1>103 Aiken Street</address1> <address2></address2> <city>Chichester</city> <county>Sussex</county> <postcode>PO19 4DS</postcode> <telephone>01905149894</telephone> <mobile>07784467941</mobile> <email>[email protected]</email> </personal> <account> <username>AmbientExpert</username> <password>8e64214160e9dd14ae2a6d9f700004a6</password> <userlevel>2</userlevel> <signupdate>2010-03-26T09:23:50</signupdate> </account> </user> <volunteer id="2"> <roles>Van Driver,gas Fitter</roles> <region>South Central</region> </volunteer> </member> <member> <user id="3"> <personal> <name>Skye Saunders</name> <sex>Female</sex> <address1>31 Anns Court</address1> <address2></address2> <city>Cirencester</city> <county>Gloucestershire</county> <postcode>GL7 1JG</postcode> <telephone>01958303514</telephone> <mobile>07260491667</mobile> <email>[email protected]</email> </personal> <account> <username>BigUndecided</username> <password>ea297847f80e046ca24a8621f4068594</password> <userlevel>2</userlevel> <signupdate>2010-03-26T09:23:50</signupdate> </account> </user> <volunteer id="3"> <roles>Scaffold Erector</roles> <region>South West</region> </volunteer> </member> <member> <user id="4"> <personal> <name>Connor Lawson</name> <sex>Male</sex> <address1>12 Ash Way</address1> <address2></address2> <city>Swindon</city> <county>Wiltshire</county> <postcode>SN3 6GS</postcode> <telephone>01791928119</telephone> <mobile>07338695664</mobile> <email>[email protected]</email> </personal> <account> <username>iTuneStinker</username> <password>3a1f5fda21a07bfff20c41272bae7192</password> <userlevel>3</userlevel> <signupdate>2010-03-26T09:23:50</signupdate> </account> </user> <festival id="1"> <event> <eventname>Oxford Folk Festival</eventname> <url>http://www.oxfordfolkfestival.com/</url> <datefrom>2010-04-07</datefrom> <dateto>2010-04-09</dateto> <location>Oxford</location> <eventpostcode>OX1 9BE</eventpostcode> <additional>Oxford Folk Festival is going into it's third year in 2006. As well as needing volunteers to steward for the event on the weekend itself, we would be delighted to hear from people willing to help in year round festival work such as stuffing envelopes for mailings, poster and leaflet distribution, and stewarding duties at festival pre-events.</additional> <coords> <lat>51.735640</lat> <lng>-1.276136</lng> </coords> </event> <contact> <conname>Stuart Vincent</conname> <conaddress1>P.O. Box 642</conaddress1> <conaddress2></conaddress2> <concity>Oxford</concity> <concounty>Bedfordshire</concounty> <conpostcode>OX1 3BY</conpostcode> <contelephone>01865 79073</contelephone> <conmobile></conmobile> <fax></fax> <conemail>[email protected]</conemail> </contact> </festival> </member> <member> <user id="5"> <personal> <name>Lewis King</name> <sex>Male</sex> <address1>67 Arbors Way</address1> <address2></address2> <city>Sherborne</city> <county>Dorset</county> <postcode>DT9 0GS</postcode> <telephone>01446139701</telephone> <mobile>07292614033</mobile> <email>[email protected]</email> </personal> <account> <username>Runninglife</username> <password>98fab0a27c34ddb2b0618bc184d4331d</password> <userlevel>2</userlevel> <signupdate>2010-03-26T09:23:50</signupdate> </account> </user> <volunteer id="4"> <roles>Van Driver</roles> <region>South West</region> </volunteer> </member> <member> <user id="6"> <personal> <name>Cameron Lee</name> <sex>Male</sex> <address1>77 Arrington Road</address1> <address2></address2> <city>Solihull</city> <county>Warwickshire</county> <postcode>B90 6FG</postcode> <telephone>01435158624</telephone> <mobile>07789503179</mobile> <email>[email protected]</email> </personal> <account> <username>love2Mixer</username> <password>1df752d54876928639cea07ce036a9c0</password> <userlevel>2</userlevel> <signupdate>2010-03-26T09:23:50</signupdate> </account> </user> <volunteer id="5"> <roles>Fire Warden</roles> <region>Midlands</region> </volunteer> </member> <member> <user id="7"> <personal> <name>Lexie Dean</name> <sex>Female</sex> <address1>38 Bloomfield Court</address1> <address2></address2> <city>Windermere</city> <county>Westmorland</county> <postcode>LA23 8BM</postcode> <telephone>01781207188</telephone> <mobile>07127461231</mobile> <email>[email protected]</email> </personal> <account> <username>MailNetworker</username> <password>0e070701839e612bf46af4421db4f44b</password> <userlevel>3</userlevel> <signupdate>2010-03-26T09:23:50</signupdate> </account> </user> <festival id="2"> <event> <eventname>Middlewich Folk And Boat Festival</eventname> <url>http://midfest.org.uk/mfab/</url> <datefrom>2010-06-16</datefrom> <dateto>2010-06-18</dateto> <location>Middlewich</location> <eventpostcode>CW10 9BX</eventpostcode> <additional>We welcome stewards staying on campsite or boats.</additional> <coords> <lat>53.190562</lat> <lng>-2.444926</lng> </coords> </event> <contact> <conname>Festival Committee</conname> <conaddress1>PO Box 141</conaddress1> <conaddress2></conaddress2> <concity>Winsford</concity> <concounty>Cheshire</concounty> <conpostcode>CW10 9WB</conpostcode> <contelephone>07092 39050</contelephone> <conmobile>07092 39050</conmobile> <fax></fax> <conemail>[email protected]</conemail> </contact> </festival> </member> <member> <user id="8"> <personal> <name>Liam Chapman</name> <sex>Male</sex> <address1>99 Black Water Drive</address1> <address2></address2> <city>St.Austell</city> <county>Cornwall</county> <postcode>PL25 3GF</postcode> <telephone>01835629418</telephone> <mobile>07695179069</mobile> <email>[email protected]</email> </personal> <account> <username>GreenWimp</username> <password>1fe3df99a841dc4f723d21af89e0990f</password> <userlevel>1</userlevel> <signupdate>2010-03-26T09:23:50</signupdate> </account> </user> </member> <member> <user id="9"> <personal> <name>Brandon Harrison</name> <sex>Male</sex> <address1>41 Arlington Way</address1> <address2></address2> <city>Dorchester</city> <county>Dorset</county> <postcode>DT1 3JS</postcode> <telephone>01293626735</telephone> <mobile>07277145760</mobile> <email>[email protected]</email> </personal> <account> <username>LovelyStar</username> <password>8b53b66f323aa5e6a083edb4fd44456b</password> <userlevel>1</userlevel> <signupdate>2010-03-26T09:23:50</signupdate> </account> </user> </member> <member> <user id="10"> <personal> <name>Samuel Young</name> <sex>Male</sex> <address1>102 Bailey Hill Road</address1> <address2></address2> <city>Wolverhampton</city> <county>Staffordshire</county> <postcode>WV7 8HS</postcode> <telephone>01594531382</telephone> <mobile>07544663654</mobile> <email>[email protected]</email> </personal> <account> <username>GuruSassy</username> <password>00da02da6c143c3d136bf60b8bfcf43e</password> <userlevel>2</userlevel> <signupdate>2010-03-26T09:23:50</signupdate> </account> </user> <volunteer id="6"> <roles>Fire Warden</roles> <region>Midlands</region> </volunteer> </member> <member> <user id="11"> <personal> <name>Alexander Harris</name> <sex>Male</sex> <address1>93 Beguine Drive</address1> <address2></address2> <city>Winchester</city> <county>Hampshire</county> <postcode>S23 2FD</postcode> <telephone>01452496582</telephone> <mobile>07353867291</mobile> <email>[email protected]</email> </personal> <account> <username>GuitarExpert</username> <password>0102ad3740028e155925e9918ead3bde</password> <userlevel>2</userlevel> <signupdate>2010-03-26T09:23:50</signupdate> </account> </user> <volunteer id="7"> <roles>Scaffold Erector</roles> <region>North East</region> </volunteer> </member> <member> <user id="12"> <personal> <name>Tyler Mcdonald</name> <sex>Male</sex> <address1>44 Baker Road</address1> <address2></address2> <city>Bromley</city> <county>Kent</county> <postcode>BR1 2GD</postcode> <telephone>01918704546</telephone> <mobile>07314062451</mobile> <email>[email protected]</email> </personal> <account> <username>WildWish</username> <password>073220bb5e9a12ad202bb7d94dcc86f7</password> <userlevel>1</userlevel> <signupdate>2010-03-26T09:23:50</signupdate> </account> </user> </member> <member> <user id="13"> <personal> <name>Skye Mason</name> <sex>Female</sex> <address1>56 Cedar Creek Church Road</address1> <address2></address2> <city>Bracknell</city> <county>Berkshire</county> <postcode>RG12 1AQ</postcode> <telephone>01787607618</telephone> <mobile>07540218868</mobile> <email>[email protected]</email> </personal> <account> <username>PizzaDork</username> <password>74c54937ee7051ee7f4ebc11296ed531</password> <userlevel>1</userlevel> <signupdate>2010-03-26T09:23:50</signupdate> </account> </user> </member> <member> <user id="14"> <personal> <name>Maryam Rose</name> <sex>Female</sex> <address1>98 Baptist Circle</address1> <address2></address2> <city>Newbury</city> <county>Berkshire</county> <postcode>RG14 8DF</postcode> <telephone>01691317999</telephone> <mobile>07212477154</mobile> <email>[email protected]</email> </personal> <account> <username>SexTech</username> <password>f1c21f9f1e999da97d7dc460bb876fcf</password> <userlevel>3</userlevel> <signupdate>2010-03-26T09:23:50</signupdate> </account> </user> <festival id="3"> <event> <eventname>Birdsedge Village Festival</eventname> <url>http://www.birdsedge.co.uk/</url> <datefrom>2010-07-08</datefrom> <dateto>2010-07-09</dateto> <location>Birdsedge</location> <eventpostcode>HD8 8XT</eventpostcode> <additional></additional> <coords> <lat>53.565644</lat> <lng>-1.696196</lng> </coords> </event> <contact> <conname>Jacey Bedford</conname> <conaddress1>Penistone Road</conaddress1> <conaddress2>Birdsedge</conaddress2> <concity>Huddersfield</concity> <concounty>West Yorkshire</concounty> <conpostcode>HD8 8XT</conpostcode> <contelephone>01484 60623</contelephone> <conmobile></conmobile> <fax></fax> <conemail>[email protected]</conemail> </contact> </festival> </member> <member> <user id="15"> <personal> <name>Lexie Rogers</name> <sex>Female</sex> <address1>38 Bishop Road</address1> <address2></address2> <city>Matlock</city> <county>Derbyshire</county> <postcode>DE4 1BX</postcode> <telephone>01961168823</telephone> <mobile>07170855351</mobile> <email>[email protected]</email> </personal> <account> <username>ShipBurglar</username> <password>cc190488a95667cb117e20bc6c7c330e</password> <userlevel>2</userlevel> <signupdate>2010-03-26T09:23:50</signupdate> </account> </user> <volunteer id="8"> <roles>Gas Fitter</roles> <region>Midlands</region> </volunteer> </member> <member> <user id="16"> <personal> <name>Noah Parker</name> <sex>Male</sex> <address1>112 Canty Road</address1> <address2></address2> <city>Keswick</city> <county>Cumberland</county> <postcode>CA12 4TR</postcode> <telephone>01931272522</telephone> <mobile>07610026576</mobile> <email>[email protected]</email> </personal> <account> <username>AwsomeMoon</username> <password>50b770539bdf08543f15778fc7a6f188</password> <userlevel>2</userlevel> <signupdate>2010-03-26T09:23:50</signupdate> </account> </user> <volunteer id="9"> <roles>Van Driver</roles> <region>North West</region> </volunteer> </member> <member> <user id="17"> <personal> <name>Elliot Mitchell</name> <sex>Male</sex> <address1>102 Brown Loop</address1> <address2></address2> <city>Grimsby</city> <county>Lincolnshire</county> <postcode>OX16 4QP</postcode> <telephone>01212971319</telephone> <mobile>07544663654</mobile> <email>[email protected]</email> </personal> <account> <username>msBasher</username> <password>c38fad85badcdff6e3559ef38656305d</password> <userlevel>1</userlevel> <signupdate>2010-03-26T09:23:50</signupdate> </account> </user> </member> <member> <user id="18"> <personal> <name>Scarlett Rose</name> <sex>Female</sex> <address1>93 Cedar Lane</address1> <address2></address2> <city>Stourbridge</city> <county>Warminster</county> <postcode>DY8 4NX</postcode> <telephone>01537477435</telephone> <mobile>07353867291</mobile> <email>[email protected]</email> </personal> <account> <username>MakeupWimp</username> <password>16a9b7910fc34304c1d1a6a1b0c31502</password> <userlevel>1</userlevel> <signupdate>2010-03-26T09:23:50</signupdate> </account> </user> </member> <member> <user id="19"> <personal> <name>Katie Butler</name> <sex>Female</sex> <address1>44 Boulder Crest Road</address1> <address2></address2> <city>Bungay</city> <county>Suffolk</county> <postcode>NR35 1LT</postcode> <telephone>01419124094</telephone> <mobile>07314062451</mobile> <email>[email protected]</email> </personal> <account> <username>TomatoCrunch</username> <password>d7eba53443ec4ddcee69ed71b2023fc0</password> <userlevel>1</userlevel> <signupdate>2010-03-26T09:23:50</signupdate> </account> </user> </member> <member> <user id="20"> <personal> <name>Jayden Richards</name> <sex>Male</sex> <address1>56 Corson Trail</address1> <address2></address2> <city>Sandy</city> <county>Bedfordshire</county> <postcode>SG19 6DF</postcode> <telephone>01882134438</telephone> <mobile>07540218868</mobile> <email>[email protected]</email> </personal> <account> <username>nightmareTwig</username> <password>8a9c08c7b6473493e8a5da15dd541025</password> <userlevel>3</userlevel> <signupdate>2010-03-26T09:23:50</signupdate> </account> </user> <festival id="4"> <event> <eventname>East Barnet Festival</eventname> <url>http://www.eastbarnetfestival.org.uk</url> <datefrom>2010-07-01</datefrom> <dateto>2010-07-03</dateto> <location>East Barnet</location> <eventpostcode>EN4 8TB</eventpostcode> <additional></additional> <coords> <lat>51.641556</lat> <lng>-0.163018</lng> </coords> </event> <contact> <conname>East Barnet Festival Commitee</conname> <conaddress1>Oak Hill Park</conaddress1> <conaddress2>Church Hill Road</conaddress2> <concity>East Barnet</concity> <concounty>Hertfordshire</concounty> <conpostcode>EN4 8TB</conpostcode> <contelephone>07071781745</contelephone> <conmobile>07071781745</conmobile> <fax></fax> <conemail>[email protected]</conemail> </contact> </festival> </member> <member> <user id="21"> <personal> <name>Abbie Jackson</name> <sex>Female</sex> <address1>98 Briarwood Lane</address1> <address2></address2> <city>Weymouth</city> <county>Dorset</county> <postcode>DT3 6TS</postcode> <telephone>01575629969</telephone> <mobile>07212477154</mobile> <email>[email protected]</email> </personal> <account> <username>CrazyBlockhead</username> <password>4ce56fb13d043be605037ace4fbd9fa5</password> <userlevel>2</u

    Read the article

< Previous Page | 15 16 17 18 19