Search Results

Search found 13104 results on 525 pages for 'malcolm box'.

Page 522/525 | < Previous Page | 518 519 520 521 522 523 524 525  | Next Page >

  • CodePlex Daily Summary for Thursday, September 20, 2012

    CodePlex Daily Summary for Thursday, September 20, 2012Popular ReleasesSiteMap Editor for Microsoft Dynamics CRM 2011: SiteMap Editor (1.1.2020.421): New features: Disable a specific part of SiteMap to keep the data without displaying them in the CRM application. It simply comments XML part of the sitemap (thanks to rboyers for this feature request) Right click an item and click on "Disable" to disable it Items disabled are greyed and a suffix "- disabled" is added Right click an item and click on "Enable" to enable it Refresh list of web resources in the web resources pickerWPF Animated GIF: WPF Animated GIF 1.2.1: Bug fixes 1275: fixed rendering issues when DisposalMethod = 2 or 3AJAX Control Toolkit: September 2012 Release: AJAX Control Toolkit Release Notes - September 2012 Release Version 60919September 2012 release of the AJAX Control Toolkit. AJAX Control Toolkit .NET 4.5 – AJAX Control Toolkit for .NET 4.5 and sample site (Recommended). AJAX Control Toolkit .NET 4 – AJAX Control Toolkit for .NET 4 and sample site (Recommended). AJAX Control Toolkit .NET 3.5 – AJAX Control Toolkit for .NET 3.5 and sample site (Recommended). Notes: - The current version of the AJAX Control Toolkit is not compatible with ...Lib.Web.Mvc & Yet another developer blog: Lib.Web.Mvc 6.1.0: Lib.Web.Mvc is a library which contains some helper classes for ASP.NET MVC such as strongly typed jqGrid helper, XSL transformation HtmlHelper/ActionResult, FileResult with range request support, custom attributes and more. Release contains: Lib.Web.Mvc.dll with xml documentation file Standalone documentation in chm file and change log Library source code Sample application for strongly typed jqGrid helper is available here. Sample application for XSL transformation HtmlHelper/ActionRe...Sense/Net CMS - Enterprise Content Management: SenseNet 6.1.2 Community Edition: Sense/Net 6.1.2 Community EditionMain new featuresOur current release brings a lot of bugfixes, including the resolution of js/css editing cache issues, xlsx file handling from Office, expense claim demo workspace fixes and much more. Besides fixes 6.1.2 introduces workflow start options and other minor features like a reusable Reject client button for approval scenarios and resource editor enhancements. We have also fixed an issue with our install package to bring you a flawless installation...WinRT XAML Toolkit: WinRT XAML Toolkit - 1.2.3: WinRT XAML Toolkit based on the Windows 8 RTM SDK. Download the latest source from the SOURCE CODE page. For compiled version use NuGet. You can add it to your project in Visual Studio by going to View/Other Windows/Package Manager Console and entering: PM> Install-Package winrtxamltoolkit Features AsyncUI extensions Controls and control extensions Converters Debugging helpers Imaging IO helpers VisualTree helpers Samples Recent changes NOTE: Namespace changes DebugConsol...Python Tools for Visual Studio: 1.5 RC: PTVS 1.5RC Available! We’re pleased to announce the release of Python Tools for Visual Studio 1.5 RC. Python Tools for Visual Studio (PTVS) is an open-source plug-in for Visual Studio which supports programming with the Python language. PTVS supports a broad range of features including CPython/IronPython, Edit/Intellisense/Debug/Profile, Cloud, HPC, IPython, etc. support. The primary new feature for the 1.5 release is Django including Azure support! The http://www.djangoproject.com is a pop...Launchbar: Lanchbar 4.0.0: First public release.AssaultCube Reloaded: 2.5.4 -: Linux has Ubuntu 11.10 32-bit precompiled binaries and Ubuntu 10.10 64-bit precompiled binaries, but you can compile your own as it also contains the source. If you are using Mac or other operating systems, please wait while we try to package for those OSes. Try to compile it. If it fails, download a virtual machine. The server pack is ready for both Windows and Linux, but you might need to compile your own for Linux (source included) Changelog: New logo Improved airstrike! Reset nukes...Extended WPF Toolkit: Extended WPF Toolkit - 1.7.0: Want an easier way to install the Extended WPF Toolkit?The Extended WPF Toolkit is available on Nuget. What's new in the 1.7.0 Release?New controls Zoombox Pie New features / bug fixes PropertyGrid.ShowTitle property added to allow showing/hiding the PropertyGrid title. Modifications to the PropertyGrid.EditorDefinitions collection will now automatically be applied to the PropertyGrid. Modifications to the PropertyGrid.PropertyDefinitions collection will now be reflected automaticaly...JayData - The cross-platform HTML5 data-management library for JavaScript: JayData 1.2: JayData is a unified data access library for JavaScript to CRUD + Query data from different sources like OData, MongoDB, WebSQL, SqLite, Facebook or YQL. The library can be integrated with Knockout.js or Sencha Touch 2 and can be used on Node.js as well. See it in action in this 6 minutes video Sencha Touch 2 example app using JayData: Netflix browser. What's new in JayData 1.2 For detailed release notes check the release notes. JayData core: all async operations now support promises JayDa...????????API for .Net SDK: SDK for .Net ??? Release 4: 2012?9?17??? ?????,???????????????。 ?????Release 3??????,???????,???,??? ??????????????????SDK,????????。 ??,??????? That's all.VidCoder: 1.4.0 Beta: First Beta release! Catches up to HandBrake nightlies with SVN 4937. Added PGS (Blu-ray) subtitle support. Additional framerates available: 30, 50, 59.94, 60 Additional sample rates available: 8, 11.025, 12 and 16 kHz Additional higher bitrates available for audio. Same as Source Constant Framerate available. Added Apple TV 3 preset. Added new Bob deinterlacing option. Introduced process isolation for encodes. Now if HandBrake crashes, VidCoder will keep running and continue pro...DNN Metro7 style Skin package: Metro7 style Skin for DotNetNuke 06.02.01: Stabilization release fixed this issues: Links not worked on FF, Chrome and Safari Modified packaging with own manifest file for install and source package. Moved the user Image on the Login to the left side. Moved h2 font-size to 24px. Note : This release Comes w/o source package about we still work an a solution. Who Needs the Visual Studio source files please go to source and download it from there. Known 16 CSS issues that related to the skin.css. All others are DNN default o...Visual Studio Icon Patcher: Version 1.5.1: This fixes a bug in the 1.5 release where it would crash when no language packs were installed for VS2010.VFPX: Desktop Alerts 1.0.2: This update for the Desktop Alerts contains changes to behavior for setting custom sounds for alerts. I have removed ALERTWAV.TXT from the project, and also removed DA_DEFAULTSOUND from the VFPALERT.H file. The AlertManager class and Alert class both have a "default" cSound of ADDBS(JUSTPATH(_VFP.ServerName))+"alert.wav" --- so, as long as you distribute a sound file with the file name "alert.wav" along with the EXE, that file will be used. You can set your own sound file globally by setti...MCEBuddy 2.x: MCEBuddy 2.2.15: Changelog for 2.2.15 (32bit and 64bit) 1. Added support for %originalfilepath% to get the source file full path. Used for custom commands only. 2. Added support for better parsing of Media Portal XML files to extract ShowName and Episode Name and download additional details from TVDB (like Season No, Episode No etc). 3. Added support for TVDB seriesID in metadata 4. Added support for eMail non blocking UI testEmmaClient - Liveresults for Orienteering: EmmaClient 2012-09-13: Minor release with a small fix for producing OS2012 results (and status of runners in the forest)Multiple Image choice custom field type: MultipleImageUpload V1.0: This is the Custom field type which allows the users to choose image as a choice field. This custom field type is SharePoint 2010, install the WSP thru powershell or Stsadm tool and enjoy the functionality...MDS Administration: Version 1.1.3: Fixed Rename issueNew Projects3dxia: bug3dxiaBitbucket Issue Tracker: A simple issue-tracking Windows client for your projects hosted on bitbucket.org.C++ thread-safe logging: Visual Studio C++ log library project: add to your project for thread-safe logging capabilities.Caddies GeoNote: The work started from making a vision for a neighbourhood communication platform, and ended up in creating the version 1.0 of a mobile application – GeoNotes – CodePlexGitHookForAzure: TestCommerce Server Pipeline Log Analyzer: This tool read and analyze pipeline logs under one selected folder. It applies to Microsoft Commerce Server 2002, 2007, 2009 and 2009 R2 Pipeline logs.Contrib.Mod.ResetPassword: Send reset link as a shapeContrib.Taxonomies.ViewExtension: Orchard module that adds a filter box to the taxonomies selector.EasierRdp: This is a remote desktop session management tool which provides an easy way to maintain multiple users and servers' connectionEconomic news grabber: WCF service for get news from rss, news sites and etc. WPF client for presentation this data for end users.Eticaret Sitesi: eee ticFacebook Graph API SDK Helper Class Library: Facebook C# Graph API SDK Helper Class Under developmentfxch01v14: helloKarned 2: Karned est un carnet de pêche informatique. Ce logiciel permet de noter vos prises de pêche à des fins d'analyse, ou simplement pour le souvenir...lixotrash: SandBox and POCs collections, not interesting hereLoggerLib: The project is a "Tracing Library" developed in a Borland C++ enviroment. Il progetto consiste in una libreria di tracciamento, sviluppata in ambiente Borland.LyncTalker: A simple tray application which will speak incoming Lync instant messages.MicroFrameWork: MicroFrameWorkNuzzle: 2.6.5 Dofus EmulatorPDF Merge: PDF Merge is a simple user-friendly application that allows you to merge multiple PDF documents including scanned / imported documents and images into 1 PDF.Pipeline: A library of several lightweight pipeline implementations ("pipes and filters" pattern).Prime Calculator: PrimeCalculator factorizes a number or a math expression into its prime factors or if prime display its prime type [Unit, Prime, Additive, Pure].Racing: not ready yetRuntime DataSet/DataTable viewer: This component basically allows you to inspect the contents of any Data Set or a Data Table at runtime without breaking into the debugger again and again.Service billing: Student group work for the College of West Anglia UCWA. Snake!: A Snake game written in C#SoccerBot: this is just a test projectSQL Server Trace File Import Utility: Command-line utility to import trace files into a data warehouse type structure. Currently it only handles Login events.testscenairo7onv14: helloToQueryString: Serialize any object in C# to a query string with the .ToQueryString() extension method. Supports primitives, strings, arrays and collections.tyajz: tyajz projectWindows Azure Table Storage: you can find all the details in my blog: hhaggan.wordpress.com and if you do have any question or inquiries feel free to contact me at hhaggan@hotmail.comwtcms: wtcms

    Read the article

  • Remove box2d bodies after collision deduction android?

    - by jubin
    Can any one explain me how to destroy box2d body when collide i have tried but my application crashed.First i have checked al collisions then add all the bodies in array who i want to destroy.I am trying to learning this tutorial My all the bodies are falling i want these bodies should destroy when these bodies will collide my actor monkey but when it collide it destroy but my aplication crashed.I have googled and from google i got the application crash reasons we should not destroy body in step funtion but i am removing body in the last of tick method. could any one help me or provide me code aur check my code why i am getting this prblem or how can i destroy box2d bodies. This is my code what i am doing. Please could any one check my code and tell me what is i am doing wrong for removing bodies. The code is for multiple box2d objects falling on my actor monkey it should be destroy when it will fall on the monkey.It is destroing but my application crahes. static class Box2DLayer extends CCLayer { protected static final float PTM_RATIO = 32.0f; protected static final float WALK_FACTOR = 3.0f; protected static final float MAX_WALK_IMPULSE = 0.2f; protected static final float ANIM_SPEED = 0.3f; int isLeft=0; String dir=""; int x =0; float direction; CCColorLayer objectHint; // protected static final float PTM_RATIO = 32.0f; protected World _world; protected static Body spriteBody; CGSize winSize = CCDirector.sharedDirector().winSize(); private static int count = 200; protected static Body monkey_body; private static Body bodies; CCSprite monkey; float animDelay; int animPhase; CCSpriteSheet danceSheet = CCSpriteSheet.spriteSheet("phases.png"); CCSprite _block; List<Body> toDestroy = new ArrayList<Body>(); //CCSpriteSheet _spriteSheet; private static MyContactListener _contactListener = new MyContactListener(); public Box2DLayer() { this.setIsAccelerometerEnabled(true); CCSprite bg = CCSprite.sprite("jungle.png"); addChild(bg,0); bg.setAnchorPoint(0,0); bg.setPosition(0,0); CGSize s = CCDirector.sharedDirector().winSize(); // Use scaled width and height so that our boundaries always match the current screen float scaledWidth = s.width/PTM_RATIO; float scaledHeight = s.height/PTM_RATIO; Vector2 gravity = new Vector2(0.0f, -30.0f); boolean doSleep = false; _world = new World(gravity, doSleep); // Create edges around the entire screen // Define the ground body. BodyDef bxGroundBodyDef = new BodyDef(); bxGroundBodyDef.position.set(0.0f, 0.0f); // The body is also added to the world. Body groundBody = _world.createBody(bxGroundBodyDef); // Register our contact listener // Define the ground box shape. PolygonShape groundBox = new PolygonShape(); Vector2 bottomLeft = new Vector2(0f,0f); Vector2 topLeft = new Vector2(0f,scaledHeight); Vector2 topRight = new Vector2(scaledWidth,scaledHeight); Vector2 bottomRight = new Vector2(scaledWidth,0f); // bottom groundBox.setAsEdge(bottomLeft, bottomRight); groundBody.createFixture(groundBox,0); // top groundBox.setAsEdge(topLeft, topRight); groundBody.createFixture(groundBox,0); // left groundBox.setAsEdge(topLeft, bottomLeft); groundBody.createFixture(groundBox,0); // right groundBox.setAsEdge(topRight, bottomRight); groundBody.createFixture(groundBox,0); CCSprite floorbg = CCSprite.sprite("grassbehind.png"); addChild(floorbg,1); floorbg.setAnchorPoint(0,0); floorbg.setPosition(0,0); CCSprite floorfront = CCSprite.sprite("grassfront.png"); floorfront.setTag(2); this.addBoxBodyForSprite(floorfront); addChild(floorfront,3); floorfront.setAnchorPoint(0,0); floorfront.setPosition(0,0); addChild(danceSheet); //CCSprite monkey = CCSprite.sprite(danceSheet, CGRect.make(0, 0, 48, 73)); //addChild(danceSprite); monkey = CCSprite.sprite("arms_up.png"); monkey.setTag(2); monkey.setPosition(200,100); BodyDef spriteBodyDef = new BodyDef(); spriteBodyDef.type = BodyType.DynamicBody; spriteBodyDef.bullet=true; spriteBodyDef.position.set(200 / PTM_RATIO, 300 / PTM_RATIO); monkey_body = _world.createBody(spriteBodyDef); monkey_body.setUserData(monkey); PolygonShape spriteShape = new PolygonShape(); spriteShape.setAsBox(monkey.getContentSize().width/PTM_RATIO/2, monkey.getContentSize().height/PTM_RATIO/2); FixtureDef spriteShapeDef = new FixtureDef(); spriteShapeDef.shape = spriteShape; spriteShapeDef.density = 2.0f; spriteShapeDef.friction = 0.70f; spriteShapeDef.restitution = 0.0f; monkey_body.createFixture(spriteShapeDef); //Vector2 force = new Vector2(10, 10); //monkey_body.applyLinearImpulse(force, spriteBodyDef.position); addChild(monkey,10000); this.schedule(tickCallback); this.schedule(createobjects, 2.0f); objectHint = CCColorLayer.node(ccColor4B.ccc4(255,0,0,128), 200f, 100f); addChild(objectHint, 15000); objectHint.setVisible(false); _world.setContactListener(_contactListener); } private UpdateCallback tickCallback = new UpdateCallback() { public void update(float d) { tick(d); } }; private UpdateCallback createobjects = new UpdateCallback() { public void update(float d) { secondUpdate(d); } }; private void secondUpdate(float dt) { this.addNewSprite(); } public void addBoxBodyForSprite(CCSprite sprite) { BodyDef spriteBodyDef = new BodyDef(); spriteBodyDef.type = BodyType.StaticBody; //spriteBodyDef.bullet=true; spriteBodyDef.position.set(sprite.getPosition().x / PTM_RATIO, sprite.getPosition().y / PTM_RATIO); spriteBody = _world.createBody(spriteBodyDef); spriteBody.setUserData(sprite); Vector2 verts[] = { new Vector2(-11.8f / PTM_RATIO, -24.5f / PTM_RATIO), new Vector2(11.7f / PTM_RATIO, -24.0f / PTM_RATIO), new Vector2(29.2f / PTM_RATIO, -14.0f / PTM_RATIO), new Vector2(28.7f / PTM_RATIO, -0.7f / PTM_RATIO), new Vector2(8.0f / PTM_RATIO, 18.2f / PTM_RATIO), new Vector2(-29.0f / PTM_RATIO, 18.7f / PTM_RATIO), new Vector2(-26.3f / PTM_RATIO, -12.2f / PTM_RATIO) }; PolygonShape spriteShape = new PolygonShape(); spriteShape.set(verts); //spriteShape.setAsBox(sprite.getContentSize().width/PTM_RATIO/2, //sprite.getContentSize().height/PTM_RATIO/2); FixtureDef spriteShapeDef = new FixtureDef(); spriteShapeDef.shape = spriteShape; spriteShapeDef.density = 2.0f; spriteShapeDef.friction = 0.70f; spriteShapeDef.restitution = 0.0f; spriteShapeDef.isSensor=true; spriteBody.createFixture(spriteShapeDef); } public void addNewSprite() { count=0; Random rand = new Random(); int Number = rand.nextInt(10); switch(Number) { case 0: _block = CCSprite.sprite("banana.png"); break; case 1: _block = CCSprite.sprite("backpack.png");break; case 2: _block = CCSprite.sprite("statue.png");break; case 3: _block = CCSprite.sprite("pineapple.png");break; case 4: _block = CCSprite.sprite("bananabunch.png");break; case 5: _block = CCSprite.sprite("hat.png");break; case 6: _block = CCSprite.sprite("canteen.png");break; case 7: _block = CCSprite.sprite("banana.png");break; case 8: _block = CCSprite.sprite("statue.png");break; case 9: _block = CCSprite.sprite("hat.png");break; } int padding=20; //_block.setPosition(CGPoint.make(100, 100)); // Determine where to spawn the target along the Y axis CGSize winSize = CCDirector.sharedDirector().displaySize(); int minY = (int)(_block.getContentSize().width / 2.0f); int maxY = (int)(winSize.width - _block.getContentSize().width / 2.0f); int rangeY = maxY - minY; int actualY = rand.nextInt(rangeY) + minY; // Create block and add it to the layer float xOffset = padding+_block.getContentSize().width/2+((_block.getContentSize().width+padding)*count); _block.setPosition(CGPoint.make(actualY, 750)); _block.setTag(1); float w = _block.getContentSize().width; objectHint.setVisible(true); objectHint.changeWidth(w); objectHint.setPosition(actualY-w/2, 460); this.addChild(_block,10000); // Create ball body and shape BodyDef ballBodyDef1 = new BodyDef(); ballBodyDef1.type = BodyType.DynamicBody; ballBodyDef1.position.set(actualY/PTM_RATIO, 480/PTM_RATIO); bodies = _world.createBody(ballBodyDef1); bodies.setUserData(_block); PolygonShape circle1 = new PolygonShape(); Vector2 verts[] = { new Vector2(-11.8f / PTM_RATIO, -24.5f / PTM_RATIO), new Vector2(11.7f / PTM_RATIO, -24.0f / PTM_RATIO), new Vector2(29.2f / PTM_RATIO, -14.0f / PTM_RATIO), new Vector2(28.7f / PTM_RATIO, -0.7f / PTM_RATIO), new Vector2(8.0f / PTM_RATIO, 18.2f / PTM_RATIO), new Vector2(-29.0f / PTM_RATIO, 18.7f / PTM_RATIO), new Vector2(-26.3f / PTM_RATIO, -12.2f / PTM_RATIO) }; circle1.set(verts); FixtureDef ballShapeDef1 = new FixtureDef(); ballShapeDef1.shape = circle1; ballShapeDef1.density = 10.0f; ballShapeDef1.friction = 0.0f; ballShapeDef1.restitution = 0.1f; bodies.createFixture(ballShapeDef1); count++; //Remove(); } @Override public void ccAccelerometerChanged(float accelX, float accelY, float accelZ) { //Apply the directional impulse /*float impulse = monkey_body.getMass()*accelY*WALK_FACTOR; Vector2 force = new Vector2(impulse, 0); monkey_body.applyLinearImpulse(force, monkey_body.getWorldCenter());*/ walk(accelY); //Remove(); } private void walk(float accelY) { // TODO Auto-generated method stub direction = accelY; } private void Remove() { for (Iterator<MyContact> it1 = _contactListener.mContacts.iterator(); it1.hasNext();) { MyContact contact = it1.next(); Body bodyA = contact.fixtureA.getBody(); Body bodyB = contact.fixtureB.getBody(); // See if there's any user data attached to the Box2D body // There should be, since we set it in addBoxBodyForSprite if (bodyA.getUserData() != null && bodyB.getUserData() != null) { CCSprite spriteA = (CCSprite) bodyA.getUserData(); CCSprite spriteB = (CCSprite) bodyB.getUserData(); // Is sprite A a cat and sprite B a car? If so, push the cat // on a list to be destroyed... if (spriteA.getTag() == 1 && spriteB.getTag() == 2) { //Log.v("dsfds", "dsfsd"+bodyA); //_world.destroyBody(bodyA); // removeChild(spriteA, true); toDestroy.add(bodyA); } // Is sprite A a car and sprite B a cat? If so, push the cat // on a list to be destroyed... else if (spriteA.getTag() == 2 && spriteB.getTag() == 1) { //Log.v("dsfds", "dsfsd"+bodyB); toDestroy.add(bodyB); } } } // Loop through all of the box2d bodies we want to destroy... for (Iterator<Body> it1 = toDestroy.iterator(); it1.hasNext();) { Body body = it1.next(); // See if there's any user data attached to the Box2D body // There should be, since we set it in addBoxBodyForSprite if (body.getUserData() != null) { // We know that the user data is a sprite since we set // it that way, so cast it... CCSprite sprite = (CCSprite) body.getUserData(); // Remove the sprite from the scene _world.destroyBody(body); removeChild(sprite, true); } // Destroy the Box2D body as well // _contactListener.mContacts.remove(0); } } public synchronized void tick(float delta) { synchronized (_world) { _world.step(delta, 8, 3); //_world.clearForces(); //addNewSprite(); } CCAnimation danceAnimation = CCAnimation.animation("dance", 1.0f); // Iterate over the bodies in the physics world Iterator<Body> it = _world.getBodies(); while(it.hasNext()) { Body b = it.next(); Object userData = b.getUserData(); if (userData != null && userData instanceof CCSprite) { //Synchronize the Sprites position and rotation with the corresponding body CCSprite sprite = (CCSprite)userData; if(sprite.getTag()==1) { //b.applyLinearImpulse(force, pos); sprite.setPosition(b.getPosition().x * PTM_RATIO, b.getPosition().y * PTM_RATIO); sprite.setRotation(-1.0f * ccMacros.CC_RADIANS_TO_DEGREES(b.getAngle())); } else { //Apply the directional impulse float impulse = monkey_body.getMass()*direction*WALK_FACTOR; Vector2 force = new Vector2(impulse, 0); b.applyLinearImpulse(force, b.getWorldCenter()); sprite.setPosition(b.getPosition().x * PTM_RATIO, b.getPosition().y * PTM_RATIO); animDelay -= 1.0f/60.0f; if(animDelay <= 0) { animDelay = ANIM_SPEED; animPhase++; if(animPhase > 2) { animPhase = 1; } } if(direction < 0 ) { isLeft=1; } else { isLeft=0; } if(isLeft==1) { dir = "left"; } else { dir = "right"; } float standingLimit = (float) 0.1f; float vX = monkey_body.getLinearVelocity().x; if((vX > -standingLimit)&& (vX < standingLimit)) { // Log.v("sasd", "standing"); } else { } } } } Remove(); } } Sorry for my english. Thanks in advance.

    Read the article

  • CodePlex Daily Summary for Sunday, November 28, 2010

    CodePlex Daily Summary for Sunday, November 28, 2010Popular ReleasesMDownloader: MDownloader-0.15.25.7002: Fixed updater Fixed FileServe Fixed LetItBitNotepad.NET: Notepad.NET 0.7 Preview 1: Whats New?* Optimized Code Generation: Which means it will run significantly faster. * Preview of Syntax Highlighting: Only VB.NET highlighting is supported, C# and Ruby will come in Preview 2. * Improved Editing Updates (when the line number, etc updates) to be more graceful. * Recent Documents works! * Images can be inserted but they're extremely large. Known Bugs* The Update Process hangs: This is a bug apparently spawning since 0.5. It will be fixed in Preview 2. Until then, perform a fr...Flickr Schedulr: Schedulr v2.2.3984: This is v2.2 of Flickr Schedulr, a Windows desktop application that automatically uploads pictures and videos to Flickr based on a schedule (e.g. to post a new picture every day at a certain time). What's new in this release? Videos are now supported as well. The application can now automatically add pictures to the queue when it starts up, from specific monitored folders. Sets and Groups can now be displayed as text only (without icons) to speed up the application. Fixed issue that ta...Cropper: 1.9.4: Mostly fixes for issues with a few feature requests. Fixed Issues 2730 & 3638 & 14467 11044 11447 11448 11449 14665 Implemented Features 6123 11581PFC: PFC for PB 11.5: This is just a migration from the 11.0 code. No changes have been made yet (and they are needed) for it to work properly with 11.5.PDF Rider: PDF Rider 0.5: This release does not add any new feature for pdf manipulation, but enables automatic updates checking, so it is reccomended to install it in order to stay updated with next releases. Prerequisites * Microsoft Windows Operating Systems (XP - Vista - 7) * Microsoft .NET Framework 3.5 runtime * A PDF rendering software (i.e. Adobe Reader) that can be opened inside Internet Explorer. Installation instructionsChoose one of the following methods: 1. Download and run the "pdfRider0...XamlQuery/WPF - The Write Less, Do More, WPF Library: XamlQuery-WPF v1.2 (Runtime, Source): This is the first release of popular XamlQuery library for WPF. XamlQuery has already gained recognition among Silverlight developers.Math.NET Numerics: Beta 1: First beta of Math.NET Numerics. Only contains the managed linear algebra provider. Beta 2 will include the native linear algebra providers along with better documentation and examples.Minecraft GPS: Minecraft GPS 1.1: 1.1 Release New Features Compass! New style. Set opacity on main window to allow overlay of Minecraft.Microsoft All-In-One Code Framework: Visual Studio 2010 Code Samples 2010-11-25: Code samples for Visual Studio 2010Wii Backup Fusion: Wii Backup Fusion 0.8.5 Beta: - WBFS repair (default) options fixed - Transfer to image fixed - Settings ui widget names fixed - Some little bug fixes You need to reset the settings! Delete WiiBaFu's config file or registry entries on windows: Linux: ~/.config/WiiBaFu/wiibafu.conf Windows: HKEY_CURRENT_USER\Software\WiiBaFu\wiibafu Mac OS X: ~/Library/Preferences/com.wiibafu.wiibafu.plist Caution: This is a BETA version! Errors, crashes and data loss not impossible! Use in test environments only, not on productive syste...Minemapper: Minemapper v0.1.3: Added process count and world size calculation progress to the status bar. Added View->'Status Bar' menu item to show/hide the status bar. Status bar is automatically shown when loading a world. Added a prompt, when loading a world, to use or clear cached images.Sexy Select: sexy select v0.4: Changes in v0.4 Added method : elements. This returns all the option elements that are currently added to the select list Added method : selectOption. This method accepts two values, the element to be modified and the selected state. (true/false)Deep Zoom for WPF: First Release: This first release of the Deep Zoom control has the same source code, binaries and demos as the CodeProject article (http://www.codeproject.com/KB/WPF/DeepZoom.aspx).BlogEngine.NET: BlogEngine.NET 2.0 RC: This is a Release Candidate version for BlogEngine.NET 2.0. The most current, stable version of BlogEngine.NET is version 1.6. Find out more about the BlogEngine.NET 2.0 RC here. If you want to extend or modify BlogEngine.NET, you should download the source code. To get started, be sure to check out our installation documentation and the installation screencast. If you are upgrading from a previous version, please take a look at the Upgrading to BlogEngine.NET 2.0 instructions. As this ...NodeXL: Network Overview, Discovery and Exploration for Excel: NodeXL Excel Template, version 1.0.1.156: The NodeXL Excel template displays a network graph using edge and vertex lists stored in an Excel 2007 or Excel 2010 workbook. What's NewThis release adds a feature for aggregating the overall metrics in a folder full of NodeXL workbooks, adds geographical coordinates to the Twitter import features, and fixes a memory-related bug. See the Complete NodeXL Release History for details. Please Note: There is a new option in the setup program to install for "Just Me" or "Everyone." Most people...VFPX: FoxBarcode v.0.11: FoxBarcode v.0.11 - Released 2010.11.22 FoxBarcode is a 100% Visual FoxPro class that provides a tool for generating images with different bar code symbologies to be used in VFP forms and reports, or exported to other applications. Its use and distribution is free for all Visual FoxPro Community. Whats is new? Added a third parameter to the BarcodeImage() method Fixed some minor bugs History FoxBarcode v.0.10 - Released 2010.11.19 - 85 Downloads Project page: FoxBarcodeDotNetAge -a lightweight Mvc jQuery CMS: DotNetAge 1.1.0.5: What is new in DotNetAge 1.1.0.5 ?Document Library features and template added. Resolve issues of templates Improving publishing service performance Opml support added. What is new in DotNetAge 1.1 ? D.N.A Core updatesImprove runtime performance , more stabilize. The DNA core objects model added. Personalization features added that allows users create the personal website, manage their resources, store personal data DynamicUIFixed the PageManager could not move page node bug. ...ASP.NET MVC Project Awesome (jQuery Ajax helpers): 1.3.1 and demos: A rich set of helpers (controls) that you can use to build highly responsive and interactive Ajax-enabled Web applications. These helpers include Autocomplete, AjaxDropdown, Lookup, Confirm Dialog, Popup Form and Pager tested on mozilla, safari, chrome, opera, ie 9b/8/7/6DotSpatial: DotSpatial 11-21-2010: This release introduces the following Fixed bugs related to dispose, which caused issues when reordering layers in the legend Fixed bugs related to assigning categories where NULL values are in the fields New fast-acting resize using a bitmap "prediction" of what the final resize content will look like. ImageData.ReadBlock, ImageData.WriteBlock These allow direct file access for reading or writing a rectangular window. Bitmaps are used for holding the values. Removed the need to stor...New ProjectsAC: Just a simple AC ProjectAvailability Manager Project (CMPE290): Availability Manager Project (CMPE290)BCLExtensions: BCLExtensions is an extension method library for .NET 3.5 or higher which offers various extension methods for classes in the .NET Base Class Library (BCL)Command Browser 2.0 for Visual Studio: The Command Browser is a visual studio add-in that Lets you browse, and execute Visual Studio commands on the fly. It’s very handy for people that don’t tend to use bindings on daily basis and simply wants to learn new commands out of interest.EDM And T4: Generate a WCF Service from your edmxEmperor of the Multiverse: Szoftverarchitekturak rocks :)KINECT CALL: Kinect CallMultiSafepay for nopCommerce: This is MultiSafepay shop plugin voor nopCommerce. The current version supported is nopCommerce 1.60.MyGadgebydeleted: MyGadgebydeletedNHashcash: NHashcash is a .NET implementation of the Hashcash mechanism for paying for e-mail with burnt CPU cycles.OhYeah!: WP7 Note Demo ApplicationOMax - Online Market and Exchange for Real Estate: OMax Project - Online Real Estate Management System OMax stands for Online Market and eXchange and is a Real Estate management system focusing on both individuals (brokers, agents) and organizations (real estate companies) looking for online property management solutions.Opus - Silverlight UI Framework: The opus framework provides a simple way for creating UIs for Silverlight by leveraging your current MVVM structure. Outliner, the edge detection utility: Outliner is a vectorizer of the edges in the raster pictures. The new edge detection algorithm is used. It is developed in Visual C++ 2010 Express Edition, using WinAPI and STL.SharpDropBox Client for .NET: SharpDropBox Client is a DropBox client for (at first) WP7. It caches files/directories in ISO storage, and also uses Sterling DB to metadata info so that it can easily do hash checks against DropBox making it download only when there are changes. It can also be used offline.SicunSimpleDiary: ???????(????)Silverlight Navigation With Caliburn.Micro: Bring the power of the Cabliurn.Micro MVVM framework to Siliverlight 4 navigation applications. This sample application for CM shows how it can be used to create MVVM friendly Navigation applications which formerly required a view-first approach. SilverTrace - a tracer and logger for Silverlight applications: SilverTrace - a tracer and logger for Silverlight applications.StackDeck: StackDeck is a StackOverflow analysis tool written in WPFSVNHelper -- a small clean tool for visual studio: A small tool that used to clean your bin direcotry and temporary directory for the whole solutions. I usually use it before commit soure to a SVN Server. System.Xml.Serialization: Expressions.Compiler is a lightweight xml serialization framework. It is an addition to the standart .NET XML serialization. It is designed to support out-of-the-box features: * interface-driven Xml serialization * class-driven Xml serialization * runtime serializationVoXign: - Proposer une formation introductive à la langue des signes, en e-Learning. - autonomie de gestion de l’emploi du temps, du rythme - apprentissage d’un lexique de « première nécessité » - introduction à la culture sourde et approche des spécificités linguistiques de la LSF Weyland: A .NET metaprogramming framework.

    Read the article

  • rsync over ssh is not working anymore, while ssh itself is working fine (Write failed: broken pipe)

    - by brazorf
    This issue started happening after i changed router. This is the scenario: Windows7 Host Ubuntu 10.04 Guest (VirtualBox) Ubuntu 10.04 remote server What i used to do is run a very basic rsync command: rsync -avz --delete /local/path/ username@host:/path/to/remote/directory This worked perfect until i did change adsl provider, and i changed router aswell: now, this happens: rsync on Ubuntu Guest is not working anymore (to any random server), if using this new router rsync on Ubuntu Guest is WORKING, if i switch back to old router i tried a new virtual box ubuntu install, and the command is WORKING with both the routers So, the not-working-combo is oldUbuntu + newRouter. To get things worst, i can state that (on the not-working ubuntu) i ping the remote host plain ssh connection to the remote host is working fine (i can auth, connect, and do stuff on the remote host) scp is NOT working (this is just a further thing i tried) This is the console output of the execution, with ssh verbose set to vvvv: root@client:~# rsync -ae 'ssh -vvvv' /root/test-rsync/ {username}@{hostname}:/home/{username}/test/ OpenSSH_5.3p1 Debian-3ubuntu7, OpenSSL 0.9.8k 25 Mar 2009 debug1: Reading configuration data /root/.ssh/config debug1: Applying options for {hostname} debug1: Reading configuration data /etc/ssh/ssh_config debug1: Applying options for * debug2: ssh_connect: needpriv 0 debug1: Connecting to {hostname} [{ip.add.re.ss}] port 22. debug1: Connection established. debug1: permanently_set_uid: 0/0 debug3: Not a RSA1 key file /root/.ssh/{private_key}. debug2: key_type_from_name: unknown key type '-----BEGIN' debug3: key_read: missing keytype debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug2: key_type_from_name: unknown key type '-----END' debug3: key_read: missing keytype debug1: identity file /root/.ssh/{private_key} type 1 debug1: Checking blacklist file /usr/share/ssh/blacklist.RSA-2048 debug1: Checking blacklist file /etc/ssh/blacklist.RSA-2048 debug1: Remote protocol version 2.0, remote software version OpenSSH_5.3p1 Debian-3ubuntu7 debug1: match: OpenSSH_5.3p1 Debian-3ubuntu7 pat OpenSSH* debug1: Enabling compatibility mode for protocol 2.0 debug1: Local version string SSH-2.0-OpenSSH_5.3p1 Debian-3ubuntu7 debug2: fd 3 setting O_NONBLOCK debug1: SSH2_MSG_KEXINIT sent debug3: Wrote 792 bytes for a total of 831 debug1: SSH2_MSG_KEXINIT received debug2: kex_parse_kexinit: diffie-hellman-group-exchange-sha256,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1 debug2: kex_parse_kexinit: ssh-rsa,ssh-dss debug2: kex_parse_kexinit: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,[email protected] debug2: kex_parse_kexinit: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,[email protected] debug2: kex_parse_kexinit: hmac-md5,hmac-sha1,[email protected],hmac-ripemd160,[email protected],hmac-sha1-96,hmac-md5-96 debug2: kex_parse_kexinit: hmac-md5,hmac-sha1,[email protected],hmac-ripemd160,[email protected],hmac-sha1-96,hmac-md5-96 debug2: kex_parse_kexinit: [email protected],zlib,none debug2: kex_parse_kexinit: [email protected],zlib,none debug2: kex_parse_kexinit: debug2: kex_parse_kexinit: debug2: kex_parse_kexinit: first_kex_follows 0 debug2: kex_parse_kexinit: reserved 0 debug2: kex_parse_kexinit: diffie-hellman-group-exchange-sha256,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1 debug2: kex_parse_kexinit: ssh-rsa,ssh-dss debug2: kex_parse_kexinit: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,[email protected] debug2: kex_parse_kexinit: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,[email protected] debug2: kex_parse_kexinit: hmac-md5,hmac-sha1,[email protected],hmac-ripemd160,[email protected],hmac-sha1-96,hmac-md5-96 debug2: kex_parse_kexinit: hmac-md5,hmac-sha1,[email protected],hmac-ripemd160,[email protected],hmac-sha1-96,hmac-md5-96 debug2: kex_parse_kexinit: none,[email protected] debug2: kex_parse_kexinit: none,[email protected] debug2: kex_parse_kexinit: debug2: kex_parse_kexinit: debug2: kex_parse_kexinit: first_kex_follows 0 debug2: kex_parse_kexinit: reserved 0 debug2: mac_setup: found hmac-md5 debug1: kex: server->client aes128-ctr hmac-md5 [email protected] debug2: mac_setup: found hmac-md5 debug1: kex: client->server aes128-ctr hmac-md5 [email protected] debug1: SSH2_MSG_KEX_DH_GEX_REQUEST(1024<1024<8192) sent debug1: expecting SSH2_MSG_KEX_DH_GEX_GROUP debug3: Wrote 24 bytes for a total of 855 debug2: dh_gen_key: priv key bits set: 125/256 debug2: bits set: 525/1024 debug1: SSH2_MSG_KEX_DH_GEX_INIT sent debug1: expecting SSH2_MSG_KEX_DH_GEX_REPLY debug3: Wrote 144 bytes for a total of 999 debug3: check_host_in_hostfile: filename /root/.ssh/known_hosts debug3: check_host_in_hostfile: match line 4 debug3: check_host_in_hostfile: filename /root/.ssh/known_hosts debug3: check_host_in_hostfile: match line 5 debug1: Host '{hostname}' is known and matches the RSA host key. debug1: Found key in /root/.ssh/known_hosts:4 debug2: bits set: 512/1024 debug1: ssh_rsa_verify: signature correct debug2: kex_derive_keys debug2: set_newkeys: mode 1 debug1: SSH2_MSG_NEWKEYS sent debug1: expecting SSH2_MSG_NEWKEYS debug3: Wrote 16 bytes for a total of 1015 debug2: set_newkeys: mode 0 debug1: SSH2_MSG_NEWKEYS received debug1: SSH2_MSG_SERVICE_REQUEST sent debug3: Wrote 48 bytes for a total of 1063 debug2: service_accept: ssh-userauth debug1: SSH2_MSG_SERVICE_ACCEPT received debug2: key: /root/.ssh/{private_key} (0x7f3ad0e7f9b0) debug3: Wrote 80 bytes for a total of 1143 debug1: Authentications that can continue: publickey,password debug3: start over, passed a different list publickey,password debug3: preferred gssapi-keyex,gssapi-with-mic,gssapi,publickey,keyboard-interactive,password debug3: authmethod_lookup publickey debug3: remaining preferred: keyboard-interactive,password debug3: authmethod_is_enabled publickey debug1: Next authentication method: publickey debug1: Offering public key: /root/.ssh/{private_key} debug3: send_pubkey_test debug2: we sent a publickey packet, wait for reply debug3: Wrote 368 bytes for a total of 1511 debug1: Server accepts key: pkalg ssh-rsa blen 277 debug2: input_userauth_pk_ok: fp 1b:65:36:92:59:b3:12:3e:8c:c6:03:28:d4:81:09:dc debug3: sign_and_send_pubkey debug1: read PEM private key done: type RSA debug3: Wrote 656 bytes for a total of 2167 debug1: Enabling compression at level 6. debug1: Authentication succeeded (publickey). debug2: fd 4 setting O_NONBLOCK debug3: fd 5 is O_NONBLOCK debug1: channel 0: new [client-session] debug3: ssh_session2_open: channel_new: 0 debug2: channel 0: send open debug1: Requesting [email protected] debug1: Entering interactive session. debug3: Wrote 112 bytes for a total of 2279 debug2: callback start debug2: client_session2_setup: id 0 debug1: Sending environment. debug3: Ignored env TERM debug3: Ignored env SHELL debug3: Ignored env SSH_CLIENT debug3: Ignored env SSH_TTY debug1: Sending env LC_ALL = en_US.UTF-8 debug2: channel 0: request env confirm 0 debug3: Ignored env USER debug3: Ignored env LS_COLORS debug3: Ignored env MAIL debug3: Ignored env PATH debug3: Ignored env PWD debug1: Sending env LANG = en_US.UTF-8 debug2: channel 0: request env confirm 0 debug3: Ignored env SHLVL debug3: Ignored env HOME debug3: Ignored env LANGUAGE debug3: Ignored env LOGNAME debug3: Ignored env SSH_CONNECTION debug3: Ignored env LESSOPEN debug3: Ignored env LESSCLOSE debug3: Ignored env _ debug1: Sending command: rsync --server -logDtpre.iLsf . /home/{username}/test/ debug2: channel 0: request exec confirm 1 debug2: fd 3 setting TCP_NODELAY debug2: callback done debug2: channel 0: open confirm rwindow 0 rmax 32768 debug3: Wrote 208 bytes for a total of 2487 At this point everything freeze for lots of minutes, ending in Write failed: Broken pipe rsync: connection unexpectedly closed (0 bytes received so far) [sender] rsync error: unexplained error (code 255) at io.c(601) [sender=3.0.7] Any suggestion? Thank You F. Edit 2012/09/13: i am changing title and issue definition, since i made some TINY step ahead and i think i can give more detailed clues.

    Read the article

  • Problems with SAT Collision Detection

    - by DJ AzKai
    I'm doing a project in one of my modules for college in C++ with SFML and I was hoping someone may be able to help me. I'm using a vector of squares and triangles and I am using the SAT collision detection method to see if objects collide and to make the objects respond to the collision appropriately using the MTV(minimum translation vector) Below is my code: //from the main method int main(){ // Create the main window sf::RenderWindow App(sf::VideoMode(800, 600, 32), "SFML OpenGL"); // Create a clock for measuring time elapsed sf::Clock Clock; srand(time(0)); //prepare OpenGL surface for HSR glClearDepth(1.f); glClearColor(0.3f, 0.3f, 0.3f, 0.f); //background colour glEnable(GL_DEPTH_TEST); glDepthMask(GL_TRUE); //// Setup a perspective projection & Camera position glMatrixMode(GL_PROJECTION); glLoadIdentity(); //set up a 3D Perspective View volume //gluPerspective(90.f, 1.f, 1.f, 300.0f);//fov, aspect, zNear, zFar //set up a orthographic projection same size as window //this mease the vertex coordinates are in pixel space glOrtho(0,800,0,600,0,1); // use pixel coordinates // Finally, display rendered frame on screen vector<BouncingThing*> triangles; for(int i = 0; i < 10; i++) { //instantiate each triangle; triangles.push_back(new BouncingTriangle(Vector2f(rand() % 700, rand() % 500), 3)); } vector<BouncingThing*> boxes; for(int i = 0; i < 10; i++) { //instantiate each box; boxes.push_back(new BouncingBox(Vector2f(rand() % 700, rand() % 500), 4)); } CollisionDetection * b = new CollisionDetection(); // Start game loop while (App.isOpen()) { // Process events sf::Event Event; while (App.pollEvent(Event)) { // Close window : exit if (Event.type == sf::Event::Closed) App.close(); // Escape key : exit if ((Event.type == sf::Event::KeyPressed) && (Event.key.code == sf::Keyboard::Escape)) App.close(); } //Prepare for drawing // Clear color and depth buffer glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); // Apply some transformations glMatrixMode(GL_MODELVIEW); glLoadIdentity(); for(int i = 0; i < 10; i++) { triangles[i]->draw(); boxes[i]->draw(); triangles[i]->update(Vector2f(800,600)); boxes[i]->draw(); boxes[i]->update(Vector2f(800,600)); } for(int j = 0; j < 10; j++) { for(int i = 0; i < 10; i++) { triangles[j]->setCollision(b->CheckCollision(*(triangles[j]),*(boxes[i]))); } } for(int j = 0; j < 10; j++) { for(int i = 0; i < 10; i++) { boxes[j]->setCollision(b->CheckCollision(*(boxes[j]),*(triangles[i]))); } } for(int i = 0; i < triangles.size(); i++) { for(int j = i + 1; j < triangles.size(); j ++) { triangles[j]->setCollision(b->CheckCollision(*(triangles[j]),*(triangles[i]))); } } for(int i = 0; i < triangles.size(); i++) { for(int j = i + 1; j < triangles.size(); j ++) { boxes[j]->setCollision(b->CheckCollision(*(boxes[j]),*(boxes[i]))); } } App.display(); } return EXIT_SUCCESS; } (ignore this line) //from the BouncingThing.cpp BouncingThing::BouncingThing(Vector2f position, int noSides) : pos(position), pi(3.14), radius(3.14), nSides(noSides) { collided = false; if(nSides ==3) { Vector2f vert1 = Vector2f(-12.0f,-12.0f); Vector2f vert2 = Vector2f(0.0f, 12.0f); Vector2f vert3 = Vector2f(12.0f,-12.0f); verts.push_back(vert1); verts.push_back(vert2); verts.push_back(vert3); } else if(nSides == 4) { Vector2f vert1 = Vector2f(-12.0f,12.0f); Vector2f vert2 = Vector2f(12.0f, 12.0f); Vector2f vert3 = Vector2f(12.0f,-12.0f); Vector2f vert4 = Vector2f(-12.0f, -12.0f); verts.push_back(vert1); verts.push_back(vert2); verts.push_back(vert3); verts.push_back(vert4); } velocity.x = ((rand() % 5 + 1) / 3) + 1; velocity.y = ((rand() % 5 + 1) / 3 ) +1; } void BouncingThing::update(Vector2f screenSize) { Transform t; t.rotate(0); for(int i=0;i< verts.size(); i++) { verts[i]=t.transformPoint(verts[i]); } if(pos.x >= screenSize.x || pos.x <= 0) { velocity.x *= -1; } if(pos.y >= screenSize.y || pos.y <= 0) { velocity.y *= -1; } if(collided) { //velocity.x *= -1; //velocity.y *= -1; collided = false; } pos += velocity; } void BouncingThing::setCollision(bool x){ collided = x; } void BouncingThing::draw() { glBegin(GL_POLYGON); glColor3f(0,1,0); for(int i = 0; i < verts.size(); i++) { glVertex2f(pos.x + verts[i].x,pos.y + verts[i].y); } glEnd(); } vector<Vector2f> BouncingThing::getNormals() { vector<Vector2f> normalVerts; if(nSides == 3) { Vector2f ab = Vector2f((verts[1].x + pos.x) - (verts[0].x + pos.x), (verts[1].y + pos.y) - (verts[0].y + pos.y)); ab = flip(ab); ab.x *= -1; normalVerts.push_back(ab); Vector2f bc = Vector2f((verts[2].x + pos.x) - (verts[1].x + pos.x), (verts[2].y + pos.y) - (verts[1].y + pos.y)); bc = flip(bc); bc.x *= -1; normalVerts.push_back(bc); Vector2f ac = Vector2f((verts[2].x + pos.x) - (verts[0].x + pos.x), (verts[2].y + pos.y) - (verts[0].y + pos.y)); ac = flip(ac); ac.x *= -1; normalVerts.push_back(ac); return normalVerts; } if(nSides ==4) { Vector2f ab = Vector2f((verts[1].x + pos.x) - (verts[0].x + pos.x), (verts[1].y + pos.y) - (verts[0].y + pos.y)); ab = flip(ab); ab.x *= -1; normalVerts.push_back(ab); Vector2f bc = Vector2f((verts[2].x + pos.x) - (verts[1].x + pos.x), (verts[2].y + pos.y) - (verts[1].y + pos.y)); bc = flip(bc); bc.x *= -1; normalVerts.push_back(bc); return normalVerts; } } Vector2f BouncingThing::flip(Vector2f v){ float vyTemp = v.x; float vxTemp = v.y * -1; return Vector2f(vxTemp, vyTemp); } (Ignore this line) CollisionDetection::CollisionDetection() { } vector<float> CollisionDetection::bubbleSort(vector<float> w) { int temp; bool finished = false; while (!finished) { finished = true; for (int i = 0; i < w.size()-1; i++) { if (w[i] > w[i+1]) { temp = w[i]; w[i] = w[i+1]; w[i+1] = temp; finished=false; } } } return w; } class Vector{ public: //static int dp_count; static float dot(sf::Vector2f a,sf::Vector2f b){ //dp_count++; return a.x*b.x+a.y*b.y; } static float length(sf::Vector2f a){ return sqrt(a.x*a.x+a.y*a.y); } static Vector2f add(Vector2f a, Vector2f b) { return Vector2f(a.x + b.y, a.y + b.y); } static sf::Vector2f getNormal(sf::Vector2f a,sf::Vector2f b){ sf::Vector2f n; n=a-b; n/=Vector::length(n);//normalise float x=n.x; n.x=n.y; n.y=-x; return n; } }; bool CollisionDetection::CheckCollision(BouncingThing & x, BouncingThing & y) { vector<Vector2f> xVerts = x.getVerts(); vector<Vector2f> yVerts = y.getVerts(); vector<Vector2f> xNormals = x.getNormals(); vector<Vector2f> yNormals = y.getNormals(); int size; vector<float> xRange; vector<float> yRange; for(int j = 0; j < xNormals.size(); j++) { Vector p; for(int i = 0; i < xVerts.size(); i++) { xRange.push_back(p.dot(xNormals[j], Vector2f(xVerts[i].x, xVerts[i].x))); } for(int i = 0; i < yVerts.size(); i++) { yRange.push_back(p.dot(xNormals[j], Vector2f(yVerts[i].x , yVerts[i].y))); } yRange = bubbleSort(yRange); xRange = bubbleSort(xRange); if(xRange[xRange.size() - 1] < yRange[0] || yRange[yRange.size() - 1] < xRange[0]) { return false; } float x3 = Min(xRange[0], yRange[0]); float y3 = Max(xRange[xRange.size() - 1], yRange[yRange.size() - 1]); float length = Max(x3, y3) - Min(x3, y3); } for(int j = 0; j < yNormals.size(); j++) { Vector p; for(int i = 0; i < xVerts.size(); i++) { xRange.push_back(p.dot(yNormals[j], xVerts[i])); } for(int i = 0; i < yVerts.size(); i++) { yRange.push_back(p.dot(yNormals[j], yVerts[i])); } yRange = bubbleSort(yRange); xRange = bubbleSort(xRange); if(xRange[xRange.size() - 1] < yRange[0] || yRange[yRange.size() - 1] < xRange[0]) { return false; } } return true; } float CollisionDetection::Min(float min, float max) { if(max < min) { min = max; } else return min; } float CollisionDetection::Max(float min, float max) { if(min > max) { max = min; } else return min; } On the screen the objects will freeze for a small amount of time before moving off again. However the problem is is that when this happens there are no collisions actually happening and I would really love to find out where the flaw is in the code. If you need any more information/code please don't hesitate to ask and I'll reply as soon as possible Regards, AzKai

    Read the article

  • Why do ICMP Redirect Host happen?

    - by El Barto
    I'm setting up a Debian box as a router for 4 subnets. For that I have defined 4 virtual interfaces on the NIC where the LAN is connected (eth1). eth1 Link encap:Ethernet HWaddr 94:0c:6d:82:0d:98 inet addr:10.1.1.1 Bcast:10.1.1.255 Mask:255.255.255.0 inet6 addr: fe80::960c:6dff:fe82:d98/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:6026521 errors:0 dropped:0 overruns:0 frame:0 TX packets:35331299 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:673201397 (642.0 MiB) TX bytes:177276932 (169.0 MiB) Interrupt:19 Base address:0x6000 eth1:0 Link encap:Ethernet HWaddr 94:0c:6d:82:0d:98 inet addr:10.1.2.1 Bcast:10.1.2.255 Mask:255.255.255.0 UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 Interrupt:19 Base address:0x6000 eth1:1 Link encap:Ethernet HWaddr 94:0c:6d:82:0d:98 inet addr:10.1.3.1 Bcast:10.1.3.255 Mask:255.255.255.0 UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 Interrupt:19 Base address:0x6000 eth1:2 Link encap:Ethernet HWaddr 94:0c:6d:82:0d:98 inet addr:10.1.4.1 Bcast:10.1.4.255 Mask:255.255.255.0 UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 Interrupt:19 Base address:0x6000 eth2 Link encap:Ethernet HWaddr 6c:f0:49:a4:47:38 inet addr:192.168.1.10 Bcast:192.168.1.255 Mask:255.255.255.0 inet6 addr: fe80::6ef0:49ff:fea4:4738/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:199809345 errors:0 dropped:0 overruns:0 frame:0 TX packets:158362936 errors:0 dropped:0 overruns:0 carrier:1 collisions:0 txqueuelen:1000 RX bytes:3656983762 (3.4 GiB) TX bytes:1715848473 (1.5 GiB) Interrupt:27 eth3 Link encap:Ethernet HWaddr 94:0c:6d:82:c8:72 inet addr:192.168.2.5 Bcast:192.168.2.255 Mask:255.255.255.0 inet6 addr: fe80::960c:6dff:fe82:c872/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:110814 errors:0 dropped:0 overruns:0 frame:0 TX packets:73386 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:16044901 (15.3 MiB) TX bytes:42125647 (40.1 MiB) Interrupt:20 Base address:0x2000 lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:16436 Metric:1 RX packets:22351 errors:0 dropped:0 overruns:0 frame:0 TX packets:22351 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:2625143 (2.5 MiB) TX bytes:2625143 (2.5 MiB) tun0 Link encap:UNSPEC HWaddr 00-00-00-00-00-00-00-00-00-00-00-00-00-00-00-00 inet addr:10.8.0.1 P-t-P:10.8.0.2 Mask:255.255.255.255 UP POINTOPOINT RUNNING NOARP MULTICAST MTU:1500 Metric:1 RX packets:41358924 errors:0 dropped:0 overruns:0 frame:0 TX packets:23116350 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:100 RX bytes:3065505744 (2.8 GiB) TX bytes:1324358330 (1.2 GiB) I have two other computers connected to this network. One has IP 10.1.1.12 (subnet mask 255.255.255.0) and the other one 10.1.2.20 (subnet mask 255.255.255.0). I want to be able to reach 10.1.1.12 from 10.1.2.20. Since packet forwarding is enabled in the router and the policy of the FORWARD chain is ACCEPT (and there are no other rules), I understand that there should be no problem to ping from 10.1.2.20 to 10.1.1.12 going through the router. However, this is what I get: $ ping -c15 10.1.1.12 PING 10.1.1.12 (10.1.1.12): 56 data bytes Request timeout for icmp_seq 0 92 bytes from router2.mydomain.com (10.1.2.1): Redirect Host(New addr: 10.1.1.12) Vr HL TOS Len ID Flg off TTL Pro cks Src Dst 4 5 00 0054 81d4 0 0000 3f 01 e2b3 10.1.2.20 10.1.1.12 Request timeout for icmp_seq 1 92 bytes from router2.mydomain.com (10.1.2.1): Redirect Host(New addr: 10.1.1.12) Vr HL TOS Len ID Flg off TTL Pro cks Src Dst 4 5 00 0054 899b 0 0000 3f 01 daec 10.1.2.20 10.1.1.12 Request timeout for icmp_seq 2 92 bytes from router2.mydomain.com (10.1.2.1): Redirect Host(New addr: 10.1.1.12) Vr HL TOS Len ID Flg off TTL Pro cks Src Dst 4 5 00 0054 78fe 0 0000 3f 01 eb89 10.1.2.20 10.1.1.12 Request timeout for icmp_seq 3 92 bytes from router2.mydomain.com (10.1.2.1): Redirect Host(New addr: 10.1.1.12) Vr HL TOS Len ID Flg off TTL Pro cks Src Dst 4 5 00 0054 14b8 0 0000 3f 01 4fd0 10.1.2.20 10.1.1.12 Request timeout for icmp_seq 4 92 bytes from router2.mydomain.com (10.1.2.1): Redirect Host(New addr: 10.1.1.12) Vr HL TOS Len ID Flg off TTL Pro cks Src Dst 4 5 00 0054 8ef7 0 0000 3f 01 d590 10.1.2.20 10.1.1.12 Request timeout for icmp_seq 5 92 bytes from router2.mydomain.com (10.1.2.1): Redirect Host(New addr: 10.1.1.12) Vr HL TOS Len ID Flg off TTL Pro cks Src Dst 4 5 00 0054 ec9d 0 0000 3f 01 77ea 10.1.2.20 10.1.1.12 Request timeout for icmp_seq 6 92 bytes from router2.mydomain.com (10.1.2.1): Redirect Host(New addr: 10.1.1.12) Vr HL TOS Len ID Flg off TTL Pro cks Src Dst 4 5 00 0054 70e6 0 0000 3f 01 f3a1 10.1.2.20 10.1.1.12 Request timeout for icmp_seq 7 92 bytes from router2.mydomain.com (10.1.2.1): Redirect Host(New addr: 10.1.1.12) Vr HL TOS Len ID Flg off TTL Pro cks Src Dst 4 5 00 0054 b0d2 0 0000 3f 01 b3b5 10.1.2.20 10.1.1.12 Request timeout for icmp_seq 8 92 bytes from router2.mydomain.com (10.1.2.1): Redirect Host(New addr: 10.1.1.12) Vr HL TOS Len ID Flg off TTL Pro cks Src Dst 4 5 00 0054 f8b4 0 0000 3f 01 6bd3 10.1.2.20 10.1.1.12 Request timeout for icmp_seq 9 Request timeout for icmp_seq 10 92 bytes from router2.mydomain.com (10.1.2.1): Redirect Host(New addr: 10.1.1.12) Vr HL TOS Len ID Flg off TTL Pro cks Src Dst 4 5 00 0054 1c95 0 0000 3f 01 47f3 10.1.2.20 10.1.1.12 Request timeout for icmp_seq 11 Request timeout for icmp_seq 12 Request timeout for icmp_seq 13 92 bytes from router2.mydomain.com (10.1.2.1): Redirect Host(New addr: 10.1.1.12) Vr HL TOS Len ID Flg off TTL Pro cks Src Dst 4 5 00 0054 62bc 0 0000 3f 01 01cc 10.1.2.20 10.1.1.12 Why does this happen? From what I've read the Redirect Host response has something to do with the fact that the two hosts are in the same network and there being a shorter route (or so I understood). They are in fact in the same physical network, but why would there be a better route if they are not on the same subnet (they can't see each other)? What am I missing? Some extra info you might want to see: # route -n Kernel IP routing table Destination Gateway Genmask Flags Metric Ref Use Iface 10.8.0.2 0.0.0.0 255.255.255.255 UH 0 0 0 tun0 127.0.0.1 0.0.0.0 255.255.255.255 UH 0 0 0 lo 192.168.2.0 0.0.0.0 255.255.255.0 U 0 0 0 eth3 10.8.0.0 10.8.0.2 255.255.255.0 UG 0 0 0 tun0 192.168.1.0 0.0.0.0 255.255.255.0 U 1 0 0 eth2 10.1.4.0 0.0.0.0 255.255.255.0 U 0 0 0 eth1 10.1.1.0 0.0.0.0 255.255.255.0 U 0 0 0 eth1 10.1.2.0 0.0.0.0 255.255.255.0 U 0 0 0 eth1 10.1.3.0 0.0.0.0 255.255.255.0 U 0 0 0 eth1 0.0.0.0 192.168.1.1 0.0.0.0 UG 0 0 0 eth2 0.0.0.0 192.168.2.1 0.0.0.0 UG 100 0 0 eth3 # iptables -L -n Chain INPUT (policy ACCEPT) target prot opt source destination Chain FORWARD (policy ACCEPT) target prot opt source destination Chain OUTPUT (policy ACCEPT) target prot opt source destination # iptables -L -n -t nat Chain PREROUTING (policy ACCEPT) target prot opt source destination Chain POSTROUTING (policy ACCEPT) target prot opt source destination MASQUERADE all -- !10.0.0.0/8 10.0.0.0/8 MASQUERADE all -- 10.0.0.0/8 !10.0.0.0/8 Chain OUTPUT (policy ACCEPT) target prot opt source destination

    Read the article

  • CodePlex Daily Summary for Saturday, November 03, 2012

    CodePlex Daily Summary for Saturday, November 03, 2012Popular ReleasesZXMAK2: Version 2.6.8.2: fix save to SZX snapshot for +3 model fix +3 ULA timingsLaunchbar: Launchbar 4.2.2.0: This release is the first step in cleaning up the code and using all the latest features of .NET 4.5 Changes 4.2.2 (2012-11-02) Improved handling of left clicks 4.1.0 (2012-10-17) Removed tray icon Assembly renamed and signed with strong name Note When you upgrade, Launchbar will start with the default settings. You can import your previous settings by following these steps: Run Launchbar and just save the settings without configuring anything Shutdown Launchbar Go to the folder %LOCA...CommonLibrary.NET: CommonLibrary.NET 0.9.8.8: Releases notes for FluentScript located at http://fluentscript.codeplex.com/wikipage?title=Release%20Notes&referringTitle=Documentation Fluentscript - 0.9.8.8 - Final ReleaseApplication: FluentScript Version: 0.9.8.8 Build: 0.9.8.8 Changeset: 77368 ( CommonLibrary.NET ) Release date: November 2nd, 2012 Binaries: CommonLibrary.dll Namespace: ComLib.Lang Project site: http://fluentscript.codeplex.com/ Download: http://commonlibrarynet.codeplex.com/releases/view/90426 Source code: http://common...Mouse Jiggler: MouseJiggle-1.3: This adds the much-requested minimize-to-tray feature to Mouse Jiggler.Umbraco CMS: Umbraco 4.10.0 Release Candidate: This is a Release Candidate, which means that if we do not find any major issues in the next week, we will release this version as the final release of 4.10.0 on November 9th, 2012. The documentation for the MVC bits still lives in the Github version of the docs for now and will be updated on our.umbraco.org with the final release of 4.10.0. Browse the documentation here: https://github.com/umbraco/Umbraco4Docs/tree/4.8.0/Documentation/Reference/Mvc If you want to do only MVC then make sur...Skype Auto Recorder: SkypeAutoRecorder 1.3.4: New icon and images. Reworked settings window. Implemented high-quality sound encoding. Implemented a possibility to produce stereo records. Added buttons with system-wide hot keys for manual starting and canceling of recording. Added buttons for opening folder with records. Added Help button. Fixed an issue when recording is continuing after call end. Fixed an issue when recording doesn't start. Fixed several bugs and improved stability. Major refactoring and optimization...Access 2010 Application Platform - Build Your Own Database: Application Platform - 0.0.2: Release 0.0.2 Created two new users. One belongs to the Administrators group and the other to the Public group. User: admin Pass: admin User: guest Pass: guest Initial Release This is the first version of the database. At the moment is all contained in one file to make development easier, but the obvious idea would be to split it into Front and Back End for a production version of the tool. The features it contains at the moment are the "Core" features.Python Tools for Visual Studio: Python Tools for Visual Studio 1.5: We’re pleased to announce the release of Python Tools for Visual Studio 1.5 RTM. Python Tools for Visual Studio (PTVS) is an open-source plug-in for Visual Studio which supports programming with the Python language. PTVS supports a broad range of features including CPython/IronPython, Edit/Intellisense/Debug/Profile, Cloud, HPC, IPython, etc. support. For a quick overview of the general IDE experience, please watch this video There are a number of exciting improvement in this release comp...BCF.Net: BCF.Net: BCF.Net-20121024 source codeAssaultCube Reloaded: 2.5.5: Linux has Ubuntu 11.10 32-bit precompiled binaries and Ubuntu 10.10 64-bit precompiled binaries, but you can compile your own as it also contains the source. If you are using Mac or other operating systems, please wait while we try to package for those OSes. Try to compile it. If it fails, download a virtual machine. The server pack is ready for both Windows and Linux, but you might need to compile your own for Linux (source included) Changelog: Fixed potential bot bugs: Map change, OpenAL...Edi: Edi 1.0 with DarkExpression: Added DarkExpression theme (dialogs and message boxes are not completely themed, yet)DirectX Tool Kit: October 30, 2012 (add WP8 support): October 30, 2012 Added project files for Windows Phone 8MCEBuddy 2.x: MCEBuddy 2.3.6: Changelog for 2.3.6 (32bit and 64bit) 1. Fixed a bug in multichannel audio conversion failure. AAC does not support 6 channel audio, MCEBuddy now checks for it and force the output to 2 channel if AAC codec is specified 2. Fixed a bug in Original Broadcast Date and Time. Original Broadcast Date and Time is reported in UTC timezone in WTV metadata. TVDB and MovieDB dates are reported in network timezone. It is assumed the video is recorded and converted on the same machine, i.e. local timezone...MVVM Light Toolkit: MVVM Light Toolkit V4.1 for Visual Studio 2012: This version only supports Visual Studio 2012 (and all Express editions too). If you use Visual Studio 2010, please stay tuned, we will publish an update in a few days with support for VS10. V4.1 supports: Windows Phone 8 Windows 8 (Windows RT) Silverlight 5 Silverlight 4 WPF 4.5 WPF 4 WPF 3.5 And the following development environments: Visual Studio 2012 (Pro, Premium, Ultimate) Visual Studio 2012 Express for Windows 8 Visual Studio 2012 Express for Windows Phone 8 Visual...Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.73: Fix issue in Discussion #401101 (unreferenced var in a for-in statement was getting removed). add the grouping operator to the parsed output so that unminified parsed code is closer to the original. Will still strip unneeded parens later, if minifying. more cleaning of references as they are minified out of the code.RiP-Ripper & PG-Ripper: PG-Ripper 1.4.03: changes NEW: Added Support for the phun.org forum FIXED: Kitty-Kats new Forum UrlLiberty: v3.4.0.1 Release 28th October 2012: Change Log -Fixed -H4 Fixed the save verification screen showing incorrect mission and difficulty information for some saves -H4 Hopefully fixed the issue where progress did not save between missions and saves would not revert correctly -H3 Fixed crashes that occurred when trying to load player information -Proper exception dialogs will now show in place of crashesPlayer Framework by Microsoft: Player Framework for Windows 8 (Preview 7): This release is compatible with the version of the Smooth Streaming SDK released today (10/26). Release 1 of the player framework is expected to be available next week. IMPROVEMENTS & FIXESIMPORTANT: List of breaking changes from preview 6 Support for the latest smooth streaming SDK. Xaml only: Support for moving any of the UI elements outside the MediaPlayer (e.g. into the appbar). Note: Equivelent changes to the JS version due in coming week. Support for localizing all text used in t...Send multiple SMS via Way2SMS C#: SMS 1.1: Added support for 160by2Quick Launch: Quick Launch 1.0: A Lightweight and Fast Way to Manage and Launch Thousands of Tools and ApplicationsPress Win+Q and start to search and run. http://www.codeplex.com/Download?ProjectName=quicklaunch&DownloadId=523536New Projectsappnitiren: This project's main objective is to deepen my knowledge on developing apps for windows 8 and windows phone and Buddhist philosophy of Nichiren Daishonin. Aspect - TypeScript AOP Framework: A simple AOP framework for TypeScript.AWF's PowerPoint Tab: A collection of little powerpoint utilities, including "resize all shapes".Base64 Encoder-Decoder: (Base64 Encoder/Decoder using C#)BombaJob-WF: Official Windows desktop client for BombaJob.bgBungie Test: !CCAudio: A project to create an audio recording/mixing and processing framework in C#/C++ with a look to creating a Digital Audio Workstation. EvaluationSyarem: for yuanguangexporttfschangesets: Small utility that runs agains tftp.exe(tfs power tools) and exports a given range of changesets.fjfdszjtfs: fjfdszjtfsKladionica: Ovo je mala web aplikacija koja je specijalizovana za f1 kladionicu na forumu KrstaricaListViewItem Float Test: WPF testing ground for development of a "floating" listviewitem (preeettty)Macconnell: This is my test projectNWebsec Demo site: This is the NWebsec demo website project.Ocean Flattened: Summary of this projectQuagmire: QUAGMIRE aims to make distributed data processing and visualisation simple and flexible.Scheduled SQL Jobs for DotNetNuke by IowaComputerGurus Inc.: Keeping a DotNetNuke site database clean can be a real nightmare for those hosting sites on shared hosting providers.Secretary Tool: Tool for secretaries of congregations of Jehovah's Witnesses.Secure My Install for DotNetNuke by IowaComputerGurus: This module is a single-use per DotNetNuke installation utility module that will make a number of security enhancing modifications to your "out of the box" DotNServStop: ServStop is a .NET application that makes it easy to stop several system services at once. Now you don't have to change startup types or stop them one at a time. It has a simple list-based interface with the ability to save and load lists of user services to stop. Written in C#.Simple File List for DotNetNuke by IowaComputerGurus Inc.: The Simple File List module is a Free DotNetNuke module that will list all files within a specific folder under the specific portal folder within DNN for users SOLUS3 Config Discovery: An open-source community developed tool for generating a simple text file detailing your local SIMS server settings to simplify new SOLUS3 configurations.Sort Project: Project Name: Sort Project Description: This project holds a class that extends objects type of IEnumerable by enabling them using sorting algorithms other than the buil in one within the microsoft staff SQL Azure Data Protector: This tool leverages the currently available options that Azure platform has to provide an integrated and automated solution to back up the SQL Azure databases.Throttling Suite for ASP.NET Applications: The Throttling Suite provides throttling control capabilities to the ASP.NET applications. It is highly customizable; including "log-only" mode.TidyBackups: Allows for automation of deletion of Microsoft SQL backup files based on file type (.bak) creation age as well as compression using standard ZIP technology.TrainingData: TrainingData is a set of libraries for reading and modifying training data files for Garmin and Polar heart rate monitors.Virtual eCommerce: ??? ? ?????? ????????? ??????. ??????: ??????????? ?????????, ?????????? ?????????? ?? ????????????? ?? Microsoft - ASP Web Pages with Razor Engine.WebNext: My personal web page / blog with cms on asp mvc 3Windows 8 Mouse Unsticky: Makes Windows 8 top right/left hot corners unsticky.WPForms: WPForms is simple framework for Form-driven Silverlight Windows Phone applications. Using the framework allows developers to easily create and display forms.

    Read the article

  • Why is Varnish not caching?

    - by Justin
    I am troubleshooting the setup of Varnish 3.x on my Ubuntu server. I'm running Drupal 7 on two sites set up on the box, via named-based vhosts. Before trying to get Varnish to play nice with Drupal I'm trying to just get Varnish to a PNG from cache. Here are the headers I get from a curl -I request of the PNG file: HTTP/1.1 200 OK Server: Apache/2.2.22 (Ubuntu) Last-Modified: Sun, 07 Oct 2012 21:18:59 GMT ETag: "a57c2-3850-4cb7ea73db6c0" Accept-Ranges: bytes Content-Length: 14416 Cache-Control: max-age=1209600 Expires: Thu, 25 Oct 2012 22:55:14 GMT Content-Type: image/png Accept-Ranges: bytes Date: Thu, 11 Oct 2012 22:55:14 GMT X-Varnish: 1766703058 Age: 0 Via: 1.1 varnish Connection: keep-alive X-Varnish-Cache: MISS Here is the Varnish VCL file I'm using (It's a default VCL configuration designed for Drupal): # Default backend definition. Set this to point to your content # server. # backend default { .host = "127.0.0.1"; .port = "8080"; } # Respond to incoming requests. sub vcl_recv { # Use anonymous, cached pages if all backends are down. if (!req.backend.healthy) { unset req.http.Cookie; } # Allow the backend to serve up stale content if it is responding slowly. set req.grace = 6h; # Pipe these paths directly to Apache for streaming. #if (req.url ~ "^/admin/content/backup_migrate/export") { # return (pipe); #} # Do not cache these paths. if (req.url ~ "^/status\.php$" || req.url ~ "^/update\.php$" || req.url ~ "^/admin$" || req.url ~ "^/admin/.*$" || req.url ~ "^/flag/.*$" || req.url ~ "^.*/ajax/.*$" || req.url ~ "^.*/ahah/.*$") { return (pass); } # Do not allow outside access to cron.php or install.php. #if (req.url ~ "^/(cron|install)\.php$" && !client.ip ~ internal) { # Have Varnish throw the error directly. # error 404 "Page not found."; # Use a custom error page that you've defined in Drupal at the path "404". # set req.url = "/404"; #} # Always cache the following file types for all users. This list of extensions # appears twice, once here and again in vcl_fetch so make sure you edit both # and keep them equal. if (req.url ~ "(?i)\.(pdf|asc|dat|txt|doc|xls|ppt|tgz|csv|png|gif|jpeg|jpg|ico|swf|css|js)(\?.*)?$") { unset req.http.Cookie; } # Remove all cookies that Drupal doesn't need to know about. We explicitly # list the ones that Drupal does need, the SESS and NO_CACHE. If, after # running this code we find that either of these two cookies remains, we # will pass as the page cannot be cached. if (req.http.Cookie) { # 1. Append a semi-colon to the front of the cookie string. # 2. Remove all spaces that appear after semi-colons. # 3. Match the cookies we want to keep, adding the space we removed # previously back. (\1) is first matching group in the regsuball. # 4. Remove all other cookies, identifying them by the fact that they have # no space after the preceding semi-colon. # 5. Remove all spaces and semi-colons from the beginning and end of the # cookie string. set req.http.Cookie = ";" + req.http.Cookie; set req.http.Cookie = regsuball(req.http.Cookie, "; +", ";"); set req.http.Cookie = regsuball(req.http.Cookie, ";(SESS[a-z0-9]+|SSESS[a-z0-9]+|NO_CACHE)=", "; \1="); set req.http.Cookie = regsuball(req.http.Cookie, ";[^ ][^;]*", ""); set req.http.Cookie = regsuball(req.http.Cookie, "^[; ]+|[; ]+$", ""); if (req.http.Cookie == "") { # If there are no remaining cookies, remove the cookie header. If there # aren't any cookie headers, Varnish's default behavior will be to cache # the page. unset req.http.Cookie; } else { # If there is any cookies left (a session or NO_CACHE cookie), do not # cache the page. Pass it on to Apache directly. return (pass); } } } # Set a header to track a cache HIT/MISS. sub vcl_deliver { if (obj.hits > 0) { set resp.http.X-Varnish-Cache = "HIT"; } else { set resp.http.X-Varnish-Cache = "MISS"; } } # Code determining what to do when serving items from the Apache servers. # beresp == Back-end response from the web server. sub vcl_fetch { # We need this to cache 404s, 301s, 500s. Otherwise, depending on backend but # definitely in Drupal's case these responses are not cacheable by default. if (beresp.status == 404 || beresp.status == 301 || beresp.status == 500) { set beresp.ttl = 10m; } # Don't allow static files to set cookies. # (?i) denotes case insensitive in PCRE (perl compatible regular expressions). # This list of extensions appears twice, once here and again in vcl_recv so # make sure you edit both and keep them equal. if (req.url ~ "(?i)\.(pdf|asc|dat|txt|doc|xls|ppt|tgz|csv|png|gif|jpeg|jpg|ico|swf|css|js)(\?.*)?$") { unset beresp.http.set-cookie; } # Allow items to be stale if needed. set beresp.grace = 6h; } # In the event of an error, show friendlier messages. sub vcl_error { # Redirect to some other URL in the case of a homepage failure. #if (req.url ~ "^/?$") { # set obj.status = 302; # set obj.http.Location = "http://backup.example.com/"; #} # Otherwise redirect to the homepage, which will likely be in the cache. set obj.http.Content-Type = "text/html; charset=utf-8"; synthetic {" <html> <head> <title>Page Unavailable</title> <style> body { background: #303030; text-align: center; color: white; } #page { border: 1px solid #CCC; width: 500px; margin: 100px auto 0; padding: 30px; background: #323232; } a, a:link, a:visited { color: #CCC; } .error { color: #222; } </style> </head> <body onload="setTimeout(function() { window.location = '/' }, 5000)"> <div id="page"> <h1 class="title">Page Unavailable</h1> <p>The page you requested is temporarily unavailable.</p> <p>We're redirecting you to the <a href="/">homepage</a> in 5 seconds.</p> <div class="error">(Error "} + obj.status + " " + obj.response + {")</div> </div> </body> </html> "}; return (deliver); } I'm getting a MISS and age 0 every time. If I'm understanding correctly, this means the file isn't being returned from Varnish's cache. Is there a problem with my Varnish config?

    Read the article

  • Unable to connect to Linux (Virtual OS-vmware) through Putty on Windows

    - by RBA
    Hi, I want to access my Linux box (Virtual OS) through Putty on Windows using Run command: putty -ssh -P 22 192.168.171.130,,, but it is returning an error message, not able to connect. But few days back I was able to connect it today. But not now. Why?? Windows IP Configuration Host Name . . . . . . . . . . . . : rba7791fd466 Primary Dns Suffix . . . . . . . : Node Type . . . . . . . . . . . . : Unknown IP Routing Enabled. . . . . . . . : No WINS Proxy Enabled. . . . . . . . : No Ethernet adapter VMware Network Adapter VMnet1: Connection-specific DNS Suffix . : Description . . . . . . . . . . . : VMware Virtual Ethernet Adapter for VMnet1 Physical Address. . . . . . . . . : 00-50-56-C0-00-01 Dhcp Enabled. . . . . . . . . . . : No IP Address. . . . . . . . . . . . : 192.168.234.1 Subnet Mask . . . . . . . . . . . : 255.255.255.0 Default Gateway . . . . . . . . . : Ethernet adapter Wireless Network Connection: Connection-specific DNS Suffix . : Description . . . . . . . . . . . : Dell Wireless 1395 WLAN Mini-Card Physical Address. . . . . . . . . : 00-24-2B-60-A0-88 Dhcp Enabled. . . . . . . . . . . : Yes Autoconfiguration Enabled . . . . : Yes IP Address. . . . . . . . . . . . : 10.0.0.2 Subnet Mask . . . . . . . . . . . : 255.255.255.0 Default Gateway . . . . . . . . . : 10.0.0.1 DHCP Server . . . . . . . . . . . : 10.0.0.1 DNS Servers . . . . . . . . . . . : 10.0.0.1 Lease Obtained. . . . . . . . . . : Friday, August 28, 2009 4:11:09 AM Lease Expires . . . . . . . . . . : Saturday, August 29, 2009 4:11:09 AM Ubuntu Configuration eth0 inet addr:192.168.171.130

    Read the article

  • CodePlex Daily Summary for Thursday, October 11, 2012

    CodePlex Daily Summary for Thursday, October 11, 2012Popular ReleasesOstrivDB: OstrivDB 0.1: - Storage configuration: objects serialization (Xml, Json), storage file compressing, data block size. - Caching for Select queries. - Indexing. - Batch of queries. - No special query language (LINQ used). - Integrated sorting and paging. - Multithreaded data processing.Mido: Mido v0.7: Mido is the simplest utility that helps to make watermarks on images and resize them. It has a very thin installer. The program has beta mark but it is able to draw watermark text, watermark images, resize pictures. Change list: + Opacity option + Stroke option + Bold, italic, underline options + Show progress during loading of images + Allow rotate watermart + Allow write multiline text as watermark + Add text aligment if text contains several lines + Add button 'clear custom position' + A...D3 Loot Tracker: 1.5.4: Fixed a bug where the server ip was not logged properly in the stats file.Captcha MVC: Captcha Mvc 2.1.2: v 2.1.2: Fixed problem with serialization. Made all classes from a namespace Jetbrains.Annotaions as the internal. Added autocomplete attribute and autocorrect attribute for captcha input element. Minor changes. v 2.1.1: Fixed problem with serialization. Minor changes. v 2.1: Added support for storing captcha in the session or cookie. See the updated example. Updated example. Minor changes. v 2.0.1: Added support for a partial captcha. Now you can easily customize the layout, s...DotNetNuke® Community Edition CMS: 06.02.04: Major Highlights Fixed issue where the module printing function was only visible to administrators Fixed issue where pane level skinning was being assigned to a default container for any content pane Fixed issue when using password aging and FB / Google authentication Fixed issue that was causing the DateEditControl to not load the assigned value Fixed issue that stopped additional profile properties to be displayed in the member directory after modifying the template Fixed er...Advanced DataGridView with Excel-like auto filter: 1.0.0.0: ?????? ??????Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.69: Fix for issue #18766: build task should not build the output if it's newer than all the input files. Fix for Issue #18764: build taks -res switch not working. update build task to concatenate input source and then minify, rather than minify and then concatenate. include resource string-replacement root name in the assumed globals list. Stop replacing new Date().getTime() with +new Date -- the latter is smaller, but turns out it executes up to 45% slower. add CSS support for single-...WinRT XAML Toolkit: WinRT XAML Toolkit - 1.3.3: WinRT XAML Toolkit based on the Windows 8 RTM SDK. Download the latest source from the SOURCE CODE page. For compiled version use NuGet. You can add it to your project in Visual Studio by going to View/Other Windows/Package Manager Console and entering: PM> Install-Package winrtxamltoolkit Features Attachable Behaviors AwaitableUI extensions Controls Converters Debugging helpers Extension methods Imaging helpers IO helpers VisualTree helpers Samples Recent changes NOTE:...VidCoder: 1.4.4 Beta: Fixed inability to create new presets with "Save As".MCEBuddy 2.x: MCEBuddy 2.3.2: Changelog for 2.3.2 (32bit and 64bit) 1. Added support for generating XBMC XML NFO files for files in the conversion queue (store it along with the source video with source video name.nfo). Right click on the file in queue and select generate XML 2. UI bugifx, start and end trim box locations interchanged 3. Added support for removing commercials from non DVRMS/WTV files (MP4, AVI etc) 4. Now checking for Firewall port status before enabling (might help with some firewall problems) 5. User In...Sandcastle Help File Builder: SHFB v1.9.5.0 with Visual Studio Package: General InformationIMPORTANT: On some systems, the content of the ZIP file is blocked and the installer may fail to run. Before extracting it, right click on the ZIP file, select Properties, and click on the Unblock button if it is present in the lower right corner of the General tab in the properties dialog. This release supports the Sandcastle October 2012 Release (v2.7.1.0). It includes full support for generating, installing, and removing MS Help Viewer files. This new release suppor...ClosedXML - The easy way to OpenXML: ClosedXML 0.68.0: ClosedXML now resolves formulas! Yes it finally happened. If you call cell.Value and it has a formula the library will try to evaluate the formula and give you the result. For example: var wb = new XLWorkbook(); var ws = wb.AddWorksheet("Sheet1"); ws.Cell("A1").SetValue(1).CellBelow().SetValue(1); ws.Cell("B1").SetValue(1).CellBelow().SetValue(1); ws.Cell("C1").FormulaA1 = "\"The total value is: \" & SUM(A1:B2)"; var...Json.NET: Json.NET 4.5 Release 10: New feature - Added Portable build to NuGet package New feature - Added GetValue and TryGetValue with StringComparison to JObject Change - Improved duplicate object reference id error message Fix - Fixed error when comparing empty JObjects Fix - Fixed SecAnnotate warnings Fix - Fixed error when comparing DateTime JValue with a DateTimeOffset JValue Fix - Fixed serializer sometimes not using DateParseHandling setting Fix - Fixed error in JsonWriter.WriteToken when writing a DateT...Readable Passphrase Generator: KeePass Plugin 0.7.2: Changes: Tested against KeePass 2.20.1 Tested under Ubuntu 12.10 (and KeePass 2.20) Added GenerateAsUtf8 method returning the encrypted passphrase as a UTF8 byte array.TelerikMvcGridCustomBindingHelper: Version 1.0.15.279-RC3: TelerikMvcGridCustomBindingHelper 1.0.15.279 RC3 Release notes: This is a RC version (hopefully the last one), please test and report any error or problem you encounter. Configurable null handling when filtering (AcceptNullValuesWhenFiltering, NullSubstitutes and NullAliases) Internal DynamicWhereClause improvments GridGridCustomBindingHelper.UseProjections method now acept ProjectionsOptions parameter for easely determine which properties should be projected Isolate and hide some are...JSLint for Visual Studio 2010: 1.4.2: 1.4.2patterns & practices: Prism: Prism for .NET 4.5: This is a release does not include any functionality changes over Prism 4.1 Desktop. These assemblies target .NET 4.5. These assemblies also were compiled against updated dependencies: Unity 3.0 and Common Service Locator (Portable Class Library).Snoop, the WPF Spy Utility: Snoop 2.8.0: Snoop 2.8.0Announcing Snoop 2.8.0! It's been exactly six months since the last release, and this one has a bunch of goodies in it. In particular, there is now a PowerShell scripting tab, compliments of Bailey Ling. With this tab, the possibilities are limitless. It basically lets you automate/script the application that you are Snooping. Bailey has a couple blog posts (one and two) on his tab already, and I am sure more is to come. Please note that if you do not have PowerShell installed, y...Z3: Z3 4.1.2: Minor fixes. Now, z3 compiles with gcc 4.7.x.NET Micro Framework: .NET MF 4.3 (Beta): This is the 4.3 Beta version of the .NET Micro Framework. Feature List for v4.3 Support for Visual Studio 2012 (including the Windows Desktop Express version) All v4.2 QFEs features and bug fixes (PWM enhancements, lwIP and network driver reliability improvements, Analog Output, WinUSB and latest GCC support) Improved diagnostic information for deployment Decreased boot time Bug fixes Work Item 1736 - Create link for MFDeploy under start menu Work Item 1504 - Customizing lwIP o...New Projects.NET 4.0 Object/Function DLL interface: A Visual Basic .NET 4.0 Object / Function interface for quick and easy updating.AzureDirectory Library for Lucene.Net: This project allows you to create Lucene Indexes via a Lucene Directory object which uses Windows Azure BlobStorage for persistent storage.Building Modern Mobile Web Apps: This project provides guidance on building mobile web experiences using HTML5, CSS3, and JavaScript. Developing web apps for mobile browsers can be less forgiviC# Singleton Base-Class: C# Singleton Base Class is a single, simple C# class used to implement the Singleton pattern through inheritance.C++ Unit Test Library for Windows Store apps with Async Helper: This Visual Studio extension will install a project template for C++ Unit Test Library for Windows Store apps that contains async helper.Citrix Mobile Application SDK Samples: This site contains our latest prototype samples for the Citrix Mobile Application SDK for you to play with.eBrain Engine: eBrain Engine is a software to administer neuropsychological tests by means of advanced I/F devices like BCI P300 and eye-tracking systemsFairycake: A game about a little witch. Java. FedEx Connector for AbleCommerce Gold: This plugin provides FedEx rating and tracking services for the AbleCommerce shopping cart.Free - Simple Phone Book (SimPB): An alternative to backup your telephone contacts. Portable, easy to use, multilingual.GPXLocalTime2UTC: Time Format Converter, from Local to UTC, for GPX Files A small utility made to open GPX files (XML), search for the "time" tag, then transform the local time Grid Solutions Framework: Core classes to manage, analyze, and visualize real-time and historical data. This project combines the time-series framework and TVA code library projects.HireMe: HireMe is a HR application that works in conjunction with the HiremeMagazine.com website. The app will run in Android, iOS and WebOS.my-sim-asdf: my greate toolOdeToFoodMvc4: Source code for "Building Applications with ASP.NET MVC 4"OrgCharts for SharePoint: Months of planning, design, development and testing have gone into a truly phenomenal Org Chart experiencePDC BW2: A pkm editor application that tries to make easier legal pkm Packed with Legality Analysis, Event downloader and more...PI Payroll system: This is a payroll system for People Index, LLC.Project13251011: fsdQuickWick.NET - Agile Pproject Management: QuickWick.NET is a simple and effective web-based solution for agile project management.SharePoint 2010 Top Nav: This project was created when a project I was on required a Navigation scheme to manage site collections. SisGAC: Sistema de Gerenciamento de Artigos para CongressosTriDes_project_3AO: encryptohideVB6Doc: Visual Basic 6 Documentor (VB6 Doc)VideoChat: Simple VideoChat web-application on ASP.NET MVC4, SignalR, Wowza Media Server/Windows Updater Open: This is an alternative to the Windows Update software on all versions of Windows OS and the WSUS provided for corporations to update multiple machines.WOWAddonsUpdater: WPF App to update lua addons for the Blizzard World of warcraft gameyygua: email:huliang@yahoo.cnZJUDTS2: ????Silverlight?????????

    Read the article

  • From HttpRuntime.Cache to Windows Azure Caching (Preview)

    - by Jeff
    I don’t know about you, but the announcement of Windows Azure Caching (Preview) (yes, the parentheses are apparently part of the interim name) made me a lot more excited about using Azure. Why? Because one of the great performance tricks of any Web app is to cache frequently used data in memory, so it doesn’t have to hit the database, a service, or whatever. When you run your Web app on one box, HttpRuntime.Cache is a sweet and stupid-simple solution. Somewhere in the data fetching pieces of your app, you can see if an object is available in cache, and return that instead of hitting the data store. I did this quite a bit in POP Forums, and it dramatically cuts down on the database chatter. The problem is that it falls apart if you run the app on many servers, in a Web farm, where one server may initiate a change to that data, and the others will have no knowledge of the change, making it stale. Of course, if you have the infrastructure to do so, you can use something like memcached or AppFabric to do a distributed cache, and achieve the caching flavor you desire. You could do the same thing in Azure before, but it would cost more because you’d need to pay for another role or VM or something to host the cache. Now, you can use a portion of the memory from each instance of a Web role to act as that cache, with no additional cost. That’s huge. So if you’re using a percentage of memory that comes out to 100 MB, and you have three instances running, that’s 300 MB available for caching. For the uninitiated, a Web role in Azure is essentially a VM that runs a Web app (worker roles are the same idea, only without the IIS part). You can spin up many instances of the role, and traffic is load balanced to the various instances. It’s like adding or removing servers to a Web farm all willy-nilly and at your discretion, and it’s what the cloud is all about. I’d say it’s my favorite thing about Windows Azure. The slightly annoying thing about developing for a Web role in Azure is that the local emulator that’s launched by Visual Studio is a little on the slow side. If you’re used to using the built-in Web server, you’re used to building and then alt-tabbing to your browser and refreshing a page. If you’re just changing an MVC view, you’re not even doing the building part. Spinning up the simulated Azure environment is too slow for this, but ideally you want to code your app to use this fantastic distributed cache mechanism. So first off, here’s the link to the page showing how to code using the caching feature. If you’re used to using HttpRuntime.Cache, this should be pretty familiar to you. Let’s say that you want to use the Azure cache preview when you’re running in Azure, but HttpRuntime.Cache if you’re running local, or in a regular IIS server environment. Through the magic of dependency injection, we can get there pretty quickly. First, design an interface to handle the cache insertion, fetching and removal. Mine looks like this: public interface ICacheProvider {     void Add(string key, object item, int duration);     T Get<T>(string key) where T : class;     void Remove(string key); } Now we’ll create two implementations of this interface… one for Azure cache, one for HttpRuntime: public class AzureCacheProvider : ICacheProvider {     public AzureCacheProvider()     {         _cache = new DataCache("default"); // in Microsoft.ApplicationServer.Caching, see how-to      }         private readonly DataCache _cache;     public void Add(string key, object item, int duration)     {         _cache.Add(key, item, new TimeSpan(0, 0, 0, 0, duration));     }     public T Get<T>(string key) where T : class     {         return _cache.Get(key) as T;     }     public void Remove(string key)     {         _cache.Remove(key);     } } public class LocalCacheProvider : ICacheProvider {     public LocalCacheProvider()     {         _cache = HttpRuntime.Cache;     }     private readonly System.Web.Caching.Cache _cache;     public void Add(string key, object item, int duration)     {         _cache.Insert(key, item, null, DateTime.UtcNow.AddMilliseconds(duration), System.Web.Caching.Cache.NoSlidingExpiration);     }     public T Get<T>(string key) where T : class     {         return _cache[key] as T;     }     public void Remove(string key)     {         _cache.Remove(key);     } } Feel free to expand these to use whatever cache features you want. I’m not going to go over dependency injection here, but I assume that if you’re using ASP.NET MVC, you’re using it. Somewhere in your app, you set up the DI container that resolves interfaces to concrete implementations (Ninject call is a “kernel” instead of a container). For this example, I’ll show you how StructureMap does it. It uses a convention based scheme, where if you need to get an instance of IFoo, it looks for a class named Foo. You can also do this mapping explicitly. The initialization of the container looks something like this: ObjectFactory.Initialize(x =>             {                 x.Scan(scan =>                         {                             scan.AssembliesFromApplicationBaseDirectory();                             scan.WithDefaultConventions();                         });                 if (Microsoft.WindowsAzure.ServiceRuntime.RoleEnvironment.IsAvailable)                     x.For<ICacheProvider>().Use<AzureCacheProvider>();                 else                     x.For<ICacheProvider>().Use<LocalCacheProvider>();             }); If you use Ninject or Windsor or something else, that’s OK. Conceptually they’re all about the same. The important part is the conditional statement that checks to see if the app is running in Azure. If it is, it maps ICacheProvider to AzureCacheProvider, otherwise it maps to LocalCacheProvider. Now when a request comes into your MVC app, and the chain of dependency resolution occurs, you can see to it that the right caching code is called. A typical design may have a call stack that goes: Controller –> BusinessLogicClass –> Repository. Let’s say your repository class looks like this: public class MyRepo : IMyRepo {     public MyRepo(ICacheProvider cacheProvider)     {         _context = new MyDataContext();         _cache = cacheProvider;     }     private readonly MyDataContext _context;     private readonly ICacheProvider _cache;     public SomeType Get(int someTypeID)     {         var key = "somename-" + someTypeID;         var cachedObject = _cache.Get<SomeType>(key);         if (cachedObject != null)         {             _context.SomeTypes.Attach(cachedObject);             return cachedObject;         }         var someType = _context.SomeTypes.SingleOrDefault(p => p.SomeTypeID == someTypeID);         _cache.Add(key, someType, 60000);         return someType;     } ... // more stuff to update, delete or whatever, being sure to remove // from cache when you do so  When the DI container gets an instance of the repo, it passes an instance of ICacheProvider to the constructor, which in this case will be whatever implementation was specified when the container was initialized. The Get method first tries to hit the cache, and of course doesn’t care what the underlying implementation is, Azure, HttpRuntime, or otherwise. If it finds the object, it returns it right then. If not, it hits the database (this example is using Entity Framework), and inserts the object into the cache before returning it. The important thing not pictured here is that other methods in the repo class will construct the key for the cached object, in this case “somename-“ plus the ID of the object, and then remove it from cache, in any method that alters or deletes the object. That way, no matter what instance of the role is processing the request, it won’t find the object if it has been made stale, that is, updated or outright deleted, forcing it to attempt to hit the database. So is this good technique? Well, sort of. It depends on how you use it, and what your testing looks like around it. Because of differences in behavior and execution of the two caching providers, for example, you could see some strange errors. For example, I immediately got an error indicating there was no parameterless constructor for an MVC controller, because the DI resolver failed to create instances for the dependencies it had. In reality, the NuGet packaged DI resolver for StructureMap was eating an exception thrown by the Azure components that said my configuration, outlined in that how-to article, was wrong. That error wouldn’t occur when using the HttpRuntime. That’s something a lot of people debate about using different components like that, and how you configure them. I kinda hate XML config files, and like the idea of the code-based approach above, but you should be darn sure that your unit and integration testing can account for the differences.

    Read the article

  • Parsing Concerns

    - by Jesse
    If you’ve ever written an application that accepts date and/or time inputs from an external source (a person, an uploaded file, posted XML, etc.) then you’ve no doubt had to deal with parsing some text representing a date into a data structure that a computer can understand. Similarly, you’ve probably also had to take values from those same data structure and turn them back into their original formats. Most (all?) suitably modern development platforms expose some kind of parsing and formatting functionality for turning text into dates and vice versa. In .NET, the DateTime data structure exposes ‘Parse’ and ‘ToString’ methods for this purpose. This post will focus mostly on parsing, though most of the examples and suggestions below can also be applied to the ToString method. The DateTime.Parse method is pretty permissive in the values that it will accept (though apparently not as permissive as some other languages) which makes it pretty easy to take some text provided by a user and turn it into a proper DateTime instance. Here are some examples (note that the resulting DateTime values are shown using the RFC1123 format): DateTime.Parse("3/12/2010"); //Fri, 12 Mar 2010 00:00:00 GMT DateTime.Parse("2:00 AM"); //Sat, 01 Jan 2011 02:00:00 GMT (took today's date as date portion) DateTime.Parse("5-15/2010"); //Sat, 15 May 2010 00:00:00 GMT DateTime.Parse("7/8"); //Fri, 08 Jul 2011 00:00:00 GMT DateTime.Parse("Thursday, July 1, 2010"); //Thu, 01 Jul 2010 00:00:00 GMT Dealing With Inaccuracy While the DateTime struct has the ability to store a date and time value accurate down to the millisecond, most date strings provided by a user are not going to specify values with that much precision. In each of the above examples, the Parse method was provided a partial value from which to construct a proper DateTime. This means it had to go ahead and assume what you meant and fill in the missing parts of the date and time for you. This is a good thing, especially when we’re talking about taking input from a user. We can’t expect that every person using our software to provide a year, day, month, hour, minute, second, and millisecond every time they need to express a date. That said, it’s important for developers to understand what assumptions the software might be making and plan accordingly. I think the assumptions that were made in each of the above examples were pretty reasonable, though if we dig into this method a little bit deeper we’ll find that there are a lot more assumptions being made under the covers than you might have previously known. One of the biggest assumptions that the DateTime.Parse method has to make relates to the format of the date represented by the provided string. Let’s consider this example input string: ‘10-02-15’. To some people. that might look like ‘15-Feb-2010’. To others, it might be ‘02-Oct-2015’. Like many things, it depends on where you’re from. This Is America! Most cultures around the world have adopted a “little-endian” or “big-endian” formats. (Source: Date And Time Notation By Country) In this context,  a “little-endian” date format would list the date parts with the least significant first while the “big-endian” date format would list them with the most significant first. For example, a “little-endian” date would be “day-month-year” and “big-endian” would be “year-month-day”. It’s worth nothing here that ISO 8601 defines a “big-endian” format as the international standard. While I personally prefer “big-endian” style date formats, I think both styles make sense in that they follow some logical standard with respect to ordering the date parts by their significance. Here in the United States, however, we buck that trend by using what is, in comparison, a completely nonsensical format of “month/day/year”. Almost no other country in the world uses this format. I’ve been fortunate in my life to have done some international travel, so I’ve been aware of this difference for many years, but never really thought much about it. Until recently, I had been developing software for exclusively US-based audiences and remained blissfully ignorant of the different date formats employed by other countries around the world. The web application I work on is being rolled out to users in different countries, so I was recently tasked with updating it to support different date formats. As it turns out, .NET has a great mechanism for dealing with different date formats right out of the box. Supporting date formats for different cultures is actually pretty easy once you understand this mechanism. Pulling the Curtain Back On the Parse Method Have you ever taken a look at the different flavors (read: overloads) that the DateTime.Parse method comes in? In it’s simplest form, it takes a single string parameter and returns the corresponding DateTime value (if it can divine what the date value should be). You can optionally provide two additional parameters to this method: an ‘System.IFormatProvider’ and a ‘System.Globalization.DateTimeStyles’. Both of these optional parameters have some bearing on the assumptions that get made while parsing a date, but for the purposes of this article I’m going to focus on the ‘System.IFormatProvider’ parameter. The IFormatProvider exposes a single method called ‘GetFormat’ that returns an object to be used for determining the proper format for displaying and parsing things like numbers and dates. This interface plays a big role in the globalization capabilities that are built into the .NET Framework. The cornerstone of these globalization capabilities can be found in the ‘System.Globalization.CultureInfo’ class. To put it simply, the CultureInfo class is used to encapsulate information related to things like language, writing system, and date formats for a certain culture. Support for many cultures are “baked in” to the .NET Framework and there is capacity for defining custom cultures if needed (thought I’ve never delved into that). While the details of the CultureInfo class are beyond the scope of this post, so for now let me just point out that the CultureInfo class implements the IFormatInfo interface. This means that a CultureInfo instance created for a given culture can be provided to the DateTime.Parse method in order to tell it what date formats it should expect. So what happens when you don’t provide this value? Let’s crack this method open in Reflector: When no IFormatInfo parameter is provided (i.e. we use the simple DateTime.Parse(string) overload), the ‘DateTimeFormatInfo.CurrentInfo’ is used instead. Drilling down a bit further we can see the implementation of the DateTimeFormatInfo.CurrentInfo property: From this property we can determine that, in the absence of an IFormatProvider being specified, the DateTime.Parse method will assume that the provided date should be treated as if it were in the format defined by the CultureInfo object that is attached to the current thread. The culture specified by the CultureInfo instance on the current thread can vary depending on several factors, but if you’re writing an application where a single instance might be used by people from different cultures (i.e. a web application with an international user base), it’s important to know what this value is. Having a solid strategy for setting the current thread’s culture for each incoming request in an internationally used ASP .NET application is obviously important, and might make a good topic for a future post. For now, let’s think about what the implications of not having the correct culture set on the current thread. Let’s say you’re running an ASP .NET application on a server in the United States. The server was setup by English speakers in the United States, so it’s configured for US English. It exposes a web page where users can enter order data, one piece of which is an anticipated order delivery date. Most users are in the US, and therefore enter dates in a ‘month/day/year’ format. The application is using the DateTime.Parse(string) method to turn the values provided by the user into actual DateTime instances that can be stored in the database. This all works fine, because your users and your server both think of dates in the same way. Now you need to support some users in South America, where a ‘day/month/year’ format is used. The best case scenario at this point is a user will enter March 13, 2011 as ‘25/03/2011’. This would cause the call to DateTime.Parse to blow up since that value doesn’t look like a valid date in the US English culture (Note: In all likelihood you might be using the DateTime.TryParse(string) method here instead, but that method behaves the same way with regard to date formats). “But wait a minute”, you might be saying to yourself, “I thought you said that this was the best case scenario?” This scenario would prevent users from entering orders in the system, which is bad, but it could be worse! What if the order needs to be delivered a day earlier than that, on March 12, 2011? Now the user enters ‘12/03/2011’. Now the call to DateTime.Parse sees what it thinks is a valid date, but there’s just one problem: it’s not the right date. Now this order won’t get delivered until December 3, 2011. In my opinion, that kind of data corruption is a much bigger problem than having the Parse call fail. What To Do? My order entry example is a bit contrived, but I think it serves to illustrate the potential issues with accepting date input from users. There are some approaches you can take to make this easier on you and your users: Eliminate ambiguity by using a graphical date input control. I’m personally a fan of a jQuery UI Datepicker widget. It’s pretty easy to setup, can be themed to match the look and feel of your site, and has support for multiple languages and cultures. Be sure you have a way to track the culture preference of each user in your system. For a web application this could be done using something like a cookie or session state variable. Ensure that the current user’s culture is being applied correctly to DateTime formatting and parsing code. This can be accomplished by ensuring that each request has the handling thread’s CultureInfo set properly, or by using the Format and Parse method overloads that accept an IFormatProvider instance where the provided value is a CultureInfo object constructed using the current user’s culture preference. When in doubt, favor formats that are internationally recognizable. Using the string ‘2010-03-05’ is likely to be recognized as March, 5 2011 by users from most (if not all) cultures. Favor standard date format strings over custom ones. So far we’ve only talked about turning a string into a DateTime, but most of the same “gotchas” apply when doing the opposite. Consider this code: someDateValue.ToString("MM/dd/yyyy"); This will output the same string regardless of what the current thread’s culture is set to (with the exception of some cultures that don’t use the Gregorian calendar system, but that’s another issue all together). For displaying dates to users, it would be better to do this: someDateValue.ToString("d"); This standard format string of “d” will use the “short date format” as defined by the culture attached to the current thread (or provided in the IFormatProvider instance in the proper method overload). This means that it will honor the proper month/day/year, year/month/day, or day/month/year format for the culture. Knowing Your Audience The examples and suggestions shown above can go a long way toward getting an application in shape for dealing with date inputs from users in multiple cultures. There are some instances, however, where taking approaches like these would not be appropriate. In some cases, the provider or consumer of date values that pass through your application are not people, but other applications (or other portions of your own application). For example, if your site has a page that accepts a date as a query string parameter, you’ll probably want to format that date using invariant date format. Otherwise, the same URL could end up evaluating to a different page depending on the user that is viewing it. In addition, if your application exports data for consumption by other systems, it’s best to have an agreed upon format that all systems can use and that will not vary depending upon whether or not the users of the systems on either side prefer a month/day/year or day/month/year format. I’ll look more at some approaches for dealing with these situations in a future post. If you take away one thing from this post, make it an understanding of the importance of knowing where the dates that pass through your system come from and are going to. You will likely want to vary your parsing and formatting approach depending on your audience.

    Read the article

  • SSL / HTTP / No Response to Curl

    - by Alex McHale
    I am trying to send commands to a SOAP service, and getting nothing in reply. The SOAP service is at a completely separate site from either server I am testing with. I have written a dummy script with the SOAP XML embedded. When I run it at my local site, on any of three machines -- OSX, Ubuntu, or CentOS 5.3 -- it completes successfully with a good response. I then sent the script to our public host at Slicehost, where I fail to get the response back from the SOAP service. It accepts the TCP socket and proceeds with the SSL handshake. I do not however receive any valid HTTP response. This is the case whether I use my script or curl on the command line. I have rewritten the script using SOAP4R, Net::HTTP and Curb. All of which work at my local site, none of which work at the Slicehost site. I have tried to assemble the CentOS box as closely to match my Slicehost server as possible. I rebuilt the Slice to be a stock CentOS 5.3 and stock CentOS 5.4 with the same results. When I look at a tcpdump of the bad sessions on Slicehost, I see my script or curl send the XML to the remote server, and nothing comes back. When I look at the tcpdump at my local site, I see the response just fine. I have entirely disabled iptables on the Slice. Does anyone have any ideas what could be causing these results? Please let me know what additional information I can furnish. Thank you! Below is a wire trace of a sample session. The IP that starts with 173 is my server while the IP that starts with 12 is the SOAP server's. No. Time Source Destination Protocol Info 1 0.000000 173.45.x.x 12.36.x.x TCP 36872 > https [SYN] Seq=0 Win=5840 Len=0 MSS=1460 TSV=137633469 TSER=0 WS=6 Frame 1 (74 bytes on wire, 74 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 0, Len: 0 No. Time Source Destination Protocol Info 2 0.040000 12.36.x.x 173.45.x.x TCP https > 36872 [SYN, ACK] Seq=0 Ack=1 Win=8760 Len=0 MSS=1460 Frame 2 (62 bytes on wire, 62 bytes captured) Ethernet II, Src: Dell_fb:49:a1 (00:21:9b:fb:49:a1), Dst: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6) Internet Protocol, Src: 12.36.x.x (12.36.x.x), Dst: 173.45.x.x (173.45.x.x) Transmission Control Protocol, Src Port: https (443), Dst Port: 36872 (36872), Seq: 0, Ack: 1, Len: 0 No. Time Source Destination Protocol Info 3 0.040000 173.45.x.x 12.36.x.x TCP 36872 > https [ACK] Seq=1 Ack=1 Win=5840 Len=0 Frame 3 (54 bytes on wire, 54 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 1, Ack: 1, Len: 0 No. Time Source Destination Protocol Info 4 0.050000 173.45.x.x 12.36.x.x SSLv2 Client Hello Frame 4 (156 bytes on wire, 156 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 1, Ack: 1, Len: 102 Secure Socket Layer No. Time Source Destination Protocol Info 5 0.130000 12.36.x.x 173.45.x.x TCP [TCP segment of a reassembled PDU] Frame 5 (1434 bytes on wire, 1434 bytes captured) Ethernet II, Src: Dell_fb:49:a1 (00:21:9b:fb:49:a1), Dst: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6) Internet Protocol, Src: 12.36.x.x (12.36.x.x), Dst: 173.45.x.x (173.45.x.x) Transmission Control Protocol, Src Port: https (443), Dst Port: 36872 (36872), Seq: 1, Ack: 103, Len: 1380 Secure Socket Layer No. Time Source Destination Protocol Info 6 0.130000 173.45.x.x 12.36.x.x TCP 36872 > https [ACK] Seq=103 Ack=1381 Win=8280 Len=0 Frame 6 (54 bytes on wire, 54 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 103, Ack: 1381, Len: 0 No. Time Source Destination Protocol Info 7 0.130000 12.36.x.x 173.45.x.x TLSv1 Server Hello, Certificate, Server Hello Done Frame 7 (1280 bytes on wire, 1280 bytes captured) Ethernet II, Src: Dell_fb:49:a1 (00:21:9b:fb:49:a1), Dst: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6) Internet Protocol, Src: 12.36.x.x (12.36.x.x), Dst: 173.45.x.x (173.45.x.x) Transmission Control Protocol, Src Port: https (443), Dst Port: 36872 (36872), Seq: 1381, Ack: 103, Len: 1226 [Reassembled TCP Segments (2606 bytes): #5(1380), #7(1226)] Secure Socket Layer No. Time Source Destination Protocol Info 8 0.130000 173.45.x.x 12.36.x.x TCP 36872 > https [ACK] Seq=103 Ack=2607 Win=11040 Len=0 Frame 8 (54 bytes on wire, 54 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 103, Ack: 2607, Len: 0 No. Time Source Destination Protocol Info 9 0.130000 173.45.x.x 12.36.x.x TLSv1 Client Key Exchange, Change Cipher Spec, Encrypted Handshake Message Frame 9 (236 bytes on wire, 236 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 103, Ack: 2607, Len: 182 Secure Socket Layer No. Time Source Destination Protocol Info 10 0.190000 12.36.x.x 173.45.x.x TLSv1 Change Cipher Spec, Encrypted Handshake Message Frame 10 (97 bytes on wire, 97 bytes captured) Ethernet II, Src: Dell_fb:49:a1 (00:21:9b:fb:49:a1), Dst: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6) Internet Protocol, Src: 12.36.x.x (12.36.x.x), Dst: 173.45.x.x (173.45.x.x) Transmission Control Protocol, Src Port: https (443), Dst Port: 36872 (36872), Seq: 2607, Ack: 285, Len: 43 Secure Socket Layer No. Time Source Destination Protocol Info 11 0.190000 173.45.x.x 12.36.x.x TLSv1 Application Data Frame 11 (347 bytes on wire, 347 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 285, Ack: 2650, Len: 293 Secure Socket Layer No. Time Source Destination Protocol Info 12 0.190000 173.45.x.x 12.36.x.x TCP [TCP segment of a reassembled PDU] Frame 12 (1514 bytes on wire, 1514 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 578, Ack: 2650, Len: 1460 Secure Socket Layer No. Time Source Destination Protocol Info 13 0.450000 12.36.x.x 173.45.x.x TCP https > 36872 [ACK] Seq=2650 Ack=578 Win=64958 Len=0 Frame 13 (54 bytes on wire, 54 bytes captured) Ethernet II, Src: Dell_fb:49:a1 (00:21:9b:fb:49:a1), Dst: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6) Internet Protocol, Src: 12.36.x.x (12.36.x.x), Dst: 173.45.x.x (173.45.x.x) Transmission Control Protocol, Src Port: https (443), Dst Port: 36872 (36872), Seq: 2650, Ack: 578, Len: 0 No. Time Source Destination Protocol Info 14 0.450000 173.45.x.x 12.36.x.x TCP [TCP segment of a reassembled PDU] Frame 14 (206 bytes on wire, 206 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 2038, Ack: 2650, Len: 152 No. Time Source Destination Protocol Info 15 0.510000 12.36.x.x 173.45.x.x TCP [TCP Dup ACK 13#1] https > 36872 [ACK] Seq=2650 Ack=578 Win=64958 Len=0 Frame 15 (54 bytes on wire, 54 bytes captured) Ethernet II, Src: Dell_fb:49:a1 (00:21:9b:fb:49:a1), Dst: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6) Internet Protocol, Src: 12.36.x.x (12.36.x.x), Dst: 173.45.x.x (173.45.x.x) Transmission Control Protocol, Src Port: https (443), Dst Port: 36872 (36872), Seq: 2650, Ack: 578, Len: 0 No. Time Source Destination Protocol Info 16 0.850000 173.45.x.x 12.36.x.x TCP [TCP Retransmission] [TCP segment of a reassembled PDU] Frame 16 (1514 bytes on wire, 1514 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 578, Ack: 2650, Len: 1460 Secure Socket Layer No. Time Source Destination Protocol Info 17 1.650000 173.45.x.x 12.36.x.x TCP [TCP Retransmission] [TCP segment of a reassembled PDU] Frame 17 (1514 bytes on wire, 1514 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 578, Ack: 2650, Len: 1460 Secure Socket Layer No. Time Source Destination Protocol Info 18 3.250000 173.45.x.x 12.36.x.x TCP [TCP Retransmission] [TCP segment of a reassembled PDU] Frame 18 (1514 bytes on wire, 1514 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 578, Ack: 2650, Len: 1460 Secure Socket Layer No. Time Source Destination Protocol Info 19 6.450000 173.45.x.x 12.36.x.x TCP [TCP Retransmission] [TCP segment of a reassembled PDU] Frame 19 (1514 bytes on wire, 1514 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 578, Ack: 2650, Len: 1460 Secure Socket Layer

    Read the article

  • Error when reloading supervisord: unix:///tmp/supervisor.sock no such file

    - by Yarin
    I'm running supervisord on my CentOS 6 box like so, /usr/bin/supervisord -c /etc/supervisord.conf and when I launch supervisorctl all process status are fine, but if I try to reload using supervisorctl I get unix:///tmp/supervisor.sock no such file I'm using the same config file I've used successfully on other boxes, and im running everything as root. I can't undesrtand what the problem is... Config file: ; Sample supervisor config file. [unix_http_server] file=/tmp/supervisor.sock ; (the path to the socket file) ;chmod=0700 ; socket file mode (default 0700) ;chown=nobody:nogroup ; socket file uid:gid owner ;username=user ; (default is no username (open server)) ;password=123 ; (default is no password (open server)) ;[inet_http_server] ; inet (TCP) server disabled by default ;port=127.0.0.1:9001 ; (ip_address:port specifier, *:port for all iface) ;username=user ; (default is no username (open server)) ;password=123 ; (default is no password (open server)) [supervisord] logfile=/tmp/supervisord.log ; (main log file;default $CWD/supervisord.log) logfile_maxbytes=50MB ; (max main logfile bytes b4 rotation;default 50MB) logfile_backups=10 ; (num of main logfile rotation backups;default 10) loglevel=info ; (log level;default info; others: debug,warn,trace) pidfile=/tmp/supervisord.pid ; (supervisord pidfile;default supervisord.pid) nodaemon=false ; (start in foreground if true;default false) minfds=1024 ; (min. avail startup file descriptors;default 1024) minprocs=200 ; (min. avail process descriptors;default 200) ;umask=022 ; (process file creation umask;default 022) ;user=chrism ; (default is current user, required if root) ;identifier=supervisor ; (supervisord identifier, default is 'supervisor') ;directory=/tmp ; (default is not to cd during start) ;nocleanup=true ; (don't clean up tempfiles at start;default false) ;childlogdir=/tmp ; ('AUTO' child log dir, default $TEMP) ;environment=KEY=value ; (key value pairs to add to environment) ;strip_ansi=false ; (strip ansi escape codes in logs; def. false) ; the below section must remain in the config file for RPC ; (supervisorctl/web interface) to work, additional interfaces may be ; added by defining them in separate rpcinterface: sections [rpcinterface:supervisor] supervisor.rpcinterface_factory = supervisor.rpcinterface:make_main_rpcinterface [supervisorctl] serverurl=unix:///tmp/supervisor.sock ; use a unix:// URL for a unix socket ;serverurl=http://127.0.0.1:9001 ; use an http:// url to specify an inet socket ;username=chris ; should be same as http_username if set ;password=123 ; should be same as http_password if set ;prompt=mysupervisor ; cmd line prompt (default "supervisor") ;history_file=~/.sc_history ; use readline history if available ; The below sample program section shows all possible program subsection values, ; create one or more 'real' program: sections to be able to control them under ; supervisor. ;[program:foo] ;command=/bin/cat [program:embed_scheduler] command=/opt/web-apps/mywebsite/custom_process.py process_name=%(program_name)s_%(process_num)d numprocs=3 ;[program:theprogramname] ;command=/bin/cat ; the program (relative uses PATH, can take args) ;process_name=%(program_name)s ; process_name expr (default %(program_name)s) ;numprocs=1 ; number of processes copies to start (def 1) ;directory=/tmp ; directory to cwd to before exec (def no cwd) ;umask=022 ; umask for process (default None) ;priority=999 ; the relative start priority (default 999) ;autostart=true ; start at supervisord start (default: true) ;autorestart=unexpected ; whether/when to restart (default: unexpected) ;startsecs=1 ; number of secs prog must stay running (def. 1) ;startretries=3 ; max # of serial start failures (default 3) ;exitcodes=0,2 ; 'expected' exit codes for process (default 0,2) ;stopsignal=QUIT ; signal used to kill process (default TERM) ;stopwaitsecs=10 ; max num secs to wait b4 SIGKILL (default 10) ;killasgroup=false ; SIGKILL the UNIX process group (def false) ;user=chrism ; setuid to this UNIX account to run the program ;redirect_stderr=true ; redirect proc stderr to stdout (default false) ;stdout_logfile=/a/path ; stdout log path, NONE for none; default AUTO ;stdout_logfile_maxbytes=1MB ; max # logfile bytes b4 rotation (default 50MB) ;stdout_logfile_backups=10 ; # of stdout logfile backups (default 10) ;stdout_capture_maxbytes=1MB ; number of bytes in 'capturemode' (default 0) ;stdout_events_enabled=false ; emit events on stdout writes (default false) ;stderr_logfile=/a/path ; stderr log path, NONE for none; default AUTO ;stderr_logfile_maxbytes=1MB ; max # logfile bytes b4 rotation (default 50MB) ;stderr_logfile_backups=10 ; # of stderr logfile backups (default 10) ;stderr_capture_maxbytes=1MB ; number of bytes in 'capturemode' (default 0) ;stderr_events_enabled=false ; emit events on stderr writes (default false) ;environment=A=1,B=2 ; process environment additions (def no adds) ;serverurl=AUTO ; override serverurl computation (childutils) ; The below sample eventlistener section shows all possible ; eventlistener subsection values, create one or more 'real' ; eventlistener: sections to be able to handle event notifications ; sent by supervisor. ;[eventlistener:theeventlistenername] ;command=/bin/eventlistener ; the program (relative uses PATH, can take args) ;process_name=%(program_name)s ; process_name expr (default %(program_name)s) ;numprocs=1 ; number of processes copies to start (def 1) ;events=EVENT ; event notif. types to subscribe to (req'd) ;buffer_size=10 ; event buffer queue size (default 10) ;directory=/tmp ; directory to cwd to before exec (def no cwd) ;umask=022 ; umask for process (default None) ;priority=-1 ; the relative start priority (default -1) ;autostart=true ; start at supervisord start (default: true) ;autorestart=unexpected ; whether/when to restart (default: unexpected) ;startsecs=1 ; number of secs prog must stay running (def. 1) ;startretries=3 ; max # of serial start failures (default 3) ;exitcodes=0,2 ; 'expected' exit codes for process (default 0,2) ;stopsignal=QUIT ; signal used to kill process (default TERM) ;stopwaitsecs=10 ; max num secs to wait b4 SIGKILL (default 10) ;killasgroup=false ; SIGKILL the UNIX process group (def false) ;user=chrism ; setuid to this UNIX account to run the program ;redirect_stderr=true ; redirect proc stderr to stdout (default false) ;stdout_logfile=/a/path ; stdout log path, NONE for none; default AUTO ;stdout_logfile_maxbytes=1MB ; max # logfile bytes b4 rotation (default 50MB) ;stdout_logfile_backups=10 ; # of stdout logfile backups (default 10) ;stdout_events_enabled=false ; emit events on stdout writes (default false) ;stderr_logfile=/a/path ; stderr log path, NONE for none; default AUTO ;stderr_logfile_maxbytes=1MB ; max # logfile bytes b4 rotation (default 50MB) ;stderr_logfile_backups ; # of stderr logfile backups (default 10) ;stderr_events_enabled=false ; emit events on stderr writes (default false) ;environment=A=1,B=2 ; process environment additions ;serverurl=AUTO ; override serverurl computation (childutils) ; The below sample group section shows all possible group values, ; create one or more 'real' group: sections to create "heterogeneous" ; process groups. ;[group:thegroupname] ;programs=progname1,progname2 ; each refers to 'x' in [program:x] definitions ;priority=999 ; the relative start priority (default 999) ; The [include] section can just contain the "files" setting. This ; setting can list multiple files (separated by whitespace or ; newlines). It can also contain wildcards. The filenames are ; interpreted as relative to this file. Included files *cannot* ; include files themselves. ;[include] ;files = relative/directory/*.ini

    Read the article

  • UAC being turned off once a day on Windows 7

    - by Mehper C. Palavuzlar
    I have strange problem on my HP laptop. This began to happen recently. Whenever I start my machine, Windows 7 Action Center displays the following warning: You need to restart your computer for UAC to be turned off. Actually, this does not happen if it happened once on a specific day. For example, when I start the machine in the morning, it shows up; but it never shows up in the subsequent restarts within that day. On the next day, the same thing happens again. I never disable UAC, but obviously some rootkit or virus causes this. As soon as I get this warning, I head for the UAC settings, and re-enable UAC to dismiss this warning. This is a bothersome situation as I can't fix it. First, I have run a full scan on the computer for any probable virus and malware/rootkit activity, but TrendMicro OfficeScan said that no viruses have been found. I went to an old Restore Point using Windows System Restore, but the problem was not solved. What I have tried so far (which couldn't find the rootkit): TrendMicro OfficeScan Antivirus AVAST Malwarebytes' Anti-malware Ad-Aware Vipre Antivirus GMER TDSSKiller (Kaspersky Labs) HiJackThis RegRuns UnHackMe SuperAntiSpyware Portable Tizer Rootkit Razor (*) Sophos Anti-Rootkit SpyHunter 4 There are no other strange activities on the machine. Everything works fine except this bizarre incident. What could be the name of this annoying rootkit? How can I detect and remove it? EDIT: Below is the log file generated by HijackThis: Logfile of Trend Micro HijackThis v2.0.4 Scan saved at 13:07:04, on 17.01.2011 Platform: Windows 7 (WinNT 6.00.3504) MSIE: Internet Explorer v8.00 (8.00.7600.16700) Boot mode: Normal Running processes: C:\Windows\system32\taskhost.exe C:\Windows\system32\Dwm.exe C:\Windows\Explorer.EXE C:\Program Files\CheckPoint\SecuRemote\bin\SR_GUI.Exe C:\Windows\System32\igfxtray.exe C:\Windows\System32\hkcmd.exe C:\Windows\system32\igfxsrvc.exe C:\Windows\System32\igfxpers.exe C:\Program Files\Hewlett-Packard\HP Wireless Assistant\HPWAMain.exe C:\Program Files\Synaptics\SynTP\SynTPEnh.exe C:\Program Files\Hewlett-Packard\HP Quick Launch Buttons\QLBCTRL.exe C:\Program Files\Analog Devices\Core\smax4pnp.exe C:\Program Files\Hewlett-Packard\HP Quick Launch Buttons\VolCtrl.exe C:\Program Files\LightningFAX\LFclient\lfsndmng.exe C:\Program Files\Common Files\Java\Java Update\jusched.exe C:\Program Files\Microsoft Office Communicator\communicator.exe C:\Program Files\Iron Mountain\Connected BackupPC\Agent.exe C:\Program Files\Trend Micro\OfficeScan Client\PccNTMon.exe C:\Program Files\Microsoft LifeCam\LifeExp.exe C:\Program Files\Hewlett-Packard\Shared\HpqToaster.exe C:\Program Files\Windows Sidebar\sidebar.exe C:\Program Files\mimio\mimio Studio\system\aps_tablet\atwtusb.exe C:\Program Files\Microsoft Office\Office12\OUTLOOK.EXE C:\Program Files\Babylon\Babylon-Pro\Babylon.exe C:\Program Files\Mozilla Firefox\firefox.exe C:\Users\userx\Desktop\HijackThis.exe R1 - HKCU\Software\Microsoft\Internet Explorer\Main,Search Page = http://go.microsoft.com/fwlink/?LinkId=54896 R0 - HKCU\Software\Microsoft\Internet Explorer\Main,Start Page = about:blank R1 - HKLM\Software\Microsoft\Internet Explorer\Main,Default_Page_URL = http://go.microsoft.com/fwlink/?LinkId=69157 R1 - HKLM\Software\Microsoft\Internet Explorer\Main,Default_Search_URL = http://go.microsoft.com/fwlink/?LinkId=54896 R1 - HKLM\Software\Microsoft\Internet Explorer\Main,Search Page = http://go.microsoft.com/fwlink/?LinkId=54896 R0 - HKLM\Software\Microsoft\Internet Explorer\Main,Start Page = http://go.microsoft.com/fwlink/?LinkId=69157 R0 - HKLM\Software\Microsoft\Internet Explorer\Search,SearchAssistant = R0 - HKLM\Software\Microsoft\Internet Explorer\Search,CustomizeSearch = R1 - HKCU\Software\Microsoft\Windows\CurrentVersion\Internet Settings,AutoConfigURL = http://www.yaysat.com.tr/proxy/proxy.pac R0 - HKCU\Software\Microsoft\Internet Explorer\Toolbar,LinksFolderName = O2 - BHO: AcroIEHelperStub - {18DF081C-E8AD-4283-A596-FA578C2EBDC3} - C:\Program Files\Common Files\Adobe\Acrobat\ActiveX\AcroIEHelperShim.dll O2 - BHO: Babylon IE plugin - {9CFACCB6-2F3F-4177-94EA-0D2B72D384C1} - C:\Program Files\Babylon\Babylon-Pro\Utils\BabylonIEPI.dll O2 - BHO: Java(tm) Plug-In 2 SSV Helper - {DBC80044-A445-435b-BC74-9C25C1C588A9} - C:\Program Files\Java\jre6\bin\jp2ssv.dll O4 - HKLM\..\Run: [IgfxTray] C:\Windows\system32\igfxtray.exe O4 - HKLM\..\Run: [HotKeysCmds] C:\Windows\system32\hkcmd.exe O4 - HKLM\..\Run: [Persistence] C:\Windows\system32\igfxpers.exe O4 - HKLM\..\Run: [hpWirelessAssistant] C:\Program Files\Hewlett-Packard\HP Wireless Assistant\HPWAMain.exe O4 - HKLM\..\Run: [SynTPEnh] C:\Program Files\Synaptics\SynTP\SynTPEnh.exe O4 - HKLM\..\Run: [QlbCtrl.exe] C:\Program Files\Hewlett-Packard\HP Quick Launch Buttons\QlbCtrl.exe /Start O4 - HKLM\..\Run: [SoundMAXPnP] C:\Program Files\Analog Devices\Core\smax4pnp.exe O4 - HKLM\..\Run: [Adobe Reader Speed Launcher] "C:\Program Files\Adobe\Reader 9.0\Reader\Reader_sl.exe" O4 - HKLM\..\Run: [Adobe ARM] "C:\Program Files\Common Files\Adobe\ARM\1.0\AdobeARM.exe" O4 - HKLM\..\Run: [lfsndmng] C:\Program Files\LightningFAX\LFclient\LFSNDMNG.EXE O4 - HKLM\..\Run: [SunJavaUpdateSched] "C:\Program Files\Common Files\Java\Java Update\jusched.exe" O4 - HKLM\..\Run: [Communicator] "C:\Program Files\Microsoft Office Communicator\communicator.exe" /fromrunkey O4 - HKLM\..\Run: [AgentUiRunKey] "C:\Program Files\Iron Mountain\Connected BackupPC\Agent.exe" -ni -sss -e http://localhost:16386/ O4 - HKLM\..\Run: [OfficeScanNT Monitor] "C:\Program Files\Trend Micro\OfficeScan Client\pccntmon.exe" -HideWindow O4 - HKLM\..\Run: [Babylon Client] C:\Program Files\Babylon\Babylon-Pro\Babylon.exe -AutoStart O4 - HKLM\..\Run: [LifeCam] "C:\Program Files\Microsoft LifeCam\LifeExp.exe" O4 - HKCU\..\Run: [Sidebar] C:\Program Files\Windows Sidebar\sidebar.exe /autoRun O4 - Global Startup: mimio Studio.lnk = C:\Program Files\mimio\mimio Studio\mimiosys.exe O8 - Extra context menu item: Microsoft Excel'e &Ver - res://C:\PROGRA~1\MICROS~1\Office12\EXCEL.EXE/3000 O8 - Extra context menu item: Translate this web page with Babylon - res://C:\Program Files\Babylon\Babylon-Pro\Utils\BabylonIEPI.dll/ActionTU.htm O8 - Extra context menu item: Translate with Babylon - res://C:\Program Files\Babylon\Babylon-Pro\Utils\BabylonIEPI.dll/Action.htm O9 - Extra button: Research - {92780B25-18CC-41C8-B9BE-3C9C571A8263} - C:\PROGRA~1\MICROS~1\Office12\REFIEBAR.DLL O9 - Extra button: Translate this web page with Babylon - {F72841F0-4EF1-4df5-BCE5-B3AC8ACF5478} - C:\Program Files\Babylon\Babylon-Pro\Utils\BabylonIEPI.dll O9 - Extra 'Tools' menuitem: Translate this web page with Babylon - {F72841F0-4EF1-4df5-BCE5-B3AC8ACF5478} - C:\Program Files\Babylon\Babylon-Pro\Utils\BabylonIEPI.dll O16 - DPF: {00134F72-5284-44F7-95A8-52A619F70751} (ObjWinNTCheck Class) - https://172.20.12.103:4343/officescan/console/html/ClientInstall/WinNTChk.cab O16 - DPF: {08D75BC1-D2B5-11D1-88FC-0080C859833B} (OfficeScan Corp Edition Web-Deployment SetupCtrl Class) - https://172.20.12.103:4343/officescan/console/html/ClientInstall/setup.cab O17 - HKLM\System\CCS\Services\Tcpip\Parameters: Domain = yaysat.com O17 - HKLM\Software\..\Telephony: DomainName = yaysat.com O17 - HKLM\System\CS1\Services\Tcpip\Parameters: Domain = yaysat.com O17 - HKLM\System\CS2\Services\Tcpip\Parameters: Domain = yaysat.com O18 - Protocol: qcom - {B8DBD265-42C3-43E6-B439-E968C71984C6} - C:\Program Files\Common Files\Quest Shared\CodeXpert\qcom.dll O22 - SharedTaskScheduler: FencesShellExt - {1984DD45-52CF-49cd-AB77-18F378FEA264} - C:\Program Files\Stardock\Fences\FencesMenu.dll O23 - Service: Andrea ADI Filters Service (AEADIFilters) - Andrea Electronics Corporation - C:\Windows\system32\AEADISRV.EXE O23 - Service: AgentService - Iron Mountain Incorporated - C:\Program Files\Iron Mountain\Connected BackupPC\AgentService.exe O23 - Service: Agere Modem Call Progress Audio (AgereModemAudio) - LSI Corporation - C:\Program Files\LSI SoftModem\agrsmsvc.exe O23 - Service: BMFMySQL - Unknown owner - C:\Program Files\Quest Software\Benchmark Factory for Databases\Repository\MySQL\bin\mysqld-max-nt.exe O23 - Service: Com4QLBEx - Hewlett-Packard Development Company, L.P. - C:\Program Files\Hewlett-Packard\HP Quick Launch Buttons\Com4QLBEx.exe O23 - Service: hpqwmiex - Hewlett-Packard Development Company, L.P. - C:\Program Files\Hewlett-Packard\Shared\hpqwmiex.exe O23 - Service: OfficeScanNT RealTime Scan (ntrtscan) - Trend Micro Inc. - C:\Program Files\Trend Micro\OfficeScan Client\ntrtscan.exe O23 - Service: SMS Task Sequence Agent (smstsmgr) - Unknown owner - C:\Windows\system32\CCM\TSManager.exe O23 - Service: Check Point VPN-1 Securemote service (SR_Service) - Check Point Software Technologies - C:\Program Files\CheckPoint\SecuRemote\bin\SR_Service.exe O23 - Service: Check Point VPN-1 Securemote watchdog (SR_Watchdog) - Check Point Software Technologies - C:\Program Files\CheckPoint\SecuRemote\bin\SR_Watchdog.exe O23 - Service: Trend Micro Unauthorized Change Prevention Service (TMBMServer) - Trend Micro Inc. - C:\Program Files\Trend Micro\OfficeScan Client\..\BM\TMBMSRV.exe O23 - Service: OfficeScan NT Listener (tmlisten) - Trend Micro Inc. - C:\Program Files\Trend Micro\OfficeScan Client\tmlisten.exe O23 - Service: OfficeScan NT Proxy Service (TmProxy) - Trend Micro Inc. - C:\Program Files\Trend Micro\OfficeScan Client\TmProxy.exe O23 - Service: VNC Server Version 4 (WinVNC4) - RealVNC Ltd. - C:\Program Files\RealVNC\VNC4\WinVNC4.exe -- End of file - 8204 bytes As suggested in this very similar question, I have run full scans (+boot time scans) with RegRun and UnHackMe, but they also did not find anything. I have carefully examined all entries in the Event Viewer, but there's nothing wrong. Now I know that there is a hidden trojan (rootkit) on my machine which seems to disguise itself quite successfully. Note that I don't have the chance to remove the HDD, or reinstall the OS as this is a work machine subjected to certain IT policies on a company domain. Despite all my attempts, the problem still remains. I strictly need a to-the-point method or a pukka rootkit remover to remove whatever it is. I don't want to monkey with the system settings, i.e. disabling auto runs one by one, messing the registry, etc. EDIT 2: I have found an article which is closely related to my trouble: Malware can turn off UAC in Windows 7; “By design” says Microsoft. Special thanks(!) to Microsoft. In the article, a VBScript code is given to disable UAC automatically: '// 1337H4x Written by _____________ '// (12 year old) Set WshShell = WScript.CreateObject("WScript.Shell") '// Toggle Start menu WshShell.SendKeys("^{ESC}") WScript.Sleep(500) '// Search for UAC applet WshShell.SendKeys("change uac") WScript.Sleep(2000) '// Open the applet (assuming second result) WshShell.SendKeys("{DOWN}") WshShell.SendKeys("{DOWN}") WshShell.SendKeys("{ENTER}") WScript.Sleep(2000) '// Set UAC level to lowest (assuming out-of-box Default setting) WshShell.SendKeys("{TAB}") WshShell.SendKeys("{DOWN}") WshShell.SendKeys("{DOWN}") WshShell.SendKeys("{DOWN}") '// Save our changes WshShell.SendKeys("{TAB}") WshShell.SendKeys("{ENTER}") '// TODO: Add code to handle installation of rebound '// process to continue exploitation, i.e. place something '// evil in Startup folder '// Reboot the system '// WshShell.Run "shutdown /r /f" Unfortunately, that doesn't tell me how I can get rid of this malicious code running on my system. EDIT 3: Last night, I left the laptop open because of a running SQL task. When I came in the morning, I saw that UAC was turned off. So, I suspect that the problem is not related to startup. It is happening once a day for sure no matter if the machine is rebooted.

    Read the article

  • FreeBSD 8.1 unstable network connection

    - by frankcheong
    I have three FreeBSD 8.1 running on three different hardware and therefore consist of different network adapter as well (bce, bge and igb). I found that the network connection is kind of unstable which I have tried to scp some 10MB file and found that I cannot always get the files completed successfully. I have further checked with my network admin and he claim that the problem is being caused by the network driver which cannot support the load whereby he tried to ping using huge packet size (around 15k) and my server will drop packet consistently at a regular interval. I found that this statement may not be valid since the three server is using three different network drive and it would be quite impossible that the same problem is being caused by three different network adapter and thus different network driver. Since then I have tried to tune up the performance by playing around with the /etc/sysctl.conf figures with no luck. kern.ipc.somaxconn=1024 kern.ipc.shmall=3276800 kern.ipc.shmmax=1638400000 # Security net.inet.ip.redirect=0 net.inet.ip.sourceroute=0 net.inet.ip.accept_sourceroute=0 net.inet.icmp.maskrepl=0 net.inet.icmp.log_redirect=0 net.inet.icmp.drop_redirect=1 net.inet.tcp.drop_synfin=1 # Security net.inet.udp.blackhole=1 net.inet.tcp.blackhole=2 # Required by pf net.inet.ip.forwarding=1 #Network Performance Tuning kern.ipc.maxsockbuf=16777216 net.inet.tcp.rfc1323=1 net.inet.tcp.sendbuf_max=16777216 net.inet.tcp.recvbuf_max=16777216 # Setting specifically for 1 or even 10Gbps network net.local.stream.sendspace=262144 net.local.stream.recvspace=262144 net.inet.tcp.local_slowstart_flightsize=10 net.inet.tcp.nolocaltimewait=1 net.inet.tcp.mssdflt=1460 net.inet.tcp.sendbuf_auto=1 net.inet.tcp.sendbuf_inc=16384 net.inet.tcp.recvbuf_auto=1 net.inet.tcp.recvbuf_inc=524288 net.inet.tcp.sendspace=262144 net.inet.tcp.recvspace=262144 net.inet.udp.recvspace=262144 kern.ipc.maxsockbuf=16777216 kern.ipc.nmbclusters=32768 net.inet.tcp.delayed_ack=1 net.inet.tcp.delacktime=100 net.inet.tcp.slowstart_flightsize=179 net.inet.tcp.inflight.enable=1 net.inet.tcp.inflight.min=6144 # Reduce the cache size of slow start connection net.inet.tcp.hostcache.expire=1 Our network admin also claim that they see quite a lot of network up and down from their cisco switch log while I cannot find any up down message inside the dmesg. Have further checked the netstat -s but dont have concrete idea. tcp: 133695291 packets sent 39408539 data packets (3358837321 bytes) 61868 data packets (89472844 bytes) retransmitted 24 data packets unnecessarily retransmitted 0 resends initiated by MTU discovery 50756141 ack-only packets (2148 delayed) 0 URG only packets 0 window probe packets 4372385 window update packets 39781869 control packets 134898031 packets received 72339403 acks (for 3357601899 bytes) 190712 duplicate acks 0 acks for unsent data 59339201 packets (3647021974 bytes) received in-sequence 114 completely duplicate packets (135202 bytes) 27 old duplicate packets 0 packets with some dup. data (0 bytes duped) 42090 out-of-order packets (60817889 bytes) 0 packets (0 bytes) of data after window 0 window probes 3953896 window update packets 64181 packets received after close 0 discarded for bad checksums 0 discarded for bad header offset fields 0 discarded because packet too short 45192 discarded due to memory problems 19945391 connection requests 1323420 connection accepts 0 bad connection attempts 0 listen queue overflows 0 ignored RSTs in the windows 21133581 connections established (including accepts) 21268724 connections closed (including 32737 drops) 207874 connections updated cached RTT on close 207874 connections updated cached RTT variance on close 132439 connections updated cached ssthresh on close 42392 embryonic connections dropped 72339338 segments updated rtt (of 69477829 attempts) 390871 retransmit timeouts 0 connections dropped by rexmit timeout 0 persist timeouts 0 connections dropped by persist timeout 0 Connections (fin_wait_2) dropped because of timeout 13990 keepalive timeouts 2 keepalive probes sent 13988 connections dropped by keepalive 173044 correct ACK header predictions 36947371 correct data packet header predictions 1323420 syncache entries added 0 retransmitted 0 dupsyn 0 dropped 1323420 completed 0 bucket overflow 0 cache overflow 0 reset 0 stale 0 aborted 0 badack 0 unreach 0 zone failures 1323420 cookies sent 0 cookies received 1864 SACK recovery episodes 18005 segment rexmits in SACK recovery episodes 26066896 byte rexmits in SACK recovery episodes 147327 SACK options (SACK blocks) received 87473 SACK options (SACK blocks) sent 0 SACK scoreboard overflow 0 packets with ECN CE bit set 0 packets with ECN ECT(0) bit set 0 packets with ECN ECT(1) bit set 0 successful ECN handshakes 0 times ECN reduced the congestion window udp: 5141258 datagrams received 0 with incomplete header 0 with bad data length field 0 with bad checksum 1 with no checksum 0 dropped due to no socket 129616 broadcast/multicast datagrams undelivered 0 dropped due to full socket buffers 0 not for hashed pcb 5011642 delivered 5016050 datagrams output 0 times multicast source filter matched sctp: 0 input packets 0 datagrams 0 packets that had data 0 input SACK chunks 0 input DATA chunks 0 duplicate DATA chunks 0 input HB chunks 0 HB-ACK chunks 0 input ECNE chunks 0 input AUTH chunks 0 chunks missing AUTH 0 invalid HMAC ids received 0 invalid secret ids received 0 auth failed 0 fast path receives all one chunk 0 fast path multi-part data 0 output packets 0 output SACKs 0 output DATA chunks 0 retransmitted DATA chunks 0 fast retransmitted DATA chunks 0 FR's that happened more than once to same chunk 0 intput HB chunks 0 output ECNE chunks 0 output AUTH chunks 0 ip_output error counter Packet drop statistics: 0 from middle box 0 from end host 0 with data 0 non-data, non-endhost 0 non-endhost, bandwidth rep only 0 not enough for chunk header 0 not enough data to confirm 0 where process_chunk_drop said break 0 failed to find TSN 0 attempt reverse TSN lookup 0 e-host confirms zero-rwnd 0 midbox confirms no space 0 data did not match TSN 0 TSN's marked for Fast Retran Timeouts: 0 iterator timers fired 0 T3 data time outs 0 window probe (T3) timers fired 0 INIT timers fired 0 sack timers fired 0 shutdown timers fired 0 heartbeat timers fired 0 a cookie timeout fired 0 an endpoint changed its cookiesecret 0 PMTU timers fired 0 shutdown ack timers fired 0 shutdown guard timers fired 0 stream reset timers fired 0 early FR timers fired 0 an asconf timer fired 0 auto close timer fired 0 asoc free timers expired 0 inp free timers expired 0 packet shorter than header 0 checksum error 0 no endpoint for port 0 bad v-tag 0 bad SID 0 no memory 0 number of multiple FR in a RTT window 0 RFC813 allowed sending 0 RFC813 does not allow sending 0 times max burst prohibited sending 0 look ahead tells us no memory in interface 0 numbers of window probes sent 0 times an output error to clamp down on next user send 0 times sctp_senderrors were caused from a user 0 number of in data drops due to chunk limit reached 0 number of in data drops due to rwnd limit reached 0 times a ECN reduced the cwnd 0 used express lookup via vtag 0 collision in express lookup 0 times the sender ran dry of user data on primary 0 same for above 0 sacks the slow way 0 window update only sacks sent 0 sends with sinfo_flags !=0 0 unordered sends 0 sends with EOF flag set 0 sends with ABORT flag set 0 times protocol drain called 0 times we did a protocol drain 0 times recv was called with peek 0 cached chunks used 0 cached stream oq's used 0 unread messages abandonded by close 0 send burst avoidance, already max burst inflight to net 0 send cwnd full avoidance, already max burst inflight to net 0 number of map array over-runs via fwd-tsn's ip: 137814085 total packets received 0 bad header checksums 0 with size smaller than minimum 0 with data size < data length 0 with ip length > max ip packet size 0 with header length < data size 0 with data length < header length 0 with bad options 0 with incorrect version number 1200 fragments received 0 fragments dropped (dup or out of space) 0 fragments dropped after timeout 300 packets reassembled ok 137813009 packets for this host 530 packets for unknown/unsupported protocol 0 packets forwarded (0 packets fast forwarded) 61 packets not forwardable 0 packets received for unknown multicast group 0 redirects sent 137234598 packets sent from this host 0 packets sent with fabricated ip header 685307 output packets dropped due to no bufs, etc. 52 output packets discarded due to no route 300 output datagrams fragmented 1200 fragments created 0 datagrams that can't be fragmented 0 tunneling packets that can't find gif 0 datagrams with bad address in header icmp: 0 calls to icmp_error 0 errors not generated in response to an icmp message Output histogram: echo reply: 305 0 messages with bad code fields 0 messages less than the minimum length 0 messages with bad checksum 0 messages with bad length 0 multicast echo requests ignored 0 multicast timestamp requests ignored Input histogram: destination unreachable: 530 echo: 305 305 message responses generated 0 invalid return addresses 0 no return routes ICMP address mask responses are disabled igmp: 0 messages received 0 messages received with too few bytes 0 messages received with wrong TTL 0 messages received with bad checksum 0 V1/V2 membership queries received 0 V3 membership queries received 0 membership queries received with invalid field(s) 0 general queries received 0 group queries received 0 group-source queries received 0 group-source queries dropped 0 membership reports received 0 membership reports received with invalid field(s) 0 membership reports received for groups to which we belong 0 V3 reports received without Router Alert 0 membership reports sent arp: 376748 ARP requests sent 3207 ARP replies sent 245245 ARP requests received 80845 ARP replies received 326090 ARP packets received 267712 total packets dropped due to no ARP entry 108876 ARP entrys timed out 0 Duplicate IPs seen ip6: 2226633 total packets received 0 with size smaller than minimum 0 with data size < data length 0 with bad options 0 with incorrect version number 0 fragments received 0 fragments dropped (dup or out of space) 0 fragments dropped after timeout 0 fragments that exceeded limit 0 packets reassembled ok 2226633 packets for this host 0 packets forwarded 0 packets not forwardable 0 redirects sent 2226633 packets sent from this host 0 packets sent with fabricated ip header 0 output packets dropped due to no bufs, etc. 8 output packets discarded due to no route 0 output datagrams fragmented 0 fragments created 0 datagrams that can't be fragmented 0 packets that violated scope rules 0 multicast packets which we don't join Input histogram: UDP: 2226633 Mbuf statistics: 962679 one mbuf 1263954 one ext mbuf 0 two or more ext mbuf 0 packets whose headers are not continuous 0 tunneling packets that can't find gif 0 packets discarded because of too many headers 0 failures of source address selection Source addresses selection rule applied: icmp6: 0 calls to icmp6_error 0 errors not generated in response to an icmp6 message 0 errors not generated because of rate limitation 0 messages with bad code fields 0 messages < minimum length 0 bad checksums 0 messages with bad length Histogram of error messages to be generated: 0 no route 0 administratively prohibited 0 beyond scope 0 address unreachable 0 port unreachable 0 packet too big 0 time exceed transit 0 time exceed reassembly 0 erroneous header field 0 unrecognized next header 0 unrecognized option 0 redirect 0 unknown 0 message responses generated 0 messages with too many ND options 0 messages with bad ND options 0 bad neighbor solicitation messages 0 bad neighbor advertisement messages 0 bad router solicitation messages 0 bad router advertisement messages 0 bad redirect messages 0 path MTU changes rip6: 0 messages received 0 checksum calculations on inbound 0 messages with bad checksum 0 messages dropped due to no socket 0 multicast messages dropped due to no socket 0 messages dropped due to full socket buffers 0 delivered 0 datagrams output netstat -m 516/5124/5640 mbufs in use (current/cache/total) 512/1634/2146/32768 mbuf clusters in use (current/cache/total/max) 512/1536 mbuf+clusters out of packet secondary zone in use (current/cache) 0/1303/1303/12800 4k (page size) jumbo clusters in use (current/cache/total/max) 0/0/0/6400 9k jumbo clusters in use (current/cache/total/max) 0/0/0/3200 16k jumbo clusters in use (current/cache/total/max) 1153K/9761K/10914K bytes allocated to network (current/cache/total) 0/0/0 requests for mbufs denied (mbufs/clusters/mbuf+clusters) 0/0/0 requests for jumbo clusters denied (4k/9k/16k) 0/8/6656 sfbufs in use (current/peak/max) 0 requests for sfbufs denied 0 requests for sfbufs delayed 0 requests for I/O initiated by sendfile 0 calls to protocol drain routines Anyone got an idea what might be the possible cause?

    Read the article

  • Building applications with WPF, MVVM and Prism(aka CAG)

    - by skjagini
    In this article I am going to walk through an application using WPF and Prism (aka composite application guidance, CAG) which simulates engaging a taxi (cab).  The rules are simple, the app would have3 screens A login screen to authenticate the user An information screen. A screen to engage the cab and roam around and calculating the total fare Metered Rate of Fare The meter is required to be engaged when a cab is occupied by anyone $3.00 upon entry $0.35 for each additional unit The unit fare is: one-fifth of a mile, when the cab is traveling at 6 miles an hour or more; or 60 seconds when not in motion or traveling at less than 12 miles per hour. Night surcharge of $.50 after 8:00 PM & before 6:00 AM Peak hour Weekday Surcharge of $1.00 Monday - Friday after 4:00 PM & before 8:00 PM New York State Tax Surcharge of $.50 per ride. Example: Friday (2010-10-08) 5:30pm Start at Lexington Ave & E 57th St End at Irving Pl & E 15th St Start = $3.00 Travels 2 miles at less than 6 mph for 15 minutes = $3.50 Travels at more than 12 mph for 5 minutes = $1.75 Peak hour Weekday Surcharge = $1.00 (ride started at 5:30 pm) New York State Tax Surcharge = $0.50 Before we dive into the app, I would like to give brief description about the framework.  If you want to jump on to the source code, scroll all the way to the end of the post. MVVM MVVM pattern is in no way related to the usage of PRISM in your application and should be considered if you are using WPF irrespective of PRISM or not. Lets say you are not familiar with MVVM, your typical UI would involve adding some UI controls like text boxes, a button, double clicking on the button,  generating event handler, calling a method from business layer and updating the user interface, it works most of the time for developing small scale applications. The problem with this approach is that there is some amount of code specific to business logic wrapped in UI specific code which is hard to unit test it, mock it and MVVM helps to solve the exact problem. MVVM stands for Model(M) – View(V) – ViewModel(VM),  based on the interactions with in the three parties it should be called VVMM,  MVVM sounds more like MVC (Model-View-Controller) so the name. Why it should be called VVMM: View – View Model - Model WPF allows to create user interfaces using XAML and MVVM takes it to the next level by allowing complete separation of user interface and business logic. In WPF each view will have a property, DataContext when set to an instance of a class (which happens to be your view model) provides the data the view is interested in, i.e., view interacts with view model and at the same time view model interacts with view through DataContext. Sujith, if view and view model are interacting directly with each other how does MVVM is helping me separation of concerns? Well, the catch is DataContext is of type Object, since it is of type object view doesn’t know exact type of view model allowing views and views models to be loosely coupled. View models aggregate data from models (data access layer, services, etc) and make it available for views through properties, methods etc, i.e., View Models interact with Models. PRISM Prism is provided by Microsoft Patterns and Practices team and it can be downloaded from codeplex for source code,  samples and documentation on msdn.  The name composite implies, to compose user interface from different modules (views) without direct dependencies on each other, again allowing  loosely coupled development. Well Sujith, I can already do that with user controls, why shall I learn another framework?  That’s correct, you can decouple using user controls, but you still have to manage some amount of coupling, like how to do you communicate between the controls, how do you subscribe/unsubscribe, loading/unloading views dynamically. Prism is not a replacement for user controls, provides the following features which greatly help in designing the composite applications. Dependency Injection (DI)/ Inversion of Control (IoC) Modules Regions Event Aggregator  Commands Simply put, MVVM helps building a single view and Prism helps building an application using the views There are other open source alternatives to Prism, like MVVMLight, Cinch, take a look at them as well. Lets dig into the source code.  1. Solution The solution is made of the following projects Framework: Holds the common functionality in building applications using WPF and Prism TaxiClient: Start up project, boot strapping and app styling TaxiCommon: Helps with the business logic TaxiModules: Holds the meat of the application with views and view models TaxiTests: To test the application 2. DI / IoC Dependency Injection (DI) as the name implies refers to injecting dependencies and Inversion of Control (IoC) means the calling code has no direct control on the dependencies, opposite of normal way of programming where dependencies are passed by caller, i.e inversion; aside from some differences in terminology the concept is same in both the cases. The idea behind DI/IoC pattern is to reduce the amount of direct coupling between different components of the application, the higher the dependency the more tightly coupled the application resulting in code which is hard to modify, unit test and mock.  Initializing Dependency Injection through BootStrapper TaxiClient is the starting project of the solution and App (App.xaml)  is the starting class that gets called when you run the application. From the App’s OnStartup method we will invoke BootStrapper.   namespace TaxiClient { /// <summary> /// Interaction logic for App.xaml /// </summary> public partial class App : Application { protected override void OnStartup(StartupEventArgs e) { base.OnStartup(e);   (new BootStrapper()).Run(); } } } BootStrapper is your contact point for initializing the application including dependency injection, creating Shell and other frameworks. We are going to use Unity for DI and there are lot of open source DI frameworks like Spring.Net, StructureMap etc with different feature set  and you can choose a framework based on your preferences. Note that Prism comes with in built support for Unity, for example we are deriving from UnityBootStrapper in our case and for any other DI framework you have to extend the Prism appropriately   namespace TaxiClient { public class BootStrapper: UnityBootstrapper { protected override IModuleCatalog CreateModuleCatalog() { return new ConfigurationModuleCatalog(); } protected override DependencyObject CreateShell() { Framework.FrameworkBootStrapper.Run(Container, Application.Current.Dispatcher);   Shell shell = new Shell(); shell.ResizeMode = ResizeMode.NoResize; shell.Show();   return shell; } } } Lets take a look into  FrameworkBootStrapper to check out how to register with unity container. namespace Framework { public class FrameworkBootStrapper { public static void Run(IUnityContainer container, Dispatcher dispatcher) { UIDispatcher uiDispatcher = new UIDispatcher(dispatcher); container.RegisterInstance<IDispatcherService>(uiDispatcher);   container.RegisterType<IInjectSingleViewService, InjectSingleViewService>( new ContainerControlledLifetimeManager());   . . . } } } In the above code we are registering two components with unity container. You shall observe that we are following two different approaches, RegisterInstance and RegisterType.  With RegisterInstance we are registering an existing instance and the same instance will be returned for every request made for IDispatcherService   and with RegisterType we are requesting unity container to create an instance for us when required, i.e., when I request for an instance for IInjectSingleViewService, unity will create/return an instance of InjectSingleViewService class and with RegisterType we can configure the life time of the instance being created. With ContaienrControllerLifetimeManager, the unity container caches the instance and reuses for any subsequent requests, without recreating a new instance. Lets take a look into FareViewModel.cs and it’s constructor. The constructor takes one parameter IEventAggregator and if you try to find all references in your solution for IEventAggregator, you will not find a single location where an instance of EventAggregator is passed directly to the constructor. The compiler still finds an instance and works fine because Prism is already configured when used with Unity container to return an instance of EventAggregator when requested for IEventAggregator and in this particular case it is called constructor injection. public class FareViewModel:ObservableBase, IDataErrorInfo { ... private IEventAggregator _eventAggregator;   public FareViewModel(IEventAggregator eventAggregator) { _eventAggregator = eventAggregator; InitializePropertyNames(); InitializeModel(); PropertyChanged += OnPropertyChanged; } ... 3. Shell Shells are very similar in operation to Master Pages in asp.net or MDI in Windows Forms. And shells contain regions which display the views, you can have as many regions as you wish in a given view. You can also nest regions. i.e, one region can load a view which in itself may contain other regions. We have to create a shell at the start of the application and are doing it by overriding CreateShell method from BootStrapper From the following Shell.xaml you shall notice that we have two content controls with Region names as ‘MenuRegion’ and ‘MainRegion’.  The idea here is that you can inject any user controls into the regions dynamically, i.e., a Menu User Control for MenuRegion and based on the user action you can load appropriate view into MainRegion.    <Window x:Class="TaxiClient.Shell" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:Regions="clr-namespace:Microsoft.Practices.Prism.Regions;assembly=Microsoft.Practices.Prism" Title="Taxi" Height="370" Width="800"> <Grid Margin="2"> <ContentControl Regions:RegionManager.RegionName="MenuRegion" HorizontalAlignment="Stretch" VerticalAlignment="Stretch" HorizontalContentAlignment="Stretch" VerticalContentAlignment="Stretch" />   <ContentControl Grid.Row="1" Regions:RegionManager.RegionName="MainRegion" HorizontalAlignment="Stretch" VerticalAlignment="Stretch" HorizontalContentAlignment="Stretch" VerticalContentAlignment="Stretch" /> <!--<Border Grid.ColumnSpan="2" BorderThickness="2" CornerRadius="3" BorderBrush="LightBlue" />-->   </Grid> </Window> 4. Modules Prism provides the ability to build composite applications and modules play an important role in it. For example if you are building a Mortgage Loan Processor application with 3 components, i.e. customer’s credit history,  existing mortgages, new home/loan information; and consider that the customer’s credit history component involves gathering data about his/her address, background information, job details etc. The idea here using Prism modules is to separate the implementation of these 3 components into their own visual studio projects allowing to build components with no dependency on each other and independently. If we need to add another component to the application, the component can be developed by in house team or some other team in the organization by starting with a new Visual Studio project and adding to the solution at the run time with very little knowledge about the application. Prism modules are defined by implementing the IModule interface and each visual studio project to be considered as a module should implement the IModule interface.  From the BootStrapper.cs you shall observe that we are overriding the method by returning a ConfiguratingModuleCatalog which returns the modules that are registered for the application using the app.config file  and you can also add module using code. Lets take a look into configuration file.   <?xml version="1.0"?> <configuration> <configSections> <section name="modules" type="Microsoft.Practices.Prism.Modularity.ModulesConfigurationSection, Microsoft.Practices.Prism"/> </configSections> <modules> <module assemblyFile="TaxiModules.dll" moduleType="TaxiModules.ModuleInitializer, TaxiModules" moduleName="TaxiModules"/> </modules> </configuration> Here we are adding TaxiModules project to our solution and TaxiModules.ModuleInitializer implements IModule interface   5. Module Mapper With Prism modules you can dynamically add or remove modules from the regions, apart from that Prism also provides API to control adding/removing the views from a region within the same module. Taxi Information Screen: Engage the Taxi Screen: The sample application has two screens, ‘Taxi Information’ and ‘Engage the Taxi’ and they both reside in same module, TaxiModules. ‘Engage the Taxi’ is again made of two user controls, FareView on the left and TotalView on the right. We have created a Shell with two regions, MenuRegion and MainRegion with menu loaded into MenuRegion. We can create a wrapper user control called EngageTheTaxi made of FareView and TotalView and load either TaxiInfo or EngageTheTaxi into MainRegion based on the user action. Though it will work it tightly binds the user controls and for every combination of user controls, we need to create a dummy wrapper control to contain them. Instead we can apply the principles we learned so far from Shell/regions and introduce another template (LeftAndRightRegionView.xaml) made of two regions Region1 (left) and Region2 (right) and load  FareView and TotalView dynamically.  To help with loading of the views dynamically I have introduce an helper an interface, IInjectSingleViewService,  idea suggested by Mike Taulty, a must read blog for .Net developers. using System; using System.Collections.Generic; using System.ComponentModel;   namespace Framework.PresentationUtility.Navigation {   public interface IInjectSingleViewService : INotifyPropertyChanged { IEnumerable<CommandViewDefinition> Commands { get; } IEnumerable<ModuleViewDefinition> Modules { get; }   void RegisterViewForRegion(string commandName, string viewName, string regionName, Type viewType); void ClearViewFromRegion(string viewName, string regionName); void RegisterModule(string moduleName, IList<ModuleMapper> moduleMappers); } } The Interface declares three methods to work with views: RegisterViewForRegion: Registers a view with a particular region. You can register multiple views and their regions under one command.  When this particular command is invoked all the views registered under it will be loaded into their regions. ClearViewFromRegion: To unload a specific view from a region. RegisterModule: The idea is when a command is invoked you can load the UI with set of controls in their default position and based on the user interaction, you can load different contols in to different regions on the fly.  And it is supported ModuleViewDefinition and ModuleMappers as shown below. namespace Framework.PresentationUtility.Navigation { public class ModuleViewDefinition { public string ModuleName { get; set; } public IList<ModuleMapper> ModuleMappers; public ICommand Command { get; set; } }   public class ModuleMapper { public string ViewName { get; set; } public string RegionName { get; set; } public Type ViewType { get; set; } } } 6. Event Aggregator Prism event aggregator enables messaging between components as in Observable pattern, Notifier notifies the Observer which receives notification it is interested in. When it comes to Observable pattern, Observer has to unsubscribes for notifications when it no longer interested in notifications, which allows the Notifier to remove the Observer’s reference from it’s local cache. Though .Net has managed garbage collection it cannot remove inactive the instances referenced by an active instance resulting in memory leak, keeping the Observers in memory as long as Notifier stays in memory.  Developers have to be very careful to unsubscribe when necessary and it often gets overlooked, to overcome these problems Prism Event Aggregator uses weak references to cache the reference (Observer in this case)  and releases the reference (memory) once the instance goes out of scope. Using event aggregator is very simple, declare a generic type of CompositePresenationEvent by inheriting from it. using Microsoft.Practices.Prism.Events; using TaxiCommon.BAO;   namespace TaxiCommon.CompositeEvents { public class TaxiOnMoveEvent:CompositePresentationEvent<TaxiOnMove> { } }   TaxiOnMove.cs includes the properties which we want to exchange between the parties, FareView and TotalView. using System;   namespace TaxiCommon.BAO { public class TaxiOnMove { public TimeSpan MinutesAtTweleveMPH { get; set; } public double MilesAtSixMPH { get; set; } } }   Lets take a look into FareViewodel (Notifier) and how it raises the event.  Here we are raising the event by getting the event through GetEvent<..>() and publishing it with the payload private void OnAddMinutes(object obj) { TaxiOnMove payload = new TaxiOnMove(); if(MilesAtSixMPH != null) payload.MilesAtSixMPH = MilesAtSixMPH.Value; if(MinutesAtTweleveMPH != null) payload.MinutesAtTweleveMPH = new TimeSpan(0,0,MinutesAtTweleveMPH.Value,0);   _eventAggregator.GetEvent<TaxiOnMoveEvent>().Publish(payload); ResetMinutesAndMiles(); } And TotalViewModel(Observer) subscribes to notifications by getting the event through GetEvent<..>() namespace TaxiModules.ViewModels { public class TotalViewModel:ObservableBase { .... private IEventAggregator _eventAggregator;   public TotalViewModel(IEventAggregator eventAggregator) { _eventAggregator = eventAggregator; ... }   private void SubscribeToEvents() { _eventAggregator.GetEvent<TaxiStartedEvent>() .Subscribe(OnTaxiStarted, ThreadOption.UIThread,false,(filter) => true); _eventAggregator.GetEvent<TaxiOnMoveEvent>() .Subscribe(OnTaxiMove, ThreadOption.UIThread, false, (filter) => true); _eventAggregator.GetEvent<TaxiResetEvent>() .Subscribe(OnTaxiReset, ThreadOption.UIThread, false, (filter) => true); }   ... private void OnTaxiMove(TaxiOnMove taxiOnMove) { OnMoveFare fare = new OnMoveFare(taxiOnMove); Fares.Add(fare); SetTotalFare(new []{fare}); }   .... 7. MVVM through example In this section we are going to look into MVVM implementation through example.  I have all the modules declared in a single project, TaxiModules, again it is not necessary to have them into one project. Once the user logs into the application, will be greeted with the ‘Engage the Taxi’ screen which is made of two user controls, FareView.xaml and TotalView.Xaml. As you can see from the solution explorer, each of them have their own code behind files and  ViewModel classes, FareViewMode.cs, TotalViewModel.cs Lets take a look in to the FareView and how it interacts with FareViewModel using MVVM implementation. FareView.xaml acts as a view and FareViewMode.cs is it’s view model. The FareView code behind class   namespace TaxiModules.Views { /// <summary> /// Interaction logic for FareView.xaml /// </summary> public partial class FareView : UserControl { public FareView(FareViewModel viewModel) { InitializeComponent(); this.Loaded += (s, e) => { this.DataContext = viewModel; }; } } } The FareView is bound to FareViewModel through the data context  and you shall observe that DataContext is of type Object, i.e. the FareView doesn’t really know the type of ViewModel (FareViewModel). This helps separation of View and ViewModel as View and ViewModel are independent of each other, you can bind FareView to FareViewModel2 as well and the application compiles just fine. Lets take a look into FareView xaml file  <UserControl x:Class="TaxiModules.Views.FareView" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:Toolkit="clr-namespace:Microsoft.Windows.Controls;assembly=WPFToolkit" xmlns:Commands="clr-namespace:Microsoft.Practices.Prism.Commands;assembly=Microsoft.Practices.Prism"> <Grid Margin="10" > ....   <Border Style="{DynamicResource innerBorder}" Grid.Row="0" Grid.Column="0" Grid.RowSpan="11" Grid.ColumnSpan="2" Panel.ZIndex="1"/>   <Label Grid.Row="0" Content="Engage the Taxi" Style="{DynamicResource innerHeader}"/> <Label Grid.Row="1" Content="Select the State"/> <ComboBox Grid.Row="1" Grid.Column="1" ItemsSource="{Binding States}" Height="auto"> <ComboBox.ItemTemplate> <DataTemplate> <TextBlock Text="{Binding Name}"/> </DataTemplate> </ComboBox.ItemTemplate> <ComboBox.SelectedItem> <Binding Path="SelectedState" Mode="TwoWay"/> </ComboBox.SelectedItem> </ComboBox> <Label Grid.Row="2" Content="Select the Date of Entry"/> <Toolkit:DatePicker Grid.Row="2" Grid.Column="1" SelectedDate="{Binding DateOfEntry, ValidatesOnDataErrors=true}" /> <Label Grid.Row="3" Content="Enter time 24hr format"/> <TextBox Grid.Row="3" Grid.Column="1" Text="{Binding TimeOfEntry, TargetNullValue=''}"/> <Button Grid.Row="4" Grid.Column="1" Content="Start the Meter" Commands:Click.Command="{Binding StartMeterCommand}" />   <Label Grid.Row="5" Content="Run the Taxi" Style="{DynamicResource innerHeader}"/> <Label Grid.Row="6" Content="Number of Miles &lt;@6mph"/> <TextBox Grid.Row="6" Grid.Column="1" Text="{Binding MilesAtSixMPH, TargetNullValue='', ValidatesOnDataErrors=true}"/> <Label Grid.Row="7" Content="Number of Minutes @12mph"/> <TextBox Grid.Row="7" Grid.Column="1" Text="{Binding MinutesAtTweleveMPH, TargetNullValue=''}"/> <Button Grid.Row="8" Grid.Column="1" Content="Add Minutes and Miles " Commands:Click.Command="{Binding AddMinutesCommand}"/> <Label Grid.Row="9" Content="Other Operations" Style="{DynamicResource innerHeader}"/> <Button Grid.Row="10" Grid.Column="1" Content="Reset the Meter" Commands:Click.Command="{Binding ResetCommand}"/>   </Grid> </UserControl> The highlighted code from the above code shows data binding, for example ComboBox which displays list of states has it’s ItemsSource bound to States property, with DataTemplate bound to Name and SelectedItem  to SelectedState. You might be wondering what are all these properties and how it is able to bind to them.  The answer lies in data context, i.e., when you bound a control, WPF looks for data context on the root object (Grid in this case) and if it can’t find data context it will look into root’s root, i.e. FareView UserControl and it is bound to FareViewModel.  Each of those properties have be declared on the ViewModel for the View to bind correctly. To put simply, View is bound to ViewModel through data context of type object and every control that is bound on the View actually binds to the public property on the ViewModel. Lets look into the ViewModel code (the following code is not an exact copy of FareViewMode.cs, pasted relevant code for this section)   namespace TaxiModules.ViewModels { public class FareViewModel:ObservableBase, IDataErrorInfo { public List<USState> States { get { return USStates.StateList; } }   public USState SelectedState { get { return _selectedState; } set { _selectedState = value; RaisePropertyChanged(_selectedStatePropertyName); } }   public DateTime? DateOfEntry { get { return _dateOfEntry; } set { _dateOfEntry = value; RaisePropertyChanged(_dateOfEntryPropertyName); } }   public TimeSpan? TimeOfEntry { get { return _timeOfEntry; } set { _timeOfEntry = value; RaisePropertyChanged(_timeOfEntryPropertyName); } }   public double? MilesAtSixMPH { get { return _milesAtSixMPH; } set { _milesAtSixMPH = value; RaisePropertyChanged(_distanceAtSixMPHPropertyName); } }   public int? MinutesAtTweleveMPH { get { return _minutesAtTweleveMPH; } set { _minutesAtTweleveMPH = value; RaisePropertyChanged(_minutesAtTweleveMPHPropertyName); } }   public ICommand StartMeterCommand { get { if(_startMeterCommand == null) { _startMeterCommand = new DelegateCommand<object>(OnStartMeter, CanStartMeter); } return _startMeterCommand; } }   public ICommand AddMinutesCommand { get { if(_addMinutesCommand == null) { _addMinutesCommand = new DelegateCommand<object>(OnAddMinutes, CanAddMinutes); } return _addMinutesCommand; } }   public ICommand ResetCommand { get { if(_resetCommand == null) { _resetCommand = new DelegateCommand<object>(OnResetCommand); } return _resetCommand; } }   } private void OnStartMeter(object obj) { _eventAggregator.GetEvent<TaxiStartedEvent>().Publish( new TaxiStarted() { EngagedOn = DateOfEntry.Value.Date + TimeOfEntry.Value, EngagedState = SelectedState.Value });   _isMeterStarted = true; OnPropertyChanged(this,null); } And views communicate user actions like button clicks, tree view item selections, etc using commands. When user clicks on ‘Start the Meter’ button it invokes the method StartMeterCommand, which calls the method OnStartMeter which publishes the event to TotalViewModel using event aggregator  and TaxiStartedEvent. namespace TaxiModules.ViewModels { public class TotalViewModel:ObservableBase { ... private IEventAggregator _eventAggregator;   public TotalViewModel(IEventAggregator eventAggregator) { _eventAggregator = eventAggregator;   InitializePropertyNames(); InitializeModel(); SubscribeToEvents(); }   public decimal? TotalFare { get { return _totalFare; } set { _totalFare = value; RaisePropertyChanged(_totalFarePropertyName); } } .... private void SubscribeToEvents() { _eventAggregator.GetEvent<TaxiStartedEvent>().Subscribe(OnTaxiStarted, ThreadOption.UIThread,false,(filter) => true); _eventAggregator.GetEvent<TaxiOnMoveEvent>().Subscribe(OnTaxiMove, ThreadOption.UIThread, false, (filter) => true); _eventAggregator.GetEvent<TaxiResetEvent>().Subscribe(OnTaxiReset, ThreadOption.UIThread, false, (filter) => true); }   private void OnTaxiStarted(TaxiStarted taxiStarted) { Fares.Add(new EntryFare()); Fares.Add(new StateTaxFare(taxiStarted)); Fares.Add(new NightSurchargeFare(taxiStarted)); Fares.Add(new PeakHourWeekdayFare(taxiStarted));   SetTotalFare(Fares); }   private void SetTotalFare(IEnumerable<IFare> fares) { TotalFare = (_totalFare ?? 0) + TaxiFareHelper.GetTotalFare(fares); } ....   } }   TotalViewModel subscribes to events, TaxiStartedEvent and rest. When TaxiStartedEvent gets invoked it calls the OnTaxiStarted method which sets the total fare which includes entry fee, state tax, nightly surcharge, peak hour weekday fare.   Note that TotalViewModel derives from ObservableBase which implements the method RaisePropertyChanged which we are invoking in Set of TotalFare property, i.e, once we update the TotalFare property it raises an the event that  allows the TotalFare text box to fetch the new value through the data context. ViewModel is communicating with View through data context and it has no knowledge about View, helping in loose coupling of ViewModel and View.   I have attached the source code (.Net 4.0, Prism 4.0, VS 2010) , download and play with it and don’t forget to leave your comments.  

    Read the article

  • Using the West Wind Web Toolkit to set up AJAX and REST Services

    - by Rick Strahl
    I frequently get questions about which option to use for creating AJAX and REST backends for ASP.NET applications. There are many solutions out there to do this actually, but when I have a choice - not surprisingly - I fall back to my own tools in the West Wind West Wind Web Toolkit. I've talked a bunch about the 'in-the-box' solutions in the past so for a change in this post I'll talk about the tools that I use in my own and customer applications to handle AJAX and REST based access to service resources using the West Wind West Wind Web Toolkit. Let me preface this by saying that I like things to be easy. Yes flexible is very important as well but not at the expense of over-complexity. The goal I've had with my tools is make it drop dead easy, with good performance while providing the core features that I'm after, which are: Easy AJAX/JSON Callbacks Ability to return any kind of non JSON content (string, stream, byte[], images) Ability to work with both XML and JSON interchangeably for input/output Access endpoints via POST data, RPC JSON calls, GET QueryString values or Routing interface Easy to use generic JavaScript client to make RPC calls (same syntax, just what you need) Ability to create clean URLS with Routing Ability to use standard ASP.NET HTTP Stack for HTTP semantics It's all about options! In this post I'll demonstrate most of these features (except XML) in a few simple and short samples which you can download. So let's take a look and see how you can build an AJAX callback solution with the West Wind Web Toolkit. Installing the Toolkit Assemblies The easiest and leanest way of using the Toolkit in your Web project is to grab it via NuGet: West Wind Web and AJAX Utilities (Westwind.Web) and drop it into the project by right clicking in your Project and choosing Manage NuGet Packages from anywhere in the Project.   When done you end up with your project looking like this: What just happened? Nuget added two assemblies - Westwind.Web and Westwind.Utilities and the client ww.jquery.js library. It also added a couple of references into web.config: The default namespaces so they can be accessed in pages/views and a ScriptCompressionModule that the toolkit optionally uses to compress script resources served from within the assembly (namely ww.jquery.js and optionally jquery.js). Creating a new Service The West Wind Web Toolkit supports several ways of creating and accessing AJAX services, but for this post I'll stick to the lower level approach that works from any plain HTML page or of course MVC, WebForms, WebPages. There's also a WebForms specific control that makes this even easier but I'll leave that for another post. So, to create a new standalone AJAX/REST service we can create a new HttpHandler in the new project either as a pure class based handler or as a generic .ASHX handler. Both work equally well, but generic handlers don't require any web.config configuration so I'll use that here. In the root of the project add a Generic Handler. I'm going to call this one StockService.ashx. Once the handler has been created, edit the code and remove all of the handler body code. Then change the base class to CallbackHandler and add methods that have a [CallbackMethod] attribute. Here's the modified base handler implementation now looks like with an added HelloWorld method: using System; using Westwind.Web; namespace WestWindWebAjax { /// <summary> /// Handler implements CallbackHandler to provide REST/AJAX services /// </summary> public class SampleService : CallbackHandler { [CallbackMethod] public string HelloWorld(string name) { return "Hello " + name + ". Time is: " + DateTime.Now.ToString(); } } } Notice that the class inherits from CallbackHandler and that the HelloWorld service method is marked up with [CallbackMethod]. We're done here. Services Urlbased Syntax Once you compile, the 'service' is live can respond to requests. All CallbackHandlers support input in GET and POST formats, and can return results as JSON or XML. To check our fancy HelloWorld method we can now access the service like this: http://localhost/WestWindWebAjax/StockService.ashx?Method=HelloWorld&name=Rick which produces a default JSON response - in this case a string (wrapped in quotes as it's JSON): (note by default JSON will be downloaded by most browsers not displayed - various options are available to view JSON right in the browser) If I want to return the same data as XML I can tack on a &format=xml at the end of the querystring which produces: <string>Hello Rick. Time is: 11/1/2011 12:11:13 PM</string> Cleaner URLs with Routing Syntax If you want cleaner URLs for each operation you can also configure custom routes on a per URL basis similar to the way that WCF REST does. To do this you need to add a new RouteHandler to your application's startup code in global.asax.cs one for each CallbackHandler based service you create: protected void Application_Start(object sender, EventArgs e) { CallbackHandlerRouteHandler.RegisterRoutes<StockService>(RouteTable.Routes); } With this code in place you can now add RouteUrl properties to any of your service methods. For the HelloWorld method that doesn't make a ton of sense but here is what a routed clean URL might look like in definition: [CallbackMethod(RouteUrl="stocks/HelloWorld/{name}")] public string HelloWorld(string name) { return "Hello " + name + ". Time is: " + DateTime.Now.ToString(); } The same URL I previously used now becomes a bit shorter and more readable with: http://localhost/WestWindWebAjax/HelloWorld/Rick It's an easy way to create cleaner URLs and still get the same functionality. Calling the Service with $.getJSON() Since the result produced is JSON you can now easily consume this data using jQuery's getJSON method. First we need a couple of scripts - jquery.js and ww.jquery.js in the page: <!DOCTYPE html> <html> <head> <link href="Css/Westwind.css" rel="stylesheet" type="text/css" /> <script src="scripts/jquery.min.js" type="text/javascript"></script> <script src="scripts/ww.jquery.min.js" type="text/javascript"></script> </head> <body> Next let's add a small HelloWorld example form (what else) that has a single textbox to type a name, a button and a div tag to receive the result: <fieldset> <legend>Hello World</legend> Please enter a name: <input type="text" name="txtHello" id="txtHello" value="" /> <input type="button" id="btnSayHello" value="Say Hello (POST)" /> <input type="button" id="btnSayHelloGet" value="Say Hello (GET)" /> <div id="divHelloMessage" class="errordisplay" style="display:none;width: 450px;" > </div> </fieldset> Then to call the HelloWorld method a little jQuery is used to hook the document startup and the button click followed by the $.getJSON call to retrieve the data from the server. <script type="text/javascript"> $(document).ready(function () { $("#btnSayHelloGet").click(function () { $.getJSON("SampleService.ashx", { Method: "HelloWorld", name: $("#txtHello").val() }, function (result) { $("#divHelloMessage") .text(result) .fadeIn(1000); }); });</script> .getJSON() expects a full URL to the endpoint of our service, which is the ASHX file. We can either provide a full URL (SampleService.ashx?Method=HelloWorld&name=Rick) or we can just provide the base URL and an object that encodes the query string parameters for us using an object map that has a property that matches each parameter for the server method. We can also use the clean URL routing syntax, but using the object parameter encoding actually is safer as the parameters will get properly encoded by jQuery. The result returned is whatever the result on the server method is - in this case a string. The string is applied to the divHelloMessage element and we're done. Obviously this is a trivial example, but it demonstrates the basics of getting a JSON response back to the browser. AJAX Post Syntax - using ajaxCallMethod() The previous example allows you basic control over the data that you send to the server via querystring parameters. This works OK for simple values like short strings, numbers and boolean values, but doesn't really work if you need to pass something more complex like an object or an array back up to the server. To handle traditional RPC type messaging where the idea is to map server side functions and results to a client side invokation, POST operations can be used. The easiest way to use this functionality is to use ww.jquery.js and the ajaxCallMethod() function. ww.jquery wraps jQuery's AJAX functions and knows implicitly how to call a CallbackServer method with parameters and parse the result. Let's look at another simple example that posts a simple value but returns something more interesting. Let's start with the service method: [CallbackMethod(RouteUrl="stocks/{symbol}")] public StockQuote GetStockQuote(string symbol) { Response.Cache.SetExpires(DateTime.UtcNow.Add(new TimeSpan(0, 2, 0))); StockServer server = new StockServer(); var quote = server.GetStockQuote(symbol); if (quote == null) throw new ApplicationException("Invalid Symbol passed."); return quote; } This sample utilizes a small StockServer helper class (included in the sample) that downloads a stock quote from Yahoo's financial site via plain HTTP GET requests and formats it into a StockQuote object. Lets create a small HTML block that lets us query for the quote and display it: <fieldset> <legend>Single Stock Quote</legend> Please enter a stock symbol: <input type="text" name="txtSymbol" id="txtSymbol" value="msft" /> <input type="button" id="btnStockQuote" value="Get Quote" /> <div id="divStockDisplay" class="errordisplay" style="display:none; width: 450px;"> <div class="label-left">Company:</div> <div id="stockCompany"></div> <div class="label-left">Last Price:</div> <div id="stockLastPrice"></div> <div class="label-left">Quote Time:</div> <div id="stockQuoteTime"></div> </div> </fieldset> The final result looks something like this:   Let's hook up the button handler to fire the request and fill in the data as shown: $("#btnStockQuote").click(function () { ajaxCallMethod("SampleService.ashx", "GetStockQuote", [$("#txtSymbol").val()], function (quote) { $("#divStockDisplay").show().fadeIn(1000); $("#stockCompany").text(quote.Company + " (" + quote.Symbol + ")"); $("#stockLastPrice").text(quote.LastPrice); $("#stockQuoteTime").text(quote.LastQuoteTime.formatDate("MMM dd, HH:mm EST")); }, onPageError); }); So we point at SampleService.ashx and the GetStockQuote method, passing a single parameter of the input symbol value. Then there are two handlers for success and failure callbacks.  The success handler is the interesting part - it receives the stock quote as a result and assigns its values to various 'holes' in the stock display elements. The data that comes back over the wire is JSON and it looks like this: { "Symbol":"MSFT", "Company":"Microsoft Corpora", "OpenPrice":26.11, "LastPrice":26.01, "NetChange":0.02, "LastQuoteTime":"2011-11-03T02:00:00Z", "LastQuoteTimeString":"Nov. 11, 2011 4:20pm" } which is an object representation of the data. JavaScript can evaluate this JSON string back into an object easily and that's the reslut that gets passed to the success function. The quote data is then applied to existing page content by manually selecting items and applying them. There are other ways to do this more elegantly like using templates, but here we're only interested in seeing how the data is returned. The data in the object is typed - LastPrice is a number and QuoteTime is a date. Note about the date value: JavaScript doesn't have a date literal although the JSON embedded ISO string format used above  ("2011-11-03T02:00:00Z") is becoming fairly standard for JSON serializers. However, JSON parsers don't deserialize dates by default and return them by string. This is why the StockQuote actually returns a string value of LastQuoteTimeString for the same date. ajaxMethodCallback always converts dates properly into 'real' dates and the example above uses the real date value along with a .formatDate() data extension (also in ww.jquery.js) to display the raw date properly. Errors and Exceptions So what happens if your code fails? For example if I pass an invalid stock symbol to the GetStockQuote() method you notice that the code does this: if (quote == null) throw new ApplicationException("Invalid Symbol passed."); CallbackHandler automatically pushes the exception message back to the client so it's easy to pick up the error message. Regardless of what kind of error occurs: Server side, client side, protocol errors - any error will fire the failure handler with an error object parameter. The error is returned to the client via a JSON response in the error callback. In the previous examples I called onPageError which is a generic routine in ww.jquery that displays a status message on the bottom of the screen. But of course you can also take over the error handling yourself: $("#btnStockQuote").click(function () { ajaxCallMethod("SampleService.ashx", "GetStockQuote", [$("#txtSymbol").val()], function (quote) { $("#divStockDisplay").fadeIn(1000); $("#stockCompany").text(quote.Company + " (" + quote.Symbol + ")"); $("#stockLastPrice").text(quote.LastPrice); $("#stockQuoteTime").text(quote.LastQuoteTime.formatDate("MMM dd, hh:mmt")); }, function (error, xhr) { $("#divErrorDisplay").text(error.message).fadeIn(1000); }); }); The error object has a isCallbackError, message and  stackTrace properties, the latter of which is only populated when running in Debug mode, and this object is returned for all errors: Client side, transport and server side errors. Regardless of which type of error you get the same object passed (as well as the XHR instance optionally) which makes for a consistent error retrieval mechanism. Specifying HttpVerbs You can also specify HTTP Verbs that are allowed using the AllowedHttpVerbs option on the CallbackMethod attribute: [CallbackMethod(AllowedHttpVerbs=HttpVerbs.GET | HttpVerbs.POST)] public string HelloWorld(string name) { … } If you're building REST style API's this might be useful to force certain request semantics onto the client calling. For the above if call with a non-allowed HttpVerb the request returns a 405 error response along with a JSON (or XML) error object result. The default behavior is to allow all verbs access (HttpVerbs.All). Passing in object Parameters Up to now the parameters I passed were very simple. But what if you need to send something more complex like an object or an array? Let's look at another example now that passes an object from the client to the server. Keeping with the Stock theme here lets add a method called BuyOrder that lets us buy some shares for a stock. Consider the following service method that receives an StockBuyOrder object as a parameter: [CallbackMethod] public string BuyStock(StockBuyOrder buyOrder) { var server = new StockServer(); var quote = server.GetStockQuote(buyOrder.Symbol); if (quote == null) throw new ApplicationException("Invalid or missing stock symbol."); return string.Format("You're buying {0} shares of {1} ({2}) stock at {3} for a total of {4} on {5}.", buyOrder.Quantity, quote.Company, quote.Symbol, quote.LastPrice.ToString("c"), (quote.LastPrice * buyOrder.Quantity).ToString("c"), buyOrder.BuyOn.ToString("MMM d")); } public class StockBuyOrder { public string Symbol { get; set; } public int Quantity { get; set; } public DateTime BuyOn { get; set; } public StockBuyOrder() { BuyOn = DateTime.Now; } } This is a contrived do-nothing example that simply echoes back what was passed in, but it demonstrates how you can pass complex data to a callback method. On the client side we now have a very simple form that captures the three values on a form: <fieldset> <legend>Post a Stock Buy Order</legend> Enter a symbol: <input type="text" name="txtBuySymbol" id="txtBuySymbol" value="GLD" />&nbsp;&nbsp; Qty: <input type="text" name="txtBuyQty" id="txtBuyQty" value="10" style="width: 50px" />&nbsp;&nbsp; Buy on: <input type="text" name="txtBuyOn" id="txtBuyOn" value="<%= DateTime.Now.ToString("d") %>" style="width: 70px;" /> <input type="button" id="btnBuyStock" value="Buy Stock" /> <div id="divStockBuyMessage" class="errordisplay" style="display:none"></div> </fieldset> The completed form and demo then looks something like this:   The client side code that picks up the input values and assigns them to object properties and sends the AJAX request looks like this: $("#btnBuyStock").click(function () { // create an object map that matches StockBuyOrder signature var buyOrder = { Symbol: $("#txtBuySymbol").val(), Quantity: $("#txtBuyQty").val() * 1, // number Entered: new Date() } ajaxCallMethod("SampleService.ashx", "BuyStock", [buyOrder], function (result) { $("#divStockBuyMessage").text(result).fadeIn(1000); }, onPageError); }); The code creates an object and attaches the properties that match the server side object passed to the BuyStock method. Each property that you want to update needs to be included and the type must match (ie. string, number, date in this case). Any missing properties will not be set but also not cause any errors. Pass POST data instead of Objects In the last example I collected a bunch of values from form variables and stuffed them into object variables in JavaScript code. While that works, often times this isn't really helping - I end up converting my types on the client and then doing another conversion on the server. If lots of input controls are on a page and you just want to pick up the values on the server via plain POST variables - that can be done too - and it makes sense especially if you're creating and filling the client side object only to push data to the server. Let's add another method to the server that once again lets us buy a stock. But this time let's not accept a parameter but rather send POST data to the server. Here's the server method receiving POST data: [CallbackMethod] public string BuyStockPost() { StockBuyOrder buyOrder = new StockBuyOrder(); buyOrder.Symbol = Request.Form["txtBuySymbol"]; ; int qty; int.TryParse(Request.Form["txtBuyQuantity"], out qty); buyOrder.Quantity = qty; DateTime time; DateTime.TryParse(Request.Form["txtBuyBuyOn"], out time); buyOrder.BuyOn = time; // Or easier way yet //FormVariableBinder.Unbind(buyOrder,null,"txtBuy"); var server = new StockServer(); var quote = server.GetStockQuote(buyOrder.Symbol); if (quote == null) throw new ApplicationException("Invalid or missing stock symbol."); return string.Format("You're buying {0} shares of {1} ({2}) stock at {3} for a total of {4} on {5}.", buyOrder.Quantity, quote.Company, quote.Symbol, quote.LastPrice.ToString("c"), (quote.LastPrice * buyOrder.Quantity).ToString("c"), buyOrder.BuyOn.ToString("MMM d")); } Clearly we've made this server method take more code than it did with the object parameter. We've basically moved the parameter assignment logic from the client to the server. As a result the client code to call this method is now a bit shorter since there's no client side shuffling of values from the controls to an object. $("#btnBuyStockPost").click(function () { ajaxCallMethod("SampleService.ashx", "BuyStockPost", [], // Note: No parameters - function (result) { $("#divStockBuyMessage").text(result).fadeIn(1000); }, onPageError, // Force all page Form Variables to be posted { postbackMode: "Post" }); }); The client simply calls the BuyStockQuote method and pushes all the form variables from the page up to the server which parses them instead. The feature that makes this work is one of the options you can pass to the ajaxCallMethod() function: { postbackMode: "Post" }); which directs the function to include form variable POST data when making the service call. Other options include PostNoViewState (for WebForms to strip out WebForms crap vars), PostParametersOnly (default), None. If you pass parameters those are always posted to the server except when None is set. The above code can be simplified a bit by using the FormVariableBinder helper, which can unbind form variables directly into an object: FormVariableBinder.Unbind(buyOrder,null,"txtBuy"); which replaces the manual Request.Form[] reading code. It receives the object to unbind into, a string of properties to skip, and an optional prefix which is stripped off form variables to match property names. The component is similar to the MVC model binder but it's independent of MVC. Returning non-JSON Data CallbackHandler also supports returning non-JSON/XML data via special return types. You can return raw non-JSON encoded strings like this: [CallbackMethod(ReturnAsRawString=true,ContentType="text/plain")] public string HelloWorldNoJSON(string name) { return "Hello " + name + ". Time is: " + DateTime.Now.ToString(); } Calling this method results in just a plain string - no JSON encoding with quotes around the result. This can be useful if your server handling code needs to return a string or HTML result that doesn't fit well for a page or other UI component. Any string output can be returned. You can also return binary data. Stream, byte[] and Bitmap/Image results are automatically streamed back to the client. Notice that you should set the ContentType of the request either on the CallbackMethod attribute or using Response.ContentType. This ensures the Web Server knows how to display your binary response. Using a stream response makes it possible to return any of data. Streamed data can be pretty handy to return bitmap data from a method. The following is a method that returns a stock history graph for a particular stock over a provided number of years: [CallbackMethod(ContentType="image/png",RouteUrl="stocks/history/graph/{symbol}/{years}")] public Stream GetStockHistoryGraph(string symbol, int years = 2,int width = 500, int height=350) { if (width == 0) width = 500; if (height == 0) height = 350; StockServer server = new StockServer(); return server.GetStockHistoryGraph(symbol,"Stock History for " + symbol,width,height,years); } I can now hook this up into the JavaScript code when I get a stock quote. At the end of the process I can assign the URL to the service that returns the image into the src property and so force the image to display. Here's the changed code: $("#btnStockQuote").click(function () { var symbol = $("#txtSymbol").val(); ajaxCallMethod("SampleService.ashx", "GetStockQuote", [symbol], function (quote) { $("#divStockDisplay").fadeIn(1000); $("#stockCompany").text(quote.Company + " (" + quote.Symbol + ")"); $("#stockLastPrice").text(quote.LastPrice); $("#stockQuoteTime").text(quote.LastQuoteTime.formatDate("MMM dd, hh:mmt")); // display a stock chart $("#imgStockHistory").attr("src", "stocks/history/graph/" + symbol + "/2"); },onPageError); }); The resulting output then looks like this: The charting code uses the new ASP.NET 4.0 Chart components via code to display a bar chart of the 2 year stock data as part of the StockServer class which you can find in the sample download. The ability to return arbitrary data from a service is useful as you can see - in this case the chart is clearly associated with the service and it's nice that the graph generation can happen off a handler rather than through a page. Images are common resources, but output can also be PDF reports, zip files for downloads etc. which is becoming increasingly more common to be returned from REST endpoints and other applications. Why reinvent? Obviously the examples I've shown here are pretty basic in terms of functionality. But I hope they demonstrate the core features of AJAX callbacks that you need to work through in most applications which is simple: return data, send back data and potentially retrieve data in various formats. While there are other solutions when it comes down to making AJAX callbacks and servicing REST like requests, I like the flexibility my home grown solution provides. Simply put it's still the easiest solution that I've found that addresses my common use cases: AJAX JSON RPC style callbacks Url based access XML and JSON Output from single method endpoint XML and JSON POST support, querystring input, routing parameter mapping UrlEncoded POST data support on callbacks Ability to return stream/raw string data Essentially ability to return ANYTHING from Service and pass anything All these features are available in various solutions but not together in one place. I've been using this code base for over 4 years now in a number of projects both for myself and commercial work and it's served me extremely well. Besides the AJAX functionality CallbackHandler provides, it's also an easy way to create any kind of output endpoint I need to create. Need to create a few simple routines that spit back some data, but don't want to create a Page or View or full blown handler for it? Create a CallbackHandler and add a method or multiple methods and you have your generic endpoints.  It's a quick and easy way to add small code pieces that are pretty efficient as they're running through a pretty small handler implementation. I can have this up and running in a couple of minutes literally without any setup and returning just about any kind of data. Resources Download the Sample NuGet: Westwind Web and AJAX Utilities (Westwind.Web) ajaxCallMethod() Documentation Using the AjaxMethodCallback WebForms Control West Wind Web Toolkit Home Page West Wind Web Toolkit Source Code © Rick Strahl, West Wind Technologies, 2005-2011Posted in ASP.NET  jQuery  AJAX   Tweet (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Custom ASP.NET Routing to an HttpHandler

    - by Rick Strahl
    As of version 4.0 ASP.NET natively supports routing via the now built-in System.Web.Routing namespace. Routing features are automatically integrated into the HtttpRuntime via a few custom interfaces. New Web Forms Routing Support In ASP.NET 4.0 there are a host of improvements including routing support baked into Web Forms via a RouteData property available on the Page class and RouteCollection.MapPageRoute() route handler that makes it easy to route to Web forms. To map ASP.NET Page routes is as simple as setting up the routes with MapPageRoute:protected void Application_Start(object sender, EventArgs e) { RegisterRoutes(RouteTable.Routes); } void RegisterRoutes(RouteCollection routes) { routes.MapPageRoute("StockQuote", "StockQuote/{symbol}", "StockQuote.aspx"); routes.MapPageRoute("StockQuotes", "StockQuotes/{symbolList}", "StockQuotes.aspx"); } and then accessing the route data in the page you can then use the new Page class RouteData property to retrieve the dynamic route data information:public partial class StockQuote1 : System.Web.UI.Page { protected StockQuote Quote = null; protected void Page_Load(object sender, EventArgs e) { string symbol = RouteData.Values["symbol"] as string; StockServer server = new StockServer(); Quote = server.GetStockQuote(symbol); // display stock data in Page View } } Simple, quick and doesn’t require much explanation. If you’re using WebForms most of your routing needs should be served just fine by this simple mechanism. Kudos to the ASP.NET team for putting this in the box and making it easy! How Routing Works To handle Routing in ASP.NET involves these steps: Registering Routes Creating a custom RouteHandler to retrieve an HttpHandler Attaching RouteData to your HttpHandler Picking up Route Information in your Request code Registering routes makes ASP.NET aware of the Routes you want to handle via the static RouteTable.Routes collection. You basically add routes to this collection to let ASP.NET know which URL patterns it should watch for. You typically hook up routes off a RegisterRoutes method that fires in Application_Start as I did in the example above to ensure routes are added only once when the application first starts up. When you create a route, you pass in a RouteHandler instance which ASP.NET caches and reuses as routes are matched. Once registered ASP.NET monitors the routes and if a match is found just prior to the HttpHandler instantiation, ASP.NET uses the RouteHandler registered for the route and calls GetHandler() on it to retrieve an HttpHandler instance. The RouteHandler.GetHandler() method is responsible for creating an instance of an HttpHandler that is to handle the request and – if necessary – to assign any additional custom data to the handler. At minimum you probably want to pass the RouteData to the handler so the handler can identify the request based on the route data available. To do this you typically add  a RouteData property to your handler and then assign the property from the RouteHandlers request context. This is essentially how Page.RouteData comes into being and this approach should work well for any custom handler implementation that requires RouteData. It’s a shame that ASP.NET doesn’t have a top level intrinsic object that’s accessible off the HttpContext object to provide route data more generically, but since RouteData is directly tied to HttpHandlers and not all handlers support it it might cause some confusion of when it’s actually available. Bottom line is that if you want to hold on to RouteData you have to assign it to a custom property of the handler or else pass it to the handler via Context.Items[] object that can be retrieved on an as needed basis. It’s important to understand that routing is hooked up via RouteHandlers that are responsible for loading HttpHandler instances. RouteHandlers are invoked for every request that matches a route and through this RouteHandler instance the Handler gains access to the current RouteData. Because of this logic it’s important to understand that Routing is really tied to HttpHandlers and not available prior to handler instantiation, which is pretty late in the HttpRuntime’s request pipeline. IOW, Routing works with Handlers but not with earlier in the pipeline within Modules. Specifically ASP.NET calls RouteHandler.GetHandler() from the PostResolveRequestCache HttpRuntime pipeline event. Here’s the call stack at the beginning of the GetHandler() call: which fires just before handler resolution. Non-Page Routing – You need to build custom RouteHandlers If you need to route to a custom Http Handler or other non-Page (and non-MVC) endpoint in the HttpRuntime, there is no generic mapping support available. You need to create a custom RouteHandler that can manage creating an instance of an HttpHandler that is fired in response to a routed request. Depending on what you are doing this process can be simple or fairly involved as your code is responsible based on the route data provided which handler to instantiate, and more importantly how to pass the route data on to the Handler. Luckily creating a RouteHandler is easy by implementing the IRouteHandler interface which has only a single GetHttpHandler(RequestContext context) method. In this method you can pick up the requestContext.RouteData, instantiate the HttpHandler of choice, and assign the RouteData to it. Then pass back the handler and you’re done.Here’s a simple example of GetHttpHandler() method that dynamically creates a handler based on a passed in Handler type./// <summary> /// Retrieves an Http Handler based on the type specified in the constructor /// </summary> /// <param name="requestContext"></param> /// <returns></returns> IHttpHandler IRouteHandler.GetHttpHandler(RequestContext requestContext) { IHttpHandler handler = Activator.CreateInstance(CallbackHandlerType) as IHttpHandler; // If we're dealing with a Callback Handler // pass the RouteData for this route to the Handler if (handler is CallbackHandler) ((CallbackHandler)handler).RouteData = requestContext.RouteData; return handler; } Note that this code checks for a specific type of handler and if it matches assigns the RouteData to this handler. This is optional but quite a common scenario if you want to work with RouteData. If the handler you need to instantiate isn’t under your control but you still need to pass RouteData to Handler code, an alternative is to pass the RouteData via the HttpContext.Items collection:IHttpHandler IRouteHandler.GetHttpHandler(RequestContext requestContext) { IHttpHandler handler = Activator.CreateInstance(CallbackHandlerType) as IHttpHandler; requestContext.HttpContext.Items["RouteData"] = requestContext.RouteData; return handler; } The code in the handler implementation can then pick up the RouteData from the context collection as needed:RouteData routeData = HttpContext.Current.Items["RouteData"] as RouteData This isn’t as clean as having an explicit RouteData property, but it does have the advantage that the route data is visible anywhere in the Handler’s code chain. It’s definitely preferable to create a custom property on your handler, but the Context work-around works in a pinch when you don’t’ own the handler code and have dynamic code executing as part of the handler execution. An Example of a Custom RouteHandler: Attribute Based Route Implementation In this post I’m going to discuss a custom routine implementation I built for my CallbackHandler class in the West Wind Web & Ajax Toolkit. CallbackHandler can be very easily used for creating AJAX, REST and POX requests following RPC style method mapping. You can pass parameters via URL query string, POST data or raw data structures, and you can retrieve results as JSON, XML or raw string/binary data. It’s a quick and easy way to build service interfaces with no fuss. As a quick review here’s how CallbackHandler works: You create an Http Handler that derives from CallbackHandler You implement methods that have a [CallbackMethod] Attribute and that’s it. Here’s an example of an CallbackHandler implementation in an ashx.cs based handler:// RestService.ashx.cs public class RestService : CallbackHandler { [CallbackMethod] public StockQuote GetStockQuote(string symbol) { StockServer server = new StockServer(); return server.GetStockQuote(symbol); } [CallbackMethod] public StockQuote[] GetStockQuotes(string symbolList) { StockServer server = new StockServer(); string[] symbols = symbolList.Split(new char[2] { ',',';' },StringSplitOptions.RemoveEmptyEntries); return server.GetStockQuotes(symbols); } } CallbackHandler makes it super easy to create a method on the server, pass data to it via POST, QueryString or raw JSON/XML data, and then retrieve the results easily back in various formats. This works wonderful and I’ve used these tools in many projects for myself and with clients. But one thing missing has been the ability to create clean URLs. Typical URLs looked like this: http://www.west-wind.com/WestwindWebToolkit/samples/Rest/StockService.ashx?Method=GetStockQuote&symbol=msfthttp://www.west-wind.com/WestwindWebToolkit/samples/Rest/StockService.ashx?Method=GetStockQuotes&symbolList=msft,intc,gld,slw,mwe&format=xml which works and is clear enough, but also clearly very ugly. It would be much nicer if URLs could look like this: http://www.west-wind.com//WestwindWebtoolkit/Samples/StockQuote/msfthttp://www.west-wind.com/WestwindWebtoolkit/Samples/StockQuotes/msft,intc,gld,slw?format=xml (the Virtual Root in this sample is WestWindWebToolkit/Samples and StockQuote/{symbol} is the route)(If you use FireFox try using the JSONView plug-in make it easier to view JSON content) So, taking a clue from the WCF REST tools that use RouteUrls I set out to create a way to specify RouteUrls for each of the endpoints. The change made basically allows changing the above to: [CallbackMethod(RouteUrl="RestService/StockQuote/{symbol}")] public StockQuote GetStockQuote(string symbol) { StockServer server = new StockServer(); return server.GetStockQuote(symbol); } [CallbackMethod(RouteUrl = "RestService/StockQuotes/{symbolList}")] public StockQuote[] GetStockQuotes(string symbolList) { StockServer server = new StockServer(); string[] symbols = symbolList.Split(new char[2] { ',',';' },StringSplitOptions.RemoveEmptyEntries); return server.GetStockQuotes(symbols); } where a RouteUrl is specified as part of the Callback attribute. And with the changes made with RouteUrls I can now get URLs like the second set shown earlier. So how does that work? Let’s find out… How to Create Custom Routes As mentioned earlier Routing is made up of several steps: Creating a custom RouteHandler to create HttpHandler instances Mapping the actual Routes to the RouteHandler Retrieving the RouteData and actually doing something useful with it in the HttpHandler In the CallbackHandler routing example above this works out to something like this: Create a custom RouteHandler that includes a property to track the method to call Set up the routes using Reflection against the class Looking for any RouteUrls in the CallbackMethod attribute Add a RouteData property to the CallbackHandler so we can access the RouteData in the code of the handler Creating a Custom Route Handler To make the above work I created a custom RouteHandler class that includes the actual IRouteHandler implementation as well as a generic and static method to automatically register all routes marked with the [CallbackMethod(RouteUrl="…")] attribute. Here’s the code:/// <summary> /// Route handler that can create instances of CallbackHandler derived /// callback classes. The route handler tracks the method name and /// creates an instance of the service in a predictable manner /// </summary> /// <typeparam name="TCallbackHandler">CallbackHandler type</typeparam> public class CallbackHandlerRouteHandler : IRouteHandler { /// <summary> /// Method name that is to be called on this route. /// Set by the automatically generated RegisterRoutes /// invokation. /// </summary> public string MethodName { get; set; } /// <summary> /// The type of the handler we're going to instantiate. /// Needed so we can semi-generically instantiate the /// handler and call the method on it. /// </summary> public Type CallbackHandlerType { get; set; } /// <summary> /// Constructor to pass in the two required components we /// need to create an instance of our handler. /// </summary> /// <param name="methodName"></param> /// <param name="callbackHandlerType"></param> public CallbackHandlerRouteHandler(string methodName, Type callbackHandlerType) { MethodName = methodName; CallbackHandlerType = callbackHandlerType; } /// <summary> /// Retrieves an Http Handler based on the type specified in the constructor /// </summary> /// <param name="requestContext"></param> /// <returns></returns> IHttpHandler IRouteHandler.GetHttpHandler(RequestContext requestContext) { IHttpHandler handler = Activator.CreateInstance(CallbackHandlerType) as IHttpHandler; // If we're dealing with a Callback Handler // pass the RouteData for this route to the Handler if (handler is CallbackHandler) ((CallbackHandler)handler).RouteData = requestContext.RouteData; return handler; } /// <summary> /// Generic method to register all routes from a CallbackHandler /// that have RouteUrls defined on the [CallbackMethod] attribute /// </summary> /// <typeparam name="TCallbackHandler">CallbackHandler Type</typeparam> /// <param name="routes"></param> public static void RegisterRoutes<TCallbackHandler>(RouteCollection routes) { // find all methods var methods = typeof(TCallbackHandler).GetMethods(BindingFlags.Instance | BindingFlags.Public); foreach (var method in methods) { var attrs = method.GetCustomAttributes(typeof(CallbackMethodAttribute), false); if (attrs.Length < 1) continue; CallbackMethodAttribute attr = attrs[0] as CallbackMethodAttribute; if (string.IsNullOrEmpty(attr.RouteUrl)) continue; // Add the route routes.Add(method.Name, new Route(attr.RouteUrl, new CallbackHandlerRouteHandler(method.Name, typeof(TCallbackHandler)))); } } } The RouteHandler implements IRouteHandler, and its responsibility via the GetHandler method is to create an HttpHandler based on the route data. When ASP.NET calls GetHandler it passes a requestContext parameter which includes a requestContext.RouteData property. This parameter holds the current request’s route data as well as an instance of the current RouteHandler. If you look at GetHttpHandler() you can see that the code creates an instance of the handler we are interested in and then sets the RouteData property on the handler. This is how you can pass the current request’s RouteData to the handler. The RouteData object also has a  RouteData.RouteHandler property that is also available to the Handler later, which is useful in order to get additional information about the current route. In our case here the RouteHandler includes a MethodName property that identifies the method to execute in the handler since that value no longer comes from the URL so we need to figure out the method name some other way. The method name is mapped explicitly when the RouteHandler is created and here the static method that auto-registers all CallbackMethods with RouteUrls sets the method name when it creates the routes while reflecting over the methods (more on this in a minute). The important point here is that you can attach additional properties to the RouteHandler and you can then later access the RouteHandler and its properties later in the Handler to pick up these custom values. This is a crucial feature in that the RouteHandler serves in passing additional context to the handler so it knows what actions to perform. The automatic route registration is handled by the static RegisterRoutes<TCallbackHandler> method. This method is generic and totally reusable for any CallbackHandler type handler. To register a CallbackHandler and any RouteUrls it has defined you simple use code like this in Application_Start (or other application startup code):protected void Application_Start(object sender, EventArgs e) { // Register Routes for RestService CallbackHandlerRouteHandler.RegisterRoutes<RestService>(RouteTable.Routes); } If you have multiple CallbackHandler style services you can make multiple calls to RegisterRoutes for each of the service types. RegisterRoutes internally uses reflection to run through all the methods of the Handler, looking for CallbackMethod attributes and whether a RouteUrl is specified. If it is a new instance of a CallbackHandlerRouteHandler is created and the name of the method and the type are set. routes.Add(method.Name,           new Route(attr.RouteUrl, new CallbackHandlerRouteHandler(method.Name, typeof(TCallbackHandler) )) ); While the routing with CallbackHandlerRouteHandler is set up automatically for all methods that use the RouteUrl attribute, you can also use code to hook up those routes manually and skip using the attribute. The code for this is straightforward and just requires that you manually map each individual route to each method you want a routed: protected void Application_Start(objectsender, EventArgs e){    RegisterRoutes(RouteTable.Routes);}void RegisterRoutes(RouteCollection routes) { routes.Add("StockQuote Route",new Route("StockQuote/{symbol}",                     new CallbackHandlerRouteHandler("GetStockQuote",typeof(RestService) ) ) );     routes.Add("StockQuotes Route",new Route("StockQuotes/{symbolList}",                     new CallbackHandlerRouteHandler("GetStockQuotes",typeof(RestService) ) ) );}I think it’s clearly easier to have CallbackHandlerRouteHandler.RegisterRoutes() do this automatically for you based on RouteUrl attributes, but some people have a real aversion to attaching logic via attributes. Just realize that the option to manually create your routes is available as well. Using the RouteData in the Handler A RouteHandler’s responsibility is to create an HttpHandler and as mentioned earlier, natively IHttpHandler doesn’t have any support for RouteData. In order to utilize RouteData in your handler code you have to pass the RouteData to the handler. In my CallbackHandlerRouteHandler when it creates the HttpHandler instance it creates the instance and then assigns the custom RouteData property on the handler:IHttpHandler handler = Activator.CreateInstance(CallbackHandlerType) as IHttpHandler; if (handler is CallbackHandler) ((CallbackHandler)handler).RouteData = requestContext.RouteData; return handler; Again this only works if you actually add a RouteData property to your handler explicitly as I did in my CallbackHandler implementation:/// <summary> /// Optionally store RouteData on this handler /// so we can access it internally /// </summary> public RouteData RouteData {get; set; } and the RouteHandler needs to set it when it creates the handler instance. Once you have the route data in your handler you can access Route Keys and Values and also the RouteHandler. Since my RouteHandler has a custom property for the MethodName to retrieve it from within the handler I can do something like this now to retrieve the MethodName (this example is actually not in the handler but target is an instance pass to the processor): // check for Route Data method name if (target is CallbackHandler) { var routeData = ((CallbackHandler)target).RouteData; if (routeData != null) methodToCall = ((CallbackHandlerRouteHandler)routeData.RouteHandler).MethodName; } When I need to access the dynamic values in the route ( symbol in StockQuote/{symbol}) I can retrieve it easily with the Values collection (RouteData.Values["symbol"]). In my CallbackHandler processing logic I’m basically looking for matching parameter names to Route parameters: // look for parameters in the routeif(routeData != null){    string parmString = routeData.Values[parameter.Name] as string;    adjustedParms[parmCounter] = ReflectionUtils.StringToTypedValue(parmString, parameter.ParameterType);} And with that we’ve come full circle. We’ve created a custom RouteHandler() that passes the RouteData to the handler it creates. We’ve registered our routes to use the RouteHandler, and we’ve utilized the route data in our handler. For completeness sake here’s the routine that executes a method call based on the parameters passed in and one of the options is to retrieve the inbound parameters off RouteData (as well as from POST data or QueryString parameters):internal object ExecuteMethod(string method, object target, string[] parameters, CallbackMethodParameterType paramType, ref CallbackMethodAttribute callbackMethodAttribute) { HttpRequest Request = HttpContext.Current.Request; object Result = null; // Stores parsed parameters (from string JSON or QUeryString Values) object[] adjustedParms = null; Type PageType = target.GetType(); MethodInfo MI = PageType.GetMethod(method, BindingFlags.Instance | BindingFlags.Public | BindingFlags.NonPublic); if (MI == null) throw new InvalidOperationException("Invalid Server Method."); object[] methods = MI.GetCustomAttributes(typeof(CallbackMethodAttribute), false); if (methods.Length < 1) throw new InvalidOperationException("Server method is not accessible due to missing CallbackMethod attribute"); if (callbackMethodAttribute != null) callbackMethodAttribute = methods[0] as CallbackMethodAttribute; ParameterInfo[] parms = MI.GetParameters(); JSONSerializer serializer = new JSONSerializer(); RouteData routeData = null; if (target is CallbackHandler) routeData = ((CallbackHandler)target).RouteData; int parmCounter = 0; adjustedParms = new object[parms.Length]; foreach (ParameterInfo parameter in parms) { // Retrieve parameters out of QueryString or POST buffer if (parameters == null) { // look for parameters in the route if (routeData != null) { string parmString = routeData.Values[parameter.Name] as string; adjustedParms[parmCounter] = ReflectionUtils.StringToTypedValue(parmString, parameter.ParameterType); } // GET parameter are parsed as plain string values - no JSON encoding else if (HttpContext.Current.Request.HttpMethod == "GET") { // Look up the parameter by name string parmString = Request.QueryString[parameter.Name]; adjustedParms[parmCounter] = ReflectionUtils.StringToTypedValue(parmString, parameter.ParameterType); } // POST parameters are treated as methodParameters that are JSON encoded else if (paramType == CallbackMethodParameterType.Json) //string newVariable = methodParameters.GetValue(parmCounter) as string; adjustedParms[parmCounter] = serializer.Deserialize(Request.Params["parm" + (parmCounter + 1).ToString()], parameter.ParameterType); else adjustedParms[parmCounter] = SerializationUtils.DeSerializeObject( Request.Params["parm" + (parmCounter + 1).ToString()], parameter.ParameterType); } else if (paramType == CallbackMethodParameterType.Json) adjustedParms[parmCounter] = serializer.Deserialize(parameters[parmCounter], parameter.ParameterType); else adjustedParms[parmCounter] = SerializationUtils.DeSerializeObject(parameters[parmCounter], parameter.ParameterType); parmCounter++; } Result = MI.Invoke(target, adjustedParms); return Result; } The code basically uses Reflection to loop through all the parameters available on the method and tries to assign the parameters from RouteData, QueryString or POST variables. The parameters are converted into their appropriate types and then used to eventually make a Reflection based method call. What’s sweet is that the RouteData retrieval is just another option for dealing with the inbound data in this scenario and it adds exactly two lines of code plus the code to retrieve the MethodName I showed previously – a seriously low impact addition that adds a lot of extra value to this endpoint callback processing implementation. Debugging your Routes If you create a lot of routes it’s easy to run into Route conflicts where multiple routes have the same path and overlap with each other. This can be difficult to debug especially if you are using automatically generated routes like the routes created by CallbackHandlerRouteHandler.RegisterRoutes. Luckily there’s a tool that can help you out with this nicely. Phill Haack created a RouteDebugging tool you can download and add to your project. The easiest way to do this is to grab and add this to your project is to use NuGet (Add Library Package from your Project’s Reference Nodes):   which adds a RouteDebug assembly to your project. Once installed you can easily debug your routes with this simple line of code which needs to be installed at application startup:protected void Application_Start(object sender, EventArgs e) { CallbackHandlerRouteHandler.RegisterRoutes<StockService>(RouteTable.Routes); // Debug your routes RouteDebug.RouteDebugger.RewriteRoutesForTesting(RouteTable.Routes); } Any routed URL then displays something like this: The screen shows you your current route data and all the routes that are mapped along with a flag that displays which route was actually matched. This is useful – if you have any overlap of routes you will be able to see which routes are triggered – the first one in the sequence wins. This tool has saved my ass on a few occasions – and with NuGet now it’s easy to add it to your project in a few seconds and then remove it when you’re done. Routing Around Custom routing seems slightly complicated on first blush due to its disconnected components of RouteHandler, route registration and mapping of custom handlers. But once you understand the relationship between a RouteHandler, the RouteData and how to pass it to a handler, utilizing of Routing becomes a lot easier as you can easily pass context from the registration to the RouteHandler and through to the HttpHandler. The most important thing to understand when building custom routing solutions is to figure out how to map URLs in such a way that the handler can figure out all the pieces it needs to process the request. This can be via URL routing parameters and as I did in my example by passing additional context information as part of the RouteHandler instance that provides the proper execution context. In my case this ‘context’ was the method name, but it could be an actual static value like an enum identifying an operation or category in an application. Basically user supplied data comes in through the url and static application internal data can be passed via RouteHandler property values. Routing can make your application URLs easier to read by non-techie types regardless of whether you’re building Service type or REST applications, or full on Web interfaces. Routing in ASP.NET 4.0 makes it possible to create just about any extensionless URLs you can dream up and custom RouteHanmdler References Sample ProjectIncludes the sample CallbackHandler service discussed here along with compiled versionsof the Westwind.Web and Westwind.Utilities assemblies.  (requires .NET 4.0/VS 2010) West Wind Web Toolkit includes full implementation of CallbackHandler and the Routing Handler West Wind Web Toolkit Source CodeContains the full source code to the Westwind.Web and Westwind.Utilities assemblies usedin these samples. Includes the source described in the post.(Latest build in the Subversion Repository) CallbackHandler Source(Relevant code to this article tree in Westwind.Web assembly) JSONView FireFoxPluginA simple FireFox Plugin to easily view JSON data natively in FireFox.For IE you can use a registry hack to display JSON as raw text.© Rick Strahl, West Wind Technologies, 2005-2011Posted in ASP.NET  AJAX  HTTP  

    Read the article

  • AngularJS on top of ASP.NET: Moving the MVC framework out to the browser

    - by Varun Chatterji
    Heavily drawing inspiration from Ruby on Rails, MVC4’s convention over configuration model of development soon became the Holy Grail of .NET web development. The MVC model brought with it the goodness of proper separation of concerns between business logic, data, and the presentation logic. However, the MVC paradigm, was still one in which server side .NET code could be mixed with presentation code. The Razor templating engine, though cleaner than its predecessors, still encouraged and allowed you to mix .NET server side code with presentation logic. Thus, for example, if the developer required a certain <div> tag to be shown if a particular variable ShowDiv was true in the View’s model, the code could look like the following: Fig 1: To show a div or not. Server side .NET code is used in the View Mixing .NET code with HTML in views can soon get very messy. Wouldn’t it be nice if the presentation layer (HTML) could be pure HTML? Also, in the ASP.NET MVC model, some of the business logic invariably resides in the controller. It is tempting to use an anti­pattern like the one shown above to control whether a div should be shown or not. However, best practice would indicate that the Controller should not be aware of the div. The ShowDiv variable in the model should not exist. A controller should ideally, only be used to do the plumbing of getting the data populated in the model and nothing else. The view (ideally pure HTML) should render the presentation layer based on the model. In this article we will see how Angular JS, a new JavaScript framework by Google can be used effectively to build web applications where: 1. Views are pure HTML 2. Controllers (in the server sense) are pure REST based API calls 3. The presentation layer is loaded as needed from partial HTML only files. What is MVVM? MVVM short for Model View View Model is a new paradigm in web development. In this paradigm, the Model and View stuff exists on the client side through javascript instead of being processed on the server through postbacks. These frameworks are JavaScript frameworks that facilitate the clear separation of the “frontend” or the data rendering logic from the “backend” which is typically just a REST based API that loads and processes data through a resource model. The frameworks are called MVVM as a change to the Model (through javascript) gets reflected in the view immediately i.e. Model > View. Also, a change on the view (through manual input) gets reflected in the model immediately i.e. View > Model. The following figure shows this conceptually (comments are shown in red): Fig 2: Demonstration of MVVM in action In Fig 2, two text boxes are bound to the same variable model.myInt. Thus, changing the view manually (changing one text box through keyboard input) also changes the other textbox in real time demonstrating V > M property of a MVVM framework. Furthermore, clicking the button adds 1 to the value of model.myInt thus changing the model through JavaScript. This immediately updates the view (the value in the two textboxes) thus demonstrating the M > V property of a MVVM framework. Thus we see that the model in a MVVM JavaScript framework can be regarded as “the single source of truth“. This is an important concept. Angular is one such MVVM framework. We shall use it to build a simple app that sends SMS messages to a particular number. Application, Routes, Views, Controllers, Scope and Models Angular can be used in many ways to construct web applications. For this article, we shall only focus on building Single Page Applications (SPAs). Many of the approaches we will follow in this article have alternatives. It is beyond the scope of this article to explain every nuance in detail but we shall try to touch upon the basic concepts and end up with a working application that can be used to send SMS messages using Sent.ly Plus (a service that is itself built using Angular). Before you read on, we would like to urge you to forget what you know about Models, Views, Controllers and Routes in the ASP.NET MVC4 framework. All these words have different meanings in the Angular world. Whenever these words are used in this article, they will refer to Angular concepts and not ASP.NET MVC4 concepts. The following figure shows the skeleton of the root page of an SPA: Fig 3: The skeleton of a SPA The skeleton of the application is based on the Bootstrap starter template which can be found at: http://getbootstrap.com/examples/starter­template/ Apart from loading the Angular, jQuery and Bootstrap JavaScript libraries, it also loads our custom scripts /app/js/controllers.js /app/js/app.js These scripts define the routes, views and controllers which we shall come to in a moment. Application Notice that the body tag (Fig. 3) has an extra attribute: ng­app=”smsApp” Providing this tag “bootstraps” our single page application. It tells Angular to load a “module” called smsApp. This “module” is defined /app/js/app.js angular.module('smsApp', ['smsApp.controllers', function () {}]) Fig 4: The definition of our application module The line shows above, declares a module called smsApp. It also declares that this module “depends” on another module called “smsApp.controllers”. The smsApp.controllers module will contain all the controllers for our SPA. Routing and Views Notice that in the Navbar (in Fig 3) we have included two hyperlinks to: “#/app” “#/help” This is how Angular handles routing. Since the URLs start with “#”, they are actually just bookmarks (and not server side resources). However, our route definition (in /app/js/app.js) gives these URLs a special meaning within the Angular framework. angular.module('smsApp', ['smsApp.controllers', function () { }]) //Configure the routes .config(['$routeProvider', function ($routeProvider) { $routeProvider.when('/binding', { templateUrl: '/app/partials/bindingexample.html', controller: 'BindingController' }); }]); Fig 5: The definition of a route with an associated partial view and controller As we can see from the previous code sample, we are using the $routeProvider object in the configuration of our smsApp module. Notice how the code “asks for” the $routeProvider object by specifying it as a dependency in the [] braces and then defining a function that accepts it as a parameter. This is known as dependency injection. Please refer to the following link if you want to delve into this topic: http://docs.angularjs.org/guide/di What the above code snippet is doing is that it is telling Angular that when the URL is “#/binding”, then it should load the HTML snippet (“partial view”) found at /app/partials/bindingexample.html. Also, for this URL, Angular should load the controller called “BindingController”. We have also marked the div with the class “container” (in Fig 3) with the ng­view attribute. This attribute tells Angular that views (partial HTML pages) defined in the routes will be loaded within this div. You can see that the Angular JavaScript framework, unlike many other frameworks, works purely by extending HTML tags and attributes. It also allows you to extend HTML with your own tags and attributes (through directives) if you so desire, you can find out more about directives at the following URL: http://www.codeproject.com/Articles/607873/Extending­HTML­with­AngularJS­Directives Controllers and Models We have seen how we define what views and controllers should be loaded for a particular route. Let us now consider how controllers are defined. Our controllers are defined in the file /app/js/controllers.js. The following snippet shows the definition of the “BindingController” which is loaded when we hit the URL http://localhost:port/index.html#/binding (as we have defined in the route earlier as shown in Fig 5). Remember that we had defined that our application module “smsApp” depends on the “smsApp.controllers” module (see Fig 4). The code snippet below shows how the “BindingController” defined in the route shown in Fig 5 is defined in the module smsApp.controllers: angular.module('smsApp.controllers', [function () { }]) .controller('BindingController', ['$scope', function ($scope) { $scope.model = {}; $scope.model.myInt = 6; $scope.addOne = function () { $scope.model.myInt++; } }]); Fig 6: The definition of a controller in the “smsApp.controllers” module. The pieces are falling in place! Remember Fig.2? That was the code of a partial view that was loaded within the container div of the skeleton SPA shown in Fig 3. The route definition shown in Fig 5 also defined that the controller called “BindingController” (shown in Fig 6.) was loaded when we loaded the URL: http://localhost:22544/index.html#/binding The button in Fig 2 was marked with the attribute ng­click=”addOne()” which added 1 to the value of model.myInt. In Fig 6, we can see that this function is actually defined in the “BindingController”. Scope We can see from Fig 6, that in the definition of “BindingController”, we defined a dependency on $scope and then, as usual, defined a function which “asks for” $scope as per the dependency injection pattern. So what is $scope? Any guesses? As you might have guessed a scope is a particular “address space” where variables and functions may be defined. This has a similar meaning to scope in a programming language like C#. Model: The Scope is not the Model It is tempting to assign variables in the scope directly. For example, we could have defined myInt as $scope.myInt = 6 in Fig 6 instead of $scope.model.myInt = 6. The reason why this is a bad idea is that scope in hierarchical in Angular. Thus if we were to define a controller which was defined within the another controller (nested controllers), then the inner controller would inherit the scope of the parent controller. This inheritance would follow JavaScript prototypal inheritance. Let’s say the parent controller defined a variable through $scope.myInt = 6. The child controller would inherit the scope through java prototypical inheritance. This basically means that the child scope has a variable myInt that points to the parent scopes myInt variable. Now if we assigned the value of myInt in the parent, the child scope would be updated with the same value as the child scope’s myInt variable points to the parent scope’s myInt variable. However, if we were to assign the value of the myInt variable in the child scope, then the link of that variable to the parent scope would be broken as the variable myInt in the child scope now points to the value 6 and not to the parent scope’s myInt variable. But, if we defined a variable model in the parent scope, then the child scope will also have a variable model that points to the model variable in the parent scope. Updating the value of $scope.model.myInt in the parent scope would change the model variable in the child scope too as the variable is pointed to the model variable in the parent scope. Now changing the value of $scope.model.myInt in the child scope would ALSO change the value in the parent scope. This is because the model reference in the child scope is pointed to the scope variable in the parent. We did no new assignment to the model variable in the child scope. We only changed an attribute of the model variable. Since the model variable (in the child scope) points to the model variable in the parent scope, we have successfully changed the value of myInt in the parent scope. Thus the value of $scope.model.myInt in the parent scope becomes the “single source of truth“. This is a tricky concept, thus it is considered good practice to NOT use scope inheritance. More info on prototypal inheritance in Angular can be found in the “JavaScript Prototypal Inheritance” section at the following URL: https://github.com/angular/angular.js/wiki/Understanding­Scopes. Building It: An Angular JS application using a .NET Web API Backend Now that we have a perspective on the basic components of an MVVM application built using Angular, let’s build something useful. We will build an application that can be used to send out SMS messages to a given phone number. The following diagram describes the architecture of the application we are going to build: Fig 7: Broad application architecture We are going to add an HTML Partial to our project. This partial will contain the form fields that will accept the phone number and message that needs to be sent as an SMS. It will also display all the messages that have previously been sent. All the executable code that is run on the occurrence of events (button clicks etc.) in the view resides in the controller. The controller interacts with the ASP.NET WebAPI to get a history of SMS messages, add a message etc. through a REST based API. For the purposes of simplicity, we will use an in memory data structure for the purposes of creating this application. Thus, the tasks ahead of us are: Creating the REST WebApi with GET, PUT, POST, DELETE methods. Creating the SmsView.html partial Creating the SmsController controller with methods that are called from the SmsView.html partial Add a new route that loads the controller and the partial. 1. Creating the REST WebAPI This is a simple task that should be quite straightforward to any .NET developer. The following listing shows our ApiController: public class SmsMessage { public string to { get; set; } public string message { get; set; } } public class SmsResource : SmsMessage { public int smsId { get; set; } } public class SmsResourceController : ApiController { public static Dictionary<int, SmsResource> messages = new Dictionary<int, SmsResource>(); public static int currentId = 0; // GET api/<controller> public List<SmsResource> Get() { List<SmsResource> result = new List<SmsResource>(); foreach (int key in messages.Keys) { result.Add(messages[key]); } return result; } // GET api/<controller>/5 public SmsResource Get(int id) { if (messages.ContainsKey(id)) return messages[id]; return null; } // POST api/<controller> public List<SmsResource> Post([FromBody] SmsMessage value) { //Synchronize on messages so we don't have id collisions lock (messages) { SmsResource res = (SmsResource) value; res.smsId = currentId++; messages.Add(res.smsId, res); //SentlyPlusSmsSender.SendMessage(value.to, value.message); return Get(); } } // PUT api/<controller>/5 public List<SmsResource> Put(int id, [FromBody] SmsMessage value) { //Synchronize on messages so we don't have id collisions lock (messages) { if (messages.ContainsKey(id)) { //Update the message messages[id].message = value.message; messages[id].to = value.message; } return Get(); } } // DELETE api/<controller>/5 public List<SmsResource> Delete(int id) { if (messages.ContainsKey(id)) { messages.Remove(id); } return Get(); } } Once this class is defined, we should be able to access the WebAPI by a simple GET request using the browser: http://localhost:port/api/SmsResource Notice the commented line: //SentlyPlusSmsSender.SendMessage The SentlyPlusSmsSender class is defined in the attached solution. We have shown this line as commented as we want to explain the core Angular concepts. If you load the attached solution, this line is uncommented in the source and an actual SMS will be sent! By default, the API returns XML. For consumption of the API in Angular, we would like it to return JSON. To change the default to JSON, we make the following change to WebApiConfig.cs file located in the App_Start folder. public static class WebApiConfig { public static void Register(HttpConfiguration config) { config.Routes.MapHttpRoute( name: "DefaultApi", routeTemplate: "api/{controller}/{id}", defaults: new { id = RouteParameter.Optional } ); var appXmlType = config.Formatters.XmlFormatter. SupportedMediaTypes. FirstOrDefault( t => t.MediaType == "application/xml"); config.Formatters.XmlFormatter.SupportedMediaTypes.Remove(appXmlType); } } We now have our backend REST Api which we can consume from Angular! 2. Creating the SmsView.html partial This simple partial will define two fields: the destination phone number (international format starting with a +) and the message. These fields will be bound to model.phoneNumber and model.message. We will also add a button that we shall hook up to sendMessage() in the controller. A list of all previously sent messages (bound to model.allMessages) will also be displayed below the form input. The following code shows the code for the partial: <!--­­ If model.errorMessage is defined, then render the error div -­­> <div class="alert alert-­danger alert-­dismissable" style="margin­-top: 30px;" ng­-show="model.errorMessage != undefined"> <button type="button" class="close" data­dismiss="alert" aria­hidden="true">&times;</button> <strong>Error!</strong> <br /> {{ model.errorMessage }} </div> <!--­­ The input fields bound to the model --­­> <div class="well" style="margin-­top: 30px;"> <table style="width: 100%;"> <tr> <td style="width: 45%; text-­align: center;"> <input type="text" placeholder="Phone number (eg; +44 7778 609466)" ng­-model="model.phoneNumber" class="form-­control" style="width: 90%" onkeypress="return checkPhoneInput();" /> </td> <td style="width: 45%; text-­align: center;"> <input type="text" placeholder="Message" ng­-model="model.message" class="form-­control" style="width: 90%" /> </td> <td style="text-­align: center;"> <button class="btn btn-­danger" ng-­click="sendMessage();" ng-­disabled="model.isAjaxInProgress" style="margin­right: 5px;">Send</button> <img src="/Content/ajax-­loader.gif" ng­-show="model.isAjaxInProgress" /> </td> </tr> </table> </div> <!--­­ The past messages ­­--> <div style="margin-­top: 30px;"> <!­­-- The following div is shown if there are no past messages --­­> <div ng­-show="model.allMessages.length == 0"> No messages have been sent yet! </div> <!--­­ The following div is shown if there are some past messages --­­> <div ng-­show="model.allMessages.length == 0"> <table style="width: 100%;" class="table table-­striped"> <tr> <td>Phone Number</td> <td>Message</td> <td></td> </tr> <!--­­ The ng-­repeat directive is line the repeater control in .NET, but as you can see this partial is pure HTML which is much cleaner --> <tr ng-­repeat="message in model.allMessages"> <td>{{ message.to }}</td> <td>{{ message.message }}</td> <td> <button class="btn btn-­danger" ng-­click="delete(message.smsId);" ng­-disabled="model.isAjaxInProgress">Delete</button> </td> </tr> </table> </div> </div> The above code is commented and should be self explanatory. Conditional rendering is achieved through using the ng-­show=”condition” attribute on various div tags. Input fields are bound to the model and the send button is bound to the sendMessage() function in the controller as through the ng­click=”sendMessage()” attribute defined on the button tag. While AJAX calls are taking place, the controller sets model.isAjaxInProgress to true. Based on this variable, buttons are disabled through the ng-­disabled directive which is added as an attribute to the buttons. The ng-­repeat directive added as an attribute to the tr tag causes the table row to be rendered multiple times much like an ASP.NET repeater. 3. Creating the SmsController controller The penultimate piece of our application is the controller which responds to events from our view and interacts with our MVC4 REST WebAPI. The following listing shows the code we need to add to /app/js/controllers.js. Note that controller definitions can be chained. Also note that this controller “asks for” the $http service. The $http service is a simple way in Angular to do AJAX. So far we have only encountered modules, controllers, views and directives in Angular. The $http is new entity in Angular called a service. More information on Angular services can be found at the following URL: http://docs.angularjs.org/guide/dev_guide.services.understanding_services. .controller('SmsController', ['$scope', '$http', function ($scope, $http) { //We define the model $scope.model = {}; //We define the allMessages array in the model //that will contain all the messages sent so far $scope.model.allMessages = []; //The error if any $scope.model.errorMessage = undefined; //We initially load data so set the isAjaxInProgress = true; $scope.model.isAjaxInProgress = true; //Load all the messages $http({ url: '/api/smsresource', method: "GET" }). success(function (data, status, headers, config) { this callback will be called asynchronously //when the response is available $scope.model.allMessages = data; //We are done with AJAX loading $scope.model.isAjaxInProgress = false; }). error(function (data, status, headers, config) { //called asynchronously if an error occurs //or server returns response with an error status. $scope.model.errorMessage = "Error occurred status:" + status; //We are done with AJAX loading $scope.model.isAjaxInProgress = false; }); $scope.delete = function (id) { //We are making an ajax call so we set this to true $scope.model.isAjaxInProgress = true; $http({ url: '/api/smsresource/' + id, method: "DELETE" }). success(function (data, status, headers, config) { // this callback will be called asynchronously // when the response is available $scope.model.allMessages = data; //We are done with AJAX loading $scope.model.isAjaxInProgress = false; }); error(function (data, status, headers, config) { // called asynchronously if an error occurs // or server returns response with an error status. $scope.model.errorMessage = "Error occurred status:" + status; //We are done with AJAX loading $scope.model.isAjaxInProgress = false; }); } $scope.sendMessage = function () { $scope.model.errorMessage = undefined; var message = ''; if($scope.model.message != undefined) message = $scope.model.message.trim(); if ($scope.model.phoneNumber == undefined || $scope.model.phoneNumber == '' || $scope.model.phoneNumber.length < 10 || $scope.model.phoneNumber[0] != '+') { $scope.model.errorMessage = "You must enter a valid phone number in international format. Eg: +44 7778 609466"; return; } if (message.length == 0) { $scope.model.errorMessage = "You must specify a message!"; return; } //We are making an ajax call so we set this to true $scope.model.isAjaxInProgress = true; $http({ url: '/api/smsresource', method: "POST", data: { to: $scope.model.phoneNumber, message: $scope.model.message } }). success(function (data, status, headers, config) { // this callback will be called asynchronously // when the response is available $scope.model.allMessages = data; //We are done with AJAX loading $scope.model.isAjaxInProgress = false; }). error(function (data, status, headers, config) { // called asynchronously if an error occurs // or server returns response with an error status. $scope.model.errorMessage = "Error occurred status:" + status // We are done with AJAX loading $scope.model.isAjaxInProgress = false; }); } }]); We can see from the previous listing how the functions that are called from the view are defined in the controller. It should also be evident how easy it is to make AJAX calls to consume our MVC4 REST WebAPI. Now we are left with the final piece. We need to define a route that associates a particular path with the view we have defined and the controller we have defined. 4. Add a new route that loads the controller and the partial This is the easiest part of the puzzle. We simply define another route in the /app/js/app.js file: $routeProvider.when('/sms', { templateUrl: '/app/partials/smsview.html', controller: 'SmsController' }); Conclusion In this article we have seen how much of the server side functionality in the MVC4 framework can be moved to the browser thus delivering a snappy and fast user interface. We have seen how we can build client side HTML only views that avoid the messy syntax offered by server side Razor views. We have built a functioning app from the ground up. The significant advantage of this approach to building web apps is that the front end can be completely platform independent. Even though we used ASP.NET to create our REST API, we could just easily have used any other language such as Node.js, Ruby etc without changing a single line of our front end code. Angular is a rich framework and we have only touched on basic functionality required to create a SPA. For readers who wish to delve further into the Angular framework, we would recommend the following URL as a starting point: http://docs.angularjs.org/misc/started. To get started with the code for this project: Sign up for an account at http://plus.sent.ly (free) Add your phone number Go to the “My Identies Page” Note Down your Sender ID, Consumer Key and Consumer Secret Download the code for this article at: https://docs.google.com/file/d/0BzjEWqSE31yoZjZlV0d0R2Y3eW8/edit?usp=sharing Change the values of Sender Id, Consumer Key and Consumer Secret in the web.config file Run the project through Visual Studio!

    Read the article

  • Windows Azure: Import/Export Hard Drives, VM ACLs, Web Sockets, Remote Debugging, Continuous Delivery, New Relic, Billing Alerts and More

    - by ScottGu
    Two weeks ago we released a giant set of improvements to Windows Azure, as well as a significant update of the Windows Azure SDK. This morning we released another massive set of enhancements to Windows Azure.  Today’s new capabilities include: Storage: Import/Export Hard Disk Drives to your Storage Accounts HDInsight: General Availability of our Hadoop Service in the cloud Virtual Machines: New VM Gallery, ACL support for VIPs Web Sites: WebSocket and Remote Debugging Support Notification Hubs: Segmented customer push notification support with tag expressions TFS & GIT: Continuous Delivery Support for Web Sites + Cloud Services Developer Analytics: New Relic support for Web Sites + Mobile Services Service Bus: Support for partitioned queues and topics Billing: New Billing Alert Service that sends emails notifications when your bill hits a threshold you define All of these improvements are now available to use immediately (note that some features are still in preview).  Below are more details about them. Storage: Import/Export Hard Disk Drives to Windows Azure I am excited to announce the preview of our new Windows Azure Import/Export Service! The Windows Azure Import/Export Service enables you to move large amounts of on-premises data into and out of your Windows Azure Storage accounts. It does this by enabling you to securely ship hard disk drives directly to our Windows Azure data centers. Once we receive the drives we’ll automatically transfer the data to or from your Windows Azure Storage account.  This enables you to import or export massive amounts of data more quickly and cost effectively (and not be constrained by available network bandwidth). Encrypted Transport Our Import/Export service provides built-in support for BitLocker disk encryption – which enables you to securely encrypt data on the hard drives before you send it, and not have to worry about it being compromised even if the disk is lost/stolen in transit (since the content on the transported hard drives is completely encrypted and you are the only one who has the key to it).  The drive preparation tool we are shipping today makes setting up bitlocker encryption on these hard drives easy. How to Import/Export your first Hard Drive of Data You can read our Getting Started Guide to learn more about how to begin using the import/export service.  You can create import and export jobs via the Windows Azure Management Portal as well as programmatically using our Server Management APIs. It is really easy to create a new import or export job using the Windows Azure Management Portal.  Simply navigate to a Windows Azure storage account, and then click the new Import/Export tab now available within it (note: if you don’t have this tab make sure to sign-up for the Import/Export preview): Then click the “Create Import Job” or “Create Export Job” commands at the bottom of it.  This will launch a wizard that easily walks you through the steps required: For more comprehensive information about Import/Export, refer to Windows Azure Storage team blog.  You can also send questions and comments to the [email protected] email address. We think you’ll find this new service makes it much easier to move data into and out of Windows Azure, and it will dramatically cut down the network bandwidth required when working on large data migration projects.  We hope you like it. HDInsight: 100% Compatible Hadoop Service in the Cloud Last week we announced the general availability release of Windows Azure HDInsight. HDInsight is a 100% compatible Hadoop service that allows you to easily provision and manage Hadoop clusters for big data processing in Windows Azure.  This release is now live in production, backed by an enterprise SLA, supported 24x7 by Microsoft Support, and is ready to use for production scenarios. HDInsight allows you to use Apache Hadoop tools, such as Pig and Hive, to process large amounts of data in Windows Azure Blob Storage. Because data is stored in Windows Azure Blob Storage, you can choose to dynamically create Hadoop clusters only when you need them, and then shut them down when they are no longer required (since you pay only for the time the Hadoop cluster instances are running this provides a super cost effective way to use them).  You can create Hadoop clusters using either the Windows Azure Management Portal (see below) or using our PowerShell and Cross Platform Command line tools: The import/export hard drive support that came out today is a perfect companion service to use with HDInsight – the combination allows you to easily ingest, process and optionally export a limitless amount of data.  We’ve also integrated HDInsight with our Business Intelligence tools, so users can leverage familiar tools like Excel in order to analyze the output of jobs.  You can find out more about how to get started with HDInsight here. Virtual Machines: VM Gallery Enhancements Today’s update of Windows Azure brings with it a new Virtual Machine gallery that you can use to create new VMs in the cloud.  You can launch the gallery by doing New->Compute->Virtual Machine->From Gallery within the Windows Azure Management Portal: The new Virtual Machine Gallery includes some nice enhancements that make it even easier to use: Search: You can now easily search and filter images using the search box in the top-right of the dialog.  For example, simply type “SQL” and we’ll filter to show those images in the gallery that contain that substring. Category Tree-view: Each month we add more built-in VM images to the gallery.  You can continue to browse these using the “All” view within the VM Gallery – or now quickly filter them using the category tree-view on the left-hand side of the dialog.  For example, by selecting “Oracle” in the tree-view you can now quickly filter to see the official Oracle supplied images. MSDN and Supported checkboxes: With today’s update we are also introducing filters that makes it easy to filter out types of images that you may not be interested in. The first checkbox is MSDN: using this filter you can exclude any image that is not part of the Windows Azure benefits for MSDN subscribers (which have highly discounted pricing - you can learn more about the MSDN pricing here). The second checkbox is Supported: this filter will exclude any image that contains prerelease software, so you can feel confident that the software you choose to deploy is fully supported by Windows Azure and our partners. Sort options: We sort gallery images by what we think customers are most interested in, but sometimes you might want to sort using different views. So we’re providing some additional sort options, like “Newest,” to customize the image list for what suits you best. Pricing information: We now provide additional pricing information about images and options on how to cost effectively run them directly within the VM Gallery. The above improvements make it even easier to use the VM Gallery and quickly create launch and run Virtual Machines in the cloud. Virtual Machines: ACL Support for VIPs A few months ago we exposed the ability to configure Access Control Lists (ACLs) for Virtual Machines using Windows PowerShell cmdlets and our Service Management API. With today’s release, you can now configure VM ACLs using the Windows Azure Management Portal as well. You can now do this by clicking the new Manage ACL command in the Endpoints tab of a virtual machine instance: This will enable you to configure an ordered list of permit and deny rules to scope the traffic that can access your VM’s network endpoints. For example, if you were on a virtual network, you could limit RDP access to a Windows Azure virtual machine to only a few computers attached to your enterprise. Or if you weren’t on a virtual network you could alternatively limit traffic from public IPs that can access your workloads: Here is the default behaviors for ACLs in Windows Azure: By default (i.e. no rules specified), all traffic is permitted. When using only Permit rules, all other traffic is denied. When using only Deny rules, all other traffic is permitted. When there is a combination of Permit and Deny rules, all other traffic is denied. Lastly, remember that configuring endpoints does not automatically configure them within the VM if it also has firewall rules enabled at the OS level.  So if you create an endpoint using the Windows Azure Management Portal, Windows PowerShell, or REST API, be sure to also configure your guest VM firewall appropriately as well. Web Sites: Web Sockets Support With today’s release you can now use Web Sockets with Windows Azure Web Sites.  This feature enables you to easily integrate real-time communication scenarios within your web based applications, and is available at no extra charge (it even works with the free tier).  Higher level programming libraries like SignalR and socket.io are also now supported with it. You can enable Web Sockets support on a web site by navigating to the Configure tab of a Web Site, and by toggling Web Sockets support to “on”: Once Web Sockets is enabled you can start to integrate some really cool scenarios into your web applications.  Check out the new SignalR documentation hub on www.asp.net to learn more about some of the awesome scenarios you can do with it. Web Sites: Remote Debugging Support The Windows Azure SDK 2.2 we released two weeks ago introduced remote debugging support for Windows Azure Cloud Services. With today’s Windows Azure release we are extending this remote debugging support to also work with Windows Azure Web Sites. With live, remote debugging support inside of Visual Studio, you are able to have more visibility than ever before into how your code is operating live in Windows Azure. It is now super easy to attach the debugger and quickly see what is going on with your application in the cloud. Remote Debugging of a Windows Azure Web Site using VS 2013 Enabling the remote debugging of a Windows Azure Web Site using VS 2013 is really easy.  Start by opening up your web application’s project within Visual Studio. Then navigate to the “Server Explorer” tab within Visual Studio, and click on the deployed web-site you want to debug that is running within Windows Azure using the Windows Azure->Web Sites node in the Server Explorer.  Then right-click and choose the “Attach Debugger” option on it: When you do this Visual Studio will remotely attach the debugger to the Web Site running within Windows Azure.  The debugger will then stop the web site’s execution when it hits any break points that you have set within your web application’s project inside Visual Studio.  For example, below I set a breakpoint on the “ViewBag.Message” assignment statement within the HomeController of the standard ASP.NET MVC project template.  When I hit refresh on the “About” page of the web site within the browser, the breakpoint was triggered and I am now able to debug the app remotely using Visual Studio: Note above how we can debug variables (including autos/watchlist/etc), as well as use the Immediate and Command Windows. In the debug session above I used the Immediate Window to explore some of the request object state, as well as to dynamically change the ViewBag.Message property.  When we click the the “Continue” button (or press F5) the app will continue execution and the Web Site will render the content back to the browser.  This makes it super easy to debug web apps remotely. Tips for Better Debugging To get the best experience while debugging, we recommend publishing your site using the Debug configuration within Visual Studio’s Web Publish dialog. This will ensure that debug symbol information is uploaded to the Web Site which will enable a richer debug experience within Visual Studio.  You can find this option on the Web Publish dialog on the Settings tab: When you ultimately deploy/run the application in production we recommend using the “Release” configuration setting – the release configuration is memory optimized and will provide the best production performance.  To learn more about diagnosing and debugging Windows Azure Web Sites read our new Troubleshooting Windows Azure Web Sites in Visual Studio guide. Notification Hubs: Segmented Push Notification support with tag expressions In August we announced the General Availability of Windows Azure Notification Hubs - a powerful Mobile Push Notifications service that makes it easy to send high volume push notifications with low latency from any mobile app back-end.  Notification hubs can be used with any mobile app back-end (including ones built using our Mobile Services capability) and can also be used with back-ends that run in the cloud as well as on-premises. Beginning with the initial release, Notification Hubs allowed developers to send personalized push notifications to both individual users as well as groups of users by interest, by associating their devices with tags representing the logical target of the notification. For example, by registering all devices of customers interested in a favorite MLB team with a corresponding tag, it is possible to broadcast one message to millions of Boston Red Sox fans and another message to millions of St. Louis Cardinals fans with a single API call respectively. New support for using tag expressions to enable advanced customer segmentation With today’s release we are adding support for even more advanced customer targeting.  You can now identify customers that you want to send push notifications to by defining rich tag expressions. With tag expressions, you can now not only broadcast notifications to Boston Red Sox fans, but take that segmenting a step farther and reach more granular segments. This opens up a variety of scenarios, for example: Offers based on multiple preferences—e.g. send a game day vegetarian special to users tagged as both a Boston Red Sox fan AND a vegetarian Push content to multiple segments in a single message—e.g. rain delay information only to users who are tagged as either a Boston Red Sox fan OR a St. Louis Cardinal fan Avoid presenting subsets of a segment with irrelevant content—e.g. season ticket availability reminder to users who are tagged as a Boston Red Sox fan but NOT also a season ticket holder To illustrate with code, consider a restaurant chain app that sends an offer related to a Red Sox vs Cardinals game for users in Boston. Devices can be tagged by your app with location tags (e.g. “Loc:Boston”) and interest tags (e.g. “Follows:RedSox”, “Follows:Cardinals”), and then a notification can be sent by your back-end to “(Follows:RedSox || Follows:Cardinals) && Loc:Boston” in order to deliver an offer to all devices in Boston that follow either the RedSox or the Cardinals. This can be done directly in your server backend send logic using the code below: var notification = new WindowsNotification(messagePayload); hub.SendNotificationAsync(notification, "(Follows:RedSox || Follows:Cardinals) && Loc:Boston"); In your expressions you can use all Boolean operators: AND (&&), OR (||), and NOT (!).  Some other cool use cases for tag expressions that are now supported include: Social: To “all my group except me” - group:id && !user:id Events: Touchdown event is sent to everybody following either team or any of the players involved in the action: Followteam:A || Followteam:B || followplayer:1 || followplayer:2 … Hours: Send notifications at specific times. E.g. Tag devices with time zone and when it is 12pm in Seattle send to: GMT8 && follows:thaifood Versions and platforms: Send a reminder to people still using your first version for Android - version:1.0 && platform:Android For help on getting started with Notification Hubs, visit the Notification Hub documentation center.  Then download the latest NuGet package (or use the Notification Hubs REST APIs directly) to start sending push notifications using tag expressions.  They are really powerful and enable a bunch of great new scenarios. TFS & GIT: Continuous Delivery Support for Web Sites + Cloud Services With today’s Windows Azure release we are making it really easy to enable continuous delivery support with Windows Azure and Team Foundation Services.  Team Foundation Services is a cloud based offering from Microsoft that provides integrated source control (with both TFS and Git support), build server, test execution, collaboration tools, and agile planning support.  It makes it really easy to setup a team project (complete with automated builds and test runners) in the cloud, and it has really rich integration with Visual Studio. With today’s Windows Azure release it is now really easy to enable continuous delivery support with both TFS and Git based repositories hosted using Team Foundation Services.  This enables a workflow where when code is checked in, built successfully on an automated build server, and all tests pass on it – I can automatically have the app deployed on Windows Azure with zero manual intervention or work required. The below screen-shots demonstrate how to quickly setup a continuous delivery workflow to Windows Azure with a Git-based ASP.NET MVC project hosted using Team Foundation Services. Enabling Continuous Delivery to Windows Azure with Team Foundation Services The project I’m going to enable continuous delivery with is a simple ASP.NET MVC project whose source code I’m hosting using Team Foundation Services.  I did this by creating a “SimpleContinuousDeploymentTest” repository there using Git – and then used the new built-in Git tooling support within Visual Studio 2013 to push the source code to it.  Below is a screen-shot of the Git repository hosted within Team Foundation Services: I can access the repository within Visual Studio 2013 and easily make commits with it (as well as branch, merge and do other tasks).  Using VS 2013 I can also setup automated builds to take place in the cloud using Team Foundation Services every time someone checks in code to the repository: The cool thing about this is that I don’t have to buy or rent my own build server – Team Foundation Services automatically maintains its own build server farm and can automatically queue up a build for me (for free) every time someone checks in code using the above settings.  This build server (and automated testing) support now works with both TFS and Git based source control repositories. Connecting a Team Foundation Services project to Windows Azure Once I have a source repository hosted in Team Foundation Services with Automated Builds and Testing set up, I can then go even further and set it up so that it will be automatically deployed to Windows Azure when a source code commit is made to the repository (assuming the Build + Tests pass).  Enabling this is now really easy.  To set this up with a Windows Azure Web Site simply use the New->Compute->Web Site->Custom Create command inside the Windows Azure Management Portal.  This will create a dialog like below.  I gave the web site a name and then made sure the “Publish from source control” checkbox was selected: When we click next we’ll be prompted for the location of the source repository.  We’ll select “Team Foundation Services”: Once we do this we’ll be prompted for our Team Foundation Services account that our source repository is hosted under (in this case my TFS account is “scottguthrie”): When we click the “Authorize Now” button we’ll be prompted to give Windows Azure permissions to connect to the Team Foundation Services account.  Once we do this we’ll be prompted to pick the source repository we want to connect to.  Starting with today’s Windows Azure release you can now connect to both TFS and Git based source repositories.  This new support allows me to connect to the “SimpleContinuousDeploymentTest” respository we created earlier: Clicking the finish button will then create the Web Site with the continuous delivery hooks setup with Team Foundation Services.  Now every time someone pushes source control to the repository in Team Foundation Services, it will kick off an automated build, run all of the unit tests in the solution , and if they pass the app will be automatically deployed to our Web Site in Windows Azure.  You can monitor the history and status of these automated deployments using the Deployments tab within the Web Site: This enables a really slick continuous delivery workflow, and enables you to build and deploy apps in a really nice way. Developer Analytics: New Relic support for Web Sites + Mobile Services With today’s Windows Azure release we are making it really easy to enable Developer Analytics and Monitoring support with both Windows Azure Web Site and Windows Azure Mobile Services.  We are partnering with New Relic, who provide a great dev analytics and app performance monitoring offering, to enable this - and we have updated the Windows Azure Management Portal to make it really easy to configure. Enabling New Relic with a Windows Azure Web Site Enabling New Relic support with a Windows Azure Web Site is now really easy.  Simply navigate to the Configure tab of a Web Site and scroll down to the “developer analytics” section that is now within it: Clicking the “add-on” button will display some additional UI.  If you don’t already have a New Relic subscription, you can click the “view windows azure store” button to obtain a subscription (note: New Relic has a perpetually free tier so you can enable it even without paying anything): Clicking the “view windows azure store” button will launch the integrated Windows Azure Store experience we have within the Windows Azure Management Portal.  You can use this to browse from a variety of great add-on services – including New Relic: Select “New Relic” within the dialog above, then click the next button, and you’ll be able to choose which type of New Relic subscription you wish to purchase.  For this demo we’ll simply select the “Free Standard Version” – which does not cost anything and can be used forever:  Once we’ve signed-up for our New Relic subscription and added it to our Windows Azure account, we can go back to the Web Site’s configuration tab and choose to use the New Relic add-on with our Windows Azure Web Site.  We can do this by simply selecting it from the “add-on” dropdown (it is automatically populated within it once we have a New Relic subscription in our account): Clicking the “Save” button will then cause the Windows Azure Management Portal to automatically populate all of the needed New Relic configuration settings to our Web Site: Deploying the New Relic Agent as part of a Web Site The final step to enable developer analytics using New Relic is to add the New Relic runtime agent to our web app.  We can do this within Visual Studio by right-clicking on our web project and selecting the “Manage NuGet Packages” context menu: This will bring up the NuGet package manager.  You can search for “New Relic” within it to find the New Relic agent.  Note that there is both a 32-bit and 64-bit edition of it – make sure to install the version that matches how your Web Site is running within Windows Azure (note: you can configure your Web Site to run in either 32-bit or 64-bit mode using the Web Site’s “Configuration” tab within the Windows Azure Management Portal): Once we install the NuGet package we are all set to go.  We’ll simply re-publish the web site again to Windows Azure and New Relic will now automatically start monitoring the application Monitoring a Web Site using New Relic Now that the application has developer analytics support with New Relic enabled, we can launch the New Relic monitoring portal to start monitoring the health of it.  We can do this by clicking on the “Add Ons” tab in the left-hand side of the Windows Azure Management Portal.  Then select the New Relic add-on we signed-up for within it.  The Windows Azure Management Portal will provide some default information about the add-on when we do this.  Clicking the “Manage” button in the tray at the bottom will launch a new browser tab and single-sign us into the New Relic monitoring portal associated with our account: When we do this a new browser tab will launch with the New Relic admin tool loaded within it: We can now see insights into how our app is performing – without having to have written a single line of monitoring code.  The New Relic service provides a ton of great built-in monitoring features allowing us to quickly see: Performance times (including browser rendering speed) for the overall site and individual pages.  You can optionally set alert thresholds to trigger if the speed does not meet a threshold you specify. Information about where in the world your customers are hitting the site from (and how performance varies by region) Details on the latency performance of external services your web apps are using (for example: SQL, Storage, Twitter, etc) Error information including call stack details for exceptions that have occurred at runtime SQL Server profiling information – including which queries executed against your database and what their performance was And a whole bunch more… The cool thing about New Relic is that you don’t need to write monitoring code within your application to get all of the above reports (plus a lot more).  The New Relic agent automatically enables the CLR profiler within applications and automatically captures the information necessary to identify these.  This makes it super easy to get started and immediately have a rich developer analytics view for your solutions with very little effort. If you haven’t tried New Relic out yet with Windows Azure I recommend you do so – I think you’ll find it helps you build even better cloud applications.  Following the above steps will help you get started and deliver you a really good application monitoring solution in only minutes. Service Bus: Support for partitioned queues and topics With today’s release, we are enabling support within Service Bus for partitioned queues and topics. Enabling partitioning enables you to achieve a higher message throughput and better availability from your queues and topics. Higher message throughput is achieved by implementing multiple message brokers for each partitioned queue and topic.  The  multiple messaging stores will also provide higher availability. You can create a partitioned queue or topic by simply checking the Enable Partitioning option in the custom create wizard for a Queue or Topic: Read this article to learn more about partitioned queues and topics and how to take advantage of them today. Billing: New Billing Alert Service Today’s Windows Azure update enables a new Billing Alert Service Preview that enables you to get proactive email notifications when your Windows Azure bill goes above a certain monetary threshold that you configure.  This makes it easier to manage your bill and avoid potential surprises at the end of the month. With the Billing Alert Service Preview, you can now create email alerts to monitor and manage your monetary credits or your current bill total.  To set up an alert first sign-up for the free Billing Alert Service Preview.  Then visit the account management page, click on a subscription you have setup, and then navigate to the new Alerts tab that is available: The alerts tab allows you to setup email alerts that will be sent automatically once a certain threshold is hit.  For example, by clicking the “add alert” button above I can setup a rule to send myself email anytime my Windows Azure bill goes above $100 for the month: The Billing Alert Service will evolve to support additional aspects of your bill as well as support multiple forms of alerts such as SMS.  Try out the new Billing Alert Service Preview today and give us feedback. Summary Today’s Windows Azure release enables a ton of great new scenarios, and makes building applications hosted in the cloud even easier. If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Windows Azure Developer Center to learn more about how to build apps with it. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • An Xml Serializable PropertyBag Dictionary Class for .NET

    - by Rick Strahl
    I don't know about you but I frequently need property bags in my applications to store and possibly cache arbitrary data. Dictionary<T,V> works well for this although I always seem to be hunting for a more specific generic type that provides a string key based dictionary. There's string dictionary, but it only works with strings. There's Hashset<T> but it uses the actual values as keys. In most key value pair situations for me string is key value to work off. Dictionary<T,V> works well enough, but there are some issues with serialization of dictionaries in .NET. The .NET framework doesn't do well serializing IDictionary objects out of the box. The XmlSerializer doesn't support serialization of IDictionary via it's default serialization, and while the DataContractSerializer does support IDictionary serialization it produces some pretty atrocious XML. What doesn't work? First off Dictionary serialization with the Xml Serializer doesn't work so the following fails: [TestMethod] public void DictionaryXmlSerializerTest() { var bag = new Dictionary<string, object>(); bag.Add("key", "Value"); bag.Add("Key2", 100.10M); bag.Add("Key3", Guid.NewGuid()); bag.Add("Key4", DateTime.Now); bag.Add("Key5", true); bag.Add("Key7", new byte[3] { 42, 45, 66 }); TestContext.WriteLine(this.ToXml(bag)); } public string ToXml(object obj) { if (obj == null) return null; StringWriter sw = new StringWriter(); XmlSerializer ser = new XmlSerializer(obj.GetType()); ser.Serialize(sw, obj); return sw.ToString(); } The error you get with this is: System.NotSupportedException: The type System.Collections.Generic.Dictionary`2[[System.String, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089],[System.Object, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089]] is not supported because it implements IDictionary. Got it! BTW, the same is true with binary serialization. Running the same code above against the DataContractSerializer does work: [TestMethod] public void DictionaryDataContextSerializerTest() { var bag = new Dictionary<string, object>(); bag.Add("key", "Value"); bag.Add("Key2", 100.10M); bag.Add("Key3", Guid.NewGuid()); bag.Add("Key4", DateTime.Now); bag.Add("Key5", true); bag.Add("Key7", new byte[3] { 42, 45, 66 }); TestContext.WriteLine(this.ToXmlDcs(bag)); } public string ToXmlDcs(object value, bool throwExceptions = false) { var ser = new DataContractSerializer(value.GetType(), null, int.MaxValue, true, false, null); MemoryStream ms = new MemoryStream(); ser.WriteObject(ms, value); return Encoding.UTF8.GetString(ms.ToArray(), 0, (int)ms.Length); } This DOES work but produces some pretty heinous XML (formatted with line breaks and indentation here): <ArrayOfKeyValueOfstringanyType xmlns="http://schemas.microsoft.com/2003/10/Serialization/Arrays" xmlns:i="http://www.w3.org/2001/XMLSchema-instance"> <KeyValueOfstringanyType> <Key>key</Key> <Value i:type="a:string" xmlns:a="http://www.w3.org/2001/XMLSchema">Value</Value> </KeyValueOfstringanyType> <KeyValueOfstringanyType> <Key>Key2</Key> <Value i:type="a:decimal" xmlns:a="http://www.w3.org/2001/XMLSchema">100.10</Value> </KeyValueOfstringanyType> <KeyValueOfstringanyType> <Key>Key3</Key> <Value i:type="a:guid" xmlns:a="http://schemas.microsoft.com/2003/10/Serialization/">2cd46d2a-a636-4af4-979b-e834d39b6d37</Value> </KeyValueOfstringanyType> <KeyValueOfstringanyType> <Key>Key4</Key> <Value i:type="a:dateTime" xmlns:a="http://www.w3.org/2001/XMLSchema">2011-09-19T17:17:05.4406999-07:00</Value> </KeyValueOfstringanyType> <KeyValueOfstringanyType> <Key>Key5</Key> <Value i:type="a:boolean" xmlns:a="http://www.w3.org/2001/XMLSchema">true</Value> </KeyValueOfstringanyType> <KeyValueOfstringanyType> <Key>Key7</Key> <Value i:type="a:base64Binary" xmlns:a="http://www.w3.org/2001/XMLSchema">Ki1C</Value> </KeyValueOfstringanyType> </ArrayOfKeyValueOfstringanyType> Ouch! That seriously hurts the eye! :-) Worse though it's extremely verbose with all those repetitive namespace declarations. It's good to know that it works in a pinch, but for a human readable/editable solution or something lightweight to store in a database it's not quite ideal. Why should I care? As a little background, in one of my applications I have a need for a flexible property bag that is used on a free form database field on an otherwise static entity. Basically what I have is a standard database record to which arbitrary properties can be added in an XML based string field. I intend to expose those arbitrary properties as a collection from field data stored in XML. The concept is pretty simple: When loading write the data to the collection, when the data is saved serialize the data into an XML string and store it into the database. When reading the data pick up the XML and if the collection on the entity is accessed automatically deserialize the XML into the Dictionary. (I'll talk more about this in another post). While the DataContext Serializer would work, it's verbosity is problematic both for size of the generated XML strings and the fact that users can manually edit this XML based property data in an advanced mode. A clean(er) layout certainly would be preferable and more user friendly. Custom XMLSerialization with a PropertyBag Class So… after a bunch of experimentation with different serialization formats I decided to create a custom PropertyBag class that provides for a serializable Dictionary. It's basically a custom Dictionary<TType,TValue> implementation with the keys always set as string keys. The result are PropertyBag<TValue> and PropertyBag (which defaults to the object type for values). The PropertyBag<TType> and PropertyBag classes provide these features: Subclassed from Dictionary<T,V> Implements IXmlSerializable with a cleanish XML format ToXml() and FromXml() methods to export and import to and from XML strings Static CreateFromXml() method to create an instance It's simple enough as it's merely a Dictionary<string,object> subclass but that supports serialization to a - what I think at least - cleaner XML format. The class is super simple to use: [TestMethod] public void PropertyBagTwoWayObjectSerializationTest() { var bag = new PropertyBag(); bag.Add("key", "Value"); bag.Add("Key2", 100.10M); bag.Add("Key3", Guid.NewGuid()); bag.Add("Key4", DateTime.Now); bag.Add("Key5", true); bag.Add("Key7", new byte[3] { 42,45,66 } ); bag.Add("Key8", null); bag.Add("Key9", new ComplexObject() { Name = "Rick", Entered = DateTime.Now, Count = 10 }); string xml = bag.ToXml(); TestContext.WriteLine(bag.ToXml()); bag.Clear(); bag.FromXml(xml); Assert.IsTrue(bag["key"] as string == "Value"); Assert.IsInstanceOfType( bag["Key3"], typeof(Guid)); Assert.IsNull(bag["Key8"]); //Assert.IsNull(bag["Key10"]); Assert.IsInstanceOfType(bag["Key9"], typeof(ComplexObject)); } This uses the PropertyBag class which uses a PropertyBag<string,object> - which means it returns untyped values of type object. I suspect for me this will be the most common scenario as I'd want to store arbitrary values in the PropertyBag rather than one specific type. The same code with a strongly typed PropertyBag<decimal> looks like this: [TestMethod] public void PropertyBagTwoWayValueTypeSerializationTest() { var bag = new PropertyBag<decimal>(); bag.Add("key", 10M); bag.Add("Key1", 100.10M); bag.Add("Key2", 200.10M); bag.Add("Key3", 300.10M); string xml = bag.ToXml(); TestContext.WriteLine(bag.ToXml()); bag.Clear(); bag.FromXml(xml); Assert.IsTrue(bag.Get("Key1") == 100.10M); Assert.IsTrue(bag.Get("Key3") == 300.10M); } and produces typed results of type decimal. The types can be either value or reference types the combination of which actually proved to be a little more tricky than anticipated due to null and specific string value checks required - getting the generic typing right required use of default(T) and Convert.ChangeType() to trick the compiler into playing nice. Of course the whole raison d'etre for this class is the XML serialization. You can see in the code above that we're doing a .ToXml() and .FromXml() to serialize to and from string. The XML produced for the first example looks like this: <?xml version="1.0" encoding="utf-8"?> <properties> <item> <key>key</key> <value>Value</value> </item> <item> <key>Key2</key> <value type="decimal">100.10</value> </item> <item> <key>Key3</key> <value type="___System.Guid"> <guid>f7a92032-0c6d-4e9d-9950-b15ff7cd207d</guid> </value> </item> <item> <key>Key4</key> <value type="datetime">2011-09-26T17:45:58.5789578-10:00</value> </item> <item> <key>Key5</key> <value type="boolean">true</value> </item> <item> <key>Key7</key> <value type="base64Binary">Ki1C</value> </item> <item> <key>Key8</key> <value type="nil" /> </item> <item> <key>Key9</key> <value type="___Westwind.Tools.Tests.PropertyBagTest+ComplexObject"> <ComplexObject> <Name>Rick</Name> <Entered>2011-09-26T17:45:58.5789578-10:00</Entered> <Count>10</Count> </ComplexObject> </value> </item> </properties>   The format is a bit cleaner than the DataContractSerializer. Each item is serialized into <key> <value> pairs. If the value is a string no type information is written. Since string tends to be the most common type this saves space and serialization processing. All other types are attributed. Simple types are mapped to XML types so things like decimal, datetime, boolean and base64Binary are encoded using their Xml type values. All other types are embedded with a hokey format that describes the .NET type preceded by a three underscores and then are encoded using the XmlSerializer. You can see this best above in the ComplexObject encoding. For custom types this isn't pretty either, but it's more concise than the DCS and it works as long as you're serializing back and forth between .NET clients at least. The XML generated from the second example that uses PropertyBag<decimal> looks like this: <?xml version="1.0" encoding="utf-8"?> <properties> <item> <key>key</key> <value type="decimal">10</value> </item> <item> <key>Key1</key> <value type="decimal">100.10</value> </item> <item> <key>Key2</key> <value type="decimal">200.10</value> </item> <item> <key>Key3</key> <value type="decimal">300.10</value> </item> </properties>   How does it work As I mentioned there's nothing fancy about this solution - it's little more than a subclass of Dictionary<T,V> that implements custom Xml Serialization and a couple of helper methods that facilitate getting the XML in and out of the class more easily. But it's proven very handy for a number of projects for me where dynamic data storage is required. Here's the code: /// <summary> /// Creates a serializable string/object dictionary that is XML serializable /// Encodes keys as element names and values as simple values with a type /// attribute that contains an XML type name. Complex names encode the type /// name with type='___namespace.classname' format followed by a standard xml /// serialized format. The latter serialization can be slow so it's not recommended /// to pass complex types if performance is critical. /// </summary> [XmlRoot("properties")] public class PropertyBag : PropertyBag<object> { /// <summary> /// Creates an instance of a propertybag from an Xml string /// </summary> /// <param name="xml">Serialize</param> /// <returns></returns> public static PropertyBag CreateFromXml(string xml) { var bag = new PropertyBag(); bag.FromXml(xml); return bag; } } /// <summary> /// Creates a serializable string for generic types that is XML serializable. /// /// Encodes keys as element names and values as simple values with a type /// attribute that contains an XML type name. Complex names encode the type /// name with type='___namespace.classname' format followed by a standard xml /// serialized format. The latter serialization can be slow so it's not recommended /// to pass complex types if performance is critical. /// </summary> /// <typeparam name="TValue">Must be a reference type. For value types use type object</typeparam> [XmlRoot("properties")] public class PropertyBag<TValue> : Dictionary<string, TValue>, IXmlSerializable { /// <summary> /// Not implemented - this means no schema information is passed /// so this won't work with ASMX/WCF services. /// </summary> /// <returns></returns> public System.Xml.Schema.XmlSchema GetSchema() { return null; } /// <summary> /// Serializes the dictionary to XML. Keys are /// serialized to element names and values as /// element values. An xml type attribute is embedded /// for each serialized element - a .NET type /// element is embedded for each complex type and /// prefixed with three underscores. /// </summary> /// <param name="writer"></param> public void WriteXml(System.Xml.XmlWriter writer) { foreach (string key in this.Keys) { TValue value = this[key]; Type type = null; if (value != null) type = value.GetType(); writer.WriteStartElement("item"); writer.WriteStartElement("key"); writer.WriteString(key as string); writer.WriteEndElement(); writer.WriteStartElement("value"); string xmlType = XmlUtils.MapTypeToXmlType(type); bool isCustom = false; // Type information attribute if not string if (value == null) { writer.WriteAttributeString("type", "nil"); } else if (!string.IsNullOrEmpty(xmlType)) { if (xmlType != "string") { writer.WriteStartAttribute("type"); writer.WriteString(xmlType); writer.WriteEndAttribute(); } } else { isCustom = true; xmlType = "___" + value.GetType().FullName; writer.WriteStartAttribute("type"); writer.WriteString(xmlType); writer.WriteEndAttribute(); } // Actual deserialization if (!isCustom) { if (value != null) writer.WriteValue(value); } else { XmlSerializer ser = new XmlSerializer(value.GetType()); ser.Serialize(writer, value); } writer.WriteEndElement(); // value writer.WriteEndElement(); // item } } /// <summary> /// Reads the custom serialized format /// </summary> /// <param name="reader"></param> public void ReadXml(System.Xml.XmlReader reader) { this.Clear(); while (reader.Read()) { if (reader.NodeType == XmlNodeType.Element && reader.Name == "key") { string xmlType = null; string name = reader.ReadElementContentAsString(); // item element reader.ReadToNextSibling("value"); if (reader.MoveToNextAttribute()) xmlType = reader.Value; reader.MoveToContent(); TValue value; if (xmlType == "nil") value = default(TValue); // null else if (string.IsNullOrEmpty(xmlType)) { // value is a string or object and we can assign TValue to value string strval = reader.ReadElementContentAsString(); value = (TValue) Convert.ChangeType(strval, typeof(TValue)); } else if (xmlType.StartsWith("___")) { while (reader.Read() && reader.NodeType != XmlNodeType.Element) { } Type type = ReflectionUtils.GetTypeFromName(xmlType.Substring(3)); //value = reader.ReadElementContentAs(type,null); XmlSerializer ser = new XmlSerializer(type); value = (TValue)ser.Deserialize(reader); } else value = (TValue)reader.ReadElementContentAs(XmlUtils.MapXmlTypeToType(xmlType), null); this.Add(name, value); } } } /// <summary> /// Serializes this dictionary to an XML string /// </summary> /// <returns>XML String or Null if it fails</returns> public string ToXml() { string xml = null; SerializationUtils.SerializeObject(this, out xml); return xml; } /// <summary> /// Deserializes from an XML string /// </summary> /// <param name="xml"></param> /// <returns>true or false</returns> public bool FromXml(string xml) { this.Clear(); // if xml string is empty we return an empty dictionary if (string.IsNullOrEmpty(xml)) return true; var result = SerializationUtils.DeSerializeObject(xml, this.GetType()) as PropertyBag<TValue>; if (result != null) { foreach (var item in result) { this.Add(item.Key, item.Value); } } else // null is a failure return false; return true; } /// <summary> /// Creates an instance of a propertybag from an Xml string /// </summary> /// <param name="xml"></param> /// <returns></returns> public static PropertyBag<TValue> CreateFromXml(string xml) { var bag = new PropertyBag<TValue>(); bag.FromXml(xml); return bag; } } } The code uses a couple of small helper classes SerializationUtils and XmlUtils for mapping Xml types to and from .NET, both of which are from the WestWind,Utilities project (which is the same project where PropertyBag lives) from the West Wind Web Toolkit. The code implements ReadXml and WriteXml for the IXmlSerializable implementation using old school XmlReaders and XmlWriters (because it's pretty simple stuff - no need for XLinq here). Then there are two helper methods .ToXml() and .FromXml() that basically allow your code to easily convert between XML and a PropertyBag object. In my code that's what I use to actually to persist to and from the entity XML property during .Load() and .Save() operations. It's sweet to be able to have a string key dictionary and then be able to turn around with 1 line of code to persist the whole thing to XML and back. Hopefully some of you will find this class as useful as I've found it. It's a simple solution to a common requirement in my applications and I've used the hell out of it in the  short time since I created it. Resources You can find the complete code for the two classes plus the helpers in the Subversion repository for Westwind.Utilities. You can grab the source files from there or download the whole project. You can also grab the full Westwind.Utilities assembly from NuGet and add it to your project if that's easier for you. PropertyBag Source Code SerializationUtils and XmlUtils Westwind.Utilities Assembly on NuGet (add from Visual Studio) © Rick Strahl, West Wind Technologies, 2005-2011Posted in .NET  CSharp   Tweet (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • actionscript3: reflect-class applied on rotationY

    - by algro
    Hi, I'm using a class which applies a visual reflection-effect to defined movieclips. I use a reflection-class from here: link to source. It works like a charm except when I apply a rotation to the movieclip. In my case the reflection is still visible but only a part of it. What am I doing wrong? How could I pass/include the rotation to the Reflection-Class ? Thanks in advance! This is how you apply the Reflection Class to your movieclip: var ref_mc:MovieClip = new MoviClip(); addChild(ref_mc); var r1:Reflect = new Reflect({mc:ref_mc, alpha:50, ratio:50,distance:0, updateTime:0,reflectionDropoff:1}); Now I apply a rotation to my movieclip: ref_mc.rotationY = 30; And Here the Reflect-Class: package com.pixelfumes.reflect{ import flash.display.MovieClip; import flash.display.DisplayObject; import flash.display.BitmapData; import flash.display.Bitmap; import flash.geom.Matrix; import flash.display.GradientType; import flash.display.SpreadMethod; import flash.utils.setInterval; import flash.utils.clearInterval; public class Reflect extends MovieClip{ //Created By Ben Pritchard of Pixelfumes 2007 //Thanks to Mim, Jasper, Jason Merrill and all the others who //have contributed to the improvement of this class //static var for the version of this class private static var VERSION:String = "4.0"; //reference to the movie clip we are reflecting private var mc:MovieClip; //the BitmapData object that will hold a visual copy of the mc private var mcBMP:BitmapData; //the BitmapData object that will hold the reflected image private var reflectionBMP:Bitmap; //the clip that will act as out gradient mask private var gradientMask_mc:MovieClip; //how often the reflection should update (if it is video or animated) private var updateInt:Number; //the size the reflection is allowed to reflect within private var bounds:Object; //the distance the reflection is vertically from the mc private var distance:Number = 0; function Reflect(args:Object){ /*the args object passes in the following variables /we set the values of our internal vars to math the args*/ //the clip being reflected mc = args.mc; //the alpha level of the reflection clip var alpha:Number = args.alpha/100; //the ratio opaque color used in the gradient mask var ratio:Number = args.ratio; //update time interval var updateTime:Number = args.updateTime; //the distance at which the reflection visually drops off at var reflectionDropoff:Number = args.reflectionDropoff; //the distance the reflection starts from the bottom of the mc var distance:Number = args.distance; //store width and height of the clip var mcHeight = mc.height; var mcWidth = mc.width; //store the bounds of the reflection bounds = new Object(); bounds.width = mcWidth; bounds.height = mcHeight; //create the BitmapData that will hold a snapshot of the movie clip mcBMP = new BitmapData(bounds.width, bounds.height, true, 0xFFFFFF); mcBMP.draw(mc); //create the BitmapData the will hold the reflection reflectionBMP = new Bitmap(mcBMP); //flip the reflection upside down reflectionBMP.scaleY = -1; //move the reflection to the bottom of the movie clip reflectionBMP.y = (bounds.height*2) + distance; //add the reflection to the movie clip's Display Stack var reflectionBMPRef:DisplayObject = mc.addChild(reflectionBMP); reflectionBMPRef.name = "reflectionBMP"; //add a blank movie clip to hold our gradient mask var gradientMaskRef:DisplayObject = mc.addChild(new MovieClip()); gradientMaskRef.name = "gradientMask_mc"; //get a reference to the movie clip - cast the DisplayObject that is returned as a MovieClip gradientMask_mc = mc.getChildByName("gradientMask_mc") as MovieClip; //set the values for the gradient fill var fillType:String = GradientType.LINEAR; var colors:Array = [0xFFFFFF, 0xFFFFFF]; var alphas:Array = [alpha, 0]; var ratios:Array = [0, ratio]; var spreadMethod:String = SpreadMethod.PAD; //create the Matrix and create the gradient box var matr:Matrix = new Matrix(); //set the height of the Matrix used for the gradient mask var matrixHeight:Number; if (reflectionDropoff<=0) { matrixHeight = bounds.height; } else { matrixHeight = bounds.height/reflectionDropoff; } matr.createGradientBox(bounds.width, matrixHeight, (90/180)*Math.PI, 0, 0); //create the gradient fill gradientMask_mc.graphics.beginGradientFill(fillType, colors, alphas, ratios, matr, spreadMethod); gradientMask_mc.graphics.drawRect(0,0,bounds.width,bounds.height); //position the mask over the reflection clip gradientMask_mc.y = mc.getChildByName("reflectionBMP").y - mc.getChildByName("reflectionBMP").height; //cache clip as a bitmap so that the gradient mask will function gradientMask_mc.cacheAsBitmap = true; mc.getChildByName("reflectionBMP").cacheAsBitmap = true; //set the mask for the reflection as the gradient mask mc.getChildByName("reflectionBMP").mask = gradientMask_mc; //if we are updating the reflection for a video or animation do so here if(updateTime > -1){ updateInt = setInterval(update, updateTime, mc); } } public function setBounds(w:Number,h:Number):void{ //allows the user to set the area that the reflection is allowed //this is useful for clips that move within themselves bounds.width = w; bounds.height = h; gradientMask_mc.width = bounds.width; redrawBMP(mc); } public function redrawBMP(mc:MovieClip):void { // redraws the bitmap reflection - Mim Gamiet [2006] mcBMP.dispose(); mcBMP = new BitmapData(bounds.width, bounds.height, true, 0xFFFFFF); mcBMP.draw(mc); } private function update(mc):void { //updates the reflection to visually match the movie clip mcBMP = new BitmapData(bounds.width, bounds.height, true, 0xFFFFFF); mcBMP.draw(mc); reflectionBMP.bitmapData = mcBMP; } public function destroy():void{ //provides a method to remove the reflection mc.removeChild(mc.getChildByName("reflectionBMP")); reflectionBMP = null; mcBMP.dispose(); clearInterval(updateInt); mc.removeChild(mc.getChildByName("gradientMask_mc")); } } }

    Read the article

  • "Exception has been thrown by the target of an invocation" Running Tests - VS2008 SP1

    - by omatrot
    I'm using Visual Studio 2008 Team Suite and I'm unable to run tests and display the Test/Windows/Test Result Window. The result is a dialog box with the following content : "Exception has been thrown by the target of an invocation". Team Explorer has been installed after Visual Studio 2008 SP1. So I have re-apllied the service pack. Searching the web I found that this error is pretty common but unfortunately, the proposed solutions does not work for me. The problem was never analysed so I decided to give it a try : I reproduced the problem on a computer, attached the process with windbg and start with the basic investigations. Following are the first results : 0:000>!dumpstack OS Thread Id: 0xdb0 (0) Current frame: USER32!NtUserWaitMessage+0x15 ChildEBP RetAddr Caller,Callee 003fec94 75a32674 USER32!DialogBox2+0x222, calling USER32!NtUserWaitMessage 003fecd0 75a3288a USER32!InternalDialogBox+0xe5, calling USER32!DialogBox2 003fecfc 75a6f8d0 USER32!SoftModalMessageBox+0x757, calling USER32!InternalDialogBox 003fed3c 6eb61996 mscorwks!Thread::ReverseLeaveRuntime+0x95, calling mscorwks!_EH_epilog3 003fedb0 75a6fbac USER32!MessageBoxWorker+0x269, calling USER32!SoftModalMessageBox 003fede0 6ea559c3 mscorwks!SetupThreadNoThrow+0x19a, calling mscorwks!_EH_epilog3_catch_GS 003fee24 6eb61d8a mscorwks!HasIllegalReentrancy+0xac, calling mscorwks!_EH_epilog3 003fee30 6ea89796 mscorwks!SimpleComCallWrapper::Release+0x2e, calling mscorwks!CompareExchangeMP 003fee38 6ea0da05 mscorwks!CLRException::HandlerState::CleanupTry+0x16, calling mscorwks!GetCurrentSEHRecord 003fee44 6ea0c9c0 mscorwks!Thread::EnablePreemptiveGC+0xf, calling mscorwks!Thread::CatchAtSafePoint 003fee4c 6ea8a241 mscorwks!Unknown_Release_Internal+0x24d, calling mscorwks!GCHolder<1,0,0>::Pop 003fee50 6ea0c86c mscorwks!_EH_epilog3_catch_GS+0xa, calling mscorwks!__security_check_cookie 003fee54 6ea8a24c mscorwks!Unknown_Release_Internal+0x258, calling mscorwks!_EH_epilog3_catch_GS 003fee7c 75a16941 USER32!UserCallWinProcCheckWow+0x13d, calling ntdll!RtlDeactivateActivationContextUnsafeFast 003feed8 7082119e msenv!ATL::CComCritSecLock<ATL::CComCriticalSection>::Lock+0xd, calling ntdll!RtlEnterCriticalSection 003fef08 75a6fe5b USER32!MessageBoxIndirectW+0x2e, calling USER32!MessageBoxWorker 003fef7c 70a1e367 msenv!MessageBoxPVoidW+0xda 003fefd4 70a1db60 msenv!VBDialogCover2+0x11b 003ff01c 70a1e4c0 msenv!VBMessageBox2W+0xf0, calling msenv!VBDialogCover2 003ff044 7087246b msenv!main_GetAppNameW+0xa, calling msenv!GetAppNameInternal 003ff04c 70a1e4f2 msenv!VBMessageBox3W+0x1c, calling msenv!VBMessageBox2W 003ff064 70a1d6d7 msenv!_IdMsgShow+0x362, calling msenv!VBMessageBox3W 003ff0cc 70951841 msenv!TaskDialogCallback+0x7e0, calling msenv!_IdMsgShow 003ff118 6eb20da4 mscorwks!Unknown_QueryInterface+0x230, calling mscorwks!_EH_epilog3_catch_GS 003ff14c 6eb20c43 mscorwks!Unknown_QueryInterface_Internal+0x3d8, calling mscorwks!_EH_epilog3_catch_GS 003ff168 02006ec4 02006ec4, calling 0247a1e8 003ff16c 6ea0c86c mscorwks!_EH_epilog3_catch_GS+0xa, calling mscorwks!__security_check_cookie 003ff198 6eb20562 mscorwks!COMToCLRWorker+0xb34, calling mscorwks!_EH_epilog3_catch_GS 003ff19c 0247a235 0247a235, calling mscorwks!COMToCLRWorker 003ff1c4 7083249f msenv!CVSCommandTarget::ExecCmd+0x937 003ff1e4 7086d5c8 msenv!VsReportErrorInfo+0x11, calling msenv!TaskDialogCallback+0xd8 003ff1f8 7093e65b msenv!CVSCommandTarget::ExecCmd+0x945, calling msenv!VsReportErrorInfo 003ff25c 7081f53a msenv!ATL::CComPtr<IVsLanguageInfo>::~CComPtr<IVsLanguageInfo>+0x24, calling msenv!_EH_epilog3 003ff260 70b18d72 msenv!LogCommand+0x4c, calling msenv!ATL::CComPtr<IVsCodePageSelection>::~CComPtr<IVsCodePageSelection> 003ff264 70b18d77 msenv!LogCommand+0x51, calling msenv!_EH_epilog3 003ff280 70a4fd0e msenv!CMsoButtonUser::FClick+0x1d1, calling msenv!CVSCommandTarget::ExecCmd 003ff2f4 70823a87 msenv!CTLSITE::QueryInterface+0x16 003ff31c 70cb7d4d msenv!TBCB::FNotifyFocus+0x204 003ff35c 70ce5fda msenv!TB::NotifyControl+0x101 003ff3bc 709910f6 msenv!TB::FRequestFocus+0x4ed, calling msenv!TB::NotifyControl 003ff414 708254ba msenv!CMsoButtonUser::FEnabled+0x3d, calling msenv!GetQueryStatusFlags 003ff428 7086222a msenv!TBC::FAutoEnabled+0x24 003ff43c 7098e1eb msenv!TB::LProcessInputMsg+0xdb4 003ff458 6bec1c49 (MethodDesc 0x6bcd7f54 +0x89 System.Windows.Forms.Form.DefWndProc(System.Windows.Forms.Message ByRef)), calling 6be3b738 003ff50c 70823ab0 msenv!FPtbFromSite+0x16 003ff520 70991c43 msenv!TB::PtbParent+0x25, calling msenv!FPtbFromSite 003ff52c 708dda49 msenv!TBWndProc+0x2da 003ff588 0203d770 0203d770, calling 0247a1e8 003ff598 70822a70 msenv!CPaneFrame::Release+0x118, calling msenv!_EH_epilog3 003ff5b0 75a16238 USER32!InternalCallWinProc+0x23 003ff5dc 75a168ea USER32!UserCallWinProcCheckWow+0x109, calling USER32!InternalCallWinProc 003ff620 75a16899 USER32!UserCallWinProcCheckWow+0x6a, calling ntdll!RtlActivateActivationContextUnsafeFast 003ff654 75a17d31 USER32!DispatchMessageWorker+0x3bc, calling USER32!UserCallWinProcCheckWow 003ff688 70847f2b msenv!CMsoComponent::FPreTranslateMessage+0x72, calling msenv!MainFTranslateMessage 003ff6b4 75a17dfa USER32!DispatchMessageW+0xf, calling USER32!DispatchMessageWorker 003ff6c4 70831553 msenv!EnvironmentMsgLoop+0x1ea, calling USER32!DispatchMessageW 003ff6f8 708eb9bd msenv!CMsoCMHandler::FPushMessageLoop+0x86, calling msenv!EnvironmentMsgLoop 003ff724 708eb94d msenv!SCM::FPushMessageLoop+0xb7 003ff74c 708eb8e9 msenv!SCM_MsoCompMgr::FPushMessageLoop+0x28, calling msenv!SCM::FPushMessageLoop 003ff768 708eb8b8 msenv!CMsoComponent::PushMsgLoop+0x28 003ff788 708ebe4e msenv!VStudioMainLogged+0x482, calling msenv!CMsoComponent::PushMsgLoop 003ff7ac 70882afe msenv!CVsActivityLogSingleton::Instance+0xdf, calling msenv!_EH_epilog3 003ff7d8 70882afe msenv!CVsActivityLogSingleton::Instance+0xdf, calling msenv!_EH_epilog3 003ff7dc 707e4e31 msenv!VActivityLogStartupEntries+0x42 003ff7f4 7081f63b msenv!ATL::CComPtr<IClassFactory>::~CComPtr<IClassFactory>+0x24, calling msenv!_EH_epilog3 003ff7f8 708b250f msenv!ATL::CComQIPtr<IUnknown,&IID_IUnknown>::~CComQIPtr<IUnknown,&IID_IUnknown>+0x1d, calling msenv!_EH_epilog3 003ff820 708e7561 msenv!VStudioMain+0xc1, calling msenv!VStudioMainLogged 003ff84c 2f32aabc devenv!util_CallVsMain+0xff 003ff878 2f3278f2 devenv!CDevEnvAppId::Run+0x11fd, calling devenv!util_CallVsMain 003ff97c 77533b23 ntdll!RtlpAllocateHeap+0xe73, calling ntdll!_SEH_epilog4 003ff9f0 77536cd7 ntdll!RtlpLowFragHeapAllocFromContext+0x882, calling ntdll!RtlpSubSegmentInitialize 003ffa10 7753609f ntdll!RtlNtStatusToDosError+0x3b, calling ntdll!RtlNtStatusToDosErrorNoTeb 003ffa14 775360a4 ntdll!RtlNtStatusToDosError+0x40, calling ntdll!_SEH_epilog4 003ffa40 775360a4 ntdll!RtlNtStatusToDosError+0x40, calling ntdll!_SEH_epilog4 003ffa44 75bd2736 kernel32!LocalBaseRegOpenKey+0x159, calling ntdll!RtlNtStatusToDosError 003ffa48 75bd2762 kernel32!LocalBaseRegOpenKey+0x22a, calling kernel32!_SEH_epilog4 003ffac4 75bd2762 kernel32!LocalBaseRegOpenKey+0x22a, calling kernel32!_SEH_epilog4 003ffac8 75bd28c9 kernel32!RegOpenKeyExInternalW+0x130, calling kernel32!LocalBaseRegOpenKey 003ffad8 75bd28de kernel32!RegOpenKeyExInternalW+0x211 003ffae0 75bd28e5 kernel32!RegOpenKeyExInternalW+0x21d, calling kernel32!_SEH_epilog4 003ffb04 6f282e2b MSVCR90!_unlock+0x15, calling ntdll!RtlLeaveCriticalSection 003ffb14 75bd2642 kernel32!BaseRegCloseKeyInternal+0x41, calling ntdll!NtClose 003ffb28 75bd25d0 kernel32!RegCloseKey+0xd4, calling kernel32!_SEH_epilog4 003ffb5c 75bd25d0 kernel32!RegCloseKey+0xd4, calling kernel32!_SEH_epilog4 003ffb60 2f321ea4 devenv!DwInitSyncObjects+0x340 003ffb90 2f327bf4 devenv!WinMain+0x74, calling devenv!CDevEnvAppId::Run 003ffbac 2f327c68 devenv!License::GetPID+0x258, calling devenv!WinMain 003ffc3c 75bd3677 kernel32!BaseThreadInitThunk+0xe 003ffc48 77539d72 ntdll!__RtlUserThreadStart+0x70 003ffc88 77539d45 ntdll!_RtlUserThreadStart+0x1b, calling ntdll!__RtlUserThreadStart 0:000> !pe -nested Exception object: 050aae9c Exception type: System.Reflection.TargetInvocationException Message: Exception has been thrown by the target of an invocation. InnerException: System.NullReferenceException, use !PrintException 050aac64 to see more StackTrace (generated): SP IP Function 003FEC2C 6D2700F7 mscorlib_ni!System.RuntimeType.CreateInstanceSlow(Boolean, Boolean)+0x57 003FEC5C 6D270067 mscorlib_ni!System.RuntimeType.CreateInstanceImpl(Boolean, Boolean, Boolean)+0xe7 003FEC94 6D270264 mscorlib_ni!System.Activator.CreateInstance(System.Type, Boolean)+0x44 003FECA4 6AD02DAF Microsoft_VisualStudio_Shell_9_0_ni!Microsoft.VisualStudio.Shell.Package.CreateToolWindow(System.Type, Int32, Microsoft.VisualStudio.Shell.ProvideToolWindowAttribute)+0x67 003FED30 6AD0311B Microsoft_VisualStudio_Shell_9_0_ni!Microsoft.VisualStudio.Shell.Package.CreateToolWindow(System.Type, Int32)+0xb7 003FED58 6AD02D12 Microsoft_VisualStudio_Shell_9_0_ni!Microsoft.VisualStudio.Shell.Package.FindToolWindow(System.Type, Int32, Boolean, Microsoft.VisualStudio.Shell.ProvideToolWindowAttribute)+0x7a 003FED88 6AD02D39 Microsoft_VisualStudio_Shell_9_0_ni!Microsoft.VisualStudio.Shell.Package.FindToolWindow(System.Type, Int32, Boolean)+0x11 003FED94 02585E30 Microsoft_VisualStudio_QualityTools_TestCaseManagement!Microsoft.VisualStudio.TestTools.TestCaseManagement.QualityToolsPackage.InitToolWindowVariable[[System.__Canon, mscorlib]](System.__Canon ByRef, System.String, Boolean)+0x58 003FEDD0 02585DBE Microsoft_VisualStudio_QualityTools_TestCaseManagement!Microsoft.VisualStudio.TestTools.TestCaseManagement.QualityToolsPackage.InitToolWindowVariable[[System.__Canon, mscorlib]](System.__Canon ByRef, System.String)+0x36 003FEDE4 02585D32 Microsoft_VisualStudio_QualityTools_TestCaseManagement!Microsoft.VisualStudio.TestTools.TestCaseManagement.QualityToolsPackage.ShowToolWindow[[System.__Canon, mscorlib]](System.__Canon ByRef, System.String, Boolean)+0x3a 003FEE00 02585AB4 Microsoft_VisualStudio_QualityTools_TestCaseManagement!Microsoft.VisualStudio.TestTools.TestCaseManagement.QualityToolsPackage.OpenTestResultsToolWindow()+0x2c 003FEE10 02585A6E Microsoft_VisualStudio_QualityTools_TestCaseManagement!Microsoft.VisualStudio.TestTools.TestCaseManagement.QualityToolsPackage.OnMenuViewTestResults(System.Object, System.EventArgs)+0x6 003FEE18 6CD4F993 System_ni!System.ComponentModel.Design.MenuCommand.Invoke()+0x43 003FEE40 6CD4F9D4 System_ni!System.ComponentModel.Design.MenuCommand.Invoke(System.Object)+0x8 003FEE48 6AD000FA Microsoft_VisualStudio_Shell_9_0_ni!Microsoft.VisualStudio.Shell.OleMenuCommandService.Microsoft.VisualStudio.OLE.Interop.IOleCommandTarget.Exec(System.Guid ByRef, UInt32, UInt32, IntPtr, IntPtr)+0x11a 003FEEA0 6AD03FB8 Microsoft_VisualStudio_Shell_9_0_ni!Microsoft.VisualStudio.Shell.Package.Microsoft.VisualStudio.OLE.Interop.IOleCommandTarget.Exec(System.Guid ByRef, UInt32, UInt32, IntPtr, IntPtr)+0x44 StackTraceString: <none> HResult: 80131604 0:000> !PrintException 050aac64 Exception object: 050aac64 Exception type: System.NullReferenceException Message: Object reference not set to an instance of an object. InnerException: <none> StackTrace (generated): SP IP Function 003FE660 078E60BE Microsoft_VisualStudio_TeamSystem_Integration!Microsoft.VisualStudio.TeamSystem.Integration.TcmResultsPublishManager..ctor(Microsoft.VisualStudio.TeamSystem.Integration.ResultsPublishManager)+0xc6 003FE674 078E5C91 Microsoft_VisualStudio_TeamSystem_Integration!Microsoft.VisualStudio.TeamSystem.Integration.ResultsPublishManager..ctor(Microsoft.VisualStudio.TeamSystem.Integration.TeamFoundationHostHelper)+0x59 003FE684 078E2FA0 Microsoft_VisualStudio_TeamSystem_Integration!Microsoft.VisualStudio.TeamSystem.Integration.VsetServerHelper..ctor(System.IServiceProvider)+0x50 003FE6A4 078E2E90 Microsoft_VisualStudio_TeamSystem_Common!Microsoft.VisualStudio.TeamSystem.Integration.Client.VsetHelper.InitializeThrow(System.IServiceProvider)+0x20 003FE6B8 078E2E2A Microsoft_VisualStudio_TeamSystem_Common!Microsoft.VisualStudio.TeamSystem.Integration.Client.VsetHelper.InitializeHelper(System.IServiceProvider)+0x22 003FE6E0 078E2DEC Microsoft_VisualStudio_TeamSystem_Common!Microsoft.VisualStudio.TeamSystem.Integration.Client.VsetHelper.CreateVsetHelper(System.IServiceProvider)+0x1c 003FE6F0 078E2DAC Microsoft_VisualStudio_QualityTools_TestCaseManagement!Microsoft.VisualStudio.TestTools.TestCaseManagement.QualityToolsPackage.get_VsetHelper()+0x14 003FE6F8 02586BBE Microsoft_VisualStudio_QualityTools_TestCaseManagement!Microsoft.VisualStudio.TestTools.TestCaseManagement.ResultsToolWindow..ctor()+0x9f6 003FE798 02585F8A Microsoft_VisualStudio_QualityTools_TestCaseManagement!Microsoft.VisualStudio.TestTools.TestCaseManagement.ResultToolWindowHost..ctor()+0x1a StackTraceString: <none> HResult: 80004003 In order to be able to continue the analysis, we need to get the parameters to see what is going on. I also tried to run devenv.exe with the /log switch. No error in the log after reproducing the problem. Finally, If Team Explorer is removed from the system, the problem goes away. Any help appreciated. TIA. Olivier.

    Read the article

  • CSS, Centering an absolute <div> withing a 100% width <div>

    - by blackessej
    This shouldn't be hard...I have a menu and some content wrapped in a centered, liquid div. The content is positioned absolute. All I want to do is center #content-container. What am I missing here? #wrapper { max-width:100%; min-width:600px; min-height:100%; margin:0 auto; } #header { -moz-background-clip:border; -moz-background-origin:padding; -moz-background-size:auto auto; background-attachment:scroll; background-color:transparent; background-image:url(images/KMIAFS_banner.jpg); background-position:center top; background-repeat:no-repeat; height:150px; } #menu { margin-top:150px; clear:left; float:left; padding:0; border-top:6px solid #336699; width:100%; overflow:hidden; } #menu ul { float:left; margin:0; padding:0; list-style:none; position:relative; left:50%; text-align:center; } #menu ul li { display:block; float:left; list-style:none; margin:0; padding:0; position:relative; right:50%; } #menu ul li a { display:block; float:left; margin:0 1px 0 0; padding:30px 10px 6px 10px; background:#336699; text-decoration:none; color:#fff; } #menu ul li a:hover { padding:35px 10px 6px 10px; } #menu ul li.active a, #menu ul li.active a:hover { padding:40px 10px 6px 10px; } #content-container { top:225px; position:absolute; margin:0 auto; width:1000px; background-color:#fff; } #content { clear:left; float:left; width:610px; padding:20px 0; margin:0 0 0 30px; display:inline; } #content h2 { margin:0; } #aside { float:right; width:290px; padding:20px 0; margin:0 20px 0 0; display:inline; } #aside h3 { margin:0; } <div id="wrapper"> <div id="header"> <a id="box-link" href="index.html"></a> <div id="menu"> <ul> <li><a href="" title="Link01">Link01/a></li> <li><a href="" title="Link02">Link02</a></li> <li><a href="" title="Link03">Link03</a></li> <li><a href="" title="Link04">Link04</a></li> <li><a href="" title="Link05">Link05</a></li> <li><a href="" title="Link06">Link06</a></li> <li><a href="" title="Link07">Link07</a></li> <li><a href="" title="Link08">Link08</a></li> </ul> </div> <div id="content-container"> <div id="content"> <h2> Page heading </h2> <p> Lorem ipsum dolor sit amet consect etuer adipi scing elit sed diam nonummy nibh euismod tinunt ut laoreet dolore magna aliquam erat volut. Ut wisi enim ad minim veniam, quis nostrud exerci tation ullamcorper suscipit lobortis nisl ut aliquip ex ea commodo consequat. Duis autem vel eum iriure dolor in hendrerit in vulputate velit esse molestie consequat, vel illum dolore eu feugiat nulla facilisis at vero eros et accumsan et iusto odio dignissim qui blandit praesent luptatum zzril delenit augue duis dolore te feugait nulla facilisi. </p> <p> Ut wisi enim ad minim veniam, quis nostrud exerci tation ullamcorper suscipit lobortis nisl ut aliquip ex ea commodo consequat. Duis autem vel eum iriure dolor in hendrerit in vulputate velit esse molestie consequat, vel illum dolore eu feugiat nulla facilisis at vero eros et accumsan et iusto odio dignissim qui blandit praesent luptatum zzril delenit augue duis dolore te feugait nulla facilisi. Lorem ipsum dolor sit amet, consectetuer adipiscing elit, sed diam nonummy nibh euismod tincidunt ut laoreet dolore magna aliquam erat volutpat. </p> <p> Duis autem vel eum iriure dolor in hendrerit in vulputate velit esse molestie consequat, vel illum dolore eu feugiat nulla facilisis at vero eros et accumsan et iusto odio dignissim qui blandit praesent luptatum zzril delenit augue duis dolore te feugait nulla facilisi. Lorem ipsum dolor sit amet, consectetuer adipiscing elit, sed diam nonummy nibh euismod tincidunt ut laoreet dolore magna aliquam erat volutpat. Ut wisi enim ad minim veniam, quis nostrud exerci tation ullamcorper suscipit lobortis nisl ut aliquip ex ea commodo consequat. </p> </div> <div id="aside"> <h3> Aside heading </h3> <p> Duis autem vel eum iriure dolor in hendrerit in vulputate velit esse molestie consequat, vel illum dolore eu feugiat nulla facilisis at vero eros et accumsan. </p> </div> </div>

    Read the article

< Previous Page | 518 519 520 521 522 523 524 525  | Next Page >