Search Results

Search found 11901 results on 477 pages for 'triple store'.

Page 277/477 | < Previous Page | 273 274 275 276 277 278 279 280 281 282 283 284  | Next Page >

  • CodePlex Daily Summary for Tuesday, December 07, 2010

    CodePlex Daily Summary for Tuesday, December 07, 2010Popular ReleasesMy Web Pages Starter Kit: 1.3.1 Production Release (Security HOTFIX): Due to a critical security issue, it's strongly advised to update the My Web Pages Starter Kit to this version. Possible attackers could misuse the image upload to transmit any type of file to the website. If you already have a running version of My Web Pages Starter Kit 1.3.0, you can just replace the ftb.imagegallery.aspx file in the root directory with the one attached to this release.ASP.NET MVC Project Awesome (jQuery Ajax helpers): 1.4: A rich set of helpers (controls) that you can use to build highly responsive and interactive Ajax-enabled Web applications. These helpers include Autocomplete, AjaxDropdown, Lookup, Confirm Dialog, Popup Form, Popup and Pager new stuff: popup WhiteSpaceFilterAttribute tested on mozilla, safari, chrome, opera, ie 9b/8/7/6nopCommerce. ASP.NET open source shopping cart: nopCommerce 1.90: To see the full list of fixes and changes please visit the release notes page (http://www.nopCommerce.com/releasenotes.aspx).Aura: Aura Preview 1: Rewritten from scratch. This release supports getting color only from icon of foreground window.myCollections: Version 1.2: New in version 1.2: Big performance improvement. New Design (Added Outlook style View, New detail view, New Groub By...) Added Sort by Media Added Manage Movie Studio Zoom preference is now saved. Media name are now editable. Added Portuguese version You can now Hide details panel Add support for FLAC tags You can now imports books from BibTex Xml file BugFixingmytrip.mvc (CMS & e-Commerce): mytrip.mvc 1.0.49.0 beta: mytrip.mvc 1.0.49.0 beta web Web for install hosting System Requirements: NET 4.0, MSSQL 2008 or MySql (auto creation table to database) if .\SQLEXPRESS auto creation database (App_Data folder) mytrip.mvc 1.0.49.0 beta src System Requirements: Visual Studio 2010 or Web Deweloper 2010 MSSQL 2008 or MySql (auto creation table to database) if .\SQLEXPRESS auto creation database (App_Data folder) Connector/Net 6.3.4, MVC3 RC WARNING For run and debug mytrip.mvc 1.0.49.0 beta src download and ...Menu and Context Menu for Silverlight 4.0: Silverlight Menu and Context Menu v2.3 Beta: - Added keyboard navigation support with access keys - Shortcuts like Ctrl-Alt-A are now supported(where the browser permits it) - The PopupMenuSeparator is now completely based on the PopupMenuItem class - Moved item manipulation code to a partial class in PopupMenuItemsControl.cs - Moved menu management and keyboard navigation code to the new PopupMenuManager class - Simplified the layout by removing the RootGrid element(all content is now placed in OverlayCanvas and is accessed by the new ...SubtitleTools: SubtitleTools 1.0: First public releaseMiniTwitter: 1.62: MiniTwitter 1.62 ???? ?? ??????????????????????????????????????? 140 ?????????????????????????? ???????????????????????????????? ?? ??????????????????????????????????Phalanger - The PHP Language Compiler for the .NET Framework: 2.0 (December 2010): The release is targetted for stable daily use. With improved performance and enhanced compatibility with several latest PHP open source applications; it makes this release perfect replacement of your old PHP runtime. Changes made within this release include following and much more: Performance improvements based on real-world applications experience. We determined biggest bottlenecks and we found and removed overheads causing performance problems in many PHP applications. Reimplemented nat...Chronos WPF: Chronos v2.0 Beta 3: Release notes: Updated introduction document. Updated Visual Studio 2010 Extension (vsix) package. Added horizontal scrolling to the main window TaskBar. Added new styles for ListView, ListViewItem, GridViewColumnHeader, ... Added a new WindowViewModel class (allowing to fetch data). Added a new Navigate method (with several overloads) to the NavigationViewModel class (protected). Reimplemented Task usage for the WorkspaceViewModel.OnDelete method. Removed the reflection effect...MDownloader: MDownloader-0.15.26.7024: Fixed updater; Fixed MegauploadDJ - jQuery WebControls for ASP.NET: DJ 1.2: What is new? Update to support jQuery 1.4.2 Update to support jQuery ui 1.8.6 Update to Visual Studio 2010 New WebControls with samples added Autocomplete WebControl Button WebControl ToggleButt WebControl The example web site is including in source code project.LateBindingApi.Excel: LateBindingApi.Excel Release 0.7g: Unterschiede zur Vorgängerversion: - Zusätzliche Interior Properties - Group / Ungroup Methoden für Range - Bugfix COM Reference Handling für Application Objekt in einigen Klassen Release+Samples V0.7g: - Enthält Laufzeit DLL und Beispielprojekte Beispielprojekte: COMAddinExample - Demonstriert ein versionslos angebundenes COMAddin Example01 - Background Colors und Borders für Cells Example02 - Font Attributes undAlignment für Cells Example03 - Numberformats Example04 - Shapes, WordArts, P...ESRI ArcGIS Silverlight Toolkit: November 2010 - v2.1: ESRI ArcGIS Silverlight Toolkit v2.1 Added Windows Phone 7 build. New controls added: InfoWindow ChildPage (Windows Phone 7 only) See what's new here full details for : http://help.arcgis.com/en/webapi/silverlight/help/#/What_s_new_in_2_1/016600000025000000/ Note: Requires Visual Studio 2010, .NET 4.0 and Silverlight 4.0.ASP .NET MVC CMS (Content Management System): Atomic CMS 2.1.1: Atomic CMS 2.1.1 release notes Atomic CMS installation guide Free Silverlight & WPF Chart Control - Visifire: Visifire SL and WPF Charts v3.6.5 beta Released: Hi, Today we are releasing Visifire 3.6.5 beta with the following new feature: New property AutoFitToPlotArea has been introduced in DataSeries. AutoFitToPlotArea will bring bubbles inside the PlotArea in order to avoid clipping of bubbles in bubble chart. Also this release includes few bug fixes: AxisXLabel label were getting clipped if angle was set for AxisLabels and ScrollingEnabled was not set in Chart. If LabelStyle property was set as 'Inside', size of the Pie was not proper. Yo...AI: Initial 0.0.1: It’s simply just one code file; it simulates AI and machine in a simulated world. The AI has a little understanding of its body machine and parts, and able to use its feet to do actions just start and stop walking. The world is all of white with nothing but just the machine on a white planet. Colors, odors and position information make no sense. I’m previous C# programmer and I’m learning F# during this project, although I’m still not a good F# programmer, in this project I learning to prog...NKinect: NKinect Preview: Build features: Accelerometer reading Motor serial number property Realtime image update Realtime depth calculation Export to PLY (On demand) Control motor LED Control Kinect tiltMicrosoft - Domain Oriented N-Layered .NET 4.0 App Sample (Microsoft Spain): V1.0 - N-Layer DDD Sample App .NET 4.0: Required Software (Microsoft Base Software needed for Development environment) Visual Studio 2010 RTM & .NET 4.0 RTM (Final Versions) Expression Blend 4 SQL Server 2008 R2 Express/Standard/Enterprise Unity Application Block 2.0 - Published May 5th 2010 http://www.microsoft.com/downloads/en/details.aspx?FamilyID=2D24F179-E0A6-49D7-89C4-5B67D939F91B&displaylang=en http://unity.codeplex.com/releases/view/31277 PEX & MOLES 0.94.51023.0, 29/Oct/2010 - Visual Studio 2010 Power Tools http://re...New ProjectsAcorn: Little acorns lead to mighty oaks.Algorithmia: Algorithm and data-structure library for .NET 3.5 and up. Algorithmia contains sophisticated algorithms and data-structures like graphs, priority queues, command, undo-redo and more. Base Station Verification system: Base Station Verification systemBase Station Verification systemBase Station Verification systemBase Station Verification systemBase Station Verification systemBase Station Verification systemBase Station Verification systemBase Station Verification systemBase Station VerificatioBlueAd: Simple app to broadcast messages to bluetooth enabled devicesBuiltWith Fiddler Integration: Project Description BuiltWithFiddler adds BuildWith functionality to the HTTP Debugging Proxy Fiddler. It helps to determine the underlying technologies used in HTTP responses. www.builtwith.com www.fiddler2.com It is written in C# by Andy at Bare Web BVCMS.app: The Bellevue Church Management System is a complete Web-based application for managing your church. This iPhone app provides tools to connect to bvcms so that users can search, check-in members, and other actions.coffeeGreet: CoffeeGreet is a WordPress plug-in that will greet your visitors with coffee depending on the hour of the day, by displaying images using the Flickr API.DCEL data structure: Doubly-connected edge list data structure implementation in C#.El Bruno ClickOnce Demo: Demo de ClickOnce en CodePlexFiren's Laboratory: NothingFunCam: A fun application for playing with your webcam. Experiment with different overlays and exciting effects. Save the images when you want, or on a timer. Great fun for parties! (WPF/C#) Uses WPF Media Kit for webcam integration, and Shazzam for the great shader effects.GammaJul LgLcd: A .NET wrapper around the Logitech SDK for G15/G19 keyboard screens. Supports raw byte sending, GDI+ drawing and rendering WPF elements onto the screen.Getting Started CodePlex: This is a demo for using TFS in CodePlexGPUG (Dynamics GP User Group): The location for GPUG members to share code.HPMC: DemoImageOfMeLocator: Team Boarders Platform: WordPress Objectives: 1. Create a plugin for WordPress. 2. Create a plugin that allows users to browse images uploaded on their Flickr Account and use them as overlays for store locations on a large map. 3. Create a plugjDepot: jQuery ajax, jquery UI and ASP.NET MVC based online store application. This software will let a user manage their product inventory by exposiing CRUD operations through the UI. Customers can buy these products and track each shipment separately. It is developed in C#.JQuery Cycle Carousel for DotNetNuke®: DNN Module JQuery Cycle Carousel This module will show images as a carousel using the cycle JQuery plugin. You can easyly change Cycle effect and other settings in the module.Local Movie DB in C#: C# WPF project. Will create local movie database where users can create their own DB of the movies they own/seen/liked ... etcLocation Framework for Windows Phone 7 and Windows Azure: A framework to build location based applications with Windows Phone 7 and Windows Azure.OraLibs: Collection of useful PL/SQL procedures, which contain methods for working with arrays, strings, numbers, dates.Phyo: License managementRepositório de Monografias: O Repositório de Monografias terá como função: - Salvar em um repositório todas as monografias postadas no período pelos os alunos da FACISA/FCM/ESAC. - O administrador do sistema, fará uma avaliação de acordo com ABNT e retornará para o aluno as nescessárias correções.Secure SharePoint Silverlight Web Part - Silverlight Security & Auditing: The Secure Silverlight WebPart provides both builtin security using default SharePoint security mechanisms as well as site collection specific auditing to record an event a Silverlight file is newly hosted in the SharePoint environment. SilverlightColorPicker: Photoshop like ColorPicker built in silverlight from scratchSparrow.Net Connect: This is a passport system.Sparrow.NET TaskMe: TaskMe is a project management web application.Written using Sparrow.Net frameworkSQLiteWrapper: A light c# wrapper around the sqlite library's functionsSuperMarioBros.Net: A .Net Super Mario Bros clon.Virtualizing Tree View: Tree View for large amount of itemsWindows Forms GUI based Trace Listener: Gives a simple UI based Trace Listener to debug / Trace information . No need to look at EventLog / Xml file etc. This code Library helps you View the Trace and debug entries. Can plug in to your WinForms App as well.WP Socially Related: Automatically include related posts from Twitter, WordPress.com and Bing Search into each of your blog posts

    Read the article

  • CodePlex Daily Summary for Thursday, February 24, 2011

    CodePlex Daily Summary for Thursday, February 24, 2011Popular ReleasesInstant Feature Builder for Visual Studio 2010: Instant Feature Builder 1.2: This is the binary version of the Instant Feature Builder. Once downloaded, double click to install into Visual Studio. Version 1.2 fixes: Rename FX to IFB to shorten path lengths Fix issue in ExecuteCommand Fix issue to workaround problem with VS Template writer The Instant Feature Builder is a tool which enables you, via drag-and-drop, to build a specific type of Visual Studio extension (VSIX) known as a Feature Extension. A Feature Extension packages project and/or item templates,...DirectQ: Release 1.8.7 (Beta): Beta release of 1.8.7 to get feedback on what works well, what doesn't work well, and what doesn't work at all. D3D9 hardware with ps2.0 a must. Faster, more streamlined and more integrated rendering capabilities with additional MP features and support.Smartkernel: Smartkernel: ????,??????Chiave File Encryption: Chiave 0.9.1: Application for file encryption and decryption using 512 Bit rijndael encyrption algorithm with simple to use UI. Its written in C# and compiled in .Net version 3.5. It incorporates features of Windows 7 like Jumplists, Taskbar progress and Aero Glass. Change Log from 0.9 Beta to 0.9.1: ======================= >Added option for system shutdown, sleep, hibernate after operation completed. >Minor Changes to the UI. >Numerous Bug fixes. Feedbacks are Welcome!....DotNetNuke® Store: 03.00.00: What's New in this release? IMPORTANT: this version requires DotNetNuke 04.06.02 or higher! DO NOT REPORT BUGS HERE IN THE ISSUE TRACKER, INSTEAD USE THE DotNetNuke Store Forum! This version is the same code base as the version 02.01.51 RC, just some cleaning and source code release before submition to the release tracker for "official" release.ClosedXML - The easy way to OpenXML: ClosedXML 0.45.2: New on this release: 1) Added data validation. See Data Validation 2) Deleting or clearing cells deletes the hyperlinks too. New on v0.45.1 1) Fixed issues 6237, 6240 New on v0.45.2 1) Fixed issues 6257, 6266 New Examples Data ValidationOMEGA CMS: OMEGA CMA - Alpha 0.2: A few fixes for OMEGA Framework (DLL) A few tweeks for OMEGA CMSCoding4Fun Tools: Coding4Fun.Phone.Toolkit v1.2: New control, Toast Prompt! Removed progress bar since Silverlight Toolkit Feb 2010 has it.Umbraco CMS: Umbraco 4.7: Service release fixing 31 issues. A full changelog will be available with the final stable release of 4.7 Important when upgradingUpgrade as if it was a patch release (update /bin, /umbraco and /umbraco_client). For general upgrade information follow the guide found at http://our.umbraco.org/wiki/install-and-setup/upgrading-an-umbraco-installation 4.7 requires the .NET 4.0 framework Web.Config changes Update the web web.config to include the 4 changes found in (they're clearly marked in...HubbleDotNet - Open source full-text search engine: V1.1.0.0: Add Sqlite3 DBAdapter Add App Report when Query Cache is Collecting. Improve the performance of index through Synchronize. Add top 0 feature so that we can only get count of the result. Improve the score calculating algorithm of match. Let the score of the record that match all items large then others. Add MySql DBAdapter Improve performance for multi-fields sort . Using hash table to access the Payload data. The version before used bin search. Using heap sort instead of qui...Xen: Graphics API for XNA: Xen 2.0: This is the final release of Xen; Xen 2.0. Xen 2.0 supports PC and Xbox 360 running XNA 4. The documentation download is coming soon Due to restrictions in XNA 4, Building Xen requires a DirectX 10 capable video card (Xen applications can still run on Windows Xp and DirectX 9 video cards)Silverlight????[???]: silverlight????[???]2.0: ???????,?????,????????silverlight??????。DBSourceTools: DBSourceTools_1.3.0.0: Release 1.3.0.0 Changed editors from FireEdit to ICSharpCode.TextEditor. Complete re-vamp of Intellisense ( further testing needed). Hightlight Field and Table Names in sql scripts. Added field dropdown on all tables and views in DBExplorer. Added data option for viewing data in Tables. Fixed comment / uncomment bug as reported by tareq. Included Synonyms in scripting engine ( nickt_ch ).IronPython: 2.7 Release Candidate 1: We are pleased to announce the first Release Candidate for IronPython 2.7. This release contains over two dozen bugs fixed in preparation for 2.7 Final. See the release notes for 60193 for details and what has already been fixed in the earlier 2.7 prereleases. - IronPython TeamCaliburn Micro: A Micro-Framework for WPF, Silverlight and WP7: Caliburn.Micro 1.0 RC: This is the official Release Candicate for Caliburn.Micro 1.0. The download contains the binaries, samples and VS templates. VS Templates The templates included are designed for situations where the Caliburn.Micro source needs to be embedded within a single project solution. This was targeted at government and other organizations that expressed specific requirements around using an open source project like this. NuGet This release does not have a corresponding NuGet package. The NuGet pack...Caliburn: A Client Framework for WPF and Silverlight: Caliburn 2.0 RC: This is the official Release Candidate for Caliburn 2.0. It contains all binaries, samples and generated code docs.Rawr: Rawr 4.0.20 Beta: Rawr is now web-based. The link to use Rawr4 is: http://elitistjerks.com/rawr.phpThis is the Cataclysm Beta Release. More details can be found at the following link http://rawr.codeplex.com/Thread/View.aspx?ThreadId=237262 As of the 4.0.16 release, you can now also begin using the new Downloadable WPF version of Rawr!This is a pre-alpha release of the WPF version, there are likely to be a lot of issues. If you have a problem, please follow the Posting Guidelines and put it into the Issue Trac...MiniTwitter: 1.66: MiniTwitter 1.66 ???? ?? ?????????? 2 ??????????????????? User Streams ?????????Windows Phone 7 Isolated Storage Explorer: WP7 Isolated Storage Explorer v1.0 Beta: Current release features:WPF desktop explorer client Visual Studio integrated tool window explorer client (Visual Studio 2010 Professional and above) Supported operations: Refresh (isolated storage information), Add Folder, Add Existing Item, Download File, Delete Folder, Delete File Explorer supports operations running on multiple remote applications at the same time Explorer detects application disconnect (1-2 second delay) Explorer confirms operation completed status Explorer d...Silverlight Toolkit: Silverlight for Windows Phone Toolkit - Feb 2011: Silverlight for Windows Phone Toolkit OverviewSilverlight for Windows Phone Toolkit offers developers additional controls for Windows Phone application development, designed to match the rich user experience of the Windows Phone 7. Suggestions? Features? Questions? Ask questions in the Create.msdn.com forum. Add bugs or feature requests to the Issue Tracker. Help us shape the Silverlight Toolkit with your feedback! Please clearly indicate that the work items and issues are for the phone t...New ProjectsCompetition Management Platform: Paragliding Competition Management PlatformCS424 B2 Group - Car Management: CS424 B2 Group - Car ManagementEMS: The main aim of the system to perform environment inventorization.EVVA: EVVA ist ein Softwareprojekt zur Unterstützung privater Arbeitsvermittler FileSocialVB: A library to work with the filesocial.com web site and API through VB.NET. This is actually an offshoot of the http://twittervb.codeplex.com project. Though smaller, this will lead to a more compact library.Hash Crack - IGProgram: Hash Crack is a software program for hashes and passwords cracking. Hash Crack use dictionary or set of symbols for hashes cracking, and also support pwdump file format for Windows passwords cracking NTHash MD4. MD2, MD4, MD5, SHA1, SHA256, SHA384, SHA512Imail Spammer Killer: Bots seem to love picking off the weak passwords of IpSwitch IMail v10 users and using SMTP auth to spam the world with their new found account access. This project is a windows service to isolate and stop this activity by disabling the violated user account as it occurs.JVS2USB: Montage permettant de relier une IO ( capcom / Sega ) sur un PC via l'USB !!Library Reminder: Library ReminderMAVI: mobile application for the visually impaired: bill recognition & tag and recognize objects based on a specific stickerMessage splliting without envelope in Biztalk 2009: Message splitting without envelope in Biztalk 2009. The project contains: - Source Code - Examples Article describing how to make it:Microsoft translator: Language translator designed to test Microsoft Translator web service API In Windows Phone 7 developed using Visual Studio 2010 in C#mrmuffin: Mr Muffin WP7 game for childrenNetwork Monitor: Simple application with a vu-meter style display of recent incoming network traffic. - Requires .NET 4, - Requires WinPCap (http://www.winpcap.org/) - Only tested to run under Windows 7NPhysics: NPhysics - Physical Data Types for .NETOpen SimRacing Results: Open SimRacing Results aims to provide an open standard for race results of PC SimRacing games (like iRacing, rFactor, NR2003, ...), allowing developers of league management systems to use a unique interface to get the results from, regardless of the simulation used.Orchard Image Gallery: Orchard Image Gallery project is intented to provide a Image Gallery content part and/or widget for the Orchard Project.Planning Poker for Windows Phone 7: Play planning poker on your Windows Phone 7.Publishing and consuming WCF in Biztalk 2009 and Visual Studio 2008: The file contains: Biztalk 2009 Project, C# Console Project, Example. Push Notification for Windows Phone 7 in php: WindowsPhonePushNotification enables you to use Microsoft Push Notification Service in phpQuksace Agjke: For more information, please visit <http://students.cs.tamu.edu/abe/IS_and_R/HW3/quksace%20agjke.html>. http://students.cs.tamu.edu/abe/IS_and_R/HW3/quksace%20agjke.html Read: a "GNU Make"-like utility for viewing Readme files: Read is a simple program to easily load and read README files on a UNIX-compliant system. It works in a similar way to GNU Make by searching a directory for a compatible file, in this case a Readme file, and loading it for reading using a text editor or viewer.SQL CE 3.5 persistence mapper for ECO IV: May need some adjustments for ECO V/VI, definitely needs some rewrite to support SQL CE 4.0 fullySQL Server Job Failure Notification System: A simple means of ensuring you know when SQLAgent jobs fail.SuperQuery: SuperQuery makes it easy to run the same batch of SQL across several databases on different SQL servers. SuperQuery supports all editions and versions of SQL Server from 2000 onwards. It is developed in C# using .NET 4 Client Profile.System.Net.Mail Extended: An Extension to the System.Net.Mail Namespace, adding a POP3 Client, Enhanced SMTP Client, IMAP Client, POP3 Server, SMTP Server, and IMAP Serve, all written in Visual C#.wow-combatlogs: A PowerShell module for working with combat logs generated by World of Warcraft.WPF UndoManager: WPF UndoManager provides a simple Undomanager on the base of WPF's CommandPattern. It use an implementation of the ICommand-interface to manage a history of actions.XO / TicTacToe game: Dynamic sized XO / TicTacToe game for Windows Phone 7 Build using Visual Studio 2010 and C#

    Read the article

  • CLSF & CLK 2013 Trip Report by Jeff Liu

    - by jamesmorris
    This is a contributed post from Jeff Liu, lead XFS developer for the Oracle mainline Linux kernel team. Recently, I attended both the China Linux Storage and Filesystem workshop (CLSF), and the China Linux Kernel conference (CLK), which were held in Shanghai. Here are the highlights for both events. CLSF - 17th October XFS update (led by Jeff Liu) XFS keeps rapid progress with a lot of changes, especially focused on the infrastructure/performance improvements as well as  new feature development.  This can be reflected with a sample statistics among XFS/Ext4+JBD2/Btrfs via: # git diff --stat --minimal -C -M v3.7..v3.12-rc4 -- fs/xfs|fs/ext4+fs/jbd2|fs/btrfs XFS: 141 files changed, 27598 insertions(+), 19113 deletions(-) Ext4+JBD2: 39 files changed, 10487 insertions(+), 5454 deletions(-) Btrfs: 70 files changed, 19875 insertions(+), 8130 deletions(-) What made up those changes in XFS? Self-describing metadata(CRC32c). This is a new feature and it contributed about 70% code changes, it can be enabled via `mkfs.xfs -m crc=1 /dev/xxx` for v5 superblock. Transaction log space reservation improvements. With this change, we can calculate the log space reservation at mount time rather than runtime to reduce the the CPU overhead. User namespace support. So both XFS and USERNS can be enabled on kernel configuration begin from Linux 3.10. Thanks Dwight Engen's efforts for this thing. Split project/group quota inodes. Originally, project quota can not be enabled with group quota at the same time because they were share the same quota file inode, now it works but only for v5 super block. i.e, CRC enabled. CONFIG_XFS_WARN, an new lightweight runtime debugger which can be deployed in production environment. Readahead log object recovery, this change can speed up the log replay progress significantly. Speculative preallocation inode tracking, clearing and throttling. The main purpose is to deal with inodes with post-EOF space due to speculative preallocation, support improved quota management to free up a significant amount of unwritten space when at or near EDQUOT. It support backgroup scanning which occurs on a longish interval(5 mins by default, tunable), and on-demand scanning/trimming via ioctl(2). Bitter arguments ensued from this session, especially for the comparison between Ext4 and Btrfs in different areas, I have to spent a whole morning of the 1st day answering those questions. We basically agreed on XFS is the best choice in Linux nowadays because: Stable, XFS has a good record in stability in the past 10 years. Fengguang Wu who lead the 0-day kernel test project also said that he has observed less error than other filesystems in the past 1+ years, I own it to the XFS upstream code reviewer, they always performing serious code review as well as testing. Good performance for large/small files, XFS does not works very well for small files has already been an old story for years. Best choice (maybe) for distributed PB filesystems. e.g, Ceph recommends delopy OSD daemon on XFS because Ext4 has limited xattr size. Best choice for large storage (>16TB). Ext4 does not support a single file more than around 15.95TB. Scalability, any objection to XFS is best in this point? :) XFS is better to deal with transaction concurrency than Ext4, why? The maximum size of the log in XFS is 2038MB compare to 128MB in Ext4. Misc. Ext4 is widely used and it has been proved fast/stable in various loads and scenarios, XFS just need more customers, and Btrfs is still on the road to be a manhood. Ceph Introduction (Led by Li Wang) This a hot topic.  Li gave us a nice introduction about the design as well as their current works. Actually, Ceph client has been included in Linux kernel since 2.6.34 and supported by Openstack since Folsom but it seems that it has not yet been widely deployment in production environment. Their major work is focus on the inline data support to separate the metadata and data storage, reduce the file access time, i.e, a file access need communication twice, fetch the metadata from MDS and then get data from OSD, and also, the small file access is limited by the network latency. The solution is, for the small files they would like to store the data at metadata so that when accessing a small file, the metadata server can push both metadata and data to the client at the same time. In this way, they can reduce the overhead of calculating the data offset and save the communication to OSD. For this feature, they have only run some small scale testing but really saw noticeable improvements. Test environment: Intel 2 CPU 12 Core, 64GB RAM, Ubuntu 12.04, Ceph 0.56.6 with 200GB SATA disk, 15 OSD, 1 MDS, 1 MON. The sequence read performance for 1K size files improved about 50%. I have asked Li and Zheng Yan (the core developer of Ceph, who also worked on Btrfs) whether Ceph is really stable and can be deployed at production environment for large scale PB level storage, but they can not give a positive answer, looks Ceph even does not spread over Dreamhost (subject to confirmation). From Li, they only deployed Ceph for a small scale storage(32 nodes) although they'd like to try 6000 nodes in the future. Improve Linux swap for Flash storage (led by Shaohua Li) Because of high density, low power and low price, flash storage (SSD) is a good candidate to partially replace DRAM. A quick answer for this is using SSD as swap. But Linux swap is designed for slow hard disk storage, so there are a lot of challenges to efficiently use SSD for swap. SWAPOUT swap_map scan swap_map is the in-memory data structure to track swap disk usage, but it is a slow linear scan. It will become a bottleneck while finding many adjacent pages in the use of SSD. Shaohua Li have changed it to a cluster(128K) list, resulting in O(1) algorithm. However, this apporoach needs restrictive cluster alignment and only enabled for SSD. IO pattern In most cases, the swap io is in interleaved pattern because of mutiple reclaimers or a free cluster is shared by all reclaimers. Even though block layer can merge interleaved IO to some extent, but we cannot count on it completely. Hence the per-cpu cluster is added base on the previous change, it can help reclaimer do sequential IO and the block layer will be easier to merge IO. TLB flush: If we're reclaiming one active page, we should first move the page from active lru list to inactive lru list, and then reclaim the page from inactive lru to swap it out. During the process, we need to clear PTE twice: first is 'A'(ACCESS) bit, second is 'P'(PRESENT) bit. Processors need to send lots of ipi which make the TLB flush really expensive. Some works have been done to improve this, including rework smp_call_functiom_many() or remove the first TLB flush in x86, but there still have some arguments here and only parts of works have been pushed to mainline. SWAPIN: Page fault does iodepth=1 sync io, but it's a little waste if only issue a page size's IO. The obvious solution is doing swap readahead. But the current in-kernel swap readahead is arbitary(always 8 pages), and it always doesn't perform well for both random and sequential access workload. Shaohua introduced a new flag for madvise(MADV_WILLNEED) to do swap prefetch, so the changes happen in userspace API and leave the in-kernel readahead unchanged(but I think some improvement can also be done here). SWAP discard As we know, discard is important for SSD write throughout, but the current swap discard implementation is synchronous. He changed it to async discard which allow discard and write run in the same time. Meanwhile, the unit of discard is also optimized to cluster. Misc: lock contention For many concurrent swapout and swapin , the lock contention such as anon_vma or swap_lock is high, so he changed the swap_lock to a per-swap lock. But there still have some lock contention in very high speed SSD because of swapcache address_space lock. Zproject (led by Bob Liu) Bob gave us a very nice introduction about the current memory compression status. Now there are 3 projects(zswap/zram/zcache) which all aim at smooth swap IO storm and promote performance, but they all have their own pros and cons. ZSWAP It is implemented based on frontswap API and it uses a dynamic allocater named Zbud to allocate free pages. Zbud means pairs of zpages are "buddied" and it can only store at most two compressed pages in one page frame, so the max compress ratio is 50%. Each page frame is lru-linked and can do shink in memory pressure. If the compressed memory pool reach its limitation, shink or reclaim happens. It decompress the page frame into two new allocated pages and then write them to real swap device, but it can fail when allocating the two pages. ZRAM Acts as a compressed ramdisk and used as swap device, and it use zsmalloc as its allocator which has high density but may have fragmentation issues. Besides, page reclaim is hard since it will need more pages to uncompress and free just one page. ZRAM is preferred by embedded system which may not have any real swap device. Now both ZRAM and ZSWAP are in driver/staging tree, and in the mm community there are some disscussions of merging ZRAM into ZSWAP or viceversa, but no agreement yet. ZCACHE Handles file page compression but it is removed out of staging recently. From industry (led by Tang Jie, LSI) An LSI engineer introduced several new produces to us. The first is raid5/6 cards that it use full stripe writes to improve performance. The 2nd one he introduced is SandForce flash controller, who can understand data file types (data entropy) to reduce write amplification (WA) for nearly all writes. It's called DuraWrite and typical WA is 0.5. What's more, if enable its Dynamic Logical Capacity function module, the controller can do data compression which is transparent to upper layer. LSI testing shows that with this virtual capacity enables 1x TB drive can support up to 2x TB capacity, but the application must monitor free flash space to maintain optimal performance and to guard against free flash space exhaustion. He said the most useful application is for datebase. Another thing I think it's worth to mention is that a NV-DRAM memory in NMR/Raptor which is directly exposed to host system. Applications can directly access the NV-DRAM via a memory address - using standard system call mmap(). He said that it is very useful for database logging now. This kind of NVM produces are beginning to appear in recent years, and it is said that Samsung is building a research center in China for related produces. IMHO, NVM will bring an effect to current os layer especially on file system, e.g. its journaling may need to redesign to fully utilize these nonvolatile memory. OCFS2 (led by Canquan Shen) Without a doubt, HuaWei is the biggest contributor to OCFS2 in the past two years. They have posted 46 upstream patches and 39 patches have been merged. Their current project is based on 32/64 nodes cluster, but they also tried 128 nodes at the experimental stage. The major work they are working is to support ATS (atomic test and set), it can be works with DLM at the same time. Looks this idea is inspired by the vmware VMFS locking, i.e, http://blogs.vmware.com/vsphere/2012/05/vmfs-locking-uncovered.html CLK - 18th October 2013 Improving Linux Development with Better Tools (Andi Kleen) This talk focused on how to find/solve bugs along with the Linux complexity growing. Generally, we can do this with the following kind of tools: Static code checkers tools. e.g, sparse, smatch, coccinelle, clang checker, checkpatch, gcc -W/LTO, stanse. This can help check a lot of things, simple mistakes, complex problems, but the challenges are: some are very slow, false positives, may need a concentrated effort to get false positives down. Especially, no static checker I found can follow indirect calls (“OO in C”, common in kernel): struct foo_ops { int (*do_foo)(struct foo *obj); } foo->do_foo(foo); Dynamic runtime checkers, e.g, thread checkers, kmemcheck, lockdep. Ideally all kernel code would come with a test suite, then someone could run all the dynamic checkers. Fuzzers/test suites. e.g, Trinity is a great tool, it finds many bugs, but needs manual model for each syscall. Modern fuzzers around using automatic feedback, but notfor kernel yet: http://taviso.decsystem.org/making_software_dumber.pdf Debuggers/Tracers to understand code, e.g, ftrace, can dump on events/oops/custom triggers, but still too much overhead in many cases to run always during debug. Tools to read/understand source, e.g, grep/cscope work great for many cases, but do not understand indirect pointers (OO in C model used in kernel), give us all “do_foo” instances: struct foo_ops { int (*do_foo)(struct foo *obj); } = { .do_foo = my_foo }; foo>do_foo(foo); That would be great to have a cscope like tool that understands this based on types/initializers XFS: The High Performance Enterprise File System (Jeff Liu) [slides] I gave a talk for introducing the disk layout, unique features, as well as the recent changes.   The slides include some charts to reflect the performances between XFS/Btrfs/Ext4 for small files. About a dozen users raised their hands when I asking who has experienced with XFS. I remembered that when I asked the same question in LinuxCon/Japan, only 3 people raised their hands, but they are Chris Mason, Ric Wheeler, and another attendee. The attendee questions were mainly focused on stability, and comparison with other file systems. Linux Containers (Feng Gao) The speaker introduced us that the purpose for those kind of namespaces, include mount/UTS/IPC/Network/Pid/User, as well as the system API/ABI. For the userspace tools, He mainly focus on the Libvirt LXC rather than us(LXC). Libvirt LXC is another userspace container management tool, implemented as one type of libvirt driver, it can manage containers, create namespace, create private filesystem layout for container, Create devices for container and setup resources controller via cgroup. In this talk, Feng also mentioned another two possible new namespaces in the future, the 1st is the audit, but not sure if it should be assigned to user namespace or not. Another is about syslog, but the question is do we really need it? In-memory Compression (Bob Liu) Same as CLSF, a nice introduction that I have already mentioned above. Misc There were some other talks related to ACPI based memory hotplug, smart wake-affinity in scheduler etc., but my head is not big enough to record all those things. -- Jeff Liu

    Read the article

  • CodePlex Daily Summary for Friday, February 11, 2011

    CodePlex Daily Summary for Friday, February 11, 2011Popular ReleasesSnoop, the WPF Spy Utility: Snoop 2.6.1: This release is a bug fixing release. Most importantly, issues have been seen around WPF 4.0 applications not always showing up in the app chooser. Hopefully, they are fixed now. I thought this issue warranted a minor release since more and more people are going WPF 4.0 and I don't want anyone to have any problems. Dan Hanan also contributes again with several usability features. Thanks Dan! Happy Snooping! p.s. By request, I am also attaching a .zip file ... so that people can install it ...RIBA - Rich Internet Business Application for Silverlight: Preview of MVVM Framework Source + Tutorials: This is a first public release of the MVVM Framework which is part of the final RIBA application. The complete RIBA example LOB application has yet to be published. Further Documentation on the MVVM part can be found on the Blog, http://www.SilverlightBlog.Net and in the downloadable source ( mvvm/doc/ ). Please post all issues and suggestions in the issue tracker.SharePoint Learning Kit: 1.5: SharePoint Learning Kit 1.5 has the following new functionality: *Support for SharePoint 2010 *E-Learning Actions can be localised *Two New Document Library Edit Options *Automatically add the Assignment List Web Part to the Web Part Gallery *Various Bug Fixes for the Drop Box There are 2 downloads for this release SLK-1.5-2010.zip for SharePoint 2010 SLK-1.5-2007.zip for SharePoint 2007 (WSS3 & MOSS 2007)Facebook C# SDK: 5.0.3 (BETA): This is fourth BETA release of the version 5 branch of the Facebook C# SDK. Remember this is a BETA build. Some things may change or not work exactly as planned. We are absolutely looking for feedback on this release to help us improve the final 5.X.X release. For more information about this release see the following blog posts: Facebook C# SDK - Writing your first Facebook Application Facebook C# SDK v5 Beta Internals Facebook C# SDK V5.0.0 (BETA) Released We have spend time trying ...NodeXL: Network Overview, Discovery and Exploration for Excel: NodeXL Excel Template, version 1.0.1.161: The NodeXL Excel template displays a network graph using edge and vertex lists stored in an Excel 2007 or Excel 2010 workbook. What's NewThis release adds a new Twitter List network importer, makes some minor feature improvements, and fixes a few bugs. See the Complete NodeXL Release History for details. Installation StepsFollow these steps to install and use the template: Download the Zip file. Unzip it into any folder. Use WinZip or a similar program, or just right-click the Zip file...WatchersNET.TagCloud: WatchersNET.TagCloud 01.09.03: Whats NewAdded New Skin TagTastic http://www.watchersnet.de/Portals/0/screenshots/dnn/TagCloud-TagTastic-Skin.jpg Added New Skin RoundedButton http://www.watchersnet.de/Portals/0/screenshots/dnn/TagCloud-RoundedButton-Skin.jpg changes Tag Count fixed on Tag Source Referrals Fixed Tag Count when multiple Tag Sources are usedExtremeML: ExtremeML v1.0 Beta 3: VS solution source code updated for compatibility with VS2010 (accommodates VS2010 breaking changes in T4 template support).Finestra Virtual Desktops: 1.1: This release adds a few more performance and graphical enhancements to 1.0. Switching desktops is now about as fast as you can blink. Desktop switching optimizations New welcome wizard for Vista/7 Fixed a few minor bugs Added a few more options to the options dialog (including ability to disable the taskbar switching)WCF Data Services Toolkit: WCF Data Services Toolkit: The source code and binary releases of the WCF Data Services Toolkit. For simplicity, the source code download doesn't include any of the MSTest files. If you want those, you can pull the code down via MercurialyoutubeFisher: youtubeFisher 3.0 [beta]: What's new: Video capturing improved Supports YouTube's new layout (january 2011) Internal refactoringNearforums - ASP.NET MVC forum engine: Nearforums v5.0: Version 5.0 of the ASP.NET MVC Forum Engine, containing the following improvements: .NET 4.0 as target framework using ASP.NET MVC 3. All views migrated to Razor for cleaner markup. Alternate template (Layout file) for mobile devices 4 Bug Fixes since Version 4.1 Visit the project Roadmap for more details. Webdeploy package sha1 checksum: 28785b7248052465ea0738a7775e8e8744d84c27fuv: 1.0 release, codename Chopper Joe: features: search/replace :o to open file :s to save file :q to quitASP.NET MVC Project Awesome, jQuery Ajax helpers (controls): 1.7: A rich set of helpers (controls) that you can use to build highly responsive and interactive Ajax-enabled Web applications. These helpers include Autocomplete, AjaxDropdown, Lookup, Confirm Dialog, Popup Form, Popup and Pager html generation optimized new features for the lookup (add additional search data ) live demo went aeroEnhSim: EnhSim 2.3.6 BETA: 2.3.6 BETAThis release supports WoW patch 4.06 at level 85 To use this release, you must have the Microsoft Visual C++ 2010 Redistributable Package installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=A7B7A05E-6DE6-4D3A-A423-37BF0912DB84 To use the GUI you must have the .NET 4.0 Framework installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=9cfb2d51-5ff4-4491-b0e5-b386f32c0992 Changes since 2.3.0 ...TestApi - a library of Test APIs: TestApi v0.6: TestApi v0.6 comes with the following changes: TestApi code development has been moved to Codeplex: Moved TestApi soluton to VS 2010; Moved all source code to Codeplex. All development work is done there now. Fault Injection API: Integrated the unmanaged FaultInjectionEngine.dll COM component in the build; Cleaned up FaultInjectionEngine.dll to build at warning level 4; Implemented “FaultScope” which allows for in-process fault injection; Added automation scripts & sample program; ...AutoLoL: AutoLoL v1.5.5: AutoChat now allows up to 6 items. Items with nr. 7-0 will be removed! News page url's are now opened in the default browser Added a context menu to the system tray icon (thanks to Alex Banagos) AutoChat now allows configuring the Chat Keys and the Modifier Key The recent files list now supports compact and full mode Fix: Swapped mouse buttons are now properly detected Fix: Sometimes the Play button was pressed while still greyed out Champion: Karma Note: You can also run the u...mojoPortal: 2.3.6.2: see release notes on mojoportal.com http://www.mojoportal.com/mojoportal-2362-released.aspx Note that we have separate deployment packages for .NET 3.5 and .NET 4.0 The deployment package downloads on this page are pre-compiled and ready for production deployment, they contain no C# source code. To download the source code see the Source Code Tab I recommend getting the latest source code using TortoiseHG, you can get the source code corresponding to this release here.Rawr: Rawr 4.0.19 Beta: Rawr is now web-based. The link to use Rawr4 is: http://elitistjerks.com/rawr.phpThis is the Cataclysm Beta Release. More details can be found at the following link http://rawr.codeplex.com/Thread/View.aspx?ThreadId=237262 As of the 4.0.16 release, you can now also begin using the new Downloadable WPF version of Rawr!This is a pre-alpha release of the WPF version, there are likely to be a lot of issues. If you have a problem, please follow the Posting Guidelines and put it into the Issue Trac...IronRuby: 1.1.2: IronRuby 1.1.2 is a servicing release that keeps on improving compatibility with Ruby 1.9.2 and includes IronRuby integration to Visual Studio 2010. We decided to drop 1.8.6 compatibility mode in all post-1.0 releases. We recommend using IronRuby 1.0 if you need 1.8.6 compatibility. In this release we fixed several major issues: - problems that blocked Gem installation in certain cases - regex syntax: the parser was replaced with a new one that is much more compatible with Ruby 1.9.2 - cras...MVVM Light Toolkit: MVVM Light Toolkit V3 SP1 (4): There was a small issue with the previous release that caused errors when installing the templates in VS10 Express. This release corrects the error. Only use this if you encountered issues when installing the previous release. No changes in the binaries.New ProjectsAlchemySearch: A middleware application that allows the searching of Open Text's Alchemy application from a command line or from a calling application. C# .NET 4alltuan: all tuan infoasp.net mvc 2.0 ??????: asp.net mvc 2.0 ??????Basic Text File Generator: This is an investigatory development that creates text files from a windows forms app.bisolu_luc: nothing yetCriteria Pack for EPiServer CMS: This is a collection of useful criteria for extending the personalization mechanism, Visitor Groups, in EPiServer CMS 6 R2 (or later).CTRNN.NET: CTRNN.NET is a C# implementation of Continuous Time Recurrent Neural Networks.DXperience Toolkit: Complimentary controls and libraries from DevExpress.Excel Library for Small Basic: This library extend Small Basic to allows you to read or write contents of a cell in the Excel file from Small Basic program. ???????? Small Basic ????、Small Basic ??? Excel ???????????????????。HealthVault Eventing Sample: A sample application demonstrating the new HealthVault Eventing feature.Hitcents Blog - Example Code: Contains various source code examples from articles at http://www.hitcents.com/blog. Brought to you by Hitcents.INI Modifier: A proprietary application to amend users ini files for JBAe automatically from a central control file.issueIT.net - ”I just found the last bug”: issueIT is the best open source web-based issue tracker for .NET. It’s easy to customize to suit your organization. This project is not finished yet - it is only at a very early stage Leage of Legends Masteries Tool: -Small, portable and fast tool to help you set your masteries. -No Install. -Masteries saved in the same folder: "lolMasterSet.dat" (text file) How to use: - Run and then go to masteries tab in League of Legends luncher. You should see the overlay. Memory View controls: Memory View controls are a pair of controls for real time and static viewing and editing of a memory space. MemoryHexEditor is a hex editor that supports viewing and editing memory, and MemoryHeatMap is a time-based graphical view that shows how memory is read and written to.Moq Contrib: Community contributions to Moq.Operating System Basic: Sistema operacional desenvolvido em QuickBasic, durante minha adolecência.PAiRS - A WPF Memory Card Game: PAiRS is an implementation of a card matching game in which you are given an even number of cards face down in a grid, and you try to flip over 2 cards at a time to create a match until all cards are matched. PAiRS is built using C#, .Net 3.5+ and WPF.RIBA - Rich Internet Business Application for Silverlight: RIBA is an example LOB application with many features to show how to build distributed and secure Line of Business applications based on Silverlight 4. You can find the official blog on www.silverlightblog.netSearcher: Searcher is a Windows Phone project which allows you to quickly search using multiple search engines. SharePoint Term Store Powershell Utilities: An assortment of PowerShell script-based utilities to perform actions on the SharePoint Managed Metadata Term Store. PowerShell script-based utilities have been chosen rather than cmdlets as they are more suitable for environments that have strict installation policies.SmartHotelFoundation: Smart Hotel FoundationSumer Filemanager: This project ensured fully ajax based file management.The Open Source PMU: The Open Source PMU Project provides resources that enable you to build your own SynchroPhasor sensor for use with the openPDC project, research, development, or electric grid observation.Trail Blazer: Movie Trailer Fetcher - Downloads trailers for movies in a collection and saves them in movie folder. It is being developed as a companion to Meta Data Fetchers for the Media Browser project. It is being developed as a standalone C# app but a frontend in MCML is being considered.University Project Tracker: University Project TrackerVideo Backup Fusion: The complete and simply to use backup solution for video portalsWPF ShellFactory: WPF ShellFactory is an easy-to-use application framework for WPF, using a system derived from M-V-VM for its views. It supports multiple modules, either statically or dynamically loaded (i.e. plugins). It also contains mechanisms for application-wide services.

    Read the article

  • Windows Phone 7: Building a simple dictionary web client

    - by TechTwaddle
    Like I mentioned in this post a while back, I came across a dictionary web service called Aonaware that serves up word definitions from various dictionaries and is really easy to use. The services page on their website, http://services.aonaware.com/DictService/DictService.asmx, lists all the operations that are supported by the dictionary service. Here they are, Word Dictionary Web Service The following operations are supported. For a formal definition, please review the Service Description. Define Define given word, returning definitions from all dictionaries DefineInDict Define given word, returning definitions from specified dictionary DictionaryInfo Show information about the specified dictionary DictionaryList Returns a list of available dictionaries DictionaryListExtended Returns a list of advanced dictionaries (e.g. translating dictionaries) Match Look for matching words in all dictionaries using the given strategy MatchInDict Look for matching words in the specified dictionary using the given strategy ServerInfo Show remote server information StrategyList Return list of all available strategies on the server Follow the links above to get more information on each API. In this post we will be building a simple windows phone 7 client which uses this service to get word definitions for words entered by the user. The application will also allow the user to select a dictionary from all the available ones and look up the word definition in that dictionary. So of all the apis above we will be using only two, DictionaryList() to get a list of all supported dictionaries and DefineInDict() to get the word definition from a particular dictionary. Before we get started, a note to you all; I would have liked to implement this application using concepts from data binding, item templates, data templates etc. I have a basic understanding of what they are but, being a beginner, I am not very comfortable with those topics yet so I didn’t use them. I thought I’ll get this version out of the way and maybe in the next version I could give those a try. A somewhat scary mock-up of the what the final application will look like, Select Dictionary is a list picker control from the silverlight toolkit (you need to download and install the toolkit if you haven’t already). Below it is a textbox where the user can enter words to look up and a button beside it to fetch the word definition when clicked. Finally we have a textblock which occupies the remaining area and displays the word definition from the selected dictionary. Create a silverlight application for windows phone 7, AonawareDictionaryClient, and add references to the silverlight toolkit and the web service. From the solution explorer right on References and select Microsoft.Phone.Controls.Toolkit from under the .NET tab, Next, add a reference to the web service. Again right click on References and this time select Add Service Reference In the resulting dialog paste the service url in the Address field and press go, (url –> http://services.aonaware.com/DictService/DictService.asmx) once the service is discovered, provide a name for the NameSpace, in this case I’ve called it AonawareDictionaryService. Press OK. You can now use the classes and functions that are generated in the AonawareDictionaryClient.AonawareDictionaryService namespace. Let’s get the UI done now. In MainPage.xaml add a namespace declaration to use the toolkit controls, xmlns:toolkit="clr-namespace:Microsoft.Phone.Controls;assembly=Microsoft.Phone.Controls.Toolkit" the content of LayoutRoot is changed as follows, (sorry, no syntax highlighting in this post) <StackPanel x:Name="TitlePanel" Grid.Row="0" Margin="12,5,0,5">     <TextBlock x:Name="ApplicationTitle" Text="AONAWARE DICTIONARY CLIENT" Style="{StaticResource PhoneTextNormalStyle}"/>     <!--<TextBlock x:Name="PageTitle" Text="page name" Margin="9,-7,0,0" Style="{StaticResource PhoneTextTitle1Style}"/>--> </StackPanel> <!--ContentPanel - place additional content here--> <Grid x:Name="ContentPanel" Grid.Row="1" Margin="12,0,12,0">     <Grid.RowDefinitions>         <RowDefinition Height="Auto"/>         <RowDefinition Height="Auto"/>         <RowDefinition Height="*"/>     </Grid.RowDefinitions>     <toolkit:ListPicker Grid.Column="1" x:Name="listPickerDictionaryList"                         Header="Select Dictionary :">     </toolkit:ListPicker>     <Grid Grid.Row="1" Margin="0,5,0,0">         <Grid.ColumnDefinitions>             <ColumnDefinition Width="*"/>             <ColumnDefinition Width="Auto" />         </Grid.ColumnDefinitions>         <TextBox x:Name="txtboxInputWord" Grid.Column="0" GotFocus="OnTextboxInputWordGotFocus" />         <Button x:Name="btnGo" Grid.Column="1" Click="OnButtonGoClick" >             <Button.Content>                 <Image Source="/images/button-go.png"/>             </Button.Content>         </Button>     </Grid>     <ScrollViewer Grid.Row="2" x:Name="scrollViewer">         <TextBlock  Margin="12,5,12,5"  x:Name="txtBlockWordMeaning" HorizontalAlignment="Stretch"                    VerticalAlignment="Stretch" TextWrapping="Wrap"                    FontSize="26" />     </ScrollViewer> </Grid> I have commented out the PageTitle as it occupies too much valuable space, and the ContentPanel is changed to contain three rows. First row contains the list picker control, second row contains the textbox and the button, and the third row contains a textblock within a scroll viewer. The designer will now be showing the final ui, Now go to MainPage.xaml.cs, and add the following namespace declarations, using Microsoft.Phone.Controls; using AonawareDictionaryClient.AonawareDictionaryService; using System.IO.IsolatedStorage; A class called DictServiceSoapClient would have been created for you in the background when you added a reference to the web service. This class functions as a wrapper to the services exported by the web service. All the web service functions that we saw at the start can be access through this class, or more precisely through an object of this class. Create a data member of type DictServiceSoapClient in the Mainpage class, and a function which initializes it, DictServiceSoapClient DictSvcClient = null; private DictServiceSoapClient GetDictServiceSoapClient() {     if (null == DictSvcClient)     {         DictSvcClient = new DictServiceSoapClient();     }     return DictSvcClient; } We have two major tasks remaining. First, when the application loads we need to populate the list picker with all the supported dictionaries and second, when the user enters a word and clicks on the arrow button we need to fetch the word’s meaning. Populating the List Picker In the OnNavigatingTo event of the MainPage, we call the DictionaryList() api. This can also be done in the OnLoading event handler of the MainPage; not sure if one has an advantage over the other. Here’s the code for OnNavigatedTo, protected override void OnNavigatedTo(System.Windows.Navigation.NavigationEventArgs e) {     DictServiceSoapClient client = GetDictServiceSoapClient();     client.DictionaryListCompleted += new EventHandler<DictionaryListCompletedEventArgs>(OnGetDictionaryListCompleted);     client.DictionaryListAsync();     base.OnNavigatedTo(e); } Windows Phone 7 supports only async calls to web services. When we added a reference to the dictionary service, asynchronous versions of all the functions were generated automatically. So in the above function we register a handler to the DictionaryListCompleted event which will occur when the call to DictionaryList() gets a response from the server. Then we call the DictionaryListAsynch() function which is the async version of the DictionaryList() api. The result of this api will be sent to the handler OnGetDictionaryListCompleted(), void OnGetDictionaryListCompleted(object sender, DictionaryListCompletedEventArgs e) {     IsolatedStorageSettings settings = IsolatedStorageSettings.ApplicationSettings;     Dictionary[] listOfDictionaries;     if (e.Error == null)     {         listOfDictionaries = e.Result;         PopulateListPicker(listOfDictionaries, settings);     }     else if (settings.Contains("SavedDictionaryList"))     {         listOfDictionaries = settings["SavedDictionaryList"] as Dictionary[];         PopulateListPicker(listOfDictionaries, settings);     }     else     {         MessageBoxResult res = MessageBox.Show("An error occured while retrieving dictionary list, do you want to try again?", "Error", MessageBoxButton.OKCancel);         if (MessageBoxResult.OK == res)         {             GetDictServiceSoapClient().DictionaryListAsync();         }     }     settings.Save(); } I have used IsolatedStorageSettings to store a few things; the entire dictionary list and the dictionary that is selected when the user exits the application, so that the next time when the user starts the application the current dictionary is set to the last selected value. First we check if the api returned any error, if the error object is null e.Result will contain the list (actually array) of Dictionary type objects. If there was an error, we check the isolated storage settings to see if there is a dictionary list stored from a previous instance of the application and if so, we populate the list picker based on this saved list. Note that in this case there are chances that the dictionary list might be out of date if there have been changes on the server. Finally, if none of these cases are true, we display an error message to the user and try to fetch the list again. PopulateListPicker() is passed the array of Dictionary objects and the settings object as well, void PopulateListPicker(Dictionary[] listOfDictionaries, IsolatedStorageSettings settings) {     listPickerDictionaryList.Items.Clear();     foreach (Dictionary dictionary in listOfDictionaries)     {         listPickerDictionaryList.Items.Add(dictionary.Name);     }     settings["SavedDictionaryList"] = listOfDictionaries;     string savedDictionaryName;     if (settings.Contains("SavedDictionary"))     {         savedDictionaryName = settings["SavedDictionary"] as string;     }     else     {         savedDictionaryName = "WordNet (r) 2.0"; //default dictionary, wordnet     }     foreach (string dictName in listPickerDictionaryList.Items)     {         if (dictName == savedDictionaryName)         {             listPickerDictionaryList.SelectedItem = dictName;             break;         }     }     settings["SavedDictionary"] = listPickerDictionaryList.SelectedItem as string; } We first clear all the items from the list picker, add the dictionary names from the array and then create a key in the settings called SavedDictionaryList and store the dictionary list in it. We then check if there is saved dictionary available from a previous instance, if there is, we set it as the selected item in the list picker. And if not, we set “WordNet ® 2.0” as the default dictionary. Before returning, we save the selected dictionary in the “SavedDictionary” key of the isolated storage settings. Fetching word definitions Getting this part done is very similar to the above code. We get the input word from the textbox, call into DefineInDictAsync() to fetch the definition and when DefineInDictAsync completes, we get the result and display it in the textblock. Here is the handler for the button click, private void OnButtonGoClick(object sender, RoutedEventArgs e) {     txtBlockWordMeaning.Text = "Please wait..";     IsolatedStorageSettings settings = IsolatedStorageSettings.ApplicationSettings;     if (txtboxInputWord.Text.Trim().Length <= 0)     {         MessageBox.Show("Please enter a word in the textbox and press 'Go'");     }     else     {         Dictionary[] listOfDictionaries = settings["SavedDictionaryList"] as Dictionary[];         string selectedDictionary = listPickerDictionaryList.SelectedItem.ToString();         string dictId = "wn"; //default dictionary is wordnet (wn is the dict id)         foreach (Dictionary dict in listOfDictionaries)         {             if (dict.Name == selectedDictionary)             {                 dictId = dict.Id;                 break;             }         }         DictServiceSoapClient client = GetDictServiceSoapClient();         client.DefineInDictCompleted += new EventHandler<DefineInDictCompletedEventArgs>(OnDefineInDictCompleted);         client.DefineInDictAsync(dictId, txtboxInputWord.Text.Trim());     } } We validate the input and then select the dictionary id based on the currently selected dictionary. We need the dictionary id because the api DefineInDict() expects the dictionary identifier and not the dictionary name. We could very well have stored the dictionary id in isolated storage settings too. Again, same as before, we register a event handler for the DefineInDictCompleted event and call the DefineInDictAsync() method passing in the dictionary id and the input word. void OnDefineInDictCompleted(object sender, DefineInDictCompletedEventArgs e) {     WordDefinition wd = e.Result;     scrollViewer.ScrollToVerticalOffset(0.0f);     if (wd.Definitions.Length == 0)     {         txtBlockWordMeaning.Text = String.Format("No definitions were found for '{0}' in '{1}'", txtboxInputWord.Text.Trim(), listPickerDictionaryList.SelectedItem.ToString().Trim());     }     else     {         foreach (Definition def in wd.Definitions)         {             string str = def.WordDefinition;             str = str.Replace("  ", " "); //some formatting             txtBlockWordMeaning.Text = str;         }     } } When the api completes, e.Result will contain a WordDefnition object. This class is also generated in the background while adding the service reference. We check the word definitions within this class to see if any results were returned, if not, we display a message to the user in the textblock. If a definition was found the text on the textblock is set to display the definition of the word. Adding final touches, we now need to save the current dictionary when the application exits. A small but useful thing is selecting the entire word in the input textbox when the user selects it. This makes sure that if the user has looked up a definition for a really long word, he doesn’t have to press ‘clear’ too many times to enter the next word, protected override void OnNavigatingFrom(System.Windows.Navigation.NavigatingCancelEventArgs e) {     IsolatedStorageSettings settings = IsolatedStorageSettings.ApplicationSettings;     settings["SavedDictionary"] = listPickerDictionaryList.SelectedItem as string;     settings.Save();     base.OnNavigatingFrom(e); } private void OnTextboxInputWordGotFocus(object sender, RoutedEventArgs e) {     TextBox txtbox = sender as TextBox;     if (txtbox.Text.Trim().Length > 0)     {         txtbox.SelectionStart = 0;         txtbox.SelectionLength = txtbox.Text.Length;     } } OnNavigatingFrom() is called whenever you navigate away from the MainPage, since our application contains only one page that would mean that it is exiting. I leave you with a short video of the application in action, but before that if you have any suggestions on how to make the code better and improve it please do leave a comment. Until next time…

    Read the article

  • CodePlex Daily Summary for Tuesday, May 31, 2011

    CodePlex Daily Summary for Tuesday, May 31, 2011Popular ReleasesNearforums - ASP.NET MVC forum engine: Nearforums v6.0: Version 6.0 of Nearforums, the ASP.NET MVC Forum Engine, containing new features: Authentication using Membership Provider for SQL Server and MySql Spam prevention: Flood Control Moderation: Flag messages Content management: Pages: Create pages (about us/contact/texts) through web administration Allow nearforums to run as an IIS subapp Migrated Facebook Connect to OAuth 2.0 Visit the project Roadmap for more details.NetOffice - The easiest way to use Office in .NET: NetOffice Release 0.8b: Changes: - fix critical issue 15922(AccessViolationException) once and for all update is strongly recommended Includes: - Runtime Binaries and Source Code for .NET Framework:......v2.0, v3.0, v3.5, v4.0 - Tutorials in C# and VB.Net:..............................................................COM Proxy Management, Events, etc. - Examples in C# and VB.Net:............................................................Excel, Word, Outlook, PowerPoint, Access - COMAddin Examples in C# and VB....Facebook Graph Toolkit: Facebook Graph Toolkit 1.5.4186: Updates the API in response to Facebook's recent change of policy: All Graph Api accessing feeds or posts must provide a AccessToken.SharePoint Farm Poster: SharePoint Farm Poster: SharePoint Farm Poster is generated by a PowerShell Script. Run this script under the Farm Admin Account. After downloading, unblock the file in the Property Window. Current version is beta : v0.3.0VCC: Latest build, v2.1.40530.0: Automatic drop of latest buildServiio for Windows Home Server: Beta Release 0.5.2.0: Ready for widespread beta. Synchronized build number to Serviio version to avoid confusion.AcDown????? - Anime&Comic Downloader: AcDown????? v3.0 Beta4: ??AcDown?????????????,??????????????,????、????。?????Acfun????? ????32??64? Windows XP/Vista/7 ????????????? ??:????????Windows XP???,?????????.NET Framework 2.0???(x86)?.NET Framework 2.0???(x64),?????"?????????"??? ??v3.0 Beta4 2011-5-31?? ???Bilibili.us????? ???? ?? ???"????" ???Bilibili.us??? ??????? ?? ??????? ?? ???????? ?? ?? ???Bilibili.us?????(??????????????????) ??????(6.cn)?????(????) ?? ?????Acfun?????????? ?????????????? ???QQ???????? ????????????Discussion...Terraria Map Generator: TerrariaMapTool 1.0.0.2 Beta: Version 1.0.0.2 Beta Release - Now has a Gui - Draws backgrounds (May still not be exact) - Hopefully fixed support on DirectX 9 machine.CodeCopy Auto Code Converter: Code Copy v0.1: Full add-in, setup project source code and setup fileEnhSim: EnhSim 2.4.5 ALPHA: 2.4.5 ALPHAThis release supports WoW patch 4.1 at level 85 To use this release, you must have the Microsoft Visual C++ 2010 Redistributable Package installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=A7B7A05E-6DE6-4D3A-A423-37BF0912DB84 To use the GUI you must have the .NET 4.0 Framework installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=9cfb2d51-5ff4-4491-b0e5-b386f32c0992 - Added in the T12 s...TerrariViewer: TerrariViewer v2.4.1: Added Piggy Bank editor and fixed some minor bugs.Kooboo CMS: Kooboo CMS 3.02: What is new in kooboo cms 3.02 The most important updates of this version is the Kooboo site builder, an unique and creative web design tool, design an professional website and export to Kooboo CMS. See: http://www.sitekin.com Add Version contorl on View, Layout and other elements. Add user CMS language selection, user can select a language to use on their CMS backend. Add User profile provider, you can use now stop website user information on a SQL database. Previously it stored on XML...mojoPortal: 2.3.6.6: see release notes on mojoportal.com http://www.mojoportal.com/mojoportal-2366-released Note that we have separate deployment packages for .NET 3.5 and .NET 4.0 The deployment package downloads on this page are pre-compiled and ready for production deployment, they contain no C# source code. To download the source code see the Source Code Tab I recommend getting the latest source code using TortoiseHG, you can get the source code corresponding to this release here.Terraria World Creator: Terraria World Creator: Version 1.01 Fixed a bug that would cause the application to crash. Re-named the Application.VidCoder: 0.9.0: New startup UI for one-click scanning of discs or opening a file/folder. New seek bar on the preview window to make switching previews easier (you can click anywhere on the bar). Added gradient backgrounds to the main window to visually group the sections. Added Open Video File and Open Video Folder options to the File menu. Moved preview button to be in line with the other control buttons. Fixed settings getting in a weird state if they were saved without an output folder being chos...General Media Access WebService: 0.2.0.0 Beta: Updated GMA release with sorting/ordering mechanisms. Several bug fixes.Microsoft All-In-One Code Framework - a centralized code sample library: All-In-One Code Framework 2011-05-26: Alternatively, you can install Sample Browser or Sample Browser VS extension, and download the code samples from Sample Browser. Improved and Newly Added Examples:For an up-to-date code sample index, please refer to All-In-One Code Framework Sample Catalog. NEW Samples for Dynamics Sample Description Owner CSDynamicsNAVWebServices The code sample shows syntax for calling Dynamics NAV Web Services. Lars Lohndorf-Larsen NEW Samples for WPF Sample Description Owner CSWPFDataGridCustomS...Terraria World Viewer: Version 1.1: Update May 26th Added Chest Filtering, this allows chests only containing certain items to have their symbol drawn. (Its under advanced settings tab) GUI elements (checkboxes/etc) are persistant between uses of the application Beta Worlds (i.e. Release #38) will work properly Symbols can be enabled or disabled on a per symbol basis Chest Information tab which is just a dump of the current chest information Meterorite is now visible as a bright magenta pink Application defaults to ...MVC Controls Toolkit: Mvc Controls Toolkit 1.1 RC: *Added: Compatibility with jQuery 1.6.1 Rendering of enumerables with images and/or customizable strings improved the client side tempate engine added new parameters to the template definition binding all new knockout bindings helpers have been fully implemented added a new overload for defining the client-side ViewModel The SetTme method has the option to store the theme in a permanent cookie If no CSS class is provided for the watermark of a TypedTextBox the watermark class of the current t...patterns & practices: Project Silk: Project Silk - Documentation Only Drop - May 24: To get the latest code, please see the previous drop here. Guidance Chapters Ready for Review The following chapters (provided in CHM or PDF format) are ready for community review. Our team very much appreciates your feedback and technical review. All documentation feedback should be posted in the Issue Tracker; if required, a document can be attached along with the feedback. Architecture jQuery UI Widgets Server-Side Implementation Security Unit Testing Web Applications Widget Q...New Projects#liveDB: liveDB is an in-memory database engine for Microsoft .NET providing full ACID support, lightning fast performance and offering a significant reduction of development and operational costs. liveDB is built on Live Domain Technology(TM).8 hours: 8hours Private studyABox2d: A port of Box 2d game engine doing it has an exercise to study how the game engine work.ADempiere.NET: If I have enough time and support I we will translate this into .NETAlmonaster: Almonaster is a turn-based multi-player war game. It is free for all players and comes with absolutely no warranty. The game is fully web-based and requires no downloads, Javascript, Java or ActiveX controls. ASPone API: ASPone partnerské API (aplikacní programové rozhraní) je rozhraní pro vytvorené a urcené pro partnery spolecnosti ASPone, s.r.o. Pomocí tohoto aplikacního rozhraní mužete zautomatizovat radu úkonu, které by pomocí webového rozhraní mohly být casove nárocnejší nebo vyžadují interakci cloveka. API umožnuje zautomatizovat radu úkonu souvisejících se správou domén, doménových kontaktu, webhostingu, databází, serveru a mnoha dalších. Pro zjednodušení práce s API jsou již pripraveni dva ukázkový...CodeCopy Auto Code Converter: This add-in project converts c# and vb.net codes in visual studio.drms: Data Resource Management SystemDrop Down CheckBoxList control (DropDownCheckBoxes): DropDownCheckBoxes is an ASP.NET server control directly inheriting standard ASP.NET CheckBoxList control and fully it supports parent's API (except members responsible for rendering and styling). Thus in most cases CheckBoxList control can be simply replaced with DropDownCheckBoxes with no need to change any data binding code or event handlers. In normal state the control is displayed as a select (DropDownList) control. Clicking the expand button shows a list with check boxes. When the se...Extended Registration module for Orchard CMS: This project has a dependency on the Contrib.Profile module. With this module enabled, users must fill out any parts you add to the User ContentItem in the Registration page. Ideal if you require additional information from your users.GreenWay: Car navigation softwareHost Profiles: Host Profiles is small tool to control, switch and management the hosts file of the computer. The hosts file is located in "c:\windows\system32\drivers\etc\hosts".HRM System MVVM sample code: This is the sample WPF MVVM application that i've described in my blog posts. I hope to give you a clear view of mvvm and other commonly used patterns.Mi Game Library: Ever wanted to store all the games you own into one place that you, could then later come see and search also with your own personal wish list!Micorrhiza: Micorrhiza is a client-server solution written in C# for voice- and video-communications between users in local and global networks.MPlayer.NET for Windows Forms & WPF: MPlayer.NET is a wrapper around MPlayer executable. It's developed on .NET platform and includes visual controls for both Windows Forms and WPF applications.MyGet - NuGet-as-a-Service: This project is the source for http://myget.org. MyGet offers you the possibility to create your own, private, filtered NuGet feed for use in the Visual Studio Package Manager. It can contain packages from the official NuGet feed as well as your private packages, hosted on MyGet.MZExtensions: A collection of handy C# Extension Methods.NCAds: NCadsNetSync: Universal file synchronization agent.OLE 1C7.7: OLE 1C7.7 ?????????? ??????? ??? ??????? ? 1?7.7 ????????? OLE ??????????.Pear 2.5: Pear 2.5 is a web browser which has MetroUI which is also known for WP7. Pear 2.5's graphics is totally made up with MetroUI and looks stunning when browse. This version has 3 builds - 2 alpha builds and 1 gamma delta (beta) build. It's developed in VB.NET which is the easiest.ProjectOne: ProjectOne is a Open Community Information Sharing Website regarding Realty as its primary source.russomi: russomiSopaco Server Foundation 1.x: The one earlier version of my server infrastructure(SSF, Sopaco Server Foundation 1.x, owned by ??)。 Network Layer Based On MINA, message meta in 1.x is hard coded to 6bytes message header like this struct NetworkMessageHeader { short msgId; int msgLength; } struct NetworkMTray Timer: A simple timer/stopwatch which runs fromt he system tray. I started it as a hobby learning project to understand the Win32 API. Now open sourcing it to get more inputs about the same, and at the same time it may prove helpful to othersVENSOFT DIPERCAX: Proyecto Final del Curso de Proyectos II de la Universidad Privada del NorteWindows Phone Blog Menu: A Silverlight navigation control that looks like a Windows Phone 7. The live tiles are links to websites. Use this control on your blog or website to show your love for WP7. It is a creative way to link to external sites you are interested in.

    Read the article

  • C#/.NET Fundamentals: Choosing the Right Collection Class

    - by James Michael Hare
    The .NET Base Class Library (BCL) has a wide array of collection classes at your disposal which make it easy to manage collections of objects. While it's great to have so many classes available, it can be daunting to choose the right collection to use for any given situation. As hard as it may be, choosing the right collection can be absolutely key to the performance and maintainability of your application! This post will look at breaking down any confusion between each collection and the situations in which they excel. We will be spending most of our time looking at the System.Collections.Generic namespace, which is the recommended set of collections. The Generic Collections: System.Collections.Generic namespace The generic collections were introduced in .NET 2.0 in the System.Collections.Generic namespace. This is the main body of collections you should tend to focus on first, as they will tend to suit 99% of your needs right up front. It is important to note that the generic collections are unsynchronized. This decision was made for performance reasons because depending on how you are using the collections its completely possible that synchronization may not be required or may be needed on a higher level than simple method-level synchronization. Furthermore, concurrent read access (all writes done at beginning and never again) is always safe, but for concurrent mixed access you should either synchronize the collection or use one of the concurrent collections. So let's look at each of the collections in turn and its various pros and cons, at the end we'll summarize with a table to help make it easier to compare and contrast the different collections. The Associative Collection Classes Associative collections store a value in the collection by providing a key that is used to add/remove/lookup the item. Hence, the container associates the value with the key. These collections are most useful when you need to lookup/manipulate a collection using a key value. For example, if you wanted to look up an order in a collection of orders by an order id, you might have an associative collection where they key is the order id and the value is the order. The Dictionary<TKey,TVale> is probably the most used associative container class. The Dictionary<TKey,TValue> is the fastest class for associative lookups/inserts/deletes because it uses a hash table under the covers. Because the keys are hashed, the key type should correctly implement GetHashCode() and Equals() appropriately or you should provide an external IEqualityComparer to the dictionary on construction. The insert/delete/lookup time of items in the dictionary is amortized constant time - O(1) - which means no matter how big the dictionary gets, the time it takes to find something remains relatively constant. This is highly desirable for high-speed lookups. The only downside is that the dictionary, by nature of using a hash table, is unordered, so you cannot easily traverse the items in a Dictionary in order. The SortedDictionary<TKey,TValue> is similar to the Dictionary<TKey,TValue> in usage but very different in implementation. The SortedDictionary<TKey,TValye> uses a binary tree under the covers to maintain the items in order by the key. As a consequence of sorting, the type used for the key must correctly implement IComparable<TKey> so that the keys can be correctly sorted. The sorted dictionary trades a little bit of lookup time for the ability to maintain the items in order, thus insert/delete/lookup times in a sorted dictionary are logarithmic - O(log n). Generally speaking, with logarithmic time, you can double the size of the collection and it only has to perform one extra comparison to find the item. Use the SortedDictionary<TKey,TValue> when you want fast lookups but also want to be able to maintain the collection in order by the key. The SortedList<TKey,TValue> is the other ordered associative container class in the generic containers. Once again SortedList<TKey,TValue>, like SortedDictionary<TKey,TValue>, uses a key to sort key-value pairs. Unlike SortedDictionary, however, items in a SortedList are stored as an ordered array of items. This means that insertions and deletions are linear - O(n) - because deleting or adding an item may involve shifting all items up or down in the list. Lookup time, however is O(log n) because the SortedList can use a binary search to find any item in the list by its key. So why would you ever want to do this? Well, the answer is that if you are going to load the SortedList up-front, the insertions will be slower, but because array indexing is faster than following object links, lookups are marginally faster than a SortedDictionary. Once again I'd use this in situations where you want fast lookups and want to maintain the collection in order by the key, and where insertions and deletions are rare. The Non-Associative Containers The other container classes are non-associative. They don't use keys to manipulate the collection but rely on the object itself being stored or some other means (such as index) to manipulate the collection. The List<T> is a basic contiguous storage container. Some people may call this a vector or dynamic array. Essentially it is an array of items that grow once its current capacity is exceeded. Because the items are stored contiguously as an array, you can access items in the List<T> by index very quickly. However inserting and removing in the beginning or middle of the List<T> are very costly because you must shift all the items up or down as you delete or insert respectively. However, adding and removing at the end of a List<T> is an amortized constant operation - O(1). Typically List<T> is the standard go-to collection when you don't have any other constraints, and typically we favor a List<T> even over arrays unless we are sure the size will remain absolutely fixed. The LinkedList<T> is a basic implementation of a doubly-linked list. This means that you can add or remove items in the middle of a linked list very quickly (because there's no items to move up or down in contiguous memory), but you also lose the ability to index items by position quickly. Most of the time we tend to favor List<T> over LinkedList<T> unless you are doing a lot of adding and removing from the collection, in which case a LinkedList<T> may make more sense. The HashSet<T> is an unordered collection of unique items. This means that the collection cannot have duplicates and no order is maintained. Logically, this is very similar to having a Dictionary<TKey,TValue> where the TKey and TValue both refer to the same object. This collection is very useful for maintaining a collection of items you wish to check membership against. For example, if you receive an order for a given vendor code, you may want to check to make sure the vendor code belongs to the set of vendor codes you handle. In these cases a HashSet<T> is useful for super-quick lookups where order is not important. Once again, like in Dictionary, the type T should have a valid implementation of GetHashCode() and Equals(), or you should provide an appropriate IEqualityComparer<T> to the HashSet<T> on construction. The SortedSet<T> is to HashSet<T> what the SortedDictionary<TKey,TValue> is to Dictionary<TKey,TValue>. That is, the SortedSet<T> is a binary tree where the key and value are the same object. This once again means that adding/removing/lookups are logarithmic - O(log n) - but you gain the ability to iterate over the items in order. For this collection to be effective, type T must implement IComparable<T> or you need to supply an external IComparer<T>. Finally, the Stack<T> and Queue<T> are two very specific collections that allow you to handle a sequential collection of objects in very specific ways. The Stack<T> is a last-in-first-out (LIFO) container where items are added and removed from the top of the stack. Typically this is useful in situations where you want to stack actions and then be able to undo those actions in reverse order as needed. The Queue<T> on the other hand is a first-in-first-out container which adds items at the end of the queue and removes items from the front. This is useful for situations where you need to process items in the order in which they came, such as a print spooler or waiting lines. So that's the basic collections. Let's summarize what we've learned in a quick reference table.  Collection Ordered? Contiguous Storage? Direct Access? Lookup Efficiency Manipulate Efficiency Notes Dictionary No Yes Via Key Key: O(1) O(1) Best for high performance lookups. SortedDictionary Yes No Via Key Key: O(log n) O(log n) Compromise of Dictionary speed and ordering, uses binary search tree. SortedList Yes Yes Via Key Key: O(log n) O(n) Very similar to SortedDictionary, except tree is implemented in an array, so has faster lookup on preloaded data, but slower loads. List No Yes Via Index Index: O(1) Value: O(n) O(n) Best for smaller lists where direct access required and no ordering. LinkedList No No No Value: O(n) O(1) Best for lists where inserting/deleting in middle is common and no direct access required. HashSet No Yes Via Key Key: O(1) O(1) Unique unordered collection, like a Dictionary except key and value are same object. SortedSet Yes No Via Key Key: O(log n) O(log n) Unique ordered collection, like SortedDictionary except key and value are same object. Stack No Yes Only Top Top: O(1) O(1)* Essentially same as List<T> except only process as LIFO Queue No Yes Only Front Front: O(1) O(1) Essentially same as List<T> except only process as FIFO   The Original Collections: System.Collections namespace The original collection classes are largely considered deprecated by developers and by Microsoft itself. In fact they indicate that for the most part you should always favor the generic or concurrent collections, and only use the original collections when you are dealing with legacy .NET code. Because these collections are out of vogue, let's just briefly mention the original collection and their generic equivalents: ArrayList A dynamic, contiguous collection of objects. Favor the generic collection List<T> instead. Hashtable Associative, unordered collection of key-value pairs of objects. Favor the generic collection Dictionary<TKey,TValue> instead. Queue First-in-first-out (FIFO) collection of objects. Favor the generic collection Queue<T> instead. SortedList Associative, ordered collection of key-value pairs of objects. Favor the generic collection SortedList<T> instead. Stack Last-in-first-out (LIFO) collection of objects. Favor the generic collection Stack<T> instead. In general, the older collections are non-type-safe and in some cases less performant than their generic counterparts. Once again, the only reason you should fall back on these older collections is for backward compatibility with legacy code and libraries only. The Concurrent Collections: System.Collections.Concurrent namespace The concurrent collections are new as of .NET 4.0 and are included in the System.Collections.Concurrent namespace. These collections are optimized for use in situations where multi-threaded read and write access of a collection is desired. The concurrent queue, stack, and dictionary work much as you'd expect. The bag and blocking collection are more unique. Below is the summary of each with a link to a blog post I did on each of them. ConcurrentQueue Thread-safe version of a queue (FIFO). For more information see: C#/.NET Little Wonders: The ConcurrentStack and ConcurrentQueue ConcurrentStack Thread-safe version of a stack (LIFO). For more information see: C#/.NET Little Wonders: The ConcurrentStack and ConcurrentQueue ConcurrentBag Thread-safe unordered collection of objects. Optimized for situations where a thread may be bother reader and writer. For more information see: C#/.NET Little Wonders: The ConcurrentBag and BlockingCollection ConcurrentDictionary Thread-safe version of a dictionary. Optimized for multiple readers (allows multiple readers under same lock). For more information see C#/.NET Little Wonders: The ConcurrentDictionary BlockingCollection Wrapper collection that implement producers & consumers paradigm. Readers can block until items are available to read. Writers can block until space is available to write (if bounded). For more information see C#/.NET Little Wonders: The ConcurrentBag and BlockingCollection Summary The .NET BCL has lots of collections built in to help you store and manipulate collections of data. Understanding how these collections work and knowing in which situations each container is best is one of the key skills necessary to build more performant code. Choosing the wrong collection for the job can make your code much slower or even harder to maintain if you choose one that doesn’t perform as well or otherwise doesn’t exactly fit the situation. Remember to avoid the original collections and stick with the generic collections.  If you need concurrent access, you can use the generic collections if the data is read-only, or consider the concurrent collections for mixed-access if you are running on .NET 4.0 or higher.   Tweet Technorati Tags: C#,.NET,Collecitons,Generic,Concurrent,Dictionary,List,Stack,Queue,SortedList,SortedDictionary,HashSet,SortedSet

    Read the article

  • Dual booting on separate hard drives

    - by tornadorider
    I have windows XP professional installed on 1 hard drive and Ubuntu 10.10 on my second hard drive. On start up the computer completely skips the grub menu and boots straight into 10.10. I have tried running os-prober with the windows hard drive mounted and then updating grub but it didnt work. Any ideas? I have changed the boot order so that the HDD with xp on it is first however the computer still booted into linux. I tried running grub-install /dev/sda and got this /usr/sbin/grub-setup: warn: Sector 32 is already in use by FlexNet; avoiding it. This software may cause boot or other problems in future. Please ask its authors not to store data in the boot track.. /usr/sbin/grub-setup: warn: Sector 33 is already in use by FlexNet; avoiding it. This software may cause boot or other problems in future. Please ask its authors not to store data in the boot track.. Installation finished. No error reported I checked using disk utility and the code for my xp hard drive is sdb so i ran the camand grub-install /dev/sdb shich gave me this Installation finished. No error reported. So i rebooted but it still didnt work. Any other ideas? Additional info gedit /boot/grub/grub.cfg: # # DO NOT EDIT THIS FILE # # It is automatically generated by grub-mkconfig using templates # from /etc/grub.d and settings from /etc/default/grub # ### BEGIN /etc/grub.d/00_header ### if [ -s $prefix/grubenv ]; then set have_grubenv=true load_env fi set default="0" if [ "${prev_saved_entry}" ]; then set saved_entry="${prev_saved_entry}" save_env saved_entry set prev_saved_entry= save_env prev_saved_entry set boot_once=true fi function savedefault { if [ -z "${boot_once}" ]; then saved_entry="${chosen}" save_env saved_entry fi } function recordfail { set recordfail=1 if [ -n "${have_grubenv}" ]; then if [ -z "${boot_once}" ]; then save_env recordfail; fi; fi } function load_video { insmod vbe insmod vga } insmod part_msdos insmod ext2 set root='(hd0,msdos1)' search --no-floppy --fs-uuid --set d682c9bd-dd89-4827-9802-a1f921ebe21c if loadfont /usr/share/grub/unicode.pf2 ; then set gfxmode=640x480 load_video insmod gfxterm fi terminal_output gfxterm insmod part_msdos insmod ext2 set root='(hd0,msdos1)' search --no-floppy --fs-uuid --set d682c9bd-dd89-4827-9802-a1f921ebe21c set locale_dir=($root)/boot/grub/locale set lang=en insmod gettext if [ "${recordfail}" = 1 ]; then set timeout=-1 else set timeout=10 fi ### END /etc/grub.d/00_header ### ### BEGIN /etc/grub.d/05_debian_theme ### set menu_color_normal=white/black set menu_color_highlight=black/light-gray ### END /etc/grub.d/05_debian_theme ### ### BEGIN /etc/grub.d/10_linux ### menuentry 'Ubuntu, with Linux 2.6.35-28-generic' --class ubuntu --class gnu-linux --class gnu --class os { recordfail insmod part_msdos insmod ext2 set root='(hd0,msdos1)' search --no-floppy --fs-uuid --set d682c9bd-dd89-4827-9802-a1f921ebe21c linux /boot/vmlinuz-2.6.35-28-generic root=UUID=d682c9bd-dd89-4827-9802-a1f921ebe21c ro quiet splash initrd /boot/initrd.img-2.6.35-28-generic } menuentry 'Ubuntu, with Linux 2.6.35-28-generic (recovery mode)' --class ubuntu --class gnu-linux --class gnu --class os { recordfail insmod part_msdos insmod ext2 set root='(hd0,msdos1)' search --no-floppy --fs-uuid --set d682c9bd-dd89-4827-9802-a1f921ebe21c echo 'Loading Linux 2.6.35-28-generic ...' linux /boot/vmlinuz-2.6.35-28-generic root=UUID=d682c9bd-dd89-4827-9802-a1f921ebe21c ro single echo 'Loading initial ramdisk ...' initrd /boot/initrd.img-2.6.35-28-generic } menuentry 'Ubuntu, with Linux 2.6.35-22-generic' --class ubuntu --class gnu-linux --class gnu --class os { recordfail insmod part_msdos insmod ext2 set root='(hd0,msdos1)' search --no-floppy --fs-uuid --set d682c9bd-dd89-4827-9802-a1f921ebe21c linux /boot/vmlinuz-2.6.35-22-generic root=UUID=d682c9bd-dd89-4827-9802-a1f921ebe21c ro quiet splash initrd /boot/initrd.img-2.6.35-22-generic } menuentry 'Ubuntu, with Linux 2.6.35-22-generic (recovery mode)' --class ubuntu --class gnu-linux --class gnu --class os { recordfail insmod part_msdos insmod ext2 set root='(hd0,msdos1)' search --no-floppy --fs-uuid --set d682c9bd-dd89-4827-9802-a1f921ebe21c echo 'Loading Linux 2.6.35-22-generic ...' linux /boot/vmlinuz-2.6.35-22-generic root=UUID=d682c9bd-dd89-4827-9802-a1f921ebe21c ro single echo 'Loading initial ramdisk ...' initrd /boot/initrd.img-2.6.35-22-generic } ### END /etc/grub.d/10_linux ### ### BEGIN /etc/grub.d/20_linux_xen ### ### END /etc/grub.d/20_linux_xen ### ### BEGIN /etc/grub.d/20_memtest86+ ### menuentry "Memory test (memtest86+)" { insmod part_msdos insmod ext2 set root='(hd0,msdos1)' search --no-floppy --fs-uuid --set d682c9bd-dd89-4827-9802-a1f921ebe21c linux16 /boot/memtest86+.bin } menuentry "Memory test (memtest86+, serial console 115200)" { insmod part_msdos insmod ext2 set root='(hd0,msdos1)' search --no-floppy --fs-uuid --set d682c9bd-dd89-4827-9802-a1f921ebe21c linux16 /boot/memtest86+.bin console=ttyS0,115200n8 } ### END /etc/grub.d/20_memtest86+ ### ### BEGIN /etc/grub.d/30_os-prober ### if [ "x${timeout}" != "x-1" ]; then if keystatus; then if keystatus --shift; then set timeout=-1 else set timeout=0 fi else if sleep --interruptible 3 ; then set timeout=0 fi fi fi ### END /etc/grub.d/30_os-prober ### ### BEGIN /etc/grub.d/40_custom ### # This file provides an easy way to add custom menu entries. Simply type the # menu entries you want to add after this comment. Be careful not to change # the 'exec tail' line above. ### END /etc/grub.d/40_custom ### ### BEGIN /etc/grub.d/41_custom ### if [ -f $prefix/custom.cfg ]; then source $prefix/custom.cfg; fi ### END /etc/grub.d/41_custom ### sudo fdisk -l: Disk /dev/sda: 80.1 GB, 80060424192 bytes 255 heads, 63 sectors/track, 9733 cylinders Units = cylinders of 16065 * 512 = 8225280 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0x0008a483 Device Boot Start End Blocks Id System /dev/sda1 * 1 9352 75112448 83 Linux /dev/sda2 9352 9734 3068929 5 Extended /dev/sda5 9352 9734 3068928 82 Linux swap / Solaris Disk /dev/sdb: 500.1 GB, 500107862016 bytes 255 heads, 63 sectors/track, 60801 cylinders Units = cylinders of 16065 * 512 = 8225280 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0xc5d6c5d6 Device Boot Start End Blocks Id System /dev/sdb1 1 60800 488375968+ 7 HPFS/NTFS sudo blkid /dev/sda1: UUID="d682c9bd-dd89-4827-9802-a1f921ebe21c" TYPE="ext4" /dev/sda5: UUID="09e9c2cb-d903-4f0b-a181-536951845231" TYPE="swap" /dev/sdb1: UUID="B21844EB1844AFE1" TYPE="ntfs" sudo os-prober (nothing) Boot Info Script 0.55 dated February 15th, 2010 ============================= Boot Info Summary: ============================== => Grub 2 is installed in the MBR of /dev/sda and looks on the same drive in partition #1 for (,msdos1)/boot/grub. => Grub 2 is installed in the MBR of /dev/sdb and looks on the same drive in partition #1 for (,msdos1)/boot/grub. sda1: _________________________________________________________________________ File system: ext4 Boot sector type: - Boot sector info: Operating System: Ubuntu 10.10 Boot files/dirs: /boot/grub/grub.cfg /etc/fstab /boot/grub/core.img sda2: _________________________________________________________________________ File system: Extended Partition Boot sector type: Unknown Boot sector info: sda5: _________________________________________________________________________ File system: swap Boot sector type: - Boot sector info: sdb1: _________________________________________________________________________ File system: ntfs Boot sector type: Windows XP Boot sector info: No errors found in the Boot Parameter Block. Operating System: Windows XP Boot files/dirs: =========================== Drive/Partition Info: ============================= Drive: sda ___________________ _____________________________________________________ Disk /dev/sda: 80.1 GB, 80060424192 bytes 255 heads, 63 sectors/track, 9733 cylinders, total 156368016 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 512 bytes Partition Boot Start End Size Id System /dev/sda1 * 2,048 150,226,943 150,224,896 83 Linux /dev/sda2 150,228,990 156,366,847 6,137,858 5 Extended /dev/sda5 150,228,992 156,366,847 6,137,856 82 Linux swap / Solaris Drive: sdb ___________________ _____________________________________________________ Disk /dev/sdb: 500.1 GB, 500107862016 bytes 255 heads, 63 sectors/track, 60801 cylinders, total 976773168 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 512 bytes Partition Boot Start End Size Id System /dev/sdb1 * 63 976,751,999 976,751,937 7 HPFS/NTFS blkid -c /dev/null: ____________________________________________________________ Device UUID TYPE LABEL /dev/sda1 d682c9bd-dd89-4827-9802-a1f921ebe21c ext4 /dev/sda2: PTTYPE="dos" /dev/sda5 09e9c2cb-d903-4f0b-a181-536951845231 swap /dev/sda: PTTYPE="dos" /dev/sdb1 B21844EB1844AFE1 ntfs /dev/sdb: PTTYPE="dos" ============================ "mount | grep ^/dev output: =========================== Device Mount_Point Type Options /dev/sda1 / ext4 (rw,errors=remount-ro,commit=0) =========================== sda1/boot/grub/grub.cfg: =========================== # # DO NOT EDIT THIS FILE # # It is automatically generated by grub-mkconfig using templates # from /etc/grub.d and settings from /etc/default/grub # ### BEGIN /etc/grub.d/00_header ### if [ -s $prefix/grubenv ]; then set have_grubenv=true load_env fi set default="0" if [ "${prev_saved_entry}" ]; then set saved_entry="${prev_saved_entry}" save_env saved_entry set prev_saved_entry= save_env prev_saved_entry set boot_once=true fi function savedefault { if [ -z "${boot_once}" ]; then saved_entry="${chosen}" save_env saved_entry fi } function recordfail { set recordfail=1 if [ -n "${have_grubenv}" ]; then if [ -z "${boot_once}" ]; then save_env recordfail; fi; fi } function load_video { insmod vbe insmod vga } insmod part_msdos insmod ext2 set root='(hd0,msdos1)' search --no-floppy --fs-uuid --set d682c9bd-dd89-4827-9802-a1f921ebe21c if loadfont /usr/share/grub/unicode.pf2 ; then set gfxmode=640x480 load_video insmod gfxterm fi terminal_output gfxterm insmod part_msdos insmod ext2 set root='(hd0,msdos1)' search --no-floppy --fs-uuid --set d682c9bd-dd89-4827-9802-a1f921ebe21c set locale_dir=($root)/boot/grub/locale set lang=en insmod gettext if [ "${recordfail}" = 1 ]; then set timeout=-1 else set timeout=10 fi ### END /etc/grub.d/00_header ### ### BEGIN /etc/grub.d/05_debian_theme ### set menu_color_normal=white/black set menu_color_highlight=black/light-gray ### END /etc/grub.d/05_debian_theme ### ### BEGIN /etc/grub.d/10_linux ### menuentry 'Ubuntu, with Linux 2.6.35-28-generic' --class ubuntu --class gnu-linux --class gnu --class os { recordfail insmod part_msdos insmod ext2 set root='(hd0,msdos1)' search --no-floppy --fs-uuid --set d682c9bd-dd89-4827-9802-a1f921ebe21c linux /boot/vmlinuz-2.6.35-28-generic root=UUID=d682c9bd-dd89-4827-9802-a1f921ebe21c ro quiet splash initrd /boot/initrd.img-2.6.35-28-generic } menuentry 'Ubuntu, with Linux 2.6.35-28-generic (recovery mode)' --class ubuntu --class gnu-linux --class gnu --class os { recordfail insmod part_msdos insmod ext2 set root='(hd0,msdos1)' search --no-floppy --fs-uuid --set d682c9bd-dd89-4827-9802-a1f921ebe21c echo 'Loading Linux 2.6.35-28-generic ...' linux /boot/vmlinuz-2.6.35-28-generic root=UUID=d682c9bd-dd89-4827-9802-a1f921ebe21c ro single echo 'Loading initial ramdisk ...' initrd /boot/initrd.img-2.6.35-28-generic } menuentry 'Ubuntu, with Linux 2.6.35-22-generic' --class ubuntu --class gnu-linux --class gnu --class os { recordfail insmod part_msdos insmod ext2 set root='(hd0,msdos1)' search --no-floppy --fs-uuid --set d682c9bd-dd89-4827-9802-a1f921ebe21c linux /boot/vmlinuz-2.6.35-22-generic root=UUID=d682c9bd-dd89-4827-9802-a1f921ebe21c ro quiet splash initrd /boot/initrd.img-2.6.35-22-generic } menuentry 'Ubuntu, with Linux 2.6.35-22-generic (recovery mode)' --class ubuntu --class gnu-linux --class gnu --class os { recordfail insmod part_msdos insmod ext2 set root='(hd0,msdos1)' search --no-floppy --fs-uuid --set d682c9bd-dd89-4827-9802-a1f921ebe21c echo 'Loading Linux 2.6.35-22-generic ...' linux /boot/vmlinuz-2.6.35-22-generic root=UUID=d682c9bd-dd89-4827-9802-a1f921ebe21c ro single echo 'Loading initial ramdisk ...' initrd /boot/initrd.img-2.6.35-22-generic } ### END /etc/grub.d/10_linux ### ### BEGIN /etc/grub.d/20_linux_xen ### ### END /etc/grub.d/20_linux_xen ### ### BEGIN /etc/grub.d/20_memtest86+ ### menuentry "Memory test (memtest86+)" { insmod part_msdos insmod ext2 set root='(hd0,msdos1)' search --no-floppy --fs-uuid --set d682c9bd-dd89-4827-9802-a1f921ebe21c linux16 /boot/memtest86+.bin } menuentry "Memory test (memtest86+, serial console 115200)" { insmod part_msdos insmod ext2 set root='(hd0,msdos1)' search --no-floppy --fs-uuid --set d682c9bd-dd89-4827-9802-a1f921ebe21c linux16 /boot/memtest86+.bin console=ttyS0,115200n8 } ### END /etc/grub.d/20_memtest86+ ### ### BEGIN /etc/grub.d/30_os-prober ### if [ "x${timeout}" != "x-1" ]; then if keystatus; then if keystatus --shift; then set timeout=-1 else set timeout=0 fi else if sleep --interruptible 3 ; then set timeout=0 fi fi fi ### END /etc/grub.d/30_os-prober ### ### BEGIN /etc/grub.d/40_custom ### # This file provides an easy way to add custom menu entries. Simply type the # menu entries you want to add after this comment. Be careful not to change # the 'exec tail' line above. menuentry "Windows XP" { set root=(hd1,1) chainloader (hd1,1)+1 } ### END /etc/grub.d/40_custom ### ### BEGIN /etc/grub.d/41_custom ### if [ -f $prefix/custom.cfg ]; then source $prefix/custom.cfg; fi ### END /etc/grub.d/41_custom ### =============================== sda1/etc/fstab: =============================== # /etc/fstab: static file system information. # # Use 'blkid -o value -s UUID' to print the universally unique identifier # for a device; this may be used with UUID= as a more robust way to name # devices that works even if disks are added and removed. See fstab(5). # # <file system> <mount point> <type> <options> <dump> <pass> proc /proc proc nodev,noexec,nosuid 0 0 /dev/sda1 / ext4 errors=remount-ro 0 1 # swap was on /dev/sda5 during installation UUID=09e9c2cb-d903-4f0b-a181-536951845231 none swap sw 0 0 =================== sda1: Location of files loaded by Grub: =================== 51.7GB: boot/grub/core.img 58.5GB: boot/grub/grub.cfg 1.2GB: boot/initrd.img-2.6.35-22-generic 1.3GB: boot/initrd.img-2.6.35-28-generic 58.2GB: boot/vmlinuz-2.6.35-22-generic 51.7GB: boot/vmlinuz-2.6.35-28-generic 1.3GB: initrd.img 1.2GB: initrd.img.old 51.7GB: vmlinuz 58.2GB: vmlinuz.old =========================== Unknown MBRs/Boot Sectors/etc ======================= Unknown BootLoader on sda2 00000000 d9 ed 13 ab ff a8 33 8c 01 b2 47 99 e1 4a b1 f1 |......3...G..J..| 00000010 69 5f a7 29 a4 1a 03 9e 31 b9 45 02 71 e6 58 78 |i_.)....1.E.q.Xx| 00000020 3d f6 ee 7b 3e 33 1b 82 c6 7d cf 1a c8 e7 bc 2f |=..{>3...}...../| 00000030 b9 e1 70 75 cf 18 aa e7 d5 7e 3c f1 b4 e7 9e 3a |..pu.....~<....:| 00000040 55 38 f1 b4 ee 78 59 0b 5e f7 3c 4c 57 73 9c 2a |U8...xY.^.<LWs.*| 00000050 28 f1 19 ed 11 9c b2 19 e2 80 92 1c 7b 84 ee 0b |(...........{...| 00000060 e2 c0 ac af 0a 50 42 b9 cf 0c dc 2c 20 77 85 dc |.....PB...., w..| 00000070 8f 70 5f 7b 84 9b a1 f7 8c 2d ee 70 5c ae f7 39 |.p_{.....-.p\..9| 00000080 63 f7 09 8a ec 79 4c ed 9f cc ad 3c f8 1b 47 7d |c....yL....<..G}| 00000090 3f 97 d5 16 cb 29 45 38 25 61 36 08 de 10 93 0f |?....)E8%a6.....| 000000a0 95 4f ea 54 f9 89 ff f1 bf 9a cc bb fd b6 22 b1 |.O.T..........".| 000000b0 65 08 05 21 78 19 46 b0 24 7e fb de d4 b3 ba d6 |e..!x.F.$~......| 000000c0 ec 11 65 82 ee 10 1d 12 04 91 da 6d 67 47 ea 9b |..e........mgG..| 000000d0 6f b0 aa fb cb 67 10 64 86 e8 26 85 fb f9 50 77 |o....g.d..&...Pw| 000000e0 9d 13 9b 9e d9 11 f3 a1 50 1b 11 b7 93 79 9f ab |........P....y..| 000000f0 c1 b6 86 0f 35 ed d4 9f dc f8 db bd ed 45 3a 68 |....5........E:h| 00000100 54 68 4a 1d d1 fc b8 c9 72 b4 d7 7b 60 e7 39 2f |ThJ.....r..{`.9/| 00000110 2a 0a 4e 52 72 52 c6 e2 2a 55 6a 2a e1 82 40 71 |*.NRrR..*Uj*..@q| 00000120 11 11 e0 53 d6 ff 1b a9 c6 65 df 1e b7 15 6f a2 |...S.....e....o.| 00000130 15 02 a4 6d 19 b7 78 57 a6 ee 9e 36 08 7d 6f 7c |...m..xW...6.}o|| 00000140 fd f7 7c d5 40 ff 0f c7 97 dc aa 00 ce 8b bb dc |..|.@...........| 00000150 e2 eb 1c 50 74 d8 14 cc 9a d6 5c a2 ab f2 67 f9 |...Pt.....\...g.| 00000160 58 ed 43 79 0e 78 7a 5c a6 f8 7b e8 05 4e 62 8a |X.Cy.xz\..{..Nb.| 00000170 0a 5f 22 ee a6 38 b9 e1 32 45 97 08 cc 75 66 c6 |._"..8..2E...uf.| 00000180 b3 a2 2d 89 a1 e9 95 21 28 53 fd dd be b1 b2 a2 |..-....!(S......| 00000190 78 3f a3 c9 3d e3 31 54 88 cf 78 0d e1 21 a8 74 |x?..=.1T..x..!.t| 000001a0 06 60 9d 21 c6 7a 24 e1 cc 28 f8 98 e0 99 e3 fc |.`.!.z$..(......| 000001b0 fa 8b eb d5 56 03 20 b8 54 ba c6 ee 9f 57 00 fe |....V. .T....W..| 000001c0 ff ff 82 fe ff ff 02 00 00 00 00 a8 5d 00 00 00 |............]...| 000001d0 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 |................| * 000001f0 00 00 00 00 00 00 00 00 00 00 00 00 00 00 55 aa |..............U.| 00000200

    Read the article

  • MVVM in Task-It

    As I'm gearing up to write a post about dynamic XAP loading with MEF, I'd like to first talk a bit about MVVM, the Model-View-ViewModel pattern, as I will be leveraging this pattern in my future posts. Download Source Code Why MVVM? Your first question may be, "why do I need this pattern? I've been using a code-behind approach for years and it works fine." Well, you really don't have to make the switch to MVVM, but let me first explain some of the benefits I see for doing so. MVVM Benefits Testability - This is the one you'll probably hear the most about when it comes to MVVM. Moving most of the code from your code-behind to a separate view model class means you can now write unit tests against the view model without any knowledge of a view (UserControl). Multiple UIs - Let's just say that you've created a killer app, it's running in the browser, and maybe you've even made it run out-of-browser. Now what if your boss comes to you and says, "I heard about this new Windows Phone 7 device that is coming out later this year. Can you start porting the app to that device?". Well, now you have to create a new UI (UserControls, etc.) because you have a lot less screen real estate to work with. So what do you do, copy all of your existing UserControls, paste them, rename them, and then start changing the code? Hmm, that doesn't sound so good. But wait, if most of the code that makes your browser-based app tick lives in view model classes, now you can create new view (UserControls) for Windows Phone 7 that reference the same view model classes as your browser-based app. Page state - In Silverlight you're at some point going to be faced with the same issue you dealt with for years in ASP.NET, maintaining page state. Let's say a user hits your Products page, does some stuff (filters record, etc.), then leaves the page and comes back later. It would be best if the Products page was in the same state as when they left it right? Well, if you've thrown away your view (UserControl or Page) and moved off to another part of the UI, when you come back to Products you're probably going to re-instantiate your view...which will put it right back in the state it was when it started. Hmm, not good. Well, with a little help from MEF you can store the state in your view model class, MEF will keep that view model instance hanging around in memory, and then you simply rebind your view to the view model class. I made that sound easy, but it's actually a bit of work to properly store and restore the state. At least it can be done though, which will make your users a lot happier! I'll talk more about this in an upcoming blog post. No event handlers? Another nice thing about MVVM is that you can bind your UserControls to the view model, which may eliminate the need for event handlers in your code-behind. So instead of having a Click handler on a Button (or RadMenuItem), for example, you can now bind your control's Command property to a DelegateCommand in your view model (I'll talk more about Commands in an upcoming post). Instead of having a SelectionChanged event handler on your RadGridView you can now bind its SelectedItem property to a property in your view model, and each time the user clicks a row, the view model property's setter will be called. Now through the magic of binding we can eliminate the need for traditional code-behind based event handlers on our user interface controls, and the best thing is that the view model knows about everything that's going on...which means we can test things without a user interface. The brains of the operation So what we're seeing here is that the view is now just a dumb layer that binds to the view model, and that the view model is in control of just about everything, like what happens when a RadGridView row is selected, or when a RadComboBoxItem is selected, or when a RadMenuItem is clicked. It is also responsible for loading data when the page is hit, as well as kicking off data inserts, updates and deletions. Once again, all of this stuff can be tested without the need for a user interface. If the test works, then it'll work regardless of whether the user is hitting the browser-based version of your app, or the Windows Phone 7 version. Nice! The database Before running the code for this app you will need to create the database. First, create a database called MVVMProject in SQL Server, then run MVVMProject.sql in the MVVMProject/Database directory of your downloaded .zip file. This should give you a Task table with 3 records in it. When you fire up the solution you will also need to update the connection string in web.config to point to your database instead of IBM12\SQLSERVER2008. The code One note about this code is that it runs against the latest Silverlight 4 RC and WCF RIA Services code. Please see my first blog post about updating to the RC bits. Beta to RC - Part 1 At the top of this post is a link to a sample project that demonstrates a sample application with a Tasks page that uses the MVVM pattern. This is a simplified version of how I have implemented the Tasks page in the Task-It application. Youll notice that Tasks.xaml has very little code to it. Just a TextBlock that displays the page title and a ContentControl. <StackPanel>     <TextBlock Text="Tasks" Style="{StaticResource PageTitleStyle}"/>     <Rectangle Style="{StaticResource StandardSpacerStyle}"/>     <ContentControl x:Name="ContentControl1"/> </StackPanel> In List.xaml we have a RadGridView. Notice that the ItemsSource is bound to a property in the view model class call Tasks, SelectedItem is bound to a property in the view model called SelectedItem, and IsBusy is bound to a property in the view model called IsLoading. <Grid>     <telerikGridView:RadGridView ItemsSource="{Binding Tasks}" SelectedItem="{Binding SelectedItem, Mode=TwoWay}"                                  IsBusy="{Binding IsLoading}" AutoGenerateColumns="False" IsReadOnly="True" RowIndicatorVisibility="Collapsed"                IsFilteringAllowed="False" ShowGroupPanel="False">         <telerikGridView:RadGridView.Columns>             <telerikGridView:GridViewDataColumn Header="Name" DataMemberBinding="{Binding Name}" Width="3*"/>             <telerikGridView:GridViewDataColumn Header="Due" DataMemberBinding="{Binding DueDate}" DataFormatString="{}{0:d}" Width="*"/>         </telerikGridView:RadGridView.Columns>     </telerikGridView:RadGridView> </Grid> In Details.xaml we have a Save button that is bound to a property called SaveCommand in our view model. We also have a simple form (Im using a couple of controls here from Silverlight.FX for the form layout, FormPanel and Label simply because they make for a clean XAML layout). Notice that the FormPanel is also bound to the SelectedItem in the view model (the same one that the RadGridView is). The two form controls, the TextBox and RadDatePicker) are bound to the SelectedItem's Name and DueDate properties. These are properties of the Task object that WCF RIA Services creates. <StackPanel>     <Button Content="Save" Command="{Binding SaveCommand}" HorizontalAlignment="Left"/>     <Rectangle Style="{StaticResource StandardSpacerStyle}"/>     <fxui:FormPanel DataContext="{Binding SelectedItem}" Style="{StaticResource FormContainerStyle}">         <fxui:Label Text="Name:"/>         <TextBox Text="{Binding Name, Mode=TwoWay}"/>         <fxui:Label Text="Due:"/>         <telerikInput:RadDatePicker SelectedDate="{Binding DueDate, Mode=TwoWay}"/>     </fxui:FormPanel> </StackPanel> In the code-behind of the Tasks control, Tasks.xaml.cs, I created an instance of the view model class (TasksViewModel) in the constructor and set it as the DataContext for the control. The Tasks page will load one of two child UserControls depending on whether you are viewing the list of tasks (List.xaml) or the form for editing a task (Details.xaml). // Set the DataContext to an instance of the view model class var viewModel = new TasksViewModel(); DataContext = viewModel;   // Child user controls (inherit DataContext from this user control) List = new List(); // RadGridView Details = new Details(); // Form When the page first loads, the List is loaded into the ContentControl. // Show the RadGridView first ContentControl1.Content = List; In the code-behind we also listen for a couple of the view models events. The ItemSelected event will be fired when the user clicks on a record in the RadGridView in the List control. The SaveCompleted event will be fired when the user clicks Save in the Details control (the form). Here the view model is in control, and is letting the view know when something needs to change. // Listeners for the view model's events viewModel.ItemSelected += OnItemSelected; viewModel.SaveCompleted += OnSaveCompleted; The event handlers toggle the view between the RadGridView (List) and the form (Details). void OnItemSelected(object sender, RoutedEventArgs e) {     // Show the form     ContentControl1.Content = Details; }   void OnSaveCompleted(object sender, RoutedEventArgs e) {     // Show the RadGridView     ContentControl1.Content = List; } In TasksViewModel, we instantiate a DataContext object and a SaveCommand in the constructor. DataContext is a WCF RIA Services object that well use to retrieve the list of Tasks and to save any changes to a task. Ill talk more about this and Commands in future post, but for now think of the SaveCommand as an event handler that is called when the Save button in the form is clicked. DataContext = new DataContext(); SaveCommand = new DelegateCommand(OnSave); When the TasksViewModel constructor is called we also make a call to LoadTasks. This sets IsLoading to true (which causes the RadGridViews busy indicator to appear) and retrieves the records via WCF RIA Services.         public LoadOperation<Task> LoadTasks()         {             // Show the loading message             IsLoading = true;             // Get the data via WCF RIA Services. When the call has returned, called OnTasksLoaded.             return DataContext.Load(DataContext.GetTasksQuery(), OnTasksLoaded, false);         } When the data is returned, OnTasksLoaded is called. This sets IsLoading to false (which hides the RadGridViews busy indicator), and fires property changed notifications to the UI to let it know that the IsLoading and Tasks properties have changed. This property changed notification basically tells the UI to rebind. void OnTasksLoaded(LoadOperation<Task> lo) {     // Hide the loading message     IsLoading = false;       // Notify the UI that Tasks and IsLoading properties have changed     this.OnPropertyChanged(p => p.Tasks);     this.OnPropertyChanged(p => p.IsLoading); } Next lets look at the view models SelectedItem property. This is the one thats bound to both the RadGridView and the form. When the user clicks a record in the RadGridView its setter gets called (set a breakpoint and see what I mean). The other code in the setter lets the UI know that the SelectedItem has changed (so the form displays the correct data), and fires the event that notifies the UI that a selection has occurred (which tells the UI to switch from List to Details). public Task SelectedItem {     get { return _selectedItem; }     set     {         _selectedItem = value;           // Let the UI know that the SelectedItem has changed (forces it to re-bind)         this.OnPropertyChanged(p => p.SelectedItem);         // Notify the UI, so it can switch to the Details (form) page         NotifyItemSelected();     } } One last thing, saving the data. When the Save button in the form is clicked it fires the SaveCommand, which calls the OnSave method in the view model (once again, set a breakpoint to see it in action). public void OnSave() {     // Save the changes via WCF RIA Services. When the save is complete, call OnSaveCompleted.     DataContext.SubmitChanges(OnSaveCompleted, null); } In OnSave, we tell WCF RIA Services to submit any changes, which there will be if you changed either the Name or the Due Date in the form. When the save is completed, it calls OnSaveCompleted. This method fires a notification back to the UI that the save is completed, which causes the RadGridView (List) to show again. public virtual void OnSaveCompleted(SubmitOperation so) {     // Clear the item that is selected in the grid (in case we want to select it again)     SelectedItem = null;     // Notify the UI, so it can switch back to the List (RadGridView) page     NotifySaveCompleted(); } Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Azure WNS to Win8 - Push Notifications for Metro Apps

    - by JoshReuben
    Background The Windows Azure Toolkit for Windows 8 allows you to build a Windows Azure Cloud Service that can send Push Notifications to registered Metro apps via Windows Notification Service (WNS). Some configuration is required - you need to: Register the Metro app for Windows Live Application Management Provide Package SID & Client Secret to WNS Modify the Azure Cloud App cscfg file and the Metro app package.appxmanifest file to contain matching Metro package name, SID and client secret. The Mechanism: These notifications take the form of XAML Tile, Toast, Raw or Badge UI notifications. The core engine is provided via the WNS nuget recipe, which exposes an API for constructing payloads and posting notifications to WNS. An application receives push notifications by requesting a notification channel from WNS, which returns a channel URI that the application then registers with a cloud service. In the cloud service, A WnsAccessTokenProvider authenticates with WNS by providing its credentials, the package SID and secret key, and receives in return an access token that the provider caches and can reuse for multiple notification requests. The cloud service constructs a notification request by filling out a template class that contains the information that will be sent with the notification, including text and image references. Using the channel URI of a registered client, the cloud service can then send a notification whenever it has an update for the user. The package contains the NotificationSendUtils class for submitting notifications. The Windows Azure Toolkit for Windows 8 (WAT) provides the PNWorker sample pair of solutions - The Azure server side contains a WebRole & a WorkerRole. The WebRole allows submission of new push notifications into an Azure Queue which the WorkerRole extracts and processes. Further background resources: http://watwindows8.codeplex.com/ - Windows Azure Toolkit for Windows 8 http://watwindows8.codeplex.com/wikipage?title=Push%20Notification%20Worker%20Sample - WAT WNS sample setup http://watwindows8.codeplex.com/wikipage?title=Using%20the%20Windows%208%20Cloud%20Application%20Services%20Application – using Windows 8 with Cloud Application Services A bit of Configuration Register the Metro apps for Windows Live Application Management From the current app manifest of your metro app Publish tab, copy the Package Display Name and the Publisher From: https://manage.dev.live.com/Build/ Package name: <-- we need to change this Client secret: keep this Package Security Identifier (SID): keep this Verify the app here: https://manage.dev.live.com/Applications/Index - so this step is done "If you wish to send push notifications in your application, provide your Package Security Identifier (SID) and client secret to WNS." Provide Package SID & Client Secret to WNS http://msdn.microsoft.com/en-us/library/windows/apps/hh465407.aspx - How to authenticate with WNS https://appdev.microsoft.com/StorePortals/en-us/Account/Signup/PurchaseSubscription - register app with dashboard - need registration code or register a new account & pay $170 shekels http://msdn.microsoft.com/en-us/library/windows/apps/hh868184.aspx - Registering for a Windows Store developer account http://msdn.microsoft.com/en-us/library/windows/apps/hh868187.aspx - Picking a Microsoft account for the Windows Store The WNS Nuget Recipe The WNS Recipe is a nuget package that provides an API for authenticating against WNS, constructing payloads and posting notifications to WNS. After installing this package, a WnsRecipe assembly is added to project references. To send notifications using WNS, first register the application at the Windows Push Notifications & Live Connect portal to obtain Package Security Identifier (SID) and a secret key that your cloud service uses to authenticate with WNS. An application receives push notifications by requesting a notification channel from WNS, which returns a channel URI that the application then registers with a cloud service. In the cloud service, the WnsAccessTokenProvider authenticates with WNS by providing its credentials, the package SID and secret key, and receives in return an access token that the provider caches and can reuse for multiple notification requests. The cloud service constructs a notification request by filling out a template class that contains the information that will be sent with the notification, including text and image references.Using the channel URI of a registered client, the cloud service can then send a notification whenever it has an update for the user. var provider = new WnsAccessTokenProvider(clientId, clientSecret); var notification = new ToastNotification(provider) {     ToastType = ToastType.ToastText02,     Text = new List<string> { "blah"} }; notification.Send(channelUri); the WNS Recipe is instrumented to write trace information via a trace listener – configuratively or programmatically from Application_Start(): WnsDiagnostics.Enable(); WnsDiagnostics.TraceSource.Listeners.Add(new DiagnosticMonitorTraceListener()); WnsDiagnostics.TraceSource.Switch.Level = SourceLevels.Verbose; The WAT PNWorker Sample The Azure server side contains a WebRole & a WorkerRole. The WebRole allows submission of new push notifications into an Azure Queue which the WorkerRole extracts and processes. Overview of Push Notification Worker Sample The toolkit includes a sample application based on the same solution structure as the one created by theWindows 8 Cloud Application Services project template. The sample demonstrates how to off-load the job of sending Windows Push Notifications using a Windows Azure worker role. You can find the source code in theSamples\PNWorker folder. This folder contains a full version of the sample application showing how to use Windows Push Notifications using ASP.NET Membership as the authentication mechanism. The sample contains two different solution files: WATWindows.Azure.sln: This solution must be opened with Visual Studio 2010 and contains the projects related to the Windows Azure web and worker roles. WATWindows.Client.sln: This solution must be opened with Visual Studio 11 and contains the Windows Metro style application project. Only Visual Studio 2010 supports Windows Azure cloud projects so you currently need to use this edition to launch the server application. This will change in a future release of the Windows Azure tools when support for Visual Studio 11 is enabled. Important: Setting up the PNWorker Sample Before running the PNWorker sample, you need to register the application and configure it: 1. Register the app: To register your application, go to the Windows Live Application Management site for Metro style apps at https://manage.dev.live.com/build and sign in with your Windows Live ID. In the Windows Push Notifications & Live Connect page, enter the following information. Package Display Name PNWorker.Sample Publisher CN=127.0.0.1, O=TESTING ONLY, OU=Windows Azure DevFabric 2. 3. Once you register the application, make a note of the values shown in the portal for Client Secret,Package Name and Package SID. 4. Configure the app - double-click the SetupSample.cmd file located inside the Samples\PNWorker folder to launch a tool that will guide you through the process of configuring the sample. setup runs a PowerShell script that requires running with administration privileges to allow the scripts to execute in your machine. When prompted, enter the Client Secret, Package Name, and Package Security Identifier you obtained previously and wait until the tool finishes configuring your sample. Running the PNWorker Sample To run this sample, you must run both the client and the server application projects. 1. Open Visual Studio 2010 as an administrator. Open the WATWindows.Azure.sln solution. Set the start-up project of the solution as the cloud project. Run the app in the dev fabric to test. 2. Open Visual Studio 11 and open the WATWindows.Client.sln solution. Run the Metro client application. In the client application, click Reopen channel and send to server. à the application opens the channel and registers it with the cloud application, & the Output area shows the channel URI. 3. Refresh the WebRole's Push Notifications page to see the UI list the newly registered client. 4. Send notifications to the client application by clicking the Send Notification button. Setup 3 command files + 1 powershell script: SetupSample.cmd –> SetupWPNS.vbs –> SetupWPNS.cmd –> SetupWPNS.UpdateWPNSCredentialsInServiceConfiguration.ps1 appears to set PackageName – from manifest Client Id package security id (SID) – from registration Client Secret – from registration The following configs are modified: WATWindows\ServiceConfiguration.Cloud.cscfg WATWindows\ServiceConfiguration.Local.cscfg WATWindows.Client\package.appxmanifest WatWindows.Notifications A class library – it references the following WNS DLL: C:\WorkDev\CountdownValue\AzureToolkits\WATWindows8\Samples\PNWorker\packages\WnsRecipe.0.0.3.0\lib\net40\WnsRecipe.dll NotificationJobRequest A DataContract for triggering notifications:     using System.Runtime.Serialization; using Microsoft.Windows.Samples.Notifications;     [DataContract]     [KnownType(typeof(WnsAccessTokenProvider))] public class NotificationJobRequest     {               [DataMember] public bool ProcessAsync { get; set; }          [DataMember] public string Payload { get; set; }         [DataMember] public string ChannelUrl { get; set; }         [DataMember] public NotificationType NotificationType { get; set; }         [DataMember] public IAccessTokenProvider AccessTokenProvider { get; set; }         [DataMember] public NotificationSendOptions NotificationSendOptions{ get; set; }     } Investigated these types: WnsAccessTokenProvider – a DataContract that contains the client Id and client secret NotificationType – an enum that can be: Tile, Toast, badge, Raw IAccessTokenProvider – get or reset the access token NotificationSendOptions – SecondsTTL, NotificationPriority (enum), isCache, isRequestForStatus, Tag   There is also a NotificationJobSerializer class which basically wraps a DataContractSerializer serialization / deserialization of NotificationJobRequest The WNSNotificationJobProcessor class This class wraps the NotificationSendUtils API – it periodically extracts any NotificationJobRequest objects from a CloudQueue and submits them to WNS. The ProcessJobMessageRequest method – this is the punchline: it will deserialize a CloudQueueMessage into a NotificationJobRequest & send pass its contents to NotificationUtils to SendAsynchronously / SendSynchronously, (and then dequeue the message).     public override void ProcessJobMessageRequest(CloudQueueMessage notificationJobMessageRequest)         { Trace.WriteLine("Processing a new Notification Job Request", "Information"); NotificationJobRequest pushNotificationJob =                 NotificationJobSerializer.Deserialize(notificationJobMessageRequest.AsString); if (pushNotificationJob != null)             { if (pushNotificationJob.ProcessAsync)                 { Trace.WriteLine("Sending the notification asynchronously", "Information"); NotificationSendUtils.SendAsynchronously( new Uri(pushNotificationJob.ChannelUrl),                         pushNotificationJob.AccessTokenProvider,                         pushNotificationJob.Payload,                         result => this.ProcessSendResult(pushNotificationJob, result),                         result => this.ProcessSendResultError(pushNotificationJob, result),                         pushNotificationJob.NotificationType,                         pushNotificationJob.NotificationSendOptions);                 } else                 { Trace.WriteLine("Sending the notification synchronously", "Information"); NotificationSendResult result = NotificationSendUtils.Send( new Uri(pushNotificationJob.ChannelUrl),                         pushNotificationJob.AccessTokenProvider,                         pushNotificationJob.Payload,                         pushNotificationJob.NotificationType,                         pushNotificationJob.NotificationSendOptions); this.ProcessSendResult(pushNotificationJob, result);                 }             } else             { Trace.WriteLine("Could not deserialize the notification job", "Error");             } this.queue.DeleteMessage(notificationJobMessageRequest);         } Investigation of NotificationSendUtils class - This is the engine – it exposes Send and a SendAsyncronously overloads that take the following params from the NotificationJobRequest: Channel Uri AccessTokenProvider Payload NotificationType NotificationSendOptions WebRole WebRole is a large MVC project – it references WatWindows.Notifications as well as the following WNS DLL: \AzureToolkits\WATWindows8\Samples\PNWorker\packages\WnsRecipe.0.0.3.0\lib\net40\NotificationsExtensions.dll Controllers\PushNotificationController.cs Notification related namespaces:     using Notifications;     using NotificationsExtensions;     using NotificationsExtensions.BadgeContent;     using NotificationsExtensions.RawContent;     using NotificationsExtensions.TileContent;     using NotificationsExtensions.ToastContent;     using Windows.Samples.Notifications; TokenProvider – initialized from the Azure RoleEnvironment:   IAccessTokenProvider tokenProvider = new WnsAccessTokenProvider(         RoleEnvironment.GetConfigurationSettingValue("WNSPackageSID"),         RoleEnvironment.GetConfigurationSettingValue("WNSClientSecret")); SendNotification method – calls QueuePushMessage method to create and serialize a NotificationJobRequest and enqueue it in a CloudQueue [HttpPost]         public ActionResult SendNotification(             [ModelBinder(typeof(NotificationTemplateModelBinder))] INotificationContent notification,             string channelUrl,             NotificationPriority priority = NotificationPriority.Normal)         {             var payload = notification.GetContent();             var options = new NotificationSendOptions()             {                 Priority = priority             };             var notificationType =                 notification is IBadgeNotificationContent ? NotificationType.Badge :                 notification is IRawNotificationContent ? NotificationType.Raw :                 notification is ITileNotificationContent ? NotificationType.Tile :                 NotificationType.Toast;             this.QueuePushMessage(payload, channelUrl, notificationType, options);             object response = new             {                 Status = "Queued for delivery to WNS"             };             return this.Json(response);         } GetSendTemplate method: Create the cshtml partial rendering based on the notification type     [HttpPost]         public ActionResult GetSendTemplate(NotificationTemplateViewModel templateOptions)         {             PartialViewResult result = null;             switch (templateOptions.NotificationType)             {                 case "Badge":                     templateOptions.BadgeGlyphValueContent = Enum.GetNames(typeof( GlyphValue));                     ViewBag.ViewData = templateOptions;                     result = PartialView("_" + templateOptions.NotificationTemplateType);                     break;                 case "Raw":                     ViewBag.ViewData = templateOptions;                     result = PartialView("_Raw");                     break;                 case "Toast":                     templateOptions.TileImages = this.blobClient.GetAllBlobsInContainer(ConfigReader.GetConfigValue("TileImagesContainer")).OrderBy(i => i.FileName).ToList();                     templateOptions.ToastAudioContent = Enum.GetNames(typeof( ToastAudioContent));                     templateOptions.Priorities = Enum.GetNames(typeof( NotificationPriority));                     ViewBag.ViewData = templateOptions;                     result = PartialView("_" + templateOptions.NotificationTemplateType);                     break;                 case "Tile":                     templateOptions.TileImages = this.blobClient.GetAllBlobsInContainer(ConfigReader.GetConfigValue("TileImagesContainer")).OrderBy(i => i.FileName).ToList();                     ViewBag.ViewData = templateOptions;                     result = PartialView("_" + templateOptions.NotificationTemplateType);                     break;             }             return result;         } Investigated these types: ToastAudioContent – an enum of different Win8 sound effects for toast notifications GlyphValue – an enum of different Win8 icons for badge notifications · Infrastructure\NotificationTemplateModelBinder.cs WNS Namespace references     using NotificationsExtensions.BadgeContent;     using NotificationsExtensions.RawContent;     using NotificationsExtensions.TileContent;     using NotificationsExtensions.ToastContent; Various NotificationFactory derived types can server as bindable models in MVC for creating INotificationContent types. Default values are also set for IWideTileNotificationContent & IToastNotificationContent. Type factoryType = null;             switch (notificationType)             {                 case "Badge":                     factoryType = typeof(BadgeContentFactory);                     break;                 case "Tile":                     factoryType = typeof(TileContentFactory);                     break;                 case "Toast":                     factoryType = typeof(ToastContentFactory);                     break;                 case "Raw":                     factoryType = typeof(RawContentFactory);                     break;             } Investigated these types: BadgeContentFactory – CreateBadgeGlyph, CreateBadgeNumeric (???) TileContentFactory – many notification content creation methods , apparently one for every tile layout type ToastContentFactory – many notification content creation methods , apparently one for every toast layout type RawContentFactory – passing strings WorkerRole WNS Namespace references using Notifications; using Notifications.WNS; using Windows.Samples.Notifications; OnStart() Method – on Worker Role startup, initialize the NotificationJobSerializer, the CloudQueue, and the WNSNotificationJobProcessor _notificationJobSerializer = new NotificationJobSerializer(); _cloudQueueClient = this.account.CreateCloudQueueClient(); _pushNotificationRequestsQueue = _cloudQueueClient.GetQueueReference(ConfigReader.GetConfigValue("RequestQueueName")); _processor = new WNSNotificationJobProcessor(_notificationJobSerializer, _pushNotificationRequestsQueue); Run() Method – poll the Azure Queue for NotificationJobRequest messages & process them:   while (true)             { Trace.WriteLine("Checking for Messages", "Information"); try                 { Parallel.ForEach( this.pushNotificationRequestsQueue.GetMessages(this.batchSize), this.processor.ProcessJobMessageRequest);                 } catch (Exception e)                 { Trace.WriteLine(e.ToString(), "Error");                 } Trace.WriteLine(string.Format("Sleeping for {0} seconds", this.pollIntervalMiliseconds / 1000)); Thread.Sleep(this.pollIntervalMiliseconds);                                            } How I learned to appreciate Win8 There is really only one application architecture for Windows 8 apps: Metro client side and Azure backend – and that is a good thing. With WNS, tier integration is so automated that you don’t even have to leverage a HTTP push API such as SignalR. This is a pretty powerful development paradigm, and has changed the way I look at Windows 8 for RAD business apps. When I originally looked at Win8 and the WinRT API, my first opinion on Win8 dev was as follows – GOOD:WinRT, WRL, C++/CX, WinJS, XAML (& ease of Direct3D integration); BAD: low projected market penetration,.NET lobotomized (Only 8% of .NET 4.5 classes can be used in Win8 non-desktop apps - http://bit.ly/HRuJr7); UGLY:Metro pascal tiles! Perhaps my 80s teenage years gave me a punk reactionary sense of revulsion towards the Partridge Family 70s style that Metro UX seems to have appropriated: On second thought though, it simplifies UI dev to a single paradigm (although UX guys will need to change career) – you will not find an easier app dev environment. Speculation: If LightSwitch is going to support HTML5 client app generation, then its a safe guess to say that vnext will support Win8 Metro XAML - a much easier port from Silverlight XAML. Given the VS2012 LightSwitch integration as a thumbs up from the powers that be at MS, and given that Win8 C#/XAML Metro apps tend towards a streamlined 'golden straight-jacket' cookie cutter app dev style with an Azure back-end supporting Win8 push notifications... --> its easy to extrapolate than LightSwitch vnext could well be the Win8 Metro XAML to Azure RAD tool of choice! The hook is already there - :) Why else have the space next to the HTML Client box? This high level of application development abstraction will facilitate rapid app cookie-cutter architecture-infrastructure frameworks for wrapping any app. This will allow me to avoid too much XAML code-monkeying around & focus on my area of interest: Technical Computing.

    Read the article

  • Unable to connect to Samba printer

    - by user127236
    I have a headless Ubuntu 12.04 server for files and printers. It shares files via Samba just fine. However, the HP PSC-750xi connected to the server via USB is not accessible from my Ubuntu 12.04 laptop. I can browse for it in the Printing control panel, but any attempt to authenticate my ID to the printer with my user credentials results in the error "This print share is not accessible". I have included the Samba smb.conf file below. Any help appreciated. Thanks... JGB # # Sample configuration file for the Samba suite for Debian GNU/Linux. # # # This is the main Samba configuration file. You should read the # smb.conf(5) manual page in order to understand the options listed # here. Samba has a huge number of configurable options most of which # are not shown in this example # # Some options that are often worth tuning have been included as # commented-out examples in this file. # - When such options are commented with ";", the proposed setting # differs from the default Samba behaviour # - When commented with "#", the proposed setting is the default # behaviour of Samba but the option is considered important # enough to be mentioned here # # NOTE: Whenever you modify this file you should run the command # "testparm" to check that you have not made any basic syntactic # errors. # A well-established practice is to name the original file # "smb.conf.master" and create the "real" config file with # testparm -s smb.conf.master >smb.conf # This minimizes the size of the really used smb.conf file # which, according to the Samba Team, impacts performance # However, use this with caution if your smb.conf file contains nested # "include" statements. See Debian bug #483187 for a case # where using a master file is not a good idea. # #======================= Global Settings ======================= [global] log file = /var/log/samba/log.%m passwd chat = *Enter\snew\s*\spassword:* %n\n *Retype\snew\s*\spassword:* %n\n *password\supdated\ssuccessfully* . obey pam restrictions = yes map to guest = bad user encrypt passwords = true passwd program = /usr/bin/passwd %u passdb backend = tdbsam dns proxy = no writeable = yes server string = %h server (Samba, Ubuntu) unix password sync = yes workgroup = WORKGROUP syslog = 0 panic action = /usr/share/samba/panic-action %d usershare allow guests = yes max log size = 1000 pam password change = yes ## Browsing/Identification ### # Change this to the workgroup/NT-domain name your Samba server will part of # server string is the equivalent of the NT Description field # Windows Internet Name Serving Support Section: # WINS Support - Tells the NMBD component of Samba to enable its WINS Server # wins support = no # WINS Server - Tells the NMBD components of Samba to be a WINS Client # Note: Samba can be either a WINS Server, or a WINS Client, but NOT both ; wins server = w.x.y.z # This will prevent nmbd to search for NetBIOS names through DNS. # What naming service and in what order should we use to resolve host names # to IP addresses ; name resolve order = lmhosts host wins bcast #### Networking #### # The specific set of interfaces / networks to bind to # This can be either the interface name or an IP address/netmask; # interface names are normally preferred ; interfaces = 127.0.0.0/8 eth0 # Only bind to the named interfaces and/or networks; you must use the # 'interfaces' option above to use this. # It is recommended that you enable this feature if your Samba machine is # not protected by a firewall or is a firewall itself. However, this # option cannot handle dynamic or non-broadcast interfaces correctly. ; bind interfaces only = yes #### Debugging/Accounting #### # This tells Samba to use a separate log file for each machine # that connects # Cap the size of the individual log files (in KiB). # If you want Samba to only log through syslog then set the following # parameter to 'yes'. # syslog only = no # We want Samba to log a minimum amount of information to syslog. Everything # should go to /var/log/samba/log.{smbd,nmbd} instead. If you want to log # through syslog you should set the following parameter to something higher. # Do something sensible when Samba crashes: mail the admin a backtrace ####### Authentication ####### # "security = user" is always a good idea. This will require a Unix account # in this server for every user accessing the server. See # /usr/share/doc/samba-doc/htmldocs/Samba3-HOWTO/ServerType.html # in the samba-doc package for details. # security = user # You may wish to use password encryption. See the section on # 'encrypt passwords' in the smb.conf(5) manpage before enabling. # If you are using encrypted passwords, Samba will need to know what # password database type you are using. # This boolean parameter controls whether Samba attempts to sync the Unix # password with the SMB password when the encrypted SMB password in the # passdb is changed. # For Unix password sync to work on a Debian GNU/Linux system, the following # parameters must be set (thanks to Ian Kahan <<[email protected]> for # sending the correct chat script for the passwd program in Debian Sarge). # This boolean controls whether PAM will be used for password changes # when requested by an SMB client instead of the program listed in # 'passwd program'. The default is 'no'. # This option controls how unsuccessful authentication attempts are mapped # to anonymous connections ########## Domains ########### # Is this machine able to authenticate users. Both PDC and BDC # must have this setting enabled. If you are the BDC you must # change the 'domain master' setting to no # ; domain logons = yes # # The following setting only takes effect if 'domain logons' is set # It specifies the location of the user's profile directory # from the client point of view) # The following required a [profiles] share to be setup on the # samba server (see below) ; logon path = \\%N\profiles\%U # Another common choice is storing the profile in the user's home directory # (this is Samba's default) # logon path = \\%N\%U\profile # The following setting only takes effect if 'domain logons' is set # It specifies the location of a user's home directory (from the client # point of view) ; logon drive = H: # logon home = \\%N\%U # The following setting only takes effect if 'domain logons' is set # It specifies the script to run during logon. The script must be stored # in the [netlogon] share # NOTE: Must be store in 'DOS' file format convention ; logon script = logon.cmd # This allows Unix users to be created on the domain controller via the SAMR # RPC pipe. The example command creates a user account with a disabled Unix # password; please adapt to your needs ; add user script = /usr/sbin/adduser --quiet --disabled-password --gecos "" %u # This allows machine accounts to be created on the domain controller via the # SAMR RPC pipe. # The following assumes a "machines" group exists on the system ; add machine script = /usr/sbin/useradd -g machines -c "%u machine account" -d /var/lib/samba -s /bin/false %u # This allows Unix groups to be created on the domain controller via the SAMR # RPC pipe. ; add group script = /usr/sbin/addgroup --force-badname %g ########## Printing ########## # If you want to automatically load your printer list rather # than setting them up individually then you'll need this # load printers = yes # lpr(ng) printing. You may wish to override the location of the # printcap file ; printing = bsd ; printcap name = /etc/printcap # CUPS printing. See also the cupsaddsmb(8) manpage in the # cupsys-client package. ; printing = cups ; printcap name = cups ############ Misc ############ # Using the following line enables you to customise your configuration # on a per machine basis. The %m gets replaced with the netbios name # of the machine that is connecting ; include = /home/samba/etc/smb.conf.%m # Most people will find that this option gives better performance. # See smb.conf(5) and /usr/share/doc/samba-doc/htmldocs/Samba3-HOWTO/speed.html # for details # You may want to add the following on a Linux system: # SO_RCVBUF=8192 SO_SNDBUF=8192 # socket options = TCP_NODELAY # The following parameter is useful only if you have the linpopup package # installed. The samba maintainer and the linpopup maintainer are # working to ease installation and configuration of linpopup and samba. ; message command = /bin/sh -c '/usr/bin/linpopup "%f" "%m" %s; rm %s' & # Domain Master specifies Samba to be the Domain Master Browser. If this # machine will be configured as a BDC (a secondary logon server), you # must set this to 'no'; otherwise, the default behavior is recommended. # domain master = auto # Some defaults for winbind (make sure you're not using the ranges # for something else.) ; idmap uid = 10000-20000 ; idmap gid = 10000-20000 ; template shell = /bin/bash # The following was the default behaviour in sarge, # but samba upstream reverted the default because it might induce # performance issues in large organizations. # See Debian bug #368251 for some of the consequences of *not* # having this setting and smb.conf(5) for details. ; winbind enum groups = yes ; winbind enum users = yes # Setup usershare options to enable non-root users to share folders # with the net usershare command. # Maximum number of usershare. 0 (default) means that usershare is disabled. ; usershare max shares = 100 # Allow users who've been granted usershare privileges to create # public shares, not just authenticated ones #======================= Share Definitions ======================= # Un-comment the following (and tweak the other settings below to suit) # to enable the default home directory shares. This will share each # user's home director as \\server\username ;[homes] ; comment = Home Directories ; browseable = no # By default, the home directories are exported read-only. Change the # next parameter to 'no' if you want to be able to write to them. ; read only = yes # File creation mask is set to 0700 for security reasons. If you want to # create files with group=rw permissions, set next parameter to 0775. ; create mask = 0700 # Directory creation mask is set to 0700 for security reasons. If you want to # create dirs. with group=rw permissions, set next parameter to 0775. ; directory mask = 0700 # By default, \\server\username shares can be connected to by anyone # with access to the samba server. Un-comment the following parameter # to make sure that only "username" can connect to \\server\username # The following parameter makes sure that only "username" can connect # # This might need tweaking when using external authentication schemes ; valid users = %S # Un-comment the following and create the netlogon directory for Domain Logons # (you need to configure Samba to act as a domain controller too.) ;[netlogon] ; comment = Network Logon Service ; path = /home/samba/netlogon ; guest ok = yes ; read only = yes # Un-comment the following and create the profiles directory to store # users profiles (see the "logon path" option above) # (you need to configure Samba to act as a domain controller too.) # The path below should be writable by all users so that their # profile directory may be created the first time they log on ;[profiles] ; comment = Users profiles ; path = /home/samba/profiles ; guest ok = no ; browseable = no ; create mask = 0600 ; directory mask = 0700 [printers] comment = All Printers browseable = no path = /var/spool/samba printable = yes guest ok = no read only = yes create mask = 0700 # Windows clients look for this share name as a source of downloadable # printer drivers [print$] comment = Printer Drivers browseable = yes writeable = no path = /var/lib/samba/printers # Uncomment to allow remote administration of Windows print drivers. # You may need to replace 'lpadmin' with the name of the group your # admin users are members of. # Please note that you also need to set appropriate Unix permissions # to the drivers directory for these users to have write rights in it ; write list = root, @lpadmin # A sample share for sharing your CD-ROM with others. ;[cdrom] ; comment = Samba server's CD-ROM ; read only = yes ; locking = no ; path = /cdrom ; guest ok = yes # The next two parameters show how to auto-mount a CD-ROM when the # cdrom share is accesed. For this to work /etc/fstab must contain # an entry like this: # # /dev/scd0 /cdrom iso9660 defaults,noauto,ro,user 0 0 # # The CD-ROM gets unmounted automatically after the connection to the # # If you don't want to use auto-mounting/unmounting make sure the CD # is mounted on /cdrom # ; preexec = /bin/mount /cdrom ; postexec = /bin/umount /cdrom [mediafiles] path = /media/multimedia/

    Read the article

  • CodePlex Daily Summary for Tuesday, November 01, 2011

    CodePlex Daily Summary for Tuesday, November 01, 2011Popular ReleasesAgFx Windows Phone Application Data Caching Framework: AgFx November Update: This update is primarily a bug fix release, which provides a number of stability and performance enhancements to AgFx. Available as a NuGet package here. Release Notes Fix IsoStore exceptions during shutdown (Changelist 81729) Thread safety and sequencings issues (Changelists 78682,79252, 80707) Fix landscape layout issue in auth sample Fix async keyword collision Supporting overlapping/multiple notifications for Load requestsiTuner - The iTunes Companion: iTuner 1.4.4322: Added German (unverified, apologies if incorrect) Properly source invariant resources with correct resIDs Replaced obsolete lyric providers with working providers Fix Pseudolater to correctly morph every third char Fix null reference in CatalogBaseDevpad: 4.6: Whats new for Devpad 4.6: New Recent Files New Run in Safari Minor Bug Fix's, improvements and speed upsWindows Workflow Foundation on Codeplex: Microsoft.Activities v1.8.8: Microsoft.Activities Overview How do I install Microsoft.Activities? Updates in this release9318Technical Analysis Engine for .NET: Technical Analysis Engine 1.25: What's new in the 1.25 release (2011-11-01): - Added Williams %R indicator - Added Moving Average Envelopes indicatorBF3Rcon.NET: BF3Rcon 3.0: This release is targeted for RCON documentation based on R3. Everything should be beta stable, but it's alpha because I haven't been able to fully test it. When a stable release is ready, a proper changelog will be kept. Important Edit: There's one method that will keep this from working in Mono. GeneratePasswordHash uses void HashAlgorithm.Dispose(), which isn't in Mono. This will have to be changed to Clear() in the next release. If anyone needs a Mono version of this immediately, you can...BoxWorld: BoxWorld_2011.10.30: BoxWorld - 8.0.1110.30 This is the initial release of BoxWorld. I'd recommend downloading the installer as it contains the compiled code and everything all nicely contained. By default, you end up with this directory structure: C:\Program Files\ViperWorks\BoxWorld C:\Program Files\ViperWorks\BoxWorld\Data C:\Program Files\ViperWorks\BoxWorld\Interface C:\Program Files\ViperWorks\BoxWorld\Source In the root you have the compiled EXE files, one for the main release, one for the LITE release ...VidCoder: 1.2.1: Fixed a couple regressions: video encoder was blank in queue and crashes with the High Profile preset when opening the Settings window. Fixed problem with auto-update introduced in 1.2.0. If you have 1.2.0 you will need to update manually to get this.AssaultCube Reloaded: Release 2.3: THE RELEASE YOU'VE ALL BEEN WAITING FOR! IT CAN NOW BE CONSIDERED STABLE Linux has Debian 64-bit precompiled binaries, but you can compile your own as it also contains the source. If you are using Mac or other operating systems, download the Linux package. The server pack is ready for both Windows and Linux, but you might need to compile your own for linux (source included) If you are using Windows and require the source code, download the source package!Koober: Koober - The Ebook Creator 0.2: The official release of Koober as Open source. Koober is a ebook creator for Windows, and Koob Reader is the reader.A Microblog API (SINA weibo.com open API in C#, .Net???????API): AMicroblogAPI v1.0: AMicroBlogAPI is a C# implementation, a strong-typed deep encapsulation, a easy-to-use .net wrapper of SINA microblog API v1.0. App developers no longer need to parse various HTTP responses -- all responses are parsed into strong types; No longer need to construt the request query strings -- just simply gives the values of parameters; No longer need to implement the OAuth -- a single call could logs the user on. In this release (AMicroblogAPI v1.0), all basic data APIs are implemented. For ...patterns & practices: Enterprise Library Contrib: Enterprise Library Contrib - 5.0 (Oct 2011): This release of Enterprise Library Contrib is based on the Microsoft patterns & practices Enterprise Library 5.0 core and contains the following: Common extensionsTypeConfigurationElement<T> - A Polymorphic Configuration Element without having to be part of a PolymorphicConfigurationElementCollection. AnonymousConfigurationElement - A Configuration element that can be uniquely identified without having to define its name explicitly. Data Access Application Block extensionsMySql Provider - ...Network Monitor Open Source Parsers: Network Monitor Parsers 3.4.2748: The Network Monitor Parsers packages contain parsers for more than 400 network protocols, including RFC based public protocols and protocols for Microsoft products defined in the Microsoft Open Specifications for Windows and SQL Server. NetworkMonitor_Parsers.msi is the base parser package which defines parsers for commonly used public protocols and protocols for Microsoft Windows. In this release, NetowrkMonitor_Parsers.msi continues to improve quality and fix bugs. It has included the fo...Duckworth Lewis Professional Edition Calculator: DLcalc 3.0: DLcalc 3.0 can perform Duckworth/Lewis Professional Edition calculations 100% accurately. It also produces over-by-over and ball-by-ball PAR score tables.Media Companion: MC 3.420b Weekly: Ensure .NET 4.0 Full Framework is installed. (Available from http://www.microsoft.com/download/en/details.aspx?id=17718) Ensure the NFO ID fix is applied when transitioning from versions prior to 3.416b. (Details here) Movies Fixed: Fanart and poster scraping issues TV Shows (Re)Added: Rebuild single show Fixed: Issue when shows are moved from original location Ability to handle " for actor nicknames Crash when episode name contains "<" (does not scrape yet) Clears fanart when switch...patterns & practices - Unity: Unity 3.0 for .NET4.5 Preview: The Unity 3.0.1026.0 Preview enables Unity to work on .NET 4.5 with both the WinRT and desktop profiles. The major changes include: Unity projects updated to target .NET 4.5. Dynamic build plans modified to use compiled lambda expressions instead of Reflection.Emit Converting reflection to use the new TypeInfo for reflection. Projects updated to work with the Microsoft Visual Studio 2011 Preview Notes/Known Issues: The Microsoft.Practices.Unity.UnityServiceLocator class cannot be use...Managed Extensibility Framework: MEF 2 Preview 4: Detailed information on this release is available on the BCL team blog.AcDown????? - Anime&Comic Downloader: AcDown????? v3.6: ?? ● AcDown??????????、??????,??????????????????????,???????Acfun、Bilibili、???、???、???、Tucao.cc、SF???、?????80????,???????????、?????????。 ● AcDown???????????????????????????,???,???????????????????。 ● AcDown???????C#??,????.NET Framework 2.0??。?????"Acfun?????"。 ????32??64? Windows XP/Vista/7 ????????????? ??:????????Windows XP???,?????????.NET Framework 2.0???(x86)?.NET Framework 2.0???(x64),?????"?????????"??? ??????????????,??????????: ??"AcDown?????"????????? ?? v3.6?? ??“????”...MVC Controls Toolkit: Mvc Controls Toolkit 1.5.0: Added: The new Client Blocks feaure of Views A new "move" js method for the TreeViews The NewHtmlCreated js event to the DataGrid Improved the ChoiceList structure that now allows also the selection list of a dropdown to be chosen with a lambda expression Improved the AcceptViewHintAttribute controller filter. Now a client can specify not only the name of a View or Partial View it prefers, but also to receive just the rough data in Json format. Now the the SMinimum and SMaximum par...Free SharePoint Master Pages: Buried Alive (Halloween) Theme: Release Notes *Created for Halloween, you will find theme file, custom css file and images. *Created by Al Roome @AlstarRoome Features: Custom styling for web part Custom background *Screenshot https://s3.amazonaws.com/kkhipple/post/sharepoint-showcase-halloween.pngNew ProjectsarlTH: "Airport Link Thailand" Application for Windows Phone 7.1Audio Tags Editor For Excel: The purpose of the project is to create a tool to edit tags of audio files through the excel application. The program is an Excel add-in.AW Load Tester: AW Load Tester is a simple and easy to use website load tester that very closely mimics a regular users browsing session on your website. Many popular load tester only download HTML. AW Load Tester downloads all the related items on a page simultaneously. AzMan - Be Good Do Right !: The AzMan is a collection of reusable software components designed to assist software developers with common enterprise development challenges. http://bbs.fengshupo.com/BestWayWP7UI: The User interface of best wat wp7 applicationCheckout Subsystem Optimizer: Checkout Subsystem OptimizerDBScripter - Library for scripting SQL Server database objects: This project is library that allows users to script SQL Server database objects. Library uses dynamic management views for extracting data about databases objects. This library could be used in various situations. The most interesting areas are comparing database objects and generation database documentation. For both cases examples have been prepared.Dependency Checker: Dependency Checker is a utility for verifying that a set of prerequisites is present on a machine. The set of dependencies is defined in a configuration file. For more information, see http://dependencychecker.codeplex.com/documentation DibTer: DibTer is a brand new CMS for ASP.NET Mainly features are NHibernate MVC JQuery HTML5Dynamics CRM Entity Based Stress Tool: Entity Based Stress Tool, is command line program that creates configurable server agents in order to stress a defined entity at a defined Message/Stage. This tool allows CRM Application Stressing, Plugin Stressing, and enlaborate an execution report.Edinger: Edinger is a simple text editor.FreeboxHDVideoPlayer: FreeboxHDVideoPlayer makes it easier for free.fr french ISP users to play and manage videos stored on the HDD of the Freebox HD. FreeboxHDVideoPlayer simplifie, pour les utilisateurs de free.fr, la lecture et la gestion des vidéos stockées sur le disque du boitier Freebox HD.guglex - google docs xtraz: my studying project for asp.net mvc 3. now displaying spreadsheet rows in card viewIP-Camera MJPEG HTTP Web-Server Emulator: Turns web-camera into IP-camera with mjpeg streaming and access it over network. Usage of i-Lids (Imagery Library for Intelligent Detection Systems) and other data for surveillance systems development. Sends WMV, folder with images, web-camera data over the wire.LIM: LIM is .net software for archive manage.Luminji.CorWeb: luminji's company web.Map Generator: Map Generator for your Civ/Col/Roguelike games in VB.NET.Microformat Parsers for .NET: Microformat's Parsers for .NET helps you to collect information you run into on the web, that is stored by means of microformats. It's written in C# 3.0.Mini Rx: Mini Rx provides LINQ like queries over events for C# and F# using extension methods, somewhat like the Reactive Extensions (Rx) but in a lightweight open source package with one non-strong named assembly.MVC Music Store by NNT: Demo Music Store Follow tut from Microsoft. Team Explorer VS 2010 .NET 4PHP MySQL Web Site Creator: PHP MySQL Web Site Creator helps you create a basic template for your web site using a UI form. I creates all PHP, CSS, HTML basic pages to connect to MySQL database.PQLRD: Paco KLk!Project Web And Application: Project MobileStore in ASP use ASP.NET MVC 3RoboZip: RoboZip combines MS Robocopy and Zip-functionality. Create a job and RoboZip will zip all files (from a list of filetypes) within a time span in the past. The zip file name can contain date and time values of the time period it covers. It's developed in C#. The zip component is the DotNetZip Library from http://dotnetzip.codeplex.com. The logging is done using Apache log4net from http://logging.apache.org/log4net. The idea here is not to only present the code, but also to document how the de...Scriptster Gearbox: Scriptster Gearbox is a full C# scripting system based on Scriptster (also on CodePlex). It is aimed at Visual Studio developers and power users who need to automate the sorts of things that DOS is generally good at, but which using C# can make more powerful and more familiar.Technical Analysis Engine for .NET: Technical Analysis Engine is a .NET based library for technical analysis. It is fast, free, easy to use and provides functionality for financial market analysis.ValidationDSL: A Fluent validation engine with ease to use and reusable around the multi-tenant applicationsWebFormsUtilities: WebFormsUtilities is a collection of tools that assist in using MVC/MVP functionality to webforms in an unobtrusive manner. This differs from a hybrid MVC/WebForms page in that you can use methods like UpdateModel() or TryValidateModel() in a WebForms Page class.Wen - WPF 3D Navigator: Control the camera and the lights of a WPF 3D application without any code modification.WindStudios: WindStudiosWrite My Expect Statement: Have Rhino Mocks automatically generate your expect() and return() statements for you.zion xtraz: some basic helpful classes for use in every project

    Read the article

  • C#/.NET Little Wonders: ConcurrentBag and BlockingCollection

    - by James Michael Hare
    In the first week of concurrent collections, began with a general introduction and discussed the ConcurrentStack<T> and ConcurrentQueue<T>.  The last post discussed the ConcurrentDictionary<T> .  Finally this week, we shall close with a discussion of the ConcurrentBag<T> and BlockingCollection<T>. For more of the "Little Wonders" posts, see C#/.NET Little Wonders: A Redux. Recap As you'll recall from the previous posts, the original collections were object-based containers that accomplished synchronization through a Synchronized member.  With the advent of .NET 2.0, the original collections were succeeded by the generic collections which are fully type-safe, but eschew automatic synchronization.  With .NET 4.0, a new breed of collections was born in the System.Collections.Concurrent namespace.  Of these, the final concurrent collection we will examine is the ConcurrentBag and a very useful wrapper class called the BlockingCollection. For some excellent information on the performance of the concurrent collections and how they perform compared to a traditional brute-force locking strategy, see this informative whitepaper by the Microsoft Parallel Computing Platform team here. ConcurrentBag<T> – Thread-safe unordered collection. Unlike the other concurrent collections, the ConcurrentBag<T> has no non-concurrent counterpart in the .NET collections libraries.  Items can be added and removed from a bag just like any other collection, but unlike the other collections, the items are not maintained in any order.  This makes the bag handy for those cases when all you care about is that the data be consumed eventually, without regard for order of consumption or even fairness – that is, it’s possible new items could be consumed before older items given the right circumstances for a period of time. So why would you ever want a container that can be unfair?  Well, to look at it another way, you can use a ConcurrentQueue and get the fairness, but it comes at a cost in that the ordering rules and synchronization required to maintain that ordering can affect scalability a bit.  Thus sometimes the bag is great when you want the fastest way to get the next item to process, and don’t care what item it is or how long its been waiting. The way that the ConcurrentBag works is to take advantage of the new ThreadLocal<T> type (new in System.Threading for .NET 4.0) so that each thread using the bag has a list local to just that thread.  This means that adding or removing to a thread-local list requires very low synchronization.  The problem comes in where a thread goes to consume an item but it’s local list is empty.  In this case the bag performs “work-stealing” where it will rob an item from another thread that has items in its list.  This requires a higher level of synchronization which adds a bit of overhead to the take operation. So, as you can imagine, this makes the ConcurrentBag good for situations where each thread both produces and consumes items from the bag, but it would be less-than-idea in situations where some threads are dedicated producers and the other threads are dedicated consumers because the work-stealing synchronization would outweigh the thread-local optimization for a thread taking its own items. Like the other concurrent collections, there are some curiosities to keep in mind: IsEmpty(), Count, ToArray(), and GetEnumerator() lock collection Each of these needs to take a snapshot of whole bag to determine if empty, thus they tend to be more expensive and cause Add() and Take() operations to block. ToArray() and GetEnumerator() are static snapshots Because it is based on a snapshot, will not show subsequent updates after snapshot. Add() is lightweight Since adding to the thread-local list, there is very little overhead on Add. TryTake() is lightweight if items in thread-local list As long as items are in the thread-local list, TryTake() is very lightweight, much more so than ConcurrentStack() and ConcurrentQueue(), however if the local thread list is empty, it must steal work from another thread, which is more expensive. Remember, a bag is not ideal for all situations, it is mainly ideal for situations where a process consumes an item and either decomposes it into more items to be processed, or handles the item partially and places it back to be processed again until some point when it will complete.  The main point is that the bag works best when each thread both takes and adds items. For example, we could create a totally contrived example where perhaps we want to see the largest power of a number before it crosses a certain threshold.  Yes, obviously we could easily do this with a log function, but bare with me while I use this contrived example for simplicity. So let’s say we have a work function that will take a Tuple out of a bag, this Tuple will contain two ints.  The first int is the original number, and the second int is the last multiple of that number.  So we could load our bag with the initial values (let’s say we want to know the last multiple of each of 2, 3, 5, and 7 under 100. 1: var bag = new ConcurrentBag<Tuple<int, int>> 2: { 3: Tuple.Create(2, 1), 4: Tuple.Create(3, 1), 5: Tuple.Create(5, 1), 6: Tuple.Create(7, 1) 7: }; Then we can create a method that given the bag, will take out an item, apply the multiplier again, 1: public static void FindHighestPowerUnder(ConcurrentBag<Tuple<int,int>> bag, int threshold) 2: { 3: Tuple<int,int> pair; 4:  5: // while there are items to take, this will prefer local first, then steal if no local 6: while (bag.TryTake(out pair)) 7: { 8: // look at next power 9: var result = Math.Pow(pair.Item1, pair.Item2 + 1); 10:  11: if (result < threshold) 12: { 13: // if smaller than threshold bump power by 1 14: bag.Add(Tuple.Create(pair.Item1, pair.Item2 + 1)); 15: } 16: else 17: { 18: // otherwise, we're done 19: Console.WriteLine("Highest power of {0} under {3} is {0}^{1} = {2}.", 20: pair.Item1, pair.Item2, Math.Pow(pair.Item1, pair.Item2), threshold); 21: } 22: } 23: } Now that we have this, we can load up this method as an Action into our Tasks and run it: 1: // create array of tasks, start all, wait for all 2: var tasks = new[] 3: { 4: new Task(() => FindHighestPowerUnder(bag, 100)), 5: new Task(() => FindHighestPowerUnder(bag, 100)), 6: }; 7:  8: Array.ForEach(tasks, t => t.Start()); 9:  10: Task.WaitAll(tasks); Totally contrived, I know, but keep in mind the main point!  When you have a thread or task that operates on an item, and then puts it back for further consumption – or decomposes an item into further sub-items to be processed – you should consider a ConcurrentBag as the thread-local lists will allow for quick processing.  However, if you need ordering or if your processes are dedicated producers or consumers, this collection is not ideal.  As with anything, you should performance test as your mileage will vary depending on your situation! BlockingCollection<T> – A producers & consumers pattern collection The BlockingCollection<T> can be treated like a collection in its own right, but in reality it adds a producers and consumers paradigm to any collection that implements the interface IProducerConsumerCollection<T>.  If you don’t specify one at the time of construction, it will use a ConcurrentQueue<T> as its underlying store. If you don’t want to use the ConcurrentQueue, the ConcurrentStack and ConcurrentBag also implement the interface (though ConcurrentDictionary does not).  In addition, you are of course free to create your own implementation of the interface. So, for those who don’t remember the producers and consumers classical computer-science problem, the gist of it is that you have one (or more) processes that are creating items (producers) and one (or more) processes that are consuming these items (consumers).  Now, the crux of the problem is that there is a bin (queue) where the produced items are placed, and typically that bin has a limited size.  Thus if a producer creates an item, but there is no space to store it, it must wait until an item is consumed.  Also if a consumer goes to consume an item and none exists, it must wait until an item is produced. The BlockingCollection makes it trivial to implement any standard producers/consumers process set by providing that “bin” where the items can be produced into and consumed from with the appropriate blocking operations.  In addition, you can specify whether the bin should have a limited size or can be (theoretically) unbounded, and you can specify timeouts on the blocking operations. As far as your choice of “bin”, for the most part the ConcurrentQueue is the right choice because it is fairly light and maximizes fairness by ordering items so that they are consumed in the same order they are produced.  You can use the concurrent bag or stack, of course, but your ordering would be random-ish in the case of the former and LIFO in the case of the latter. So let’s look at some of the methods of note in BlockingCollection: BoundedCapacity returns capacity of the “bin” If the bin is unbounded, the capacity is int.MaxValue. Count returns an internally-kept count of items This makes it O(1), but if you modify underlying collection directly (not recommended) it is unreliable. CompleteAdding() is used to cut off further adds. This sets IsAddingCompleted and begins to wind down consumers once empty. IsAddingCompleted is true when producers are “done”. Once you are done producing, should complete the add process to alert consumers. IsCompleted is true when producers are “done” and “bin” is empty. Once you mark the producers done, and all items removed, this will be true. Add() is a blocking add to collection. If bin is full, will wait till space frees up Take() is a blocking remove from collection. If bin is empty, will wait until item is produced or adding is completed. GetConsumingEnumerable() is used to iterate and consume items. Unlike the standard enumerator, this one consumes the items instead of iteration. TryAdd() attempts add but does not block completely If adding would block, returns false instead, can specify TimeSpan to wait before stopping. TryTake() attempts to take but does not block completely Like TryAdd(), if taking would block, returns false instead, can specify TimeSpan to wait. Note the use of CompleteAdding() to signal the BlockingCollection that nothing else should be added.  This means that any attempts to TryAdd() or Add() after marked completed will throw an InvalidOperationException.  In addition, once adding is complete you can still continue to TryTake() and Take() until the bin is empty, and then Take() will throw the InvalidOperationException and TryTake() will return false. So let’s create a simple program to try this out.  Let’s say that you have one process that will be producing items, but a slower consumer process that handles them.  This gives us a chance to peek inside what happens when the bin is bounded (by default, the bin is NOT bounded). 1: var bin = new BlockingCollection<int>(5); Now, we create a method to produce items: 1: public static void ProduceItems(BlockingCollection<int> bin, int numToProduce) 2: { 3: for (int i = 0; i < numToProduce; i++) 4: { 5: // try for 10 ms to add an item 6: while (!bin.TryAdd(i, TimeSpan.FromMilliseconds(10))) 7: { 8: Console.WriteLine("Bin is full, retrying..."); 9: } 10: } 11:  12: // once done producing, call CompleteAdding() 13: Console.WriteLine("Adding is completed."); 14: bin.CompleteAdding(); 15: } And one to consume them: 1: public static void ConsumeItems(BlockingCollection<int> bin) 2: { 3: // This will only be true if CompleteAdding() was called AND the bin is empty. 4: while (!bin.IsCompleted) 5: { 6: int item; 7:  8: if (!bin.TryTake(out item, TimeSpan.FromMilliseconds(10))) 9: { 10: Console.WriteLine("Bin is empty, retrying..."); 11: } 12: else 13: { 14: Console.WriteLine("Consuming item {0}.", item); 15: Thread.Sleep(TimeSpan.FromMilliseconds(20)); 16: } 17: } 18: } Then we can fire them off: 1: // create one producer and two consumers 2: var tasks = new[] 3: { 4: new Task(() => ProduceItems(bin, 20)), 5: new Task(() => ConsumeItems(bin)), 6: new Task(() => ConsumeItems(bin)), 7: }; 8:  9: Array.ForEach(tasks, t => t.Start()); 10:  11: Task.WaitAll(tasks); Notice that the producer is faster than the consumer, thus it should be hitting a full bin often and displaying the message after it times out on TryAdd(). 1: Consuming item 0. 2: Consuming item 1. 3: Bin is full, retrying... 4: Bin is full, retrying... 5: Consuming item 3. 6: Consuming item 2. 7: Bin is full, retrying... 8: Consuming item 4. 9: Consuming item 5. 10: Bin is full, retrying... 11: Consuming item 6. 12: Consuming item 7. 13: Bin is full, retrying... 14: Consuming item 8. 15: Consuming item 9. 16: Bin is full, retrying... 17: Consuming item 10. 18: Consuming item 11. 19: Bin is full, retrying... 20: Consuming item 12. 21: Consuming item 13. 22: Bin is full, retrying... 23: Bin is full, retrying... 24: Consuming item 14. 25: Adding is completed. 26: Consuming item 15. 27: Consuming item 16. 28: Consuming item 17. 29: Consuming item 19. 30: Consuming item 18. Also notice that once CompleteAdding() is called and the bin is empty, the IsCompleted property returns true, and the consumers will exit. Summary The ConcurrentBag is an interesting collection that can be used to optimize concurrency scenarios where tasks or threads both produce and consume items.  In this way, it will choose to consume its own work if available, and then steal if not.  However, in situations where you want fair consumption or ordering, or in situations where the producers and consumers are distinct processes, the bag is not optimal. The BlockingCollection is a great wrapper around all of the concurrent queue, stack, and bag that allows you to add producer and consumer semantics easily including waiting when the bin is full or empty. That’s the end of my dive into the concurrent collections.  I’d also strongly recommend, once again, you read this excellent Microsoft white paper that goes into much greater detail on the efficiencies you can gain using these collections judiciously (here). Tweet Technorati Tags: C#,.NET,Concurrent Collections,Little Wonders

    Read the article

  • How to Visualize your Audit Data with BI Publisher?

    - by kanichiro.nishida
      Do you know how many reports on your BI Publisher server are accessed yesterday ? Or, how many users accessed to the reports yesterday, or what are the average number of the users accessed to the reports during the week vs. weekend or morning vs. afternoon ? With BI Publisher 11G, now you can audit your user’s reports access and understand the state of the reporting environment at your server, each user, or each report level. At the previous post I’ve talked about what the BI Publisher’s auditing functionality and how to enable it so that BI Publisher can start collecting such data. (How to Audit and Monitor BI Publisher Reports Access?)Now, how can you visualize such auditing data to have a better understanding and gain more insights? With Fusion Middleware Audit Framework you have an option to store the auditing data into a database instead of a log file, which is the default option. Once you enable the database storage option, that means you have your auditing data (or, user report access data) in your database tables, now no brainer, you can start visualize the data, create reports, analyze, and share with BI Publisher. So, first, let’s take a look on how to enable the database storage option for the auditing data. How to Feed the Auditing Data into Database First you need to create a database schema for Fusion Middleware Audit Framework with RCU (Repository Creation Utility). If you have already installed BI Publisher 11G you should be familiar with this RCU. It creates any database schema necessary to run any Fusion Middleware products including BI stuff. And you can use the same RCU that you used for your BI or BI Publisher installation to create this Audit schema. Create Audit Schema with RCU Here are the steps: Go to $RCU_HOME/bin and execute the ‘rcu’ command Choose Create at the starting screen and click Next. Enter your database details and click Next. Choose the option to create a new prefix, for example ‘BIP’, ‘KAN’, etc. Select 'Audit Services' from the list of schemas. Click Next and accept the tablespace creation. Click Finish to start the process. After this, there should be following three Audit related schema created in your database. <prefix>_IAU (e.g. KAN_IAU) <prefix>_IAU_APPEND (e.g. KAN_IAU_APPEND) <prefix>_IAU_VIEWER (e.g. KAN_IAU_VIEWER) Setup Datasource at WebLogic After you create a database schema for your auditing data, now you need to create a JDBC connection on your WebLogic Server so the Audit Framework can access to the database schema that was created with the RCU with the previous step. Connect to the Oracle WebLogic Server administration console: http://hostname:port/console (e.g. http://report.oracle.com:7001/console) Under Services, click the Data Sources link. Click ‘Lock & Edit’ so that you can make changes Click New –> ‘Generic Datasource’ to create a new data source. Enter the following details for the new data source:  Name: Enter a name such as Audit Data Source-0.  JNDI Name: jdbc/AuditDB  Database Type: Oracle  Click Next and select ‘Oracle's Driver (Thin XA) Versions: 9.0.1 or later’ as Database Driver (if you’re using Oracle database), and click Next. The Connection Properties page appears. Enter the following information: Database Name: Enter the name of the database (SID) to which you will connect. Host Name: Enter the hostname of the database.  Port: Enter the database port.  Database User Name: This is the name of the audit schema that you created in RCU. The suffix is always IAU for the audit schema. For example, if you gave the prefix as ‘BIP’, then the schema name would be ‘KAN_IAU’.  Password: This is the password for the audit schema that you created in RCU.   Click Next. Accept the defaults, and click Test Configuration to verify the connection. Click Next Check listed servers where you want to make this JDBC connection available. Click ‘Finish’ ! After that, make sure you click ‘Activate Changes’ at the left hand side top to take the new JDBC connection in effect. Register your Audit Data Storing Database to your Domain Finally, you can register the JNDI/JDBC datasource as your Auditing data storage with Fusion Middleware Control (EM). Here are the steps: 1. Login to Fusion Middleware Control 2. Navigate to Weblogic Domain, right click on ‘bifoundation…..’, select Security, then Audit Store. 3. Click the searchlight icon next to the Datasource JNDI Name field. 4.Select the Audit JNDI/JDBC datasource you created in the previous step in the pop-up window and click OK. 5. Click Apply to continue. 6. Restart the whole WebLogic Servers in the domain. After this, now the BI Publisher should start feeding all the auditing data into the database table called ‘IAU_BASE’. Try login to BI Publisher and open a couple of reports, you should see the activity audited in the ‘IAU_BASE’ table. If not working, you might want to check the log file, which is located at $BI_HOME/user_projects/domains/bifoundation_domain/servers/AdminServer/logs/AdminServer-diagnostic.log to see if there is any error. Once you have the data in the database table, now, it’s time to visualize with BI Publisher reports! Create a First BI Publisher Auditing Report Register Auditing Datasource as JNDI datasource First thing you need to do is to register the audit datasource (JNDI/JDBC connection) you created in the previous step as JNDI data source at BI Publisher. It is a JDBC connection registered as JNDI, that means you don’t need to create a new JDBC connection by typing the connection URL, username/password, etc. You can just register it using the JNDI name. (e.g. jdbc/AuditDB) Login to BI Publisher as Administrator (e.g. weblogic) Go to Administration Page Click ‘JNDI Connection’ under Data Sources and Click ‘New’ Type Data Source Name and JNDI Name. The JNDI Name is the one you created in the WebLogic Console as the auditing datasource. (e.g. jdbc/AuditDB) Click ‘Test Connection’ to make sure the datasource connection works. Provide appropriate roles so that the report developers or viewers can share this data source to view reports. Click ‘Apply’ to save. Create Data Model Select Data Model from the tool bar menu ‘New’ Set ‘Default Data Source’ to the audit JNDI data source you have created in the previous step. Select ‘SQL Query’ for your data set Use Query Builder to build a query or just type a sql query. Either way, the table you want to report against is ‘IAU_BASE’. This IAU_BASE table contains all the auditing data for other products running on the WebLogic Server such as JPS, OID, etc. So, if you care only specific to BI Publisher then you want to filter by using  ‘IAU_COMPONENTTYPE’ column which contains the product name (e.g. ’xmlpserver’ for BI Publisher). Here is my sample sql query. select     "IAU_BASE"."IAU_COMPONENTTYPE" as "IAU_COMPONENTTYPE",      "IAU_BASE"."IAU_EVENTTYPE" as "IAU_EVENTTYPE",      "IAU_BASE"."IAU_EVENTCATEGORY" as "IAU_EVENTCATEGORY",      "IAU_BASE"."IAU_TSTZORIGINATING" as "IAU_TSTZORIGINATING",    to_char("IAU_TSTZORIGINATING", 'YYYY-MM-DD') IAU_DATE,    to_char("IAU_TSTZORIGINATING", 'DAY') as IAU_DAY,    to_char("IAU_TSTZORIGINATING", 'HH24') as IAU_HH24,    to_char("IAU_TSTZORIGINATING", 'WW') as IAU_WEEK_OF_YEAR,      "IAU_BASE"."IAU_INITIATOR" as "IAU_INITIATOR",      "IAU_BASE"."IAU_RESOURCE" as "IAU_RESOURCE",      "IAU_BASE"."IAU_TARGET" as "IAU_TARGET",      "IAU_BASE"."IAU_MESSAGETEXT" as "IAU_MESSAGETEXT",      "IAU_BASE"."IAU_FAILURECODE" as "IAU_FAILURECODE",      "IAU_BASE"."IAU_REMOTEIP" as "IAU_REMOTEIP" from    "KAN3_IAU"."IAU_BASE" "IAU_BASE" where "IAU_BASE"."IAU_COMPONENTTYPE" = 'xmlpserver' Once you saved a sample XML for this data model, now you can create a report with this data model. Create Report Now you can use one of the BI Publisher’s layout options to design the report layout and visualize the auditing data. I’m a big fan of Online Layout Editor, it’s just so easy and simple to create reports, and on top of that, all the reports created with Online Layout Editor has the Interactive View with automatic data linking and filtering feature without any setting or coding. If you haven’t checked the Interactive View or Online Layout Editor you might want to check these previous blog posts. (Interactive Reporting with BI Publisher 11G, Interactive Master Detail Report Just A Few Clicks Away!) But of course, you can use other layout design option such as RTF template. Here are some sample screenshots of my report design with Online Layout Editor.     Visualize and Gain More Insights about your Customers (Users) ! Now you can visualize your auditing data to have better understanding and gain more insights about your reporting environment you manage. It’s been actually helping me personally to answer the  questios like below.  How many reports are accessed or opened yesterday, today, last week ? Who is accessing which report at what time ? What are the time windows when the most of the reports access happening ? What are the most viewed reports ? Who are the active users ? What are the # of reports access or user access trend for the last month, last 6 months, last 12 months, etc ? I was talking with one of the best concierge in the world at this hotel the other day, and he was telling me that the best concierge knows about their customers inside-out therefore they can provide a very private service that is customized to each customer to meet each customer’s specific needs. Well, this is true when it comes to how to administrate and manage your reporting environment, right ? The best way to serve your customers (report users, including both viewers and developers) is to understand how they use, what they use, when they use. Auditing is not just about compliance, but it’s the way to improve the customer service. The BI Publisher 11G Auditing feature enables just that to help you understand your customers better. Happy customer service, be the best reporting concierge! p.s. please share with us on what other information would be helpful for you for the auditing! Always, any feedback is a great value and inspiration for us!  

    Read the article

  • Windows Azure Service Bus Splitter and Aggregator

    - by Alan Smith
    This article will cover basic implementations of the Splitter and Aggregator patterns using the Windows Azure Service Bus. The content will be included in the next release of the “Windows Azure Service Bus Developer Guide”, along with some other patterns I am working on. I’ve taken the pattern descriptions from the book “Enterprise Integration Patterns” by Gregor Hohpe. I bought a copy of the book in 2004, and recently dusted it off when I started to look at implementing the patterns on the Windows Azure Service Bus. Gregor has also presented an session in 2011 “Enterprise Integration Patterns: Past, Present and Future” which is well worth a look. I’ll be covering more patterns in the coming weeks, I’m currently working on Wire-Tap and Scatter-Gather. There will no doubt be a section on implementing these patterns in my “SOA, Connectivity and Integration using the Windows Azure Service Bus” course. There are a number of scenarios where a message needs to be divided into a number of sub messages, and also where a number of sub messages need to be combined to form one message. The splitter and aggregator patterns provide a definition of how this can be achieved. This section will focus on the implementation of basic splitter and aggregator patens using the Windows Azure Service Bus direct programming model. In BizTalk Server receive pipelines are typically used to implement the splitter patterns, with sequential convoy orchestrations often used to aggregate messages. In the current release of the Service Bus, there is no functionality in the direct programming model that implements these patterns, so it is up to the developer to implement them in the applications that send and receive messages. Splitter A message splitter takes a message and spits the message into a number of sub messages. As there are different scenarios for how a message can be split into sub messages, message splitters are implemented using different algorithms. The Enterprise Integration Patterns book describes the splatter pattern as follows: How can we process a message if it contains multiple elements, each of which may have to be processed in a different way? Use a Splitter to break out the composite message into a series of individual messages, each containing data related to one item. The Enterprise Integration Patterns website provides a description of the Splitter pattern here. In some scenarios a batch message could be split into the sub messages that are contained in the batch. The splitting of a message could be based on the message type of sub-message, or the trading partner that the sub message is to be sent to. Aggregator An aggregator takes a stream or related messages and combines them together to form one message. The Enterprise Integration Patterns book describes the aggregator pattern as follows: How do we combine the results of individual, but related messages so that they can be processed as a whole? Use a stateful filter, an Aggregator, to collect and store individual messages until a complete set of related messages has been received. Then, the Aggregator publishes a single message distilled from the individual messages. The Enterprise Integration Patterns website provides a description of the Aggregator pattern here. A common example of the need for an aggregator is in scenarios where a stream of messages needs to be combined into a daily batch to be sent to a legacy line-of-business application. The BizTalk Server EDI functionality provides support for batching messages in this way using a sequential convoy orchestration. Scenario The scenario for this implementation of the splitter and aggregator patterns is the sending and receiving of large messages using a Service Bus queue. In the current release, the Windows Azure Service Bus currently supports a maximum message size of 256 KB, with a maximum header size of 64 KB. This leaves a safe maximum body size of 192 KB. The BrokeredMessage class will support messages larger than 256 KB; in fact the Size property is of type long, implying that very large messages may be supported at some point in the future. The 256 KB size restriction is set in the service bus components that are deployed in the Windows Azure data centers. One of the ways of working around this size restriction is to split large messages into a sequence of smaller sub messages in the sending application, send them via a queue, and then reassemble them in the receiving application. This scenario will be used to demonstrate the pattern implementations. Implementation The splitter and aggregator will be used to provide functionality to send and receive large messages over the Windows Azure Service Bus. In order to make the implementations generic and reusable they will be implemented as a class library. The splitter will be implemented in the LargeMessageSender class and the aggregator in the LargeMessageReceiver class. A class diagram showing the two classes is shown below. Implementing the Splitter The splitter will take a large brokered message, and split the messages into a sequence of smaller sub-messages that can be transmitted over the service bus messaging entities. The LargeMessageSender class provides a Send method that takes a large brokered message as a parameter. The implementation of the class is shown below; console output has been added to provide details of the splitting operation. public class LargeMessageSender {     private static int SubMessageBodySize = 192 * 1024;     private QueueClient m_QueueClient;       public LargeMessageSender(QueueClient queueClient)     {         m_QueueClient = queueClient;     }       public void Send(BrokeredMessage message)     {         // Calculate the number of sub messages required.         long messageBodySize = message.Size;         int nrSubMessages = (int)(messageBodySize / SubMessageBodySize);         if (messageBodySize % SubMessageBodySize != 0)         {             nrSubMessages++;         }           // Create a unique session Id.         string sessionId = Guid.NewGuid().ToString();         Console.WriteLine("Message session Id: " + sessionId);         Console.Write("Sending {0} sub-messages", nrSubMessages);           Stream bodyStream = message.GetBody<Stream>();         for (int streamOffest = 0; streamOffest < messageBodySize;             streamOffest += SubMessageBodySize)         {                                     // Get the stream chunk from the large message             long arraySize = (messageBodySize - streamOffest) > SubMessageBodySize                 ? SubMessageBodySize : messageBodySize - streamOffest;             byte[] subMessageBytes = new byte[arraySize];             int result = bodyStream.Read(subMessageBytes, 0, (int)arraySize);             MemoryStream subMessageStream = new MemoryStream(subMessageBytes);               // Create a new message             BrokeredMessage subMessage = new BrokeredMessage(subMessageStream, true);             subMessage.SessionId = sessionId;               // Send the message             m_QueueClient.Send(subMessage);             Console.Write(".");         }         Console.WriteLine("Done!");     }} The LargeMessageSender class is initialized with a QueueClient that is created by the sending application. When the large message is sent, the number of sub messages is calculated based on the size of the body of the large message. A unique session Id is created to allow the sub messages to be sent as a message session, this session Id will be used for correlation in the aggregator. A for loop in then used to create the sequence of sub messages by creating chunks of data from the stream of the large message. The sub messages are then sent to the queue using the QueueClient. As sessions are used to correlate the messages, the queue used for message exchange must be created with the RequiresSession property set to true. Implementing the Aggregator The aggregator will receive the sub messages in the message session that was created by the splitter, and combine them to form a single, large message. The aggregator is implemented in the LargeMessageReceiver class, with a Receive method that returns a BrokeredMessage. The implementation of the class is shown below; console output has been added to provide details of the splitting operation.   public class LargeMessageReceiver {     private QueueClient m_QueueClient;       public LargeMessageReceiver(QueueClient queueClient)     {         m_QueueClient = queueClient;     }       public BrokeredMessage Receive()     {         // Create a memory stream to store the large message body.         MemoryStream largeMessageStream = new MemoryStream();           // Accept a message session from the queue.         MessageSession session = m_QueueClient.AcceptMessageSession();         Console.WriteLine("Message session Id: " + session.SessionId);         Console.Write("Receiving sub messages");           while (true)         {             // Receive a sub message             BrokeredMessage subMessage = session.Receive(TimeSpan.FromSeconds(5));               if (subMessage != null)             {                 // Copy the sub message body to the large message stream.                 Stream subMessageStream = subMessage.GetBody<Stream>();                 subMessageStream.CopyTo(largeMessageStream);                   // Mark the message as complete.                 subMessage.Complete();                 Console.Write(".");             }             else             {                 // The last message in the sequence is our completeness criteria.                 Console.WriteLine("Done!");                 break;             }         }                     // Create an aggregated message from the large message stream.         BrokeredMessage largeMessage = new BrokeredMessage(largeMessageStream, true);         return largeMessage;     } }   The LargeMessageReceiver initialized using a QueueClient that is created by the receiving application. The receive method creates a memory stream that will be used to aggregate the large message body. The AcceptMessageSession method on the QueueClient is then called, which will wait for the first message in a message session to become available on the queue. As the AcceptMessageSession can throw a timeout exception if no message is available on the queue after 60 seconds, a real-world implementation should handle this accordingly. Once the message session as accepted, the sub messages in the session are received, and their message body streams copied to the memory stream. Once all the messages have been received, the memory stream is used to create a large message, that is then returned to the receiving application. Testing the Implementation The splitter and aggregator are tested by creating a message sender and message receiver application. The payload for the large message will be one of the webcast video files from http://www.cloudcasts.net/, the file size is 9,697 KB, well over the 256 KB threshold imposed by the Service Bus. As the splitter and aggregator are implemented in a separate class library, the code used in the sender and receiver console is fairly basic. The implementation of the main method of the sending application is shown below.   static void Main(string[] args) {     // Create a token provider with the relevant credentials.     TokenProvider credentials =         TokenProvider.CreateSharedSecretTokenProvider         (AccountDetails.Name, AccountDetails.Key);       // Create a URI for the serivce bus.     Uri serviceBusUri = ServiceBusEnvironment.CreateServiceUri         ("sb", AccountDetails.Namespace, string.Empty);       // Create the MessagingFactory     MessagingFactory factory = MessagingFactory.Create(serviceBusUri, credentials);       // Use the MessagingFactory to create a queue client     QueueClient queueClient = factory.CreateQueueClient(AccountDetails.QueueName);       // Open the input file.     FileStream fileStream = new FileStream(AccountDetails.TestFile, FileMode.Open);       // Create a BrokeredMessage for the file.     BrokeredMessage largeMessage = new BrokeredMessage(fileStream, true);       Console.WriteLine("Sending: " + AccountDetails.TestFile);     Console.WriteLine("Message body size: " + largeMessage.Size);     Console.WriteLine();         // Send the message with a LargeMessageSender     LargeMessageSender sender = new LargeMessageSender(queueClient);     sender.Send(largeMessage);       // Close the messaging facory.     factory.Close();  } The implementation of the main method of the receiving application is shown below. static void Main(string[] args) {       // Create a token provider with the relevant credentials.     TokenProvider credentials =         TokenProvider.CreateSharedSecretTokenProvider         (AccountDetails.Name, AccountDetails.Key);       // Create a URI for the serivce bus.     Uri serviceBusUri = ServiceBusEnvironment.CreateServiceUri         ("sb", AccountDetails.Namespace, string.Empty);       // Create the MessagingFactory     MessagingFactory factory = MessagingFactory.Create(serviceBusUri, credentials);       // Use the MessagingFactory to create a queue client     QueueClient queueClient = factory.CreateQueueClient(AccountDetails.QueueName);       // Create a LargeMessageReceiver and receive the message.     LargeMessageReceiver receiver = new LargeMessageReceiver(queueClient);     BrokeredMessage largeMessage = receiver.Receive();       Console.WriteLine("Received message");     Console.WriteLine("Message body size: " + largeMessage.Size);       string testFile = AccountDetails.TestFile.Replace(@"\In\", @"\Out\");     Console.WriteLine("Saving file: " + testFile);       // Save the message body as a file.     Stream largeMessageStream = largeMessage.GetBody<Stream>();     largeMessageStream.Seek(0, SeekOrigin.Begin);     FileStream fileOut = new FileStream(testFile, FileMode.Create);     largeMessageStream.CopyTo(fileOut);     fileOut.Close();       Console.WriteLine("Done!"); } In order to test the application, the sending application is executed, which will use the LargeMessageSender class to split the message and place it on the queue. The output of the sender console is shown below. The console shows that the body size of the large message was 9,929,365 bytes, and the message was sent as a sequence of 51 sub messages. When the receiving application is executed the results are shown below. The console application shows that the aggregator has received the 51 messages from the message sequence that was creating in the sending application. The messages have been aggregated to form a massage with a body of 9,929,365 bytes, which is the same as the original large message. The message body is then saved as a file. Improvements to the Implementation The splitter and aggregator patterns in this implementation were created in order to show the usage of the patterns in a demo, which they do quite well. When implementing these patterns in a real-world scenario there are a number of improvements that could be made to the design. Copying Message Header Properties When sending a large message using these classes, it would be great if the message header properties in the message that was received were copied from the message that was sent. The sending application may well add information to the message context that will be required in the receiving application. When the sub messages are created in the splitter, the header properties in the first message could be set to the values in the original large message. The aggregator could then used the values from this first sub message to set the properties in the message header of the large message during the aggregation process. Using Asynchronous Methods The current implementation uses the synchronous send and receive methods of the QueueClient class. It would be much more performant to use the asynchronous methods, however doing so may well affect the sequence in which the sub messages are enqueued, which would require the implementation of a resequencer in the aggregator to restore the correct message sequence. Handling Exceptions In order to keep the code readable no exception handling was added to the implementations. In a real-world scenario exceptions should be handled accordingly.

    Read the article

  • CodePlex Daily Summary for Wednesday, January 05, 2011

    CodePlex Daily Summary for Wednesday, January 05, 2011Popular ReleasesBloodSim: BloodSim - 1.3.2.0: - Simulation Log is now automatically disabled and hidden when running 10 or more iterations - Hit and Expertise are now entered by Rating, and include option for a Racial Expertise bonus - Added option for boss to use a periodic magic ability (Dragon Breath) - Added option for boss to periodically Enrage, gaining a Damage/Attack Speed buffTemporary Data Storage Folder: Temporary Data Storage Folder 0.3 Stable: With lots of features and bug fixes we are releasing stable 0.3 version. Just download it and check it out. Added Features: 1. Rename TDS Folder with a name of your choice 2. Control what to do on program termination 3. Move TDS Folder to a new locationASP.NET MVC CMS ( Using CommonLibrary.NET ): CommonLibrary.NET CMS 0.9.5 Alpha: CommonLibrary CMSA simple yet powerful CMS system in ASP.NET MVC 2 using C# 4.0. ActiveRecord based components for Blogs, Widgets, Pages, Parts, Events, Feedback, BlogRolls, Links Includes several widgets ( tag cloud, archives, recent, user cloud, links twitter, blog roll and more ) Built using the http://commonlibrarynet.codeplex.com framework. ( Uses TDD, DDD, Models/Entities, Code Generation ) Can run w/ In-Memory Repositories or Sql Server Database See Documentation tab for Ins...AllNewsManager.NET: AllNewsManager.NET 1.2.1: AllNewsManager.NET 1.2.1 It is a minor update from version 1.2Mini Memory Dump Diagnosis using Asp.Net: MiniDump HealthMonitor: Enable developers to generate mini memory dumps in case of unhandled exceptions or for specific exception scenarios without using any external tools , only using Asp.Net 2.0 and above. Many times in production , QA Servers the issues require post-mortem or low-level system debugging efforts. This Memory dump generator will help in those scenarios.EnhSim: EnhSim 2.2.9 BETA: 2.2.9 BETAThis release supports WoW patch 4.03a at level 85 To use this release, you must have the Microsoft Visual C++ 2010 Redistributable Package installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=A7B7A05E-6DE6-4D3A-A423-37BF0912DB84 To use the GUI you must have the .NET 4.0 Framework installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=9cfb2d51-5ff4-4491-b0e5-b386f32c0992 - Added in the Gobl...xUnit.net - Unit Testing for .NET: xUnit.net 1.7 Beta: xUnit.net release 1.7 betaBuild #1533 Important notes for Resharper users: Resharper support has been moved to the xUnit.net Contrib project. Important note for TestDriven.net users: If you are having issues running xUnit.net tests in TestDriven.net, especially on 64-bit Windows, we strongly recommend you upgrade to TD.NET version 3.0 or later. This release adds the following new features: Added support for ASP.NET MVC 3 Added Assert.Equal(double expected, double actual, int precision)...Json.NET: Json.NET 4.0 Release 1: New feature - Added Windows Phone 7 project New feature - Added dynamic support to LINQ to JSON New feature - Added dynamic support to serializer New feature - Added INotifyCollectionChanged to JContainer in .NET 4 build New feature - Added ReadAsDateTimeOffset to JsonReader New feature - Added ReadAsDecimal to JsonReader New feature - Added covariance to IJEnumerable type parameter New feature - Added XmlSerializer style Specified property support New feature - Added ...StyleCop for ReSharper: StyleCop for ReSharper 5.1.14977.000: Prerequisites: ============== o Visual Studio 2008 / Visual Studio 2010 o ReSharper 5.1.1753.4 o StyleCop 4.4.1.2 Preview This release adds no new features, has bug fixes around performance and unhandled errors reported on YouTrack.DbDocument: DbDoc Initial Version: DbDoc Initial versionASP .NET MVC CMS (Content Management System): Atomic CMS 2.1.2: Atomic CMS 2.1.2 release notes Atomic CMS installation guide Kind Of Magic MSBuild Task: Beta 4: Update to keep up with latest bug fixes. To those who don't like Magic/NoMagic attributes, you may change these names in KindOfMagic.targets file: Change this line: <MagicTask Assembly="@(IntermediateAssembly)" References="@(ReferencePath)"/> to something like this: <MagicTask Assembly="@(IntermediateAssembly)" References="@(ReferencePath)" MagicAttribute="MyMagicAttribute" NoMagicAttribute="MyNoMagicAttribute"/>N2 CMS: 2.1: N2 is a lightweight CMS framework for ASP.NET. It helps you build great web sites that anyone can update. Major Changes Support for auto-implemented properties ({get;set;}, based on contribution by And Poulsen) All-round improvements and bugfixes File manager improvements (multiple file upload, resize images to fit) New image gallery Infinite scroll paging on news Content templates First time with N2? Try the demo site Download one of the template packs (above) and open the proj...Wii Backup Fusion: Wii Backup Fusion 1.0: - Norwegian translation - French translation - German translation - WBFS dump for analysis - Scalable full HQ cover - Support for log file - Load game images improved - Support for image splitting - Diff for images after transfer - Support for scrubbing modes - Search functionality for log - Recurse depth for Files/Load - Show progress while downloading game cover - Supports more databases for cover download - Game cover loading routines improvedAutoLoL: AutoLoL v1.5.1: Fix: Fixed a bug where pressing Save As would not select the Mastery Directory by default Unexpected errors are now always reported to the user before closing AutoLoL down.* Extracted champion data to Data directory** Added disclaimer to notify users this application has nothing to do with Riot Games Inc. Updated Codeplex image * An error report will be shown to the user which can help the developers to find out what caused the error, this should improve support ** We are working on ...TortoiseHg: TortoiseHg 1.1.8: TortoiseHg 1.1.8 is a minor bug fix release, with minor improvementsBlogEngine.NET: BlogEngine.NET 2.0: Get DotNetBlogEngine for 3 Months Free! Click Here for More Info 3 Months FREE – BlogEngine.NET Hosting – Click Here! If you want to set up and start using BlogEngine.NET right away, you should download the Web project. If you want to extend or modify BlogEngine.NET, you should download the source code. If you are upgrading from a previous version of BlogEngine.NET, please take a look at the Upgrading to BlogEngine.NET 2.0 instructions. To get started, be sure to check out our installatio...Free Silverlight & WPF Chart Control - Visifire: Visifire SL and WPF Charts v3.6.6 Released: Hi, Today we are releasing final version of Visifire, v3.6.6 with the following new feature: * TextDecorations property is implemented in Title for Chart. * TitleTextDecorations property is implemented in Axis. * MinPointHeight property is now applicable for Column and Bar Charts. Also this release includes few bug fixes: * ToolTipText property of DataSeries was not getting applied from Style. * Chart threw exception if IndicatorEnabled property was set to true and Too...StyleCop Compliant Visual Studio Code Snippets: Visual Studio Code Snippets - January 2011: StyleCop Compliant Visual Studio Code Snippets Visual Studio 2010 provides C# developers with 38 code snippets, enhancing developer productivty and increasing the consistency of the code. Within this project the original code snippets have been refactored to provide StyleCop compliant versions of the original code snippets while also adding many new code snippets. Within the January 2011 release you'll find 82 code snippets to make you more productive and the code you write more consistent!...WPF Application Framework (WAF): WPF Application Framework (WAF) 2.0.0.2: Version: 2.0.0.2 (Milestone 2): This release contains the source code of the WPF Application Framework (WAF) and the sample applications. Requirements .NET Framework 4.0 (The package contains a solution file for Visual Studio 2010) The unit test projects require Visual Studio 2010 Professional Remark The sample applications are using Microsoft’s IoC container MEF. However, the WPF Application Framework (WAF) doesn’t force you to use the same IoC container in your application. You can use ...New Projectsbsp4airplay: BSP files support for Airplay SDK. This project target is to support: - Quake 1 - Quake 2 - Quake 3 - HL 1 - HL 2 ClassyBlog - Personal Blogging Engine in ASP.NET MVC: A modern blogging engine for classy people.Cybernux Linux® Bulid Script: Cybernux Linux® Build Script is a set of bash scripts designed for usage within the Cybernux Linux® Project. Doing things like settup a repository, building and rebuilding packages for Cybernux Linux®, which uses Debian and portions of Debian repos combined with the Cybernux onesCybernux Linux® Multimedia Bulid Script: Cybernux Linux® Multia Build Script is a set of bash scripts designed for usage within the Cybernux Linux® Project. Doing things like settup a repository, building and rebuilding Multimedia packages for Cybernux Linux® Multimedia ExceptionMessageBox: ExceptionMessageBox is development of messagebox to show exception detail in your app like .NET unhandled exception dialog. More and better features and view, similar to Microsoft.ExceptionMessageBox from SQL Server, shall be added by contributors. This is developed in C# WPF.FractalZoom: FractalZoom is a fun little application that allows people to explore the fractal landscape of Julia sets: http://en.wikipedia.org/wiki/Julia_set FSharpJump: Visual Studio 2010 Extension for F# source files. Pops up a window in the F# editor which outlines the namespaces, 'modules, types and "let bindings" in the current document. Pressing enter key on the selected list item will jump the caret to that line. GoJan: A web application that introduce deep infomations about travelling to japan, using DNN framework.List2Code: List2Code is very Simple Application that take a comma delimited list and runs it trough a template to create code that can be copied and pasted in your application. Its Developed in CSharp using WPF for the GUI.MicroNET Framework UAV: El proyecto UAV se realiza para controlar una avion de radio control a traves de .NET MicroFramework y permitir un vuelo autonomo. Mini Memory Dump Diagnosis using Asp.Net: Enable developers to generate mini memory dumps in case of unhandled exceptions or for specific exception scenarios without using any external tools , only using Asp.Net 2.0 and above. Many times in production , QA Servers the issues require detailed system debugging dumps.MultiConvert: console based utility primary to convert solid edge files into data exchange formats like *.stp and *.dxf. It's using the API of the original software and can't be run alone. The goal of this project is creating routines with desired pre-instructions for batch conversionsPDI 2009: Ferramenta para processamento digital de imagens.RoC: RoC is a mvc 3 razor based cms system. Also RoC use Composition, Policy Injection, EF 4 Model First, Windows Identity Foundation technologies. Now RoC is draft project and it'll be more strong in the futureShot In The Dark: Shot In The Dark is a multiplayer top-down 2D shooter framework developed in Flash/AS3.0, and uses the Nonoba Multiplayer API.SilveR - Online Statistics Application: SilveR is a Silverlight based online statistical analysis application written in c# with a WCF backend exposing an R serviceSimple Scheduler: Simple task scheduling application for windows built on top of Quartz.NETSj's Personal Arena: This project project hosting space is for all my personal projects I have been working on and will be working on. For any further details you can contact me at mukhs18[at]gmail[dot]comskymet: JAlquerqueThe Electric Clam: A web console for PowerShell. Features include: - near real-time web based console - multiple consoles and multiple users - shared or exclusive consoles - latest technology (jQuery, WCF Web Sockets, ASP.Net MVC 3 with Razor) - more...UCI Protocol Sniffer [UCIPlug]: UCIPlug allows dumping the UCI messages exchanged between a UCI compliant GUI (for ex., Chessbase) and an UCI engine (for ex., Rybka 2.2). UCIPlug is written in C# and is targeted at .NET 3.5 (though, can be recompiled for .NET 2.0). Umbraco for Windows Phone: Umbraco for Windows PhoneYetAnotherErp: This is a 2 years of work building yet another erp system that integrates CRM, SCM, complex inventory management, EDI. This software is powered using XAF Application Framework from Developer Express and Expand Framework. YetAnotherErp is a fully integrated solution.Your Store: “Your Store” is beta e-commerce webapplication created using ASP.NET MVC 2

    Read the article

  • Prepping a conference

    - by Laurent Bugnion
    I have had the chance to talk at many conferences these past few years, and came up with a way to prepare them which works really well for me. Most importantly, it would make it quite easy to overcome an emergency (for example if my laptop would suddenly lose data). The whole code as well as the slides and other documents are in the cloud. I also use source control for my demos, so that I always have the latest and the greatest, but also a history of changes I made to my demos. Finally I have a system of code snippets which works great, and I often had very positive remarks from the audience regarding that. Putting everything in the cloud The one thing I used to be the most scared of was a sudden crash of my laptop, and being unable to restore in time for a conference. Most conferences ask speakers to send slides a few days (or weeks…) in advance, but let's face it, we all have last minute changes to our talks and I always come in the conference with updated slides that I pass to the management team. The answer to that dilemma used to be working off memory sticks, and that worked not bad. However last year I started putting all the documents relating to a conference in a DropBox folder, and that works great too. Obviously DropBox works only if you have connectivity, so if I for instance update slides while on an international flight, I cannot save to the cloud. The obvious answer to that is to backup everything on a memory stick… but I have to admit, I have been trusting my luck and working off my laptop HD and then synching everything to the cloud after landing. Of course on some US national flights you get WiFi on board, so in that case it is even simpler :) Usually after the conference is done, I remove the files from DropBox and copy them to their "final destination". They are backed up from there to BackBlaze, the great online backup service I am using routinely (I currently have about 90GB of data in BackBlaze). Outlining the presentations I like to have a written outline of my presentations written somewhere. I keep it simple, just write the various sections of the presentation with timing. I guess it is a remnant of the time when I was a private pilot, and using checklists for flight preparation. For example: Demo about designability 15' (0:37) Switch to Blend Open MainPage.xaml Create a DataTemplate ... Here I can immediately see during the presentation if I am taking too much time for my demo (0:37 is where I need to be when I am done with this section of the presentation, and 15' is the time that this particular section takes). I keep these sections reasonable, I don't detail every step of the preparation. Typically I have one such section for every 10-15 minutes of my talks. Yes, I am timing my presentations. I keep adjusting these numbers when I rehearse, and this really helps to feel more confident during the presentations. This is especially important for presentations that are long, like my MIX11 demo which clocked at 57 minutes (I had a lot of stuff to show…). Such presentations are risky, because if anything goes wrong, you will have to cut stuff, so the answer to that is: Rehearse, rehearse and when you're done rehearsing, rehearse a little more. I also have a "Preparation" section where I outline what I need to do before a presentation. For instance: Preparation Reboot in VHD Make sure MSN and Twitter are not running. Open VS10 and load demo Open Blend and load demo Run the WP7 emulator ... I typically start preparing my laptop an hour before the talk, starting everything I need to start and then putting my laptop to sleep. Saving and printing the outline, Timing Printing is a real problem because it is really hard to find a printer at most conference venues, and also quite hard in hotels. To solve that, I simply write everything in OneNote (synched to the cloud, now you start to know what I like ;) and then I print it to a PDF (I use CutePDFWriter) that I save to my Kindle. During the presentation, I read the outline off the Kindle (I mostly just need a quick check to see how I am timing). For timing during the presentation, I use the free tool ChronoGPS on my Windows Phone 7, but of course any phone these days has a clock/chrono application. In some conferences, they even have timers that the presenters can see, but they tend to count down and I prefer to count up… so I just use my own :) Source control for demos For demos, I create a separate folder and use Mercurial as source control. Mercurial has the huge advantage (over SVN or TFS) to work offline too, so I can commit while on a plane, and all the history is saved. Then when I have connectivity I push everything to the cloud (I am using the fantastic Trunksapp.com for my private repositories). Here too the obvious downside is the risk of losing my last changes if my laptop crashes before I can push to the cloud, and here too the obvious answer would be to work from a memory stick… though I have to admit I didn't do that lately (except when I was writing Silverlight 4 Unleashed, where I was really paranoid…) And code snippets? I am one of these presenters who hates to type in front of an audience. I can type really fast (writing two books has this advantage, it really teaches you to touch type and be fast at it) but in the context of an audience, on a stage where it is often damn cold (an issue I had a lot in past conferences, air conditioning can freeze your fingers and make it really hard to type), it doesn't work as well. I don't know for you, but I really dislike seeing a presentation where the speaker uses the backspace key more often than others ;) To solve that, I like to have my code ready in snippets, and drag them to the screen. Then I can spend time explaining each code snippet, while highlighting portions of the code (always highlight what you talk about, the audience often doesn't even see the cursor and doesn't know where you are on the screen!) Over the years I have used various solutions for code snippets, and now I have one which works really well… if you take a few precautions! I use the Visual Studio Toolbox. Preparing the code snippets You can store code snippets in the Toolbox for anything, XAML, C# etc. I arrange the snippets in the order in which I need them, which is a great way to remember what comes next in the presentation. I also separate them by topic, to make it easier to find them, for example when I switch to the slides and then back to the code. Remember that no matter how experienced you are, you will feel more nervous on stage than while you are preparing, so any way to make it easier for you is going to be beneficial to the audience. To store a code snippet, I do the following: Open the final demo that you want to show to the audience in Visual Studio. In your code, select a snippet of code that you want to explain in particular. Make sure that the Visual Studio Toolbox is open (menu View, Toolbox or Ctrl-Alt-X). Drag the selected snippet from the code window to the toolbox. (if needed) drag the snippet to the correct location (for example between two other code snippets so that you can access it as you speak through the demo). Right click on the snippet and select Rename Item from the context menu. Select a meaningful name. For me I use the following conventions: If it is a method, I use the method's name. If it is not a whole method, I use a descriptive name. If it is the content of a method (i.e. the body only, without the method's signature), I use "-> MethodName". This reminds me during the presentation that this is only the body, and that I need to insert that into an existing signature. This is the case, for instance, when I use Visual Studio to automatically generate the members of an interface’s implementation; then I only need to insert my snippet inside the generated method body. Saving the snippets This is the most important!! It happened to me a few times that VS10 lost its settings. When that happens, the snippets are lost too! Yeah that really sucks, especially (as it happened once) when this is the case about an hour before a talk… Stress and sweat follows, not good conditions to start a talk in front of an audience believe me. Thankfully, saving snippets is really easy with the following steps: Select the menu Tools, Import and Export Settings. Select Export selected environment settings and press Next. Uncheck All Settings. Then expand General Settings and select Toolbox (only!). Press Next. Select your source control folder and save under a meaningful name (for instance Snippets.vssettings). Commit to source control and push to the cloud. By the way, this also has the advantage of applying source control to the snippets file (which is an XML file), so you get history for free on that file! Reimporting the snippets If VS loses its settings and you need to reimport the snippets, this can be done super easily and very fast. Make sure that the Toolbox is empty. When you import snippets, they are merged with existing ones, they do not replace the content of the Toolbox. Unless merging is really what you want, make sure that your Toolbox is clean before you import, it is really easier. Select the menu Tools, Import and Export Settings. Select Import selected environment settings and press Next. Select No, just import new settings and press Next. Press Browse and select the Snippets.vssettings file. Press Finish. Et voila, all your snippets appear again in the Toolbox. Whew, the worst was averted and you can start your demo without sweating! (I had to do that once literally 5 minutes before the start of a demo, while my laptop was already hooked to the projector, and it went just fine). What about special tools? When using special tools (for example beta versions of tools you have an early access to), or a special configuration of your laptop, things can get tricky because you cannot really be sure that you will get a laptop with the same tools and the same configuration at the conference. To solve that, I use the following precautions: I make my demos from a Virtual Hard Disk. The great John Papa made a very easy-to-follow web page where he explains how to create a VHD and install Win7 to it. This gives you the full power of your laptop (as fast as booting from the metal). For me, I have a basic configuration that I saved on a USB harddrive (Win7 plus drivers, basic settings for desktop, folder options, taskbar etc) and Visual Studio 2010 SP1 on it. When preparing, I start by copying this "basis VHD" to my laptop. I install additional tools and configurations. I save the VHD back to the USB harddrive in a different folder. This would allow me to reinstall my demo environment quite fast, for example in case of harddrive failure. Replace the harddrive, copy the VHD to it, configure the BCD and you can start. Unfortunately this only works if the laptop itself still works. In the worst case of total failure, my security is to back all the installers up: The installers I use are synched on all my laptops and backed up to BackBlaze. If the worst happens and my laptop is absolutely broken, I can download the installer from BackBlaze and install on another laptop. This of course takes some time, and if that happens 5 minutes before a presentation, well… I don't have an answer to that, except of course crossing my fingers. Still, all that gives me additional security. Conclusion Remember folks, talking to an audience, large or small, will make you nervous. Just ask Scott Hanselman :) The goal here is to create the best possible conditions for you, and to create an environment where everything is saved and easy to restore, where everything is well known and provides you with additional confidence. The cooler you feel before the presentation (and during ;)), the better your presentation will be. Here too, the goal is to provide the best user experience you can have, which in turn will make it more enjoyable for your audience! Happy presenting :) Laurent   Laurent Bugnion (GalaSoft) Subscribe | Twitter | Facebook | Flickr | LinkedIn

    Read the article

  • obiee memory usage

    - by user554629
    Heap memory is a frequent customer topic. Here's the quick refresher, oriented towards AIX, but the principles apply to other unix implementations. 1. 32-bit processes have a maximum addressability of 4GB; usable application heap size of 2-3 GB.  On AIX it is controlled by an environment variable: export LDR_CNTRL=....=MAXDATA=0x080000000   # 2GB ( The leading zero is deliberate, not required )   1a. It is  possible to get 3.25GB  heap size for a 32-bit process using @DSA (Discontiguous Segment Allocation)     export LDR_CNTRL=MAXDATA=0xd0000000@DSA  # 3.25 GB 32-bit only        One side-effect of using AIX segments "c" and "d" is that shared libraries will be loaded privately, and not shared.        If you need the additional heap space, this is worth the trade-off.  This option is frequently used for 32-bit java.   1b. 64-bit processes have no need for the @DSA option. 2. 64-bit processes can double the 32-bit heap size to 4GB using: export LDR_CNTRL=....=MAXDATA=0x100000000  # 1 with 8-zeros    2a. But this setting would place the same memory limitations on obiee as a 32-bit process    2b. The major benefit of 64-bit is to break the binds of 32-bit addressing.  At a minimum, use 8GB export LDR_CNTRL=....=MAXDATA=0x200000000  # 2 with 8-zeros    2c.  Many large customers are providing extra safety to their servers by using 16GB: export LDR_CNTRL=....=MAXDATA=0x400000000  # 4 with 8-zeros There is no performance penalty for providing virtual memory allocations larger than required by the application.  - If the server only uses 2GB of space in 64-bit ... specifying 16GB just provides an upper bound cushion.    When an unexpected user query causes a sudden memory surge, the extra memory keeps the server running. 3.  The next benefit to 64-bit is that you can provide huge thread stack sizes for      strange queries that might otherwise crash the server.      nqsserver uses fast recursive algorithms to traverse complicated control structures.    This means lots of thread space to hold the stack frames.    3a. Stack frames mostly contain register values;  64-bit registers are twice as large as 32-bit          At a minimum you should  quadruple the size of the server stack threads in NQSConfig.INI          when migrating from 32- to 64-bit, to prevent a rogue query from crashing the server.           Allocate more than is normally necessary for safety.    3b. There is no penalty for allocating more stack size than you need ...           it is just virtual memory;   no real resources  are consumed until the extra space is needed.    3c. Increasing thread stack sizes may require the process heap size (MAXDATA) to be increased.          Heap space is used for dynamic memory requests, and for thread stacks.          No performance penalty to run with large heap and thread stack sizes.           In a 32-bit world, this safety would require careful planning to avoid exceeding 2GM usable storage.     3d. Increasing the number of threads also may require additional heap storage.          Most thread stack frames on obiee are allocated when the server is started,          and the real memory usage increases as threads run work. Does 2.8GB sound like a lot of memory for an AIX application server? - I guess it is what you are accustomed to seeing from "grandpa's applications". - One of the primary design goals of obiee is to trade memory for services ( db, query caches, etc) - 2.8GB is still well under the 4GB heap size allocated with MAXDATA=0x100000000 - 2.8GB process size is also possible even on 32-bit Windows applications - It is not unusual to receive a sudden request for 30MB of contiguous storage on obiee.- This is not a memory leak;  eventually the nqsserver storage will stabilize, but it may take days to do so. vmstat is the tool of choice to observe memory usage.  On AIX vmstat will show  something that may be  startling to some people ... that available free memory ( the 2nd column ) is always  trending toward zero ... no available free memory.  Some customers have concluded that "nearly zero memory free" means it is time to upgrade the server with more real memory.   After the upgrade, the server again shows very little free memory available. Should you be concerned about this?   Many customers are !!  Here is what is happening: - AIX filesystems are built on a paging model.   If you read/write a  filesystem block it is paged into memory ( no read/write system calls ) - This filesystem "page" has its own "backing store" on disk, the original filesystem block.   When the system needs the real memory page holding the file block, there is no need to "page out".    The page can be stolen immediately, because the original is still on disk in the filesystem. - The filesystem  pages tend to collect ... every filesystem block that was ever seen since    system boot is available in memory.  If another application needs the file block, it is retrieved with no physical I/O. What happens if the system does need the memory ... to satisfy a 30MB heap request by nqsserver, for example? - Since the filesystem blocks have their own backing store ( not on a paging device )   the kernel can just steal any filesystem block ... on a least-recently-used basis   to satisfy a new real memory request for "computation pages". No cause for alarm.   vmstat is accurately displaying whether all filesystem blocks have been touched, and now reside in memory.   Back to nqsserver:  when should you be worried about its memory footprint? Answer:  Almost never.   Stop monitoring it ... stop fussing over it ... stop trying to optimize it. This is a production application, and nqsserver uses the memory it requires to accomplish the job, based on demand. C'mon ... never worry?   I'm from New York ... worry is what we do best. Ok, here is the metric you should be watching, using vmstat: - Are you paging ... there are several columns of vmstat outputbash-2.04$ vmstat 3 3 System configuration: lcpu=4 mem=4096MB kthr    memory              page              faults        cpu    ----- ------------ ------------------------ ------------ -----------  r  b    avm   fre  re  pi  po  fr   sr  cy  in   sy  cs us sy id wa  0  0 208492  2600   0   0   0   0    0   0  13   45  73  0  0 99  0  0  0 208492  2600   0   0   0   0    0   0   9   12  77  0  0 99  0  0  0 208492  2600   0   0   0   0    0   0   9   40  86  0  0 99  0 avm is the "available free memory" indicator that trends toward zerore   is "re-page".  The kernel steals a real memory page for one process;  immediately repages back to original processpi  "page in".   A process memory page previously paged out, now paged back in because the process needs itpo "page out" A process memory block was paged out, because it was needed by some other process Light paging activity ( re, pi, po ) is not a concern for worry.   Processes get started, need some memory, go away. Sustained paging activity  is cause for concern.   obiee users are having a terrible day if these counters are always changing. Hang on ... if nqsserver needs that memory and I reduce MAXDATA to keep the process under control, won't the nqsserver process crash when the memory is needed? Yes it will.   It means that nqsserver is configured to require too much memory and there are  lots of options to reduce the real memory requirement.  - number of threads  - size of query cache  - size of sort But I need nqsserver to keep running. Real memory is over-committed.    Many things can cause this:- running all application processes on a single server    ... DB server, web servers, WebLogic/WebSphere, sawserver, nqsserver, etc.   You could move some of those to another host machine and communicate over the network  The need for real memory doesn't go away, it's just distributed to other host machines. - AIX LPAR is configured with too little memory.     The AIX admin needs to provide more real memory to the LPAR running obiee. - More memory to this LPAR affects other partitions. Then it's time to visit your friendly IBM rep and buy more memory.

    Read the article

  • Developing custom MBeans to manage J2EE Applications (Part III)

    - by philippe Le Mouel
    This is the third and final part in a series of blogs, that demonstrate how to add management capability to your own application using JMX MBeans. In Part I we saw: How to implement a custom MBean to manage configuration associated with an application. How to package the resulting code and configuration as part of the application's ear file. How to register MBeans upon application startup, and unregistered them upon application stop (or undeployment). How to use generic JMX clients such as JConsole to browse and edit our application's MBean. In Part II we saw: How to add localized descriptions to our MBean, MBean attributes, MBean operations and MBean operation parameters. How to specify meaningful name to our MBean operation parameters. We also touched on future enhancements that will simplify how we can implement localized MBeans. In this third and last part, we will re-write our MBean to simplify how we added localized descriptions. To do so we will take advantage of the functionality we already described in part II and that is now part of WebLogic 10.3.3.0. We will show how to take advantage of WebLogic's localization support to localize our MBeans based on the client's Locale independently of the server's Locale. Each client will see MBean descriptions localized based on his/her own Locale. We will show how to achieve this using JConsole, and also using a sample programmatic JMX Java client. The complete code sample and associated build files for part III are available as a zip file. The code has been tested against WebLogic Server 10.3.3.0 and JDK6. To build and deploy our sample application, please follow the instruction provided in Part I, as they also apply to part III's code and associated zip file. Providing custom descriptions take II In part II we localized our MBean descriptions by extending the StandardMBean class and overriding its many getDescription methods. WebLogic 10.3.3.0 similarly to JDK 7 can automatically localize MBean descriptions as long as those are specified according to the following conventions: Descriptions resource bundle keys are named according to: MBean description: <MBeanInterfaceClass>.mbean MBean attribute description: <MBeanInterfaceClass>.attribute.<AttributeName> MBean operation description: <MBeanInterfaceClass>.operation.<OperationName> MBean operation parameter description: <MBeanInterfaceClass>.operation.<OperationName>.<ParameterName> MBean constructor description: <MBeanInterfaceClass>.constructor.<ConstructorName> MBean constructor parameter description: <MBeanInterfaceClass>.constructor.<ConstructorName>.<ParameterName> We also purposely named our resource bundle class MBeanDescriptions and included it as part of the same package as our MBean. We already followed the above conventions when creating our resource bundle in part II, and our default resource bundle class with English descriptions looks like: package blog.wls.jmx.appmbean; import java.util.ListResourceBundle; public class MBeanDescriptions extends ListResourceBundle { protected Object[][] getContents() { return new Object[][] { {"PropertyConfigMXBean.mbean", "MBean used to manage persistent application properties"}, {"PropertyConfigMXBean.attribute.Properties", "Properties associated with the running application"}, {"PropertyConfigMXBean.operation.setProperty", "Create a new property, or change the value of an existing property"}, {"PropertyConfigMXBean.operation.setProperty.key", "Name that identify the property to set."}, {"PropertyConfigMXBean.operation.setProperty.value", "Value for the property being set"}, {"PropertyConfigMXBean.operation.getProperty", "Get the value for an existing property"}, {"PropertyConfigMXBean.operation.getProperty.key", "Name that identify the property to be retrieved"} }; } } We have now also added a resource bundle with French localized descriptions: package blog.wls.jmx.appmbean; import java.util.ListResourceBundle; public class MBeanDescriptions_fr extends ListResourceBundle { protected Object[][] getContents() { return new Object[][] { {"PropertyConfigMXBean.mbean", "Manage proprietes sauvegarde dans un fichier disque."}, {"PropertyConfigMXBean.attribute.Properties", "Proprietes associee avec l'application en cour d'execution"}, {"PropertyConfigMXBean.operation.setProperty", "Construit une nouvelle proprietee, ou change la valeur d'une proprietee existante."}, {"PropertyConfigMXBean.operation.setProperty.key", "Nom de la propriete dont la valeur est change."}, {"PropertyConfigMXBean.operation.setProperty.value", "Nouvelle valeur"}, {"PropertyConfigMXBean.operation.getProperty", "Retourne la valeur d'une propriete existante."}, {"PropertyConfigMXBean.operation.getProperty.key", "Nom de la propriete a retrouver."} }; } } So now we can just remove the many getDescriptions methods from our MBean code, and have a much cleaner: package blog.wls.jmx.appmbean; import java.io.IOException; import java.io.InputStream; import java.io.OutputStream; import java.io.FileInputStream; import java.io.FileOutputStream; import java.io.File; import java.net.URL; import java.util.Map; import java.util.HashMap; import java.util.Properties; import javax.management.MBeanServer; import javax.management.ObjectName; import javax.management.MBeanRegistration; import javax.management.StandardMBean; import javax.management.MBeanOperationInfo; import javax.management.MBeanParameterInfo; public class PropertyConfig extends StandardMBean implements PropertyConfigMXBean, MBeanRegistration { private String relativePath_ = null; private Properties props_ = null; private File resource_ = null; private static Map operationsParamNames_ = null; static { operationsParamNames_ = new HashMap(); operationsParamNames_.put("setProperty", new String[] {"key", "value"}); operationsParamNames_.put("getProperty", new String[] {"key"}); } public PropertyConfig(String relativePath) throws Exception { super(PropertyConfigMXBean.class , true); props_ = new Properties(); relativePath_ = relativePath; } public String setProperty(String key, String value) throws IOException { String oldValue = null; if (value == null) { oldValue = String.class.cast(props_.remove(key)); } else { oldValue = String.class.cast(props_.setProperty(key, value)); } save(); return oldValue; } public String getProperty(String key) { return props_.getProperty(key); } public Map getProperties() { return (Map) props_; } private void load() throws IOException { InputStream is = new FileInputStream(resource_); try { props_.load(is); } finally { is.close(); } } private void save() throws IOException { OutputStream os = new FileOutputStream(resource_); try { props_.store(os, null); } finally { os.close(); } } public ObjectName preRegister(MBeanServer server, ObjectName name) throws Exception { // MBean must be registered from an application thread // to have access to the application ClassLoader ClassLoader cl = Thread.currentThread().getContextClassLoader(); URL resourceUrl = cl.getResource(relativePath_); resource_ = new File(resourceUrl.toURI()); load(); return name; } public void postRegister(Boolean registrationDone) { } public void preDeregister() throws Exception {} public void postDeregister() {} protected String getParameterName(MBeanOperationInfo op, MBeanParameterInfo param, int sequence) { return operationsParamNames_.get(op.getName())[sequence]; } } The only reason we are still extending the StandardMBean class, is to override the default values for our operations parameters name. If this isn't a concern, then one could just write the following code: package blog.wls.jmx.appmbean; import java.io.IOException; import java.io.InputStream; import java.io.OutputStream; import java.io.FileInputStream; import java.io.FileOutputStream; import java.io.File; import java.net.URL; import java.util.Properties; import javax.management.MBeanServer; import javax.management.ObjectName; import javax.management.MBeanRegistration; import javax.management.StandardMBean; import javax.management.MBeanOperationInfo; import javax.management.MBeanParameterInfo; public class PropertyConfig implements PropertyConfigMXBean, MBeanRegistration { private String relativePath_ = null; private Properties props_ = null; private File resource_ = null; public PropertyConfig(String relativePath) throws Exception { props_ = new Properties(); relativePath_ = relativePath; } public String setProperty(String key, String value) throws IOException { String oldValue = null; if (value == null) { oldValue = String.class.cast(props_.remove(key)); } else { oldValue = String.class.cast(props_.setProperty(key, value)); } save(); return oldValue; } public String getProperty(String key) { return props_.getProperty(key); } public Map getProperties() { return (Map) props_; } private void load() throws IOException { InputStream is = new FileInputStream(resource_); try { props_.load(is); } finally { is.close(); } } private void save() throws IOException { OutputStream os = new FileOutputStream(resource_); try { props_.store(os, null); } finally { os.close(); } } public ObjectName preRegister(MBeanServer server, ObjectName name) throws Exception { // MBean must be registered from an application thread // to have access to the application ClassLoader ClassLoader cl = Thread.currentThread().getContextClassLoader(); URL resourceUrl = cl.getResource(relativePath_); resource_ = new File(resourceUrl.toURI()); load(); return name; } public void postRegister(Boolean registrationDone) { } public void preDeregister() throws Exception {} public void postDeregister() {} } Note: The above would also require changing the operations parameters name in the resource bundle classes. For instance: PropertyConfigMXBean.operation.setProperty.key would become: PropertyConfigMXBean.operation.setProperty.p0 Client based localization When accessing our MBean using JConsole started with the following command line: jconsole -J-Djava.class.path=$JAVA_HOME/lib/jconsole.jar:$JAVA_HOME/lib/tools.jar: $WL_HOME/server/lib/wljmxclient.jar -J-Djmx.remote.protocol.provider.pkgs=weblogic.management.remote -debug We see that our MBean descriptions are localized according to the WebLogic's server Locale. English in this case: Note: Consult Part I for information on how to use JConsole to browse/edit our MBean. Now if we specify the client's Locale as part of the JConsole command line as follow: jconsole -J-Djava.class.path=$JAVA_HOME/lib/jconsole.jar:$JAVA_HOME/lib/tools.jar: $WL_HOME/server/lib/wljmxclient.jar -J-Djmx.remote.protocol.provider.pkgs=weblogic.management.remote -J-Dweblogic.management.remote.locale=fr-FR -debug We see that our MBean descriptions are now localized according to the specified client's Locale. French in this case: We use the weblogic.management.remote.locale system property to specify the Locale that should be associated with the cient's JMX connections. The value is composed of the client's language code and its country code separated by the - character. The country code is not required, and can be omitted. For instance: -Dweblogic.management.remote.locale=fr We can also specify the client's Locale using a programmatic client as demonstrated below: package blog.wls.jmx.appmbean.client; import javax.management.MBeanServerConnection; import javax.management.ObjectName; import javax.management.MBeanInfo; import javax.management.remote.JMXConnector; import javax.management.remote.JMXServiceURL; import javax.management.remote.JMXConnectorFactory; import java.util.Hashtable; import java.util.Set; import java.util.Locale; public class JMXClient { public static void main(String[] args) throws Exception { JMXConnector jmxCon = null; try { JMXServiceURL serviceUrl = new JMXServiceURL( "service:jmx:iiop://127.0.0.1:7001/jndi/weblogic.management.mbeanservers.runtime"); System.out.println("Connecting to: " + serviceUrl); // properties associated with the connection Hashtable env = new Hashtable(); env.put(JMXConnectorFactory.PROTOCOL_PROVIDER_PACKAGES, "weblogic.management.remote"); String[] credentials = new String[2]; credentials[0] = "weblogic"; credentials[1] = "weblogic"; env.put(JMXConnector.CREDENTIALS, credentials); // specifies the client's Locale env.put("weblogic.management.remote.locale", Locale.FRENCH); jmxCon = JMXConnectorFactory.newJMXConnector(serviceUrl, env); jmxCon.connect(); MBeanServerConnection con = jmxCon.getMBeanServerConnection(); Set mbeans = con.queryNames( new ObjectName( "blog.wls.jmx.appmbean:name=myAppProperties,type=PropertyConfig,*"), null); for (ObjectName mbeanName : mbeans) { System.out.println("\n\nMBEAN: " + mbeanName); MBeanInfo minfo = con.getMBeanInfo(mbeanName); System.out.println("MBean Description: "+minfo.getDescription()); System.out.println("\n"); } } finally { // release the connection if (jmxCon != null) jmxCon.close(); } } } The above client code is part of the zip file associated with this blog, and can be run using the provided client.sh script. The resulting output is shown below: $ ./client.sh Connecting to: service:jmx:iiop://127.0.0.1:7001/jndi/weblogic.management.mbeanservers.runtime MBEAN: blog.wls.jmx.appmbean:type=PropertyConfig,name=myAppProperties MBean Description: Manage proprietes sauvegarde dans un fichier disque. $ Miscellaneous Using Description annotation to specify MBean descriptions Earlier we have seen how to name our MBean descriptions resource keys, so that WebLogic 10.3.3.0 automatically uses them to localize our MBean. In some cases we might want to implicitly specify the resource key, and resource bundle. For instance when operations are overloaded, and the operation name is no longer sufficient to uniquely identify a single operation. In this case we can use the Description annotation provided by WebLogic as follow: import weblogic.management.utils.Description; @Description(resourceKey="myapp.resources.TestMXBean.description", resourceBundleBaseName="myapp.resources.MBeanResources") public interface TestMXBean { @Description(resourceKey="myapp.resources.TestMXBean.threshold.description", resourceBundleBaseName="myapp.resources.MBeanResources" ) public int getthreshold(); @Description(resourceKey="myapp.resources.TestMXBean.reset.description", resourceBundleBaseName="myapp.resources.MBeanResources") public int reset( @Description(resourceKey="myapp.resources.TestMXBean.reset.id.description", resourceBundleBaseName="myapp.resources.MBeanResources", displayNameKey= "myapp.resources.TestMXBean.reset.id.displayName.description") int id); } The Description annotation should be applied to the MBean interface. It can be used to specify MBean, MBean attributes, MBean operations, and MBean operation parameters descriptions as demonstrated above. Retrieving the Locale associated with a JMX operation from the MBean code There are several cases where it is necessary to retrieve the Locale associated with a JMX call from the MBean implementation. For instance this can be useful when localizing exception messages. This can be done as follow: import weblogic.management.mbeanservers.JMXContextUtil; ...... // some MBean method implementation public String setProperty(String key, String value) throws IOException { Locale callersLocale = JMXContextUtil.getLocale(); // use callersLocale to localize Exception messages or // potentially some return values such a Date .... } Conclusion With this last part we conclude our three part series on how to write MBeans to manage J2EE applications. We are far from having exhausted this particular topic, but we have gone a long way and are now capable to take advantage of the latest functionality provided by WebLogic's application server to write user friendly MBeans.

    Read the article

  • CodePlex Daily Summary for Friday, October 05, 2012

    CodePlex Daily Summary for Friday, October 05, 2012Popular ReleasesConfiguration Manager 2012 Automation: Beta Code (v0.1): Beta codefastJSON: v2.0.7: 2.0.7 - bug fix missing comma with single property and extension enabledWinRT XAML Toolkit: WinRT XAML Toolkit - 1.3.2: WinRT XAML Toolkit based on the Windows 8 RTM SDK. Download the latest source from the SOURCE CODE page. For compiled version use NuGet. You can add it to your project in Visual Studio by going to View/Other Windows/Package Manager Console and entering: PM> Install-Package winrtxamltoolkit Features AsyncUI extensions Controls and control extensions Converters Debugging helpers Imaging IO helpers VisualTree helpers Samples Recent changes NOTE: Namespace changes DebugConsol...Snoop, the WPF Spy Utility: Snoop 2.8.0: Snoop 2.8.0Announcing Snoop 2.8.0! It's been exactly six months since the last release, and this one has a bunch of goodies in it. In particular, there is now a PowerShell scripting tab, compliments of Bailey Ling. With this tab, the possibilities are limitless. It basically lets you automate/script the application that you are Snooping. Bailey has a couple blog posts (one and two) on his tab already, and I am sure more is to come. Please note that if you do not have PowerShell installed, y....NET Micro Framework: .NET MF 4.3 (Beta): This is the 4.3 Beta version of the .NET Micro Framework. Feature List for v4.3 Support for Visual Studio 2012 (including the Windows Desktop Express version) All v4.2 QFEs features and bug fixes (PWM enhancements, lwIP and network driver reliability improvements, Analog Output, WinUSB and latest GCC support) Improved diagnostic information for deployment Decreased boot time Bug fixes Work Item 1736 - Create link for MFDeploy under start menu Work Item 1504 - Customizing lwIP o...MCEBuddy 2.x: MCEBuddy 2.3.1: 2.3.1All new Remote Client Server architecture. Reccomended Download. The Remote Client Installation is OPTIONAL, you can extract the files from the zip archive into a local folder and run MCEBuddy.GUI directly. 2.2.15 was the last standalone release. Changelog for 2.3.1 (32bit and 64bit) 1. All remote MCEBuddy Client Server architecture (GUI runs remotely/independently from engine now) 2. Fixed bug in Audio Offset 3. Added support for remote MediaInfo (right click on file in queue to get ...D3 Loot Tracker: 1.5: Support for session upload to website. Support for theme change through general settings. Time played counter will now also display a count for days. Tome of secrets are no longer logged as items.NTCPMSG: V1.2.0.0: Allocate an identify cableid for each single connection cable. * Server can asend to specified cableid directly.Team Foundation Server Word Add-in: Version 1.0.12.0622: Welcome to the Visual Studio Team Foundation Server Word Add-in Supported Environments Microsoft Office Word 2007 and 2010 X86 (32-bit) Team Foundation Server 2010 Object Model TFS 2010, 2012 and TFS Service supported, using TFS OM / Explorer 2010. Quality-Bar Details Tool has been reviewed by Visual Studio ALM Rangers Tool has been through an independent technical and quality review All critical bugs have been resolved Known Issues / Bugs WI#43553 - The Acceptance criteria is not pu...Korean String Extension for .NET: ?? ??? ??? ????(v0.2.3.0): ? ?? ?? ?? ???? - string.KExtract() ?? ???? - string.AppendJosa(...) AppendJosa(...)? ?? ???? KAppendJosa(...)? ??? ?????UMD????? - PC?: UMDEditor?????V2.7: ??:http://jianyun.org/archives/948.html =============================================================================== UMD??? ???? =============================================================================== 2.7.0 (2012-10-3) ???????“UMD???.exe”??“UMDEditor.exe” ?????????;????????,??????。??????,????! ??64????,??????????????bug ?????????????,???? ???????????????? ???????????????,??????????bug ------------------------------------------------------- ?? reg.bat ????????????。 ????,??????txt/u...SharePoint Column & View Permission: SharePoint Column and View Permission v1.5: Version 1.5 of this project. If you will find any bugs please let me know at enti@zoznam.sk or post your findings in Issue TrackerUntangler: Untangler 1.0.0: Add a missing file from first releaseDirectX Tool Kit: October 2012: October 2, 2012 Added ScreenGrab module Added CreateGeoSphere for drawing a geodesic sphere Put DDSTextureLoader and WICTextureLoader into the DirectX C++ namespace Renamed project files for better naming consistency Updated WICTextureLoader for Windows 8 96bpp floating-point formats Win32 desktop projects updated to use Windows Vista (0x0600) rather than Windows 7 (0x0601) APIs Tweaked SpriteBatch.cpp to workaround ARM NEON compiler codegen bugHome Access Plus+: v8.1: HAP+ Web v8.1.1003.000079318 Fixed: Issue with the Help Desk and updating a ticket as an admin 79319 Fixed: formatting issue with the booking system admin header 79321 Moved to using the arrow with a circle symbol on the homepage instead of the > and < 79541 Added: 480px wide mobile theme to login page 79541 Added: 480px wide mobile theme to home page 79541 Added: slide events for homepage 79553 Fixed: Booking System Multiple Lesson Bug 79553 Fixed: IE Error Message 79684 Fixed: jQuery issue ...CRM 2011 Visual Ribbon Editor: Visual Ribbon Editor (1.3.1002.3): Visual Ribbon Editor 1.3.1002.3 What's New: Multi-language support for Labels/Tooltips for custom buttons and groups Support for base language other than English (1033) Connect dialog will not require organization name for ADFS / IFD connections Automatic creation of missing labels for all provisioned languages Minor connection issues fixed Notes: Before saving the ribbon to CRM server, editor will check Ribbon XML for any missing <Title> elements inside existing <LocLabel> elements...SubExtractor: Release 1029: Feature: Added option to make i and ¡ characters movie-specific for improved OCR on Spanish subs (Special Characters tab in Options) Feature: Allow switch to Word Spacing dialog directly from Spell Check dialog Fix: Added more default word spacings for accented characters Fix: Changed Word Spacing dialog to show all OCR'd characters in current sub Fix: Removed application focus grab during OCR Fix: Tightened HD subs fuzzy logic to reduce false matches in small characters Fix: Improved Arrow k...WallSwitch: WallSwitch 1.0.6: Version 1.0.6 Changes: Added hotkeys to perform a variety of operations (next/previous image, pause, clear history, etc.) Added color effects for grayscale, sepia and intense color. Various fixes.Readable Passphrase Generator: KeePass Plugin 0.7.1: See the KeePass Plugin Step By Step Guide for instructions on how to install the plugin. Changes Built against KeePass 2.20Windows 8 Toolkit - Charts and More: Beta 1.0: The First Compiled Version of my LibraryNew Projects3DGL: Bot de liga con comandos basicos para un pvpgn, necesita ser refactorizado y pasado en limpio pero esta totalmente funcional. Actualmente se usa OmbuServerADCMS: oneAptech.eProject.Batch59B - Online Book Store: eProject of Batch59B (SOFTECH - APTECH)AutoconsultaUB: this is a testAzure Active Directory Expense Demo Application: This application has been written to provide you with a quick and easy way to set up your first application that connects seamlessly to Azure Active DirectoryCentral: This a WPF project that implements PRISM with MEF as the IoC. This is an IDE with a modern UI where developers can run and centralize all their different apps.DataWay Connectors: ?????????? DataWay ??? ?????????? ?????????? ??? ????????? DataWay, ?????????? ?? ?????????????? ? ?????????? ??????-????????DLock - Open source distributed lock: DLock is a open source project based on .net framework that can provide the distributed lock. DoubleKeyDictionary: Project is just a single file, tests to be added as we go.Embedded Excel OLE Orphan Preventer: Prevents orphaned instances of embedded Excel workbooks from handing around even after the parent container document has been closed.File Duplicate Utility: This project will quickly detect duplicate files in a directory, then allow some minimal post processing on those duplicate files (i.e. move them to a new dir).Geoprocessing: A set of tools for processing Sea ice concentrations Google oAuth2 Service Account .NET: Google oAuth2 Service Account .NETHelperStart: NotingiDeal: C# library for connecting with iDealJCI-System: This project doesn't have a summaryLayoutting a long string: Takes a long string and splits it according a displacing enum layoutMicrosoft Dynamics CRM VB.Net SoapLogger: The VB.NET SoapLogger for VB.NET makes it easy to see what the SOAP looks like behind a Dynamics CRM 2011 call. Similiary to the C# version included in the SDKMinecraftRepository: Minecraft data repositoryModBUSLib.NET: TestMood Tracker - Decisions: A application to help you track your mood changes.Relying Party Federation Metadata Editor: This is a federation metadata editor for relying party trust applications. RPs can be created on any platform (as long as it's based on the oasis standart).Rest MVC: Personal project to play around with MVC APISharepoint 2010 Image carousel Sandboxed: This is a Image Slider Sandboxed solution web part for SharePoint 2010. Hope this helps. Cheers!Synapse - Micro Framework for: IoCC, AoP, Messages, Pipe and Filter Pattern: Micro Framework for: Inversion of Control Container, Aspect oriented Programming, Messages Pattern, Pipe and Filter Pattern.testdd10042012tfs01: b testnewgit1004201201: cWikimentation: A very simple Wiki to integrate in existing Asp.Net MVC4-Sites, implemented as an Area. Use it for a simple OnePage-Project-Documentation.Windows 8 AppBox: Este aplicativo vai ajuda-los a começar a desenvolver Aplicativos para a Windows 8 Store, comece desenvolvendo uma App simples, baseada somente em conteúdo :)Windows Phone Stateful Framework: A framework to correctly implement the Windows Phone tombstoned state.

    Read the article

  • CodePlex Daily Summary for Saturday, June 11, 2011

    CodePlex Daily Summary for Saturday, June 11, 2011Popular ReleasesSimplePlanner: v2.0b: For 2011-2012 Sem 1 ???2011-2012 ????Econ NetVert: 2.4.2.14 Production: Many thanks to paulhickman for testing and reporting issues. - this release has lots of bugfixes in codeconversion when using NRefactory 4.0 (local) and VisualStudio Projects conversionVisual Studio 2010 Help Downloader: 1.0.0.3: Domain name support for proxy Cleanup old packages bug Writing to EventLog with UAC enabled bug Small fixes & RefactoringMedia Companion: MC 3.406b weekly: With this version change a movie rebuild is required when first run -else MC will lock up on exit. Extract the entire archive to a folder which has user access rights, eg desktop, documents etc. Refer to the documentation on this site for the Installation & Setup Guide Important! If you find MC not displaying movie data properly, please try a 'movie rebuild' to reload the data from the nfo's into MC's cache. Fixes Movies Readded movie preference to rename invalid or scene nfo's to info ext...SCCM Client Actions Tool: SCCM Client Actions Tool v0.5.1: SCCM Client Actions Tool v0.5.1 is currently the most stable version and includes all of the functionality requested so far. It comes with following changes since last version: Fixed an incorrect path to x64 client setup folder. It comes as a ZIP file that contains three files: ClientActionsTool.hta – The tool itself. Cmdkey.exe – command line tool for managing cached credentials. This is needed for alternate credentials feature when running the HTA on Windows XP. Cmdkey.exe is natively a...Windows Azure VM Assistant: AzureVMAssist V1.0.0.5: AzureVMAssist V1.0.0.5 (Debug) - Test Release VersionSizeOnDisk: 1.8.0.3: Fix (issue 317): Main window icon loading error on windows server 2003 32bit runing on x86 cpu. Bypass of the Microsoft windows comexception.NetOffice - The easiest way to use Office in .NET: NetOffice Release 0.9: Changes: - fix examples (include issue 16026) - add new examples - 32Bit/64Bit Walkthrough is now available in technical Documentation. Includes: - Runtime Binaries and Source Code for .NET Framework:......v2.0, v3.0, v3.5, v4.0 - Tutorials in C# and VB.Net:..............................................................COM Proxy Management, Events, etc. - Examples in C# and VB.Net:............................................................Excel, Word, Outlook, PowerPoint, Access - COMAddi...Reusable Library: V1.1.3: A collection of reusable abstractions for enterprise application developerClosedXML - The easy way to OpenXML: ClosedXML 0.54.0: New on this release: 1) Mayor performance improvements. 2) AdjustToContents now take into account the text rotation. 3) Fixed issues 6782, 6784, 6788HTML-IDEx: HTML-IDEx .15 ALPHA: This release fixes line counting a little bit and adds the masshighlight() sub, which highlights pasted and inserted code.AutoLoL: AutoLoL v2.0.3: - Improved summoner spells are now displayed - Fixed some of the startup errors people got - Double clicking an item selects it - Some usability changes that make using AutoLoL just a little easier - Bug fixes AutoLoL v2 is not an update, but an entirely new version! Please install to a different directory than AutoLoL v1Host Profiles: Host Profiles 1.0: Host Profiles 1.0 Release Quickly modify host file Automatically flush dnsVidCoder: 0.9.2: Updated to HandBrake 4024svn. This fixes problems with mpeg2 sources: corrupted previews, incorrect progress indicators and encodes that incorrectly report as failed. Fixed a problem that prevented target sizes above 2048 MB.SharePoint Search XSL Samples: SharePoint 2010 Samples: I have updated some of the samples from the 2007 release. These all work in SharePoint 2010. I removed the Pivot on File Extension because SharePoint 2010 search has refiners that perform the same function.AcDown????? - Anime&Comic Downloader: AcDown????? v3.0 Beta5: ??AcDown?????????????,??????????????,????、????。?????Acfun????? ????32??64? Windows XP/Vista/7 ????????????? ??:????????Windows XP???,?????????.NET Framework 2.0???(x86)?.NET Framework 2.0???(x64),?????"?????????"??? ??v3.0 Beta5 ?????????? ???? ?? ???????? ???"????????"?? ????????????? ????????/???? ?? ???"????"??? ?? ??????????? ?? ?? ??????????? ?? ?????????????????? ??????????????????? ???????????????? ????????????Discussions???????? ????AcDown??????????????VFPX: GoFish 4 Beta 1: Current beta is Build 144 (released 2011-06-07 ) See the GoFish4 info page for details and video link: http://vfpx.codeplex.com/wikipage?title=GoFishShowUI: Write-UI -in PowerShell: ShowUI: ShowUI is a PowerShell module to help you write rich user interfaces in script.SharePoint 2010 FBA Pack: SharePoint 2010 FBA Pack 1.0.3: Fixed User Management screen when "RequiresQuestionAndAnswer" set to true Reply to Email Address can now be customized User Management page now only displays users that reside in the membership database Web parts have been changed to inherit from System.Web.UI.WebControls.WebParts.WebPart, so that they will display on anonymous application pages For installation and configuration steps see here.TerrariViewer: TerrariViewer v2.5: Added new items associated with Terraria v1.0.3 to the character editor. Fixed multiple bugs with Piggy Bank EditoryNew Projects.NET MVC Base by OST: The OST .NET MVC project is currently in infancy but contains html helper extensions, model binders and other extensions to the mvc 3 framework. It also contains a separate project for easily consuming JQGrid requests and return the correct JSON to the jQuery plugin.Asterisk AMI Proxy suite: The AMI Proxy suite project will bridge the gap between your asterisk phone system and your windows based applications. The AMI Proxy is a service that can be used to simply proxy AMI connections to your asterisk server but also adds call stat generation and line monitors.Augusta Developers Guild: The web site for the user group Augusta Developers GuildAzure Blob Pusher: File uploader allows bulk upload of files from client to Azure blob storage. FileSystemWatcher notices new files in configured directory. SQL Express database tracks files as they are pushed up to blob storage and then processed. Local Directory is cleaned up after acknowledgement.Basic Class Library: A collection of functions created to eased up the programmers life in order to solved basic task like validation, AmountTowords, Encrypt & Decryption it also has capability to Connect and retrieve sql data just in one line of code, and some other Basic but very Important task.Basic RTS Game using RolePlayingGame Tool kit based XNA: this is a basic RTS Game. i can make it easy by using RolePlayingGame Took Kit based XNA.Batch Scheduler using .Net 4, MEF and Entity framework 4.1 (Magic Unicorn): Simple Batch Architecture written on C#. Uses the .NET 4, MEF and Entity Framework 4.1 Code First. If you dont need a scheduler, just use the sample code. Agents can be scheduled through a central database table. Plug-ins (or jobs) are launched through MEF. DropBox Linker: The goal of DropBox Linker project is to improve DropBox service client usability. It intellectually automates copying URLs to clipboard after publishing a file. Multiple files at once is supported. Additionally, it corrects URL on file rename after its link been copied.Exchange Spigot for NodeXL: Exchange Spigot for NodeXL enables Microsoft Excel plugin NodeXL to collect messaging data from the Microsoft Exchange Server and display that data as a graph. It is developed in C#.G - Low Level Graphics API: A low level graphics api, in similar style to OpenGL, but more OO. Can drastically speed up OpenGL based apps. Makes full use of Net4.0 features such as lambada expressions where useful. (C#, OpenTK powered. This is not for C++) GpgAPI - A C# Api for Gpg: GpgAPI is a C# API for Gpg. Gpg is a command line software to encrypt, decrypt files with a symmetric or assymetric key. You can also managed public keys, import keys from the web, export your public keys, etc. GpgAPI is an interface to Gpg. In C#, with a few lines, you can execute gpg to encrypt, decrypt, etc. The license of this project is "GPL v3"; it is NOT GPL v2. GPL v3 is not present in the licenses list so I had to choose GPL v2. So if use GpgAPI, you must accept the GPL v...iDreamBoard: iDreamBoard is an application to ease the process of creating and editing dreamboard themes for iOS devices. UNDER DEVELOPMENTIETouch: Use IE Touch to access multitouch events from JavaScript in IE. This project uses some of the code from Windows 7 Multitouch .NET Interop Sample Library available in http://archive.msdn.microsoft.com/WindowsTouch/Release/ProjectReleases.aspx?ReleaseId=2127iMaker-Lion: Applications post installation pour Mac OS X sur PC.Informatic System Ma Chung University Alumni Center: Informatic System Ma Chung University Alumni Center is a website for alumni at Information System Faculty at Ma Chung University at Malang, IndonesiaJasc (just another script compressor): Just another script compressor, but that's by far the easiest to use! Use Jasc to merge compress your javascript (JS) and style sheets (CSS). Just drop the executable in your javascript / CSS folder and run it. Voyla! It will handle everything by itself. Compression is based on Microsoft Ajax Minifier. Jasc can handle single or multiple files, will accept regular expressions to remove specific lines from the source code, and can also optimize the results by shrinking variable names and twe...LINQ Extensions Library: A library of useful LINQ extensions allowing for arithmetic manipulation of sequences, statistical analysis, sequence generation, sequence manipulation, pattern detection and more.Mail2FotoFrame: Pulls emails from a POP3-Account, extracts picture-attachments and saves them to a SD-card, so you can view them on a fotoframe.Make web (or sharepoint) pages screenshots with Powershell: From a text file containing url, make screenshot of each page. This powershell script use iecapt.exe project. You can manage width format and delay for each captureModé Internationalé (E-Commerce Site): Mode International is a fictional clothing store, and this website project is meant to be the e-Commerce site for this store. So far, the project has been developed using Java Enterprise Edition and Sockets, but I am considering making the use of C# in ASP .NET in the future.Music House: Site dedicado a musica e afiliadosNiphor's ToolBox: ?????????Nordic nRF24L01+ .NET Micro Framework Driver: Library that enables .NET Micro Framework developers to use Nordic 2.4 GHz wireless tranceivers. This library can be used on any net mf device with SPI interface.Petey's Game Engine: Petey's Game Engine is a XNA 4.0 game engine with a featured blog describing each step of the development. This is a learning project so people may follow and learn as my project evolves.ReportGenerator: ReportGenerator converts XML reports generated by PartCover or NCover into a readable HTML report.Rise of the rabbits: Rise of the RabbitsSapaFinance: SapaFinanceSCSM Copy Object: SCSM Copy Object is a CodePlex project that enables users to select objects in the System Center Service Manager console, click a task in the task pane and create a copy of the selected object(s). Objects can also be copied from command line executables.SharePoint 2010 Language Pack Downloader: SharePoint 2010 Language Pack Downloader is a tool to quickly download multiple language pack files for the SharePoint 2010 Server platform.SharePoint XQuery Reporting Solution: My company's project needs this technology because of the current company has a lot of data is stored in SharePoint, through Infopath data collection, data storage and not all InfoPath field values ​​are saved into a database, a considerable number of data in XML File, which caused difficulties in query and print a report, need a solution. Simply explained the thinking: First step:Put SharePoint, InfoPath XML file later (I did a small tool XMLToDB) or through the Event Handle sync sav...The Pacman: A simple single player game similar to Pacman. Developed using XNA game framework and C#. Game demonstrates usage of fuzzy logic for controlling the AI ghosts. This game is developed to test the effect of fuzzy logic AI in Pacman game.TodoStrike: Todo Strike is a simple todo list keeper.WallpaperDeleter: Simple program to delete the current background wallpaper image.Windows Phone 7 Continuous Integration Testing Framework: WP7 CI is a port of the Silverlight Toolkit Test Framework with added CI support, allowing tests to be run in the emulator from the command lineWolfpack - Distributed System Monitoring: Wolfpack is an extensible, .Net windows service framework for running jobs to monitor your software, system & business KPI's. The data collected can be sent directly to Growl, Geckoboard, WCF, SQL, SQLite, NServiceBus. It comes loaded with some tasks but it's simple to implement your own!

    Read the article

  • C#/.NET Little Wonders: Tuples and Tuple Factory Methods

    - by James Michael Hare
    Once again, in this series of posts I look at the parts of the .NET Framework that may seem trivial, but can really help improve your code by making it easier to write and maintain.  This week, we look at the System.Tuple class and the handy factory methods for creating a Tuple by inferring the types. What is a Tuple? The System.Tuple is a class that tends to inspire a reaction in one of two ways: love or hate.  Simply put, a Tuple is a data structure that holds a specific number of items of a specific type in a specific order.  That is, a Tuple<int, string, int> is a tuple that contains exactly three items: an int, followed by a string, followed by an int.  The sequence is important not only to distinguish between two members of the tuple with the same type, but also for comparisons between tuples.  Some people tend to love tuples because they give you a quick way to combine multiple values into one result.  This can be handy for returning more than one value from a method (without using out or ref parameters), or for creating a compound key to a Dictionary, or any other purpose you can think of.  They can be especially handy when passing a series of items into a call that only takes one object parameter, such as passing an argument to a thread's startup routine.  In these cases, you do not need to define a class, simply create a tuple containing the types you wish to return, and you are ready to go? On the other hand, there are some people who see tuples as a crutch in object-oriented design.  They may view the tuple as a very watered down class with very little inherent semantic meaning.  As an example, what if you saw this in a piece of code: 1: var x = new Tuple<int, int>(2, 5); What are the contents of this tuple?  If the tuple isn't named appropriately, and if the contents of each member are not self evident from the type this can be a confusing question.  The people who tend to be against tuples would rather you explicitly code a class to contain the values, such as: 1: public sealed class RetrySettings 2: { 3: public int TimeoutSeconds { get; set; } 4: public int MaxRetries { get; set; } 5: } Here, the meaning of each int in the class is much more clear, but it's a bit more work to create the class and can clutter a solution with extra classes. So, what's the correct way to go?  That's a tough call.  You will have people who will argue quite well for one or the other.  For me, I consider the Tuple to be a tool to make it easy to collect values together easily.  There are times when I just need to combine items for a key or a result, in which case the tuple is short lived and so the meaning isn't easily lost and I feel this is a good compromise.  If the scope of the collection of items, though, is more application-wide I tend to favor creating a full class. Finally, it should be noted that tuples are immutable.  That means they are assigned a value at construction, and that value cannot be changed.  Now, of course if the tuple contains an item of a reference type, this means that the reference is immutable and not the item referred to. Tuples from 1 to N Tuples come in all sizes, you can have as few as one element in your tuple, or as many as you like.  However, since C# generics can't have an infinite generic type parameter list, any items after 7 have to be collapsed into another tuple, as we'll show shortly. So when you declare your tuple from sizes 1 (a 1-tuple or singleton) to 7 (a 7-tuple or septuple), simply include the appropriate number of type arguments: 1: // a singleton tuple of integer 2: Tuple<int> x; 3:  4: // or more 5: Tuple<int, double> y; 6:  7: // up to seven 8: Tuple<int, double, char, double, int, string, uint> z; Anything eight and above, and we have to nest tuples inside of tuples.  The last element of the 8-tuple is the generic type parameter Rest, this is special in that the Tuple checks to make sure at runtime that the type is a Tuple.  This means that a simple 8-tuple must nest a singleton tuple (one of the good uses for a singleton tuple, by the way) for the Rest property. 1: // an 8-tuple 2: Tuple<int, int, int, int, int, double, char, Tuple<string>> t8; 3:  4: // an 9-tuple 5: Tuple<int, int, int, int, double, int, char, Tuple<string, DateTime>> t9; 6:  7: // a 16-tuple 8: Tuple<int, int, int, int, int, int, int, Tuple<int, int, int, int, int, int, int, Tuple<int,int>>> t14; Notice that on the 14-tuple we had to have a nested tuple in the nested tuple.  Since the tuple can only support up to seven items, and then a rest element, that means that if the nested tuple needs more than seven items you must nest in it as well.  Constructing tuples Constructing tuples is just as straightforward as declaring them.  That said, you have two distinct ways to do it.  The first is to construct the tuple explicitly yourself: 1: var t3 = new Tuple<int, string, double>(1, "Hello", 3.1415927); This creates a triple that has an int, string, and double and assigns the values 1, "Hello", and 3.1415927 respectively.  Make sure the order of the arguments supplied matches the order of the types!  Also notice that we can't half-assign a tuple or create a default tuple.  Tuples are immutable (you can't change the values once constructed), so thus you must provide all values at construction time. Another way to easily create tuples is to do it implicitly using the System.Tuple static class's Create() factory methods.  These methods (much like C++'s std::make_pair method) will infer the types from the method call so you don't have to type them in.  This can dramatically reduce the amount of typing required especially for complex tuples! 1: // this 4-tuple is typed Tuple<int, double, string, char> 2: var t4 = Tuple.Create(42, 3.1415927, "Love", 'X'); Notice how much easier it is to use the factory methods and infer the types?  This can cut down on typing quite a bit when constructing tuples.  The Create() factory method can construct from a 1-tuple (singleton) to an 8-tuple (octuple), which of course will be a octuple where the last item is a singleton as we described before in nested tuples. Accessing tuple members Accessing a tuple's members is simplicity itself… mostly.  The properties for accessing up to the first seven items are Item1, Item2, …, Item7.  If you have an octuple or beyond, the final property is Rest which will give you the nested tuple which you can then access in a similar matter.  Once again, keep in mind that these are read-only properties and cannot be changed. 1: // for septuples and below, use the Item properties 2: var t1 = Tuple.Create(42, 3.14); 3:  4: Console.WriteLine("First item is {0} and second is {1}", 5: t1.Item1, t1.Item2); 6:  7: // for octuples and above, use Rest to retrieve nested tuple 8: var t9 = new Tuple<int, int, int, int, int, int, int, 9: Tuple<int, int>>(1,2,3,4,5,6,7,Tuple.Create(8,9)); 10:  11: Console.WriteLine("The 8th item is {0}", t9.Rest.Item1); Tuples are IStructuralComparable and IStructuralEquatable Most of you know about IComparable and IEquatable, what you may not know is that there are two sister interfaces to these that were added in .NET 4.0 to help support tuples.  These IStructuralComparable and IStructuralEquatable make it easy to compare two tuples for equality and ordering.  This is invaluable for sorting, and makes it easy to use tuples as a compound-key to a dictionary (one of my favorite uses)! Why is this so important?  Remember when we said that some folks think tuples are too generic and you should define a custom class?  This is all well and good, but if you want to design a custom class that can automatically order itself based on its members and build a hash code for itself based on its members, it is no longer a trivial task!  Thankfully the tuple does this all for you through the explicit implementations of these interfaces. For equality, two tuples are equal if all elements are equal between the two tuples, that is if t1.Item1 == t2.Item1 and t1.Item2 == t2.Item2, and so on.  For ordering, it's a little more complex in that it compares the two tuples one at a time starting at Item1, and sees which one has a smaller Item1.  If one has a smaller Item1, it is the smaller tuple.  However if both Item1 are the same, it compares Item2 and so on. For example: 1: var t1 = Tuple.Create(1, 3.14, "Hi"); 2: var t2 = Tuple.Create(1, 3.14, "Hi"); 3: var t3 = Tuple.Create(2, 2.72, "Bye"); 4:  5: // true, t1 == t2 because all items are == 6: Console.WriteLine("t1 == t2 : " + t1.Equals(t2)); 7:  8: // false, t1 != t2 because at least one item different 9: Console.WriteLine("t2 == t2 : " + t2.Equals(t3)); The actual implementation of IComparable, IEquatable, IStructuralComparable, and IStructuralEquatable is explicit, so if you want to invoke the methods defined there you'll have to manually cast to the appropriate interface: 1: // true because t1.Item1 < t3.Item1, if had been same would check Item2 and so on 2: Console.WriteLine("t1 < t3 : " + (((IComparable)t1).CompareTo(t3) < 0)); So, as I mentioned, the fact that tuples are automatically equatable and comparable (provided the types you use define equality and comparability as needed) means that we can use tuples for compound keys in hashing and ordering containers like Dictionary and SortedList: 1: var tupleDict = new Dictionary<Tuple<int, double, string>, string>(); 2:  3: tupleDict.Add(t1, "First tuple"); 4: tupleDict.Add(t2, "Second tuple"); 5: tupleDict.Add(t3, "Third tuple"); Because IEquatable defines GetHashCode(), and Tuple's IStructuralEquatable implementation creates this hash code by combining the hash codes of the members, this makes using the tuple as a complex key quite easy!  For example, let's say you are creating account charts for a financial application, and you want to cache those charts in a Dictionary based on the account number and the number of days of chart data (for example, a 1 day chart, 1 week chart, etc): 1: // the account number (string) and number of days (int) are key to get cached chart 2: var chartCache = new Dictionary<Tuple<string, int>, IChart>(); Summary The System.Tuple, like any tool, is best used where it will achieve a greater benefit.  I wouldn't advise overusing them, on objects with a large scope or it can become difficult to maintain.  However, when used properly in a well defined scope they can make your code cleaner and easier to maintain by removing the need for extraneous POCOs and custom property hashing and ordering. They are especially useful in defining compound keys to IDictionary implementations and for returning multiple values from methods, or passing multiple values to a single object parameter. Tweet Technorati Tags: C#,.NET,Tuple,Little Wonders

    Read the article

  • e-interview: SunSpace to WebCenter migration

    - by me
    I had the pleasure to do an e-interview with Ana Neves around the SunSpace to WebCenter migration project.  Below is the english version of the interview.  Enjoy   Peter, you joined Oracle in 2009 through the acquisition of Sun. Becoming a part of Oracle meant many changes. The internal collaboration platform was one of them, as per a post you wrote back in 2011. Sun had SunSpace. How would you describe SunSpace? SunSpace was the internal Community and Social Collaboration platform for the Sun's Global Sales and Services Organization. SunSpace served around 600 communities with a main focus around technology, products and services. SunSpace was a big success. Within 3 months of its launch SunSpace had over 20,000 users and it won the Atlassian "Not just another wiki" Award for the best use of Confluence (https://blogs.oracle.com/peterreiser/entry/goodbye_sunspace_hello_webcenter). What made SunSpace so special? 1. People centric versus  Web centric The main concept of SunSpace put the person in the middle of everything. All relevant information, resources  etc. where dynamically pushed to a person's  myProfile ( Facebook like interface) based on the person's interest and  needs.  2. Ease to use  SunSpace was really easy to use. We spent a lot of time on social interaction design to optimize the user experience.  Also we integrated some sophisticated technology to hide complexity from the user. As example - when a user added a document to SunSpace - we analyzed the content of the document and suggested related metadata and tags to the user based on a sophisticated algorithm which was integrated with the corporate taxonomy. Based on this metadata the document was automatically shared with the relevant communities.  3. Easy to find One of the main use cases for SunSpace was that  a user could quickly find the content and information they needed for their job.  The search implementation was based on:  optimized search engine algorithm using social value based ranking enhancements community facilitated search optimization  faceted search which recommended highly relevant  content like products, communities and experts 4. Social Adoption  - How to build vibrant communities You can deploy the coolest social technology but what if the users are not using it?   To drive user adoption we implemented two  complementary models: 4.1 Community Methodology  We developed a set of best practices on how to create, run and sustain communities including: community structure and types (e.g. Community of Practice, Community of Interest etc.) & tips and tricks on how to build a "vibrant " communities, Community Health check etc.  These best practices where constantly tuned and updated by the community of community drivers. 4.2. Social Value System To drive user adoption there is ONE key  question you  have to answer for each individual user: What's In It For Me (WIIFM) We developed a Social Value System called Community Equity which measures the social value flow between People, Content and Metadata. Based on this technology we added "Gamfication" techniques (although at that time this term did not exist ) to SunSpace to honor people for the active contribution and participation.  As example: All  social credentials a user earned trough active community participation where dynamically displayed on her/his myProfile. How would you describe WebCenter? Oracle WebCenter (@oraclewebcenter) is the Oracle's  user engagement platform for social business. It helps people work together more efficiently through contextual collaboration tools that optimize connections between people, information, and applications and ensures users have access to the right information in the context of the business process in which they are engaged. Oracle WebCenter can help your organization deliver contextual and targeted Web experiences to users and enable employees to access information and applications through intuitive portals, composite applications, and mash-ups. How does it compare to SunSpace in terms of functionality? Before I answer this question, I would like to point out some limitation we started to see with the current SunSpace implementation. Due to the massive growth of the user population (>20,000 users), we experienced  performance and scalability challenges with the current technology. Also at the time - Sun Internal Communications and SunIT planned to replace the entire Sun Intranet with SunSpace. We  kicked-off a project to evaluate the enterprise level technology which eventually would replace the good old static Intranet.  And then Oracle acquired Sun. We already had defined the functional requirements for the Intranet replacement with a Social Enterprise Stack and we just needed to evaluate the functional requirements against WebCenter   Below are the summary of this evaluation  MyProfile SunSpace WebCenter How WebCenter Works Home MyProfile: to access, click on your name at the top of any WebCenter page Your name, title, and reporting line are displayed.  Sub-tabs show your activity stream (Activities); people in your network (Connections); files you have uploaded (Documents); your contact information (Organization); and any personal information you wish to share (About).   Files MyFiles Allows you to upload, download and store documents or wiki pages within folders and subfolders.  The WebDav interface allows you to download / upload files / folders with a simple drag and drop to / from your local machine.  Tagging is supported and recommended. Network HomeMyConnections Home: displays the activity stream of individuals in your network.MyConnections: shows individuals in your network.  Click on a person's name to see their contact info and link to their profile. Status Updates MyProfle > Activties Add and displays  your recent activties and status updates. Watches Preferences > Subscriptions > Current Subscriptions Receive email notifications when  pages / spaces you watch are modified. Drafts N/A WebCenter does not support Drafts Settings Preferences: to access, click on 'Preferences' at the top of any WebCenter page Set your general preferences, as well as your WebCenter messaging, search and mail settings. MyCommunities MySpaces: to access, click on 'Spaces' at the top of any WebCenter page Displays MySpaces (communities you are a member of); and Recent Spaces (communities you have recently visited). Community SunSpace Webcenter How Webcenter Works Home Home Displays a community introduction and activity stream.  Members can add messages, links or documents via the Community Message Board. No Top Contributors widget. People Members Lists members of the community. The Mail All Members feature allows moderators and participants to send a message to all members of the community. Membership Management can be found under > Manage > Members News News Members can post and access latest community news and they can subscribe to news using an RSS reader Documents Documents Allows community members to upload, download and store documents or wiki pages within folders and subfolders.  The WebDav interface allows participants to download / upload files / folders with a simple drag and drop to / from your local machine.  Tagging is supported and recommended. Wiki Wiki Allows community members to create and update web pages with a WYSIWYG editor.  Note: WebCenter does not support macros or portlet embedding. Forum Forum Post community forum topics. Contribute to community forum conversations.  N/A Calendar Update and/or view the Community Calendar. N/A Analytics Displays detailed analytics data (views,downloads, unique users etc.) for Pages, Wiki, Documents, and Forum in a given community space. What is the adoption of WebCenter at Oracle? The entire Intranet serving around 100,000 users  is running on WebCenter Content.  For professional communities we use WebCenter Portal and Spaces. Currently we have around 6,000 community spaces with  around 40,000 members.  Does Oracle have any metrics to assess usage and impact of WebCenter? Can you give us some examples? Sure -  we have a lot of metrics   For the Intranet we use traditional metrics like pageviews, monthly unique visitors and unique visits.  For Communities we use the WebCenter Portal/Spaces analytics service which gives as a wealth of data. The key metrics we track are: Space traffic (PageViews, Unique Users) Wiki,Documents (views, downloads etc.) Forum (users, views, posts etc.) Registered members over time  Depending on the community we can filter/segment the metrics by User Properties e.g. Country, Organization, Job Role etc. What are you doing to improve usage and impact? 1. We  integrating the WebCenter social services/fabric into all  main business applications. As example The Fusion CRM deployment is seamless integrated with Oracle Social Network (OSN) and all conversation around an opportunity or customer engagement is  done in OSN (see youtube video). 2. We drive Social Best Practice trough a program called "Social Networking & Business Collaboration (SNBC) program" You worked both with WebCenter and SunSpace. Knowing what you know today, if you had the chance to choose between the two, which one would you choose? Why? That's a tricky question   In the early days of  the Social Enterprise implementation (we started SunSpace in 2006), we needed an agile and easy to deploy technology to keep up with the users requirements. Sometimes we pushed two releases per day  and we were in a permanent perpetual beta mode - SunSpace was perfect for that.  After the social implementation matured over time - community generated content became business critical and we saw a change in the  requirements from agile to stability, scalability and reliability  of the infrastructure.  WebCenter is the right choice for such an enterprise-level deployment.  You are a WebCenter Evangelist at Oracle. What do you do as part of that role? Our  role is to help position Oracle as one of the key thought leaders and solutions provider for Social Business. In addition we drive social innovation trough our Oracle Appslab  team. Is that a full time role? Yes  How many other Evangelists are there in Oracle? We are currently 5 people in the WebCenter evangelist team (@webcentervoices): Christian Finn (@cfinn) leads the team - Christian came from the Microsoft Sharepoint product management team and is a recognized expert in Social Business and Enterprise Collaboration. Noël Jaffré  (@noeljaffre) is our Web Experience Management (WEM) guru and came to Oracle via FatWire acquisition (now WebCenter Sites). Jake Kuramoto (@theapplab) is part of the Oracle AppsLab innovation  team - Jake is well known as  the driving force behind  http://theappslab.com  a blog around social and innovation.  Noel Portugal (@noelportugal) is a developer in the Oracle AppsLab innovation team - he is the inventor of OraTweet - Oracle's internal tweeting platform  Peter Reiser (@peterreiser) is  a Social Business guru and the inventor of SunSpace and Community Equity.  What area of the business do you and the rest of the Evangelists sit in? What area of the organisation is responsible for WebCenter? We are part of the WebCenter product management  organization.  Is WebCenter part of the Knowledge Management strategy? Oracle WebCenter is the Oracle's user engagement platform for social business. It brings together the most complete portfolio of portal, web experience management, content, social and collaboration technologies into a single product suite and is the product foundation of the Oracle Knowledge Management strategy.  I am aware Oracle also uses Beehive internally. How would you describe Beehive? Oracle Beehive provides an integrated set of communication and collaboration services built on a single scalable, secure, enterprise-class platform Beehive is  internally used for enterprise wide mail, calendar and real collaboration (Web conferencing) services.  Are Beehive and WebCenter connected? Historically Beehive and WebCenter Portal & Content had some overlap in functionally. (Hey - if  a company has an acquisition strategy to strengthen its product offering and accelerate  innovation, it's pretty normal that functional overlap exists  :- )) A key objective of the WebCenter strategy is  to combine all social and collaboration offerings under the WebCenter product family. That means that certain Beehive components  will be integrated into the overall WebCenter product offering.  Are there any other internal collaboration tools at Oracle? Which ones There here are two other main social tools which are widely used at Oracle  Oracle Connect was the first social tool the Oracle AppsLab team created in 2007 - see (Jake's blog post for details). It is still extensively used. ... and as a former Sun guy I like this quote from the blog post:  "Traffic to Connect peaked right after the Sun merger in 2010, when it served several hundred thousand pageviews each month; since then, traffic has subsided, but still averages tens of thousands of pageviews to several thousand users each month." Oratweet - Oracle internal microblogging platform has been used since June 2008 and it is still growing.  It's entirely written in Oracle Application Express (APEX) which is a rapid web application development tool for the Oracle database. Wanna try it out? Here you can download the code.  What is Oracle's strategy regarding (all these) collaboration tools? Pretty straight forward. The strategy is to seamless  integrate the WebCenter social & collaboration services into all Business Applications to help customers to socialize their enterprise. 

    Read the article

  • C#/.NET Little Wonders: The Useful But Overlooked Sets

    - by James Michael Hare
    Once again we consider some of the lesser known classes and keywords of C#.  Today we will be looking at two set implementations in the System.Collections.Generic namespace: HashSet<T> and SortedSet<T>.  Even though most people think of sets as mathematical constructs, they are actually very useful classes that can be used to help make your application more performant if used appropriately. A Background From Math In mathematical terms, a set is an unordered collection of unique items.  In other words, the set {2,3,5} is identical to the set {3,5,2}.  In addition, the set {2, 2, 4, 1} would be invalid because it would have a duplicate item (2).  In addition, you can perform set arithmetic on sets such as: Intersections: The intersection of two sets is the collection of elements common to both.  Example: The intersection of {1,2,5} and {2,4,9} is the set {2}. Unions: The union of two sets is the collection of unique items present in either or both set.  Example: The union of {1,2,5} and {2,4,9} is {1,2,4,5,9}. Differences: The difference of two sets is the removal of all items from the first set that are common between the sets.  Example: The difference of {1,2,5} and {2,4,9} is {1,5}. Supersets: One set is a superset of a second set if it contains all elements that are in the second set. Example: The set {1,2,5} is a superset of {1,5}. Subsets: One set is a subset of a second set if all the elements of that set are contained in the first set. Example: The set {1,5} is a subset of {1,2,5}. If We’re Not Doing Math, Why Do We Care? Now, you may be thinking: why bother with the set classes in C# if you have no need for mathematical set manipulation?  The answer is simple: they are extremely efficient ways to determine ownership in a collection. For example, let’s say you are designing an order system that tracks the price of a particular equity, and once it reaches a certain point will trigger an order.  Now, since there’s tens of thousands of equities on the markets, you don’t want to track market data for every ticker as that would be a waste of time and processing power for symbols you don’t have orders for.  Thus, we just want to subscribe to the stock symbol for an equity order only if it is a symbol we are not already subscribed to. Every time a new order comes in, we will check the list of subscriptions to see if the new order’s stock symbol is in that list.  If it is, great, we already have that market data feed!  If not, then and only then should we subscribe to the feed for that symbol. So far so good, we have a collection of symbols and we want to see if a symbol is present in that collection and if not, add it.  This really is the essence of set processing, but for the sake of comparison, let’s say you do a list instead: 1: // class that handles are order processing service 2: public sealed class OrderProcessor 3: { 4: // contains list of all symbols we are currently subscribed to 5: private readonly List<string> _subscriptions = new List<string>(); 6:  7: ... 8: } Now whenever you are adding a new order, it would look something like: 1: public PlaceOrderResponse PlaceOrder(Order newOrder) 2: { 3: // do some validation, of course... 4:  5: // check to see if already subscribed, if not add a subscription 6: if (!_subscriptions.Contains(newOrder.Symbol)) 7: { 8: // add the symbol to the list 9: _subscriptions.Add(newOrder.Symbol); 10: 11: // do whatever magic is needed to start a subscription for the symbol 12: } 13:  14: // place the order logic! 15: } What’s wrong with this?  In short: performance!  Finding an item inside a List<T> is a linear - O(n) – operation, which is not a very performant way to find if an item exists in a collection. (I used to teach algorithms and data structures in my spare time at a local university, and when you began talking about big-O notation you could immediately begin to see eyes glossing over as if it was pure, useless theory that would not apply in the real world, but I did and still do believe it is something worth understanding well to make the best choices in computer science). Let’s think about this: a linear operation means that as the number of items increases, the time that it takes to perform the operation tends to increase in a linear fashion.  Put crudely, this means if you double the collection size, you might expect the operation to take something like the order of twice as long.  Linear operations tend to be bad for performance because they mean that to perform some operation on a collection, you must potentially “visit” every item in the collection.  Consider finding an item in a List<T>: if you want to see if the list has an item, you must potentially check every item in the list before you find it or determine it’s not found. Now, we could of course sort our list and then perform a binary search on it, but sorting is typically a linear-logarithmic complexity – O(n * log n) - and could involve temporary storage.  So performing a sort after each add would probably add more time.  As an alternative, we could use a SortedList<TKey, TValue> which sorts the list on every Add(), but this has a similar level of complexity to move the items and also requires a key and value, and in our case the key is the value. This is why sets tend to be the best choice for this type of processing: they don’t rely on separate keys and values for ordering – so they save space – and they typically don’t care about ordering – so they tend to be extremely performant.  The .NET BCL (Base Class Library) has had the HashSet<T> since .NET 3.5, but at that time it did not implement the ISet<T> interface.  As of .NET 4.0, HashSet<T> implements ISet<T> and a new set, the SortedSet<T> was added that gives you a set with ordering. HashSet<T> – For Unordered Storage of Sets When used right, HashSet<T> is a beautiful collection, you can think of it as a simplified Dictionary<T,T>.  That is, a Dictionary where the TKey and TValue refer to the same object.  This is really an oversimplification, but logically it makes sense.  I’ve actually seen people code a Dictionary<T,T> where they store the same thing in the key and the value, and that’s just inefficient because of the extra storage to hold both the key and the value. As it’s name implies, the HashSet<T> uses a hashing algorithm to find the items in the set, which means it does take up some additional space, but it has lightning fast lookups!  Compare the times below between HashSet<T> and List<T>: Operation HashSet<T> List<T> Add() O(1) O(1) at end O(n) in middle Remove() O(1) O(n) Contains() O(1) O(n)   Now, these times are amortized and represent the typical case.  In the very worst case, the operations could be linear if they involve a resizing of the collection – but this is true for both the List and HashSet so that’s a less of an issue when comparing the two. The key thing to note is that in the general case, HashSet is constant time for adds, removes, and contains!  This means that no matter how large the collection is, it takes roughly the exact same amount of time to find an item or determine if it’s not in the collection.  Compare this to the List where almost any add or remove must rearrange potentially all the elements!  And to find an item in the list (if unsorted) you must search every item in the List. So as you can see, if you want to create an unordered collection and have very fast lookup and manipulation, the HashSet is a great collection. And since HashSet<T> implements ICollection<T> and IEnumerable<T>, it supports nearly all the same basic operations as the List<T> and can use the System.Linq extension methods as well. All we have to do to switch from a List<T> to a HashSet<T>  is change our declaration.  Since List and HashSet support many of the same members, chances are we won’t need to change much else. 1: public sealed class OrderProcessor 2: { 3: private readonly HashSet<string> _subscriptions = new HashSet<string>(); 4:  5: // ... 6:  7: public PlaceOrderResponse PlaceOrder(Order newOrder) 8: { 9: // do some validation, of course... 10: 11: // check to see if already subscribed, if not add a subscription 12: if (!_subscriptions.Contains(newOrder.Symbol)) 13: { 14: // add the symbol to the list 15: _subscriptions.Add(newOrder.Symbol); 16: 17: // do whatever magic is needed to start a subscription for the symbol 18: } 19: 20: // place the order logic! 21: } 22:  23: // ... 24: } 25: Notice, we didn’t change any code other than the declaration for _subscriptions to be a HashSet<T>.  Thus, we can pick up the performance improvements in this case with minimal code changes. SortedSet<T> – Ordered Storage of Sets Just like HashSet<T> is logically similar to Dictionary<T,T>, the SortedSet<T> is logically similar to the SortedDictionary<T,T>. The SortedSet can be used when you want to do set operations on a collection, but you want to maintain that collection in sorted order.  Now, this is not necessarily mathematically relevant, but if your collection needs do include order, this is the set to use. So the SortedSet seems to be implemented as a binary tree (possibly a red-black tree) internally.  Since binary trees are dynamic structures and non-contiguous (unlike List and SortedList) this means that inserts and deletes do not involve rearranging elements, or changing the linking of the nodes.  There is some overhead in keeping the nodes in order, but it is much smaller than a contiguous storage collection like a List<T>.  Let’s compare the three: Operation HashSet<T> SortedSet<T> List<T> Add() O(1) O(log n) O(1) at end O(n) in middle Remove() O(1) O(log n) O(n) Contains() O(1) O(log n) O(n)   The MSDN documentation seems to indicate that operations on SortedSet are O(1), but this seems to be inconsistent with its implementation and seems to be a documentation error.  There’s actually a separate MSDN document (here) on SortedSet that indicates that it is, in fact, logarithmic in complexity.  Let’s put it in layman’s terms: logarithmic means you can double the collection size and typically you only add a single extra “visit” to an item in the collection.  Take that in contrast to List<T>’s linear operation where if you double the size of the collection you double the “visits” to items in the collection.  This is very good performance!  It’s still not as performant as HashSet<T> where it always just visits one item (amortized), but for the addition of sorting this is a good thing. Consider the following table, now this is just illustrative data of the relative complexities, but it’s enough to get the point: Collection Size O(1) Visits O(log n) Visits O(n) Visits 1 1 1 1 10 1 4 10 100 1 7 100 1000 1 10 1000   Notice that the logarithmic – O(log n) – visit count goes up very slowly compare to the linear – O(n) – visit count.  This is because since the list is sorted, it can do one check in the middle of the list, determine which half of the collection the data is in, and discard the other half (binary search).  So, if you need your set to be sorted, you can use the SortedSet<T> just like the HashSet<T> and gain sorting for a small performance hit, but it’s still faster than a List<T>. Unique Set Operations Now, if you do want to perform more set-like operations, both implementations of ISet<T> support the following, which play back towards the mathematical set operations described before: IntersectWith() – Performs the set intersection of two sets.  Modifies the current set so that it only contains elements also in the second set. UnionWith() – Performs a set union of two sets.  Modifies the current set so it contains all elements present both in the current set and the second set. ExceptWith() – Performs a set difference of two sets.  Modifies the current set so that it removes all elements present in the second set. IsSupersetOf() – Checks if the current set is a superset of the second set. IsSubsetOf() – Checks if the current set is a subset of the second set. For more information on the set operations themselves, see the MSDN description of ISet<T> (here). What Sets Don’t Do Don’t get me wrong, sets are not silver bullets.  You don’t really want to use a set when you want separate key to value lookups, that’s what the IDictionary implementations are best for. Also sets don’t store temporal add-order.  That is, if you are adding items to the end of a list all the time, your list is ordered in terms of when items were added to it.  This is something the sets don’t do naturally (though you could use a SortedSet with an IComparer with a DateTime but that’s overkill) but List<T> can. Also, List<T> allows indexing which is a blazingly fast way to iterate through items in the collection.  Iterating over all the items in a List<T> is generally much, much faster than iterating over a set. Summary Sets are an excellent tool for maintaining a lookup table where the item is both the key and the value.  In addition, if you have need for the mathematical set operations, the C# sets support those as well.  The HashSet<T> is the set of choice if you want the fastest possible lookups but don’t care about order.  In contrast the SortedSet<T> will give you a sorted collection at a slight reduction in performance.   Technorati Tags: C#,.Net,Little Wonders,BlackRabbitCoder,ISet,HashSet,SortedSet

    Read the article

  • CodePlex Daily Summary for Monday, May 21, 2012

    CodePlex Daily Summary for Monday, May 21, 2012Popular ReleasesMetadata Document Generator for Microsoft Dynamics CRM 2011: Metadata Document Generator (2.0.0.0): New UI Metro style New features Save and load settings to/from file Export only OptionSet attributes Use of Gembox Spreadsheet to generate Excel (makes application lighter : 1,5MB instead of 7MB)Audio Pitch & Shift: Audio Pitch And Shift 4.2.0: Backward / Forward buttons Improved features for encoding, streaming, menu Bug fixesState Machine .netmf: State Machine Example: First release.... Contains 3 state machines running on separate threads. Event driven button to change the states. StateMachineEngine to support the Machines Message class with the type of data to send between statesSilverlight socket component: Smark.NetDisk: Smark.NetDisk?????Silverlight ?.net???????????,???????????????????????。Smark.NetDisk??????????,????.net???????????????????????tcp??;???????Silverlight??????????????????????ZXMAK2: Version 2.6.1.9: added WAV serializer for tape devicecallisto: callisto 2.0.28: Update log: - Extended Scribble protocol. - Updated HTML5 client code - now supports the latest versions of Google Chrome.ExtAspNet: ExtAspNet v3.1.6: ExtAspNet - ?? ExtJS ??? ASP.NET 2.0 ???,????? AJAX ?????????? ExtAspNet ????? ExtJS ??? ASP.NET 2.0 ???,????? AJAX ??????????。 ExtAspNet ??????? JavaScript,?? CSS,?? UpdatePanel,?? ViewState,?? WebServices ???????。 ??????: IE 7.0, Firefox 3.6, Chrome 3.0, Opera 10.5, Safari 3.0+ ????:Apache License 2.0 (Apache) ??:http://bbs.extasp.net/ ??:http://demo.extasp.net/ ??:http://doc.extasp.net/ ??:http://extaspnet.codeplex.com/ ??:http://sanshi.cnblogs.com/ ????: +2012-05-20 v3.1.6 -??RowD...Dynamics XRM Tools: Dynamics XRM Tools BETA 1.0: The Dynamics XRM Tools 1.0 BETA is now available Seperate downloads are available for On Premise and Online as certain features are only available On Premise. This is a BETA build and may not resemble the final release. Many enhancements are in development and will be made available soon. Please provide feedback so that we may learn and discover how to make these tools better.WatchersNET CKEditor™ Provider for DotNetNuke®: CKEditor Provider 1.14.05: Whats New Added New Editor Skin "BootstrapCK-Skin" Added New Editor Skin "Slick" Added Dnn Pages Drop Down to the Link Dialog (to quickly link to a portal tab) changes Fixed Issue #6956 Localization issue with some languages Fixed Issue #6930 Folder Tree view was not working in some cases Changed the user folder from User name to User id User Folder is now used when using Upload Function and User Folder is enabled File-Browser Fixed Resizer Preview Image Optimized the oEmbed Pl...PHPExcel: PHPExcel 1.7.7: See Change Log for details of the new features and bugfixes included in this release. BREAKING CHANGE! From PHPExcel 1.7.8 onwards, the 3rd-party tcPDF library will no longer be bundled with PHPExcel for rendering PDF files through the PDF Writer. The PDF Writer is being rewritten to allow a choice of 3rd party PDF libraries (tcPDF, mPDF, and domPDF initially), none of which will be bundled with PHPExcel, but which can be downloaded seperately from the appropriate sites.GhostBuster: GhostBuster Setup (91520): Added WMI based RestorePoint support Removed test code from program.cs Improved counting. Changed color of ghosted but unfiltered devices. Changed HwEntries into an ObservableCollection. Added Properties Form. Added Properties MenuItem to Context Menu. Added Hide Unfiltered Devices to Context Menu. If you like this tool, leave me a note, rate this project or write a review or Donate to Ghostbuster. Donate to GhostbusterC#??????EXCEL??、??、????????:DataPie(??MSSQL 2008、ORACLE、ACCESS 2007): DataPie_V3.2: V3.2, 2012?5?19? ????ORACLE??????。AvalonDock: AvalonDock 2.0.0795: Welcome to the Beta release of AvalonDock 2.0 After 4 months of hard work I'm ready to upload the beta version of AvalonDock 2.0. This new version boosts a lot of new features and now is stable enough to be deployed in production scenarios. For this reason I encourage everyone is using AD 1.3 or earlier to upgrade soon to this new version. The final version is scheduled for the end of June. What is included in Beta: 1) Stability! thanks to all users contribution I’ve corrected a lot of issues...myCollections: Version 2.1.0.0: New in this version : Improved UI New Metro Skin Improved Performance Added Proxy Settings New Music and Books Artist detail Lot of Bug FixingAspxCommerce: AspxCommerce1.1: AspxCommerce - 'Flexible and easy eCommerce platform' offers a complete e-Commerce solution that allows you to build and run your fully functional online store in minutes. You can create your storefront; manage the products through categories and subcategories, accept payments through credit cards and ship the ordered products to the customers. We have everything set up for you, so that you can only focus on building your own online store. Note: To login as a superuser, the username and pass...SiteMap Editor for Microsoft Dynamics CRM 2011: SiteMap Editor (1.1.1616.403): BUG FIX Hide save button when Titles or Descriptions element is selectedDotSpatial: DotSpatial 1.2: This is a Minor Release. See the changes in the issue tracker. Minimal -- includes DotSpatial core and essential extensions Extended -- includes debugging symbols and additional extensions Tutorials are available. Just want to run the software? End user (non-programmer) version available branded as MapWindow Want to add your own feature? Develop a plugin, using the template and contribute to the extension feed (you can also write extensions that you distribute in other ways). Components ...Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.52: Make preprocessor comment-statements nestable; add the ///#IFNDEF statement. (Discussion #355785) Don't throw an error for old-school JScript event handlers, and don't rename them if they aren't global functions.DotNetNuke® Events: 06.00.00: This is a serious release of Events. DNN 6 form pattern - We have take the full route towards DNN6: most notably the incorporation of the DNN6 form pattern with streamlined UX/UI. We have also tried to change all formatting to a div based structure. A daunting task, since the Events module contains a lot of forms. Roger has done a splendid job by going through all the forms in great detail, replacing all table style layouts into the new DNN6 div class="dnnForm XXX" type of layout with chang...LogicCircuit: LogicCircuit 2.12.5.15: Logic Circuit - is educational software for designing and simulating logic circuits. Intuitive graphical user interface, allows you to create unrestricted circuit hierarchy with multi bit buses, debug circuits behavior with oscilloscope, and navigate running circuits hierarchy. Changes of this versionThis release is fixing one but nasty bug. Two functions XOR and XNOR when used with 3 or more inputs were incorrectly evaluating their results. If you have a circuit that is using these functions...New ProjectsAdvanced CRM 2011 Auto Number: Advanced CRM 2011 AutoNumber is the most advanced auto numbering solution for CRM 2011 Online, On-Premise and Partner Hosted (IFD). You can virtually add any auto number to both system entities and custom entities. Here are all the features: * Supports CRM 2011 Online, On-Premise, Partner Hosted (IFD) for both Sandbox and Non-Sandbox mode. Also supports load balancing CRM instalations. * Guaranteed 100% unique autogenerated number * Supports both system entities and custom entit...alberguedeblm: Nós, membros do Departamento de Desenvolvimento Belém, hospedamos nossos projetos aqui.ChevonChristieCodeAndTools: This repo contains WP7 helper code and related Utilities, among other things, that I have accumulated across my projects. Simply open the solution in VS2010. All code is provided AS-IS and there is NO warranty for anything is this repo. If you would like to check out some of the applications this code came from head over to here: http://binaryred.com/portfolioCodeDom Assistant: Generating CodeDom Code By Parsing C# or VB ??C#??VB??CodeDom??ColaBBS: ColaBBS for .NET FrameworkDragonDen: DragonDEN project consists of SIMPLE SHARK PROMPT and DOS operating systems. For help in DOS type in '?' after logging in with username and password. Username:Drago Password:dospasswordDynamics XRM Tools: Dynamics XRM Tools brings you a quality range of applications that provide a useful set of features to enhance your experience while using and developing against Microsoft Dynamics CRM 2011. Currently available applications include OData Query Designer Metadata Browser CRM 4 to CRM 2011 JavaScript Converter Trace Tool Statistics About Dynamics XRM Tools The Dynamics XRM Tools project provides a modular framework for hosting Silverlight applications within a single shel...Employee Info System: Employee Info SystemFile upload control - MS CRM 2011: File upload Plug-in in MS CRM 2011 The objective of this plug-in is to provide functionality of using file upload feature inside the MS CRM 2011 forms. Also using this plug-in provide file extension control, multiple file uploads in one form, configurable labels and much more. It is very easy to use this. Proper steps are provided in the documentations and plug-in is provided for download. GadgeteerCookbook: A collection of projects for Microsoft .NET GadgeteerGraboo: Grabooo Imagine Cup App! ;)How High is It: A tool used by windows phone to calculate the height of a building or something else.H-Share: H-Share is my personal sharing tool try-out projectInternals Viewer (updated) for SQL Server 2008 R2.: Internals viewer is a tool for looking into the SQL Server storage engine and seeing how data is physically allocated, organised and stored. All sorts of tasks performed by a DBA or developer can benefit greatly from knowledge of what the storage engine is doing and how it works. This version is for SSMS 2008 R2.MetaWeblog Utility: A .NET library for Meta Weblog operations. To make it easy to use, library includes top blog sites' proxy/agent.MetroRate: MetroRate is a control for displaying ratings with stars in Windows 8 WinRT Metro XAML apps.Ph?n m?m qu?n lý ký túc xá sinh viên ÐH Tây Nguyên: Ph?n m?m qu?n lý ký túc xá sinh viên ÐH Tây Nguyên Liên h? d? bi?t thêm chi ti?t nhá!ProceXSS: ProceXSS is a Asp.NET Http module for detecting and ignoring xss attacks.RFIDCashier: Its RFID scanner tool SaharMediaPlayer: gggScienceLP: ScienceLPsilverlight4Stu: ????Stackr Programming Language: A stack-based programming language, initially targeted for trans-compilation to DCPU-16 assembly.Timeline example VB.net: timeline vbTKLSite: not for youVolta AutoParts: Volta AutoParts is light weight website, used for let more and more people know Volta's Auto accessories and parts. We are developers on Microsoft.NET, this project is also an opportunity for us on cooperation.????????: good

    Read the article

< Previous Page | 273 274 275 276 277 278 279 280 281 282 283 284  | Next Page >