Search Results

Search found 14772 results on 591 pages for 'full trust'.

Page 90/591 | < Previous Page | 86 87 88 89 90 91 92 93 94 95 96 97  | Next Page >

  • .NET HTML Sanitation for rich HTML Input

    - by Rick Strahl
    Recently I was working on updating a legacy application to MVC 4 that included free form text input. When I set up the new site my initial approach was to not allow any rich HTML input, only simple text formatting that would respect a few simple HTML commands for bold, lists etc. and automatically handles line break processing for new lines and paragraphs. This is typical for what I do with most multi-line text input in my apps and it works very well with very little development effort involved. Then the client sprung another note: Oh by the way we have a bunch of customers (real estate agents) who need to post complete HTML documents. Oh uh! There goes the simple theory. After some discussion and pleading on my part (<snicker>) to try and avoid this type of raw HTML input because of potential XSS issues, the client decided to go ahead and allow raw HTML input anyway. There has been lots of discussions on this subject on StackOverFlow (and here and here) but to after reading through some of the solutions I didn't really find anything that would work even closely for what I needed. Specifically we need to be able to allow just about any HTML markup, with the exception of script code. Remote CSS and Images need to be loaded, links need to work and so. While the 'legit' HTML posted by these agents is basic in nature it does span most of the full gamut of HTML (4). Most of the solutions XSS prevention/sanitizer solutions I found were way to aggressive and rendered the posted output unusable mostly because they tend to strip any externally loaded content. In short I needed a custom solution. I thought the best solution to this would be to use an HTML parser - in this case the Html Agility Pack - and then to run through all the HTML markup provided and remove any of the blacklisted tags and a number of attributes that are prone to JavaScript injection. There's much discussion on whether to use blacklists vs. whitelists in the discussions mentioned above, but I found that whitelists can make sense in simple scenarios where you might allow manual HTML input, but when you need to allow a larger array of HTML functionality a blacklist is probably easier to manage as the vast majority of elements and attributes could be allowed. Also white listing gets a bit more complex with HTML5 and the new proliferation of new HTML tags and most new tags generally don't affect XSS issues directly. Pure whitelisting based on elements and attributes also doesn't capture many edge cases (see some of the XSS cheat sheets listed below) so even with a white list, custom logic is still required to handle many of those edge cases. The Microsoft Web Protection Library (AntiXSS) My first thought was to check out the Microsoft AntiXSS library. Microsoft has an HTML Encoding and Sanitation library in the Microsoft Web Protection Library (formerly AntiXSS Library) on CodePlex, which provides stricter functions for whitelist encoding and sanitation. Initially I thought the Sanitation class and its static members would do the trick for me,but I found that this library is way too restrictive for my needs. Specifically the Sanitation class strips out images and links which rendered the full HTML from our real estate clients completely useless. I didn't spend much time with it, but apparently I'm not alone if feeling this library is not really useful without some way to configure operation. To give you an example of what didn't work for me with the library here's a small and simple HTML fragment that includes script, img and anchor tags. I would expect the script to be stripped and everything else to be left intact. Here's the original HTML:var value = "<b>Here</b> <script>alert('hello')</script> we go. Visit the " + "<a href='http://west-wind.com'>West Wind</a> site. " + "<img src='http://west-wind.com/images/new.gif' /> " ; and the code to sanitize it with the AntiXSS Sanitize class:@Html.Raw(Microsoft.Security.Application.Sanitizer.GetSafeHtmlFragment(value)) This produced a not so useful sanitized string: Here we go. Visit the <a>West Wind</a> site. While it removed the <script> tag (good) it also removed the href from the link and the image tag altogether (bad). In some situations this might be useful, but for most tasks I doubt this is the desired behavior. While links can contain javascript: references and images can 'broadcast' information to a server, without configuration to tell the library what to restrict this becomes useless to me. I couldn't find any way to customize the white list, nor is there code available in this 'open source' library on CodePlex. Using Html Agility Pack for HTML Parsing The WPL library wasn't going to cut it. After doing a bit of research I decided the best approach for a custom solution would be to use an HTML parser and inspect the HTML fragment/document I'm trying to import. I've used the HTML Agility Pack before for a number of apps where I needed an HTML parser without requiring an instance of a full browser like the Internet Explorer Application object which is inadequate in Web apps. In case you haven't checked out the Html Agility Pack before, it's a powerful HTML parser library that you can use from your .NET code. It provides a simple, parsable HTML DOM model to full HTML documents or HTML fragments that let you walk through each of the elements in your document. If you've used the HTML or XML DOM in a browser before you'll feel right at home with the Agility Pack. Blacklist based HTML Parsing to strip XSS Code For my purposes of HTML sanitation, the process involved is to walk the HTML document one element at a time and then check each element and attribute against a blacklist. There's quite a bit of argument of what's better: A whitelist of allowed items or a blacklist of denied items. While whitelists tend to be more secure, they also require a lot more configuration. In the case of HTML5 a whitelist could be very extensive. For what I need, I only want to ensure that no JavaScript is executed, so a blacklist includes the obvious <script> tag plus any tag that allows loading of external content including <iframe>, <object>, <embed> and <link> etc. <form>  is also excluded to avoid posting content to a different location. I also disallow <head> and <meta> tags in particular for my case, since I'm only allowing posting of HTML fragments. There is also some internal logic to exclude some attributes or attributes that include references to JavaScript or CSS expressions. The default tag blacklist reflects my use case, but is customizable and can be added to. Here's my HtmlSanitizer implementation:using System.Collections.Generic; using System.IO; using System.Xml; using HtmlAgilityPack; namespace Westwind.Web.Utilities { public class HtmlSanitizer { public HashSet<string> BlackList = new HashSet<string>() { { "script" }, { "iframe" }, { "form" }, { "object" }, { "embed" }, { "link" }, { "head" }, { "meta" } }; /// <summary> /// Cleans up an HTML string and removes HTML tags in blacklist /// </summary> /// <param name="html"></param> /// <returns></returns> public static string SanitizeHtml(string html, params string[] blackList) { var sanitizer = new HtmlSanitizer(); if (blackList != null && blackList.Length > 0) { sanitizer.BlackList.Clear(); foreach (string item in blackList) sanitizer.BlackList.Add(item); } return sanitizer.Sanitize(html); } /// <summary> /// Cleans up an HTML string by removing elements /// on the blacklist and all elements that start /// with onXXX . /// </summary> /// <param name="html"></param> /// <returns></returns> public string Sanitize(string html) { var doc = new HtmlDocument(); doc.LoadHtml(html); SanitizeHtmlNode(doc.DocumentNode); //return doc.DocumentNode.WriteTo(); string output = null; // Use an XmlTextWriter to create self-closing tags using (StringWriter sw = new StringWriter()) { XmlWriter writer = new XmlTextWriter(sw); doc.DocumentNode.WriteTo(writer); output = sw.ToString(); // strip off XML doc header if (!string.IsNullOrEmpty(output)) { int at = output.IndexOf("?>"); output = output.Substring(at + 2); } writer.Close(); } doc = null; return output; } private void SanitizeHtmlNode(HtmlNode node) { if (node.NodeType == HtmlNodeType.Element) { // check for blacklist items and remove if (BlackList.Contains(node.Name)) { node.Remove(); return; } // remove CSS Expressions and embedded script links if (node.Name == "style") { if (string.IsNullOrEmpty(node.InnerText)) { if (node.InnerHtml.Contains("expression") || node.InnerHtml.Contains("javascript:")) node.ParentNode.RemoveChild(node); } } // remove script attributes if (node.HasAttributes) { for (int i = node.Attributes.Count - 1; i >= 0; i--) { HtmlAttribute currentAttribute = node.Attributes[i]; var attr = currentAttribute.Name.ToLower(); var val = currentAttribute.Value.ToLower(); span style="background: white; color: green">// remove event handlers if (attr.StartsWith("on")) node.Attributes.Remove(currentAttribute); // remove script links else if ( //(attr == "href" || attr== "src" || attr == "dynsrc" || attr == "lowsrc") && val != null && val.Contains("javascript:")) node.Attributes.Remove(currentAttribute); // Remove CSS Expressions else if (attr == "style" && val != null && val.Contains("expression") || val.Contains("javascript:") || val.Contains("vbscript:")) node.Attributes.Remove(currentAttribute); } } } // Look through child nodes recursively if (node.HasChildNodes) { for (int i = node.ChildNodes.Count - 1; i >= 0; i--) { SanitizeHtmlNode(node.ChildNodes[i]); } } } } } Please note: Use this as a starting point only for your own parsing and review the code for your specific use case! If your needs are less lenient than mine were you can you can make this much stricter by not allowing src and href attributes or CSS links if your HTML doesn't allow it. You can also check links for external URLs and disallow those - lots of options.  The code is simple enough to make it easy to extend to fit your use cases more specifically. It's also quite easy to make this code work using a WhiteList approach if you want to go that route. The code above is semi-generic for allowing full featured HTML fragments that only disallow script related content. The Sanitize method walks through each node of the document and then recursively drills into all of its children until the entire document has been traversed. Note that the code here uses an XmlTextWriter to write output - this is done to preserve XHTML style self-closing tags which are otherwise left as non-self-closing tags. The sanitizer code scans for blacklist elements and removes those elements not allowed. Note that the blacklist is configurable either in the instance class as a property or in the static method via the string parameter list. Additionally the code goes through each element's attributes and looks for a host of rules gleaned from some of the XSS cheat sheets listed at the end of the post. Clearly there are a lot more XSS vulnerabilities, but a lot of them apply to ancient browsers (IE6 and versions of Netscape) - many of these glaring holes (like CSS expressions - WTF IE?) have been removed in modern browsers. What a Pain To be honest this is NOT a piece of code that I wanted to write. I think building anything related to XSS is better left to people who have far more knowledge of the topic than I do. Unfortunately, I was unable to find a tool that worked even closely for me, or even provided a working base. For the project I was working on I had no choice and I'm sharing the code here merely as a base line to start with and potentially expand on for specific needs. It's sad that Microsoft Web Protection Library is currently such a train wreck - this is really something that should come from Microsoft as the systems vendor or possibly a third party that provides security tools. Luckily for my application we are dealing with a authenticated and validated users so the user base is fairly well known, and relatively small - this is not a wide open Internet application that's directly public facing. As I mentioned earlier in the post, if I had my way I would simply not allow this type of raw HTML input in the first place, and instead rely on a more controlled HTML input mechanism like MarkDown or even a good HTML Edit control that can provide some limits on what types of input are allowed. Alas in this case I was overridden and we had to go forward and allow *any* raw HTML posted. Sometimes I really feel sad that it's come this far - how many good applications and tools have been thwarted by fear of XSS (or worse) attacks? So many things that could be done *if* we had a more secure browser experience and didn't have to deal with every little script twerp trying to hack into Web pages and obscure browser bugs. So much time wasted building secure apps, so much time wasted by others trying to hack apps… We're a funny species - no other species manages to waste as much time, effort and resources as we humans do :-) Resources Code on GitHub Html Agility Pack XSS Cheat Sheet XSS Prevention Cheat Sheet Microsoft Web Protection Library (AntiXss) StackOverflow Links: http://stackoverflow.com/questions/341872/html-sanitizer-for-net http://blog.stackoverflow.com/2008/06/safe-html-and-xss/ http://code.google.com/p/subsonicforums/source/browse/trunk/SubSonic.Forums.Data/HtmlScrubber.cs?r=61© Rick Strahl, West Wind Technologies, 2005-2012Posted in Security  HTML  ASP.NET  JavaScript   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Jolicloud is a Nifty New OS for Your Netbook

    - by Matthew Guay
    Want to breathe new life into your netbook?  Here’s a quick look at Jolicloud, a unique new Linux based OS that lets you use your netbook in a whole new way. Netbooks have been an interesting category of computers.  When they were first released, most netbooks came with a stripped down Linux based operating system designed to let you easily access the internet first and foremost.  Consumers wanted more from their netbooks, so full OSes such as Windows XP and Ubuntu became the standard on netbooks.  Microsoft worked hard to get Windows 7 working great on netbooks, and today most netbooks run Windows 7 great.  But the Linux community hasn’t stood still either, and Jolicloud is proof of that.  Jolicloud is a unique OS designed to bring the best of both webapps and standard programs to your netbook.   Keep reading to see if this is the perfect netbook OS for you. Getting Started Installing Jolicloud on your netbook is easy thanks to a the Jolicloud Express installer for Windows.  Since many netbooks run Windows by default, this makes it easy to install Jolicloud.  Plus, your Windows install is left untouched, so you can still easily access all your Windows files and programs. Download and run the roughly 700Mb installer (link below) just as a normal installer in Windows. This will first extract the needed files. Click Get started to install Jolicloud on your netbook. Enter a username, password, and nickname for your computer.  Please note that the username must be all lowercase, and the nickname should not contain spaces or special characters.   Now you can review the default installation settings.  By default it will take up 39Gb and install on your C:\ drive in English.  If you wish to change this, click Change. We chose to install it on the D: drive on this netbook, as its harddrive was already partitioned into two parts.  Click Save when your settings are all correct, and then click Next in the previous window. Jolicloud will prepare for the installation.  This took about 5 minutes in our test.  Click Next when this is finished. Click Restart now to install and run Jolicloud. When your netbook reboots, it will initialize the Jolicloud setup. It will then automatically finish the installation.  Just sit back and wait; there’s nothing for you to do right now.  The installation took about 20 minutes in our test. Jolicloud will automatically reboot when the setup is finished. Once it’s rebooted, you’re ready to go!  Enter the username, then the password, that you chose earlier when you were installing Jolicloud from Windows. Welcome to your Jolicloud desktop! Hardware Support We installed Jolicloud on a Samsung N150 netbook with an Atom N450 processor, 1Gb Ram, 250Gb harddrive, and WiFi b/g/n with Bluetooth.  Amazingly, once Jolicloud was installed, everything was ready to use.  No drivers to install, no settings to hassle with, it was all installed and set up perfectly.  Power settings worked great, and closing the netbook put it to sleep just like in Windows. WiFi drivers have typically been difficult to find and install on Linux, but Jolicloud had our netbook’s wifi working immediately.  To get online, simply click the Wireless icon on the top right, and select the wireless network you want to connect to. Jolicloud will let you know when it is signed on. Wired Lan networking was also seamless; simply connect your cable and you’re ready to go.  The webcam and touchpad also worked perfectly directly.  The only thing missing was multitouch; this touchpad has two finger scroll, pinch zoom, and other nice multitouch features in Windows, but in Julicloud it only functioned as a standard touchpad.  It did have tap to click activated by default, as well as right-side scrolling, which is nice. Jolicloud also supported our video card without any extra work.  The native resolution was already selected, and the only problem we had with the screen was that there was no apparent way to change the brightness.  This is not a major problem, but would be nice to have.  The Samsung N150 has Intel GMA3150 integrated graphics, and Jolicloud promises 1080p HD video on it.  It did playback 720p H.264 video flawlessly without installing anything extra, but it stuttered on full 1080p HD (which is the exact same as this netbook’s video playback in Windows 7 – 720p works great, but it stutters on 1080p).  We would be excited to see full HD on this netbook, but 720p is definitely fine for most stuff.   Jolicloud supports a wide range of netbooks, and based on our experience we would expect it to work as good on any supported hardware.  Check out the list of supported netbooks to see if your netbook is supported; if not, it still may work but you may have to install special drivers. Jolicloud’s performance was very similar to Windows 7 on our netbook.  It boots in about 30 seconds, and apps load fairly quickly.  In general, we couldn’t tell much difference in performance between Jolicloud and Windows 7, though this isn’t a problem since Windows 7 runs great on the current generation of netbooks. Using Jolicloud Ready to start putting Jolicloud to use?  Your fresh Jolicloud install you can run several built-in apps, such as Firefox, a calculator, and the chat client Pidgin.  It also has a media player and file viewer installed, so you can play MP3s or MPG videos, or read PDF ebooks without installing anything extra.  It also has Flash player installed so you can watch videos online easily. You can also directly access all of your files from the right side of your home screen.  You can even access your Windows files; in our test, the 116.9 GB Media was C: from Windows.  Select it to browse and open any file you had saved in Windows. You may need to enter your password to access it. Once you’re authenticated it, you’ll see all of your Windows files and folders.  Your User files (Documents, Music, Videos, etc.) will be in the Users folder. And, you can easily add files from removable media such as USB flash drives and memory cards.  Jolicloud recognized a flash drive we tested with no trouble at all. Add new apps But, the best part about Jolicloud is that it makes it very easy to install new apps.  Click the Get Started button on your homescreen. You’ll first need to create an account.  You can then use this same account on another netbook if you wish, and your settings will automatically be synced between the two. You can either signup using your Facebook account, …or you can sign up the traditional way with your email address, name, and password.  If you sign up this way, you will need to confirm your email address before your account will be finished. Now, choose your netbook model from the list, and enter a name for your computer. And that’s it!  You’ll now see the Jolicloud dashboard, which will show you updates and notifications from friends who also use Jolicloud. Click the App directory to find new apps for your netbook.  Here you will find a variety of webapps, such as Gmail, along with native applications, such as Skype, that you can install on your netbook.  Simply click the Install button on the right to add the app to your netbook. You will be prompted to enter your system password, and then the app will install without any further input.   Once an app is installed, a check mark will appear beside its name.  You can remove it by clicking the Remove button, and it will uninstall seamlessly. Webapps, such as Gmail, actually run in in a Chrome-powered window that lets the webapp run full screen.  This gives the webapps a native feel, but actually they’re just running the same as they would in a standard web browser.   The Jolicloud Interface Most apps run maximized, and there is no way to run them smaller.  This in general works good, since with small screens most apps need to run full-screen anyhow. Smaller apps, such as a calculator or the Pidgin chat client, run in a window just like they do on other operating systems. You can switch to another app that’s running by selecting it’s icon on the top left, or you can go back to the home screen by clicking the home screen.  If you’re finished with an program, simply click the red X button on the top right of the window when you’re running it. Or, you can switch between programs using standard keyboard shortcuts such as Alt-tab. The default page on the home screen is the favorites page, and all of your other programs are orginized in their own sections on the left hand side.  But, if you want to add one of these to your favorites page, simply right-click on it and select Add to Favorites. When you’re done for the day, you can simply close your netbook to put it to sleep.  Or, if you want to shut down, just press the Quit button on the bottom right of the home screen and then select Shut Down. Booting Jolicloud When you install Jolicloud, it will set itself as the default operating system.  Now, when you boot your netbook, it will show you a list of installed operating systems.  You can select either Windows or Jolicloud, but if you don’t make a selection it will boot into Jolicloud after waiting 10 seconds. If you’d perfer to boot into Windows by default, you can easily change this.  First, boot your netbook in to Windows.  Open the start menu, right-click on the Computer button, and select Properties.   Click the “Advanced system settings” link on the left side. Click the Settings button in the Startup and Recovery section. Now, select Windows as the default operating system, and click Ok.  Your netbook will now boot into Windows by default, but will give you 10 seconds to choose to boot into Jolicloud when you start your computer. Or, if you decided you don’t want Jolicloud, you can easily uninstall it from within Windows. Please note that this will also remove any files you may have saved in Jolicloud, so be sure to copy them to your Windows drive before uninstalling. To uninstall Jolicloud from within Windows, open Control Panel, and select Uninstall a Program. Scroll down to select Jolicloud, and click Uninstall/Change. Click Yes to confirm that you want to uninstall Jolicloud. After a few moments, it will let you know that Jolicloud has been uninstalled.  You’re netbook is now back the same as it was before you installed Jolicloud, with only Windows installed. Closing Whether you’re wanting to replace your current OS on your netbook or would simply like to try out a fresh new Linux version on your netbook, Jolicloud is a great option for you.  We were very impressed by it’s solid hardware support and the ease of installing new apps in Jolicloud.  Rather than simply giving us a standard OS, Jolicloud offers a unique way to use your netbook with native programs and webapps.  And whether you’re an IT pro or are a new computer user, Jolicloud was easy enough to use that anyone can do it.  Give it a try, and let us know what your favorite netbook OS is! Link Download Jolicloud for your netbook Similar Articles Productive Geek Tips How To Change XSplash Themes in Ubuntu 9.10Verify the Integrity of Windows Vista System FilesMonitor Multiple Logs in a Single Shell with MultiTail for LinuxHide Some or All of the GUI Bars in FirefoxAsk the Readers: Do You Use a Laptop, Desktop, or Both? TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Stop In The Name Of Love (Firefox addon) Chitika iPad Labs Gives Live iPad Sale Stats Heaven & Hell Finder Icon Using TrueCrypt to Secure Your Data Quickly Schedule Meetings With NeedtoMeet Share Flickr Photos On Facebook Automatically

    Read the article

  • Using an alternate search platform in Commerce Server 2009

    - by Lewis Benge
    Although Microsoft Commerce Server 2009's architecture is built upon Microsoft SQL Server, and has the full power of the SQL Full Text Indexing Search Platform, there are time however when you may require a richer or alternate search platform. One of these scenarios if when you want to implement a faceted (refinement) search into your site, which provides dynamic refinements based on the search results dataset. Faceted search is becoming popular in most online retail environments as a way of providing an enhanced user experience when browsing a larger catalogue. This is powerful for two reasons, firstly with a traditional search it is down to a user to think of a search term suitable for the product they are trying to find. This typically will not return similar products or help in any way to refine a larger dataset. Faceted searches on the other hand provide a comprehensive list of product properties, grouped together by similarity to help the user narrow down the results returned, as the user progressively restricts the search criteria by selecting additional criteria to search again, these facets needs to continually refresh. The whole experience allows users to explore alternate brands, price-ranges, or find products they hadn't initially thought of or where looking for in a bid to enhance cross sell in the retail environment. The second advantage of this type of search from a business perspective is also to harvest the search result to start to profile your user. Even though anonymous users may routinely visit your site, and will not necessarily register or complete a transaction to build up marketing data- profiling, you can still achieve the same result by recording search facets used within the search sequence. Below is a faceted search scenario generated from eBay using the search term "server". By creating a search profile of clicking through Computer & Networking -> Servers -> Dell - > New and recording this information against my user profile you can start to predict with a lot more certainty what types of products I am interested in. This will allow you to apply shopping-cart analysis against your search data and provide great cross-sale or advertising opportunity, or personalise the user experience based on your prediction of what the user may be interested in. This type of search is extremely beneficial in e-Commerce environments but achieving it out of the box with Commerce Server and SQL Full Text indexing can be challenging. In many deployments it is often easier to use an alternate search platform such as Microsoft's FAST, Apache SOLR, or Endecca, however you still want these products to integrate natively into Commerce Server to ensure that up-to-date inventory information is presented, profile information is generated, and you provide a consistant API. To do so we make the most of the Commerce Server extensibilty points called operation sequence components. In this example I will be talking about Apache Solr hosted on Apache Tomcat, in this specific example I have used the SolrNet C# library to interface to the Java platform. Also I am not going to talk about Solr configuration of indexing – but in a production envionrment this would typically happen by using Powershell to call the Commerce Server management webservice to export your catalog as XML, apply an XSLT transform to the file to make it conform to SOLR and use a simple HTTP Post to send it to the search enginge for indexing. Essentially a sequance component is a step in a serial workflow used to call a data repository (which in most cases is usually the Commerce Server pipelines or databases) and map to and from a Commerce Entity object whilst enforcing any business rules. So the first step in the process is to add a new class library to your existing Commerce Server site. You will need to use a new library as Sequence Components will need to be strongly named to be deployed. Once you are inside of your new project, add a new class file and add a reference to the Microsoft.Commerce.Providers, Microsoft.Commerce.Contracts and the Microsoft.Commerce.Broker assemblies. Now make your new class derive from the base object Microsoft.Commerce.Providers.Components.OperationSequanceComponent and overide the ExecuteQueryMethod. Your screen will then look something similar ot this: As all we are doing on this component is conducting a search we are only interested in the ExecuteQuery method. This method accepts three arguments, queryOperation, operationCache, and response. The queryOperation will be the object in which we receive our search parameters, the cache allows access to the Commerce Server cache allowing us to store regulary accessed information, and the response object is the object which we will return the result of our search upon. Inside this method is simply where we are going to inject our logic for our third party search platform. As I am not going to explain the inner-workings of actually making a SOLR call, I'll simply provide the sample code here. I would highly recommend however looking at the SolrNet wiki as they have some great explinations of how the API works. What you will find however is that there are some further extensions required when attempting to integrate a custom search provider. Firstly you out of the box the CommerceQueryOperation you will receive into the method when conducting a search against a catalog is specifically geared towards a SQL Full Text Search with properties such as a Where clause. To make the operation you receive more relevant you will need to create another class, this time derived from Microsoft.Commerce.Contract.Messages.CommerceSearchCriteria and within this you need to detail the properties you will require to allow you to submit as parameters to the SOLR search API. My exmaple looks like this: [DataContract(Namespace = "http://schemas.microsoft.com/microsoft-multi-channel-commerce-foundation/types/2008/03")] public class CommerceCatalogSolrSearch : CommerceSearchCriteria { private Dictionary<string, string> _facetQueries;   public CommerceCatalogSolrSearch() { _facetQueries = new Dictionary<String, String>();   }     public Dictionary<String, String> FacetQueries { get { return _facetQueries; } set { _facetQueries = value; } }   public String SearchPhrase{ get; set; } public int PageIndex { get; set; } public int PageSize { get; set; } public IEnumerable<String> Facets { get; set; }   public string Sort { get; set; }   public new int FirstItemIndex { get { return (PageIndex-1)*PageSize; } }   public int LastItemIndex { get { return FirstItemIndex + PageSize; } } }  To allow you to construct a CommerceQueryOperation call within the API you will also need to construct another class to derived from Microsoft.Commerce.Common.MessageBuilders.CommerceSearchCriteriaBuilder and is simply used to construct an instance of the CommerceQueryOperation you have just created and expose the properties you want set. My Message builder looks like this: public class CommerceCatalogSolrSearchBuilder : CommerceSearchCriteriaBuilder { private CommerceCatalogSolrSearch _solrSearch;   public CommerceCatalogSolrSearchBuilder() { _solrSearch = new CommerceCatalogSolrSearch(); }   public String SearchPhrase { get { return _solrSearch.SearchPhrase; } set { _solrSearch.SearchPhrase = value; } }   public int PageIndex { get { return _solrSearch.PageIndex; } set { _solrSearch.PageIndex = value; } }   public int PageSize { get { return _solrSearch.PageSize; } set { _solrSearch.PageSize = value; } }   public Dictionary<String,String> FacetQueries { get { return _solrSearch.FacetQueries; } set { _solrSearch.FacetQueries = value; } }   public String[] Facets { get { return _solrSearch.Facets.ToArray(); } set { _solrSearch.Facets = value; } } public override CommerceSearchCriteria ToSearchCriteria() { return _solrSearch; } }  Once you have these two classes in place you can now safely cast the CommerceOperation you receive as an argument of the overidden ExecuteQuery method in the SequenceComponent to the CommerceCatalogSolrSearch operation you have just created, e.g. public CommerceCatalogSolrSearch TryGetSearchCriteria(CommerceOperation operation) { var searchCriteria = operation as CommerceQueryOperation; if (searchCriteria == null) throw new Exception("No search criteria present");   var local = (CommerceCatalogSolrSearch) searchCriteria.SearchCriteria; if (local == null) throw new Exception("Unexpected Search Criteria in Operation");   return local; }  Now you have all of your search parameters present, you can go off an call the external search platform API. You will of-course get proprietry objects returned, so the next step in the process is to convert the results being returned back into CommerceEntities. You do this via another extensibility point within the Commerce Server API called translatators. Translators are another separate class, this time derived inheriting the interface Microsoft.Commerce.Providers.Translators.IToCommerceEntityTranslator . As you can imaginge this interface is specific for the conversion of the object TO a CommerceEntity, you will need to implement a separate interface if you also need to go in the opposite direction. If you implement the required method for the interace you will get a single translate method which has a source onkect, destination CommerceEntity, and a collection of properties as arguments. For simplicity sake in this example I have hard-coded the mappings, however best practice would dictate you map the objects using your metadatadefintions.xml file . Once complete your translator would look something like the following: public class SolrEntityTranslator : IToCommerceEntityTranslator { #region IToCommerceEntityTranslator Members   public void Translate(object source, CommerceEntity destinationCommerceEntity, CommercePropertyCollection propertiesToReturn) { if (source.GetType().Equals(typeof (SearchProduct))) { var searchResult = (SearchProduct) source;   destinationCommerceEntity.Id = searchResult.ProductId; destinationCommerceEntity.SetPropertyValue("DisplayName", searchResult.Title); destinationCommerceEntity.ModelName = "Product";   } }  Once you have a translator in place you can then safely map the results of your search platform into Commerce Entities and attach them on to the CommerceResponse object in a fashion similar to this: foreach (SearchProduct result in matchingProducts) { var destinationEntity = new CommerceEntity(_returnModelName);   Translator.ToCommerceEntity(result, destinationEntity, _queryOperation.Model.Properties); response.CommerceEntities.Add(destinationEntity); }  In SOLR I actually have two objects being returned – a product, and a collection of facets so I have an additional translator for facet (which maps to a custom facet CommerceEntity) and my facet response from SOLR is passed into the Translator helper class seperatley. When all of this is pieced together you have sucessfully completed the extensiblity point coding. You would have created a new OperationSequanceComponent, a custom SearchCritiera object and message builder class, and translators to convert the objects into Commerce Entities. Now you simply need to configure them, and can start calling them in your code. Make sure you sign you assembly, compile it and identiy its signature. Next you need to put this a reference of your new assembly into the Channel.Config configuration file replacing that of the existing SQL Full Text component: You will also need to add your translators to the Translators node of your Channel.Config too: Lastly add any custom CommerceEntities you have developed to your MetaDataDefintions.xml file. Your configuration is now complete, and you should now be able to happily make a call to the Commerce Foundation API, which will act as a proxy to your third party search platform and return back CommerceEntities of your search results. If you require data to be enriched, or logged, or any other logic applied then simply add further sequence components into the OperationSequence (obviously keeping the search response first) to the node of your Channel.Config file. Now to call your code you simply request it as per any other CommerceQuery operation, but taking into account you may be receiving multiple types of CommerceEntity returned: public KeyValuePair<FacetCollection ,List<Product>> DoFacetedProductQuerySearch(string searchPhrase, string orderKey, string sortOrder, int recordIndex, int recordsPerPage, Dictionary<string, string> facetQueries, out int totalItemCount) { var products = new List<Product>(); var query = new CommerceQuery<CatalogEntity, CommerceCatalogSolrSearchBuilder>();   query.SearchCriteria.PageIndex = recordIndex; query.SearchCriteria.PageSize = recordsPerPage; query.SearchCriteria.SearchPhrase = searchPhrase; query.SearchCriteria.FacetQueries = facetQueries;     totalItemCount = 0; CommerceResponse response = SiteContext.ProcessRequest(query.ToRequest()); var queryResponse = response.OperationResponses[0] as CommerceQueryOperationResponse;   // No results. Return the empty list if (queryResponse != null && queryResponse.CommerceEntities.Count == 0) return new KeyValuePair<FacetCollection, List<Product>>();   totalItemCount = (int)queryResponse.TotalItemCount;   // Prepare a multi-operation to retrieve the product variants var multiOperation = new CommerceMultiOperation();     //Add products to results foreach (Product product in queryResponse.CommerceEntities.Where(x => x.ModelName == "Product")) { var productQuery = new CommerceQuery<Product>(Product.ModelNameDefinition); productQuery.SearchCriteria.Model.Id = product.Id; productQuery.SearchCriteria.Model.CatalogId = product.CatalogId;   var variantQuery = new CommerceQueryRelatedItem<Variant>(Product.RelationshipName.Variants);   productQuery.RelatedOperations.Add(variantQuery);   multiOperation.Add(productQuery); }   CommerceResponse variantsResponse = SiteContext.ProcessRequest(multiOperation.ToRequest()); foreach (CommerceQueryOperationResponse queryOpResponse in variantsResponse.OperationResponses) { if (queryOpResponse.CommerceEntities.Count() > 0) products.Add(queryOpResponse.CommerceEntities[0]); }   //Get facet collection FacetCollection facetCollection = queryResponse.CommerceEntities.Where(x => x.ModelName == "FacetCollection").FirstOrDefault();     return new KeyValuePair<FacetCollection, List<Product>>(facetCollection, products); }    ..And that is it – simply a few classes and some configuration will allow you to extend the Commerce Server query operations to call a third party search platform, whilst still maintaing a unifed API in the remainder of your code. This logic stands for any extensibility within CommerceServer, which requires excution in a serial fashioon such as call to LOB systems or web service to validate or enrich data. Feel free to use this example on other applications, and if you have any questions please feel free to e-mail and I'll help out where I can!

    Read the article

  • Connecting to DB2 from SSIS

    - by Christopher House
    The project I'm currently working on involves moving various pieces of data from a legacy DB2 environment to some SQL Server and flat file locations.  Most of the data flows are real time, so they were a natural fit for the client's MQSeries on their iSeries servers and BizTalk to handle the messaging.  Some of the data flows, however, are daily batch type transmissions.  For the daily batch transmissions, it was decided that we'd use SSIS to pull the data direct from DB2 to either a SQL Server or flat file.  I'm not at all an SSIS guy, I've done a bit here and there, but mainly for situations were we needed to move data from a dev environment to QA, mostly informal stuff like that.  And, as much as I'm not an SSIS guy, I'm even less a DB2/iSeries guy.  Prior to this engagement, my knowledge of DB2 was limited to the fact that it's an IBM product and that it was probably a DBMS flatform (that's what the DB in DB2 means, right?).   One of my first goals when I came onto this project was to develop of POC SSIS package to pull some data from DB2 and dump it to a flat file.  It sounded like a pretty straight forward task.  As always, the devil is in the details.  Configuring the DB2 connection manager took a bit of trial and error.  As such, I thought I'd post my experiences here in hopes that they might save someone the efforts I went through.  That being said, please keep in mind, as I pointed out, I'm not at all a DB2 guy, so my terminology and explanations may not be 100% spot on. Before you get started, you need to figure out how you're going to connect to DB2.  From the research I did, it looks like there are a few options.  IBM has both an OLE DB and .Net data provider which can be found here.  I installed their client access tools and tried to use both the .Net and OLE DB providers but I received an error message from both when attempting to connect to the iSeries that indicated I needed a license for a product called DB2 Connect.  I inquired with one of my client's iSeries resources about a license for this product and it appears they didn't have one, so that meant the IBM drivers were out.  The other option that I found quite a bit of discussion around was Microsoft's OLE DB Provider for DB2.  This driver is part of the feature pack for SQL Server 2008 Enterprise Edition and can be downloaded here. As it turns out, I already had Microsoft's driver installed on my dev VM, which stuck me as odd since I hadn't installed it.  I discovered that the driver is installed with the BizTalk adapter pack for host systems, which was also installed on my VM.  However, it looks like the version used by the adapter pack is newer than the version provided in the SQL Server feature pack.   Once you get the driver installed, create a connection manager in your package just like you normally would and select the Microsoft OLE DB Provider for DB2 from the list of available drivers. After you select the driver, you'll need to enter in your host name, login credentials and initial catalog. A couple of things to note here.  First, the Initial catalog needs to be the same as your host name.  Not sure why that is, but trust me, it just does.  Second, for credentials, in my environment, we're using what the client's iSeries people refer to as "profiles".  I guess this is similar to SQL auth in the SQL Server world.  In other words, they've given me a username and password for connecting to DB, so I've entered it here. Next, click the Data Links button.  On the Data Links screen, enter your package collection on the first tab. Package collection is one of those DB2 concepts I'm still trying to figure out.  From the little bit I've read, packages are used to control SQL compilation and each DB2 connection needs one.  The package collection, I believe, controls where your package is created.  One of the iSeries folks I've been working with told me that I should always use QGPL for my package collection, as QGPL is "general purpose" and doesn't require any additional authority. Next click the ellipsis next to the Network drop-down.  Here you'll want to enter your host name again. Again, not sure why you need to do this, but trust me, my connection wouldn't work until I entered my hostname here. Finally, go to the Advanced tab, select your DBMS platform and check Process binary as character. My environment is DB2 on the iSeries and iSeries is the replacement for AS/400, so I selected DB2/AS400 for my platform.  Process binary as character was necessary to handle some of the DB2 data types.  I had a few columns that showed all their data as "System.Byte[]".  Checking Process binary as character resolved this. At this point, you should be good to go.  You can go back to the Connection tab on the Data Links dialog to perform a couple of tests to validate your configuration.  The Test Connection button is obvious, this just verifies you can connect to the host using the configuration data you've entered.  The Packages button will attempt to connect to the host and create the packages required to execute queries. This isn't meant to be a comprehensive look SSIS and DB2, these are just some of the notes I've come up with since I've started working with DB2 and SSIS.  I'm sure as I continue developing my packages, I'll find more quirks and will post them here.

    Read the article

  • Great library of ASP.NET videos – Pluralsight!

    - by hajan
    I have been subscribed to the Pluralsight website and of course since ASP.NET is my favorite development technology, I passed throughout few series of videos related to ASP.NET. You have list of ASP.NET galleries from Fundamentals to Advanced topics including the latest features of ASP.NET 4.0, ASP.NET Ajax, ASP.NET MVC etc. Most of the speakers are either Microsoft MVPs or known technology experts! I was really curious to see the way they have organized the entire course materials, and trust me, I was quite amazed. I saw the ASP.NET 4.0 video series to confirm my knowledge and some other video series regarding general software development concepts, design patterns etc. I would like to point out if anyone of you is interested to get FREE 1-week .NET training pass in the Pluralsight library, please CONTACT ME, write your name and email and include the purpose of the message in the content. I hope you will find this useful. Regards, Hajan

    Read the article

  • CRM@Oracle Series: Forecasting

    - by tony.berk
    What do you trust more: the weather forecast or your sales forecast? I hope the answer is your sales forecast! Either way, would your sales forecast be more accurate if sales management had visibility into what the sales reps are forecasting and what has changed since the last forecast? What if management could adjust forecasts for accuracy based on analytic tools? Today's slidecast discusses sales forecasting and how Oracle implemented forecasting in our global implementation of Siebel CRM, including the steps involved to roll up the forecast. CRM@Oracle - Forecasting Click here to learn more about Oracle CRM products and here to learn about other customers using Oracle CRM. Are you enjoying the CRM@Oracle Series? If you have a particular CRM area or function which you'd like to hear how Oracle implemented it internally, let us know and we'll get it on our list.

    Read the article

  • Thomas Kurian's COLLABORATE Keynote: Process not Product

    - by Aaron Lazenby
    Right off the bat, Oracle's Senior Vice President, Server Technologies Development made his purpose very clear: demonstrate how the elements of the Oracle product stack are evolving (and integrating) together. There are some great details about the new functionality of each Oracle application line and how the different products sync and interact. The lifecycle charts in Kurian's presentation illustrate how data can flow from an Oracle Demantra into Oracle E-Business Suite and back out to an Oracle Agile system to support value chain planning. With so many products at play in the enterprise, Kurian shows that if you trust that your systems can work together, IT strategy becoming much more about managing business process than managing software product.

    Read the article

  • Introducing SQLPeople - the Blog Series!

    - by andyleonard
    Introduction The first 50.5 weeks of 2010 have been interesting, to say the least. My experiences in 2010 can best be summed up in a single word: educational. I've learned a lot this year! One important thread wove its way through my 2010 experiences... Relationships Are Everything How we interact defines community. Relationships define community. Our community is more than the sum of our members. Trust and respect are the capital of community. And just like money, this capital can be invested, exchanged,...(read more)

    Read the article

  • Don&rsquo;t break that sandbox

    - by Sahil Malik
    SharePoint 2010 Training: more information Hmm .. I hear that some soldiers are spreading rumors that it is OKAY to edit the WSS_Sandbox trust level inside of SharePoint. Afterall, it is just .NET code right? And it’s just a CAS policy, so why not make that tempting little tweet, and well – all I wanna do is call web services! Ummmm ..   DON’T DO IT!   Yes I know it’s just .NET code! But Microsoft has spent a great deal of time, resources, and thoughts in crafting up the boundary of what a sandbox solution can do, and what it cannot do. Soon as you make that tiny little tweak to allow calling web services, you just opened a bunch of security holes in your SharePoint installation. Not to mention, you broke the first cardinal rule of your SharePoint solutions, which is, “No Microsoft files were hurt in the building of this solution” Read full article ....

    Read the article

  • How to Convert a PFX Certificate into a JKS Certificate to configure it on WebLogic

    - by adejuanc
    To convert a pfx cert file to a jks file, please follow these instructions: 1. Set up the environment for the domain, by executing the setDomainEnv.sh script, typically located at $DOMAIN_HOME/bin. $ . ./setDomainEnv.sh 2. Use OpenSSL to check the pfx certificate's content. $ openssl pkcs12 -in <certificate.pfx> -out KEYSTORE.pem -nodesAt this point, a password for the pfx file will be requested. Expected output: $ openssl pkcs12 -in <certificate.pfx> -out KEYSTORE.pem -nodesEnter Import Password:MAC verified OK3. Open KEYSTORE.pem file, from step 2. This should look similar to this:You will find three certificates on it and the private key: Bag Attributes Microsoft Local Key set: <No Values> localKeyID: 01 00 00 00 friendlyName: le-36c42c6e-ec49-413c-891e-591f7e3dd306 Microsoft CSP Name: Microsoft RSA SChannel Cryptographic ProviderKey Attributes X509v3 Key Usage: 10-----BEGIN RSA PRIVATE KEY-----MIIEpQIBAAKCAQEAtPwoO3eOwSyOapzZgcDnQOH27cOaaejHtNh921Pd+U4N+dlm...EDITING...R5rsB00Yk1/2W9UqD9Nn7cDuMdilS8g9CUqnnSlDkSG0AX67auKUAcI=-----END RSA PRIVATE KEY-----Bag Attributes localKeyID: 01 00 00 00 friendlyName: *.something.comsubject=/serialNumber=sj6QjpTjKcpQGZ9QqWO-pFvsakS1t8MV/C=US/ST=Missouri/L=CHESTERFIELD/O=Oracle_Corp, Inc./OU=Oracle/CN=*.something.comissuer=/C=US/O=GeoTrust, Inc./CN=GeoTrust SSL CA-----BEGIN CERTIFICATE-----MIIErzCCA5egAwIBAgIDAIH6MA0GCSqGSIb3DQEBBQUAMEAxCzAJBgNVBAYTAlVT...EDITING...wA5JxaU55teoWkuiAaYRQpuLepJfzw+qMk5i5FpMRbVMMfkcBusGtdW5OrAoYDL94rgR-----END CERTIFICATE-----Bag Attributes friendlyName: GeoTrust Global CAsubject=/C=US/O=GeoTrust Inc./CN=GeoTrust Global CAissuer=/C=US/O=GeoTrust Inc./CN=GeoTrust Global CA-----BEGIN CERTIFICATE-----MIIDVDCCAjygAwIBAgIDAjRWMA0GCSqGSIb3DQEBBQUAMEIxCzAJBgNVBAYTAlVT...EDITING...5fEWCRE11azbJHFwLJhWC9kXtNHjUStedejV0NxPNO3CBWaAocvmMw==-----END CERTIFICATE-----Bag Attributes: <Empty Attributes>subject=/C=US/O=GeoTrust, Inc./CN=GeoTrust SSL CAissuer=/C=US/O=GeoTrust Inc./CN=GeoTrust Global CA-----BEGIN CERTIFICATE-----MIID2TCCAsGgAwIBAgIDAjbQMA0GCSqGSIb3DQEBBQUAMEIxCzAJBgNVBAYTAlVT...EDITING...TpnKXKBuervdo5AaRTPvvz7SBMS24CqFZUE+ENQ=-----END CERTIFICATE-----4. Identify and store contents from KEYSTORE.pem certificate, to proceed and create jks files:At this point, you will find three certificates on KEYSTORE.pem and the private key. 4.1 Private Key.To identify the private key, look for the following headings: -----BEGIN RSA PRIVATE KEY----------END RSA PRIVATE KEY-----Both above mentioned tags will be surrounded the private key. Go ahead and save the content of it into a file called: my_key_pk.pem. This has to include the headings. Expected file: -----BEGIN RSA PRIVATE KEY-----MIIEpQIBAAKCAQEAtPwoO3eOwSyOapzZgcDnQOH27cOaaejHtNh921Pd+U4N+dlm...EDIT...Y4ZrW12PRa9/EOBGTG5teKAEada/K4yKReTyQQAGq6j5RjErmuuKkKgPGMSCjvMSR5rsB00Yk1/2W9UqD9Nn7cDuMdilS8g9CUqnnSlDkSG0AX67auKUAcI=-----END RSA PRIVATE KEY-----4.2 Root Certificate.To identify the Root Certificate, look for the following headings: subject=/C=US/O=GeoTrust Inc./CN=GeoTrust Global CA issuer=/C=US/O=GeoTrust Inc./CN=GeoTrust Global CA Subject and issuer must be the same. Go ahead and save the content of it into a file called: my_key_root.pem. Include all the content from BEGIN CERTIFICATE TO END CERTIFICATE, both included.4.3 Intermediate Certificate.To identify an Intermediate Certificate, look for the following heading: subject=/C=US/O=GeoTrust, Inc./CN=GeoTrust SSL CAissuer=/C=US/O=GeoTrust Inc./CN=GeoTrust Global CA Subject and issuer are different only on the CN. Go ahead and save the content of it into a file called: my_key_intermediate.pem. Include all the content from BEGIN CERTIFICATE TO END CERTIFICATE, both included. NOTE: This certificate is optional and there are some cases where it'll not be present. If this is the case, go ahead and skip this step. In any other case, this needs to be added to the identity keystore jks file. 4.4 Server Certificate. To identify a Server Certificate, look for the following heading: friendlyName: some.thing.comsubject=/serialNumber=sj6QjpTjKcpQGZ9QqWO-pFvsakS1t8MV/C=US/ST=Missouri/L=CHESTERFIELD/O=Oracle_Corp, Inc./OU=Oracle/CN=some.thing.com        A server certificate includes a heading called Friendly Name. Go ahead and save the content of it into a file called: my_key_crt.pem. Include all the content from BEGIN CERTIFICATE TO END CERTIFICATE, both included.5. Create a Trust Keystore and import the Root certificate into it. $ keytool -import -trustcacerts -file my_key_root.pem -alias my_key_root -keystore my_key_trust.jks -storepass <store_pass> -keypass <key_pass>Expected Output: Certificate already exists in system-wide CA keystore under alias <geotrustglobalca> Do you still want to add it to your own keystore? [no]: yes Certificate was added to keystore6. Generate an Identity Keystore and import Server into it. $java utils.ImportPrivateKey -keystore my_key_identity.jks -storepass <store_pass> -storetype JKS -keypass <key_pass> -alias server_identity -certfile my_key_crt.pem -keyfile my_key_pk.pem -keyfilepass <pfx_password> With these instructions, two jks files will be produced: my_key_identity.jks my_key_trust.jks With both files, the next step is to configure Custom Identity and Custom Trust on WebLogic Server.

    Read the article

  • Opinion on LastPass's security for the Average Joe [closed]

    - by Rook
    This is borderline on objective/subjective, but I'm posting it here since I'm more interested in objective facts, without going into too much technical details, than I am in user reviews of LastPass. I've always used offline ways for (password / sensitive data) storage, but lately I keep hearing good things about LastPass. Indeed, it is more practical having it always accessible from every computer you're using without syncing and related problems, but the security aspect still troubles me. How (in a nutshell for dummies) does LastPass keep your data secure / can their employees see your data, and what is your opinion for such storage of more than usual keeping of sensitive data (bank PIN codes, some financial / business related stuff and so on - you know, the things that would practically hurt if lost / phished)? What are your opinions of it, and do you trust it for such? Any bad experiences? If someone for example is sniffing your wifi network, would such data be easier than usual to sniff out?

    Read the article

  • 256 Windows Azure Worker Roles, Windows Kinect and a 90's Text-Based Ray-Tracer

    - by Alan Smith
    For a couple of years I have been demoing a simple render farm hosted in Windows Azure using worker roles and the Azure Storage service. At the start of the presentation I deploy an Azure application that uses 16 worker roles to render a 1,500 frame 3D ray-traced animation. At the end of the presentation, when the animation was complete, I would play the animation delete the Azure deployment. The standing joke with the audience was that it was that it was a “$2 demo”, as the compute charges for running the 16 instances for an hour was $1.92, factor in the bandwidth charges and it’s a couple of dollars. The point of the demo is that it highlights one of the great benefits of cloud computing, you pay for what you use, and if you need massive compute power for a short period of time using Windows Azure can work out very cost effective. The “$2 demo” was great for presenting at user groups and conferences in that it could be deployed to Azure, used to render an animation, and then removed in a one hour session. I have always had the idea of doing something a bit more impressive with the demo, and scaling it from a “$2 demo” to a “$30 demo”. The challenge was to create a visually appealing animation in high definition format and keep the demo time down to one hour.  This article will take a run through how I achieved this. Ray Tracing Ray tracing, a technique for generating high quality photorealistic images, gained popularity in the 90’s with companies like Pixar creating feature length computer animations, and also the emergence of shareware text-based ray tracers that could run on a home PC. In order to render a ray traced image, the ray of light that would pass from the view point must be tracked until it intersects with an object. At the intersection, the color, reflectiveness, transparency, and refractive index of the object are used to calculate if the ray will be reflected or refracted. Each pixel may require thousands of calculations to determine what color it will be in the rendered image. Pin-Board Toys Having very little artistic talent and a basic understanding of maths I decided to focus on an animation that could be modeled fairly easily and would look visually impressive. I’ve always liked the pin-board desktop toys that become popular in the 80’s and when I was working as a 3D animator back in the 90’s I always had the idea of creating a 3D ray-traced animation of a pin-board, but never found the energy to do it. Even if I had a go at it, the render time to produce an animation that would look respectable on a 486 would have been measured in months. PolyRay Back in 1995 I landed my first real job, after spending three years being a beach-ski-climbing-paragliding-bum, and was employed to create 3D ray-traced animations for a CD-ROM that school kids would use to learn physics. I had got into the strange and wonderful world of text-based ray tracing, and was using a shareware ray-tracer called PolyRay. PolyRay takes a text file describing a scene as input and, after a few hours processing on a 486, produced a high quality ray-traced image. The following is an example of a basic PolyRay scene file. background Midnight_Blue   static define matte surface { ambient 0.1 diffuse 0.7 } define matte_white texture { matte { color white } } define matte_black texture { matte { color dark_slate_gray } } define position_cylindrical 3 define lookup_sawtooth 1 define light_wood <0.6, 0.24, 0.1> define median_wood <0.3, 0.12, 0.03> define dark_wood <0.05, 0.01, 0.005>     define wooden texture { noise surface { ambient 0.2  diffuse 0.7  specular white, 0.5 microfacet Reitz 10 position_fn position_cylindrical position_scale 1  lookup_fn lookup_sawtooth octaves 1 turbulence 1 color_map( [0.0, 0.2, light_wood, light_wood] [0.2, 0.3, light_wood, median_wood] [0.3, 0.4, median_wood, light_wood] [0.4, 0.7, light_wood, light_wood] [0.7, 0.8, light_wood, median_wood] [0.8, 0.9, median_wood, light_wood] [0.9, 1.0, light_wood, dark_wood]) } } define glass texture { surface { ambient 0 diffuse 0 specular 0.2 reflection white, 0.1 transmission white, 1, 1.5 }} define shiny surface { ambient 0.1 diffuse 0.6 specular white, 0.6 microfacet Phong 7  } define steely_blue texture { shiny { color black } } define chrome texture { surface { color white ambient 0.0 diffuse 0.2 specular 0.4 microfacet Phong 10 reflection 0.8 } }   viewpoint {     from <4.000, -1.000, 1.000> at <0.000, 0.000, 0.000> up <0, 1, 0> angle 60     resolution 640, 480 aspect 1.6 image_format 0 }       light <-10, 30, 20> light <-10, 30, -20>   object { disc <0, -2, 0>, <0, 1, 0>, 30 wooden }   object { sphere <0.000, 0.000, 0.000>, 1.00 chrome } object { cylinder <0.000, 0.000, 0.000>, <0.000, 0.000, -4.000>, 0.50 chrome }   After setting up the background and defining colors and textures, the viewpoint is specified. The “camera” is located at a point in 3D space, and it looks towards another point. The angle, image resolution, and aspect ratio are specified. Two lights are present in the image at defined coordinates. The three objects in the image are a wooden disc to represent a table top, and a sphere and cylinder that intersect to form a pin that will be used for the pin board toy in the final animation. When the image is rendered, the following image is produced. The pins are modeled with a chrome surface, so they reflect the environment around them. Note that the scale of the pin shaft is not correct, this will be fixed later. Modeling the Pin Board The frame of the pin-board is made up of three boxes, and six cylinders, the front box is modeled using a clear, slightly reflective solid, with the same refractive index of glass. The other shapes are modeled as metal. object { box <-5.5, -1.5, 1>, <5.5, 5.5, 1.2> glass } object { box <-5.5, -1.5, -0.04>, <5.5, 5.5, -0.09> steely_blue } object { box <-5.5, -1.5, -0.52>, <5.5, 5.5, -0.59> steely_blue } object { cylinder <-5.2, -1.2, 1.4>, <-5.2, -1.2, -0.74>, 0.2 steely_blue } object { cylinder <5.2, -1.2, 1.4>, <5.2, -1.2, -0.74>, 0.2 steely_blue } object { cylinder <-5.2, 5.2, 1.4>, <-5.2, 5.2, -0.74>, 0.2 steely_blue } object { cylinder <5.2, 5.2, 1.4>, <5.2, 5.2, -0.74>, 0.2 steely_blue } object { cylinder <0, -1.2, 1.4>, <0, -1.2, -0.74>, 0.2 steely_blue } object { cylinder <0, 5.2, 1.4>, <0, 5.2, -0.74>, 0.2 steely_blue }   In order to create the matrix of pins that make up the pin board I used a basic console application with a few nested loops to create two intersecting matrixes of pins, which models the layout used in the pin boards. The resulting image is shown below. The pin board contains 11,481 pins, with the scene file containing 23,709 lines of code. For the complete animation 2,000 scene files will be created, which is over 47 million lines of code. Each pin in the pin-board will slide out a specific distance when an object is pressed into the back of the board. This is easily modeled by setting the Z coordinate of the pin to a specific value. In order to set all of the pins in the pin-board to the correct position, a bitmap image can be used. The position of the pin can be set based on the color of the pixel at the appropriate position in the image. When the Windows Azure logo is used to set the Z coordinate of the pins, the following image is generated. The challenge now was to make a cool animation. The Azure Logo is fine, but it is static. Using a normal video to animate the pins would not work; the colors in the video would not be the same as the depth of the objects from the camera. In order to simulate the pin board accurately a series of frames from a depth camera could be used. Windows Kinect The Kenect controllers for the X-Box 360 and Windows feature a depth camera. The Kinect SDK for Windows provides a programming interface for Kenect, providing easy access for .NET developers to the Kinect sensors. The Kinect Explorer provided with the Kinect SDK is a great starting point for exploring Kinect from a developers perspective. Both the X-Box 360 Kinect and the Windows Kinect will work with the Kinect SDK, the Windows Kinect is required for commercial applications, but the X-Box Kinect can be used for hobby projects. The Windows Kinect has the advantage of providing a mode to allow depth capture with objects closer to the camera, which makes for a more accurate depth image for setting the pin positions. Creating a Depth Field Animation The depth field animation used to set the positions of the pin in the pin board was created using a modified version of the Kinect Explorer sample application. In order to simulate the pin board accurately, a small section of the depth range from the depth sensor will be used. Any part of the object in front of the depth range will result in a white pixel; anything behind the depth range will be black. Within the depth range the pixels in the image will be set to RGB values from 0,0,0 to 255,255,255. A screen shot of the modified Kinect Explorer application is shown below. The Kinect Explorer sample application was modified to include slider controls that are used to set the depth range that forms the image from the depth stream. This allows the fine tuning of the depth image that is required for simulating the position of the pins in the pin board. The Kinect Explorer was also modified to record a series of images from the depth camera and save them as a sequence JPEG files that will be used to animate the pins in the animation the Start and Stop buttons are used to start and stop the image recording. En example of one of the depth images is shown below. Once a series of 2,000 depth images has been captured, the task of creating the animation can begin. Rendering a Test Frame In order to test the creation of frames and get an approximation of the time required to render each frame a test frame was rendered on-premise using PolyRay. The output of the rendering process is shown below. The test frame contained 23,629 primitive shapes, most of which are the spheres and cylinders that are used for the 11,800 or so pins in the pin board. The 1280x720 image contains 921,600 pixels, but as anti-aliasing was used the number of rays that were calculated was 4,235,777, with 3,478,754,073 object boundaries checked. The test frame of the pin board with the depth field image applied is shown below. The tracing time for the test frame was 4 minutes 27 seconds, which means rendering the2,000 frames in the animation would take over 148 hours, or a little over 6 days. Although this is much faster that an old 486, waiting almost a week to see the results of an animation would make it challenging for animators to create, view, and refine their animations. It would be much better if the animation could be rendered in less than one hour. Windows Azure Worker Roles The cost of creating an on-premise render farm to render animations increases in proportion to the number of servers. The table below shows the cost of servers for creating a render farm, assuming a cost of $500 per server. Number of Servers Cost 1 $500 16 $8,000 256 $128,000   As well as the cost of the servers, there would be additional costs for networking, racks etc. Hosting an environment of 256 servers on-premise would require a server room with cooling, and some pretty hefty power cabling. The Windows Azure compute services provide worker roles, which are ideal for performing processor intensive compute tasks. With the scalability available in Windows Azure a job that takes 256 hours to complete could be perfumed using different numbers of worker roles. The time and cost of using 1, 16 or 256 worker roles is shown below. Number of Worker Roles Render Time Cost 1 256 hours $30.72 16 16 hours $30.72 256 1 hour $30.72   Using worker roles in Windows Azure provides the same cost for the 256 hour job, irrespective of the number of worker roles used. Provided the compute task can be broken down into many small units, and the worker role compute power can be used effectively, it makes sense to scale the application so that the task is completed quickly, making the results available in a timely fashion. The task of rendering 2,000 frames in an animation is one that can easily be broken down into 2,000 individual pieces, which can be performed by a number of worker roles. Creating a Render Farm in Windows Azure The architecture of the render farm is shown in the following diagram. The render farm is a hybrid application with the following components: ·         On-Premise o   Windows Kinect – Used combined with the Kinect Explorer to create a stream of depth images. o   Animation Creator – This application uses the depth images from the Kinect sensor to create scene description files for PolyRay. These files are then uploaded to the jobs blob container, and job messages added to the jobs queue. o   Process Monitor – This application queries the role instance lifecycle table and displays statistics about the render farm environment and render process. o   Image Downloader – This application polls the image queue and downloads the rendered animation files once they are complete. ·         Windows Azure o   Azure Storage – Queues and blobs are used for the scene description files and completed frames. A table is used to store the statistics about the rendering environment.   The architecture of each worker role is shown below.   The worker role is configured to use local storage, which provides file storage on the worker role instance that can be use by the applications to render the image and transform the format of the image. The service definition for the worker role with the local storage configuration highlighted is shown below. <?xml version="1.0" encoding="utf-8"?> <ServiceDefinition name="CloudRay" >   <WorkerRole name="CloudRayWorkerRole" vmsize="Small">     <Imports>     </Imports>     <ConfigurationSettings>       <Setting name="DataConnectionString" />     </ConfigurationSettings>     <LocalResources>       <LocalStorage name="RayFolder" cleanOnRoleRecycle="true" />     </LocalResources>   </WorkerRole> </ServiceDefinition>     The two executable programs, PolyRay.exe and DTA.exe are included in the Azure project, with Copy Always set as the property. PolyRay will take the scene description file and render it to a Truevision TGA file. As the TGA format has not seen much use since the mid 90’s it is converted to a JPG image using Dave's Targa Animator, another shareware application from the 90’s. Each worker roll will use the following process to render the animation frames. 1.       The worker process polls the job queue, if a job is available the scene description file is downloaded from blob storage to local storage. 2.       PolyRay.exe is started in a process with the appropriate command line arguments to render the image as a TGA file. 3.       DTA.exe is started in a process with the appropriate command line arguments convert the TGA file to a JPG file. 4.       The JPG file is uploaded from local storage to the images blob container. 5.       A message is placed on the images queue to indicate a new image is available for download. 6.       The job message is deleted from the job queue. 7.       The role instance lifecycle table is updated with statistics on the number of frames rendered by the worker role instance, and the CPU time used. The code for this is shown below. public override void Run() {     // Set environment variables     string polyRayPath = Path.Combine(Environment.GetEnvironmentVariable("RoleRoot"), PolyRayLocation);     string dtaPath = Path.Combine(Environment.GetEnvironmentVariable("RoleRoot"), DTALocation);       LocalResource rayStorage = RoleEnvironment.GetLocalResource("RayFolder");     string localStorageRootPath = rayStorage.RootPath;       JobQueue jobQueue = new JobQueue("renderjobs");     JobQueue downloadQueue = new JobQueue("renderimagedownloadjobs");     CloudRayBlob sceneBlob = new CloudRayBlob("scenes");     CloudRayBlob imageBlob = new CloudRayBlob("images");     RoleLifecycleDataSource roleLifecycleDataSource = new RoleLifecycleDataSource();       Frames = 0;       while (true)     {         // Get the render job from the queue         CloudQueueMessage jobMsg = jobQueue.Get();           if (jobMsg != null)         {             // Get the file details             string sceneFile = jobMsg.AsString;             string tgaFile = sceneFile.Replace(".pi", ".tga");             string jpgFile = sceneFile.Replace(".pi", ".jpg");               string sceneFilePath = Path.Combine(localStorageRootPath, sceneFile);             string tgaFilePath = Path.Combine(localStorageRootPath, tgaFile);             string jpgFilePath = Path.Combine(localStorageRootPath, jpgFile);               // Copy the scene file to local storage             sceneBlob.DownloadFile(sceneFilePath);               // Run the ray tracer.             string polyrayArguments =                 string.Format("\"{0}\" -o \"{1}\" -a 2", sceneFilePath, tgaFilePath);             Process polyRayProcess = new Process();             polyRayProcess.StartInfo.FileName =                 Path.Combine(Environment.GetEnvironmentVariable("RoleRoot"), polyRayPath);             polyRayProcess.StartInfo.Arguments = polyrayArguments;             polyRayProcess.Start();             polyRayProcess.WaitForExit();               // Convert the image             string dtaArguments =                 string.Format(" {0} /FJ /P{1}", tgaFilePath, Path.GetDirectoryName (jpgFilePath));             Process dtaProcess = new Process();             dtaProcess.StartInfo.FileName =                 Path.Combine(Environment.GetEnvironmentVariable("RoleRoot"), dtaPath);             dtaProcess.StartInfo.Arguments = dtaArguments;             dtaProcess.Start();             dtaProcess.WaitForExit();               // Upload the image to blob storage             imageBlob.UploadFile(jpgFilePath);               // Add a download job.             downloadQueue.Add(jpgFile);               // Delete the render job message             jobQueue.Delete(jobMsg);               Frames++;         }         else         {             Thread.Sleep(1000);         }           // Log the worker role activity.         roleLifecycleDataSource.Alive             ("CloudRayWorker", RoleLifecycleDataSource.RoleLifecycleId, Frames);     } }     Monitoring Worker Role Instance Lifecycle In order to get more accurate statistics about the lifecycle of the worker role instances used to render the animation data was tracked in an Azure storage table. The following class was used to track the worker role lifecycles in Azure storage.   public class RoleLifecycle : TableServiceEntity {     public string ServerName { get; set; }     public string Status { get; set; }     public DateTime StartTime { get; set; }     public DateTime EndTime { get; set; }     public long SecondsRunning { get; set; }     public DateTime LastActiveTime { get; set; }     public int Frames { get; set; }     public string Comment { get; set; }       public RoleLifecycle()     {     }       public RoleLifecycle(string roleName)     {         PartitionKey = roleName;         RowKey = Utils.GetAscendingRowKey();         Status = "Started";         StartTime = DateTime.UtcNow;         LastActiveTime = StartTime;         EndTime = StartTime;         SecondsRunning = 0;         Frames = 0;     } }     A new instance of this class is created and added to the storage table when the role starts. It is then updated each time the worker renders a frame to record the total number of frames rendered and the total processing time. These statistics are used be the monitoring application to determine the effectiveness of use of resources in the render farm. Rendering the Animation The Azure solution was deployed to Windows Azure with the service configuration set to 16 worker role instances. This allows for the application to be tested in the cloud environment, and the performance of the application determined. When I demo the application at conferences and user groups I often start with 16 instances, and then scale up the application to the full 256 instances. The configuration to run 16 instances is shown below. <?xml version="1.0" encoding="utf-8"?> <ServiceConfiguration serviceName="CloudRay" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="1" osVersion="*">   <Role name="CloudRayWorkerRole">     <Instances count="16" />     <ConfigurationSettings>       <Setting name="DataConnectionString"         value="DefaultEndpointsProtocol=https;AccountName=cloudraydata;AccountKey=..." />     </ConfigurationSettings>   </Role> </ServiceConfiguration>     About six minutes after deploying the application the first worker roles become active and start to render the first frames of the animation. The CloudRay Monitor application displays an icon for each worker role instance, with a number indicating the number of frames that the worker role has rendered. The statistics on the left show the number of active worker roles and statistics about the render process. The render time is the time since the first worker role became active; the CPU time is the total amount of processing time used by all worker role instances to render the frames.   Five minutes after the first worker role became active the last of the 16 worker roles activated. By this time the first seven worker roles had each rendered one frame of the animation.   With 16 worker roles u and running it can be seen that one hour and 45 minutes CPU time has been used to render 32 frames with a render time of just under 10 minutes.     At this rate it would take over 10 hours to render the 2,000 frames of the full animation. In order to complete the animation in under an hour more processing power will be required. Scaling the render farm from 16 instances to 256 instances is easy using the new management portal. The slider is set to 256 instances, and the configuration saved. We do not need to re-deploy the application, and the 16 instances that are up and running will not be affected. Alternatively, the configuration file for the Azure service could be modified to specify 256 instances.   <?xml version="1.0" encoding="utf-8"?> <ServiceConfiguration serviceName="CloudRay" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="1" osVersion="*">   <Role name="CloudRayWorkerRole">     <Instances count="256" />     <ConfigurationSettings>       <Setting name="DataConnectionString"         value="DefaultEndpointsProtocol=https;AccountName=cloudraydata;AccountKey=..." />     </ConfigurationSettings>   </Role> </ServiceConfiguration>     Six minutes after the new configuration has been applied 75 new worker roles have activated and are processing their first frames.   Five minutes later the full configuration of 256 worker roles is up and running. We can see that the average rate of frame rendering has increased from 3 to 12 frames per minute, and that over 17 hours of CPU time has been utilized in 23 minutes. In this test the time to provision 140 worker roles was about 11 minutes, which works out at about one every five seconds.   We are now half way through the rendering, with 1,000 frames complete. This has utilized just under three days of CPU time in a little over 35 minutes.   The animation is now complete, with 2,000 frames rendered in a little over 52 minutes. The CPU time used by the 256 worker roles is 6 days, 7 hours and 22 minutes with an average frame rate of 38 frames per minute. The rendering of the last 1,000 frames took 16 minutes 27 seconds, which works out at a rendering rate of 60 frames per minute. The frame counts in the server instances indicate that the use of a queue to distribute the workload has been very effective in distributing the load across the 256 worker role instances. The first 16 instances that were deployed first have rendered between 11 and 13 frames each, whilst the 240 instances that were added when the application was scaled have rendered between 6 and 9 frames each.   Completed Animation I’ve uploaded the completed animation to YouTube, a low resolution preview is shown below. Pin Board Animation Created using Windows Kinect and 256 Windows Azure Worker Roles   The animation can be viewed in 1280x720 resolution at the following link: http://www.youtube.com/watch?v=n5jy6bvSxWc Effective Use of Resources According to the CloudRay monitor statistics the animation took 6 days, 7 hours and 22 minutes CPU to render, this works out at 152 hours of compute time, rounded up to the nearest hour. As the usage for the worker role instances are billed for the full hour, it may have been possible to render the animation using fewer than 256 worker roles. When deciding the optimal usage of resources, the time required to provision and start the worker roles must also be considered. In the demo I started with 16 worker roles, and then scaled the application to 256 worker roles. It would have been more optimal to start the application with maybe 200 worker roles, and utilized the full hour that I was being billed for. This would, however, have prevented showing the ease of scalability of the application. The new management portal displays the CPU usage across the worker roles in the deployment. The average CPU usage across all instances is 93.27%, with over 99% used when all the instances are up and running. This shows that the worker role resources are being used very effectively. Grid Computing Scenarios Although I am using this scenario for a hobby project, there are many scenarios where a large amount of compute power is required for a short period of time. Windows Azure provides a great platform for developing these types of grid computing applications, and can work out very cost effective. ·         Windows Azure can provide massive compute power, on demand, in a matter of minutes. ·         The use of queues to manage the load balancing of jobs between role instances is a simple and effective solution. ·         Using a cloud-computing platform like Windows Azure allows proof-of-concept scenarios to be tested and evaluated on a very low budget. ·         No charges for inbound data transfer makes the uploading of large data sets to Windows Azure Storage services cost effective. (Transaction charges still apply.) Tips for using Windows Azure for Grid Computing Scenarios I found the implementation of a render farm using Windows Azure a fairly simple scenario to implement. I was impressed by ease of scalability that Azure provides, and by the short time that the application took to scale from 16 to 256 worker role instances. In this case it was around 13 minutes, in other tests it took between 10 and 20 minutes. The following tips may be useful when implementing a grid computing project in Windows Azure. ·         Using an Azure Storage queue to load-balance the units of work across multiple worker roles is simple and very effective. The design I have used in this scenario could easily scale to many thousands of worker role instances. ·         Windows Azure accounts are typically limited to 20 cores. If you need to use more than this, a call to support and a credit card check will be required. ·         Be aware of how the billing model works. You will be charged for worker role instances for the full clock our in which the instance is deployed. Schedule the workload to start just after the clock hour has started. ·         Monitor the utilization of the resources you are provisioning, ensure that you are not paying for worker roles that are idle. ·         If you are deploying third party applications to worker roles, you may well run into licensing issues. Purchasing software licenses on a per-processor basis when using hundreds of processors for a short time period would not be cost effective. ·         Third party software may also require installation onto the worker roles, which can be accomplished using start-up tasks. Bear in mind that adding a startup task and possible re-boot will add to the time required for the worker role instance to start and activate. An alternative may be to use a prepared VM and use VM roles. ·         Consider using the Windows Azure Autoscaling Application Block (WASABi) to autoscale the worker roles in your application. When using a large number of worker roles, the utilization must be carefully monitored, if the scaling algorithms are not optimal it could get very expensive!

    Read the article

  • CodePlex Daily Summary for Friday, June 03, 2011

    CodePlex Daily Summary for Friday, June 03, 2011Popular ReleasesMedia Companion: MC 3.404b Weekly: Extract the entire archive to a folder which has user access rights, eg desktop, documents etc. Refer to the documentation on this site for the Installation & Setup Guide Important! *** Due to an issue where the date added & the full genre information was not being read into the Movie cache, it is highly recommended that you perform a Rebuild Movies when first running this latest version. This will read in the information from the nfo's & repopulate the cache used by MC during operation. Fi...Terraria Map Generator: TerrariaMapTool 1.0.0.4 Beta: 1) Fixed the generated map.html file so that the file:/// is included in the base path. 2) Added the ability to use parallelization during generation. This will cause the program to use as many threads as there are physical cores. 3) Fixed some background overdraw.DotRas: DotRas v1.2 (Version 1.2.4168): This release includes compiled (and signed) versions of the binaries, PDBs, CHM help documentation, along with both C# and VB.NET examples. Please don't forget to rate the release! If you find a bug, please open a work item and give as much description as possible. Stack traces, which operating system(s) you're targeting, and build type is also very helpful for bug hunting. If you find something you believe to be a bug but are not sure, create a new discussion on the discussions board. Thank...Ajax Minifier: Microsoft Ajax Minifier 4.21: Fixed issues #15949 and #15868. Tweaked output in multi-line (pretty-print) mode so that var-statements in the initializer of for-statements break multiple variables onto multiple lines. Added TreeModification flag (and can now therefore turn off) the behavior of removing quotes around object literal property names if the names can be valid identifiers. Convert true literals to !0 and false literals to !1. Added a Visitor class approach to iterating over a generated abstract syntax tree to he...Caliburn Micro: WPF, Silverlight and WP7 made easy.: Caliburn.Micro v1.1 RTW: Download ContentsDebug and Release Assemblies Samples Changes.txt License.txt Release Highlights For WP7A new Tombstoning API based on ideas from Fluent NHibernate A new Launcher/Chooser API Strongly typed Navigation SimpleContainer included The full phone lifecycle is made easy to work with ie. we figure out whether your app is actually Resurrecting or just Continuing for you For WPFSupport for the Client Profile Better support for WinForms integration All PlatformsA power...VidCoder: 0.9.1: Added color coding to the Log window. Errors are highlighted in red, HandBrake logs are in black and VidCoder logs are in dark blue. Moved enqueue button to the right with the other control buttons. Added logic to report failures when errors are logged during the encode or when the encode finishes prematurely. Added Copy button to Log window. Adjusted audio track selection box to always show the full track name. Changed encode job progress bar to also be colored yellow when the enco...AutoLoL: AutoLoL v2.0.1: - Fixed a small bug in Auto Login - Fixed the updaterEPPlus-Create advanced Excel 2007 spreadsheets on the server: EPPlus 2.9.0.1: EPPlus-Create advanced Excel 2007 spreadsheets on the server This version has been updated to .Net Framework 3.5 New Features Data Validation. PivotTables (Basic functionalliy...) Support for worksheet data sources. Column, Row, Page and Data fields. Date and Numeric grouping Build in styles. ...and more And some minor new features... Ranges Text-Property|Get the formated value AutofitColumns-method to set the column width from the content of the range LoadFromCollection-metho...jQuery ASP.Net MVC Controls: Version 1.4.0.0: Version 1.4.0.0 contains the following additions: Upgraded to MVC 3.0 Upgraded to jQuery 1.6.1 (Though the project supports all jQuery version from 1.4.x onwards) Upgraded to jqGrid 3.8 Better Razor View-Engine support Better Pager support, includes support for custom pagers Added jqGrid toolbar buttons support Search module refactored, with full suport for multiple filters and ordering And Code cleanup, bug-fixes and better controller configuration support.Nearforums - ASP.NET MVC forum engine: Nearforums v6.0: Version 6.0 of Nearforums, the ASP.NET MVC Forum Engine, containing new features: Authentication using Membership Provider for SQL Server and MySql Spam prevention: Flood Control Moderation: Flag messages Content management: Pages: Create pages (about us/contact/texts) through web administration Allow nearforums to run as an IIS subapp Migrated Facebook Connect to OAuth 2.0 Visit the project Roadmap for more details.NetOffice - The easiest way to use Office in .NET: NetOffice Release 0.8b: Changes: - fix critical issue 15922(AccessViolationException) once and for all ...update is strongly recommended Known Issues: - some addin ribbon examples has a COM Register function with missing codebase entry(win32 registry) ...the problem is only affected to c# examples. fixed in latest source code. NetOffice\ReleaseTags\NetOffice Release 0.8.rar Includes: - Runtime Binaries and Source Code for .NET Framework:......v2.0, v3.0, v3.5, v4.0 - Tutorials in C# and VB.Net:...................Facebook Graph Toolkit: Facebook Graph Toolkit 1.5.4186: Updates the API in response to Facebook's recent change of policy: All Graph Api accessing feeds or posts must provide a AccessToken.Serviio for Windows Home Server: Beta Release 0.5.2.0: Ready for widespread beta. Synchronized build number to Serviio version to avoid confusion.AcDown????? - Anime&Comic Downloader: AcDown????? v3.0 Beta4: ??AcDown?????????????,??????????????,????、????。?????Acfun????? ????32??64? Windows XP/Vista/7 ????????????? ??:????????Windows XP???,?????????.NET Framework 2.0???(x86)?.NET Framework 2.0???(x64),?????"?????????"??? ??v3.0 Beta4 2011-5-31?? ???Bilibili.us????? ???? ?? ???"????" ???Bilibili.us??? ??????? ?? ??????? ?? ???????? ?? ?? ???Bilibili.us?????(??????????????????) ??????(6.cn)?????(????) ?? ?????Acfun?????????? ?????????????? ???QQ???????? ????????????Discussion...CodeCopy Auto Code Converter: Code Copy v0.1: Full add-in, setup project source code and setup fileWordpress Theme 'Windows Metro': v1.00.0017: v1.00.0017 Dies ist eine Vorab-Version aus der Entwickler-Phase. This is a preview version from the development phase.TerrariViewer: TerrariViewer v2.4.1: Added Piggy Bank editor and fixed some minor bugs.Kooboo CMS: Kooboo CMS 3.02: Updated the Kooboo_CMS.zip at 2011-06-02 11:44 GMT 1.Fixed: Adding data rule issue on page. 2.Fixed: Caching issue for higher performance. Updated the Kooboo_CMS.zip at 2011-06-01 10:00 GMT 1. Fixed the published package throwed a compile error under WebSite mode. 2. Fixed the ContentHelper.NewMediaFolderObject return TextFolder issue. 3. Shorten the name of ContentHelper API. NewMediaFolderObject=>MediaFolder, NewTextFolderObject=> TextFolder, NewSchemaObject=>Schema. Also update the C...mojoPortal: 2.3.6.6: see release notes on mojoportal.com http://www.mojoportal.com/mojoportal-2366-released Note that we have separate deployment packages for .NET 3.5 and .NET 4.0 The deployment package downloads on this page are pre-compiled and ready for production deployment, they contain no C# source code. To download the source code see the Source Code Tab I recommend getting the latest source code using TortoiseHG, you can get the source code corresponding to this release here.Terraria World Creator: Terraria World Creator: Version 1.01 Fixed a bug that would cause the application to crash. Re-named the Application.New Projects"Background Transfer Service Sample" From WP7 Sample: This project is an extension to existing Windows Phone7 sample available from MS sight ("Background Transfer Service Sample"). This project extends functionality to display the background downloaded item to verify download conten. Agile project and development management dashboard - Meter Console: Agile project and development management dashboard application. The focus of this application to bridge the gap between the management, clients and developer and chive best possible transparency. Ajax Multi-ListBox: Ajax Multi-ListBox is .NET C# written user control with a dual listbox that uses ajax to move items from left to right and vice versa.aLOG : Asynchronous logging (Logging Library): aLog is an asynchronous logging component built on Microsoft Enterprise Exception handling and Logging Component block. It allows application to asynchronously log the data in multiple data sources such as file system, database , XML. Application Base Component Library: Application Base Component LibraryConcise Service Bus: A compact but comprehensive WCF service bus framework.Contour provider for Umbraco Courier: Transfer Umbraco Contour forms between Umbraco instances with Courier.CoveyNet MidWest: CoveyNet MidWestCSharp Samples: This contain C#- wpf, wcf, linqtosql, some basic oops stuff etc.DotNet1: DotNet1FireWeb: fefaesfrafasGeoCoder: Basic server side GeoCoder which accesses Google, YAHOO etc. This is developed in C# and is a mix of language styles, just depended what I was doing that week. All the libraries include tests which can be used to check if the libraries still work, it's worth doing this as the consumed webservices may change from time to time. GrandReward: GrandReward is designed to help young and old people to learn by answer questions. It is easy to use and the community can design own topic files for own questions to learn. The project is develop in C# and contains some WPF views. The project homepage is: http://grand-reward.com/helferlein: The helferlein library contains some web controls and tools I used in other projects.helferlein_BabelFish: helferlein_BabelFish is a DNN module to support multilingual features for other helferlein modules. It is developed in C#.helferlein_Form: helferlein_Form is an easy-to-use module for DotNetNuke that allows creating forms. The submissions can be sent by e-mail and/or stored in a database. It's developed in C#.ILSpy/Silverlight — Proof Of Concept: ILSpy refactoring (dire hacking) to make it work in Silverlight.ISGProject: ISGProjectMarcelo Rocha: Trabalhos acadêmicos, Apresentações, Projetos e outras Invencionices de Marcelo Rocha.MosquitoOnline: MosquitoOnline ProjectNET.Studio: Ein kleines Visual Studio für die SchuleOmegle Desktop Client: A desktop client for the anonymous chat site, Omegle. Designed to combat some of the shortcomings of the webpage, the desktop client will have no adverts, and be able to flash the taskbar window, and play a sound on message received. Online PHP IDE: Free Open Source online editor for working with files on FTP. Free and easy deployment.Open JGL: A utillity library using OpenGL, which focus' on graphics and vertex control.Open Tokenization Server: Open Tokenization Server is a simplistic implementation of a tokenization data security service. It allows users to create and retrieve tokens that represent sensitive information.OurCodeNights: Just some small projects we "work" on at our codenights. A good excuse to meet and have a good time ;)Raple: Raple is simple programming languageSea: Set of C# Extension libraries that make life easier.Shai Chi: Shai ChiSharePoint Sandbox Logging: SharePoint Sandbox Logging enables developers and solution designers to easily log events to logging list in your site collection, allowing you to easily trace errors on custom solutions within your site collection, with no need for server access. Perfect for sandbox / Office365. This project contains one sandbox solution with a site collection logging feature. When feature is active - it creates the list and logs. If it is not active - it does not log. Simple. It also contains a sampl...SharePoint Stock Ticker: SharePoint Stock Ticker is a SharePoint 2010 project which consists of a SharePoint Stock Ticker Web Part driven off of Google's Stock API. The width of this Web Part was developed to fit within the default SharePoint 2010 Quick Launch navigation. This has also been tested to run over http and https SharePoint 2010 farms and will not throw a mixed content warning with Internet Explorer.SharpLinx: SharpLinx is an open source social networking application on ASP.Net.Single.Net - less Copy & Paste,less code lines,more easier: make your code life easierSqlMyAdmin: A projecto to create a PHPMyAdmin clone app that access Sql Server.TDB: TDB is a NOSQL Key-value store entirely designed to run in windows environment. It's directly inspired by Redis project and use Redis' command sematincs and Redis' communication protocol to maintain a base compatibility with it.Tomasulo: A simulator of Tomasulo CDB dynamic instruction scheduling algorithm in Java. Authors: Mengyu Zhou @J85 Zhenyao Zhu @J85 Jun Li @J81 @Tsinghua UniversityTransport Broker Management System: Transport Management system will serve the small freight brokers manage the carriers and orders. It has features like manage truck owners, carriers, source, destination, orders. Verification system for MCBS: Verification system for mobile communication base station.VietCard: VietCardVkontakte File converter: ????????? ?????? ??? ???????? ?? ? "?????????" vkontakte.ru. VSGesture for Visual Studio: VSGesture can execute command via mouse gestures within Visual Studio2010. If you have any feedback, please send me an email to powerumc at gmail.com.WCF-Silverlight Helper: Class library to generate WCF DataContractSerializer-serializable and Silverlight (SL) -compatible classes. Useful, for example, if you have an assembly with classes that fail to serialize with DataContractSerializer. T4 Templates are used to auto-generate code.Windows Phone eBook Samples: Windows Phone eBook Samples is a collection of samples associated with the Windows Phone eBoom community initiative.WinPreciousMetal: WinPreciousMetal helps people know the precious metal's states. Because I am a Chinese, my soft mainly focus at the price of metal in RMB.

    Read the article

  • File Watcher Task

    The task will detect changes to existing files as well as new files, both actions will cause the file to be found when available. A file is available when the task can open it exclusively. This is important for files that take a long time to be written, such as large files, or those that are just written slowly or delivered via a slow network link. It can also be set to look for existing files first (1.2.4.55). The full path of the found file is returned in up to three ways: The ExecValueVariable of the task. This can be set to any String variable. The OutputVariableName when specified. This can be set to any String variable. The FullPath variable within OnFileFoundEvent. This is a File Watcher Task specific event.   Advanced warning of a file having been detected, but not yet available is returned through the OnFileWatcherEvent. This event does not always coincide with the completion of the task, as completion and the OnFileFoundEvent is delayed until the file is ready for use. This event indicates that a file has been detected, and that file will now be monitored until it becomes available. The task will only detect and report on the first file that is created or changes, any subsequent changes will be ignored. Task properties and there usages are documented below: Property Data Type Description Filter String Default filter *.* will watch all files. Standard windows wildcards and patterns can be used to restrict the files monitored. FindExistingFiles Boolean Indicates whether the task should check for any existing files that match the path and filter criteria, before starting the file watcher. IncludeSubdirectories Boolean Indicates whether changes in subdirectories are accepted or ignored. OutputVariableName String The name of the variable into which the full file path found will be written on completion of the task. The variable specified should be of type string. Path String Path to watch for new files or changes to existing files. The path is a directory, not a full filename. For a specific file, enter the file name in the Filter property and the directory in the Path property. PathInputType FileWatcherTask.InputType Three input types are supported for the path: Connection - File connection manager, of type existing folder. Direct Input - Type the path directly into the UI or set on the property as a literal string. Variable – The name of the variable which contains the path. Timeout Integer Time in minutes to wait for a file. If no files are detected within the timeout period the task will fail. The default value of 0 means infinite, and will not expire. TimeoutAsWarning Boolean The default behaviour is to raise an error and fail the task on timeout. This property allows you to suppress the error on timeout, a warning event is raised instead, and the task succeeds. The default value is false.   Installation The task is provided as an MSI file which you can download and run to install it. This simply places the files on disk in the correct locations and also installs the assemblies in the Global Assembly Cache as per Microsoft’s recommendations. You may need to restart the SQL Server Integration Services service, as this caches information about what components are installed, as well as restarting any open instances of Business Intelligence Development Studio (BIDS) / Visual Studio that you may be using to build your SSIS packages. For 2005/2008 Only - Finally you will have to add the task to the Visual Studio toolbox manually. Right-click the toolbox, and select Choose Items.... Select the SSIS Control Flow Items tab, and then check the File Watcher Task in the Choose Toolbox Items window. This process has been described in detail in the related FAQ entry for How do I install a task or transform component? We recommend you follow best practice and apply the current Microsoft SQL Server Service pack to your SQL Server servers and workstations. Downloads The File Watcher Task  is available for SQL Server 2005, SQL Server 2008 (includes R2) and SQL Server 2012. Please choose the version to match your SQL Server version, or you can install multiple versions and use them side by side if you have more than one version of SQL Server installed. File Watcher Task for SQL Server 2005 File Watcher Task for SQL Server 2008 File Watcher Task for SQL Server 2012 Version History SQL Server 2012 Version 3.0.0.16 - SQL Server 2012 release. Includes upgrade support for both 2005 and 2008 packages to 2012. (5 Jun 2012) SQL Server 2008 Version 2.0.0.14 - Fixed user interface bug. A migration problem caused the UI type editors to reference an old SQL 2005 assembly. (17 Nov 2008) Version 2.0.0.7 - SQL Server 2008 release. (20 Oct 2008) SQL Server 2005 Version 1.2.6.100 - Fixed UI bug with TimeoutAsWarning property not saving correctly. Improved expression support in UI. File availability detection changed to use read-only lock, allowing reduced permissions to be used. Corrected installed issue which prevented installation on 64-bit machines with SSIS runtime only components. (18 Mar 2007) Version 1.2.5.73 - Added TimeoutAsWarning property. Gives the ability to suppress the error on timeout, a warning event is raised instead, and the task succeeds. (Task Version 3) (27 Sep 2006) Version 1.2.4.61 - Fixed a bug which could cause a loop condition with an unexpected exception such as incorrect file permissions. (20 Sep 2006) Version 1.2.4.55 - Added FindExistingFiles property. When true the task will check for an existing file before the file watcher itself actually starts. (Task Version 2) (8 Sep 2006) Version 1.2.3.39 - SQL Server 2005 RTM Refresh. SP1 Compatibility Testing. Property type validation improved. (12 Jun 2006) Version 1.2.1.0 - SQL Server 2005 IDW 16 Sept CTP. Futher UI enhancements, including expression indicator. Fixed bug caused by execution within loop Subsequent iterations detected the same file as the first iteration. Added IncludeSubdirectories property. Fixed bug when changes made in subdirectories, and folder change was detected, causing task failure. (Task Version 1) (6 Oct 2005) Version 1.2.0.0 - SQL Server 2005 IDW 15 June CTP. Changes made include an enhanced UI, the PathInputType property for greater flexibility with path input, the OutputVariableName property, and the new OnFileFoundEvent event. (7 Sep 2005) Version 1.1.2 - Public Release (16 Nov 2004) Screenshots   Troubleshooting Make sure you have downloaded the version that matches your version of SQL Server. We offer separate downloads for SQL Server 2005 and SQL Server 2008. If you an error when you try and use the task along the lines of The task with the name "File Watcher Task" and the creation name ... is not registered for use on this computer, this usually indicates that the internal cache of SSIS components needs to be updated. This cache is held by the SSIS service, so you need restart the the SQL Server Integration Services service. You can do this from the Services applet in Control Panel or Administrative Tools in Windows. You can also restart the computer if you prefer. You may also need to restart any current instances of Business Intelligence Development Studio (BIDS) / Visual Studio that you may be using to build your SSIS packages. The full error message is shown below for reference: TITLE: Microsoft Visual Studio ------------------------------ The task with the name "File Watcher Task" and the creation name "Konesans.Dts.Tasks.FileWatcherTask.FileWatcherTask, Konesans.Dts.Tasks.FileWatcherTask, Version=1.2.0.0, Culture=neutral, PublicKeyToken=b2ab4a111192992b" is not registered for use on this computer. Contact Information: File Watcher Task A similar error message can be shown when trying to edit the task if the Microsoft Exception Message Box is not installed. This useful component is installed as part of the SQL Server Management Studio tools but occasionally due to the custom options chosen during SQL Server 2005 setup it may be absent. If you get an error like Could not load file or assembly 'Microsoft.ExceptionMessageBox.. you can manually download and install the missing component. It is available as part of the Feature Pack for SQL Server 2005 release. The feature packs are occasionally updated by Microsoft so you may like to check for a more recent edition, but you can find the Microsoft Exception Message Box download links here - Feature Pack for Microsoft SQL Server 2005 - April 2006 If you encounter this problem on SQL Server 2008, please check that you have installed the SQL Server client components. The component is no longer available as a separate download for SQL Server 2008  as noted in the Microsoft documentation for Deploying an Exception Message Box Application The full error message is shown below for reference, although note that the Version will change between SQL Server 2005 and SQL Server 2008: TITLE: Microsoft Visual Studio ------------------------------ Cannot show the editor for this task. ------------------------------ ADDITIONAL INFORMATION: Could not load file or assembly 'Microsoft.ExceptionMessageBox, Version=9.0.242.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' or one of its dependencies. The system cannot find the file specified. (Konesans.Dts.Tasks.FileWatcherTask) Once installation is complete you need to manually add the task to the toolbox before you will see it and to be able add it to packages - How do I install a task or transform component? If you are still having issues then contact us, but please provide as much detail as possible about error, as well as which version of the the task you are using and details of the SSIS tools installed. Sample Code If you wanted to use the task programmatically then here is some sample code for creating a basic package and configuring the task. It uses a variable to supply the path to watch, and also sets a variable for the OutputVariableName. Once execution is complete it writes out the file found to the console. /// <summary> /// Create a package with an File Watcher Task /// </summary> public void FileWatcherTaskBasic() { // Create the package Package package = new Package(); package.Name = "FileWatcherTaskBasic"; // Add variable for input path, the folder to look in package.Variables.Add("InputPath", false, "User", @"C:\Temp\"); // Add variable for the file found, to be used on OutputVariableName property package.Variables.Add("FileFound", false, "User", "EMPTY"); // Add the Task package.Executables.Add("Konesans.Dts.Tasks.FileWatcherTask.FileWatcherTask, " + "Konesans.Dts.Tasks.FileWatcherTask, Version=1.2.0.0, Culture=neutral, PublicKeyToken=b2ab4a111192992b"); // Get the task host wrapper TaskHost taskHost = package.Executables[0] as TaskHost; // Set basic properties taskHost.Properties["PathInputType"].SetValue(taskHost, 1); // InputType.Variable taskHost.Properties["Path"].SetValue(taskHost, "User::InputPath"); taskHost.Properties["OutputVariableName"].SetValue(taskHost, "User::FileFound"); #if DEBUG // Save package to disk, DEBUG only new Application().SaveToXml(String.Format(@"C:\Temp\{0}.dtsx", package.Name), package, null); #endif // Display variable value before execution to check EMPTY Console.WriteLine("Result Variable: {0}", package.Variables["User::FileFound"].Value); // Execute package package.Execute(); // Display variable value after execution, e.g. C:\Temp\File.txt Console.WriteLine("Result Variable: {0}", package.Variables["User::FileFound"].Value); // Perform simple check for execution errors if (package.Errors.Count > 0) foreach (DtsError error in package.Errors) { Console.WriteLine("ErrorCode : {0}", error.ErrorCode); Console.WriteLine(" SubComponent : {0}", error.SubComponent); Console.WriteLine(" Description : {0}", error.Description); } else Console.WriteLine("Success - {0}", package.Name); // Clean-up package.Dispose(); } (Updated installation and troubleshooting sections, and added sample code July 2009)

    Read the article

  • Cisco ASA not forwarding traffic from one interface to another

    - by Antoine Benkemoun
    Hello ServerFault, I am needing help in the configuration process of my Cisco ASA 5510. I have set up 4 Cisco ASA interconnected together via a big LAN. Each Cisco ASA has 3 or 4 LANs attached to them. The IP routing part is taken care of by OSPF. My problem is on another level. A computer connected to one of the LANs attached to an ASA has no problem communicating with the outside world. The outside world being anything "after" the ASA. My problem is that I am completely unable to have them communicate with another LAN connected to the same ASA. To rephrase this, I am unable to send traffic from one interface of a given ASA to another interface of the same ASA. My configuration is the following : ! hostname Fuji ! interface Ethernet0/0 speed 100 duplex full nameif outside security-level 0 ip address 10.0.0.2 255.255.255.0 no shutdown ! interface Ethernet0/1 speed 100 duplex full nameif cs4 no shutdown security-level 100 ip address 10.1.4.1 255.255.255.0 ! interface Ethernet0/2 speed 100 duplex full no shutdown ! interface Ethernet0/2.15 vlan 15 nameif cs5 security-level 100 ip address 10.1.5.1 255.255.255.0 ! interface Ethernet0/2.16 vlan 16 nameif cs6 security-level 100 ip address 10.1.6.1 255.255.255.0 ! interface Management0/0 speed 100 duplex full nameif management security-level 100 ip address 10.6.0.252 255.255.255.0 ! access-list nat_cs4 extended permit ip 10.1.4.0 255.255.255.0 any access-list acl_cs4 extended permit ip 10.1.4.0 255.255.255.0 any access-list nat_cs5 extended permit ip 10.1.5.0 255.255.255.0 any access-list acl_cs5 extended permit ip 10.1.5.0 255.255.255.0 any access-list nat_cs6 extended permit ip 10.1.6.0 255.255.255.0 any access-list acl_cs6 extended permit ip 10.1.6.0 255.255.255.0 any ! access-list nat_outside extended permit ip any any access-list acl_outside extended permit ip any 10.1.4.0 255.255.255.0 access-list acl_outside extended permit ip any 10.1.5.0 255.255.255.0 access-list acl_outside extended permit ip any 10.1.6.0 255.255.255.0 ! nat (outside) 0 access-list nat_outside nat (cs4) 0 access-list nat_cs4 nat (cs5) 0 access-list nat_cs5 nat (cs6) 0 access-list nat_cs6 ! static (outside,cs4) 0.0.0.0 0.0.0.0 netmask 0.0.0.0 static (outside,cs5) 0.0.0.0 0.0.0.0 netmask 0.0.0.0 static (outside,cs6) 0.0.0.0 0.0.0.0 netmask 0.0.0.0 ! static (cs4,outside) 10.1.4.0 10.1.4.0 netmask 255.255.255.0 static (cs4,cs5) 10.1.4.0 10.1.4.0 netmask 255.255.255.0 static (cs4,cs6) 10.1.4.0 10.1.4.0 netmask 255.255.255.0 ! static (cs5,outside) 10.1.5.0 10.1.5.0 netmask 255.255.255.0 static (cs5,cs4) 10.1.5.0 10.1.5.0 netmask 255.255.255.0 static (cs5,cs6) 10.1.5.0 10.1.5.0 netmask 255.255.255.0 ! static (cs6,outside) 10.1.6.0 10.1.6.0 netmask 255.255.255.0 static (cs6,cs4) 10.1.6.0 10.1.6.0 netmask 255.255.255.0 static (cs6,cs5) 10.1.6.0 10.1.6.0 netmask 255.255.255.0 ! access-group acl_outside in interface outside access-group acl_cs4 in interface cs4 access-group acl_cs5 in interface cs5 access-group acl_cs6 in interface cs6 ! router ospf 1 network 10.0.0.0 255.255.255.0 area 1 network 10.1.4.0 255.255.255.0 area 1 network 10.1.5.0 255.255.255.0 area 1 network 10.1.6.0 255.255.255.0 area 1 log-adj-changes ! There is nothing really complicated in this configuration. It just NATs from one interface to another and that's it. I have tried enabling same-security-traffic permit inter-interface but that doesn't help. I therefore must be missing something a little bit more complicated. Does anyone know why I cannot foward traffic from one interface to another ? Thank you in advance for your help, Antoine

    Read the article

  • CodePlex Daily Summary for Saturday, June 04, 2011

    CodePlex Daily Summary for Saturday, June 04, 2011Popular ReleasesSublightCmd: SublightCmd 1.1.0: -added guessTitle switch -improved console output -updated Sublight.Lib -updated SharpZipLib library -added new command: batch (same as downloadbatch) -added waitForExit switch for batch/downloadbatch commandpatterns & practices: Project Silk: Project Silk Community Drop 10 - June 3, 2011: Changes from previous drop: Many code changes: please see the readme.mht for details. New "Application Notifications" chapter. Updated "Server-Side Implementation" chapter. Guidance Chapters Ready for Review The Word documents for the chapters are included with the source code in addition to the CHM to help you provide feedback. The PDF is provided as a separate download for your convenience. Installation Overview To install and run the reference implementation, you must perform the fol...Claims Based Identity & Access Control Guide: Release Candidate: Highlights of this release This is the release candidate drop of the new "Claims Identity Guide" edition. In this release you will find: All code samples, including all ACS v2: ACS as a Federation Provider - Showing authentication with LiveID, Google, etc. ACS as a FP with Multiple Business Partners. ACS and REST endpoints. Using a WP7 client with REST endpoints. All ACS specific chapters. Two new chapters on SharePoint (SSO and Federation) All revised v1 chapters We are now ...Media Companion: MC 3.404b Weekly: Extract the entire archive to a folder which has user access rights, eg desktop, documents etc. Refer to the documentation on this site for the Installation & Setup Guide Important! *** Due to an issue where the date added & the full genre information was not being read into the Movie cache, it is highly recommended that you perform a Rebuild Movies when first running this latest version. This will read in the information from the nfo's & repopulate the cache used by MC during operation. Fi...Terraria Map Generator: TerrariaMapTool 1.0.0.4 Beta: 1) Fixed the generated map.html file so that the file:/// is included in the base path. 2) Added the ability to use parallelization during generation. This will cause the program to use as many threads as there are physical cores. 3) Fixed some background overdraw.DotRas: DotRas v1.2 (Version 1.2.4168): This release includes compiled (and signed) versions of the binaries, PDBs, CHM help documentation, along with both C# and VB.NET examples. Please don't forget to rate the release! If you find a bug, please open a work item and give as much description as possible. Stack traces, which operating system(s) you're targeting, and build type is also very helpful for bug hunting. If you find something you believe to be a bug but are not sure, create a new discussion on the discussions board. Thank...Caliburn Micro: WPF, Silverlight and WP7 made easy.: Caliburn.Micro v1.1 RTW: Download ContentsDebug and Release Assemblies Samples Changes.txt License.txt Release Highlights For WP7A new Tombstoning API based on ideas from Fluent NHibernate A new Launcher/Chooser API Strongly typed Navigation SimpleContainer included The full phone lifecycle is made easy to work with ie. we figure out whether your app is actually Resurrecting or just Continuing for you For WPFSupport for the Client Profile Better support for WinForms integration All PlatformsA power...VidCoder: 0.9.1: Added color coding to the Log window. Errors are highlighted in red, HandBrake logs are in black and VidCoder logs are in dark blue. Moved enqueue button to the right with the other control buttons. Added logic to report failures when errors are logged during the encode or when the encode finishes prematurely. Added Copy button to Log window. Adjusted audio track selection box to always show the full track name. Changed encode job progress bar to also be colored yellow when the enco...AutoLoL: AutoLoL v2.0.1: - Fixed a small bug in Auto Login - Fixed the updaterEPPlus-Create advanced Excel 2007 spreadsheets on the server: EPPlus 2.9.0.1: EPPlus-Create advanced Excel 2007 spreadsheets on the server This version has been updated to .Net Framework 3.5 New Features Data Validation. PivotTables (Basic functionalliy...) Support for worksheet data sources. Column, Row, Page and Data fields. Date and Numeric grouping Build in styles. ...and more And some minor new features... Ranges Text-Property|Get the formated value AutofitColumns-method to set the column width from the content of the range LoadFromCollection-metho...jQuery ASP.Net MVC Controls: Version 1.4.0.0: Version 1.4.0.0 contains the following additions: Upgraded to MVC 3.0 Upgraded to jQuery 1.6.1 (Though the project supports all jQuery version from 1.4.x onwards) Upgraded to jqGrid 3.8 Better Razor View-Engine support Better Pager support, includes support for custom pagers Added jqGrid toolbar buttons support Search module refactored, with full suport for multiple filters and ordering And Code cleanup, bug-fixes and better controller configuration support.Nearforums - ASP.NET MVC forum engine: Nearforums v6.0: Version 6.0 of Nearforums, the ASP.NET MVC Forum Engine, containing new features: Authentication using Membership Provider for SQL Server and MySql Spam prevention: Flood Control Moderation: Flag messages Content management: Pages: Create pages (about us/contact/texts) through web administration Allow nearforums to run as an IIS subapp Migrated Facebook Connect to OAuth 2.0 Visit the project Roadmap for more details.NetOffice - The easiest way to use Office in .NET: NetOffice Release 0.8b: Changes: - fix critical issue 15922(AccessViolationException) once and for all ...update is strongly recommended Known Issues: - some addin ribbon examples has a COM Register function with missing codebase entry(win32 registry) ...the problem is only affected to c# examples. fixed in latest source code. NetOffice\ReleaseTags\NetOffice Release 0.8.rar Includes: - Runtime Binaries and Source Code for .NET Framework:......v2.0, v3.0, v3.5, v4.0 - Tutorials in C# and VB.Net:...................Facebook Graph Toolkit: Facebook Graph Toolkit 1.5.4186: Updates the API in response to Facebook's recent change of policy: All Graph Api accessing feeds or posts must provide a AccessToken.Serviio for Windows Home Server: Beta Release 0.5.2.0: Ready for widespread beta. Synchronized build number to Serviio version to avoid confusion.AcDown????? - Anime&Comic Downloader: AcDown????? v3.0 Beta4: ??AcDown?????????????,??????????????,????、????。?????Acfun????? ????32??64? Windows XP/Vista/7 ????????????? ??:????????Windows XP???,?????????.NET Framework 2.0???(x86)?.NET Framework 2.0???(x64),?????"?????????"??? ??v3.0 Beta4 2011-5-31?? ???Bilibili.us????? ???? ?? ???"????" ???Bilibili.us??? ??????? ?? ??????? ?? ???????? ?? ?? ???Bilibili.us?????(??????????????????) ??????(6.cn)?????(????) ?? ?????Acfun?????????? ?????????????? ???QQ???????? ????????????Discussion...CodeCopy Auto Code Converter: Code Copy v0.1: Full add-in, setup project source code and setup fileTerrariViewer: TerrariViewer v2.4.1: Added Piggy Bank editor and fixed some minor bugs.Kooboo CMS: Kooboo CMS 3.02: Updated the Kooboo_CMS.zip at 2011-06-02 11:44 GMT 1.Fixed: Adding data rule issue on page. 2.Fixed: Caching issue for higher performance. Updated the Kooboo_CMS.zip at 2011-06-01 10:00 GMT 1. Fixed the published package throwed a compile error under WebSite mode. 2. Fixed the ContentHelper.NewMediaFolderObject return TextFolder issue. 3. Shorten the name of ContentHelper API. NewMediaFolderObject=>MediaFolder, NewTextFolderObject=> TextFolder, NewSchemaObject=>Schema. Also update the C...mojoPortal: 2.3.6.6: see release notes on mojoportal.com http://www.mojoportal.com/mojoportal-2366-released Note that we have separate deployment packages for .NET 3.5 and .NET 4.0 The deployment package downloads on this page are pre-compiled and ready for production deployment, they contain no C# source code. To download the source code see the Source Code Tab I recommend getting the latest source code using TortoiseHG, you can get the source code corresponding to this release here.New ProjectsCampaign Portfolio Manager: This is a light-weight organizer for GMs and Players of Table-Top RPGs. It allows quick access to basic notes regarding PCs, NPCs, and planned Stories. Written in C#eMarketplace: eMarketPlace is a website project for buyer and sellers for marketing any type of goodsEstudo de Realidade Aumentada usando Balder: Estudo de Realidade Aumentada usando desenhos 3D e Processamento de Imagem.Hex Vitals Calculator: Hex Vitals will help game masters understand their hex maps better. If you have one of seven measurements for a hex, you can use it to derive the other six. This program was written in C#.Internet Programming with Asp.NET 4: This project based on lesson in my college , Ma Chung University which is located at Malang, Indonesia. We (team = 2person) created this with Microsoft Visual Studio 2010 ( Visual Basic.NET). This project site about cooking club (www.hostingmm.com) at my university.kekkoooLibs: my tests.Moira Project: Moira project is a configurable, extensible and pluggable software for file management. Its core is a component that watches the file system and is capable of running different tasks based on files properties analysis. OOP: OOP is a C++ framework, which provides many utilities to make easier for C++ programmers to develop C++ programOsProject: this project is a sample of company routine to call technical. PowerShell Patch Audit/Installation GUI: PoshPAIG allows you to easily audit and install patches on your servers in the network by providing a graphical interface to select which servers to audit/install and to generate reports for the systems.RiordanWebSite: ??web??SelvinListSyncSample: My sample for Microsoft Sync Framework 4.0 CTP on Android deviceSharePoint 2010 XML Managed Metadata Migrator: The SP2010 Managed Metadata Migrator allows the export of a metadata service configuration from a source SP2010 farm, then re-import into a target SP2010 farm with options to re-associate your content type definitions with the defined term sets. Developed in C#SharePoint Foundation QueryString Filter WebPart: SharePoint foundation web part to provide out of the box components the capability to filter information based on a url querystring parameterSitePounder: Send infinite requests to specified URL target; join with others to wage distributed denial of service attacks against the deserving and overly self-serving.Snaky: A Simple Snake Game in C++ using SFML 2.0SOL Polar Explorer: A performance polar viewer and data explorer for sailonline.org polar files.SPEventReceiverWebpart: A SharePoint Webpart which allows you to add or delete EventReceivers to or from Lists in the current web.VS.NET Deployment Project Updater: VDProjectUpdater is a command line utility that will update a VS.NET setup project (vdproj) version and product/package/upgrade/module identifiers. This allows the setup projects to be integrated into automated builds that generate versioned setup outputs.?????? For Silverlight 5: ?????Silverlight ?RIA Service ???????????,??????EsaySL???,????????????,??????Silverlight

    Read the article

  • New release of the Windows Azure SDK and Tools (March CTP)

    - by kaleidoscope
    From now on, you only have to download the Windows Azure Tools for Microsoft Visual Studio and the SDK will be installed as part of that package. What’s new in Windows Azure SDK Support for developing Managed Full Trust applications. It also provides support for Native Code via PInvokes and spawning native processes. Support for developing FastCGI applications, including support for rewrite rules via URL Rewrite Module. Improved support for the integration of development storage with Visual Studio, including enhanced performance and support for SQL Server (only local instance). What’s new in Windows Azure Tools for Visual Studio Combined installer includes the Windows Azure SDK Addressed top customer bugs. Native Code Debugging. Update Notification of future releases. FastCGI template http://blogs.msdn.com/jnak/archive/2009/03/18/now-available-march-ctp-of-the-windows-azure-tools-and-sdk.aspx   Ritesh, D

    Read the article

  • website address point to localhost

    - by munir ahmad
    Today in firefox while surfing the internet, I opened a website it asked "This site may harm your computer" if you want to open this website add it in exception list. I added a website under exception list and trust this website. After that situation, whenever I opend this website, it always points toward the localhost untill internet connected. I have setup localhost through apache (xampp server). If internet not connected this website do not open anything but localhost still work as usaual. How can I remove this situation so that this website do not point to locathost? I am using winxp sp3. Same problem now appear in all browsers too.

    Read the article

  • CodePlex Daily Summary for Tuesday, May 15, 2012

    CodePlex Daily Summary for Tuesday, May 15, 2012Popular Releases51Degrees.mobi - Mobile Device Detection and Redirection: 2.1.4.9: One Click Install from NuGet Data ChangesIncludes 42 new browser properties in both the Lite and Premium data sets. Premium Data includes many new devices including Nokia Lumia 900, BlackBerry 9220 and HTC One, the Samsung Galaxy Tab 2 range and Samsung Galaxy S III. Lite data includes devices released in January 2012. Changes to Version 2.1.4.91. Added Microsoft.Web.Infrastructure.DynamicModuleHelper back into Activator.cs to ensure redirection works when .NET 4 PreApplicationStart use...Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.52: Make preprocessor comment-statements nestable; add the ///#IFNDEF statement. (Discussion #355785) Don't throw an error for old-school JScript event handlers, and don't rename them if they aren't global functions.DotNetNuke® Events: 06.00.00: This is a serious release of Events. DNN 6 form pattern - We have take the full route towards DNN6: most notably the incorporation of the DNN6 form pattern with streamlined UX/UI. We have also tried to change all formatting to a div based structure. A daunting task, since the Events module contains a lot of forms. Roger has done a splendid job by going through all the forms in great detail, replacing all table style layouts into the new DNN6 div class="dnnForm XXX" type of layout with change...LogicCircuit: LogicCircuit 2.12.5.15: Logic Circuit - is educational software for designing and simulating logic circuits. Intuitive graphical user interface, allows you to create unrestricted circuit hierarchy with multi bit buses, debug circuits behavior with oscilloscope, and navigate running circuits hierarchy. Changes of this versionThis release is fixing one but nasty bug. Two functions XOR and XNOR when used with 3 or more inputs were incorrectly evaluating their results. If you have a circuit that is using these functions...SharpCompress - a fully native C# library for RAR, 7Zip, Zip, Tar, GZip, BZip2: SharpCompress 0.8.1: Two fixes: Rar Decompression bug fixed. Error only occurred on some files Rar Decompression will throw an exception when another volume isn't found but one is expected.?????????? - ????????: All-In-One Code Framework ??? 2012-05-14: http://download.codeplex.com/Project/Download/FileDownload.aspx?ProjectName=1codechs&DownloadId=216140 ???OneCode??????,??????????6????Microsoft OneCode Sample,????2?Data Platform Sample?4?WPF Sample。???????????。 ????,?????。http://i3.codeplex.com/Project/Download/FileDownload.aspx?ProjectName=1code&DownloadId=128165 Data Platform Sample CSUseADO CppUseADO WPF Sample CSWPFMasterDetailBinding VBWPFMasterDetailBinding CSWPFThreading VBWPFThreading ....... ???????????blog: ??,??????MSD...ZXMAK2: Version 2.6.1.7: - fix tape bug: cannot select block 0 to playLINQ to Twitter: LINQ to Twitter Beta v2.0.25: Supports .NET 3.5, .NET 4.0, Silverlight 4.0, Windows Phone 7.1, Client Profile, and Windows 8. 100% Twitter API coverage. Also available via NuGet! Follow @JoeMayo.ASP.net MVC HTML5 Helpers Toolkit: ASP.net MVC HTML5 Toolkit: 14th May 2012 MVC HTML5 Helpers Toolkit .NET 4 - Binary – Source code and sample site Update 14/05/2012 - Updated demo project to use MVC4 and Twitter Bootstrap. Password input type has also been added to the list.GAC Explorer: GACExplorer_x86_Setup: Version 1.0 Features -> Copy assembly(s) to clipboard. -> Copy assembly(s) to local folder. -> Open assembly(s) folder location. -> Support Shortcut keysBlogEngine.NET: BlogEngine.NET 2.6: Get DotNetBlogEngine for 3 Months Free! Click Here for More Info BlogEngine.NET Hosting - 3 months free! Cheap ASP.NET Hosting - $4.95/Month - Click Here!! Click Here for More Info Cheap ASP.NET Hosting - $4.95/Month - Click Here! If you want to set up and start using BlogEngine.NET right away, you should download the Web project. If you want to extend or modify BlogEngine.NET, you should download the source code. If you are upgrading from a previous version of BlogEngine.NET, please take...BlackJumboDog: Ver5.6.2: 2012.05.07 Ver5.6.2 (1) Web???????、????????·????????? (2) Web???????、?????????? COMSPEC PATHEXT WINDIR SERVERADDR SERVERPORT DOCUMENTROOT SERVERADMIN REMOTE_PORT HTTPACCEPTCHRSET HTTPACCEPTLANGUAGE HTTPACCEPTEXCODINGGardens Point Parser Generator: Gardens Point Parser Generator version 1.5.0: ChangesVersion 1.5.0 contains a number of changes. Error messages are now MSBuild and VS-friendly. The default encoding of the *.y file is Unicode, with an automatic fallback to the previous raw-byte interpretation. The /report option has been improved, as has the automaton tracing facility. New facilities are included that allow multiple parsers to share a common token type. A complete change-log is available as a separate documentation file. The source project has been upgraded to Visual...Media Companion: Media Companion 3.502b: It has been a slow week, but this release addresses a couple of recent bugs: Movies Multi-part Movies - Existing .nfo files that differed in name from the first part, were missed and scraped again. Trailers - MC attempted to scrape info for existing trailers. TV Shows Show Scraping - shows available only in the non-default language would not show up in the main browser. The correct language can now be selected using the TV Show Selector for a single show. General Will no longer prompt for ...NewLife XCode ??????: XCode v8.5.2012.0508、XCoder v4.7.2012.0320: X????: 1,????For .Net 4.0?? XCoder????: 1,???????,????X????,?????? XCode????: 1,Insert/Update/Delete???????????????,???SQL???? 2,IEntityOperate?????? 3,????????IEntityTree 4,????????????????? 5,?????????? 6,??????????????Google Book Downloader: Google Books Downloader Lite 1.0: Google Books Downloader Lite 1.0Python Tools for Visual Studio: 1.5 Alpha: We’re pleased to announce the release of Python Tools for Visual Studio 1.5 Alpha. Python Tools for Visual Studio (PTVS) is an open-source plug-in for Visual Studio which supports programming with the Python language. PTVS supports a broad range of features including: • Supports Cpython, IronPython, Jython and Pypy • Python editor with advanced member, signature intellisense and refactoring • Code navigation: “Find all refs”, goto definition, and object browser • Local and remote debugging...AD Gallery: AD Gallery 1.2.7: NewsFixed a bug which caused the current thumbnail not to be highlighted Added a hook to take complete control over how descriptions are handled, take a look under Documentation for more info Added removeAllImages()WebsitePanel: 1.2.2: This build is for Beta Testing only. DO NOT USE IN PRODUCTION. The following work items has been fixed/closed in WebsitePanel 1.2.2.1: 225 135 59 96 23 29 191 72 48 240 245 244 160 16 65 7 156AcDown????? - Anime&Comic Downloader: AcDown????? v3.11.6: ?? ●AcDown??????????、??、??????,????1M,????,????,?????????????????????????。???????????Acfun、????(Bilibili)、??、??、YouTube、??、???、??????、SF????、????????????。??????AcPlay?????,??????、????????????????。 ● AcDown???????????????????????????,???,???????????????????。 ● AcDown???????C#??,????.NET Framework 2.0??。?????"Acfun?????"。 ????32??64? Windows XP/Vista/7/8 ????????????? ??:????????Windows XP???,?????????.NET Framework 2.0???(x86),?????"?????????"??? ??????????????,??????????: ??"AcDo...New ProjectsAspose.Words for Java Examples: This project contains example code for Aspose.Words for Java. Aspose.Words is a class library for generating, converting and rendering wordprocessing documents. Aspose.Words supports DOC, OOXML, RTF, HTML, MHTML, TXT, OpenDocument, PDF, XPS, EPUB, SWF, SVG, Image, printing and other formats.Bilingual Text Matching: This project is to extract bilingual sentence pairs from text. As an important basic module, it is widely applied in many different tasks in natural language process field(NLP), such as machine translation, search engine, language study and so on.bitboxx bbnews: The bitboxx bbnews module is a DNN module for collecting and providing news on your portal. It is able to collect news from a RSS/Atom feeds or from twitter. Alternativeliy you can write your own news. News display is full templated and could be provided as RSS feed too.BlackCat: Easy-to-use tool to check, create and generate encryption data. You can: - generate RSA Keypairs for keysize 1024, 2048, 4096, 8192 bytes. - generate MD5 String Hash - compare MD5 String Hash Checksums - generate MD5 and SHA1 File Hash - compare MD5 and SHA1 File Checksums That's it!BoardSpace.net Hive Games Reviewer (Ultimate Edition): BoardSpace.net Hive Games Reviewer (Ultimate Edition)Consistent Hash: Enterprise Consistent Hash (ECH) is a consistent hashing library written in C#. It is tested to be used at production level and supports N number of nodes (servers); while most of the hashing implement out there only support limited number of nodes in the hash space to achieve the required performance speed. ESH provide the following features: 1. High performance, it implements Fowler–Noll–Vo hash algorithm by overcoming all its weakness defined in the algorithm. This implementation might...Data Mining Add-Ins for Excel Sample Data (includes Call Center): Data Mining Add-Ins for Excel Sample Data (includes Call Center) sample workbook is an update to the sample workbook that is installed when you download and install the Data Mining Add-ins. The update includes additional data that supports the Goal Seek Analysis tool, and caElectronic Diary: Electronic DiaryFileZilla Server Config File Editor: A simple program for editing FileZilla Server config file. A co-worker manages a Filezilla FTP server on Windows Server 2008. He would like to give a few people the ability to add/delete users and to grant rights for various download subfolders. He showed me the steps he has to go through to perform the tasks listed above, and felt it could be error-prone for others. So he gave me the config file ("FileZilla Server.xml") as a template to build this application for him.FIM Ultimate File Connector: Project providing an Extensible Connectivity 2.0 (ECMA) File Connector (previously Management Agent). Just the basic File Connector supporting the following OOB file formats: *Attribute Value Pair (AVP) *Delimited *Directory Services Markup Language (DSML) *Fixed *LDAP Data Interchange Format (LDIF) But has the following extra functionality: *Full Export that before ECMA had to be handled externally from FIM/ILM/MIIS *Files can be managed at FTP, FTPS, SFTP, SCP and File System ...GAC Explorer: This application can be used by DotNet Developers to download assembly(s) from Global Assembly Cache (GAC). It contains features like Copy Assembly(s) to Clipboard or Copy to some Folder in Local Machine. Best part it supports DotNet 4.0 GAC structure.GPX.NET: GPX.NET provides a set of C# classes for the GPX (GPS eXchange Format) standard. It offers a full implementation of the V1.1 standard in a clean and straightforward way using the native .NET XmlSerializer. Reading, writing and a programmatic document object model are supported.gscirc: IRC client for windows. Hard-ceded string finder: This project can help to C# developers move hard-coded string to resources file (in existing or newly generated by this program). Also it has ability to search duplicated string s in resources and source file.Image Popup Module dotnetnuke: Image Pop-up Module is a module to show image light-box pop ups in dotnetnuke websites Please Follow the steps to use this module 1 Install the module and drop on your page where you want to show the pop up 2 In your HTML module editor add the token "{imagepopup}" 3 In your HTML module editor add class="popup-img" in your images which you want to show in popup.Intercom: Intercom is a comprehensive C# API wrapper library for accessing the Intercom.IO APILakana - WPF Navigation Framework: A lightweight and powerful navigation framework for WPF.Mageris: MagerisMaLoRTLib: raytracer library used in the MaLoRT.SGP - SISTEMA DE GESTÃO PAROQUIAL: SGP - SISTEMA DE GESTÃO PAROQUIAL - VERSION 1.0Silva PeerChannel: This is a simple project using the new "Peer Channel" Technology provided by Microsoft it is an ideal project & sample for developers who wants to start developing in this area (Network) Some feature of this project: 1.Chat with unlimited clients (Chat room) 2.Send/Receive unlimited file between unlimited clients. (TCP) 3.Download files from internet 4.Search and select files to download (P2P) All the source of project is full of comments to understand every single line of code...Silver Desktop: SDSimple Hit Counter WebPart SharePoint 2010: HitCounterWebPart.wsp HitCounterWebPart Source CodeSimpleRX: SimpleRX is a educational project that shows a possible implementation of many rx commands. SimpleRX should stimulate a study with the reactive extensions. It is also a guard to extend the reactive extensions with custom commands. A introduction to simpleRX can be found at: netmatze.wordpress.com Skill Studio: Skill Studio is a Visual Code Generator for Unity. (http://unity3d.com/unity/) By now it can generate BehaviorTree and AnimationTree visually.TaskMgr2: TestTeam Foundation Server overview: This application basically just allows you to open several Team Foundation Server windows on your secondary monitor that you use a lot, so that you can enjoy task management the easy way.TongjiXuanke: An Hacker-programme for xuanke Platform of Tongji University (Shanghai,China) Visual C++ 2010 Directories Editor: Path editor of include, library, source and etc. foldersVisual Coder: VSCWirelessNetworkDetection: Project to find all your wireless access pointswith connected clients.X Window System for COSMOS: This is a project meant to provide a GUI for COSMOS. It is built upon version 89858 of COSMOS kernel and provided in .dll form that expose the most common methods to create dialog/modal windows with drag&drop, re-size, open/close facilities. This solution bases on a double-linked list principle, recursively parsing the hierarchy of windows in both ways. This allows dynamic allocation of memory for an infinite number of windows, screen refresh and active window sensitivity.

    Read the article

  • CodePlex Daily Summary for Monday, March 12, 2012

    CodePlex Daily Summary for Monday, March 12, 2012Popular ReleasesAvalonDock: AvalonDock 2.0.0345: Welcome to early alpha release of AvalonDock 2.0 I've completely rewritten AvalonDock in order to take full advantage of the MVVM pattern. New version also boost a lot of new features: 1) Deep separation between model and layout. 2) Full WPF binding support thanks to unified logical tree between main docking manager, auto-hide windows and floating windows. 3) Support for Aero semi-maximized windows feature. 4) Support for multiple panes in the same floating windows. For a short list of new f...Windows Azure PowerShell Cmdlets: Windows Azure PowerShell Cmdlets 2.2.2: Changes Added Start Menu Item for Easy Startup Added Link to Getting Started Document Added Ability to Persist Subscription Data to Disk Fixed Get-Deployment to not throw on empty slot Simplified numerous default values for cmdlets Breaking Changes: -SubscriptionName is now mandatory in Set-Subscription. -DefaultStorageAccountName and -DefaultStorageAccountKey parameters were removed from Set-Subscription. Instead, when adding multiple accounts to a subscription, each one needs to be added ...IronPython: 2.7.2: On behalf of the IronPython team, I'm happy to announce the final release IronPython 2.7.2. This release includes everything from IronPython 54498 and 62475 as well. Like all IronPython 2.7-series releases, .NET 4 is required to install it. Installing this release will replace any existing IronPython 2.7-series installation. Unlike previous releases, the assemblies for all supported platforms are included in the installer as well as the zip package, in the "Platforms" directory. IronPython 2...Kooboo CMS: Kooboo CMS 3.2.0.0: Breaking changes: When upgrade from previous versions, MUST reset the all the content type templates, otherwise the content manager might get a compile error. New features Integrate with Windows azure. See: http://wiki.kooboo.com/?wiki=Kooboo CMS on Azure Complete solution to deploy on load balance servers. See: http://wiki.kooboo.com/?wiki=Kooboo CMS load balance Update Jquery and Jquery ui to the lastest version(Jquery 1.71, Jquery UI 1.8.16). Tree style text content editing. See:h...FluentData -Micro ORM with a fluent API that makes it simple to query a database: FluentData version 2.0: New features: - Support for events: OnConnectionClosed, OnConnectionOpened, OnConnectionOpening, OnError, OnExecuted, OnExecuting - Added a CommandTimeout method on the Context. This allows you to set the time out for all the commands. - QueryValues support has been added for Stored Procedures. Changes to existing features: - IgnoreProperty has been moved from a separate property to be a parameter in the AutoMap method.Home Access Plus+: v7.10: Don't forget to add your location to the list: http://www.nbdev.co.uk/projects/hap/locations.aspx Changes: Added: CompressJS controls to the Help Desk & Booking System (reduces page size) Fixed: Debug/Release mode detection in CompressJS control Added: Older Browsers will use an iframe and the old uploadh.aspx page (works better than the current implementation on older browsers) Added: Permalinks for my files, you can give out links that redirect to the correct location when you log i...SubExtractor: Release 1026: Fix: multi-colored bluray subs will no longer result in black blob for OCR Fix: dvds with no language specified will not cause exception in name creation of subtitle files Fix: Root directory Dvds will use volume label as their directory nameExtensions for Reactive Extensions (Rxx): Rxx 1.3: Please read the latest release notes for details about what's new. Related Work Items Content SummaryRxx provides the following features. See the Documentation for details. Many IObservable<T> extension methods and IEnumerable<T> extension methods. Many wrappers that convert asynchronous Framework Class Library APIs into observables. Many useful types such as ListSubject<T>, DictionarySubject<T>, CommandSubject, ViewModel, ObservableDynamicObject, Either<TLeft, TRight>, Maybe<T>, Scala...Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.47: Properly output escaped characters in CSS identifiers throw an EOF error when parsing a CSS selector that doesn't end in a declaration block chased down a stack-overflow issue with really large JS sources. Needed to flatten out the AST tree for adjacent expression statements that the application merges into a single expression statement, or that already contain large, comma-separated expressions in the original source. fix issue #17569: tie together the -debug switch with the DEBUG defi...Player Framework by Microsoft: Player Framework for Windows 8 Metro (BETA): Player Framework for HTML/JavaScript and XAML/C# Metro Style Applications.WPF Application Framework (WAF): WAF for .NET 4.5 (Experimental): Version: 2.5.0.440 (Experimental): This is an experimental release! It can be used to investigate the new .NET Framework 4.5 features. The ideas shown in this release might come in a future release (after 2.5) of the WPF Application Framework (WAF). More information can be found in this dicussion post. Requirements .NET Framework 4.5 (The package contains a solution file for Visual Studio 11) The unit test projects require Visual Studio 11 Professional Changelog All: Upgrade all proje...SSH.NET Library: 2012.3.9: There are still few outstanding issues I wanted to include in this release but since its been a while and there are few new features already I decided to create a new release now. New Features Add SOCKS4, SOCKS5 and HTTP Proxy support when connecting to remote server. For silverlight only IP address can be used for server address when using proxy. Add dynamic port forwarding support using ForwardedPortDynamic class. Add new ShellStream class to work with SSH Shell. Add supports for mu...Test Case Import Utilities for Visual Studio 2010 and Visual Studio 11 Beta: V1.2 RTM: This release (V1.2 RTM) includes: Support for connecting to Hosted Team Foundation Server Preview. Support for connecting to Team Foundation Server 11 Beta. Fix to issue with read-only attribute being set for LinksMapping-ReportFile which may have led to problems when saving the report file. Fix to issue with “related links” not being set properly in certain conditions. Fix to ensure that tool works fine when the Excel file contained rich text data. Note: Data is still imported in pl...DotNetNuke® Community Edition CMS: 06.01.04: Major Highlights Fixed issue with loading the splash page skin in the login, privacy and terms of use pages Fixed issue when searching for words with special characters in them Fixed redirection issue when the user does not have permissions to access a resource Fixed issue when clearing the cache using the ClearHostCache() function Fixed issue when displaying the site structure in the link to page feature Fixed issue when inline editing the title of modules Fixed issue with ...Mayhem: Mayhem Developer Preview: This is the developer preview of Mayhem. Enjoy!Magelia WebStore Open-source Ecommerce software: Magelia WebStore 1.2: Medium trust compliant lot of small change for medium trust compliance full refactoring of user management refactoring of Client Refactoring of user management Magelia.WebStore.Client no longer reference Magelia.WebStore.Services.Contract Refactoring page category multi parent category added copy category feature added Refactoring page catalog copy catalog feature added variant management improvement ability to define a default variant for a variable product ability to ord...PDFsharp - A .NET library for processing PDF: PDFsharp and MigraDoc Foundation 1.32: PDFsharp and MigraDoc Foundation 1.32 is a stable version that fixes a few bugs that were found with version 1.31. Version 1.32 includes solutions for Visual Studio 2010 only (but it should be possible to add the project files to existing solutions for VS 2005 or VS 2008). Users of VS 2005 or VS 2008 can still download version 1.31 with the solutions for those versions that allow them to easily try the samples that are included. While it may create smaller PDF files than version 1.30 because...Terminals: Version 2.0 - Release: Changes since version 1.9a:New art works New usability in Organize favorites window Improved usability of imports/exports and scans Large number of fixes Improvements in single instance mode Comparing November beta 4, this corrects: New application icons Doesn't show Logon error codes Fixed command line arguments exception for single instance mode Fixed detaching of tabs improved usability in detached window Fixed option settings for Capture manager Fixed system tray noti...MFCMAPI: March 2012 Release: Build: 15.0.0.1032 Full release notes at SGriffin's blog. If you just want to run the MFCMAPI or MrMAPI, get the executables. If you want to debug them, get the symbol files and the source. The 64 bit builds will only work on a machine with Outlook 2010 64 bit installed. All other machines should use the 32 bit builds, regardless of the operating system. Facebook BadgeTortoiseHg: TortoiseHg 2.3.1: bugfix releaseNew ProjectsBurrow.NET: Burrow is a simple library created based on some EasyNetQ ideas, it's a thin wrapper of RabbitMQ.Client for .NET. Basically, if you just need to put your message or subscribe messages from RabbitMQ server, you found the right place. With Burrow.NET, you can easily customize almost everything start with exchange and queue name, changing the way to serialize your object, inject custom error handling strategies, etc.C# Base Media File Format Library: Parses ISO Base Media File Format files including QuickTime (.mov, .mp4, .m4v, .m4a), Microsoft Smooth Streaming (.ismv, .isma, .ismc), JPEG2000 (.jp2, .jpf, .jpx), Motion JPEG2000 (.mj2, .mjp2), 3GPP/3GPP2 (.3gp, .3g2) and other conforming format extensions.devtm.Aop: Aspect Oriented Programming with Mono.Cecil (on build time and runtime)Dynamics CRM 2011 Script# Xrm.Page Library: This is a Script# (scriptsharp) import library that you can use to write Dynamics CRM 2011 web resources easily and efficiently. This library provides access to all functions currently documented under MSDN Xrm.Page.EntityUI: EntityUI is basically an idea to be able to create User Interface in ASP .Net applications using Code First approcah. Flurr: Flurr is the ultimate open source API wrapper library for different social networks such as Tumblr, Twitter and more! With it, you can easily connect to social networks in your desktop or web applications, by simply importing a .dll file.GerenciadorPacotes: Gerenciador de PacotesInfo Bandung: Woyyy, Orang Bandung kita bagi2 Info yu disini, smua tentang Bandung boleh tempat makanan, tempat gaul, trend, tempat murah,,, apapun yang asik-asik :-) jstring Multilingual Class Library: jstring is a small library that provides multilingual string support. The jstring class provides programmatic support for projects that require the ability to change languages on-the-fly. Kinect: Tower Defence: Kinect: Tower Defence is a 2D tower defence game programmed in C# using the XNA Framework, played on the PC. It will make use of the Kinect hardware and motion tracking to add more fun to how the game is controlled and played.Kuick -- Application Framework: An Application Framework. Kuick Data -- ORM Framework: An ORM FrameworkMelorin Radio: This is a test project for radioNorthwind-projekt: Projekt oparty na bazie Northwindolaf: olaf makes it easier for manual qa testers to use selenium web-driver by defining their test case flows in excel spreadsheets.OpenSOCKS (Open Shared Objective Collaborative Kernel System): OpenSOCKS is the best of two fantastic C# -> OS kernel compilers (MOSA and COSMOS) We are open source and aim to make our kernel simple and full of features. OpenSOCKS (Open Shared Objective Collaborative Kernel System) - by the makers of PearOSpelotas: okRadaCode.SwissKnife: SwissKnife is a RadaCode's collection of C# classes that facilitate the overall development and help with stuff like HTML removal, random name and number generation, etc. Simple AutoUpdater: This project is a simple updater.Simple TFS Tool: Simple TFS Tool for getting source from TFSSISAP: SISAPSnippet Compiler Tool Window: Snippet Compiler adds a tool window to the Visual Studio 2010 and 11 Beta where you can type/paste code snippets and try to compile them to see if they workTagomatique: Permet la gestion de fichiers multimédia sur principe des tagstsi2012: Proyecto de TS1 año 2012wbgj: this is weibo projectWindows Phone 7 Text Style Picker: WP7TextStylePicker was created to fill the gaping whole in the SDK: surprisingly, there is no control that would allow setting text properties (Color, Font Family, Font Size, Bold, Italic) - even though this sounds like a very basic task that many applications would need.WPLiveEdu: Windows phone application for browsing Live@Edu calendar

    Read the article

  • Over a million COBOL programmers in the world?

    - by Lucas McCoy
    I think I heard on a previous StackOverflow podcast that COBOL was used as the programming language for traffic lights (or something like that), so this got me interested. I did a quick Google search and found this little article: Today, Cobol is everywhere, yet largely unheard of by millions of people who interact with it daily when using the ATM, stopping at traffic lights or buying a product online. The statistics on Cobol attest to its huge influence on the business world: There are over 220 billion lines of Cobol in existence, a figure which equates to about 80 per cent of the world’s actively used code. There over a million Cobol programmers in the world. There are 200 times as many Cobol transactions that take place each day than Google searches. I didn't really trust the source seeing as how it's on some random PHPBB forum. So how accurate are these figures? Are there really 220 billion lines of COBOL? I assume a few people/companies still use COBOL, but how many?

    Read the article

  • Where can I safely learn about computer security?

    - by Ammar Ahmed
    I find it really hard to find resources about computer security. I asked questions on message boards about key loggers and viruses and I got negative assumption from people assuming the the worse. Also, I don't think that I can trust random message boards. I know that it is a broad topic, but are there any good websites that I can follow and learn from that are targeted to beginner with some samples? I am a developer (or at least want to be one) and I have a CS degree if that helps.

    Read the article

  • Custommer Centric Wealth Management

    - by michael.seback
    While the world continues to search their way out of the recent financial turmoil and recession, it has no doubt churned out the inherent faults in the wealth management industry and the larger financial system. In order to counter these apprehensions, wealth management firms are now actively seeking and evaluating avenues to re-build the lost trust. They are looking at engaging their customers in managing their investments in a more collaborative and transparent manner. At the same time, wealth managers are also seeking to empower themselves with complete and comprehensive customer information in order to provide the best advice and the best solution at the right time. Read your copy of this new global White Paper on Wealth Management.

    Read the article

  • Ubuntu Lagging even LXDE freezes

    - by Anas Ismail Khan
    Laptop, i3, Ram: 2GB. Using 14.04LTS... and it lags like hell. Even if i open more than 4 tabs in Chrome, it freezes, and often I have no choice but to restart and multi-tasking is kinda difficult and at times impossible. Now there's whole thing about Lubuntu and LXDE that are suposed to be super-fast.. installed LXDE.. mind, not lubuntu-desktop. just LXDE. And it too freezes every now and then, and trust this.. when it freezes, it does so worse than Unity.. ESPECIALLY when i start PCManFM... and mount a disk or two... Any ideas as to why this is happening.. The minimum requirements for Unity are supposed to be 1Gig RAM.. and people are running it fine even on 512 MB...

    Read the article

  • Do the "Contact us" and "Privacy policy" pages affect SEO?

    - by Gkhan14
    Just like the title says, what are the effects of having a "Contact us" and a "Privacy policy" on your site? I've read that it could build up your trust with Google, is this true? I've also read that some people said that you should add a noindex tag to your "Privacy policy" page, would this be a good idea? I say this because many websites have similar privacy policies, and I don't want any duplicate content issues. (For example, many people could be using the same WordPress privacy policy generator). I'm wondering the same things for the "Contact us" page as well.

    Read the article

< Previous Page | 86 87 88 89 90 91 92 93 94 95 96 97  | Next Page >