Search Results

Search found 2032 results on 82 pages for 'drupal extensibility'.

Page 64/82 | < Previous Page | 60 61 62 63 64 65 66 67 68 69 70 71  | Next Page >

  • Microsoft releases Visual Studio 2010 SP1

    - by brian_ritchie
    Microsoft has been beta testing SP1 since December of last year.  Today, it was released to MSDN subscribers and will be available for public download on March 10, 2011.The service pack includes a slew of fixes, and a number of new features: Silverlight 4 supportBasic Unit Testing support for the .NET Framework 3.5Performance Wizard for SilverlightIntelliTrace for 64-bit and SharePointIIS Express supportSQL CE 4 supportRazor supportHTML5 and CSS3 support (IntelliSense and validation)WCF RIA Services V1 SP1 includedVisual Basic Runtime embeddingALM Improvements Of all the improvements, IIS Express probably has the largest impact on web developer productivity.  According to Scott Gu, it provides the following:It’s lightweight and easy to install (less than 10Mb download and a super quick install)It does not require an administrator account to run/debug applications from Visual Studio It enables a full web-server feature set – including SSL, URL Rewrite, Media Support, and all other IIS 7.x modules It supports and enables the same extensibility model and web.config file settings that IIS 7.x support It can be installed side-by-side with the full IIS web server as well as the ASP.NET Development Server (they do not conflict at all) It works on Windows XP and higher operating systems – giving you a full IIS 7.x developer feature-set on all OS platforms IIS Express (like the ASP.NET Development Server) can be quickly launched to run a site from a directory on disk.  It does not require any registration/configuration steps. This makes it really easy to launch and run for development scenarios.Good stuff indeed.  This will make our lives much easier.  Thanks Microsoft...we're feeling the love!  

    Read the article

  • Permit Communicator to display an web site with an ActiveX control in Vista or Windows 7

    - by dmo
    I have a UCMA service that sends a URL to load in a communicator extensibility tab as mentioned here: http://blogs.claritycon.com/blogs/michael_greenlee/archive/2009/02/19/context-windows-in-communicator-using-ucma-v2-0.aspx http://social.msdn.microsoft.com/Forums/en-US/ucclientsdk/thread/4406e412-01f1-466f-a593-3b83652dcdb1 This works fine with Office Communicator 2007 R2 for most trusted links, and on Windows XP and 2003 it works fine with a page with an ActiveX control. However, the ActiveX content is not displayed in Vista or Window 7. I've tried relaxing the security settings to no avail. Any suggestions or guidance would be much appreciated.

    Read the article

  • Using visitor pattern with large object hierarchy

    - by T. Fabre
    Context I've been using with a hierarchy of objects (an expression tree) a "pseudo" visitor pattern (pseudo, as in it does not use double dispatch) : public interface MyInterface { void Accept(SomeClass operationClass); } public class MyImpl : MyInterface { public void Accept(SomeClass operationClass) { operationClass.DoSomething(); operationClass.DoSomethingElse(); // ... and so on ... } } This design was, however questionnable, pretty comfortable since the number of implementations of MyInterface is significant (~50 or more) and I didn't need to add extra operations. Each implementation is unique (it's a different expression or operator), and some are composites (ie, operator nodes that will contain other operator/leaf nodes). Traversal is currently performed by calling the Accept operation on the root node of the tree, which in turns calls Accept on each of its child nodes, which in turn... and so on... But the time has come where I need to add a new operation, such as pretty printing : public class MyImpl : MyInterface { // Property does not come from MyInterface public string SomeProperty { get; set; } public void Accept(SomeClass operationClass) { operationClass.DoSomething(); operationClass.DoSomethingElse(); // ... and so on ... } public void Accept(SomePrettyPrinter printer) { printer.PrettyPrint(this.SomeProperty); } } I basically see two options : Keep the same design, adding a new method for my operation to each derived class, at the expense of maintainibility (not an option, IMHO) Use the "true" Visitor pattern, at the expense of extensibility (not an option, as I expect to have more implementations coming along the way...), with about 50+ overloads of the Visit method, each one matching a specific implementation ? Question Would you recommand using the Visitor pattern ? Is there any other pattern that could help solve this issue ?

    Read the article

  • Customizing The Fusion Applications Simplified UI (aka FUSE)

    - by Richard Bingham
    Everyone who has seen it is impressed by the new Simplified UI, providing self-service workers the ease-of-use that they expect from a modern web application. As always, people want and need to make small adjustments, especially in the Cloud, and thankfully even in its first release there is good support for this. The main features are: Configuring the branding and look-and-feel (known as a theme) of the Simplified UI. Adding you own custom announcements to the Simplified UI homepage. Using Page Composer to edit component properties, such as re-label text or hiding unwanted fields. Using MDS Sandboxes to manage groups of related customizations. Using Application Composer to adjust the fields available in certain Simplified UI pages. These are demonstrated in the video embedded below, available as part of our YouTube channel, as well as being documented in the extensibility guide. &amp;amp;amp;amp;lt;span id=&amp;amp;amp;amp;quot;XinhaEditingPostion&amp;amp;amp;amp;quot;&amp;amp;amp;amp;gt;&amp;amp;amp;amp;lt;/span&amp;amp;amp;amp;gt; As mentioned, this is the first release of this capability so if there is something you're stuck on please use our forum and we'll try to help, or if you have a requirement for a new Simplified UI customization, please add a comment below.

    Read the article

  • When NOT to use a framework

    - by Chris
    Today, one can find a framework for just about any language, to suit just about any project. Most modern frameworks are fairly robust (generally speaking), with hour upon hour of testing, peer reviewed code, and great extensibility. However, I think there is a downside to ANY framework in that programmers, as a community, may become so reliant upon their chosen frameworks that they no longer understand the underlying workings, or in the case of newer programmers, never learn the underlying workings to begin with. It is easy to become specialized to a degree that you are no longer a 'PHP programmer' (for example), but a "Drupal programmer", to the exclusion of anything else. Who cares, right? We have the framework! We don't need to know how to "do it by hand"! Right? The result of this loss of basic skills (sometimes to the extent that programmers who don't use frameworks are viewed as "outdated") is that it becomes common practice to use a framework where it is not required or appropriate. The features the framework facilitates wind up confused with what the base language is capable of. Developers start using frameworks to accomplish even the most basic of tasks, so that what once was considered a rudimentary process now involves large libraries with their own quirks, bugs, and dependencies. What was once accomplished in 20 lines is now accomplished by including a 20,000 line framework AND writing 20 lines to use the framework. Conversely, one does not want to reinvent the wheel. If I'm writing code to accomplish some basic, common little task, I might feel like I am wasting my time when I know that framework XYZ offers all the features I am after, and a whole lot more. The "whole lot more" part still has me worried, but it doesn't seem that many even consider it anymore. There has to be a good metric to determine when it is appropriate to use a framework. What do you consider the threshold to be, how do you decide when to use a framework, or, when not.

    Read the article

  • Trying to install PHP 5.2 on IIS/Win 2008 - Error 500

    - by Razor
    I have a fresh install of IIS 7 - I just added Web Platform Installer, and PHP 5.2 thru that. However, when trying to access to a simple test.php file (just has phpinfo() in it), I get the following list of errors: • IIS was not able to access the web.config file for the Web site or application. This can occur if the NTFS permissions are set incorrectly. • IIS was not able to process configuration for the Web site or application. • The authenticated user does not have permission to use this DLL. • The request is mapped to a managed handler but the .NET Extensibility Feature is not installed. The domain was created with dot net panel, but I don't think that has to do with this problem, unless maybe it uses a specific user? Maybe I need to add php thruough dot net panel? Any idea of what I'm doing wrong here?

    Read the article

  • Why not XHTML5?

    - by eegg
    So, HTML5 is the Big Step Forward, I'm told. The last step forward we took that I'm aware of was the introduction of XHTML. The advantages were obvious: simplicity, strictness, the ability to use standard XML parsers and generators to work with web pages, and so on. How strange and frustrating, then, that HTML5 rolls all that back: once again we're working with a non-standard syntax; once again, we have to deal with historical baggage and parsing complexity; once again we can't use our standard XML libraries, parsers, generators, or transformers; and all the advantages introduced by XML (extensibility, namespaces, standardization, and so on), that the W3C spent a decade pushing for good reasons, are lost. Fine, we have XHTML5, but it seems like it has not gained popularity like the HTML5 encoding has. See this SO question, for example. Even the HTML5 specification says that HTML5, not XHTML5, "is the format suggested for most authors." Do I have my facts wrong? Otherwise, why am I the only one that feels this way? Why are people choosing HTML5 over XHTML5?

    Read the article

  • Writing a code example

    - by Stefano Borini
    I would like to have your feedback regarding code examples. One of the most frustrating experiences I sometimes have when learning a new technology is finding useless examples. I think an example as the most precious thing that comes with a new library, language, or technology. It must be a starting point, a wise and unadulterated explanation on how to achieve a given result. A perfect example must have the following characteristics: Self contained: it should be small enough to be compiled or executed as a single program, without dependencies or complex makefiles. An example is also a strong functional test if you correctly installed the new technology. The more issues could arise, the more likely is that something goes wrong, and the more difficult is to debug and solve the situation. Pertinent: it should demonstrate one, and only one, specific feature of your software/library, involving the minimal additional behavior from external libraries. Helpful: the code should bring you forward, step by step, using comments or self-documenting code. Extensible: the example code should be a small “framework” or blueprint for additional tinkering. A learner can start by adding features to this blueprint. Recyclable: it should be possible to extract parts of the example to use in your own code Easy: An example code is not the place to show your code-fu skillz. Keep it easy. helpful acronym: SPHERE. Prototypical examples of violations of those rules are the following: Violation of self-containedness: an example spanning multiple files without any real need for it. If your example is a python program, keep everything into a single module file. Don’t sub-modularize it. In Java, try to keep everything into a single class, unless you really must partition some entity into a meaningful object you need to pass around (and java mandates one class per file, if I remember correctly). Violation of Pertinency: When showing how many different shapes you can draw, adding radio buttons and complex controls with all the possible choices for point shapes is a bad idea. You de-focalize your example code, introducing code for event handling, controls initialization etc., and this is not part the feature you want to demonstrate, they are unnecessary noise in the understanding of the crucial mechanisms providing the feature. Violation of Helpfulness: code containing dubious naming, wrong comments, hacks, and functions longer than one page of code. Violation of Extensibility: badly factored code that have everything into a single function, with potentially swappable entities embedded within the code. Example: if an example reads data from a file and displays it, create a method getData() returning a useful entity, instead of opening the file raw and plotting the stuff. This way, if the user of the library needs to read data from a HTTP server instead, he just has to modify the getData() module and use the example almost as-is. Another violation of Extensibility comes if the example code is not under a fully liberal (e.g. MIT or BSD) license. Violation of Recyclability: when the code layout is so intermingled that is difficult to easily copy and paste parts of it and recycle them into another program. Again, licensing is also a factor. Violation of Easiness: Yes, you are a functional-programming nerd and want to show how cool you are by doing everything on a single line of map, filter and so on, but that could not be helpful to someone else, who is already under pressure to understand your library, and now has to understand your code as well. And in general, the final rule: if it takes more than 10 minutes to do the following: compile the code, run it, read the source, and understand it fully, it means that the example is not a good one. Please let me know your opinion, either positive or negative, or experience on this regard.

    Read the article

  • Magento Onepage Success Conversion Tracking Design Pattern

    - by user1734954
    My intent is to track conversions through multiple channels by inserting third party javascript (for example google analytics, optimizely, pricegrabber etc.) into the footer of onepage success . I've accomplished this by adding a block to the footer reference inside of the checkout success node within local.xml and everything works appropriately. My questions are more about efficiency and extensibility. It occurred to me that it would be better to combine all of the blocks into a single block reference and then use a various methods acting on a single call to the various related models to provide the data needed for insertion into the javascript for each of the conversion tracking scripts. Some examples of the common data that conversion tracking may rely on(pseudo): Order ID , Order Total, Order.LineItem.Name(foreach) and so on Currently for each of the scripts I've made a call to the appropriate model passing the customers last order id as the load value and the calling a get() assigning the return value to a variable and then iterating through the data to match the values with the expectations of the given third party service. All of the data should be pulled once when checkout is complete each third party services may expect different data in different formats Here is an example of one of the conversion tracking template files which loads at the footer of checkout success. $order = Mage::getModel('sales/order')->loadByIncrementId(Mage::getSingleton('checkout/session')->getLastRealOrderId()); $amount = number_format($order->getGrandTotal(),2); $customer = Mage::helper('customer')->getCustomer()->getData(); ?> <script type="text/javascript"> popup_email = '<?php echo($customer['email']);?>'; popup_order_number = '<?php echo $this->getOrderId() ?>'; </script> <!-- PriceGrabber Merchant Evaluation Code --> <script type="text/javascript" charset="UTF-8" src="https://www.pricegrabber.com/rating_merchrevpopjs.php?retid=<something>"></script> <noscript><a href="http://www.pricegrabber.com/rating_merchrev.php?retid=<something>" target=_blank> <img src="https://images.pricegrabber.com/images/mr_noprize.jpg" border="0" width="272" height="238" alt="Merchant Evaluation"></a></noscript> <!-- End PriceGrabber Code --> Having just a single piece of code like this is not that big of a deal, but we are doing similar things with a number of different third party services. Pricegrabber is one of the simpler examples. A more sophisticated tracking service expects a comma separated list of all of the product names, ids, prices, categories , order id etc. I would like to make it all more manageable so my idea to do the following: combine all of the template files into a single file Develop a helper class or library to deliver the data to the conversion template Goals Include Extensibility Minimal Model Calls Minimal Method Calls The Questions 1. Is a Mage helper the best route to take? 2. Is there any design pattern you may recommend for the "helper" class? 3. Why would this the design pattern you've chosen be best for this instance?

    Read the article

  • Using JSON.NET for dynamic JSON parsing

    - by Rick Strahl
    With the release of ASP.NET Web API as part of .NET 4.5 and MVC 4.0, JSON.NET has effectively pushed out the .NET native serializers to become the default serializer for Web API. JSON.NET is vastly more flexible than the built in DataContractJsonSerializer or the older JavaScript serializer. The DataContractSerializer in particular has been very problematic in the past because it can't deal with untyped objects for serialization - like values of type object, or anonymous types which are quite common these days. The JavaScript Serializer that came before it actually does support non-typed objects for serialization but it can't do anything with untyped data coming in from JavaScript and it's overall model of extensibility was pretty limited (JavaScript Serializer is what MVC uses for JSON responses). JSON.NET provides a robust JSON serializer that has both high level and low level components, supports binary JSON, JSON contracts, Xml to JSON conversion, LINQ to JSON and many, many more features than either of the built in serializers. ASP.NET Web API now uses JSON.NET as its default serializer and is now pulled in as a NuGet dependency into Web API projects, which is great. Dynamic JSON Parsing One of the features that I think is getting ever more important is the ability to serialize and deserialize arbitrary JSON content dynamically - that is without mapping the JSON captured directly into a .NET type as DataContractSerializer or the JavaScript Serializers do. Sometimes it isn't possible to map types due to the differences in languages (think collections, dictionaries etc), and other times you simply don't have the structures in place or don't want to create them to actually import the data. If this topic sounds familiar - you're right! I wrote about dynamic JSON parsing a few months back before JSON.NET was added to Web API and when Web API and the System.Net HttpClient libraries included the System.Json classes like JsonObject and JsonArray. With the inclusion of JSON.NET in Web API these classes are now obsolete and didn't ship with Web API or the client libraries. I re-linked my original post to this one. In this post I'll discus JToken, JObject and JArray which are the dynamic JSON objects that make it very easy to create and retrieve JSON content on the fly without underlying types. Why Dynamic JSON? So, why Dynamic JSON parsing rather than strongly typed parsing? Since applications are interacting more and more with third party services it becomes ever more important to have easy access to those services with easy JSON parsing. Sometimes it just makes lot of sense to pull just a small amount of data out of large JSON document received from a service, because the third party service isn't directly related to your application's logic most of the time - and it makes little sense to map the entire service structure in your application. For example, recently I worked with the Google Maps Places API to return information about businesses close to me (or rather the app's) location. The Google API returns a ton of information that my application had no interest in - all I needed was few values out of the data. Dynamic JSON parsing makes it possible to map this data, without having to map the entire API to a C# data structure. Instead I could pull out the three or four values I needed from the API and directly store it on my business entities that needed to receive the data - no need to map the entire Maps API structure. Getting JSON.NET The easiest way to use JSON.NET is to grab it via NuGet and add it as a reference to your project. You can add it to your project with: PM> Install-Package Newtonsoft.Json From the Package Manager Console or by using Manage NuGet Packages in your project References. As mentioned if you're using ASP.NET Web API or MVC 4 JSON.NET will be automatically added to your project. Alternately you can also go to the CodePlex site and download the latest version including source code: http://json.codeplex.com/ Creating JSON on the fly with JObject and JArray Let's start with creating some JSON on the fly. It's super easy to create a dynamic object structure with any of the JToken derived JSON.NET objects. The most common JToken derived classes you are likely to use are JObject and JArray. JToken implements IDynamicMetaProvider and so uses the dynamic  keyword extensively to make it intuitive to create object structures and turn them into JSON via dynamic object syntax. Here's an example of creating a music album structure with child songs using JObject for the base object and songs and JArray for the actual collection of songs:[TestMethod] public void JObjectOutputTest() { // strong typed instance var jsonObject = new JObject(); // you can explicitly add values here using class interface jsonObject.Add("Entered", DateTime.Now); // or cast to dynamic to dynamically add/read properties dynamic album = jsonObject; album.AlbumName = "Dirty Deeds Done Dirt Cheap"; album.Artist = "AC/DC"; album.YearReleased = 1976; album.Songs = new JArray() as dynamic; dynamic song = new JObject(); song.SongName = "Dirty Deeds Done Dirt Cheap"; song.SongLength = "4:11"; album.Songs.Add(song); song = new JObject(); song.SongName = "Love at First Feel"; song.SongLength = "3:10"; album.Songs.Add(song); Console.WriteLine(album.ToString()); } This produces a complete JSON structure: { "Entered": "2012-08-18T13:26:37.7137482-10:00", "AlbumName": "Dirty Deeds Done Dirt Cheap", "Artist": "AC/DC", "YearReleased": 1976, "Songs": [ { "SongName": "Dirty Deeds Done Dirt Cheap", "SongLength": "4:11" }, { "SongName": "Love at First Feel", "SongLength": "3:10" } ] } Notice that JSON.NET does a nice job formatting the JSON, so it's easy to read and paste into blog posts :-). JSON.NET includes a bunch of configuration options that control how JSON is generated. Typically the defaults are just fine, but you can override with the JsonSettings object for most operations. The important thing about this code is that there's no explicit type used for holding the values to serialize to JSON. Rather the JSON.NET objects are the containers that receive the data as I build up my JSON structure dynamically, simply by adding properties. This means this code can be entirely driven at runtime without compile time restraints of structure for the JSON output. Here I use JObject to create a album 'object' and immediately cast it to dynamic. JObject() is kind of similar in behavior to ExpandoObject in that it allows you to add properties by simply assigning to them. Internally, JObject values are stored in pseudo collections of key value pairs that are exposed as properties through the IDynamicMetaObject interface exposed in JSON.NET's JToken base class. For objects the syntax is very clean - you add simple typed values as properties. For objects and arrays you have to explicitly create new JObject or JArray, cast them to dynamic and then add properties and items to them. Always remember though these values are dynamic - which means no Intellisense and no compiler type checking. It's up to you to ensure that the names and values you create are accessed consistently and without typos in your code. Note that you can also access the JObject instance directly (not as dynamic) and get access to the underlying JObject type. This means you can assign properties by string, which can be useful for fully data driven JSON generation from other structures. Below you can see both styles of access next to each other:// strong type instance var jsonObject = new JObject(); // you can explicitly add values here jsonObject.Add("Entered", DateTime.Now); // expando style instance you can just 'use' properties dynamic album = jsonObject; album.AlbumName = "Dirty Deeds Done Dirt Cheap"; JContainer (the base class for JObject and JArray) is a collection so you can also iterate over the properties at runtime easily:foreach (var item in jsonObject) { Console.WriteLine(item.Key + " " + item.Value.ToString()); } The functionality of the JSON objects are very similar to .NET's ExpandObject and if you used it before, you're already familiar with how the dynamic interfaces to the JSON objects works. Importing JSON with JObject.Parse() and JArray.Parse() The JValue structure supports importing JSON via the Parse() and Load() methods which can read JSON data from a string or various streams respectively. Essentially JValue includes the core JSON parsing to turn a JSON string into a collection of JsonValue objects that can be then referenced using familiar dynamic object syntax. Here's a simple example:public void JValueParsingTest() { var jsonString = @"{""Name"":""Rick"",""Company"":""West Wind"", ""Entered"":""2012-03-16T00:03:33.245-10:00""}"; dynamic json = JValue.Parse(jsonString); // values require casting string name = json.Name; string company = json.Company; DateTime entered = json.Entered; Assert.AreEqual(name, "Rick"); Assert.AreEqual(company, "West Wind"); } The JSON string represents an object with three properties which is parsed into a JObject class and cast to dynamic. Once cast to dynamic I can then go ahead and access the object using familiar object syntax. Note that the actual values - json.Name, json.Company, json.Entered - are actually of type JToken and I have to cast them to their appropriate types first before I can do type comparisons as in the Asserts at the end of the test method. This is required because of the way that dynamic types work which can't determine the type based on the method signature of the Assert.AreEqual(object,object) method. I have to either assign the dynamic value to a variable as I did above, or explicitly cast ( (string) json.Name) in the actual method call. The JSON structure can be much more complex than this simple example. Here's another example of an array of albums serialized to JSON and then parsed through with JsonValue():[TestMethod] public void JsonArrayParsingTest() { var jsonString = @"[ { ""Id"": ""b3ec4e5c"", ""AlbumName"": ""Dirty Deeds Done Dirt Cheap"", ""Artist"": ""AC/DC"", ""YearReleased"": 1976, ""Entered"": ""2012-03-16T00:13:12.2810521-10:00"", ""AlbumImageUrl"": ""http://ecx.images-amazon.com/images/I/61kTaH-uZBL._AA115_.jpg"", ""AmazonUrl"": ""http://www.amazon.com/gp/product/…ASIN=B00008BXJ4"", ""Songs"": [ { ""AlbumId"": ""b3ec4e5c"", ""SongName"": ""Dirty Deeds Done Dirt Cheap"", ""SongLength"": ""4:11"" }, { ""AlbumId"": ""b3ec4e5c"", ""SongName"": ""Love at First Feel"", ""SongLength"": ""3:10"" }, { ""AlbumId"": ""b3ec4e5c"", ""SongName"": ""Big Balls"", ""SongLength"": ""2:38"" } ] }, { ""Id"": ""7b919432"", ""AlbumName"": ""End of the Silence"", ""Artist"": ""Henry Rollins Band"", ""YearReleased"": 1992, ""Entered"": ""2012-03-16T00:13:12.2800521-10:00"", ""AlbumImageUrl"": ""http://ecx.images-amazon.com/images/I/51FO3rb1tuL._SL160_AA160_.jpg"", ""AmazonUrl"": ""http://www.amazon.com/End-Silence-Rollins-Band/dp/B0000040OX/ref=sr_1_5?ie=UTF8&qid=1302232195&sr=8-5"", ""Songs"": [ { ""AlbumId"": ""7b919432"", ""SongName"": ""Low Self Opinion"", ""SongLength"": ""5:24"" }, { ""AlbumId"": ""7b919432"", ""SongName"": ""Grip"", ""SongLength"": ""4:51"" } ] } ]"; JArray jsonVal = JArray.Parse(jsonString) as JArray; dynamic albums = jsonVal; foreach (dynamic album in albums) { Console.WriteLine(album.AlbumName + " (" + album.YearReleased.ToString() + ")"); foreach (dynamic song in album.Songs) { Console.WriteLine("\t" + song.SongName); } } Console.WriteLine(albums[0].AlbumName); Console.WriteLine(albums[0].Songs[1].SongName); } JObject and JArray in ASP.NET Web API Of course these types also work in ASP.NET Web API controller methods. If you want you can accept parameters using these object or return them back to the server. The following contrived example receives dynamic JSON input, and then creates a new dynamic JSON object and returns it based on data from the first:[HttpPost] public JObject PostAlbumJObject(JObject jAlbum) { // dynamic input from inbound JSON dynamic album = jAlbum; // create a new JSON object to write out dynamic newAlbum = new JObject(); // Create properties on the new instance // with values from the first newAlbum.AlbumName = album.AlbumName + " New"; newAlbum.NewProperty = "something new"; newAlbum.Songs = new JArray(); foreach (dynamic song in album.Songs) { song.SongName = song.SongName + " New"; newAlbum.Songs.Add(song); } return newAlbum; } The raw POST request to the server looks something like this: POST http://localhost/aspnetwebapi/samples/PostAlbumJObject HTTP/1.1User-Agent: FiddlerContent-type: application/jsonHost: localhostContent-Length: 88 {AlbumName: "Dirty Deeds",Songs:[ { SongName: "Problem Child"},{ SongName: "Squealer"}]} and the output that comes back looks like this: {  "AlbumName": "Dirty Deeds New",  "NewProperty": "something new",  "Songs": [    {      "SongName": "Problem Child New"    },    {      "SongName": "Squealer New"    }  ]} The original values are echoed back with something extra appended to demonstrate that we're working with a new object. When you receive or return a JObject, JValue, JToken or JArray instance in a Web API method, Web API ignores normal content negotiation and assumes your content is going to be received and returned as JSON, so effectively the parameter and result type explicitly determines the input and output format which is nice. Dynamic to Strong Type Mapping You can also map JObject and JArray instances to a strongly typed object, so you can mix dynamic and static typing in the same piece of code. Using the 2 Album jsonString shown earlier, the code below takes an array of albums and picks out only a single album and casts that album to a static Album instance.[TestMethod] public void JsonParseToStrongTypeTest() { JArray albums = JArray.Parse(jsonString) as JArray; // pick out one album JObject jalbum = albums[0] as JObject; // Copy to a static Album instance Album album = jalbum.ToObject<Album>(); Assert.IsNotNull(album); Assert.AreEqual(album.AlbumName,jalbum.Value<string>("AlbumName")); Assert.IsTrue(album.Songs.Count > 0); } This is pretty damn useful for the scenario I mentioned earlier - you can read a large chunk of JSON and dynamically walk the property hierarchy down to the item you want to access, and then either access the specific item dynamically (as shown earlier) or map a part of the JSON to a strongly typed object. That's very powerful if you think about it - it leaves you in total control to decide what's dynamic and what's static. Strongly typed JSON Parsing With all this talk of dynamic let's not forget that JSON.NET of course also does strongly typed serialization which is drop dead easy. Here's a simple example on how to serialize and deserialize an object with JSON.NET:[TestMethod] public void StronglyTypedSerializationTest() { // Demonstrate deserialization from a raw string var album = new Album() { AlbumName = "Dirty Deeds Done Dirt Cheap", Artist = "AC/DC", Entered = DateTime.Now, YearReleased = 1976, Songs = new List<Song>() { new Song() { SongName = "Dirty Deeds Done Dirt Cheap", SongLength = "4:11" }, new Song() { SongName = "Love at First Feel", SongLength = "3:10" } } }; // serialize to string string json2 = JsonConvert.SerializeObject(album,Formatting.Indented); Console.WriteLine(json2); // make sure we can serialize back var album2 = JsonConvert.DeserializeObject<Album>(json2); Assert.IsNotNull(album2); Assert.IsTrue(album2.AlbumName == "Dirty Deeds Done Dirt Cheap"); Assert.IsTrue(album2.Songs.Count == 2); } JsonConvert is a high level static class that wraps lower level functionality, but you can also use the JsonSerializer class, which allows you to serialize/parse to and from streams. It's a little more work, but gives you a bit more control. The functionality available is easy to discover with Intellisense, and that's good because there's not a lot in the way of documentation that's actually useful. Summary JSON.NET is a pretty complete JSON implementation with lots of different choices for JSON parsing from dynamic parsing to static serialization, to complex querying of JSON objects using LINQ. It's good to see this open source library getting integrated into .NET, and pushing out the old and tired stock .NET parsers so that we finally have a bit more flexibility - and extensibility - in our JSON parsing. Good to go! Resources Sample Test Project http://json.codeplex.com/© Rick Strahl, West Wind Technologies, 2005-2012Posted in .NET  Web Api  AJAX   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Visual Studio 2010 and .NET 4 Released

    - by ScottGu
    The final release of Visual Studio 2010 and .NET 4 is now available. Download and Install Today MSDN subscribers, as well as WebsiteSpark/BizSpark/DreamSpark members, can now download the final releases of Visual Studio 2010 and TFS 2010 through the MSDN subscribers download center.  If you are not an MSDN Subscriber, you can download free 90-day trial editions of Visual Studio 2010.  Or you can can download the free Visual Studio express editions of Visual Web Developer 2010, Visual Basic 2010, Visual C# 2010 and Visual C++.  These express editions are available completely for free (and never time out).  If you are looking for an easy way to setup a new machine for web-development you can automate installing ASP.NET 4, ASP.NET MVC 2, IIS, SQL Server Express and Visual Web Developer 2010 Express really quickly with the Microsoft Web Platform Installer (just click the install button on the page). What is new with VS 2010 and .NET 4 Today’s release is a big one – and brings with it a ton of new feature and capabilities. One of the things we tried hard to focus on with this release was to invest heavily in making existing applications, projects and developer experiences better.  What this means is that you don’t need to read 1000+ page books or spend time learning major new concepts in order to take advantage of the release.  There are literally thousands of improvements (both big and small) that make you more productive and successful without having to learn big new concepts in order to start using them.  Below is just a small sampling of some of the improvements with this release: Visual Studio 2010 IDE  Visual Studio 2010 now supports multiple-monitors (enabling much better use of screen real-estate).  It has new code Intellisense support that makes it easier to find and use classes and methods. It has improved code navigation support for searching code-bases and seeing how code is called and used.  It has new code visualization support that allows you to see the relationships across projects and classes within projects, as well as to automatically generate sequence diagrams to chart execution flow.  The editor now supports HTML and JavaScript snippet support as well as improved JavaScript intellisense. The VS 2010 Debugger and Profiling support is now much, much richer and enables new features like Intellitrace (aka Historical Debugging), debugging of Crash/Dump files, and better parallel debugging.  VS 2010’s multi-targeting support is now much richer, and enables you to use VS 2010 to target .NET 2, .NET 3, .NET 3.5 and .NET 4 applications.  And the infamous Add Reference dialog now loads much faster. TFS 2010 is now easy to setup (you can now install the server in under 10 minutes) and enables great source-control, bug/work-item tracking, and continuous integration support.  Testing (both automated and manual) is now much, much richer.  And VS 2010 Premium and Ultimate provide much richer architecture and design tooling support. VB and C# Language Features VB and C# in VS 2010 both contain a bunch of new features and capabilities.  VB adds new support for automatic properties, collection initializers, and implicit line continuation support among many other features.  C# adds support for optional parameters and named arguments, a new dynamic keyword, co-variance and contra-variance, and among many other features. ASP.NET 4 and ASP.NET MVC 2 With ASP.NET 4, Web Forms controls now render clean, semantically correct, and CSS friendly HTML markup. Built-in URL routing functionality allows you to expose clean, search engine friendly, URLs and increase the traffic to your Website.  ViewState within applications can now be more easily controlled and made smaller.  ASP.NET Dynamic Data support has been expanded.  More controls, including rich charting and data controls, are now built-into ASP.NET 4 and enable you to build applications even faster.  New starter project templates now make it easier to get going with new projects.  SEO enhancements make it easier to drive traffic to your public facing sites.  And web.config files are now clean and simple. ASP.NET MVC 2 is now built-into VS 2010 and ASP.NET 4, and provides a great way to build web sites and applications using a model-view-controller based pattern. ASP.NET MVC 2 adds features to easily enable client and server validation logic, provides new strongly-typed HTML and UI-scaffolding helper methods.  It also enables more modular/reusable applications.  The new <%: %> syntax in ASP.NET makes it easier to HTML encode output.  Visual Studio 2010 also now includes better tooling support for unit testing and TDD.  In particular, “Consume first intellisense” and “generate from usage" support within VS 2010 make it easier to write your unit tests first, and then drive your implementation from them. Deploying ASP.NET applications gets a lot easier with this release. You can now publish your Websites and applications to a staging or production server from within Visual Studio itself. Visual Studio 2010 makes it easy to transfer all your files, code, configuration, database schema and data in one complete package. VS 2010 also makes it easy to manage separate web.config configuration files settings depending upon whether you are in debug, release, staging or production modes. WPF 4 and Silverlight 4 WPF 4 includes a ton of new improvements and capabilities including more built-in controls, richer graphics features (cached composition, pixel shader 3 support, layoutrounding, and animation easing functions), a much improved text stack (with crisper text rendering, custom dictionary support, and selection and caret brush options).  WPF 4 also includes a bunch of support to enable you to take advantage of new Windows 7 features – including multi-touch and Windows 7 shell integration. Silverlight 4 will launch this week as well.  You can watch my Silverlight 4 launch keynote streamed live Tuesday (April 13th) at 8am Pacific Time.  Silverlight 4 includes a ton of new capabilities – including a bunch for making it possible to build great business applications and out of the browser applications.  I’ll be doing a separate blog post later this week (once it is live on the web) that talks more about its capabilities. Visual Studio 2010 now includes great tooling support for both WPF and Silverlight.  The new VS 2010 WPF and Silverlight designer makes it much easier to build client applications as well as build great line of business solutions, as well as integrate and bind with data.  Tooling support for Silverlight 4 with the final release of Visual Studio 2010 will be available when Silverlight 4 releases to the web this week. SharePoint and Azure Visual Studio 2010 now includes built-in support for building SharePoint applications.  You can now create, edit, build, and debug SharePoint applications directly within Visual Studio 2010.  You can also now use SharePoint with TFS 2010. Support for creating Azure-hosted applications is also now included with VS 2010 – allowing you to build ASP.NET and WCF based applications and host them within the cloud. Data Access Data access has a lot of improvements coming to it with .NET 4.  Entity Framework 4 includes a ton of new features and capabilities – including support for model first and POCO development, default support for lazy loading, built-in support for pluralization/singularization of table/property names within the VS 2010 designer, full support for all the LINQ operators, the ability to optionally expose foreign keys on model objects (useful for some stateless web scenarios), disconnected API support to better handle N-Tier and stateless web scenarios, and T4 template customization support within VS 2010 to allow you to customize and automate how code is generated for you by the data designer.  In addition to improvements with the Entity Framework, LINQ to SQL with .NET 4 also includes a bunch of nice improvements.  WCF and Workflow WCF includes a bunch of great new capabilities – including better REST, activation and configuration support.  WCF Data Services (formerly known as Astoria) and WCF RIA Services also now enable you to easily expose and work with data from remote clients. Windows Workflow is now much faster, includes flowchart services, and now makes it easier to make custom services than before.  More details can be found here. CLR and Core .NET Library Improvements .NET 4 includes the new CLR 4 engine – which includes a lot of nice performance and feature improvements.  CLR 4 engine now runs side-by-side in-process with older versions of the CLR – allowing you to use two different versions of .NET within the same process.  It also includes improved COM interop support.  The .NET 4 base class libraries (BCL) include a bunch of nice additions and refinements.  In particular, the .NET 4 BCL now includes new parallel programming support that makes it much easier to build applications that take advantage of multiple CPUs and cores on a computer.  This work dove-tails nicely with the new VS 2010 parallel debugger (making it much easier to debug parallel applications), as well as the new F# functional language support now included in the VS 2010 IDE.  .NET 4 also now also has the Dynamic Language Runtime (DLR) library built-in – which makes it easier to use dynamic language functionality with .NET.  MEF – a really cool library that enables rich extensibility – is also now built-into .NET 4 and included as part of the base class libraries.  .NET 4 Client Profile The download size of the .NET 4 redist is now much smaller than it was before (the x86 full .NET 4 package is about 36MB).  We also now have a .NET 4 Client Profile package which is a pure sub-set of the full .NET that can be used to streamline client application installs. C++ VS 2010 includes a bunch of great improvements for C++ development.  This includes better C++ Intellisense support, MSBuild support for projects, improved parallel debugging and profiler support, MFC improvements, and a number of language features and compiler optimizations. My VS 2010 and .NET 4 Blog Series I’ve been cranking away on a blog series the last few months that highlights many of the new VS 2010 and .NET 4 improvements.  The good news is that I have about 20 in-depth posts already written.  The bad news (for me) is that I have about 200 more to go until I’m done!  I’m going to try and keep adding a few more each week over the next few months to discuss the new improvements and how best to take advantage of them. Below is a list of the already written ones that you can check out today: Clean Web.Config Files Starter Project Templates Multi-targeting Multiple Monitor Support New Code Focused Web Profile Option HTML / ASP.NET / JavaScript Code Snippets Auto-Start ASP.NET Applications URL Routing with ASP.NET 4 Web Forms Searching and Navigating Code in VS 2010 VS 2010 Code Intellisense Improvements WPF 4 Add Reference Dialog Improvements SEO Improvements with ASP.NET 4 Output Cache Extensibility with ASP.NET 4 Built-in Charting Controls for ASP.NET and Windows Forms Cleaner HTML Markup with ASP.NET 4 - Client IDs Optional Parameters and Named Arguments in C# 4 - and a cool scenarios with ASP.NET MVC 2 Automatic Properties, Collection Initializers and Implicit Line Continuation Support with VB 2010 New <%: %> Syntax for HTML Encoding Output using ASP.NET 4 JavaScript Intellisense Improvements with VS 2010 Stay tuned to my blog as I post more.  Also check out this page which links to a bunch of great articles and videos done by others. VS 2010 Installation Notes If you have installed a previous version of VS 2010 on your machine (either the beta or the RC) you must first uninstall it before installing the final VS 2010 release.  I also recommend uninstalling .NET 4 betas (including both the client and full .NET 4 installs) as well as the other installs that come with VS 2010 (e.g. ASP.NET MVC 2 preview builds, etc).  The uninstalls of the betas/RCs will clean up all the old state on your machine – after which you can install the final VS 2010 version and should have everything just work (this is what I’ve done on all of my machines and I haven’t had any problems). The VS 2010 and .NET 4 installs add a bunch of new managed assemblies to your machine.  Some of these will be “NGEN’d” to native code during the actual install process (making them run fast).  To avoid adding too much time to VS setup, though, we don’t NGEN all assemblies immediately – and instead will NGEN the rest in the background when your machine is idle.  Until it finishes NGENing the assemblies they will be JIT’d to native code the first time they are used in a process – which for large assemblies can sometimes cause a slight performance hit. If you run into this you can manually force all assemblies to be NGEN’d to native code immediately (and not just wait till the machine is idle) by launching the Visual Studio command line prompt from the Windows Start Menu (Microsoft Visual Studio 2010->Visual Studio Tools->Visual Studio Command Prompt).  Within the command prompt type “Ngen executequeueditems” – this will cause everything to be NGEN’d immediately. How to Buy Visual Studio 2010 You can can download and use the free Visual Studio express editions of Visual Web Developer 2010, Visual Basic 2010, Visual C# 2010 and Visual C++.  These express editions are available completely for free (and never time out). You can buy a new copy of VS 2010 Professional that includes a 1 year subscription to MSDN Essentials for $799.  MSDN Essentials includes a developer license of Windows 7 Ultimate, Windows Server 2008 R2 Enterprise, SQL Server 2008 DataCenter R2, and 20 hours of Azure hosting time.  Subscribers also have access to MSDN’s Online Concierge, and Priority Support in MSDN Forums. Upgrade prices from previous releases of Visual Studio are also available.  Existing Visual Studio 2005/2008 Standard customers can upgrade to Visual Studio 2010 Professional for a special $299 retail price until October.  You can take advantage of this VS Standard->Professional upgrade promotion here. Web developers who build applications for others, and who are either independent developers or who work for companies with less than 10 employees, can also optionally take advantage of the Microsoft WebSiteSpark program.  This program gives you three copies of Visual Studio 2010 Professional, 1 copy of Expression Studio, and 4 CPU licenses of both Windows 2008 R2 Web Server and SQL 2008 Web Edition that you can use to both develop and deploy applications with at no cost for 3 years.  At the end of the 3 years there is no obligation to buy anything.  You can sign-up for WebSiteSpark today in under 5 minutes – and immediately have access to the products to download. Summary Today’s release is a big one – and has a bunch of improvements for pretty much every developer.  Thank you everyone who provided feedback, suggestions and reported bugs throughout the development process – we couldn’t have delivered it without you.  Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • Using an alternate search platform in Commerce Server 2009

    - by Lewis Benge
    Although Microsoft Commerce Server 2009's architecture is built upon Microsoft SQL Server, and has the full power of the SQL Full Text Indexing Search Platform, there are time however when you may require a richer or alternate search platform. One of these scenarios if when you want to implement a faceted (refinement) search into your site, which provides dynamic refinements based on the search results dataset. Faceted search is becoming popular in most online retail environments as a way of providing an enhanced user experience when browsing a larger catalogue. This is powerful for two reasons, firstly with a traditional search it is down to a user to think of a search term suitable for the product they are trying to find. This typically will not return similar products or help in any way to refine a larger dataset. Faceted searches on the other hand provide a comprehensive list of product properties, grouped together by similarity to help the user narrow down the results returned, as the user progressively restricts the search criteria by selecting additional criteria to search again, these facets needs to continually refresh. The whole experience allows users to explore alternate brands, price-ranges, or find products they hadn't initially thought of or where looking for in a bid to enhance cross sell in the retail environment. The second advantage of this type of search from a business perspective is also to harvest the search result to start to profile your user. Even though anonymous users may routinely visit your site, and will not necessarily register or complete a transaction to build up marketing data- profiling, you can still achieve the same result by recording search facets used within the search sequence. Below is a faceted search scenario generated from eBay using the search term "server". By creating a search profile of clicking through Computer & Networking -> Servers -> Dell - > New and recording this information against my user profile you can start to predict with a lot more certainty what types of products I am interested in. This will allow you to apply shopping-cart analysis against your search data and provide great cross-sale or advertising opportunity, or personalise the user experience based on your prediction of what the user may be interested in. This type of search is extremely beneficial in e-Commerce environments but achieving it out of the box with Commerce Server and SQL Full Text indexing can be challenging. In many deployments it is often easier to use an alternate search platform such as Microsoft's FAST, Apache SOLR, or Endecca, however you still want these products to integrate natively into Commerce Server to ensure that up-to-date inventory information is presented, profile information is generated, and you provide a consistant API. To do so we make the most of the Commerce Server extensibilty points called operation sequence components. In this example I will be talking about Apache Solr hosted on Apache Tomcat, in this specific example I have used the SolrNet C# library to interface to the Java platform. Also I am not going to talk about Solr configuration of indexing – but in a production envionrment this would typically happen by using Powershell to call the Commerce Server management webservice to export your catalog as XML, apply an XSLT transform to the file to make it conform to SOLR and use a simple HTTP Post to send it to the search enginge for indexing. Essentially a sequance component is a step in a serial workflow used to call a data repository (which in most cases is usually the Commerce Server pipelines or databases) and map to and from a Commerce Entity object whilst enforcing any business rules. So the first step in the process is to add a new class library to your existing Commerce Server site. You will need to use a new library as Sequence Components will need to be strongly named to be deployed. Once you are inside of your new project, add a new class file and add a reference to the Microsoft.Commerce.Providers, Microsoft.Commerce.Contracts and the Microsoft.Commerce.Broker assemblies. Now make your new class derive from the base object Microsoft.Commerce.Providers.Components.OperationSequanceComponent and overide the ExecuteQueryMethod. Your screen will then look something similar ot this: As all we are doing on this component is conducting a search we are only interested in the ExecuteQuery method. This method accepts three arguments, queryOperation, operationCache, and response. The queryOperation will be the object in which we receive our search parameters, the cache allows access to the Commerce Server cache allowing us to store regulary accessed information, and the response object is the object which we will return the result of our search upon. Inside this method is simply where we are going to inject our logic for our third party search platform. As I am not going to explain the inner-workings of actually making a SOLR call, I'll simply provide the sample code here. I would highly recommend however looking at the SolrNet wiki as they have some great explinations of how the API works. What you will find however is that there are some further extensions required when attempting to integrate a custom search provider. Firstly you out of the box the CommerceQueryOperation you will receive into the method when conducting a search against a catalog is specifically geared towards a SQL Full Text Search with properties such as a Where clause. To make the operation you receive more relevant you will need to create another class, this time derived from Microsoft.Commerce.Contract.Messages.CommerceSearchCriteria and within this you need to detail the properties you will require to allow you to submit as parameters to the SOLR search API. My exmaple looks like this: [DataContract(Namespace = "http://schemas.microsoft.com/microsoft-multi-channel-commerce-foundation/types/2008/03")] public class CommerceCatalogSolrSearch : CommerceSearchCriteria { private Dictionary<string, string> _facetQueries;   public CommerceCatalogSolrSearch() { _facetQueries = new Dictionary<String, String>();   }     public Dictionary<String, String> FacetQueries { get { return _facetQueries; } set { _facetQueries = value; } }   public String SearchPhrase{ get; set; } public int PageIndex { get; set; } public int PageSize { get; set; } public IEnumerable<String> Facets { get; set; }   public string Sort { get; set; }   public new int FirstItemIndex { get { return (PageIndex-1)*PageSize; } }   public int LastItemIndex { get { return FirstItemIndex + PageSize; } } }  To allow you to construct a CommerceQueryOperation call within the API you will also need to construct another class to derived from Microsoft.Commerce.Common.MessageBuilders.CommerceSearchCriteriaBuilder and is simply used to construct an instance of the CommerceQueryOperation you have just created and expose the properties you want set. My Message builder looks like this: public class CommerceCatalogSolrSearchBuilder : CommerceSearchCriteriaBuilder { private CommerceCatalogSolrSearch _solrSearch;   public CommerceCatalogSolrSearchBuilder() { _solrSearch = new CommerceCatalogSolrSearch(); }   public String SearchPhrase { get { return _solrSearch.SearchPhrase; } set { _solrSearch.SearchPhrase = value; } }   public int PageIndex { get { return _solrSearch.PageIndex; } set { _solrSearch.PageIndex = value; } }   public int PageSize { get { return _solrSearch.PageSize; } set { _solrSearch.PageSize = value; } }   public Dictionary<String,String> FacetQueries { get { return _solrSearch.FacetQueries; } set { _solrSearch.FacetQueries = value; } }   public String[] Facets { get { return _solrSearch.Facets.ToArray(); } set { _solrSearch.Facets = value; } } public override CommerceSearchCriteria ToSearchCriteria() { return _solrSearch; } }  Once you have these two classes in place you can now safely cast the CommerceOperation you receive as an argument of the overidden ExecuteQuery method in the SequenceComponent to the CommerceCatalogSolrSearch operation you have just created, e.g. public CommerceCatalogSolrSearch TryGetSearchCriteria(CommerceOperation operation) { var searchCriteria = operation as CommerceQueryOperation; if (searchCriteria == null) throw new Exception("No search criteria present");   var local = (CommerceCatalogSolrSearch) searchCriteria.SearchCriteria; if (local == null) throw new Exception("Unexpected Search Criteria in Operation");   return local; }  Now you have all of your search parameters present, you can go off an call the external search platform API. You will of-course get proprietry objects returned, so the next step in the process is to convert the results being returned back into CommerceEntities. You do this via another extensibility point within the Commerce Server API called translatators. Translators are another separate class, this time derived inheriting the interface Microsoft.Commerce.Providers.Translators.IToCommerceEntityTranslator . As you can imaginge this interface is specific for the conversion of the object TO a CommerceEntity, you will need to implement a separate interface if you also need to go in the opposite direction. If you implement the required method for the interace you will get a single translate method which has a source onkect, destination CommerceEntity, and a collection of properties as arguments. For simplicity sake in this example I have hard-coded the mappings, however best practice would dictate you map the objects using your metadatadefintions.xml file . Once complete your translator would look something like the following: public class SolrEntityTranslator : IToCommerceEntityTranslator { #region IToCommerceEntityTranslator Members   public void Translate(object source, CommerceEntity destinationCommerceEntity, CommercePropertyCollection propertiesToReturn) { if (source.GetType().Equals(typeof (SearchProduct))) { var searchResult = (SearchProduct) source;   destinationCommerceEntity.Id = searchResult.ProductId; destinationCommerceEntity.SetPropertyValue("DisplayName", searchResult.Title); destinationCommerceEntity.ModelName = "Product";   } }  Once you have a translator in place you can then safely map the results of your search platform into Commerce Entities and attach them on to the CommerceResponse object in a fashion similar to this: foreach (SearchProduct result in matchingProducts) { var destinationEntity = new CommerceEntity(_returnModelName);   Translator.ToCommerceEntity(result, destinationEntity, _queryOperation.Model.Properties); response.CommerceEntities.Add(destinationEntity); }  In SOLR I actually have two objects being returned – a product, and a collection of facets so I have an additional translator for facet (which maps to a custom facet CommerceEntity) and my facet response from SOLR is passed into the Translator helper class seperatley. When all of this is pieced together you have sucessfully completed the extensiblity point coding. You would have created a new OperationSequanceComponent, a custom SearchCritiera object and message builder class, and translators to convert the objects into Commerce Entities. Now you simply need to configure them, and can start calling them in your code. Make sure you sign you assembly, compile it and identiy its signature. Next you need to put this a reference of your new assembly into the Channel.Config configuration file replacing that of the existing SQL Full Text component: You will also need to add your translators to the Translators node of your Channel.Config too: Lastly add any custom CommerceEntities you have developed to your MetaDataDefintions.xml file. Your configuration is now complete, and you should now be able to happily make a call to the Commerce Foundation API, which will act as a proxy to your third party search platform and return back CommerceEntities of your search results. If you require data to be enriched, or logged, or any other logic applied then simply add further sequence components into the OperationSequence (obviously keeping the search response first) to the node of your Channel.Config file. Now to call your code you simply request it as per any other CommerceQuery operation, but taking into account you may be receiving multiple types of CommerceEntity returned: public KeyValuePair<FacetCollection ,List<Product>> DoFacetedProductQuerySearch(string searchPhrase, string orderKey, string sortOrder, int recordIndex, int recordsPerPage, Dictionary<string, string> facetQueries, out int totalItemCount) { var products = new List<Product>(); var query = new CommerceQuery<CatalogEntity, CommerceCatalogSolrSearchBuilder>();   query.SearchCriteria.PageIndex = recordIndex; query.SearchCriteria.PageSize = recordsPerPage; query.SearchCriteria.SearchPhrase = searchPhrase; query.SearchCriteria.FacetQueries = facetQueries;     totalItemCount = 0; CommerceResponse response = SiteContext.ProcessRequest(query.ToRequest()); var queryResponse = response.OperationResponses[0] as CommerceQueryOperationResponse;   // No results. Return the empty list if (queryResponse != null && queryResponse.CommerceEntities.Count == 0) return new KeyValuePair<FacetCollection, List<Product>>();   totalItemCount = (int)queryResponse.TotalItemCount;   // Prepare a multi-operation to retrieve the product variants var multiOperation = new CommerceMultiOperation();     //Add products to results foreach (Product product in queryResponse.CommerceEntities.Where(x => x.ModelName == "Product")) { var productQuery = new CommerceQuery<Product>(Product.ModelNameDefinition); productQuery.SearchCriteria.Model.Id = product.Id; productQuery.SearchCriteria.Model.CatalogId = product.CatalogId;   var variantQuery = new CommerceQueryRelatedItem<Variant>(Product.RelationshipName.Variants);   productQuery.RelatedOperations.Add(variantQuery);   multiOperation.Add(productQuery); }   CommerceResponse variantsResponse = SiteContext.ProcessRequest(multiOperation.ToRequest()); foreach (CommerceQueryOperationResponse queryOpResponse in variantsResponse.OperationResponses) { if (queryOpResponse.CommerceEntities.Count() > 0) products.Add(queryOpResponse.CommerceEntities[0]); }   //Get facet collection FacetCollection facetCollection = queryResponse.CommerceEntities.Where(x => x.ModelName == "FacetCollection").FirstOrDefault();     return new KeyValuePair<FacetCollection, List<Product>>(facetCollection, products); }    ..And that is it – simply a few classes and some configuration will allow you to extend the Commerce Server query operations to call a third party search platform, whilst still maintaing a unifed API in the remainder of your code. This logic stands for any extensibility within CommerceServer, which requires excution in a serial fashioon such as call to LOB systems or web service to validate or enrich data. Feel free to use this example on other applications, and if you have any questions please feel free to e-mail and I'll help out where I can!

    Read the article

  • CodePlex Daily Summary for Tuesday, March 01, 2011

    CodePlex Daily Summary for Tuesday, March 01, 2011Popular ReleasesDirectQ: Release 1.8.7 (Beta 3): Fixes some problems and adds some more enhancements.Sandcastle Help File Builder: SHFB v1.9.2.0 Release: NOTE TO 32-BIT WINDOWS XP USERS: There is a problem with a type converter that fails on 32-bit Windows XP due to how it searches for the framework versions. I'll issue an update later today that fixes the issue. This release supports the Sandcastle June 2010 Release (v2.6.10621.1). It includes full support for generating, installing, and removing MS Help Viewer files. This new release is compiled under .NET 4.0, supports Visual Studio 2010 solutions and projects as documentation sources, ...Network Monitor Open Source Parsers: Microsoft Network Monitor Parsers 3.4.2554: The Network Monitor Parsers packages contain parsers for more than 400 network protocols, including RFC based public protocols and protocols for Microsoft products defined in the Microsoft Open Specifications for Windows and SQL Server. NetworkMonitor_Parsers.msi is the base parser package which defines parsers for commonly used public protocols and protocols for Microsoft Windows. In this release, we have added 4 new protocol parsers and updated 79 existing parsers in the NetworkMonitor_Pa...Ajax Minifier: Microsoft Ajax Minifier 4.13: New features: switches and settings for turning off Conditional Compilation comment processing; for adding variable and/or function names that should not be renamed automatically; for adding manual renaming of variables/functions/properties; for automatic evaluation of certain literal expressions (but not all).Image Resizer for Windows: Image Resizer 3 Preview 1: Prepare to have your minds blown. This is the first preview of what will eventually become 39613. There are still a lot of rough edges and plenty of areas still under construction, but for your basic needs, it should be relativly stable. Note: You will need the .NET Framework 4 installed to use this version. Below is a status report of where this release is in terms of the overall goal for version 3. If you're feeling a bit technically ambitious and want to check out some of the features th...JSON Toolkit: JSON Toolkit 1.1: updated GetAllJsonObjects() method and GetAllProperties() methods to JsonObject and Properties propertiesFacebook Graph Toolkit: Facebook Graph Toolkit 1.0: Refer to http://computerbeacon.net for Documentation and Tutorial New features:added FQL support added Expires property to Api object added support for publishing to a user's friend / Facebook Page added support for posting and removing comments on posts added support for adding and removing likes on posts and comments added static methods for Page class added support for Iframe Application Tab of Facebook Page added support for obtaining the user's country, locale and age in If...ASP.NET MVC Project Awesome, jQuery Ajax helpers (controls): 1.7.1: A rich set of helpers (controls) that you can use to build highly responsive and interactive Ajax-enabled Web applications. These helpers include Autocomplete, AjaxDropdown, Lookup, Confirm Dialog, Popup Form, Popup and Pager small improvements for some helpers and AjaxDropdown has Data like the Lookup except it's value gets reset and list refilled if any element from data gets changedManaged Extensibility Framework: MEF 2 Preview 3: This release aims .net 4.0 and Silverlight 4.0. Accordingly, there are two solutions files. The assemblies are named System.ComponentModel.Composition.Codeplex.dll as a way to avoid clashing with the version shipped with the 4th version of the framework. Introduced CompositionOptions to container instantiation CompositionOptions.DisableSilentRejection makes MEF throw an exception on composition errors. Useful for diagnostics Support for open generics Support for attribute-less registr...PHPExcel: PHPExcel 1.7.6 Production: DonationsDonate via PayPal via PayPal. If you want to, we can also add your name / company on our Donation Acknowledgements page. PEAR channelWe now also have a full PEAR channel! Here's how to use it: New installation: pear channel-discover pear.pearplex.net pear install pearplex/PHPExcel Or if you've already installed PHPExcel before: pear upgrade pearplex/PHPExcel The official page can be found at http://pearplex.net. Want to contribute?Please refer the Contribute page.WPF Application Framework (WAF): WPF Application Framework (WAF) 2.0.0.4: Version: 2.0.0.4 (Milestone 4): This release contains the source code of the WPF Application Framework (WAF) and the sample applications. Requirements .NET Framework 4.0 (The package contains a solution file for Visual Studio 2010) The unit test projects require Visual Studio 2010 Professional Remark The sample applications are using Microsoft’s IoC container MEF. However, the WPF Application Framework (WAF) doesn’t force you to use the same IoC container in your application. You can use ...VidCoder: 0.8.2: Updated auto-naming to handle seconds and frames ranges as well. Deprecated the {chapters} token for auto-naming in favor of {range}. Allowing file drag to preview window and enabling main window shortcut keys to work no matter what window is focused. Added option in config to enable giving custom names to audio tracks. (Note that these names will only show up certain players like iTunes or on the iPod. Players that support custom track names normally may not show them.) Added tooltips ...SQL Server Compact Toolbox: Standalone version 2.0 for SQL Server Compact 4.0: Download the Visual Studio add-in for SQL Server Compact 4.0 and 3.5 from here Standalone version of (most of) the same functionality as the add-in, for SQL Server Compact 4.0. Useful for anyone not having Visual Studio Professional or higher installed. Requires .NET 4.0. Any feedback much appreciated.Claims Based Identity & Access Control Guide: Drop 1 - Claims Identity Guide V2: Highlights of drop #1 This is the first drop of the new "Claims Identity Guide" edition. In this release you will find: All previous samples updated and enhanced. All code upgraded to .NET 4 and Visual Studio 2010. Extensive cleanup. Refactored Simulated Issuers: each solution now gets its own issuers. This results in much cleaner and simpler to understand code. Added Single Sign Out support. Added first sample using ACS ("ACS as a Federation Provider"). This sample extends the ori...Simple Notify: Simple Notify Beta 2011-02-25: Feature: host the service with a single click in console Feature: host the service as a windows service Feature: notification cient application Feature: push client application Feature: push notifications from your powershell script Feature: C# wrapper libraries for your applicationspatterns & practices: Project Silk: Project Silk Community Drop 3 - 25 Feb 2011: IntroductionWelcome to the third community drop of Project Silk. For this drop we are requesting feedback on overall application architecture, code review of the JavaScript Conductor and Widgets, and general direction of the application. Project Silk provides guidance and sample implementations that describe and illustrate recommended practices for building modern web applications using technologies such as HTML5, jQuery, CSS3 and Internet Explorer 9. This guidance is intended for experien...Minemapper: Minemapper v0.1.5: Now supports new Minecraft beta v1.3 map format, thanks to updated mcmap. Disabled biomes, until Minecraft Biome Extractor supports new format.HERB.IQ: HERB.IQ.NEW.INSTALL.0.6.0.zip: HERB.IQ.NEW.INSTALL.0.6.0.zipCoding4Fun Tools: Coding4Fun.Phone.Toolkit v1.2: New control, Toast Prompt! Removed progress bar since Silverlight Toolkit Feb 2010 has it.HubbleDotNet - Open source full-text search engine: V1.1.0.0: Add Sqlite3 DBAdapter Add App Report when Query Cache is Collecting. Improve the performance of index through Synchronize. Add top 0 feature so that we can only get count of the result. Improve the score calculating algorithm of match. Let the score of the record that match all items large then others. Add MySql DBAdapter Improve performance for multi-fields sort . Using hash table to access the Payload data. The version before used bin search. Using heap sort instead of qui...New ProjectsAssembly Explorer: Assembly Explorer is a developer utility that displays the namespaces, types, and members in an assembly. It also displays the MSIL or translated .NET language code.automated reporting system: ???????? ??????????? ?????????????? ???????????????? ????????? ?????? ?????????? ????????? Custom XSLT with Group by in Biztalk 2009: Custom XSLT with Group by in Biztalk 2009DotNet Repository: A simple to use, generic repository using Linq to SQL or Linq to Objects. euler 28: euler 28euler29: euler 29 problemFreeType for AirplaySDK: FreeType adoptation for Airplay SDK.Icicle Framework: An in-the-works component based game framework for XNA.Jogo dos Palitinhos: Jogo desenvolvido por alunos do 4º Ciclo Noturno de Programação do Curso de Análise de Sistemas e Tecnologia da Informação da Faculdade de Tecnologia de Carapicuíba. Este é o jogo dos palitinhos: uma mistura de lógica, advinhação e sorte. Será desenvolvido na plataforma Java.karmencita: Karmencita is a high level object query language for .NET . Its purpose is to allow easy querying from in memory structured data.Libero Site 011: libero sit 011MaLoRTLib: raytracer library used in the MaLoRT.MetroEdit: A WPF Text Editor based on the Metro UI Design Guidelines. Features: - Clean and simple UI based on Metro - 32bit and 64bit support - Tabbing - Syntax highlighting NOTE: Based on .NET Framework 4.0 and uses the following libraries: - MVVM Light Toolkit - AvalonEditMiaSocks: A .NET SOCKS Server Implementation base on SuperSocketmicroruntime: The MicroRuntime project is a .NET utility library.MVC Forum: A bulletin board system (like phpBB) running on ASP.NET MVC.newshehuishijianzhongxin: newshehuishijianzhongxinPrism Extension: Contains extensions for prism to reinforce some functionsRInterfaces: An interface to pass data toward and back from R and executing R code from .NETSharePoint 2007 Wiki Export: A very simple wiki export utility for SharePoint 2007. You can export a wiki library to the file system with the specified file extension, and wrapped in the speciified markup. Written in C#. The List service url is set dynamically so there is a dummy url in the configurations.Simon Squared: Simon Squared is a Multi-player Puzzle game for Windows Phone 7. It uses the XNA framework on the Phone, and the WCF Http CTP on the server side to handle communication between phones. It's written in C#.Sitefinity Toolkit: The Sitefinity Toolkit is a collection of enhancements to the Sitefinity Content Management System by Telerik. It currently supports Sitefinity version 3.7 (through SP4), and includes a number of tools to automate and simplify a number of actions and features.Slog: Slog is blog engine like Wordpress in Silverlight 4 that will have same fonctionality to bigin with and the same extensiblity thanks to MEF. Server side will be WCF DataServices, Entity Framework 4 and SQL Server Compact 4.SnagL: Social Network Analysis Graph Live (SnagL) is a light-weight, pluggable application that operates from a web browser and works with existing applications and back-end data stores to provide a visual way to understand information and enhance analysis.SocialShare Starter Kit: SocialShare Starter Kit is a web application that illustrate a wide range of features that needed to build a social site.This web application framework written in C# ASP.NET 4.0.Split Large XML file into small XML files: Split Large XML file into group of smaller XML files in sequential order. As posted to http://codeproject.com <a href='http://www.codeproject.com/KB/XML/SplitLargeXMLintoSmallFil.aspx'>Link</a>SSIS SSH Components: SSIS control flow tasks for SFTP and executing shell commands along with an SSH connection manager.StudioShell: StudioShell is a deeply integrated PowerShell module for Visual Studio 2010 and 2008. It will change the way you interact with your IDE and code by exposing the IDE extensibility features to PowerShell. What once took a binary can now be done in a one-liner.TBS: TBS TEZ BILGI SISTEMI tez bilgi sistemiuTestingService: uTestingService is a webservice with wrappers around Node and Document to allow for end-end testing of UmbracoWebsite Panel: Website Panel is a Windows application to help you manage multiple Dotnetnuke applications. Easy installations, backups & upgrades of DNN websites are just a few features of this application. Zinc: Zinc is a utility library for ASP.NET web forms development. It has support for: - utility methods for working easier with controls - CSV exports - HttpModules for dealing with caching and path based rights. - custom controls This library runs on .NET 2.0 and i would like to kee

    Read the article

  • Building a &ldquo;real&rdquo; extension for Expression Blend

    - by Timmy Kokke
    .Last time I showed you how to get started building extensions for Expression Blend. Lets build a useful extension this time and go a bit deeper into Blend. Source of project  => here Compiled dll => here (extract into /extensions folder of Expression Blend)   The Extension When working on large Xaml files in Blend it’s often hard to find a specific control in the "Objects and Timeline Pane”. An extension that searches the active document and presents all elements that satisfy the query would be helpful. When the user starts typing a search query a search will be performed and the results are shown in the list. After the user selects an item in the results list, the control in the "Objects and Timeline Pane” will be selected. Below is a sketch of what it is going to look like. The Solution Create a new WPF User Control project as shown in the earlier tutorial in the Configuring the extension project section, but name it AdvancedSearch this time. Delete the default UserControl1.Xaml to clear the solution (a new user control will be added later thought, but adding a user control is easier then renaming one). Create the main entry point of the addin by adding a new class to the solution and naming this  AdvancedSearchPackage. Add a reference to Microsoft.Expression.Extensibility and to System.ComponentModel.Composition . Implement the IPackage interface and add the Export attribute from the MEF to the definition. While you’re at it. Add references to Microsoft.Expression.DesignSurface, Microsoft.Expression.FrameWork and Microsoft.Expression.Markup. These will be used later. The Load method from the IPackage interface is going to create a ViewModel to bind to from the UI. Add another class to the solution and name this AdvancedSearchViewModel. This class needs to implement the INotifyPropertyChanged interface to enable notifications to the view.  Add a constructor to the class that takes an IServices interface as a parameter. Create a new instance of the AdvancedSearchViewModel in the load method in the AdvanceSearchPackage class. The AdvancedSearchPackage class should looks like this now:   using System.ComponentModel.Composition; using Microsoft.Expression.Extensibility;   namespace AdvancedSearch { [Export(typeof(IPackage))] public class AdvancedSearchPackage:IPackage {   public void Load(IServices services) { new AdvancedSearchViewModel(services); }   public void Unload() { } } }   Add a new UserControl to the project and name this AdvancedSearchView. The View will be created by the ViewModel, which will pass itself to the constructor of the view. Change the constructor of the View to take a AdvancedSearchViewModel object as a parameter. Add a private field to store the ViewModel and set this field in the constructor. Point the DataContext of the view to the ViewModel. The View will look something like this now:   namespace AdvancedSearch { public partial class AdvancedSearchView:UserControl { private readonly AdvancedSearchViewModel _advancedSearchViewModel;   public AdvancedSearchView(AdvancedSearchViewModel advancedSearchViewModel) { _advancedSearchViewModel = advancedSearchViewModel; InitializeComponent(); this.DataContext = _advancedSearchViewModel; } } }   The View is going to be created in the constructor of the ViewModel and stored in a read only property.   public FrameworkElement View { get; private set; }   public AdvancedSearchViewModel(IServices services) { _services = services; View = new AdvancedSearchView(this); } The last thing the solution needs before we’ll wire things up is a new class, PossibleNode. This class will be used later to store the search results. The solution should look like this now:   Adding UI to the UI The extension should build and run now, although nothing is showing up in Blend yet. To enable the user to perform a search query add a TextBox and a ListBox to the AdvancedSearchView.xaml file. I’ve set the rows of the grid too to make them look a little better. Add the TextChanged event to the TextBox and the SelectionChanged event to the ListBox, we’ll need those later on. <Grid> <Grid.RowDefinitions> <RowDefinition Height="32" /> <RowDefinition Height="*" /> </Grid.RowDefinitions> <TextBox TextChanged="SearchQueryTextChanged" HorizontalAlignment="Stretch" Margin="4" Name="SearchQuery" VerticalAlignment="Stretch" /> <ListBox SelectionChanged="SearchResultSelectionChanged" HorizontalAlignment="Stretch" Margin="4" Name="SearchResult" VerticalAlignment="Stretch" Grid.Row="1" /> </Grid>   This will create a user interface like: To make the View show up in Blend it has to be registered with the WindowService. The GetService<T> method is used to get services from Blend, which are your entry points into Blend.When writing extensions you will encounter this method very often. In this case we’re asking for an IWindowService interface. The IWindowService interface serves events for changing windows and themes, is used for adding or removing resources and is used for registering and unregistering Palettes. All panes in Blend are palettes and are registered thru the RegisterPalette method. The first parameter passed to this method is a string containing a unique ID for the palette. This ID can be used to get access to the palette later. The second parameter is the View. The third parameter is a title for the pane. This title is shown when the pane is visible. It is also shown in the window menu of Blend. The last parameter is a KeyBinding. I have chosen Ctrl+Shift+F to call the Advanced Search pane. This value is also shown in the window menu of Blend.   services.GetService<IWindowService>().RegisterPalette( "AdvancedSearch", viewModel.View, "Advanced Search", new KeyBinding { Key = Key.F, Modifiers = ModifierKeys.Control | ModifierKeys.Shift } );   You can compiler and run now. After Blend starts you can hit Ctrl+Shift+F or go the windows menu to call the advanced search extension. Searching for controls The search has to be cleared on every change of the active document. The DocumentServices fires an event every time a new document is opened, a document is closed or another document view is selected. Add the following line to the constructor of the ViewModel to handle the ActiveDocumentChanged event:   _services.GetService<IDocumentService>().ActiveDocumentChanged += ActiveDocumentChanged;   And implement the ActiveDocumentChanged method:   private void ActiveDocumentChanged(object sender, DocumentChangedEventArgs e) { }   To get to the contents of the document we first need to get access to the “Objects and Timeline” pane. This pane is registered in the PaletteRegistry in the same way as this extension has registered itself. The palettes are accessible thru an associative array. All you need to provide is the Identifier of the palette you want. The Id of the “Objects and Timeline” pane is “Designer_TimelinePane”. I’ve included a list of the other default panes at the bottom of this article. Each palette has a Content property which can be cast to the type of the pane.   var timelinePane = (TimelinePane)_services.GetService<IWindowService>() .PaletteRegistry["Designer_TimelinePane"] .Content;   Add a private field to the top of the AdvancedSearchViewModel class to store the active SceneViewModel. The SceneViewModel is needed to set the current selection and to get the little icons for the type of control.   private SceneViewModel _activeSceneViewModel;   When the active SceneViewModel changes, the ActiveSceneViewModel is stored in this field. The list of possible nodes is cleared and an PropertyChanged event is fired for this list to notify the UI to clear the list. This will make the eventhandler look like this: private void ActiveDocumentChanged(object sender, DocumentChangedEventArgs e) { var timelinePane = (TimelinePane)_services.GetService<IWindowService>() .PaletteRegistry["Designer_TimelinePane"].Content;   _activeSceneViewModel = timelinePane.ActiveSceneViewModel; PossibleNodes = new List<PossibleNode>(); InvokePropertyChanged("PossibleNodes"); } The PossibleNode class used to store information about the controls found by the search. It’s a dumb data class with only 3 properties, the name of the control, the SceneNode and a brush used for the little icon. The SceneNode is the base class for every possible object you can create in Blend, like Brushes, Controls, Annotations, ResourceDictionaries and VisualStates. The entire PossibleNode class looks like this:   using System.Windows.Media; using Microsoft.Expression.DesignSurface.ViewModel;   namespace AdvancedSearch { public class PossibleNode { public string Name { get; set; } public SceneNode SceneNode { get; set; } public DrawingBrush IconBrush { get; set; } } }   Add these two methods to the AdvancedSearchViewModel class:   public void Search(string searchText) { } public void SelectElement(PossibleNode node){ }   Both these methods are going to be called from the view. The Search method performs the search and updates the PossibleNodes list.  The controls in the active document can be accessed thru TimeLineItemsManager class. This class contains a read only collection of TimeLineItems. By using a Linq query the possible nodes are selected and placed in the PossibleNodes list.   var timelineItemManager = new TimelineItemManager(_activeSceneViewModel); PossibleNodes = new List<PossibleNode>( (from d in timelineItemManager.ItemList where d.DisplayName.ToLowerInvariant().StartsWith( searchText.ToLowerInvariant()) select new PossibleNode() { IconBrush = d.IconBrush, SceneNode = d.SceneNode, Name = d.DisplayName }).ToList() ); InvokePropertyChanged(InternalConst.PossibleNodes);   The Select method is pretty straight forward. It contains two lines.The first to clear the selection. Otherwise the selected element would be added to the current selection. The second line selects the nodes. It is given a new array with the node to be selected.   _activeSceneViewModel.ClearSelections(); _activeSceneViewModel.SelectNodes(new[] { node.SceneNode });   The last thing that needs to be done is to wire the whole thing to the View. The two event handlers just call the Search and SelectElement methods on the ViewModel.   private void SearchQueryTextChanged(object sender, TextChangedEventArgs e) { _advancedSearchViewModel.Search(SearchQuery.Text); }   private void SearchResultSelectionChanged(object sender, SelectionChangedEventArgs e) { if(e.AddedItems.Count>0) { _advancedSearchViewModel.SelectElement(e.AddedItems[0] as PossibleNode); } }   The Listbox has to be bound to the PossibleNodes list and a simple DataTemplate is added to show the selection. The IconWithOverlay control can be found in the Microsoft.Expression.DesignSurface.UserInterface.Timeline.UI namespace in the Microsoft.Expression.DesignSurface assembly. The ListBox should look something like:   <ListBox SelectionChanged="SearchResultSelectionChanged" HorizontalAlignment="Stretch" Margin="4" Name="SearchResult" VerticalAlignment="Stretch" Grid.Row="1" ItemsSource="{Binding PossibleNodes}"> <ListBox.ItemTemplate> <DataTemplate> <StackPanel Orientation="Horizontal"> <tlui:IconWithOverlay Margin="2,0,10,0" Width="12" Height="12" SourceBrush="{Binding Path=IconBrush, Mode=OneWay}" /> <TextBlock Text="{Binding Name}"/> </StackPanel> </DataTemplate> </ListBox.ItemTemplate> </ListBox>   Compile and run. Inside Blend the extension could look something like below. What’s Next When you’ve got the extension running. Try placing breakpoints in the code and see what else is in there. There’s a lot to explore and build extension on. I personally would love an extension to search for resources. Last but not least, you can download the source of project here.  If you have any questions let me know. If you just want to use this extension, you can download the compiled dll here. Just extract the . zip into the /extensions folder of Expression Blend. Notes Target framework I ran into some issues when using the .NET Framework 4 Client Profile as a target framework. I got some strange error saying certain obvious namespaces could not be found, Microsoft.Expression in my case. If you run into something like this, try setting the target framework to .NET Framework 4 instead of the client version.   Identifiers of default panes Identifier Type Title Designer_TimelinePane TimelinePane Objects and Timeline Designer_ToolPane ToolPane Tools Designer_ProjectPane ProjectPane Projects Designer_DataPane DataPane Data Designer_ResourcePane ResourcePane Resources Designer_PropertyInspector PropertyInspector Properties Designer_TriggersPane TriggersPane Triggers Interaction_Skin SkinView States Designer_AssetPane AssetPane Assets Interaction_Parts PartsPane Parts Designer_ResultsPane ResultsPane Results

    Read the article

  • best practices for setting development environment

    - by Sharique
    I use Linux as primary OS. I need some suggestions regarding how should I set up my desktop and development. I do work on mostly .Net and Drupal, but some time on other lamp products and C/C++, Qt. I'm also interested in mobile (android..) and embedded development. Currently I install everything on my main OS, even I use it a little. I use VMs a little. Should I use separate VM for each kind of development (like one for .Net/Mono, another C++, one for mobile and one for db only, one for xyz things etc) Keep primary development environment on main os and moveothers in VM. main os should be messed up keep things easy to organize (must) performance should be optimal

    Read the article

  • Differences in memory consumption between two identical D7 sites?

    - by aendrew
    I'm running Drupal on a news site that has a lot of different View blocks on the front page (~5 total, all cached). In trying to reduce the memory footprint of the site, I've checked out source from SVN to a local development install to try and convert some of those blocks into more optimized code. Here's the weird thing. Devel module lists memory consumption at 50mb on the Production site (Running Nginx, PHP 5.2.17, XCache and Zend Optimizer.) but only 14mb on my development site (Running Apache2, PHP 5.2.13 and XCache). These are nearly-identical versions of the same site — frankly, the Production site should use even less memory as I've disabled some of the modules running on the Dev site. Any idea why this might be the case?

    Read the article

  • Plesk file permissions - Apache/PHP conflicting with user accounts.

    - by hfidgen
    Hiya, I'm building a Drupal site which performs various automatic disk operations using the apache user (id=40). The problem is that the site was set up on a subdomain belonging to user ID 10001 (ie my main FTP account) so the filesystem belongs to that user ID. So I keep getting errors like this: warning: move_uploaded_file() [function.move-uploaded-file]: SAFE MODE Restriction in effect. The script whose uid is 10001 is not allowed to access /var/www/vhosts/domain.com/httpdocs/sites/default/files/images/user owned by uid 48 in /var/www/vhosts/domain.com/httpdocs/includes/file.inc on line 579. I've tried changing the apache group in httpd.conf to apache:psacln, psacln being the default group for all web users but that's not helped. The situation now is: ..../files/images/ = 777 and chown = ftplogin:psacln ..../files/images/user = 775 and chown = apache:psacln ..../files/tmp = 777 and chown = ftplogin:psacln So apparently uid 40 and 10001 both have permissions to write to any of the 3 directories involved, but still can't. Am i missing something here? Can anyone help? Thanks!

    Read the article

  • New LAMP server, all links redirecting to localhost

    - by serilain
    I've got a very frustrating issue with what should be a bespoke install of Ubuntu 12.04, the LAMP config provided in apt-get install lamp-server^, and a web application called The Fascinator. After installing those three things and making no changes to any of them, I can access the application through a public IP (http://lib-hf1.lib.sfu.ca:9997 for the curious), but the domain of every link within that page is changed to localhost, including links to images and CSS, so nothing loads correctly and all of the links are broken. I've Googled around and found some people who appear to be having this issue with WP and Drupal, but nothing makes reference to a system-wide setting, and no one using the Fascinator seems to be having this issue. I have a faint memory that this might have something to do with mod_rewrite, but I'm pretty well stumped.

    Read the article

  • Interpreting Munin graphs showing available entropy and MySQL slow queries in sync

    - by user64204
    We're experiencing performance issues on our website, and after reviewing our munin graphs, the only metrics we've found in sync are Available entropy and MySQL slow queries, with the latter influenced by our number of logged in users: Based on the wikipedia entropy page, my understanding is that entropy is the amount of randomness (here measured in bytes) that the system can use for various tasks, mainly cryptography and functions that require random input. Since the peaks in available entropy and MySQL slow queries are occurring in sync and at regular interval, that the number of MySQL slow queries is proportional to our number of Drupal users whereas the peaks in available entropy seem to be much more constant and less proportional to these 2 metrics, we're thinking available entropy is the reflect of a root cause which, combined with the traffic to our website, is causing those slow queries (and not the opposite, slow queries influencing the entropy). Accordingly: Q: What underlying problem do you think could cause regular peaks in available entropy that could have an influence on MySQL's ability to process queries?

    Read the article

  • HOW TO RECOVER A WWW DIRECTORY AND INCLUDED FIELS IN UBUNTU 9.04

    - by Al Mubarak
    hai., i'm using ubuntu 9.04 for drupal development. today morning accidentally i removed my www folder in directory. the folder has so many of my web development documents. O God., I just restart after my system when it happens., and i install some recovery software like gpart. is theri any possibilities to recover my www directory and files., bcos its includes more of web development documents. pls pls pls i'm very afraid about that issue. let me know asap. Thanks in Many more advance,

    Read the article

  • Browser pollution / Pop Up request

    - by PatrickS
    From time to time, when I browse the internet , after entering a site's URL , I get a warning message that a pop up has been blocked , then a question mark and a set of numbers appends itself after the URL. It doesn't matter what site I want to navigate to, it can be a site I have developed, a well know site like drupal.org for instance. Here's what happens: enter url: example.com warning: pop up blocked url change to: example.com/?1347628900 After checking the pop up ( which of course turns out to be an unwanted ad ), i deducted that the request and the pop up were linked. This has never happened to me before, I'm not sure how this got started, how can I get rid of this? Clearing the cache , cookies etc... doesn't solve the issue. I'm using a Macbook Pro OS10.6.7 , Chrome 13.0.782.24 beta. Thanks in advance for any tips!

    Read the article

  • How to set up memcached to use unix socket?

    - by alfish
    While I could use memcached on Debian to use the default 11211 port, but I've had great difficulty setting up unix socket, Form what I'v read, I know that I need to create a memcache.socket and add -s /path/to/memcache.socket -a 0766 To /etc/memcached.conf and comment out the default connection port and IP, i.e. -p 11211 -l 127.0.0.1 However, when I restart memcached I get internal server errors on Drupal site. I'm trying to implement unix sockets to avoid TCP/IP overhead and boost overal memcached performance, however not sure how much performance gain one can expect of this tweak. I appreciate your hints or possibly configs to to resolve this.

    Read the article

  • My datacenter is unable to add PTR/rDNS record, then how can I prevent the mail outgoing from going to spam folder?

    - by gilzero
    I am having problem that mail sent out from my server all goes to recipient's spam folder. I am running Drupal sites on Linux server. CentOS w/ cPanel. Our users cannot receive email as the mail went to the spam folder. (such as registration email, contact form email) I was advised that I need to have PTR/rDNS record added for my host. I then contact my datacenter to add PTR/rDNS thing, unfortunately, the datacenter said they are unable to do it. So what can I do? Any other ways I can fix the problem? Thank you!

    Read the article

  • How can I optimize ubuntu desktop to run my webserver

    - by Parry
    Hi, I am using ubuntu desktop edition to run My drupal website on Intranet. I know for running web servers best thing to install is ubuntu server editions, but due to some problem i am using Desktop edition. I installed XAMPP on my machine an my website is up and running. I want to know how can i optimize my machine?? Since I will not use very less features of desktop editions are there any things which I can remove or stop which will free memory and cpu consumption, are there any packages which i should install to increase the performance of my ubuntu??

    Read the article

  • Application to handle form approval

    - by ChrisMuench
    Hello, Hopefully this is the right place for this question. I have done a fair amount of research and yet to find anything that matches what I want. What I'm envisioning is the following. Let me know if any of you know of a program that will do what I want. Also it must be web-based anom user - fills out form - email gets sent to admin saying xyz has filled out form abc with links to approve/disapprove request. admin can also login and edit form and resent results to original submitter. Also once the admin approves/disapproves request the original submitter gets an approve/disapprove email. and you can search by date submitted, specific project/form, status of request(submitted, approved, disapproved). any ideas all on where I could find this? I started to look into drupal with workflows and actions but it just doesn't flow right for this

    Read the article

< Previous Page | 60 61 62 63 64 65 66 67 68 69 70 71  | Next Page >