Search Results

Search found 58245 results on 2330 pages for 'asp net authentication'.

Page 116/2330 | < Previous Page | 112 113 114 115 116 117 118 119 120 121 122 123  | Next Page >

  • Part 5, Moving Forum threads from CommunityServer to DotNetNuke

    - by Chris Hammond
    This is the fifth post in a series of blog posts about converting from CommunityServer to DotNetNuke. A brief background: I had a number of websites running on CommunityServer 2.1, I decided it was finally time to ditch CommunityServer due to the change in their licensing model and pricing that made it not good for the small guy. This series of blog posts is about how to convert your CommunityServer based sites to DotNetNuke . Previous Posts: Part 1: An Introduction Part 2: DotNetNuke Installation...(read more)

    Read the article

  • Working with Joins in LINQ

    - by vik20000in
    While working with data most of the time we have to work with relation between different lists of data. Many a times we want to fetch data from both the list at once. This requires us to make different kind of joins between the lists of data. LINQ support different kinds of join Inner Join     List<Customer> customers = GetCustomerList();     List<Supplier> suppliers = GetSupplierList();      var custSupJoin =         from sup in suppliers         join cust in customers on sup.Country equals cust.Country         select new { Country = sup.Country, SupplierName = sup.SupplierName, CustomerName = cust.CompanyName }; Group Join – where By the joined dataset is also grouped.     List<Customer> customers = GetCustomerList();     List<Supplier> suppliers = GetSupplierList();      var custSupQuery =         from sup in suppliers         join cust in customers on sup.Country equals cust.Country into cs         select new { Key = sup.Country, Items = cs }; We can also work with the Left outer join in LINQ like this.     List<Customer> customers = GetCustomerList();     List<Supplier> suppliers = GetSupplierList();      var supplierCusts =         from sup in suppliers         join cust in customers on sup.Country equals cust.Country into cs         from c in cs.DefaultIfEmpty()  // DefaultIfEmpty preserves left-hand elements that have no matches on the right side         orderby sup.SupplierName         select new { Country = sup.Country, CompanyName = c == null ? "(No customers)" : c.CompanyName,                      SupplierName = sup.SupplierName};Vikram

    Read the article

  • LINQ and Aggregate function

    - by vik20000in
    LINQ also provides with itself important aggregate function. Aggregate function are function that are applied over a sequence like and return only one value like Average, count, sum, Maximum etc…Below are some of the Aggregate functions provided with LINQ and example of their implementation. Count     int[] primeFactorsOf300 = { 2, 2, 3, 5, 5 };     int uniqueFactors = primeFactorsOf300.Distinct().Count();The below example provided count for only odd number.     int[] primeFactorsOf300 = { 2, 2, 3, 5, 5 };     int uniqueFactors = primeFactorsOf300.Distinct().Count(n => n%2 = 1);  Sum     int[] numbers = { 5, 4, 1, 3, 9, 8, 6, 7, 2, 0 };        double numSum = numbers.Sum();  Minimum      int minNum = numbers.Min(); Maximum      int maxNum = numbers.Max();Average      double averageNum = numbers.Average();  Aggregate      double[] doubles = { 1.7, 2.3, 1.9, 4.1, 2.9 };     double product = doubles.Aggregate((runningProduct, nextFactor) => runningProduct * nextFactor);  Vikram

    Read the article

  • Part 4, Getting the conversion tables ready for CS to DNN

    - by Chris Hammond
    This is the fourth post in a series of blog posts about converting from CommunityServer to DotNetNuke. A brief background: I had a number of websites running on CommunityServer 2.1, I decided it was finally time to ditch CommunityServer due to the change in their licensing model and pricing that made it not good for the small guy. This series of blog posts is about how to convert your CommunityServer based sites to DotNetNuke . Previous Posts: Part 1: An Introduction Part 2: DotNetNuke Installation...(read more)

    Read the article

  • Writing the tests for FluentPath

    - by Bertrand Le Roy
    Writing the tests for FluentPath is a challenge. The library is a wrapper around a legacy API (System.IO) that wasn’t designed to be easily testable. If it were more testable, the sensible testing methodology would be to tell System.IO to act against a mock file system, which would enable me to verify that my code is doing the expected file system operations without having to manipulate the actual, physical file system: what we are testing here is FluentPath, not System.IO. Unfortunately, that is not an option as nothing in System.IO enables us to plug a mock file system in. As a consequence, we are left with few options. A few people have suggested me to abstract my calls to System.IO away so that I could tell FluentPath – not System.IO – to use a mock instead of the real thing. That in turn is getting a little silly: FluentPath already is a thin abstraction around System.IO, so layering another abstraction between them would double the test surface while bringing little or no value. I would have to test that new abstraction layer, and that would bring us back to square one. Unless I’m missing something, the only option I have here is to bite the bullet and test against the real file system. Of course, the tests that do that can hardly be called unit tests. They are more integration tests as they don’t only test bits of my code. They really test the successful integration of my code with the underlying System.IO. In order to write such tests, the techniques of BDD work particularly well as they enable you to express scenarios in natural language, from which test code is generated. Integration tests are being better expressed as scenarios orchestrating a few basic behaviors, so this is a nice fit. The Orchard team has been successfully using SpecFlow for integration tests for a while and I thought it was pretty cool so that’s what I decided to use. Consider for example the following scenario: Scenario: Change extension Given a clean test directory When I change the extension of bar\notes.txt to foo Then bar\notes.txt should not exist And bar\notes.foo should exist This is human readable and tells you everything you need to know about what you’re testing, but it is also executable code. What happens when SpecFlow compiles this scenario is that it executes a bunch of regular expressions that identify the known Given (set-up phases), When (actions) and Then (result assertions) to identify the code to run, which is then translated into calls into the appropriate methods. Nothing magical. Here is the code generated by SpecFlow: [NUnit.Framework.TestAttribute()] [NUnit.Framework.DescriptionAttribute("Change extension")] public virtual void ChangeExtension() { TechTalk.SpecFlow.ScenarioInfo scenarioInfo = new TechTalk.SpecFlow.ScenarioInfo("Change extension", ((string[])(null))); #line 6 this.ScenarioSetup(scenarioInfo); #line 7 testRunner.Given("a clean test directory"); #line 8 testRunner.When("I change the extension of " + "bar\\notes.txt to foo"); #line 9 testRunner.Then("bar\\notes.txt should not exist"); #line 10 testRunner.And("bar\\notes.foo should exist"); #line hidden testRunner.CollectScenarioErrors();} The #line directives are there to give clues to the debugger, because yes, you can put breakpoints into a scenario: The way you usually write tests with SpecFlow is that you write the scenario first, let it fail, then write the translation of your Given, When and Then into code if they don’t already exist, which results in running but failing tests, and then you write the code to make your tests pass (you implement the scenario). In the case of FluentPath, I built a simple Given method that builds a simple file hierarchy in a temporary directory that all scenarios are going to work with: [Given("a clean test directory")] public void GivenACleanDirectory() { _path = new Path(SystemIO.Path.GetTempPath()) .CreateSubDirectory("FluentPathSpecs") .MakeCurrent(); _path.GetFileSystemEntries() .Delete(true); _path.CreateFile("foo.txt", "This is a text file named foo."); var bar = _path.CreateSubDirectory("bar"); bar.CreateFile("baz.txt", "bar baz") .SetLastWriteTime(DateTime.Now.AddSeconds(-2)); bar.CreateFile("notes.txt", "This is a text file containing notes."); var barbar = bar.CreateSubDirectory("bar"); barbar.CreateFile("deep.txt", "Deep thoughts"); var sub = _path.CreateSubDirectory("sub"); sub.CreateSubDirectory("subsub"); sub.CreateFile("baz.txt", "sub baz") .SetLastWriteTime(DateTime.Now); sub.CreateFile("binary.bin", new byte[] {0x00, 0x01, 0x02, 0x03, 0x04, 0x05, 0xFF}); } Then, to implement the scenario that you can read above, I had to write the following When: [When("I change the extension of (.*) to (.*)")] public void WhenIChangeTheExtension( string path, string newExtension) { var oldPath = Path.Current.Combine(path.Split('\\')); oldPath.Move(p => p.ChangeExtension(newExtension)); } As you can see, the When attribute is specifying the regular expression that will enable the SpecFlow engine to recognize what When method to call and also how to map its parameters. For our scenario, “bar\notes.txt” will get mapped to the path parameter, and “foo” to the newExtension parameter. And of course, the code that verifies the assumptions of the scenario: [Then("(.*) should exist")] public void ThenEntryShouldExist(string path) { Assert.IsTrue(_path.Combine(path.Split('\\')).Exists); } [Then("(.*) should not exist")] public void ThenEntryShouldNotExist(string path) { Assert.IsFalse(_path.Combine(path.Split('\\')).Exists); } These steps should be written with reusability in mind. They are building blocks for your scenarios, not implementation of a specific scenario. Think small and fine-grained. In the case of the above steps, I could reuse each of those steps in other scenarios. Those tests are easy to write and easier to read, which means that they also constitute a form of documentation. Oh, and SpecFlow is just one way to do this. Rob wrote a long time ago about this sort of thing (but using a different framework) and I highly recommend this post if I somehow managed to pique your interest: http://blog.wekeroad.com/blog/make-bdd-your-bff-2/ And this screencast (Rob always makes excellent screencasts): http://blog.wekeroad.com/mvc-storefront/kona-3/ (click the “Download it here” link)

    Read the article

  • Writing the tests for FluentPath

    - by Latest Microsoft Blogs
    Writing the tests for FluentPath is a challenge. The library is a wrapper around a legacy API (System.IO) that wasn’t designed to be easily testable. If it were more testable, the sensible testing methodology would be to tell System.IO to act against Read More......(read more)

    Read the article

  • LINQ and the use of Repeat and Range operator

    - by vik20000in
    LINQ is also very useful when it comes to generation of range or repetition of data.  We can generate a range of data with the help of the range method.     var numbers =         from n in Enumerable.Range(100, 50)         select new {Number = n, OddEven = n % 2 == 1 ? "odd" : "even"}; The above query will generate 50 records where the record will start from 100 till 149. The query also determines if the number is odd or even. But if we want to generate the same number for multiple times then we can use the Repeat method.     var numbers = Enumerable.Repeat(7, 10); The above query will produce a list with 10 items having the value 7. Vikram

    Read the article

  • Code refactoring with Visual Studio 2010-Part 3

    - by Jalpesh P. Vadgama
    I have been writing few post about Code refactoring features of visual studio 2010 and This blog post is also one of them. In this post I am going to show you reorder parameters features in visual studio 2010. As a developer you might need to reorder parameter of a method or procedure in code for better readability of the the code and if you do this task manually then it is tedious job to do. But Visual Studio Reorder Parameter code refactoring feature can do this stuff within a minute. So let’s see how its works. For this I have created a simple console application which I have used earlier posts . Following is a code for that. using System; namespace CodeRefractoring { class Program { static void Main(string[] args) { string firstName = "Jalpesh"; string lastName = "Vadgama"; PrintMyName(firstName, lastName); } private static void PrintMyName(string firstName, string lastName) { Console.WriteLine(string.Format("FirstName:{0}", firstName)); Console.WriteLine(string.Format("LastName:{0}", lastName)); Console.ReadLine(); } } } Above code is very simple. It just print a firstname and lastname via PrintMyName method. Now I want to reorder the firstname and lastname parameter of PrintMyName. So for that first I have to select method and then click Refactor Menu-> Reorder parameters like following. Once you click a dialog box appears like following where it will give options to move parameter with arrow navigation like following. Now I am moving lastname parameter as first parameter like following. Once you click OK it will show a preview option where I can see the effects of changes like following. Once I clicked Apply my code will be changed like following. using System; namespace CodeRefractoring { class Program { static void Main(string[] args) { string firstName = "Jalpesh"; string lastName = "Vadgama"; PrintMyName(lastName, firstName); } private static void PrintMyName(string lastName, string firstName) { Console.WriteLine(string.Format("FirstName:{0}", firstName)); Console.WriteLine(string.Format("LastName:{0}", lastName)); Console.ReadLine(); } } } As you can see its very easy to use this feature. Hoped you liked it.. Stay tuned for more.. Till that happy programming.

    Read the article

  • Scrubbing a DotNetNuke Database for user info and passwords

    - by Chris Hammond
    If you’ve ever needed to send a backup of your DotNetNuke database to a developer for testing, you likely trust the developer enough to do so without scrubbing your data, but just to be safe it is probably best that you do take the time to scrub. Before you do anything with the SQL below, make sure you have a backup of your website! I would recommend you do the following. Backup your existing production database Restore a backup of your production database as a NEW database Run the scripts below...(read more)

    Read the article

  • Adding Facebook Comments using Razor in DotNetNuke

    - by Chris Hammond
    The other day I posted on how to add the new Facebook Comments to your DotNetNuke website. This worked okay for basic modules that only had one content display, but for a module like DNNSimpleArticle this didn’t work well as the URLs for each article didn’t come across as individual URLs because of the way the Facebook code is formatted. When displaying the Comments I also only wanted to show them on individual articles, not on the main article listing. There is actually a pretty easy fix though...(read more)

    Read the article

  • Using set operation in LINQ

    - by vik20000in
    There are many set operation that are required to be performed while working with any kind of data. This can be done very easily with the help of LINQ methods available for this functionality. Below are some of the examples of the set operation with LINQ. Finding distinct values in the set of data. We can use the distinct method to find out distinct values in a given list.     int[] factorsOf300 = { 2, 2, 3, 5, 5 };     var uniqueFactors = factorsOf300.Distinct(); We can also use the set operation of UNION with the help of UNION method in the LINQ. The Union method takes another collection as a parameter and returns the distinct union values in  both the list. Below is an example.     int[] numbersA = { 0, 2, 4, 5, 6, 8, 9 };    int[] numbersB = { 1, 3, 5, 7, 8 };    var uniqueNumbers = numbersA.Union(numbersB); We can also get the set operation of INTERSECT with the help of the INTERSECT method. Below is an example.     int[] numbersA = { 0, 2, 4, 5, 6, 8, 9 };     int[] numbersB = { 1, 3, 5, 7, 8 };         var commonNumbers = numbersA.Intersect(numbersB);  We can also find the difference between the 2 sets of data with the help of except method.      int[] numbersA = { 0, 2, 4, 5, 6, 8, 9 };     int[] numbersB = { 1, 3, 5, 7, 8 };         IEnumerable<int> aOnlyNumbers = numbersA.Except(numbersB);  Vikram

    Read the article

  • Assembly Resources Expression Builder

    - by João Angelo
    In ASP.NET you can tackle the internationalization requirement by taking advantage of native support to local and global resources used with the ResourceExpressionBuilder. But with this approach you cannot access public resources defined in external assemblies referenced by your Web application. However, since you can extend the .NET resource provider mechanism and create new expression builders you can workaround this limitation and use external resources in your ASPX pages much like you use local or global resources. Finally, if you are thinking, okay this is all very nice but where’s the code I can use? Well, it was too much to publish directly here so download it from Helpers.Web@Codeplex, where you also find a sample on how to configure and use it.

    Read the article

  • Visual Studio 2010 Service Pack 1 Beta Released

    - by shiju
    Microsoft has been released beta version of Visual Studio 2010 Service Pack 1. The Visual Studio 2010 Service Pack 1 beta comes with a go live license. The following are the download links for Visual Studio 2010 Service Pack 1 NET Framework 4 Update Beta VS 2010 SP1 Beta TFS 2010 SP1 Beta The SP1 Beta comes with few bug fixes and also provides new features. The following are the some of the new features comes with Visual Studio 2010 Service Pack 1 HTML5 Schema Support IIS Express Support SQL Compact Edition 4 Tooling Silverlight 4 Tools for Visual Studio 2010 If you have ASP.NET MVC 3 RC installed, the SP1 will break the IntelliSense feature in the Razor views. This will fix in the ASP.NET MVC 3 RC 2 release and it will be release soon.

    Read the article

  • Branching and Merging with TortoiseSVN

    - by capgpilk
    For this example I am using Visual Studio 2010, TortoiseSVN 1.6.6, Subversion 1.6.6 and AnkhSVN 2.1.7819.411, so if you are using different versions, some of these screen shots may differ. This is assuming you have your code checked in to the trunk directory and have a standard SVN structure of trunk, branches and tags. There are a number of developers who prefer to develop solely in a branch and never touch the trunk, but the process is generally the same and you may be on a small team and prefer to work in the trunk and branch occasionally. There are three steps to successful branching. First you branch, then when you are ready you need to reintegrate any changes that other developers may have made to the trunk in to your branch. Then finally when your branch and the trunk are in sync, you merge it back in to the trunk. Branch Right click project root in Windows Explorer > TortoiseSVN > Branch/Tag Enter the branch label in the ‘To URL’ box. For example /branches/1.1 Choose Head revision Check Switch working copy Click OK Make any changes to branch Make any changes to trunk Commit any changes For this example I copied the project to another location prior to branching and made changes to that using Notepad++. Then committed it to SVN, as this directory is mapped to the trunk, that is what gets updated.   Merge Trunk with Branch Right click project root in Windows Explorer > TortoiseSVN > Merge Choose ‘Merge a range of revisions’ In ‘URL to merge from’ choose your trunk Click Next, then the ‘test merge’ button. This will highlight any conflicts. Here we have one conflict we will need to resolve because we made a change and checked in to trunk earlier Click merge. Now we have the opportunity to edit that conflict This will open up TortoiseMerge which will allow us to resolve the issue. In this case I want both changes. Perform an Update then Commit Reloading in Visual Studio shows we have all changes that have been made to both trunk and branch. Merge Branch with Trunk Switch working copy by right clicking project root in Windows Explorer > TortoiseSVN > switch Switch to the trunk then ok Right click project root in Windows Explorer > TortoiseSVN > merge Choose ‘Reintegrate a branch’ In ‘From URL’ choose your branch then next Click ‘Test merge’, this shouldn’t show any conflicts Click Merge Perform Update then Commit Open project in Visual Studio, we now have all changes. So there we have it we are connected back to the trunk and have all the updates merged.

    Read the article

  • Strange ASP.NET Queue Performance Counters Behavior?

    - by LemurTech
    We have an ASP.NET 2.0 site running in classic mode. I am seeing very strange behavior in the performance counter values. Perhaps these are bugs (I've been all over Google trying to verify this, without much luck), or perhaps it is just my inexperience with monitoring these things. This PerfMon graph (http://imgur.com/Jv5io5J) represents a load test where I add up to 350 virtual users to the site, at a rate of about 1/sec, performing relatively simple page browsing. At the end of the test, I gradually taper off the number of users. This is a 4 CPU server. Machine.config settings for are at the defaults. The solid blue line is ASP.NET Apps v2.x\Requests Executing for the application in question. The profile makes perfect sense, with a quick ramp-up to 32 executing requests (minWorkerThreads x 4CPUs), followed by a slower ramp-up to 48 ((maxWorkerThreads - minWorkerThreads) x 4CPUs). The solid yellow line is ASP.NET v2.x\Requests Queued. Again, this makes sense: after the initial 32 request threads are activated, the queue begins to build as new thread initialization can't keep pace with incoming requests. But as executing requests reaches its highest possible value of 48, the counter for ASP.NET Apps v2.x\Requests Queued (green solid line) suddenly springs to life and maintains step with the yellow counter. As far as I can tell, and with no other apps running on the server, these two counters should have had the same values from the start. One other odd thing: The counter for ASP.NET v2.x\Request Wait Time (dotted yellow line) also does not spring to life until executing requests reaches 48. Shouldn't I be seeing values here from the moment ASP.NET v2.x\Requests Queued begins to build? And likewise, why would ASP.NET Apps v2.x\Request Execution Time (dotted blue) increase significantly only after that peak of 48 is reached? Shouldn't it ramp-up gradually along with queued requests?

    Read the article

  • Getting a WCF service hosted in IIS 7.0

    - by gregarobinson
     This was not as easy as I thought it would be...lots of errors. These links saved me:  http://blah.winsmarts.com/2008-4-Host_a_WCF_Service_in_IIS_7_-and-amp;_Windows_2008_-_The_right_way.aspx http://blog.donnfelker.com/2007/03/26/iis-7-this-configuration-section-cannot-be-used-at-this-path/   

    Read the article

  • DWR like .NET Comet Ajax in ASP.NET

    Easy DWR like Comet in ASP.NET...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Integrating Facebook Comments into your DotNetNuke Pages

    - by Chris Hammond
    Last week Facebook announced a new feature that websites can use to get Facebook Comments onto their web pages . I thought this was interesting as I have a few car racing sites that are using Forums, but also have the DNNSimpleArticle module for main page content. The forums are active, but the DNNSimpleArticle module doesn’t allow for comments as of right now (or in the foreseeable future) so I started to look into the Facebook comments a bit. From a quick read of their blog post/announcement it...(read more)

    Read the article

  • HTML5-MVC application using VS2010 SP1

    - by nmarun
    This is my first attempt at creating HTML5 pages. VS 2010 allows working with HTML5 now (you just need to make a small change after installing SP1). So my Razor view is now a HTML5 page. I call this application - 5Commerce – (an over-simplified) HTML5 ECommerce site. So here’s the flow of the application: home page renders user enters first and last name, chooses a product and the quantity can enter additional instructions for the order place the order user is then taken to another page showing the order details Off to the details. This is what my page looks in Google Chrome 10 beta (or later) soon after it renders. Here are some of the things to observe on this. Look a little closer and you’ll see a border around the first name textbox – this is ‘autofocus’ in action. I’ve set the autofocus attribute on this textbox. So as soon as the page loads, this control gets focus. 1: <input type="text" autofocus id="firstName" class="inputWidth" data_minlength="" 2: data_maxlength="" placeholder="first name" /> See a partially grayed out ‘last name’ text in the second textbox. This is set using a placeholder attribute (see above). It gets wiped out on-focus and improves the UI visuals in general. The quantity textbox is actually a numerical-only textbox. 1: <input type="number" id="quantity" data_mincount="" class="inputWidth" /> The last line is for additional instructions. This looks like a label but it’s content is editable. Just adding the ‘contenteditable’ attribute to the span allow the user to edit the text inside. 1: <span contenteditable id="additionalInstructions" data_texttype="" class="editableContent">select text and edit </span> All of the above is just plain HTML (no lurking javascript acting in here). Makes it real clean and simple. Going more into the HTML, I see that the _Layout.cshtml already is using some HTML5 content. I created my project before installing SP1, so that was the reason for my surprise. 1: <!DOCTYPE html> This is the doctype declaration in HTML5 and this is supported even by IE6 (just take my word on IE6 now, don’t go install it to test it, especially when MS is doing an IE6 countdown). That’s just amazing and extremely easy to read remember and talk about a few less bytes on every call! I modified the rest of my _Layout.cshtml to the below: 1: <!DOCTYPE html> 2: <html> 3: <head> 4: <title>5Commerce - HTML 5 Ecommerce site</title> 5: <link href="@Url.Content("~/Content/Site.css")" rel="stylesheet" type="text/css" /> 6: <script src="@Url.Content("~/Scripts/jquery-1.4.4.min.js")" type="text/javascript"></script> 7: <script src="@Url.Content("~/Scripts/CustomScripts.js")" type="text/javascript"></script> 8: <script type="text/javascript"> 9: $(document).ready(function () { 10: WireupEvents(); 11: }); 12:</script> 13:  14: </head> 15:  16: <body role="document" class="bodybackground"> 17: <header role="heading"> 18: <h2>5Commerce - HTML 5 Ecommerce site!</h2> 19: </header> 20: <section id="mainForm"> 21: @RenderBody() 22: </section> 23: <footer id="page_footer" role="siteBaseInfo"> 24: <p>&copy; 2011 5Commerce Inc!</p> 25: </footer> 26: </body> 27: </html> I’m sure you’re seeing some of the new tags here. To give a brief intro about them: <header>, <footer>: Marks the header/footer region of a page or section. <section>: A logical grouping of content role attribute: Identifies the responsibility of an element. This attribute can be used by screen readers and can also be filtered through jQuery. SP1 also allows for some intellisense in HTML5. You see the other types of input fields – email, date, datetime, month, url and there are others as well. So once my page loads, i.e., ‘on document ready’, I’m wiring up the events following the principles of unobtrusive javascript. In the snippet below, I’m controlling the behavior of the input controls for specific events. 1: $("#productList").bind('change blur', function () { 2: IsSelectedProductValid(); 3: }); 4:  5: $("#quantity").bind('blur', function () { 6: IsQuantityValid(); 7: }); 8:  9: $("#placeOrderButton").click( 10: function () { 11: if (IsPageValid()) { 12: LoadProducts(); 13: } 14: }); This enables some client-side validation to occur before the data is sent to the server. These validation constraints are obtained through a JSON call to the WCF service and are set to the ‘data_’ attributes of the input controls. Have a look at the ‘GetValidators()’ function below: 1: function GetValidators() { 2: // the post to your webservice or page 3: $.ajax({ 4: type: "GET", //GET or POST or PUT or DELETE verb 5: url: "http://localhost:14805/OrderService.svc/GetValidators", // Location of the service 6: data: "{}", //Data sent to server 7: contentType: "application/json; charset=utf-8", // content type sent to server 8: dataType: "json", //Expected data format from server 9: processdata: true, //True or False 10: success: function (result) {//On Successfull service call 11: if (result.length > 0) { 12: for (i = 0; i < result.length; i++) { 13: if (result[i].PropertyName == "FirstName") { 14: if (result[i].MinLength > 0) { 15: $("#firstName").attr("data_minLength", result[i].MinLength); 16: } 17: if (result[i].MaxLength > 0) { 18: $("#firstName").attr("data_maxLength", result[i].MaxLength); 19: } 20: } 21: else if (result[i].PropertyName == "LastName") { 22: if (result[i].MinLength > 0) { 23: $("#lastName").attr("data_minLength", result[i].MinLength); 24: } 25: if (result[i].MaxLength > 0) { 26: $("#lastName").attr("data_maxLength", result[i].MaxLength); 27: } 28: } 29: else if (result[i].PropertyName == "Quantity") { 30: if (result[i].MinCount > 0) { 31: $("#quantity").attr("data_minCount", result[i].MinCount); 32: } 33: } 34: else if (result[i].PropertyName == "AdditionalInstructions") { 35: if (result[i].TextType.length > 0) { 36: $("#additionalInstructions").attr("data_textType", result[i].TextType); 37: } 38: } 39: } 40: } 41: }, 42: error: function (result) {// When Service call fails 43: alert('Service call failed: ' + result.status + ' ' + result.statusText); 44: } 45: }); 46:  47: //.... 48: } Just before the GetValidators() function runs and sets the validation constraints, this is what the html looks like (seen through the Dev tools of Chrome): After the function executes, you see the values in the ‘data_’  attributes. As and when we enter valid data into these fields, the error messages disappear, since the validation is bound to the blur event of the control. There you see… no error messages (well, the catch here is that once you enter THAT name, all errors disappear automatically). Clicking on ‘Place Order!’ runs the SaveOrder function. You can see the JSON for the order object that is getting constructed and passed to the WCF Service. 1: function SaveOrder() { 2: var addlInstructionsDefaultText = "select text and edit"; 3: var addlInstructions = $("span:first").text(); 4: if(addlInstructions == addlInstructionsDefaultText) 5: { 6: addlInstructions = ''; 7: } 8: var orderJson = { 9: AdditionalInstructions: addlInstructions, 10: Customer: { 11: FirstName: $("#firstName").val(), 12: LastName: $("#lastName").val() 13: }, 14: OrderedProduct: { 15: Id: $("#productList").val(), 16: Quantity: $("#quantity").val() 17: } 18: }; 19:  20: // the post to your webservice or page 21: $.ajax({ 22: type: "POST", //GET or POST or PUT or DELETE verb 23: url: "http://localhost:14805/OrderService.svc/SaveOrder", // Location of the service 24: data: JSON.stringify(orderJson), //Data sent to server 25: contentType: "application/json; charset=utf-8", // content type sent to server 26: dataType: "json", //Expected data format from server 27: processdata: false, //True or False 28: success: function (result) {//On Successfull service call 29: window.location.href = "http://localhost:14805/home/ShowOrderDetail/" + result; 30: }, 31: error: function (request, error) {// When Service call fails 32: alert('Service call failed: ' + request.status + ' ' + request.statusText); 33: } 34: }); 35: } The service saves this order into an XML file and returns the order id (a guid). On success, I redirect to the ShowOrderDetail action method passing the guid. This page will show all the details of the order. Although the back-end weightlifting is done by WCF, I did not show any of that plumbing-work as I wanted to concentrate more on the HTML5 and its associates. However, you can see it all in the source here. I do have one issue with HTML5 and this is an existing issue with HTML4 as well. If you see the snippet above where I’ve declared a textbox for first name, you’ll see the autofocus attribute just dangling by itself. It doesn’t follow the xml syntax of ‘key="value"’ allowing users to continue writing badly-formatted html even in the new version. You’ll see the same issue with the ‘contenteditable’ attribute as well. The work-around is that you can do ‘autofocus=”true”’ and it’ll work fine plus make it well-formatted. But unless the standards enforce this, there will be people (me included) who’ll get by, by just typing the bare minimum! Hoping this will get fixed in the coming version-updates. Source code here. Verdict: I think it’s time for us to embrace the new HTML5. Thank you HTML4 and Welcome HTML5.

    Read the article

  • Caching WCF javascript proxy on browser

    - by oazabir
    When you use WCF services from Javascript, you have to generate the Javascript proxies by hitting the Service.svc/js. If you have five WCF services, then it means five javascripts to download. As browsers download javascripts synchronously, one after another, it adds latency to page load and slows down page rendering performance. Moreover, the same WCF service proxy is downloaded from every page, because the generated javascript file is not cached on browser. Here is a solution that will ensure the generated Javascript proxies are cached on browser and when there is a hit on the service, it will respond with HTTP 304 if the Service.svc file has not changed. Here’s a Fiddler trace of a page that uses two WCF services. You can see there are two /js hits and they are sequential. Every visit to the same page, even with the same browser session results in making those two hits to /js. Second time when the same page is browsed: You can see everything else is cached, except the WCF javascript proxies. They are never cached because the WCF javascript proxy generator does not produce the necessary caching headers to cache the files on browser. Here’s an HttpModule for IIS and IIS Express which will intercept calls to WCF service proxy. It first checks if the service is changed since the cached version on the browser. If it has not changed then it will return HTTP 304 and not go through the service proxy generation process. Thus it saves some CPU on server. But if the request is for the first time and there’s no cached copy on browser, it will deliver the proxy and also emit the proper cache headers to cache the response on browser. http://www.codeproject.com/Articles/360437/Caching-WCF-javascript-proxy-on-browser Don’t forget to vote.

    Read the article

  • Izenda Reports 6.3 Top 10 Features

    - by gt0084e1
    Izenda 6.3 Top 10 New Features and Capabilities 1. Izenda Maps Add-On The Izenda Maps add-on allows rapid visualization of geographic or geo-spacial data.  It is fully integrated with the the rest of Izenda report package and adds a Maps tab which allows users to add interactive maps to their reports. Contact your representative or [email protected] for limited time discounts. Izenda Maps even has rich drill-down capabilities that allow you to dive deeper with a simple hover (also requires dashboards). 2. Streamlined Pie Charts with "Other" Slices The advanced properties of the Pie Chart now allows you to combine the smaller slices into a single "Other" slice. This reduces the visual complexity without throwing off the scale of the chart. Compare the difference below. 3. Combined Bar + Line Charts The Bar chart now allows dual visualization of multiple metrics simultaneously by adding a line for secondary data. Enabled via AdHocSettings.AllowLineOnBar = true; 4. Stacked Bar Charts The stacked bar chart lets you see a breakdown of a measure based on categorical data.  It is enabled with the following code. AdHocSettings.AllowStackedBarChart = true; 5. Self-Joining Data Sources The self-join features allows for parent-child relationships to be accessed from the Data Sources tab. The same table can be used as a secondary child table within the Report Designer. 6. Report Design From Dashboard View Dashboards now sport both view and design icons to allow quick access to both. 7. Field Arithmetic on Dates Differences between dates can now be used as measures with the arithmetic feature. 8. Simplified Multi-Tenancy Integrating with multi-tenant systems is now easier than ever. The following APIs have been added to facilitate common scenarios. AdHocSettings.CurrentUserTenantId = value; AdHocSettings.SchedulerTenantID = value; AdHocSettings.SchedulerTenantField = "AccountID"; 9. Support For SQL 2008 R2 and SQL Azure Izenda now supports the latest version of Microsoft's database as well as the SQL Azure service. 10. Enhanced Performance and Compatibility for Stored Procedures Izenda now supports more stored procedures than ever and runs them faster too.

    Read the article

  • Repairing The Visual Studio 2012 UI

    - by Ken Cox [MVP]
    I have sympathy for ‘Softies who don’t like the controversial ‘Metro’ UI changes but are afraid to say so. After all, who wants to commit a CLM (Career-Limiting Move) by declaring that the Emperor has no clothes (or gradients) and that ALL CAPS IN MENUS ARE DUMB? Talk about power! Here’s a higher-up (anyone got a name?) who has enforced a flat, monochrome, uninteresting user interface in Visual Studio 2012  that has been damned with faint praise by consumers. The pushback must have been enormous. Some ‘Softies disengage from the raging debate with, “It’s not my decision” while others feebly point out that the addition of some colour pixels in the icons is a real improvement over the beta version. True, I guess. With the UI pretty much locked, its down to repairing the damage. Fortunately, some Empire dissident has leaked the news to a blogger that  those SHOUTING CAPs aren’t hardcoded afterall: How To Prevent Visual Studio 2012 ALL CAPS Menus And so it goes. By RTM, I’m sure there will be many more add-ons to help us ‘de-Metro’ VS 2012 and recreate our favourite Visual Studio 2010 themes for it.

    Read the article

  • The Importance of Using !important in CSS Styles

    - by Ken Cox [MVP]
    I just lost time trying to hide a CSS border in a third-party control, so it’s worth a quick post to help others avoid similar frustration. I could plainly see in the Internet Explorer 8 developer tool that an unwanted border in the Telerik RadRotator   was coming from the style classes named .RadRotator_Web20 and .rrClipRegion. So, to remove the border, I inserted my own style in the page as an override: .RadRotator_Web20 .rrClipRegion {     border: 0px; } No change! WTF??? Okay...(read more)

    Read the article

  • An innovative architect to develop .NET business web forms (1) - rather than ASP.NET and MVC

    The article introduces an innovative architect to develop business web forms in enterprise software development which is better performance, higher productivity, more configurability and easier maintainability than traditional either ASP.NET or MVC development....Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • SharePoint 2010 Service Pack 1

    - by Ricardo Peres
    Install updates by this order: SharePoint Foundation 2010 SP1 SharePoint Foundation 2010 Language Packs SP1 SharePoint Server 2010 SP1 SharePoint Server 2010 SP1 Language Packs SP1 June 2011 Cumulative Update After installing any of these packages, run the SharePoint 2010 Products Configuration Wizard. Also, you may want to read this: Service Pack 1 (SP1) for Microsoft SharePoint Foundation 2010 and Microsoft SharePoint Server 2010 (white paper) Description of SharePoint Server 2010 SP1

    Read the article

< Previous Page | 112 113 114 115 116 117 118 119 120 121 122 123  | Next Page >