Search Results

Search found 6185 results on 248 pages for 'aspx'.

Page 36/248 | < Previous Page | 32 33 34 35 36 37 38 39 40 41 42 43  | Next Page >

  • Forms Authentication Across Applications Stopped Working

    - by colleski
    Hi, I have a .net 1.1 ASP application (domain.com) which has a .net 2 virtual directory (domain.com/v2) beneath it, both applications run within their own app pool on the same machine running IIS 6. The web.config files for both apps are setup for Forms Authentication as described here - http://msdn.microsoft.com/en-us/library/eb0zx8fc(v=VS.80).aspx. Users would be directed to the domain.com/v2/login.aspx page which would authenticate for both applications, this configuration has been working fine for the last few years until installing one of the recent Windows 2003 security updates today. Now after authenticating under /v2 users keep getting redirected back to domain.com/v2/Login.aspx as domain.com doesnt see them as authenticated anymore. Any ideas as to which security update would have caused this and if its possible to rollback? I've looked at a few suggestions on this (e.g. Cross app on subdomain form authentication not working) and other sites but no luck so far Any help would be appreciated. Thanks

    Read the article

  • Can't call page method from JQuery?

    - by user356787
    I have a page called AddNews.aspx and in codebehind a web method called AddNews(Parameters).. AddNews.aspx page is inherited from a master page.. So i used contentplaceholder. I have a button..It's id is btnSave. Here is jquery code: $(function() { $("[id$='_btnSave']").click(function() { $.ajax({ type: "POST", contentType: "application/json; charset=utf-8", data: $.toJSON(veriler), url: "AddNews.aspx/AddNews", dataType: "json", success: function(result) { $("#result").html('News added'); }, error: function() { alert('Problem'); } }); }); }); </script> Button click trigger now.. But it doesnt call Web Page Method.. What's the problem?

    Read the article

  • vs2003: identify the referrer page in asp.net

    - by dotnet-practitioner
    Hi, In VS2003, I am trying to find out the particular page where the request is coming from. I want to identify the exact aspx page name. Is there a way to only get the page name or some how strip the page name? Currently I am using the following instruction... string referencepage = HttpContext.Current.Request.UrlReferrer.ToString(); and I get the following result... "http://localhost/MyPage123.aspx?myval1=3333&myval2=4444; I want to get the result back with out any query string parameters and be able to identify the page MyPage123.aspx accurately... How do I do that?? Thanks

    Read the article

  • Return from jQuery Ajax Method

    - by Lijo
    I have a button named searchReportButton. On this button click, I need to check session value from server using a ajax web method "Error.aspx/CheckSessionExpiry".This is done in checkSession javascript function. The checkSession is not returning anything now - instead it is handling the required operation based on the result (redirecting to error page). I need to return the result of ajax web method to the Main Caller. How can we do this return? Main Caller searchReportButton.click(function () { checkSession(); //Remainging Code }); Helper function checkSession() { var win = 101; $.ajax( { type: "POST", url: "Error.aspx/CheckSessionExpiry", data: '{"winNumber": "' + win + '"}', contentType: "application/json; charset=utf-8", dataType: "json", success: handleSessionResult } ); } Result Helper function handleSessionResult(result) { if (result.hasOwnProperty("d")) { result = result.d } if (result == "ACTIVE") { window.location.href("Error.aspx"); return false; } //alert(result); }

    Read the article

  • Package and Publish Web Sites with TFS 2010 Build Server

    - by jdanforth
    To package and publish web sites with TFS 2010 Build Server, you can use MSDeploy and some of the new MSBuild arguments. For example: /p:DeployOnBuild=True /p:DeployTarget=MsDeployPublish /p:MSDeployPublishMethod=InProc /p:CreatePackageOnPublish=True /p:DeployIisAppPath="Default Web Site/WebApplication1" /p:MsDeployServiceUrl=localhost Does all the work for you! Unfortunately these arguments are not very well documented, yet. Please feel free comment with pointers to good docs. You can enter these arguments when editing the Build Definition, under the Process tab and the Advanced section: If you’re working with these things, I’m sure you’ve not missed the PDC 2009 presentation by Vishal Joshi about MS Deploy. A few links on the topic: http://stackoverflow.com/questions/2636153/where-is-the-documentation-for-msbuild-arguments-to-run-msdeploy http://blogs.msdn.com/aspnetue/archive/2010/03/05/automated-deployment-in-asp-net-4-frequently-asked-questions.aspx http://www.hanselman.com/blog/WebDeploymentMadeAwesomeIfYoureUsingXCopyYoureDoingItWrong.aspx

    Read the article

  • Microsoft® MVP (Most Valuable Professional) in XNA / DirectX.

    - by uditha
    I got an email form Microsoft saying that i have been selected for MVP in XNA/DirectX category. It was really memorable moment for me.And proud to be a Microsoft MVP. There are now total 44 MVPs for XNA/DirectX. https://mvp.support.microsoft.com/communities/mvp.aspx?product=1&competency=XNA%2fDirectX About MVP Program. https://mvp.support.microsoft.com/gp/aboutmvp My MVP Profile. https://mvp.support.microsoft.com/profile/Uditha Official SEA MVP blog. http://seamvpblogaholic.spaces.live.com/default.aspx

    Read the article

  • DirectX11 Swap Chain RGBA vs BGRA Format

    - by Nathan
    I was wondering if anyone could elaborate any further on something that's been bugging me. In DirectX9 the main supported back buffer formats were D3DFMT_X8R8B8G8 and D3DFMT_A8R8G8B8 (Both being BGRA in layout). http://msdn.microsoft.com/en-us/library/windows/desktop/bb174314(v=vs.85).aspx With the initial version of DirectX10 their was no support for BGRA and all the textbooks and online tutorials recommend DXGI_FORMAT_R8G8B8A8_UNORM (being RGBA in layout). Now with DirectX11 BGRA is supported again and it seems as if microsoft recommends using a BGRA format as the back buffer format. http://msdn.microsoft.com/en-us/library/windows/apps/hh465096.aspx Are there any suggestions or are there performance implications of using one or the other? (I assume not as obviously by specifying the format of the underlying resource the runtime will handle what bits your passing through and than infer how to utilise them based on the format.)

    Read the article

  • Creating HTML5 Offline Web Applications with ASP.NET

    - by Stephen Walther
    The goal of this blog entry is to describe how you can create HTML5 Offline Web Applications when building ASP.NET web applications. I describe the method that I used to create an offline Web application when building the JavaScript Reference application. You can read about the HTML5 Offline Web Application standard by visiting the following links: Offline Web Applications Firefox Offline Web Applications Safari Offline Web Applications Currently, the HTML5 Offline Web Applications feature works with all modern browsers with one important exception. You can use Offline Web Applications with Firefox, Chrome, and Safari (including iPhone Safari). Unfortunately, however, Internet Explorer does not support Offline Web Applications (not even IE 9). Why Build an HTML5 Offline Web Application? The official reason to build an Offline Web Application is so that you do not need to be connected to the Internet to use it. For example, you can use the JavaScript Reference Application when flying in an airplane, riding a subway, or hiding in a cave in Borneo. The JavaScript Reference Application works great on my iPhone even when I am completely disconnected from any network. The following screenshot shows the JavaScript Reference Application running on my iPhone when airplane mode is enabled (notice the little orange airplane):   Admittedly, it is becoming increasingly difficult to find locations where you can’t get Internet access. A second, and possibly better, reason to create Offline Web Applications is speed. An Offline Web Application must be downloaded only once. After it gets downloaded, all of the files required by your Web application (HTML, CSS, JavaScript, Image) are stored persistently on your computer. Think of Offline Web Applications as providing you with a super browser cache. Normally, when you cache files in a browser, the files are cached on a file-by-file basis. For each HTML, CSS, image, or JavaScript file, you specify how long the file should remain in the cache by setting cache headers. Unlike the normal browser caching mechanism, the HTML5 Offline Web Application cache is used to specify a caching policy for an entire set of files. You use a manifest file to list the files that you want to cache and these files are cached until the manifest is changed. Another advantage of using the HTML5 offline cache is that the HTML5 standard supports several JavaScript events and methods related to the offline cache. For example, you can be notified in your JavaScript code whenever the offline application has been updated. You can use JavaScript methods, such as the ApplicationCache.update() method, to update the cache programmatically. Creating the Manifest File The HTML5 Offline Cache uses a manifest file to determine the files that get cached. Here’s what the manifest file looks like for the JavaScript Reference application: CACHE MANIFEST # v30 Default.aspx # Standard Script Libraries Scripts/jquery-1.4.4.min.js Scripts/jquery-ui-1.8.7.custom.min.js Scripts/jquery.tmpl.min.js Scripts/json2.js # App Scripts App_Scripts/combine.js App_Scripts/combine.debug.js # Content (CSS & images) Content/default.css Content/logo.png Content/ui-lightness/jquery-ui-1.8.7.custom.css Content/ui-lightness/images/ui-bg_glass_65_ffffff_1x400.png Content/ui-lightness/images/ui-bg_glass_100_f6f6f6_1x400.png Content/ui-lightness/images/ui-bg_highlight-soft_100_eeeeee_1x100.png Content/ui-lightness/images/ui-icons_222222_256x240.png Content/ui-lightness/images/ui-bg_glass_100_fdf5ce_1x400.png Content/ui-lightness/images/ui-bg_diagonals-thick_20_666666_40x40.png Content/ui-lightness/images/ui-bg_gloss-wave_35_f6a828_500x100.png Content/ui-lightness/images/ui-icons_ffffff_256x240.png Content/ui-lightness/images/ui-icons_ef8c08_256x240.png Content/browsers/c8.png Content/browsers/es3.png Content/browsers/es5.png Content/browsers/ff3_6.png Content/browsers/ie8.png Content/browsers/ie9.png Content/browsers/sf5.png NETWORK: Services/EntryService.svc http://superexpert.com/resources/JavaScriptReference/ A Cache Manifest file always starts with the line of text Cache Manifest. In the manifest above, all of the CSS, image, and JavaScript files required by the JavaScript Reference application are listed. For example, the Default.aspx ASP.NET page, jQuery library, JQuery UI library, and several images are listed. Notice that you can add comments to a manifest by starting a line with the hash character (#). I use comments in the manifest above to group JavaScript and image files. Finally, notice that there is a NETWORK: section of the manifest. You list any file that you do not want to cache (any file that requires network access) in this section. In the manifest above, the NETWORK: section includes the URL for a WCF Service named EntryService.svc. This service is called to get the JavaScript entries displayed by the JavaScript Reference. There are two important things that you need to be aware of when using a manifest file. First, all relative URLs listed in a manifest are resolved relative to the manifest file. The URLs listed in the manifest above are all resolved relative to the root of the application because the manifest file is located in the application root. Second, whenever you make a change to the manifest file, browsers will download all of the files contained in the manifest (all of them). For example, if you add a new file to the manifest then any browser that supports the Offline Cache standard will detect the change in the manifest and download all of the files listed in the manifest automatically. If you make changes to files in the manifest (for example, modify a JavaScript file) then you need to make a change in the manifest file in order for the new version of the file to be downloaded. The standard way of updating a manifest file is to include a comment with a version number. The manifest above includes a # v30 comment. If you make a change to a file then you need to modify the comment to be # v31 in order for the new file to be downloaded. When Are Updated Files Downloaded? When you make changes to a manifest, the changes are not reflected the very next time you open the offline application in your web browser. Your web browser will download the updated files in the background. This can be very confusing when you are working with JavaScript files. If you make a change to a JavaScript file, and you have cached the application offline, then the changes to the JavaScript file won’t appear when you reload the application. The HTML5 standard includes new JavaScript events and methods that you can use to track changes and make changes to the Application Cache. You can use the ApplicationCache.update() method to initiate an update to the application cache and you can use the ApplicationCache.swapCache() method to switch to the latest version of a cached application. My heartfelt recommendation is that you do not enable your application for offline storage until after you finish writing your application code. Otherwise, debugging the application can become a very confusing experience. Offline Web Applications versus Local Storage Be careful to not confuse the HTML5 Offline Web Application feature and HTML5 Local Storage (aka DOM storage) feature. The JavaScript Reference Application uses both features. HTML5 Local Storage enables you to store key/value pairs persistently. Think of Local Storage as a super cookie. I describe how the JavaScript Reference Application uses Local Storage to store the database of JavaScript entries in a separate blog entry. Offline Web Applications enable you to store static files persistently. Think of Offline Web Applications as a super cache. Creating a Manifest File in an ASP.NET Application A manifest file must be served with the MIME type text/cache-manifest. In order to serve the JavaScript Reference manifest with the proper MIME type, I added two files to the JavaScript Reference Application project: Manifest.txt – This text file contains the actual manifest file. Manifest.ashx – This generic handler sends the Manifest.txt file with the MIME type text/cache-manifest. Here’s the code for the generic handler: using System.Web; namespace JavaScriptReference { public class Manifest : IHttpHandler { public void ProcessRequest(HttpContext context) { context.Response.ContentType = "text/cache-manifest"; context.Response.WriteFile(context.Server.MapPath("Manifest.txt")); } public bool IsReusable { get { return false; } } } } The Default.aspx file contains a reference to the manifest. The opening HTML tag in the Default.aspx file looks like this: <html manifest="Manifest.ashx"> Notice that the HTML tag contains a manifest attribute that points to the Manifest.ashx generic handler. Internet Explorer simply ignores this attribute. Every other modern browser will download the manifest when the Default.aspx page is requested. Seeing the Offline Web Application in Action The experience of using an HTML5 Web Application is different with different browsers. When you first open the JavaScript Reference application with Firefox, you get the following warning: Notice that you are provided with the choice of whether you want to use the application offline or not. Browsers other than Firefox, such as Chrome and Safari, do not provide you with this choice. Chrome and Safari will create an offline cache automatically. If you click the Allow button then Firefox will download all of the files listed in the manifest. You can view the files contained in the Firefox offline application cache by typing about:cache in the Firefox address bar: You can view the actual items being cached by clicking the List Cache Entries link: The Offline Web Application experience is different in the case of Google Chrome. You can view the entries in the offline cache by opening the Developer Tools (hit Shift+CTRL+I), selecting the Storage tab, and selecting Application Cache: Notice that you view the status of the Application Cache. In the screen shot above, the status is UNCACHED which means that the files listed in the manifest have not been downloaded and cached yet. The different possible values for the status are included in the HTML5 Offline Web Application standard: UNCACHED – The Application Cache has not been initialized. IDLE – The Application Cache is not currently being updated. CHECKING – The Application Cache is being fetched and checked for updates. DOWNLOADING – The files in the Application Cache are being updated. UPDATEREADY – There is a new version of the Application. OBSOLETE – The contents of the Application Cache are obsolete. Summary In this blog entry, I provided a description of how you can use the HTML5 Offline Web Application feature in the context of an ASP.NET application. I described how this feature is used with the JavaScript Reference Application to store the entire application on a user’s computer. By taking advantage of this new feature of the HTML5 standard, you can improve the performance of your ASP.NET web applications by requiring users of your web application to download your application once and only once. Furthermore, you can enable users to take advantage of your applications anywhere -- regardless of whether or not they are connected to the Internet.

    Read the article

  • What’s new in IIS8, Perf, Indexing Service-Week 49

    - by OWScott
    You can find this week’s video here. After some delays in the publishing process week 49 is finally live.  This week I'm taking Q&A from viewers, starting with what's new in IIS8, a question on enable32BitAppOnWin64, performance settings for asp.net, the ARR Helper, and Indexing Services. Starting this week for the remaining four weeks of the 52 week series I'll be taking questions and answers from the viewers. Already a number of questions have come in. This week we look at five topics. Pre-topic: We take a look at the new features in IIS8. Last week Internet Information Services (IIS) 8 Beta was released to the public. This week's video touches on the upcoming features in the next version of IIS. Here’s a link to the blog post which was mentioned in the video Question 1: In a number of places (http://learn.iis.net/page.aspx/201/32-bit-mode-worker-processes/, http://channel9.msdn.com/Events/MIX/MIX08/T06), I've saw that enable32BitAppOnWin64 is recommended for performance reasons. I'm guessing it has to do with memory usage... but I never could find detailed explanation on why this is recommended (even Microsoft books are vague on this topic - they just say - do it, but provide no reason why it should be done). Do you have any insight into this? (Predrag Tomasevic) Question 2: Do you have any recommendations on modifying aspnet.config and machine.config to deliver better performance when it comes to "high number of concurrent connections"? I've implemented recommendations for modifying machine.config from this article (http://www.codeproject.com/KB/aspnet/10ASPNetPerformance.aspx - ASP.NET Process Configuration Optimization section)... but I would gladly listen to more recommendations if you have them. (Predrag Tomasevic) Question 3: Could you share more of your experience with ARR Helper? I'm specifically interested in configuring ARR Helper (for example - how to only accept only X-Forwards-For from certain IPs (proxies you trust)). (Predrag Tomasevic) Question 4: What is the replacement for indexing service to use in coding web search pages on a Windows 2008R2 server? (Susan Williams) Here’s the link that was mentioned: http://technet.microsoft.com/en-us/library/ee692804.aspx This is now week 49 of a 52 week series for the web pro. You can view past and future weeks here: http://dotnetslackers.com/projects/LearnIIS7/ You can find this week’s video here.

    Read the article

  • Authorize.Net, Silent Posts, and URL Rewriting Don't Mix

    The too long, didn't read synopsis: If you use Authorize.Net and its silent post feature and it stops working, make sure that if your website uses URL rewriting to strip or add a www to the domain name that the URL you specify for the silent post matches the URL rewriting rule because Authorize.Net's silent post feature won't resubmit the post request to URL specified via the redirect response. I have a client that uses Authorize.Net to manage and bill customers. Like many payment gateways, Authorize.Net supports recurring payments. For example, a website may charge members a monthly fee to access their services. With Authorize.Net you can provide the billing amount and schedule and at each interval Authorize.Net will automatically charge the customer's credit card and deposit the funds to your account. You may want to do something whenever Authorize.Net performs a recurring payment. For instance, if the recurring payment charge was a success you would extend the customer's service; if the transaction was denied then you would cancel their service (or whatever). To accomodate this, Authorize.Net offers a silent post feature. Properly configured, Authorize.Net will send an HTTP request that contains details of the recurring payment transaction to a URL that you specify. This URL could be an ASP.NET page on your server that then parses the data from Authorize.Net and updates the specified customer's account accordingly. (Of course, you can always view the history of recurring payments through the reporting interface on Authorize.Net's website; the silent post feature gives you a way to programmatically respond to a recurring payment.) Recently, this client of mine that uses Authorize.Net informed me that several paying customers were telling him that their access to the site had been cut off even though their credit cards had been recently billed. Looking through our logs, I noticed that we had not shown any recurring payment log activity for over a month. I figured one of two things must be going on: either Authorize.Net wasn't sending us the silent post requests anymore or the page that was processing them wasn't doing so correctly. I started by verifying that our Authorize.Net account was properly setup to use the silent post feature and that it was pointing to the correct URL. Authorize.Net's site indicated the silent post was configured and that recurring payment transaction details were being sent to http://example.com/AuthorizeNetProcessingPage.aspx. Next, I wanted to determine what information was getting sent to that URL.The application was setup tolog the parsed results of the Authorize.Net request, such as what customer the recurring payment applied to; however,we were not logging the actual HTTP request coming from Authorize.Net. I contacted Authorize.Net's support to inquire if they logged the HTTP request send via the silent post feature and was told that they did not. I decided to add a bit of code to log the incoming HTTP request, which you can do by using the Request object's SaveAs method. This allowed me to saveevery incoming HTTP request to the silent post page to a text file on the server. Upon the next recurring payment, I was able to see the HTTP request being received by the page: GET /AuthorizeNetProcessingPage.aspx HTTP/1.1Connection: CloseAccept: */*Host: www.example.com That was it. Two things alarmed me: first, the request was obviously a GET and not a POST; second, there was no POST body (obviously), which is where Authorize.Net passes along thedetails of the recurring payment transaction.What stuck out was the Host header, which differed slightly from the silent post URL configured in Authorize.Net. Specifically, the Host header in the above logged request pointed to www.example.com, whereas the Authorize.Net configuration used example.com (no www). About a month ago - the same time these recurring payment transaction detailswere no longer being processed by our ASP.NET page - we had implemented IIS 7's URL rewriting feature to permanently redirect all traffic to example.com to www.example.com. Could that be the problem? I contacted Authorize.Net's support again and asked them if their silent post algorithmwould follow the301HTTP response and repost the recurring payment transaction details. They said, Yes, the silent post would follow redirects. Their reports didn't jive with my observations, so I went ahead and updated our Authorize.Net configuration to point to http://www.example.com/AuthorizeNetProcessingPage.aspx instead of http://example.com/AuthorizeNetProcessingPage.aspx. And, I'm happy to report, recurring payments and correctly being processed again! If you use Authorize.Net and the silent post feature, and you notice that your processing page is not longer working, make sure you are not using any URL rewriting rules that may conflict with the silent post URL configuration. Hope this saves someone the time it took me to get to the bottom of this. Happy Programming!Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Taking advantage of Windows Azure CDN and Dynamic Pages in ASP.NET - Caching content from hosted services

    - by Shawn Cicoria
    With the updates to Windows Azure CDN announced this week [1] I wanted to help illustrate the capability with a working sample that will serve up dynamic content from an ASP.NET site hosted in a WebRole. First, to get a good overview of the capability you can read the Overview of the Windows Azure CDN [2] content on MSDN. When you setup the ability to cache content from a hosted service, the requirement is to provide a path to your role’s DNS endpoint that ends in the path “/cdn”.  Additionally, you then map CDN to that service. What WAZ CDN does, is allow you to then map that through the CDN to your host.  The CDN will then make a request to your host on your client’s behalf. The requirement is still that your client, and any Url’s that are to be serviced through the CDN and this capability have to use the CDN DNS name and not your host – no different than what CDN does for Blog storage. The following 2 URL’s are samples of how the client needs to issue the requests. Windows Azure hosted service URL: http: //myHostedService.cloudapp.net/cdn/music.aspx   - for regular “dynamic” content Windows Azure CDN URL: http: //<identifier>.vo.msecnd.net/music.aspx   - for CDN “cachable” content. The first URL path’s the request direct to your host into the Azure datacenter.  The 2nd URL paths the request through the CDN infrastructure, where CDN will make the determination to request the content on behalf of the client to the Azure datacenter and your host on the /cdn path. The big advantage here is you can apply logic to your content creation.  What’s important is emitting the CDN friendly headers that allow CDN to request and re-request only when you designate based upon it’s rules of “staleness” as described in the overview page. With IIS7.5 there is an underlying issue when the Managed Module “OutputCache” is enabled that in order to emit a good header for your content, you’ll need to remove, and in my sample, helps provide CDN friendly headers.  You get IIS 7.5 when running under OS Family “2” in your service configuration. By default, and when the OutputCache managed module is loaded, if you use the HttpResponse.CachePolicy to set the Http Headers for “max-age” when the HttpCacheability is “Public”, you will NOT get the “max-age” emitted as part of the “Cache-control:” header.  Instead, the OutputCache module will remove “max-age” and just emit “public”.  It works ok when Cacheability is set to “private”. To work around the issue and ensure your code as follows emits the full max-age along with the public option, you need to remove as follows: <system.webServer>   <modules runAllManagedModulesForAllRequests="true">     <remove name="OutputCache"/>   </modules> </system.webServer>   Response.Cache.SetCacheability(HttpCacheability.Public); Response.Cache.SetMaxAge(TimeSpan.FromMinutes(rv));   In the attached solution, the way I approached it was to have a VirtualApplication under the root site that has it’s own web.config  - this VirtualApplication is the /cdn of the site and when deployed to Azure as a Web Role will surface as a distinct IIS Application – along with a separate AppDomain. The CDN Sample is a simple Web Forms site that the /default landing page contains 3 IFrames to host: 1. Content direct from the host @   http://xxxx.cloudapp.net/cdn 2. Content via the CDN @ http://azxxx.vo.msecnd.net  3. Simple list of recent requests – showing where the request came from.   When you run the sample the first time you hit the page, both the Host and the CDN will cause 2 initial requests to hit the host.  You won’t see the first requests in the list because of timing – but if you refresh, you’ll see that the list will show that you have 2 requests initially. 1. sourced direct from the Browser to the HOST 2. sourced via the CDN The picture above shows the call-outs of each of those requests – green rows showing requests coming direct to the HOST, yellow showing the CDN request.  The IP addresses of the green items are direct from the client, where the CDN is from the CDN data center. As you refresh the page (hit Ctrl+F5 to force a full refresh and avoid “304 – not changed”) you’ll see that the request to the HOST get’s processed direct; but the request to the CDN endpoint is serviced direct from the CDN and doesn’t incur any additional request back to the HOST. The following is the Headers from the CDN response (Status-Line) HTTP/1.1 200 OK Age 13 Cache-Control public, max-age=300 Connection keep-alive Content-Length 6212 Content-Type image/jpeg; charset=utf-8 Date Fri, 11 Mar 2011 20:47:14 GMT Expires Fri, 11 Mar 2011 20:52:01 GMT Last-Modified Fri, 11 Mar 2011 20:47:02 GMT Server Microsoft-IIS/7.5 X-AspNet-Version 4.0.30319 X-Powered-By ASP.NET   The following are the Headers from the HOST response (Status-Line) HTTP/1.1 200 OK Cache-Control public, max-age=300 Content-Length 6189 Content-Type image/jpeg; charset=utf-8 Date Fri, 11 Mar 2011 20:47:15 GMT Last-Modified Fri, 11 Mar 2011 20:47:02 GMT Server Microsoft-IIS/7.5 X-AspNet-Version 4.0.30319 X-Powered-By ASP.NET   You can see that with the CDN request, the countdown (age) starts for aging the content. The full sample is located here: CDNSampleSite.zip [1] http://blogs.msdn.com/b/windowsazure/archive/2011/03/09/now-available-updated-windows-azure-sdk-and-windows-azure-management-portal.aspx [2] http://msdn.microsoft.com/en-us/library/ff919703.aspx

    Read the article

  • Management and Monitoring Tools for Windows Azure

    - by BuckWoody
    With such a large platform, Windows Azure has a lot of moving parts. We’ve done our best to keep the interface as simple as possible, while giving you the most control and visibility we can. However, as with most Microsoft products, there are multiple ways to do something – and I’ve always found that to be a good strength. Depending on the situation, I might want a graphical interface, a command-line interface, or just an API so I can incorporate the management into my own tools, or have third-party companies write other tools. While by no means exhaustive, I thought I might put together a quick list of a few tools you can use to manage and monitor Windows Azure components, from our IaaS, SaaS and PaaS offerings. Some of the products focus on one area more than another, but all are available today. I’ll try and maintain this list to keep it current, but make sure you check the date of this post’s update – if it’s more than six months old, it’s most likely out of date. Things move fast in the cloud. The Windows Azure Management Portal The primary tool for managing Windows Azure is our portal – most everything you need is there, from creating new services to querying a database. There are two versions as of this writing – a Silverlight client version, and a newer HTML5 version. The latter is being updated constantly to be in parity with the Silverlight client. There’s a balance in this portal between simplicity and power – we’re following the “less is more” approach, with increasing levels of detail as you work through the portal rather than overwhelming you with a single, long “more is more” page. You can find the Portal here: http://windowsazure.com (then click “Log In” and then “Portal”) Windows Azure Management API You can also use programming tools to either write your own interface, or simply provide management functions directly within your solution. You have two options – you can use the more universal REST API’s, which area bit more complex but work with any system that can write to them, or the more approachable .NET API calls in code. You can find the reference for the API’s here: http://msdn.microsoft.com/en-us/library/windowsazure/ee460799.aspx  All Class Libraries, for each part of Windows Azure: http://msdn.microsoft.com/en-us/library/ee393295.aspx  PowerShell Command-lets PowerShell is one of the most powerful scripting languages I’ve used with Windows – and it’s baked into all of our products. When you need to work with multiple servers, scripting is really the only way to go, and the Windows Azure PowerShell Command-Lets allow you to work across most any part of the platform – and can even be used within the services themselves. You can do everything with them from creating a new IaaS, PaaS or SaaS service, to controlling them and even working with security and more. You can find more about the Command-Lets here: http://wappowershell.codeplex.com/documentation (older link, still works, will point you to the new ones as well) We have command-line utilities for other operating systems as well: https://www.windowsazure.com/en-us/manage/downloads/  Video walkthrough of using the Command-Lets: http://channel9.msdn.com/Events/BUILD/BUILD2011/SAC-859T  System Center System Center is actually a suite of graphical tools you can use to manage, deploy, control, monitor and tune software from Microsoft and even other platforms. This will be the primary tool we’ll recommend for managing a hybrid or contiguous management process – and as time goes on you’ll see more and more features put into System Center for the entire Windows Azure suite of products. You can find the Management Pack and README for it here: http://www.microsoft.com/en-us/download/details.aspx?id=11324  SQL Server Management Studio / Data Tools / Visual Studio SQL Server has two built-in management and development, and since Version 2008 R2, you can use them to manage Windows Azure Databases. Visual Studio also lets you connect to and manage portions of Windows Azure as well as Windows Azure Databases. You can read more about Visual Studio here: http://msdn.microsoft.com/en-us/library/windowsazure/ee405484  You can read more about the SQL tools here: http://msdn.microsoft.com/en-us/library/windowsazure/ee621784.aspx  Vendor-Provided Tools Microsoft does not suggest or endorse a specific third-party product. We do, however, use them, and see lots of other customers use them. You can browse to these sites to learn more, and chat with their folks directly on how they support Windows Azure. Cerebrata: Tools for managing from the command-line, graphical diagnostics, graphical storage management - http://www.cerebrata.com/  Quest Cloud Tools: Monitoring, Storage Management, and costing tools - http://communities.quest.com/community/cloud-tools  Paraleap: Monitoring tool - http://www.paraleap.com/AzureWatch  Cloudgraphs: Monitoring too -  http://www.cloudgraphs.com/  Opstera: Monitoring for Windows Azure and a Scale-out pattern manager - http://www.opstera.com/products/Azureops/  Compuware: SaaS performance monitoring, load testing -  http://www.compuware.com/application-performance-management/gomez-apm-products.html  SOASTA: Penetration and Security Testing - http://www.soasta.com/cloudtest/enterprise/  LoadStorm: Load-testing tool - http://loadstorm.com/windows-azure  Open-Source Tools This is probably the most specific set of tools, and the list I’ll have to maintain most often. Smaller projects have a way of coming and going, so I’ll try and make sure this list is current. Windows Azure MMC: (I actually use this one a lot) http://wapmmc.codeplex.com/  Windows Azure Diagnostics Monitor: http://archive.msdn.microsoft.com/wazdmon  Azure Application Monitor: http://azuremonitor.codeplex.com/  Azure Web Log: http://www.xentrik.net/software/azure_web_log.html  Cloud Ninja:Multi-Tennant billing and performance monitor -  http://cnmb.codeplex.com/  Cloud Samurai: Multi-Tennant Management- http://cloudsamurai.codeplex.com/    If you have additions to this list, please post them as a comment and I’ll research and then add them. Thanks!

    Read the article

  • DirectX11 Swap Chain Format

    - by Nathan
    I was wondering if anyone could elaborate any further on something thats been bugging be me. In DirectX9 the main supported back buffer formats were D3DFMT_X8R8B8G8 and D3DFMT_A8R8G8B8 (Both being BGRA in layout). http://msdn.microsoft.com/en-us/library/windows/desktop/bb174314(v=vs.85).aspx With the initial version of DirectX10 their was no support for BGRA and all the textbooks and online tutorials recommend DXGI_FORMAT_R8G8B8A8_UNORM (being RGBA in layout). Now with DirectX11 BGRA is supported again and it seems as if microsoft recommends using a BGRA format as the back buffer format. http://msdn.microsoft.com/en-us/library/windows/apps/hh465096.aspx Is their any suggestions or are their performance implications of using one or the other. (I assume not as obviously by specifying the format of the underlying resource the runtime will handle what bits your passing through and than infer how to utilise them based on the format). Any feedback is appreciated.

    Read the article

  • Windows Azure Use Case: Fast Acquisitions

    - by BuckWoody
    This is one in a series of posts on when and where to use a distributed architecture design in your organization's computing needs. You can find the main post here: http://blogs.msdn.com/b/buckwoody/archive/2011/01/18/windows-azure-and-sql-azure-use-cases.aspx  Description: Many organizations absorb, take over or merge with other organizations. In these cases, one of the most difficult parts of the process is the merging or changing of the IT systems that the employees use to do their work, process payments, and even get paid. Normally this means that the two companies have disparate systems, and several approaches can be used to have the two organizations use technology between them. An organization may choose to retain both systems, and manage them separately. The advantage here is speed, and keeping the profit/loss sheets separate. Another choice is to slowly “sunset” or stop using one organization’s system, and cutting to the other system immediately or at a later date. Although a popular choice, one of the most difficult methods is to extract data and processes from one system and import it into the other. Employees at the transitioning system have to be trained on the new one, the data must be examined and cleansed, and there is inevitable disruption when this happens. Still another option is to integrate the systems. This may prove to be as much work as a transitional strategy, but may have less impact on the users or the balance sheet. Implementation: A distributed computing paradigm can be a good strategic solution to most of these strategies. Retaining both systems is made more simple by allowing the users at the second organization immediate access to the new system, because security accounts can be created quickly inside an application. There is no need to set up a VPN or any other connections than just to the Internet. Having the users stop using one system and start with the other is also simple in Windows Azure for the same reason. Extracting data to Azure holds the same limitations as an on-premise system, and may even be more problematic because of the large data transfers that might be required. In a distributed environment, you pay for the data transfer, so a mixed migration strategy is not recommended. However, if the data is slowly migrated over time with a defined cutover, this can be an effective strategy. If done properly, an integration strategy works very well for a distributed computing environment like Windows Azure. If the Azure code is architected as a series of services, then endpoints can expose the service into and out of not only the Azure platform, but internally as well. This is a form of the Hybrid Application use-case documented here. References: Designing for Cloud Optimized Architecture: http://blogs.msdn.com/b/dachou/archive/2011/01/23/designing-for-cloud-optimized-architecture.aspx 5 Enterprise steps for adopting a Platform as a Service: http://blogs.msdn.com/b/davidmcg/archive/2010/12/02/5-enterprise-steps-for-adopting-a-platform-as-a-service.aspx?wa=wsignin1.0

    Read the article

  • HPC Server Dynamic Job Scheduling: when jobs spawn jobs

    - by JoshReuben
    HPC Job Types HPC has 3 types of jobs http://technet.microsoft.com/en-us/library/cc972750(v=ws.10).aspx · Task Flow – vanilla sequence · Parametric Sweep – concurrently run multiple instances of the same program, each with a different work unit input · MPI – message passing between master & slave tasks But when you try go outside the box – job tasks that spawn jobs, blocking the parent task – you run the risk of resource starvation, deadlocks, and recursive, non-converging or exponential blow-up. The solution to this is to write some performance monitoring and job scheduling code. You can do this in 2 ways: manually control scheduling - allocate/ de-allocate resources, change job priorities, pause & resume tasks , restrict long running tasks to specific compute clusters Semi-automatically - set threshold params for scheduling. How – Control Job Scheduling In order to manage the tasks and resources that are associated with a job, you will need to access the ISchedulerJob interface - http://msdn.microsoft.com/en-us/library/microsoft.hpc.scheduler.ischedulerjob_members(v=vs.85).aspx This really allows you to control how a job is run – you can access & tweak the following features: max / min resource values whether job resources can grow / shrink, and whether jobs can be pre-empted, whether the job is exclusive per node the creator process id & the job pool timestamp of job creation & completion job priority, hold time & run time limit Re-queue count Job progress Max/ min Number of cores, nodes, sockets, RAM Dynamic task list – can add / cancel jobs on the fly Job counters When – poll perf counters Tweaking the job scheduler should be done on the basis of resource utilization according to PerfMon counters – HPC exposes 2 Perf objects: Compute Clusters, Compute Nodes http://technet.microsoft.com/en-us/library/cc720058(v=ws.10).aspx You can monitor running jobs according to dynamic thresholds – use your own discretion: Percentage processor time Number of running jobs Number of running tasks Total number of processors Number of processors in use Number of processors idle Number of serial tasks Number of parallel tasks Design Your algorithms correctly Finally , don’t assume you have unlimited compute resources in your cluster – design your algorithms with the following factors in mind: · Branching factor - http://en.wikipedia.org/wiki/Branching_factor - dynamically optimize the number of children per node · cutoffs to prevent explosions - http://en.wikipedia.org/wiki/Limit_of_a_sequence - not all functions converge after n attempts. You also need a threshold of good enough, diminishing returns · heuristic shortcuts - http://en.wikipedia.org/wiki/Heuristic - sometimes an exhaustive search is impractical and short cuts are suitable · Pruning http://en.wikipedia.org/wiki/Pruning_(algorithm) – remove / de-prioritize unnecessary tree branches · avoid local minima / maxima - http://en.wikipedia.org/wiki/Local_minima - sometimes an algorithm cant converge because it gets stuck in a local saddle – try simulated annealing, hill climbing or genetic algorithms to get out of these ruts   watch out for rounding errors – http://en.wikipedia.org/wiki/Round-off_error - multiple iterations can in parallel can quickly amplify & blow up your algo ! Use an epsilon, avoid floating point errors,  truncations, approximations Happy Coding !

    Read the article

  • Asp.net session on browser close

    - by budugu
    Note: Cross posted from Vijay Kodali's Blog. Permalink How to capture logoff time when user closes browser? Or How to end user session when browser closed? These are some of the frequently asked questions in asp.net forums. In this post I'll show you how to do this when you're building an ASP.NET web application. Before we start, one fact: There is no full-proof technique to catch the browser close event for 100% of time. The trouble lies in the stateless nature of HTTP. The Web server is out of the picture as soon as it finishes sending the page content to the client. After that, all you can rely on is a client side script. Unfortunately, there is no reliable client side event for browser close. Solution: The first thing you need to do is create the web service. I've added web service and named it AsynchronousSave.asmx.    Make this web service accessible from Script, by setting class qualified with the ScriptServiceAttribute attribute...  Add a method (SaveLogOffTime) marked with [WebMethod] attribute. This method simply accepts UserId as a string variable and writes that value and logoff time to text file. But you can pass as many variables as required. You can then use this information for many purposes. To end user session, you can just call Session.Abandon() in the above web method. To enable web service to be called from page’s client side code, add script manager to page. Here i am adding to SessionTest.aspx page When the user closes the browser, onbeforeunload event fires on the client side. Our final step is adding a java script function to that event, which makes web service calls. The code is simple but effective My Code HTML:( SessionTest.aspx ) C#:( SessionTest.aspx.cs ) That’s’ it. Run the application and after browser close, open the text file to see the log off time. The above code works well in IE 7/8. If you have any questions, leave a comment.

    Read the article

  • Asp.net session on browser close

    - by budugu
    Note: Cross posted from Vijay Kodali's Blog. Permalink How to capture logoff time when user closes browser? Or How to end user session when browser closed? These are some of the frequently asked questions in asp.net forums. In this post I'll show you how to do this when you're building an ASP.NET web application. Before we start, one fact: There is no full-proof technique to catch the browser close event for 100% of time. The trouble lies in the stateless nature of HTTP. The Web server is out of the picture as soon as it finishes sending the page content to the client. After that, all you can rely on is a client side script. Unfortunately, there is no reliable client side event for browser close. Solution: The first thing you need to do is create the web service. I've added web service and named it AsynchronousSave.asmx.    Make this web service accessible from Script, by setting class qualified with the ScriptServiceAttribute attribute...  Add a method (SaveLogOffTime) marked with [WebMethod] attribute. This method simply accepts UserId as a string variable and writes that value and logoff time to text file. But you can pass as many variables as required. You can then use this information for many purposes. To end user session, you can just call Session.Abandon() in the above web method. To enable web service to be called from page’s client side code, add script manager to page. Here i am adding to SessionTest.aspx page When the user closes the browser, onbeforeunload event fires on the client side. Our final step is adding a java script function to that event, which makes web service calls. The code is simple but effective My Code HTML:( SessionTest.aspx ) C#:( SessionTest.aspx.cs ) That’s’ it. Run the application and after browser close, open the text file to see the log off time. The above code works well in IE 7/8. If you have any questions, leave a comment.

    Read the article

  • Management and Monitoring Tools for Windows Azure

    - by BuckWoody
    With such a large platform, Windows Azure has a lot of moving parts. We’ve done our best to keep the interface as simple as possible, while giving you the most control and visibility we can. However, as with most Microsoft products, there are multiple ways to do something – and I’ve always found that to be a good strength. Depending on the situation, I might want a graphical interface, a command-line interface, or just an API so I can incorporate the management into my own tools, or have third-party companies write other tools. While by no means exhaustive, I thought I might put together a quick list of a few tools you can use to manage and monitor Windows Azure components, from our IaaS, SaaS and PaaS offerings. Some of the products focus on one area more than another, but all are available today. I’ll try and maintain this list to keep it current, but make sure you check the date of this post’s update – if it’s more than six months old, it’s most likely out of date. Things move fast in the cloud. The Windows Azure Management Portal The primary tool for managing Windows Azure is our portal – most everything you need is there, from creating new services to querying a database. There are two versions as of this writing – a Silverlight client version, and a newer HTML5 version. The latter is being updated constantly to be in parity with the Silverlight client. There’s a balance in this portal between simplicity and power – we’re following the “less is more” approach, with increasing levels of detail as you work through the portal rather than overwhelming you with a single, long “more is more” page. You can find the Portal here: http://windowsazure.com (then click “Log In” and then “Portal”) Windows Azure Management API You can also use programming tools to either write your own interface, or simply provide management functions directly within your solution. You have two options – you can use the more universal REST API’s, which area bit more complex but work with any system that can write to them, or the more approachable .NET API calls in code. You can find the reference for the API’s here: http://msdn.microsoft.com/en-us/library/windowsazure/ee460799.aspx  All Class Libraries, for each part of Windows Azure: http://msdn.microsoft.com/en-us/library/ee393295.aspx  PowerShell Command-lets PowerShell is one of the most powerful scripting languages I’ve used with Windows – and it’s baked into all of our products. When you need to work with multiple servers, scripting is really the only way to go, and the Windows Azure PowerShell Command-Lets allow you to work across most any part of the platform – and can even be used within the services themselves. You can do everything with them from creating a new IaaS, PaaS or SaaS service, to controlling them and even working with security and more. You can find more about the Command-Lets here: http://wappowershell.codeplex.com/documentation (older link, still works, will point you to the new ones as well) We have command-line utilities for other operating systems as well: https://www.windowsazure.com/en-us/manage/downloads/  Video walkthrough of using the Command-Lets: http://channel9.msdn.com/Events/BUILD/BUILD2011/SAC-859T  System Center System Center is actually a suite of graphical tools you can use to manage, deploy, control, monitor and tune software from Microsoft and even other platforms. This will be the primary tool we’ll recommend for managing a hybrid or contiguous management process – and as time goes on you’ll see more and more features put into System Center for the entire Windows Azure suite of products. You can find the Management Pack and README for it here: http://www.microsoft.com/en-us/download/details.aspx?id=11324  SQL Server Management Studio / Data Tools / Visual Studio SQL Server has two built-in management and development, and since Version 2008 R2, you can use them to manage Windows Azure Databases. Visual Studio also lets you connect to and manage portions of Windows Azure as well as Windows Azure Databases. You can read more about Visual Studio here: http://msdn.microsoft.com/en-us/library/windowsazure/ee405484  You can read more about the SQL tools here: http://msdn.microsoft.com/en-us/library/windowsazure/ee621784.aspx  Vendor-Provided Tools Microsoft does not suggest or endorse a specific third-party product. We do, however, use them, and see lots of other customers use them. You can browse to these sites to learn more, and chat with their folks directly on how they support Windows Azure. Cerebrata: Tools for managing from the command-line, graphical diagnostics, graphical storage management - http://www.cerebrata.com/  Quest Cloud Tools: Monitoring, Storage Management, and costing tools - http://communities.quest.com/community/cloud-tools  Paraleap: Monitoring tool - http://www.paraleap.com/AzureWatch  Cloudgraphs: Monitoring too -  http://www.cloudgraphs.com/  Opstera: Monitoring for Windows Azure and a Scale-out pattern manager - http://www.opstera.com/products/Azureops/  Compuware: SaaS performance monitoring, load testing -  http://www.compuware.com/application-performance-management/gomez-apm-products.html  SOASTA: Penetration and Security Testing - http://www.soasta.com/cloudtest/enterprise/  LoadStorm: Load-testing tool - http://loadstorm.com/windows-azure  Open-Source Tools This is probably the most specific set of tools, and the list I’ll have to maintain most often. Smaller projects have a way of coming and going, so I’ll try and make sure this list is current. Windows Azure MMC: (I actually use this one a lot) http://wapmmc.codeplex.com/  Windows Azure Diagnostics Monitor: http://archive.msdn.microsoft.com/wazdmon  Azure Application Monitor: http://azuremonitor.codeplex.com/  Azure Web Log: http://www.xentrik.net/software/azure_web_log.html  Cloud Ninja:Multi-Tennant billing and performance monitor -  http://cnmb.codeplex.com/  Cloud Samurai: Multi-Tennant Management- http://cloudsamurai.codeplex.com/    If you have additions to this list, please post them as a comment and I’ll research and then add them. Thanks!

    Read the article

  • Windows Azure Use Case: Infrastructure Limits

    - by BuckWoody
    This is one in a series of posts on when and where to use a distributed architecture design in your organization's computing needs. You can find the main post here: http://blogs.msdn.com/b/buckwoody/archive/2011/01/18/windows-azure-and-sql-azure-use-cases.aspx  Description: Physical hardware components take up room, use electricity, create heat and therefore need cooling, and require wiring and special storage units. all of these requirements cost money to rent at a data-center or to build out at a local facility. In some cases, this can be a catalyst for evaluating options to remove this infrastructure requirement entirely by moving to a distributed computing environment. Implementation: There are three main options for moving to a distributed computing environment. Infrastructure as a Service (IaaS) The first option is simply to virtualize the current hardware and move the VM’s to a provider. You can do this with Microsoft’s Hyper-V product or other software, build the systems and host them locally on fewer physical machines. This is a good option for canned-applications (where you have to type setup.exe) but not as useful for custom applications, as you still have to license and patch those servers, and there are hard limits on the VM sizes. Software as a Service (SaaS) If there is already software available that does what you need, it may make sense to simply purchase not only the software license but the use of it on the vendor’s servers. Microsoft’s Exchange Online is an example of simply using an offering from a vendor on their servers. If you do not need a great deal of customization, have no interest in owning or extending the source code, and need to implement a solution quickly, this is a good choice. Platform as a Service (PaaS) If you do need to write software for your environment, your next choice is a Platform as a Service such as Windows Azure. In this case you no longer manager physical or even virtual servers. You start at the code and data level of control and responsibility, and your focus is more on the design and maintenance of the application itself. In this case you own the source code and can extend or change it as you see fit. An interesting side-benefit to using Windows Azure as a PaaS is that the Application Fabric component allows a hybrid approach, which gives you a basis to allow on-premise applications to leverage distributed computing paradigms. No one solution fits every situation. It’s common to see organizations pick a mixture of on-premise, IaaS, SaaS and PaaS components. In fact, that’s a great advantage to this form of computing - choice. References: 5 Enterprise steps for adopting a Platform as a Service: http://blogs.msdn.com/b/davidmcg/archive/2010/12/02/5-enterprise-steps-for-adopting-a-platform-as-a-service.aspx?wa=wsignin1.0  Application Patterns for the Cloud: http://blogs.msdn.com/b/kashif/archive/2010/08/07/application-patterns-for-the-cloud.aspx

    Read the article

  • Integrate Microsoft Translator into your ASP.Net application

    - by sreejukg
    In this article I am going to explain how easily you can integrate the Microsoft translator API to your ASP.Net application. Why we need a translation API? Once you published a website, you are opening a channel to the global audience. So making the web content available only in one language doesn’t cover all your audience. Especially when you are offering products/services it is important to provide contents in multiple languages. Users will be more comfortable when they see the content in their native language. How to achieve this, hiring translators and translate the content to all your user’s languages will cost you lot of money, and it is not a one time job, you need to translate the contents on the go. What is the alternative, we need to look for machine translation. Thankfully there are some translator engines available that gives you API level access, so that automatically you can translate the content and display to the user. Microsoft Translator API is an excellent set of web service APIs that allows developers to use the machine translation technology in their own applications. The Microsoft Translator API is offered through Windows Azure market place. In order to access the data services published in Windows Azure market place, you need to have an account. The registration process is simple, and it is common for all the services offered through the market place. Last year I had written an article about Bing Search API, where I covered the registration process. You can refer the article here. http://weblogs.asp.net/sreejukg/archive/2012/07/04/integrate-bing-search-api-to-asp-net-application.aspx Once you registered with Windows market place, you will get your APP ID. Now you can visit the Microsoft Translator page and click on the sign up button. http://datamarket.azure.com/dataset/bing/microsofttranslator As you can see, there are several options available for you to subscribe. There is a free version available, great. Click on the sign up button under the package that suits you. Clicking on the sign up button will bring the sign up form, where you need to agree on the terms and conditions and go ahead. You need to have a windows live account in order to sign up for any service available in Windows Azure market place. Once you signed up successfully, you will receive the thank you page. You can download the C# class library from here so that the integration can be made without writing much code. The C# file name is TranslatorContainer.cs. At any point of time, you can visit https://datamarket.azure.com/account/datasets to see the applications you are subscribed to. Click on the Use link next to each service will give you the details of the application. You need to not the primary account key and URL of the service to use in your application. Now let us start our ASP.Net project. I have created an empty ASP.Net web application using Visual Studio 2010 and named it Translator Sample, any name could work. By default, the web application in solution explorer looks as follows. Now right click the project and select Add -> Existing Item and then browse to the TranslatorContainer.cs. Now let us create a page where user enter some data and perform the translation. I have added a new web form to the project with name Translate.aspx. I have placed one textbox control for user to type the text to translate, the dropdown list to select the target language, a label to display the translated text and a button to perform the translation. For the dropdown list I have selected some languages supported by Microsoft translator. You can get all the supported languages with their codes from the below link. http://msdn.microsoft.com/en-us/library/hh456380.aspx The form looks as below in the design surface of Visual Studio. All the class libraries in the windows market place requires reference to System.Data.Services.Client, let us add the reference. You can find the documentation of how to use the downloaded class library from the below link. http://msdn.microsoft.com/en-us/library/gg312154.aspx Let us evaluate the translatorContainer.cs file. You can refer the code and it is self-explanatory. Note the namespace name used (Microsoft), you need to add the namespace reference to your page. I have added the following event for the translate button. The code is self-explanatory. You are creating an object of TranslatorContainer class by passing the translation service URL. Now you need to set credentials for your Translator container object, which will be your account key. The TranslatorContainer support a method that accept a text input, source language and destination language and returns DataServiceQuery<Translation>. Let us see this working, I just ran the application and entered Good Morning in the Textbox. Selected target language and see the output as follows. It is easy to build great translator applications using Microsoft translator API, and there is a reasonable amount of translation you can perform in your application for free. For enterprises, you can subscribe to the appropriate package and make your application multi-lingual.

    Read the article

  • CodePlex Daily Summary for Sunday, December 26, 2010

    CodePlex Daily Summary for Sunday, December 26, 2010Popular ReleasesNoSimplerAccounting: NoSimplerAccounting 6.0: -Fixed a bug in expense category report.Temporary Data Storage Folder: TDS Folder source code: This is latest version 0.2 Beta code. To know the password request at neutron.request@gmail.comNHibernate Mapping Generator: NHibernate Mapping Generator 2.0: Added support for Postgres (Thanks to Angelo)NewLife XCode: XCode v6.5.2010.1223 ????(????v3.5??): XCode v6.5.2010.1223 ????,??: NewLife.Core ??? NewLife.Net ??? XControl ??? XTemplate ????,??C#?????? XAgent ???? NewLife.CommonEnitty ??????(???,XCode??????) XCode?? ?????????,??????????????????,?????95% XCode v3.5.2009.0714 ??,?v3.5?v6.0???????????????,?????????。v3.5???????????,??????????????。 XCoder ??XTemplate?????????,????????XCode??? XCoder_Src ???????(????XTemplate????),??????????????????Windows Workflow Foundation on Codeplex: Microsoft.Activities.UnitTesting 1.71: New in this release Episode as Tasks using the Task Parallel Library CHM Help file now included in release. StyleCop compliance is resulting in massive refatoring but should not cause breaking changes Breaking Change DefaultTimeout is now a TimeSpan Removed default parameters using int timeout = DefaultTimeout and added overloads insteadMiniTwitter: 1.64: MiniTwitter 1.64 ???? ?? 1.63 ??? URL ??????????????Ajax ASP.Net Forum: InSeCla Forum Software v0.1.9: *VERSION: 0.1.9* HAPPY CHRISTMAS FEATURES ADDED Added features customizabled per category level (Customize at ADMIN/Categories Tab) Allow Anonymous Threads, Allow Anonymous Post Virtual URLs (friendly urls) has finally added And you can have some forum (category) using virtual urls and other using normal urls. Check !, as this improve the SEO indexing results Moderation Instant On: Delete Thread, Move Thread Available to users being members of moderators or administrators InstantO...VivoSocial: VivoSocial 7.4.0: Please see changes: http://support.vivoware.com/project/ChangeLog.aspx?PROJID=48Umbraco CMS: Umbraco 4.6 Beta - codename JUNO: The Umbraco 4.6 beta (codename JUNO) release contains many new features focusing on an improved installation experience, a number of robust developer features, and contains more than 89 bug fixes since the 4.5.2 release. Improved installer experience Updated Starter Kits (Simple, Blog, Personal, Business) Beautiful, free, customizable skins included Skinning engine and Skin customization (see Skinning Documentation Kit) Default dashboards on install with hide option Updated Login t...SSH.NET Library: 2010.12.23: This release includes some bug fixes and few new fetures. Fixes Allow to retrieve big directory structures ssh-dss algorithm is fixed Populate sftp file attributes New Features Support for passhrase when private key is used Support added for diffie-hellman-group14-sha1,diffie-hellman-group-exchange-sha256 and diffie-hellman-group-exchange-sha1 key exchange algorithms Allow to provide multiple key files for authentication Add support for "keyboard-interactive" authentication method...ASP.NET MVC SiteMap provider: MvcSiteMapProvider 2.3.0: Using NuGet?MvcSiteMapProvider is also listed in the NuGet feed. Learn more... Like the project? Consider a donation!Donate via PayPal via PayPal. Release notesThis will be the last release targeting ASP.NET MVC 2 and .NET 3.5. MvcSiteMapProvider 3.0.0 will be targeting ASP.NET MVC 3 and .NET 4 Web.config setting skipAssemblyScanOn has been deprecated in favor of excludeAssembliesForScan and includeAssembliesForScan ISiteMapNodeUrlResolver is now completely responsible for generating th...SuperSocket, an extensible socket application framework: SuperSocket 1.3 beta 2: Compared with SuperSocket 1.3 beta 1, the changes listed below have been done in SuperSocket 1.3 beta 2: added supports for .NET 3.5 replaced Logging Application Block of EntLib with Log4Net improved the code about logging fixed a bug in QuickStart sample project added IPv6 supportTibiaPinger: TibiaPinger v1.0: TibiaPinger v1.0Media Companion: Media Companion 3.400: Extract the entire archive to a folder which has user access rights, eg desktop, documents etc. A manual is included to get you startedMulticore Task Framework: MTF 1.0.1: Release 1.0.1 of Multicore Task Framework.SQL Monitor - tracking sql server activities: SQL Monitor 3.0 alpha 7: 1. added script save/load in user query window 2. fixed problem with connection dialog when choosing windows auth but still ask for user name 3. auto open user table when double click one table node 4. improved alert message, added log only methodEnhSim: EnhSim 2.2.6 ALPHA: 2.2.6 ALPHAThis release supports WoW patch 4.03a at level 85 To use this release, you must have the Microsoft Visual C++ 2010 Redistributable Package installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=A7B7A05E-6DE6-4D3A-A423-37BF0912DB84 To use the GUI you must have the .NET 4.0 Framework installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=9cfb2d51-5ff4-4491-b0e5-b386f32c0992 - Fixing up some r...ASP.NET MVC Project Awesome (jQuery Ajax helpers): 1.4.3: Helpers (controls) that you can use to build highly responsive and interactive Ajax-enabled Web applications. These helpers include Autocomplete, AjaxDropdown, Lookup, Confirm Dialog, Popup Form, Popup and Pager new stuff: Improvements for confirm, popup, popup form RenderView controller extension the user experience for crud in live demo has been substantially improved + added search all the features are shown in the live demoGanttPlanner: GanttPlanner V1.0: GanttPlanner V1.0 include GanttPlanner.dll and also a Demo application.N2 CMS: 2.1 release candidate 3: * Web platform installer support available N2 is a lightweight CMS framework for ASP.NET. It helps you build great web sites that anyone can update. Major Changes Support for auto-implemented properties ({get;set;}, based on contribution by And Poulsen) A bunch of bugs were fixed File manager improvements (multiple file upload, resize images to fit) New image gallery Infinite scroll paging on news Content templates First time with N2? Try the demo site Download one of the templ...New ProjectsA.I. Semantic Net Tests: A test using "semantic net" artificial intelligence developed in C++.Awesome.ClientID: Awesome.ClientID is a HTTPModule which serializes the ClientID of server controls, and page/usercontrol properties to a Json to make working with JavaScript easier without the need for outputting ClientID's. It's a solution for clean ClientIDs for .NET 2.0/3.5BitTweet: BitTweet is a fast, simple and easy to use Twitter client with tweeting, reading and writing functions included. It is know for being the most secure and fast Twitter client out there! It is written in Visual Basic .NET 2008. ~Tweet in a bit!CarRental001: carCrudo: CRUDO - The MCG (Model-Controller-Generator) CGF (Code Generation Framework) Visit The Project HomePage: http://adityayadav.com/CRUDO_The_MCG_Model_Controller_Generator_CGF_Code_Generation_Framework.aspx Licenses: 1) GPL v2 2) Commercial (contact us for information)Effeks: FX trading sample application sing prismEnhanced WebPart: Enhanced Webpart is a webpart with enhanced properties editor. It allows you display properties of most common types, and provides easy way to add custom controls to display your own types in webpart properties editor. Enhanced Webpart fully supports localization.Esteffin's graphic: Esercitazioni grafica 2010farasun: Source code Repository for farasun.wordpress.comGCTF: Desafio .NET Realizado na FACISAImgshow Plugin for Windows Live Writer: By using Imgshow Query you are able to publish multimedia content such as Youtube, Facebook Video, PDF, and other Imgshow supported service.Instant Messenger Using WPF and WCF: instant messenger using wcf and wpfjMenu: Create a dotnetnuke module that cover all kind of jquery menu pluginsMcopy API: Copy all files and directores in windows shell. Support long path (less then 32000 chars) and network path (eg. \\server\share or \\127.0.0.1\share)MOSSWebServices: This project contains code for interacting with the 21 web services provided by MOSS. This is targeted towards WSS developers This currently contains code to interact with "UserGroups" web service. I will keep adding code to interact with other MOSS web services soon. Nest Hub - Twitter Client: Nest Hub is a free Twitter client for PC. It's developed in VB.NET.Neural Networks Library: Neural networks Library by SefnajParallelism: Parallelism- Unified Parallel & Concurrency Framework Visit the Project HomePage: http://adityayadav.com/Parallelism.aspx Licenses: 1) GPL v2 2) Commercial (contact us for information)PDF IRM Protector For SharePoint 2007 And 2010: The PDF IRM Protector controls the conversion of PDF documents to their encrypted, rights-managed format and the decryption of PDF documents from their rights-managed format back to their original format. The project targets both SharePoint 2007 and 2010. PHP Form Validator - Isis: Project Isis simplifies the process of form validation for PHP. Offering a robust flag and function system, allowing you to encode the forms with the correct validation data and be done.Recycle Me: Open Source Project is to help and educate people in regarding reuse and recycling of daily life items. This is a Social project so we need funds for research and surveys. Volunteers are most welcome to contribute and certifications for their participation are available. ThanksSencha ExtJS Import Library for Script#: Script# import library for Sencha Ext JS Javascript Framework Separating Vertices: Given a graph as a collection of edges and vertices, it identifies the separating vertices, biconnected components, and edges. SIG - SISTEMA DE GESTÃO INTERNA: Projeto feito para a empresa MODEMTELECOM, consultoria em telecomunicoções.Smart Card COM: Smart Card COM Library, useful to access smart cards from a Web page in Internet ExplorerSocial Hub: Social HubStock Research: Test project for stocke researchtaokeba: ????????????????????,???????????。Temporary Data Storage Folder: This software creates a folder that stores temporary data that you want to store for temporary tasks. Temporary tasks such as storing a file that is to be deleted after some time. It automatically deleted those files on exit. Learn more at www.neutronsoftwares.weebly.comTukUI Updater: A OpenSource Windows Updater tool for the World of Warcraft(R) addon TukUI Developed in .NET 3.5 Discusion thread are found at the official TukUI forums: http://www.tukui.org/v2/forums/topic.php?id=4982 WebUtil: This is a project for using all of features by the web.

    Read the article

  • Chart Control in ASP.Net 4 – Second Part

    - by sreejukg
      Couple of weeks before, I have written an introduction about the chart control available in .Net framework. In that article, I explained the basic usage of the chart control with a simple example. You can read that article from the url http://weblogs.asp.net/sreejukg/archive/2010/12/31/getting-started-with-chart-control-in-asp-net-4-0.aspx. In this article I am going to demonstrate how one can generate various types of charts that can be generated easily using the ASP.Net chart control. Let us recollect the data sample we were working in the previous sample. The following is the data I used in the previous article. id SaleAmount SalesPerson SaleType SaleDate CompletionStatus (%) 1 1000 Jack Development 2010-01-01 100 2 300 Mills Consultancy 2010-04-14 90 3 4000 Mills Development 2010-05-15 80 4 2500 Mike eMarketting 2010-06-15 40 5 1080 Jack Development 2010-07-15 30 6 6500 Mills Consultancy 2010-08-24 65 In this article I am going to demonstrate various graphical reports generated from this data with the help of chart control. The following are the reports I am going to generate 1. Representation of share of Sales by each Sales person. 2. Representation of share of sales data according to sale type 3. Representation of sales progress over time period I am going to demonstrate how to bind the chart control programmatically. In order to facilitate this, I created an aspx page named “SalesAnalysis.Aspx” to my project. In the page I added the following controls 1. Dropdownlist control – with id ddlAnalysisType, user will use this to choose the type of chart they want to see. 2. A Button control – with id btnSubmit , by clicking this button, the chart based on the dropdownlist selection will be shown to the user 3. A label Control – with id lblMessage, to display the message to the user, initially this will ask the user to select an option and click on the button. 4. Chart control – with id chrtAnalysis, by default, I set visible = false so that during the page load the chart will be hidden to the users. The following is the initial output of the page. Generating chart for salesperson share Now from Visual Studio, I have double clicked on the button; it created the event handler btnSubmit_Click. In the button Submit event handler, I am using a switch case to execute the corresponding SQL statement and bind it to the chart control. The below is the code for generating the sales person share chart using a pie chart. The above code produces the following output The steps for creating the above chart can be summarized as follows. You specify a chart area, then a series and bind the chart to some x and y values. That is it. If you want to control the chart size and position, you can set the properties for the ChartArea.Position element. For e.g. in the previous code, after instantiating the chart area, setting the below code will give you a bigger pie chart. c.Position.Width = 100; c.Position.Height = 100; The width and height values are in percentage. In this case the chart will be generated by utilizing all the width and height of the chart object. See the output updated with the width and height set to 100% each. Generate Chart for sales type share Now for generating the chart according to the sales type, you just need to change the SQL query and x and y values of the chart. The Sql query used is “SELECT SUM(saleAmount) amount, SaleType from SalesData group by SaleType” and the X-Value is amount and Y-Values is SaleType. s.XValueMember = "SaleType"; s.YValueMembers = "amount"; After modifying the above code with these, the following output is generated. Generate Chart for sales progress over time period For generating the progress of sale chart against sales amount / period, line chart is the ideal tool. In order to facilitate the line chart, you can use Chart Type as System.Web.UI.DataVisualization.Charting.SeriesChartType.Line. Also we need to retrieve the amount and sales date from the data source. I have used the following query to facilitate this. “SELECT SaleAmount, SaleDate FROM SalesData” The output for the line chart is as follows Now you have seen how easily you can build various types of charts. Chart control is an excellent one that helps you to bring business intelligence to your applications. What I demonstrated in only a small part of what you can do with the chart control. Refer http://msdn.microsoft.com/en-us/library/dd456632.aspx for further reading. If you want to get the project files in zip format, post your email below. Hope you enjoyed reading this article.

    Read the article

< Previous Page | 32 33 34 35 36 37 38 39 40 41 42 43  | Next Page >