Search Results

Search found 16126 results on 646 pages for 'wcf performance'.

Page 391/646 | < Previous Page | 387 388 389 390 391 392 393 394 395 396 397 398  | Next Page >

  • BizTalk Schema Validation

    - by Christopher House
    Perhaps this one should be filed under:  Obvious Yesterday I created a new schema that is going to be used for a WCF receive.  The schema has a bunch of restrictions in it, with the intention that we'd validate incoming messages against the schema.  I'd never done message validation with BizTalk but I knew the XmlDisassembler component had an option for validating, so I figured it would be a piece of cake.  Sadly, that was not to be the case.  I deployed my artifacts and configured my receive location's XmlDisassembler with what I thought to be the correct document spec name.  I entered My.Project.Name.SchemaTypeName for the document spec and started running unit tests.  All of them failed with the following error logged in the event log: "WcfReceivePort_BizTalkWcfService/PurchaseOrderService" URI: "/BizTalkWcfService/PurchaseOrderService.svc" Reason: No Disassemble stage components can recognize the data. I went to the receive port and turned on tracking, submitted another message, then went to the admin console and saved the message.  It looked correct, but just to be sure, I manually validated it against the schema in my project.  As expected, it validated correctly. After a bit of thinking on this, I realized that I probably needed to fully qualify my document spec name, meaning, include the assembly name, as well as the type name.  So, I went back to the receive location and changed the document spec to: My.Project.Name.SchemaTypeName, My.Project.Name,Version=1.0.0.0, Culture=neutral, PublicKeyToken=xxxxxxxxx I re-ran my unit tests and everything was working as expected.  So, note to self:  remember to include the assembly name when setting the document spec.  If you need an easy way to determine your schema name and assembly name, find your schema in the admin console and go to it's properties.  On the property screen, look at the Name and Assembly properties.  Your document spec will be "SchemaName, AssemblyName"

    Read the article

  • Silverlight Developer needs ASP.NET MVC Training [on hold]

    - by Peet Brits
    With Silverlight on the way out, our company wants to embrace HTML5 and related technologies. Background: Our Silverlight project did everything from generating its own models (or data contracts), sending it over WCF, tracking changes, with a whole deal of back-end code to make the ride smoother, but often also cluttered and more complex. Most of the original developers for this project are gone, and we want to embrace something new for future projects. Having done this very useful MVC Jump Start course at Microsoft Virtual Academy, we are all fired up for the next project. The problem is that we have very little in-depth knowledge of all the many different components. The most important hereof is probably Entity Framework, and (for later) Web API. I suppose the best place to start is at the Microsoft ASP.NET websites. Are there any other suggestions for learning from more experienced developers? I am a senior developer, but my knowledge of ASP.NET MVC (and related) is very limited. PS: We have a project deadline at the end of this month.

    Read the article

  • Hello!

    - by barryoreilly
    After many months of deliberating I have finally gotten around to starting this blog! The reason for doing this is the large number of half finished articles lying around on my hard disk, unpublished and unloved. These articles have been of huge benefit to me, and have been written in an attempt to consolidate my own thinking, in order to help me structure my thoughts and ideas as I have tried to digest new ideas and understand abstract theories. It is my hope that by tidying up these articles and publishing them here that I can continue this learning process by getting feedback on the ideas from within the developer community. i have worked with .NET for 8 years now, and have worked with ASP.NET, SQL Server, Windows programming as well as general network administration. Since 2004 my focus has been on integration, web services, and more often than not Biztalk Server. The last two years have seen me focus on SOA and WCF, and the Managed Services Engine, so this is probably where the main focus of the blog will to start with, but there are so many fun things to play with these days that i have no idea where it will end up.....   Barry

    Read the article

  • When is it too late to go back to coding from a management role? [closed]

    - by LeoLambrettra
    Problem solving keeps the mind sharp and if you are like me then it makes you happy. But what if you went from coding up to Team Lead and then to Project Manager? I have a team of 12 and on a good salary but lately have been thinking that the politics and admin tasks of being middle level management in an Investment Bank is not the right path to happiness. I used to be able to design and code as well as manage but lately it's all budgets, admin tasks and people problems. At 39 is it too late to go be a senior developer again? Basically - Team Lead in a flat structure with good people rocks. But if half your team is offshore then it loses something - There's a lot of politics in Project Management and so many meetings that even if you want to code you start letting your team down by missing deadlines and only suited for small units of work The coding skills haven't gone so to pick up WCF services it just takes a bit of reading and then playing around. I reckon I could switch to a Hedge Fund and go back to developing and be far happier and get more money. My 2 doubts though are 1. Mid life crisis in that I'd get bored with coding again 2. Or maybe I'd like it but there aren't many dev jobs for 40+ so I'd be throwing away a high level management role that took 7 years at thee one bank to get to0 Anybody else made to switch back and survived?

    Read the article

  • Correct way to inject dependencies in Business logic service?

    - by Sri Harsha Velicheti
    Currently the structure of my application is as below Web App -- WCF Service (just a facade) -- Business Logic Services -- Repository - Entity Framework Datacontext Now each of my Business logic service is dependent on more than 5 repositories ( I have interfaces defined for all the repos) and I am doing a Constructor injection right now(poor mans DI instead of using a proper IOC as it was determined that it would be a overkill for our project). Repositories have references to EF datacontexts. Now some of the methods in the Business logic service require only one of the 5 repositories, so If I need to call that method I would end up instantiating a Service which will instatiate all 5 repositories which is a waste. An example: public class SomeService : ISomeService { public(IFirstRepository repo1, ISecondRepository repo2, IThirdRepository repo3) {} // My DoSomething method depends only on repo1 and doesn't use repo2 and repo3 public DoSomething() { //uses repo1 to do some stuff, doesn't use repo2 and repo3 } public DoSomething2() { //uses repo2 and repo3 to do something, doesn't require repo1 } public DoSomething3() { //uses repo3 to do something, doesn't require repo1 and repo2 } } Now if my I have to use DoSomething method on SomeService I end up creating both IFirstRepository,ISecondRepository and IThirdRepository but using only IFirstRepository, now this is bugging me, I can seem to accept that I am un-necessarily creating repositories and not using them. Is this a correct design? Are there any better alternatives? Should I be looking at Lazy instantiation Lazy<T> ?

    Read the article

  • How to setup users for desktop app with SQL Azure as backend?

    - by Manuel
    I'm considering SQL Azure as DB for a new application I'm developing. The reason I want to go with Azure is because I don't want to have to maintain yet another database(s) and I want my users to be able to access the data from anywhere. The problem is that I'm not clear regarding how to users will connect. The application is a basic CRUD type of windows app. I've read that you need to add your IP to SQL Azure's firewall to connect to it, but I don't know if it's only for administration purposes only. Can anyone clarify if anyone (anywhere) can access the data with the proper credentials? Which of the following scenarios would work best (if at all)? A) Add each user to SQL Azure and have the app connect directly to Azure as if it was connecting to SQL Server B) Add an anonymous user SQL Azure and pass the real user's password/hash with every call so the Azure database can service the requests accordingly. C) Put a WCF service in between so that it handles the authentication stuff. The service will only serve the appropriate information to the user given his/her authentication and SQL Azure would be open to the service exclusively. D) - ideas are welcomed - This is confusing because all Azure examples I see are for websites. I have a hard time believing SQL Azure wouldn't handle the case of desktop apps connecting to it. So what's the best practice?

    Read the article

  • Task-It Source Code

    Download Source Code I've received many questions about when the source code for the Task-It application will be released. Well, the time has finally come. I haven't been able to release this sooner due to the flurry of releases that have been coming out lately. Silverlight 4, WCF RIA Services, and even our Q1 Rad Controls. Each time I got the latest bits I ran into issues (either bugs or visual issues) in the Task-It that needed to be fixed. Having said that, the app is far from perfect. There are still some bugs lurking and things that need to be fixed up visually (especially the RadGridView filtering popup), but the main purpose of this app is to show the RadControls for Silverlight 4 in the context of a real-world application, and I don't want to keep delaying the release of the source code. Minimum requirements To run the app you will need the latest Silverlight bits. Silverlight 4 RTM, VS2010 and the ...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Visual Studio 2013 now available!

    - by TATWORTH
    Originally posted on: http://geekswithblogs.net/TATWORTH/archive/2013/10/17/visual-studio-2013-now-available.aspxVisual Studio 2013 is now available for download! I will attach the beginning of their web page announcement. You should note that web projects may now be readily a combination of Web Forms, MVC and Web API.We are excited to announce that Visual Studio 2013 is now available to you as an MSDN subscriber! For developers and development teams, Visual Studio 2013 easily delivers applications across all Microsoft devices, cloud, desktop, server and game console platforms by providing a consistent development experience, hybrid collaboration options, and state-of-the-art tools, services, and resources. Below are just a few of the highlights in this release:   •   Innovative features for greater developer productivity:Visual Studio 2013 includes many user interface improvements; there are more than 400 modified icons with greater differentiation and increased use of color, a redesigned Start page, and other design changes.  •   Support for Windows 8.1 app development: Visual Studio 2013 provides the ideal toolset for building modern applications that leverage the next wave in Windows platform innovation (Windows 8.1), while supporting devices and services across all Microsoft platforms. Support for Windows Store app development in Windows 8.1 includes updates to the tools, controls and templates, new Coded UI test support for XAML apps, UI Responsiveness Analyzer and Energy Consumption profiler for XAML & HTML apps, enhanced memory profiling tools for HTML apps, and improved integration with the Windows Store.  •   Web development advances: Creating websites or services on the Microsoft platform provides you with many options, including ASP.NET WebForms, ASP.NET MVC, WCF or Web API services, and more. Previously, working with each of these approaches meant working with separate project types and tooling isolated to that project’s capabilities. The One ASP.NET vision unifies your web project experience in Visual Studio 2013 so that you can create ASP.NET web applications using your preference of ASP.NET component frameworks in a single project. Now you can mix and match the right tools for the job within your web projects, giving you increased flexibility and productivity.

    Read the article

  • How to leverage the internal HTTP endpoint available on Azure web roles?

    - by adelsors
    Imagine you have a Web application using an in-memory collection that changes occasionally, loading it from storage on the Application_Start global.asax event and updating it whenever it changes. If you want to deploy this application on Azure you need to keep in mind that more than one instance of the application can be running at any time and therefore you need to provide some mechanism to keep all instances informed with the latest changes. Because that the communication through internal endpoints between Azure role instances is at no cost, a good solution can be maintaining the information on Azure Storage Tables, reading its contents on the Application_Start event and populating its changes to all instances using the internal HTTP port available on Azure Web Roles. You need to follow these steps to leverage the internal HTTP endpoint available on Azure web roles: 1.   Define an internal HTTP endpoint in the Web Role properties, for example InternalHttpEndpoint   2.   Add a new WCF service to the Web Role, for example NotificationServices.svc 3.   Add a method on the new service to receive notifications from other role instances. 4.   Declare a class that inherits from System.ServiceModel.Activation.ServiceHostFactory and override the method CreateServiceHost to host the internal endpoint.   Note that you can use SecurityMode.None because the internal endpoint is private to the instances of the service, this is provided by the platform. 5.   Edit the markup of the service right clicking the svc file and selecting "View markup" to add the new factory as the factory to be used to create the service    6. Now you can notify changes to other instances using this code:

    Read the article

  • Are these 2 strings equal?

    - by Shawn Cicoria
    I spent way too many hours on this one. I was going through full configuration of ADFS v2 with WCF active client scenarios and using self generated certificates, had all things lined up perfectly.  Using the certificate snap in I just copied the thumbprint into the IdentityModel section (trusted issuers) in my service config.  var one = "?ecb8fd950978d94ae21d4f073227fdc2718bdb96"; var two = "ecb8fd950978d94ae21d4f073227fdc2718bdb96"; What ended up is in the first, there’s a buried nonprintable series of characters (‎ – or E2 80 8E in 0x format). 2 lessons, turn on tracing sooner and don’t trust Copy & Paste – all the time.  I ended up creating a quick Issuer Name Registry class so I could debug and finally saw the issue. namespace MyService { public class IssuerValidator : ConfigurationBasedIssuerNameRegistry { public IssuerValidator() :base() { } public IssuerValidator(XmlNodeList xml) : base(xml) { } public override string GetIssuerName(System.IdentityModel.Tokens.SecurityToken securityToken) { X509SecurityToken token = securityToken as X509SecurityToken; if (token == null) { return "who cares"; } else { return token.Certificate.Thumbprint; } } } I do have a utility I wrote to navigate the cert store and emit the thumbprint to avoid these issues, I just didn’t have it available on my machine at the time.

    Read the article

  • Learning project Custom c# Cms [closed]

    - by user313378
    I want to start new project customCms, cause I think it's a good starting point to implement my collected knowledge from c#, ddd, nhibernate, mvc3, js. It will be great if I hear some guidlines from expirienced users here. I will use C# ASP.NET MVC3 razor view engine. Also I was thinking of NHibernate ORM, I dont know if using Nhibernate will cause performanse down. Initially MSSQL 2008 will be used, but using ORM layer cause that I can switch to some other db with no pain. I was thinking to create News entity which will have properties Id Name Created Updated IntroText Content Title Author ListPhotos Every input will be validated with untroub. java script on the view, and it will be validated on db level as well. Maybe it is best approach to create some interface which will be implemented by my cmsClient entity like NewsEntity. In this interface will be included everything it should be requested from my client in future. At least some stuff which are not included in entity right now, consumed data by rss feed, wcf, etc. So basically everything you think its good idea from documentating project, to coding. Everyone is welcomed to brainstorm for custom cms.

    Read the article

  • Windows 7 Phone Database Rapid Repository – V2.0 Beta Released

    - by SeanMcAlinden
    Hi All, A V2.0 beta has been released for the Windows 7 Phone database Rapid Repository, this can be downloaded at the following: http://rapidrepository.codeplex.com/ Along with the new View feature which greatly enhances querying and performance, various bugs have been fixed including a more serious bug with the caching that caused the GetAll() method to sometimes return inconsistent results (I’m a little bit embarrased by this bug). If you are currently using V1.0 in development, I would recommend swapping in the beta immediately. A full release will be available very shortly, I just need a few more days of testing and some input from other users/testers.   *Breaking Changes* The only real change is the RapidContext has moved under the main RapidRepository namespace. Various internal methods have been actually made ‘internal’ and replaced with a more friendly API (I imagine not many users will notice this change). Hope you like it Kind Regards, Sean McAlinden

    Read the article

  • Developer Dashboard in SharePoint 2010

    - by jcortez
    Introducing the Developer Dashboard As a SharePoint developer (or IT Professional), how many times have you had the pleasure of figuring out why a particular page on your site is taking too long to render? I'm sure one of the techniques you have employed in troubleshooting is the process of elimination - removing individual web parts from the page hoping to identify which web part is misbehaving. One of the new features of SharePoint 2010 is the Developer Dashboard. This dashboard provides tracing and performance information that can be useful when you are trying to troubleshoot pages that are loading too slow. The Developer Dashboard is turned off by default and I'll go over 3 different ways to display it. Here is a screenshot of what the Developer Dashboard looks like when displayed at the bottom of the page:   You can see on the left side the different events that fired during the page processing pipeline and how long these events took. This is where you will see individual web parts being processed and how long it took to complete (obviously the kind of processing depends on what the web part does). On the right side you would see the different database calls issued through the SharePoint Object Model to process the page. You will notice that each of these database queries are actually a hyperlink and clicking on it displays a pop-up window that shows the actual SQL Query Text, the Call Stack that triggered the database call, and the IO statistics of that query. Enabling the Developer Dashboard Option 1: Managed Code   The Developer Dashboard is a farm-wide setting and the code above won't work if it is used within a web part hosted on any non-Central Admin site. The SPDeveloperDashboardLevel enum has three possible values: On, Off, and OnDemand. Setting it to On will always display the Developer Dashboard at the bottom of the page. Setting it Off will hide the Developer Dashboard. Setting it to OnDemand will add an icon at the top right corner of the page (see screenshot below) where a Site Collection Admin can toggle the display of the Developer Dashboard for a particular site collection. In my opinion, OnDemand is the best setting when troubleshooting a page or during development since a Site Collection Admin can turn it on or off and for a particular site only. The first cool thing about this is that the Site Collection Admin that turned it on will be the only one to see the Developer Dashboard output. Everyday users won't see the Developer Dashboard output even if it was turned on by a Site Collection Admin. If you need more flexibility on who gets to see the Developer Dashboard output, you can set the SPDeveloperDashboardSettings.RequiredPermissions to control which group of users will have the permission to see the output. Option 2: Using stsadm Using stsadm, you can run the following command to configure the Developer Dashboard: STSADM –o setproperty –pn developer-dashboard –pv OnDemand To successfully execute this command, be sure you that are running as a Farm Admin. Option 3: Using PowerShell For all scripts in SharePoint 2010, I prefer writing them as PowerShell scripts. Though the stsadm command is less verbose, the PowerShell equivalent is pretty straightforward and uses the SharePoint Object Model: You can of course parameterized the value that gets assigned to the DisplayLevel property so you can turn it On, Off or OnDemand depending on the parameter. Events and the Developer Dashboard  Now, don't assume that all the code inside your web part or page will show up in the Developer Dashboard complete with all the great troubleshooting information. Only a finite set of events are monitored by default (for a web part it will events in the base web part class). Let's say you have a click event that could take some time, for example a web service call. And you want to include troubleshooting information for this event in the Developer Dashboard. Enter SPMonitoredScope which is also a new feature in SharePoint 2010. In SharePoint 2010, everything is executed within a "Monitored Scope". And each scope has a set of "Monitors" that measures and counts calls and timings which appears in the Developer Dashboard. Below is an example on how to get your custom code to get included in the Developer Dashboard by wrapping it inside a new monitored scope: The code above would include your new scope "My long web service call" into the Developer Dashboard and would log the time it took to complete processing. In my opinion, wrapping your custom code in a SPMonitoredScope is a SharePoint development best practice since it provides you visibility and a better understanding on the performance of your components.

    Read the article

  • West Wind WebSurge - an easy way to Load Test Web Applications

    - by Rick Strahl
    A few months ago on a project the subject of load testing came up. We were having some serious issues with a Web application that would start spewing SQL lock errors under somewhat heavy load. These sort of errors can be tough to catch, precisely because they only occur under load and not during typical development testing. To replicate this error more reliably we needed to put a load on the application and run it for a while before these SQL errors would flare up. It’s been a while since I’d looked at load testing tools, so I spent a bit of time looking at different tools and frankly didn’t really find anything that was a good fit. A lot of tools were either a pain to use, didn’t have the basic features I needed, or are extravagantly expensive. In  the end I got frustrated enough to build an initially small custom load test solution that then morphed into a more generic library, then gained a console front end and eventually turned into a full blown Web load testing tool that is now called West Wind WebSurge. I got seriously frustrated looking for tools every time I needed some quick and dirty load testing for an application. If my aim is to just put an application under heavy enough load to find a scalability problem in code, or to simply try and push an application to its limits on the hardware it’s running I shouldn’t have to have to struggle to set up tests. It should be easy enough to get going in a few minutes, so that the testing can be set up quickly so that it can be done on a regular basis without a lot of hassle. And that was the goal when I started to build out my initial custom load tester into a more widely usable tool. If you’re in a hurry and you want to check it out, you can find more information and download links here: West Wind WebSurge Product Page Walk through Video Download link (zip) Install from Chocolatey Source on GitHub For a more detailed discussion of the why’s and how’s and some background continue reading. How did I get here? When I started out on this path, I wasn’t planning on building a tool like this myself – but I got frustrated enough looking at what’s out there to think that I can do better than what’s available for the most common simple load testing scenarios. When we ran into the SQL lock problems I mentioned, I started looking around what’s available for Web load testing solutions that would work for our whole team which consisted of a few developers and a couple of IT guys both of which needed to be able to run the tests. It had been a while since I looked at tools and I figured that by now there should be some good solutions out there, but as it turns out I didn’t really find anything that fit our relatively simple needs without costing an arm and a leg… I spent the better part of a day installing and trying various load testing tools and to be frank most of them were either terrible at what they do, incredibly unfriendly to use, used some terminology I couldn’t even parse, or were extremely expensive (and I mean in the ‘sell your liver’ range of expensive). Pick your poison. There are also a number of online solutions for load testing and they actually looked more promising, but those wouldn’t work well for our scenario as the application is running inside of a private VPN with no outside access into the VPN. Most of those online solutions also ended up being very pricey as well – presumably because of the bandwidth required to test over the open Web can be enormous. When I asked around on Twitter what people were using– I got mostly… crickets. Several people mentioned Visual Studio Load Test, and most other suggestions pointed to online solutions. I did get a bunch of responses though with people asking to let them know what I found – apparently I’m not alone when it comes to finding load testing tools that are effective and easy to use. As to Visual Studio, the higher end skus of Visual Studio and the test edition include a Web load testing tool, which is quite powerful, but there are a number of issues with that: First it’s tied to Visual Studio so it’s not very portable – you need a VS install. I also find the test setup and terminology used by the VS test runner extremely confusing. Heck, it’s complicated enough that there’s even a Pluralsight course on using the Visual Studio Web test from Steve Smith. And of course you need to have one of the high end Visual Studio Skus, and those are mucho Dinero ($$$) – just for the load testing that’s rarely an option. Some of the tools are ultra extensive and let you run analysis tools on the target serves which is useful, but in most cases – just plain overkill and only distracts from what I tend to be ultimately interested in: Reproducing problems that occur at high load, and finding the upper limits and ‘what if’ scenarios as load is ramped up increasingly against a site. Yes it’s useful to have Web app instrumentation, but often that’s not what you’re interested in. I still fondly remember early days of Web testing when Microsoft had the WAST (Web Application Stress Tool) tool, which was rather simple – and also somewhat limited – but easily allowed you to create stress tests very quickly. It had some serious limitations (mainly that it didn’t work with SSL),  but the idea behind it was excellent: Create tests quickly and easily and provide a decent engine to run it locally with minimal setup. You could get set up and run tests within a few minutes. Unfortunately, that tool died a quiet death as so many of Microsoft’s tools that probably were built by an intern and then abandoned, even though there was a lot of potential and it was actually fairly widely used. Eventually the tools was no longer downloadable and now it simply doesn’t work anymore on higher end hardware. West Wind Web Surge – Making Load Testing Quick and Easy So I ended up creating West Wind WebSurge out of rebellious frustration… The goal of WebSurge is to make it drop dead simple to create load tests. It’s super easy to capture sessions either using the built in capture tool (big props to Eric Lawrence, Telerik and FiddlerCore which made that piece a snap), using the full version of Fiddler and exporting sessions, or by manually or programmatically creating text files based on plain HTTP headers to create requests. I’ve been using this tool for 4 months now on a regular basis on various projects as a reality check for performance and scalability and it’s worked extremely well for finding small performance issues. I also use it regularly as a simple URL tester, as it allows me to quickly enter a URL plus headers and content and test that URL and its results along with the ability to easily save one or more of those URLs. A few weeks back I made a walk through video that goes over most of the features of WebSurge in some detail: Note that the UI has slightly changed since then, so there are some UI improvements. Most notably the test results screen has been updated recently to a different layout and to provide more information about each URL in a session at a glance. The video and the main WebSurge site has a lot of info of basic operations. For the rest of this post I’ll talk about a few deeper aspects that may be of interest while also giving a glance at how WebSurge works. Session Capturing As you would expect, WebSurge works with Sessions of Urls that are played back under load. Here’s what the main Session View looks like: You can create session entries manually by individually adding URLs to test (on the Request tab on the right) and saving them, or you can capture output from Web Browsers, Windows Desktop applications that call services, your own applications using the built in Capture tool. With this tool you can capture anything HTTP -SSL requests and content from Web pages, AJAX calls, SOAP or REST services – again anything that uses Windows or .NET HTTP APIs. Behind the scenes the capture tool uses FiddlerCore so basically anything you can capture with Fiddler you can also capture with Web Surge Session capture tool. Alternately you can actually use Fiddler as well, and then export the captured Fiddler trace to a file, which can then be imported into WebSurge. This is a nice way to let somebody capture session without having to actually install WebSurge or for your customers to provide an exact playback scenario for a given set of URLs that cause a problem perhaps. Note that not all applications work with Fiddler’s proxy unless you configure a proxy. For example, .NET Web applications that make HTTP calls usually don’t show up in Fiddler by default. For those .NET applications you can explicitly override proxy settings to capture those requests to service calls. The capture tool also has handy optional filters that allow you to filter by domain, to help block out noise that you typically don’t want to include in your requests. For example, if your pages include links to CDNs, or Google Analytics or social links you typically don’t want to include those in your load test, so by capturing just from a specific domain you are guaranteed content from only that one domain. Additionally you can provide url filters in the configuration file – filters allow to provide filter strings that if contained in a url will cause requests to be ignored. Again this is useful if you don’t filter by domain but you want to filter out things like static image, css and script files etc. Often you’re not interested in the load characteristics of these static and usually cached resources as they just add noise to tests and often skew the overall url performance results. In my testing I tend to care only about my dynamic requests. SSL Captures require Fiddler Note, that in order to capture SSL requests you’ll have to install the Fiddler’s SSL certificate. The easiest way to do this is to install Fiddler and use its SSL configuration options to get the certificate into the local certificate store. There’s a document on the Telerik site that provides the exact steps to get SSL captures to work with Fiddler and therefore with WebSurge. Session Storage A group of URLs entered or captured make up a Session. Sessions can be saved and restored easily as they use a very simple text format that simply stored on disk. The format is slightly customized HTTP header traces separated by a separator line. The headers are standard HTTP headers except that the full URL instead of just the domain relative path is stored as part of the 1st HTTP header line for easier parsing. Because it’s just text and uses the same format that Fiddler uses for exports, it’s super easy to create Sessions by hand manually or under program control writing out to a simple text file. You can see what this format looks like in the Capture window figure above – the raw captured format is also what’s stored to disk and what WebSurge parses from. The only ‘custom’ part of these headers is that 1st line contains the full URL instead of the domain relative path and Host: header. The rest of each header are just plain standard HTTP headers with each individual URL isolated by a separator line. The format used here also uses what Fiddler produces for exports, so it’s easy to exchange or view data either in Fiddler or WebSurge. Urls can also be edited interactively so you can modify the headers easily as well: Again – it’s just plain HTTP headers so anything you can do with HTTP can be added here. Use it for single URL Testing Incidentally I’ve also found this form as an excellent way to test and replay individual URLs for simple non-load testing purposes. Because you can capture a single or many URLs and store them on disk, this also provides a nice HTTP playground where you can record URLs with their headers, and fire them one at a time or as a session and see results immediately. It’s actually an easy way for REST presentations and I find the simple UI flow actually easier than using Fiddler natively. Finally you can save one or more URLs as a session for later retrieval. I’m using this more and more for simple URL checks. Overriding Cookies and Domains Speaking of HTTP headers – you can also overwrite cookies used as part of the options. One thing that happens with modern Web applications is that you have session cookies in use for authorization. These cookies tend to expire at some point which would invalidate a test. Using the Options dialog you can actually override the cookie: which replaces the cookie for all requests with the cookie value specified here. You can capture a valid cookie from a manual HTTP request in your browser and then paste into the cookie field, to replace the existing Cookie with the new one that is now valid. Likewise you can easily replace the domain so if you captured urls on west-wind.com and now you want to test on localhost you can do that easily easily as well. You could even do something like capture on store.west-wind.com and then test on localhost/store which would also work. Running Load Tests Once you’ve created a Session you can specify the length of the test in seconds, and specify the number of simultaneous threads to run each session on. Sessions run through each of the URLs in the session sequentially by default. One option in the options list above is that you can also randomize the URLs so each thread runs requests in a different order. This avoids bunching up URLs initially when tests start as all threads run the same requests simultaneously which can sometimes skew the results of the first few minutes of a test. While sessions run some progress information is displayed: By default there’s a live view of requests displayed in a Console-like window. On the bottom of the window there’s a running total summary that displays where you’re at in the test, how many requests have been processed and what the requests per second count is currently for all requests. Note that for tests that run over a thousand requests a second it’s a good idea to turn off the console display. While the console display is nice to see that something is happening and also gives you slight idea what’s happening with actual requests, once a lot of requests are processed, this UI updating actually adds a lot of CPU overhead to the application which may cause the actual load generated to be reduced. If you are running a 1000 requests a second there’s not much to see anyway as requests roll by way too fast to see individual lines anyway. If you look on the options panel, there is a NoProgressEvents option that disables the console display. Note that the summary display is still updated approximately once a second so you can always tell that the test is still running. Test Results When the test is done you get a simple Results display: On the right you get an overall summary as well as breakdown by each URL in the session. Both success and failures are highlighted so it’s easy to see what’s breaking in your load test. The report can be printed or you can also open the HTML document in your default Web Browser for printing to PDF or saving the HTML document to disk. The list on the right shows you a partial list of the URLs that were fired so you can look in detail at the request and response data. The list can be filtered by success and failure requests. Each list is partial only (at the moment) and limited to a max of 1000 items in order to render reasonably quickly. Each item in the list can be clicked to see the full request and response data: This particularly useful for errors so you can quickly see and copy what request data was used and in the case of a GET request you can also just click the link to quickly jump to the page. For non-GET requests you can find the URL in the Session list, and use the context menu to Test the URL as configured including any HTTP content data to send. You get to see the full HTTP request and response as well as a link in the Request header to go visit the actual page. Not so useful for a POST as above, but definitely useful for GET requests. Finally you can also get a few charts. The most useful one is probably the Request per Second chart which can be accessed from the Charts menu or shortcut. Here’s what it looks like:   Results can also be exported to JSON, XML and HTML. Keep in mind that these files can get very large rather quickly though, so exports can end up taking a while to complete. Command Line Interface WebSurge runs with a small core load engine and this engine is plugged into the front end application I’ve shown so far. There’s also a command line interface available to run WebSurge from the Windows command prompt. Using the command line you can run tests for either an individual URL (similar to AB.exe for example) or a full Session file. By default when it runs WebSurgeCli shows progress every second showing total request count, failures and the requests per second for the entire test. A silent option can turn off this progress display and display only the results. The command line interface can be useful for build integration which allows checking for failures perhaps or hitting a specific requests per second count etc. It’s also nice to use this as quick and dirty URL test facility similar to the way you’d use Apache Bench (ab.exe). Unlike ab.exe though, WebSurgeCli supports SSL and makes it much easier to create multi-URL tests using either manual editing or the WebSurge UI. Current Status Currently West Wind WebSurge is still in Beta status. I’m still adding small new features and tweaking the UI in an attempt to make it as easy and self-explanatory as possible to run. Documentation for the UI and specialty features is also still a work in progress. I plan on open-sourcing this product, but it won’t be free. There’s a free version available that provides a limited number of threads and request URLs to run. A relatively low cost license  removes the thread and request limitations. Pricing info can be found on the Web site – there’s an introductory price which is $99 at the moment which I think is reasonable compared to most other for pay solutions out there that are exorbitant by comparison… The reason code is not available yet is – well, the UI portion of the app is a bit embarrassing in its current monolithic state. The UI started as a very simple interface originally that later got a lot more complex – yeah, that never happens, right? Unless there’s a lot of interest I don’t foresee re-writing the UI entirely (which would be ideal), but in the meantime at least some cleanup is required before I dare to publish it :-). The code will likely be released with version 1.0. I’m very interested in feedback. Do you think this could be useful to you and provide value over other tools you may or may not have used before? I hope so – it already has provided a ton of value for me and the work I do that made the development worthwhile at this point. You can leave a comment below, or for more extensive discussions you can post a message on the West Wind Message Board in the WebSurge section Microsoft MVPs and Insiders get a free License If you’re a Microsoft MVP or a Microsoft Insider you can get a full license for free. Send me a link to your current, official Microsoft profile and I’ll send you a not-for resale license. Send any messages to [email protected]. Resources For more info on WebSurge and to download it to try it out, use the following links. West Wind WebSurge Home Download West Wind WebSurge Getting Started with West Wind WebSurge Video© Rick Strahl, West Wind Technologies, 2005-2014Posted in ASP.NET   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Understanding and Controlling Parallel Query Processing in SQL Server

    Data warehousing and general reporting applications tend to be CPU intensive because they need to read and process a large number of rows. To facilitate quick data processing for queries that touch a large amount of data, Microsoft SQL Server exploits the power of multiple logical processors to provide parallel query processing operations such as parallel scans. Through extensive testing, we have learned that, for most large queries that are executed in a parallel fashion, SQL Server can deliver linear or nearly linear response time speedup as the number of logical processors increases. However, some queries in high parallelism scenarios perform suboptimally. There are also some parallelism issues that can occur in a multi-user parallel query workload. This white paper describes parallel performance problems you might encounter when you run such queries and workloads, and it explains why these issues occur. In addition, it presents how data warehouse developers can detect these issues, and how they can work around them or mitigate them.

    Read the article

  • SQLAuthority News – Best Practices for Data Warehousing with SQL Server 2008 R2

    - by pinaldave
    An integral part of any BI system is the data warehouse—a central repository of data that is regularly refreshed from the source systems. The new data is transferred at regular intervals  by extract, transform, and load (ETL) processes. This whitepaper talks about what are best practices for Data Warehousing. This whitepaper discusses ETL, Analysis, Reporting as well relational database. The main focus of this whitepaper is on mainly ‘architecture’ and ‘performance’. Download Best Practices for Data Warehousing with SQL Server 2008 R2 Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Best Practices, Data Warehousing, PostADay, SQL, SQL Authority, SQL Documentation, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Oracle Communications Data Model

    - by jean-pierre.dijcks
    I've mentioned OCDM in previous posts but found the following (see end of the post) podcast on the topic and figured it is worthwhile to spread the news some more. ORetailDM and OCommunicationsDM are the two data models currently available from Oracle. Both are intended to capture: Business best practices and industry knowledge Pre-built advanced analytics intended to predict future events before they happen (like the Churn model shown below) Oracle technology best practices to ensure optimal performance of the model All of this typically comes with a reduced time to implementation, or as the marketing slogan goes, reduced time to value. Here are the links: Podcast on OCDM OTN pages for OCDM and ORDM

    Read the article

  • OpenGL ES 2/3 vs OpenGL 3 (and 4)

    - by Martin Perry
    I have migrated my code from OpenGL ES 2/3 to OpenGL 3 (I added bunch of defines and abstract classes to encapsulate both versions, so I have both in one project and compile only one or another). All I need to change was context initialization and glClearDepth. I dont have any errors. This was kind of strange to me. Even shaders are working correctly (some of them are GL ES 3 - with #version 300 es in their header) Is this a kind of good solution, or should I rewrite something more, before I start adding another functionality like geometry shaders, performance tools etc ?

    Read the article

  • SQLAuthority News – A Successful Community TechDays at Ahmedabad – December 11, 2010

    - by pinaldave
    We recently had one of the best community events in Ahmedabad. We were fortunate that we had SQL Experts from around the world to have presented at this event. This gathering was very special because besides Jacob Sebastian and myself, we had two other speakers traveling all the way from Florida (Rushabh Mehta) and Bangalore (Vinod Kumar).There were a total of nearly 170 attendees and the event was blast. Here are the details of the event. Pinal Dave Presenting at Community Tech Days On the day of the event, it seemed to be the coldest day in Ahmedabad but I was glad to see hundreds of people waiting for the doors to be opened some hours before. We started the day with hot coffee and cookies. Yes, food first; and it was right after my keynote. I could clearly see that the coffee did some magic right away; the hall was almost full after the coffee break. Jacob Sebastian Presenting at Community Tech Days Jacob Sebastian, an SQL Server MVP and a close friend of mine, had an unusual job of surprising everybody with an innovative topic accompanied with lots of question-and-answer portions. That’s definitely one thing to love Jacob, that is, the novelty of the subject. His presentation was entitled “Best Database Practices for the .Net”; it really created magic on the crowd. Pinal Dave Presenting at Community Tech Days Next to Jacob Sebastian, I presented “Best Database Practices for the SharePoint”. It was really fun to present Database with the perspective of the database itself. The main highlight of my presentation was when I talked about how one can speed up the database performance by 40% for SharePoint in just 40 seconds. It was fun because the most important thing was to convince people to use the recommendation as soon as they walk out of the session. It was really amusing and the response of the participants was remarkable. Pinal Dave Presenting at Community Tech Days My session was followed by the most-awaited session of the day: that of Rushabh Mehta. He is an international BI expert who traveled all the way from Florida to present “Self Service BI” session. This session was funny and truly interesting. In fact, no one knew BI could be this much entertaining and fascinating. Rushabh has an appealing style of presenting the session; he instantly got very much interaction from the audience. Rushabh Mehta Presenting at Community Tech Days We had a networking lunch break in-between, when we talked about many various topics. It is always interesting to get in touch with the Community and feel a part of it. I had a wonderful time during the break. Vinod Kumar Presenting at Community Tech Days After lunch was apparently the most difficult session for the presenter as during this time, many people started to fall sleep and get dizzy. This spot was requested by Microsoft SQL Server Evangelist Vinod Kumar himself. During our discussion he suggested that if he gets this slot he would make sure people are up and more interactive than during the morning session. Just like always, this session was one of the best sessions ever. Vinod is true to his word as he presented the subject of “Time Management for Developer”. This session was the biggest hit in the event because the subject was instilled in the mind of every participant. Vinod Kumar Presenting at Community Tech Days Vinod’s session was followed by his own small session. Due to “insistent public demand”, he presented an interesting subject, “Tricks and Tips of SQL Server“. In 20 minutes he has done another awesome job and all attendees wanted more of the tricks. Just as usual he promised to do that next time for us. Vinod’s session was succeeded by Prabhjot Singh Bakshi’s session. He presented an appealing Silverlight concept. Just the same, he did a great job and people cheered him. Prabhjot Presenting at Community Tech Days We had a special invited speaker, Dhananjay Kumar, traveling all the way from Pune. He always supports our cause to help the Community in empowering participants. He presented the topic about Win7 Mobile and SharePoint integration. This was something many did not even expect to be possible. Kudos to Dhananjay for doing a great job. Dhananjay Kumar Presenting at Community Tech Days All in all, this event was one of the best in the Community Tech Days series in Ahmedabad. We were fortunate that legends from the all over the world were present here to present to the Community. I’d say never underestimate the power of the Community and its influence over the direction of the technology. Vinod Kumar Presenting trophy to Pinal Dave Vinod Kumar Presenting trophy to Pinal Dave This event was a very special gathering to me personally because of your support to the vibrant Community. The following awards were won for last year’s performance: Ahmedabad SQL Server User Group (President: Jacob Sebastian; Leader: Pinal Dave) – Best Tier 2 User Group Best Development Community Individual Contributor – Pinal Dave Speakers I was very glad to receive the award for our entire Community. Attendees at Community Tech Days I want to say thanks to Rushabh Mehta, Vinod Kumar and Dhananjay Kumar for visiting the city and presenting various technology topics in Community Tech Days. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: MVP, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, SQLAuthority News, T SQL, Technology

    Read the article

  • Setting up a Reverse Proxy using IIS, URL Rewrite and ARR

    Today there was a question in the IIS.net Forums asking how to expose two different Internet sites from another site making them look like if they were subdirectories in the main site. So for example the goal was to have a site: www.site.com expose a www.site.com/company1  and a www.site.com/company2 and have the content from www.company1.com served for the first one and www.company2.com served in the second one. Furthermore we would like to have the responses cached in the server for performance...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • SQLAuthority News – Storage and SQL Server Capacity Planning and configuration – SharePoint Server 2

    - by pinaldave
    Just a day ago, I was asked how do you plan SQL Server Storage Capacity. Here is the excellent article published by Microsoft regarding SQL Server capacity planning for SharePoint 2010. This article touches all the vital areas of this subject. Here are the bullet points for the same. Gather storage and SQL Server space and I/O requirements Choose SQL Server version and edition Design storage architecture based on capacity and IO requirements Determine memory requirements Understand network topology requirements Configure SQL Server Validate storage performance and reliability Read the original article published by Microsoft here: Storage and SQL Server Capacity Planning and configuration – SharePoint Server 2010. The question to all the SharePoint developers and administrator that if they use the whitepapers and articles to decide the capacity or they just start with application and as they progress they plan the storage? Please let me know your opinion. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: SQL, SQL Authority, SQL Data Storage, SQL Query, SQL Server, SQL Tips and Tricks, SQL White Papers, SQLAuthority News, T SQL, Technology Tagged: SharePoint

    Read the article

  • Migration from Exchange to BPOS - Microsoft Assessment and Planning (MAP) Toolkit Link

    - by Harish Pavithran
    The Microsoft Assessment and Planning (MAP) Toolkit is an agentless toolkit that finds computers on a network and performs a detailed inventory of the computers using Windows Management Instrumentation (WMI) and the Remote Registry Service. The data and analysis provided by this toolkit can significantly simplify the planning process for migrating to Windows® 7, Windows Vista®, Microsoft Office 2007, Windows Server® 2008 R2, Windows Server 2008, Hyper-V, Microsoft Application Virtualization, Microsoft SQL Server 2008, and Forefront® Client Security and Network Access Protection. Assessments for Windows Server 2008 R2, Windows Server 2008, Windows 7, and Windows Vista include device driver availability as well as recommendations for hardware upgrades. If you are interested in server virtualization planning, MAP provides the ability to gather performance metrics from computers you are considering for virtualization and a feature to model a library of potential host hardware and storage configurations. This information can be used to quickly perform "what-if" analysis using Hyper-V and Microsoft Virtual Server 2005 R2 as virtualization platforms. http://www.microsoft.com/downloads/details.aspx?displaylang=en&FamilyID=67240b76-3148-4e49-943d-4d9ea7f77730

    Read the article

  • MySQL 5.5 is GA!

    - by rob.young(at)oracle.com
    It is my pleasure to announce that MySQL 5.5 is now GA and ready for production deployment.  You can read Oracle's official press release here. I am excited about 5.5 because of the performance and scalability gains, new replication enhancements and overall improved technical efficiencies.  Congratulations and a sincere "Thanks!" go out to the entire MySQL Community and product engineering teams for making 5.5 the best release of MySQL to date.Please join us for today's MySQL Technology Update webcast where Tomas Ulin and I will cover what's new in MySQL 5.5 and provide an update on the other technologies we are working on. You can download MySQL 5.5 here.  All of the documentation and what's new information is here.  There is also a great article on MySQL 5.5 and the MySQL community here.Thanks for reading, and as always, THANKS for your support of MySQL!

    Read the article

  • EXALYTICS - Oracle® Essbase 11.1.2.1.000 Patch Set (PS): 11.1.2.2.000

    - by Ahmed Awan
    Who should apply this patch: This PS contains defect fixes and changes that are specific to the Oracle Exalytics In-Memory box. You should install this PS only in the following circumstances: You are installing Essbase on the Exalytics In-Memory Machine, or There is an urgent need for a defect fix that is included in this PS Customers considering this PS for a platform other than the Exalytics In-Memory Machine should carefully review the list of fixed defects. If there is not a truly urgent need for a defect fix included in this PS, Oracle recommends customers install the upcoming Enterprise Performance Management (EPM) 11.1.2.2.000 release, which will contain an update, instead of this patch set. Reference: http://docs.oracle.com/cd/E26232_01/doc.11122/readme/esb_11122000_readme.html

    Read the article

  • Interview with Tomas Ulin at the MySQL Innovation Day

    - by Monica Kumar
    MySQL Innovation Day held on June 5, 2012 was a great event for the MySQL engineers, users and customers to gather, share and network. I was able to get a few minutes with Tomas Ulin, Vice President of MySQL Engineering at Oracle, to ask him some questions. Here are the highlights of my interview with Tomas. Monica: This was the first MySQL Innovation Day, correct?  Why now, what was the strategy behind hosting this kind of event? Tomas: In the last year, we have rolled out an incredible number of MySQL events worldwide – some targeted at developers that are new to MySQL and others for the MySQL savvy. At the MySQL Innovation Day, our first event of this kind,, we had a number of our key engineers presenting lightning talks delivering previews of key new features as well as discussing roadmap. Our goal is to keep an open dialogue with the MySQL community. In fact, we are hosting a two-day conference, another first, for the MySQL community called MySQL Connect on Sept. 29-30 in San Francisco. If you attended the MySQL Innovation Day and liked what we did, you are going to love MySQL Connect. We’ll have a lot more of our engineers and many users and community members presenting hour long sessions and hands on labs. Our engineers will be presenting new MySQL features as well offer previews of upcoming enhancements. Monica: What's the big take-away from today's MySQL Innovation Day? Tomas: I hope the most important takeaway for attendees was to see that Oracle has been driving, and continues to drive MySQL innovation with a steady stream of new great GA and Development Milestone releases. Monica: What were attendees most interested in? What feedback did they have? Tomas: Feedback from attendees was incredibly positive and encouraging. In particular, they liked the interaction with the MySQL engineers and were also excited about the new early access features in MySQL 5.6 and MySQL Cluster 7.3. In addition, sessions delivered by MySQL users like Facebook, Pinterest and Twitter were very well received. For example, Pinterest talked about using MySQL to scale from 0 to billions of page views/month, Twitter talked about “Scaling twitter with MySQL” and Facebook discussed the many options to implement MySQL master failover solutions. The presentations are already available for download while some of the session videos will be made available on the MySQL Innovation Day web page shortly. Monica: How would you distinguish the use of MySQL vs. Oracle Database? What key factors should customers consider? Tomas: MySQL and Oracle Database complement each other. They are very different products, best suited to different use cases. Customers can choose world-class solutions from Oracle to fulfill a variety of needs. MySQL is a great choice for enterprise web-based, custom and embedded apps. Oracle Database is the leading choice for enterprise packaged applications such as ERP, CRM as well as high-end data warehousing and business intelligence applications. Monica: What are the highlights of the current MySQL 5.6 Development Milestone Release and early access features for MySQL Cluster 7.3? Tomas: MySQL 5.6 development milestone release builds on MySQL 5.5 by improving: Optimizer for better Performance, Scalability Performance Schema for better instrumentation InnoDB for better transactional throughput Replication for higher availability, data integrity NoSQL options for more flexibility We announced some new early access features in MySQL 5.6, including binary log group commit. We also announced early access features in MySQL Cluster 7.3 including support for foreign key constraints. Monica: How do people get these releases? Tomas: You can access development milestone releases by going to: http://dev.mysql.com/downloads/mysqlThen select the “Development Release” tab. The MySQL Cluster 7.3 and other early access features can be downloaded at: http://labs.mysql.com Monica: What's coming up next for MySQL? Tomas: Our development team is working in overdrive, cranking out new features with community feedback. Don’t miss the MySQL Connect conference being held in San Francisco on Sept. 29 and 30th. My team and I will be there. I hope you can join us! Monica: Thank you for your time, Tomas. I look forward to seeing you at the MySQL Connect conference. To our followers, I hope you found this interview informative. I welcome your comments. Please stay tuned here for more updates on MySQL. Note: Monica Kumar is Senior Director of product marketing for Linux, Virtualization and MySQL at Oracle.

    Read the article

< Previous Page | 387 388 389 390 391 392 393 394 395 396 397 398  | Next Page >