Search Results

Search found 16071 results on 643 pages for 'visual studio lightswitch'.

Page 130/643 | < Previous Page | 126 127 128 129 130 131 132 133 134 135 136 137  | Next Page >

  • Load and Web Performance Testing using Visual Studio Ultimate 2010-Part 3

    - by Tarun Arora
    Welcome back once again, in Part 1 of Load and Web Performance Testing using Visual Studio 2010 I talked about why Performance Testing the application is important, the test tools available in Visual Studio Ultimate 2010 and various test rig topologies, in Part 2 of Load and Web Performance Testing using Visual Studio 2010 I discussed the details of web performance & load tests as well as why it’s important to follow a goal based pattern while performance testing your application. In part 3 I’ll be discussing Test Result Analysis, Test Result Drill through, Test Report Generation, Test Run Comparison, Asp.net Profiler and some closing thoughts. Test Results – I see some creepy worms! In Part 2 we put together a web performance test and a load test, lets run the test to see load test to see how the Web site responds to the load simulation. While the load test is running you will be able to see close to real time analysis in the Load Test Analyser window. You can use the Load Test Analyser to conduct load test analysis in three ways: Monitor a running load test - A condensed set of the performance counter data is maintained in memory. To prevent the results memory requirements from growing unbounded, up to 200 samples for each performance counter are maintained. This includes 100 evenly spaced samples that span the current elapsed time of the run and the most recent 100 samples.         After the load test run is completed - The test controller spools all collected performance counter data to a database while the test is running. Additional data, such as timing details and error details, is loaded into the database when the test completes. The performance data for a completed test is loaded from the database and analysed by the Load Test Analyser. Below you can see a screen shot of the summary view, this provides key results in a format that is compact and easy to read. You can also print the load test summary, this is generated after the test has completed or been stopped.         Analyse the load test results of a previously run load test – We’ll see this in the section where i discuss comparison between two test runs. The performance counters can be plotted on the graphs. You also have the option to highlight a selected part of the test and view details, drill down to the user activity chart where you can hover over to see more details of the test run.   Generate Report => Test Run Comparisons The level of reports you can generate using the Load Test Analyser is astonishing. You have the option to create excel reports and conduct side by side analysis of two test results or to track trend analysis. The tools also allows you to export the graph data either to MS Excel or to a CSV file. You can view the ASP.NET profiler report to conduct further analysis as well. View Data and Diagnostic Attachments opens the Choose Diagnostic Data Adapter Attachment dialog box to select an adapter to analyse the result type. For example, you can select an IntelliTrace adapter, click OK and open the IntelliTrace summary for the test agent that was used in the load test.   Compare results This creates a set of reports that compares the data from two load test results using tables and bar charts. I have taken these screen shots from the MSDN documentation, I would highly recommend exploring the wealth of knowledge available on MSDN. Leaving Thoughts While load testing the application with an excessive load for a longer duration of time, i managed to bring the IIS to its knees by piling up a huge queue of requests waiting to be processed. This clearly means that the IIS had run out of threads as all the threads were busy processing existing request, one easy way of fixing this is by increasing the default number of allocated threads, but this might escalate the problem. The better suggestion is to try and drill down to the actual root cause of the problem. When ever the garbage collection runs it stops processing any pages so all requests that come in during that period are queued up, but realistically the garbage collection completes in fraction of a a second. To understand this better lets look at the .net heap, it is divided into large heap and small heap, anything greater than 85kB in size will be allocated to the Large object heap, the Large object heap is non compacting and remember large objects are expensive to move around, so if you are allocating something in the large object heap, make sure that you really need it! The small object heap on the other hand is divided into generations, so all objects that are supposed to be short-lived are suppose to live in Gen-0 and the long living objects eventually move to Gen-2 as garbage collection goes through.  As you can see in the picture below all < 85 KB size objects are first assigned to Gen-0, when Gen-0 fills up and a new object comes in and finds Gen-0 full, the garbage collection process is started, the process checks for all the dead objects and assigns them as the valid candidate for deletion to free up memory and promotes all the remaining objects in Gen-0 to Gen-1. So in the future when ever you clean up Gen-1 you have to clean up Gen-0 as well. When you fill up Gen – 0 again, all of Gen – 1 dead objects are drenched and rest are moved to Gen-2 and Gen-0 objects are moved to Gen-1 to free up Gen-0, but by this time your Garbage collection process has started to take much more time than it usually takes. Now as I mentioned earlier when garbage collection is being run all page requests that come in during that period are queued up. Does this explain why possibly page requests are getting queued up, apart from this it could also be the case that you are waiting for a long running database process to complete.      Lets explore the heap a bit more… What is really a case of crisis is when the objects are living long enough to make it to Gen-2 and then dying, this is definitely a high cost operation. But sometimes you need objects in memory, for example when you cache data you hold on to the objects because you need to use them right across the user session, which is acceptable. But if you wanted to see what extreme caching can do to your server then write a simple application that chucks in a lot of data in cache, run a load test over it for about 10-15 minutes, forcing a lot of data in memory causing the heap to run out of memory. If you get to such a state where you start running out of memory the IIS as a mode of recovery restarts the worker process. It is great way to free up all your memory in the heap but this would clear the cache. The problem with this is if the customer had 10 items in their shopping basket and that data was stored in the application cache, the user basket will now be empty forcing them either to get frustrated and go to a competitor website or if the customer is really patient, give it another try! How can you address this, well two ways of addressing this; 1. Workaround – A x86 bit processor only allows a maximum of 4GB of RAM, this means the machine effectively has around 3.4 GB of RAM available, the OS needs about 1.5 GB of RAM to run efficiently, the IIS and .net framework also need their share of memory, leaving you a heap of around 800 MB to play with. Because Team builds by default build your application in ‘Compile as any mode’ it means the application is build such that it will run in x86 bit mode if run on a x86 bit processor and run in a x64 bit mode if run on a x64 but processor. The problem with this is not all applications are really x64 bit compatible specially if you are using com objects or external libraries. So, as a quick win if you compiled your application in x86 bit mode by changing the compile as any selection to compile as x86 in the team build, you will be able to run your application on a x64 bit machine in x86 bit mode (WOW – By running Windows on Windows) and what that means is, you could use 8GB+ worth of RAM, if you take away everything else your application will roughly get a heap size of at least 4 GB to play with, which is immense. If you need a heap size of more than 4 GB you have either build a software for NASA or there is something fundamentally wrong in your application. 2. Solution – Now that you have put a workaround in place the IIS will not restart the worker process that regularly, which means you can take a breather and start working to get to the root cause of this memory leak. But this begs a question “How do I Identify possible memory leaks in my application?” Well i won’t say that there is one single tool that can tell you where the memory leak is, but trust me, ‘Performance Profiling’ is a great start point, it definitely gets you started in the right direction, let’s have a look at how. Performance Wizard - Start the Performance Wizard and select Instrumentation, this lets you measure function call counts and timings. Before running the performance session right click the performance session settings and chose properties from the context menu to bring up the Performance session properties page and as shown in the screen shot below, check the check boxes in the group ‘.NET memory profiling collection’ namely ‘Collect .NET object allocation information’ and ‘Also collect the .NET Object lifetime information’.    Now if you fire off the profiling session on your pages you will notice that the results allows you to view ‘Object Lifetime’ which shows you the number of objects that made it to Gen-0, Gen-1, Gen-2, Large heap, etc. Another great feature about the profile is that if your application has > 5% cases where objects die right after making to the Gen-2 storage a threshold alert is generated to alert you. Since you have the option to also view the most expensive methods and by capturing the IntelliTrace data you can drill in to narrow down to the line of code that is the root cause of the problem. Well now that we have seen how crucial memory management is and how easy Visual Studio Ultimate 2010 makes it for us to identify and reproduce the problem with the best of breed tools in the product. Caching One of the main ways to improve performance is Caching. Which basically means you tell the web server that instead of going to the database for each request you keep the data in the webserver and when the user asks for it you serve it from the webserver itself. BUT that can have consequences! Let’s look at some code, trust me caching code is not very intuitive, I define a cache key for almost all searches made through the common search page and cache the results. The approach works fine, first time i get the data from the database and second time data is served from the cache, significant performance improvement, EXCEPT when two users try to do the same operation and run into each other. But it is easy to handle this by adding the lock as you can see in the snippet below. So, as long as a user comes in and finds that the cache is empty, the user locks and starts to get the cache no more concurrency issues. But lets say you are processing 10 requests per second, by the time i have locked the operation to get the results from the database, 9 other users came in and found that the cache key is null so after i have come out and populated the cache they will still go in to get the results again. The application will still be faster because the next set of 10 users and so on would continue to get data from the cache. BUT if we added another null check after locking to build the cache and before actual call to the db then the 9 users who follow me would not make the extra trip to the database at all and that would really increase the performance, but didn’t i say that the code won’t be very intuitive, may be you should leave a comment you don’t want another developer to come in and think what a fresher why is he checking for the cache key null twice !!! The downside of caching is, you are storing the data outside of the database and the data could be wrong because the updates applied to the database would make the data cached at the web server out of sync. So, how do you invalidate the cache? Well if you only had one way of updating the data lets say only one entry point to the data update you can write some logic to say that every time new data is entered set the cache object to null. But this approach will not work as soon as you have several ways of feeding data to the system or your system is scaled out across a farm of web servers. The perfect solution to this is Micro Caching which means you cache the query for a set time duration and invalidate the cache after that set duration. The advantage is every time the user queries for that data with in the time span for which you have cached the results there are no calls made to the database and the data is served right from the server which makes the response immensely quick. Now figuring out the appropriate time span for which you micro cache the query results really depends on the application. Lets say your website gets 10 requests per second, if you retain the cache results for even 1 minute you will have immense performance gains. You would reduce 90% hits to the database for searching. Ever wondered why when you go to e-bookers.com or xpedia.com or yatra.com to book a flight and you click on the book button because the fare seems too exciting and you get an error message telling you that the fare is not valid any more. Yes, exactly => That is a cache failure! These travel sites or price compare engines are not going to hit the database every time you hit the compare button instead the results will be served from the cache, because the query results are micro cached, its a perfect trade-off, by micro caching the results the site gains 100% performance benefits but every once in a while annoys a customer because the fare has expired. But the trade off works in the favour of these sites as they are still able to process up to 30+ page requests per second which means cater to the site traffic by may be losing 1 customer every once in a while to a competitor who is also using a similar caching technique what are the odds that the user will not come back to their site sooner or later? Recap   Resources Below are some Key resource you might like to review. I would highly recommend the documentation, walkthroughs and videos available on MSDN. You can always make use of Fiddler to debug Web Performance Tests. Some community test extensions and plug ins available on Codeplex might also be of interest to you. The Road Ahead Thank you for taking the time out and reading this blog post, you may also want to read Part I and Part II if you haven’t so far. If you enjoyed the post, remember to subscribe to http://feeds.feedburner.com/TarunArora. Questions/Feedback/Suggestions, etc please leave a comment. Next ‘Load Testing in the cloud’, I’ll be working on exploring the possibilities of running Test controller/Agents in the Cloud. See you on the other side! Thank You!   Share this post : CodeProject

    Read the article

  • Converting projects to use Automatic NuGet restore

    - by terje
    Originally posted on: http://geekswithblogs.net/terje/archive/2014/06/11/converting-projects-to-use-automatic-nuget-restore.aspxDownload tool In version 2.7 of NuGet automatic nuget restore was introduced, meaning you no longer need to distort your msbuild project files with nuget target information.   Visual Studio and TFS 2013 build have this enabled by default.  However, if your project was created before this was introduced, and/or if you have used the “Enable NuGet Package Restore” afterwards, you now have a series of unwanted things in your projects, and a series of project files that have been modified – and – you no longer neither want nor need this !  You might also get into some unwanted issues due to these modifications.  This is a MSBuild modification that was needed only before NuGet 2.7 ! So: DON’T USE THIS FUNCTION !!! There is an issue https://nuget.codeplex.com/workitem/4019 on this on the NuGet project site to get this function removed, renamed or at least moved farther away from the top level (please help vote it up!).  The response seems to be that it WILL BE removed, around version 3.0. This function does nothing you need after the introduction of NuGet 2.7.  What is also unfortunate is the naming of it – it implies that it is needed, it is not, and what is worse, there is no corresponding function to remove what it does ! So to fix this use the tool named IFix, that will fix this issue for you   - all free of course, and the code is open source.  Also report issues there:  https://github.com/OsirisTerje/IFix    IFix information DOWNLOAD HERE This command line tool installs using an MSI, and add itself to the system path.  If you work in a team, you will probably need to use the  tool multiple times.  Anyone in the team may at any time use the “Enable NuGet Package Restore” function and mess up your project again.  The IFix program can be run either in a  check modus, where it does not write anything back – it only checks if you have any issues, or in a Fix mode, where it will also perform the necessary fixes for you. The IFix program is used like this: IFix <command> [-c/--check] [-f/--fix]  [-v/--verbose] The command in this case is “nugetrestore”.  It will do a check from the location where it is being called, and run through all subfolders from that location. So  “IFix nugetrestore  --check” , will do the check ,  and “IFix nugetrestore  --fix”  will perform the changes, for all files and folders below the current working directory. (Note that --check  can be replaced with only –c, and --fix with –f, and so on. ) BEWARE: When you run the fix option, all solutions to be affected must be closed in Visual Studio ! So, if you just want to DO it, then: IFix nugetrestore --check to see if you have issues then IFix nugetrestore  --fix to fix them. How does it work IFix nugetrestore  checks and optionally fixes four issues that the older enabling of nuget restore did.  The issues are related to the MSBuild projess, and are: Deleting the nuget.targets file. Deleting the nuget.exe that is located under the .nuget folder Removing all references to nuget.targets in the solution file Removing all properties and target imports of nuget.targets inside the csproj files. IFix fixes these issues in the same sequence. The first step, removing the nuget.targets file is the most critical one, and all instances of the nuget.targets file within the scope of a solution has to be removed, and in addition it has to be done with the solution closed in Visual Studio.  If Visual Studio finds a nuget.targets file, the csproj files will be automatically messed up again. This means the removal process above might need to be done multiple times, specially when you’re working with a team, and that solution context menu still has the “Enable NuGet Package Restore” function.  Someone on the team might inadvertently do this at any time. It can be a good idea to add this check to a checkin policy – if you run TFS standard version control, but that will have no effect if you use TFS Git version control of course. So, better be prepared to run the IFix check from time to time. Or, even better, install IFix on your build servers, and add a call to IFix nugetrestore --check in the TFS Build script.    How does it look As a first example I have run the IFix program from the top of a set of git repositories, so it spans multiple repositories with multiple solutions. The result from the check option is as follows: We see the four red lines, there is one for each of the four checks we talked about in the previous section. The fact that they are red, means we have that particular issue. The first section (above the first red text line) is the nuget targets section.  Notice  No.1, it says it has found no paths to copy.  What IFix does here is to check if there are any defined paths to other nuget galleries.  If there are, then those are copied over to the nuget.config file, where is where it should be in version 2.7 and above.   No.2 says it has found the particular nuget.targets file,  No.3  states it HAS found some other nuget galleries defines in the targets file, which then it would like to copy to the config.file. No.4 is the section for nuget.exe files, and list those it has found, and which it would like to delete. No 5 states it has found a reference to nuget.targets in the solution file.  This reference comes from the fact that the .nuget folder is a solution folder, and the items within are described in the solution file. It then checks the csproj files, and as can be seen from the last red line, it ha found issues in 96 out of 198 csproj files.  There are two possible issues in a csproj files.  No.6 is the first one, and the most common and most important one, an “Import project” section.  This is the section that calls the nuget.targets files.  No.7 is another issue, which seems to sometimes be there, sometimes not, it is a RestorePackages property, which also should go away. Now, if we run the IFix nugetrestore –fix command, and then the check again after that, the result is: All green !

    Read the article

  • Opening vbp Visual Basic Project

    - by Roman
    I have got some old sources written in Visual Basic. There are *.bas, *.cls, *.frm and *.vbp files. As I understand, vbp is a project file. But I cannot open it with my Visual Studio 2008. What version of VS should I install to open *.vbp file? Google says it is Visual Studio 6, but I am not sure and I cannot find Visual Studio 6 for downloading. Is there any publicly available free edition of Visual Studio 6 with Visual Basic? Thanks.

    Read the article

  • Some VS 2010 RC Updates (including patches for Intellisense and Web Designer fixes)

    - by ScottGu
    [In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu] We are continuing to make progress on shipping Visual Studio 2010.  I’d like to say a big thank you to everyone who has downloaded and tried out the VS 2010 Release Candidate, and especially to those who have sent us feedback or reported issues with it. This data has been invaluable in helping us find and fix remaining bugs before we ship the final release. Last month I blogged about a patch we released for the VS 2010 RC that fixed a bad intellisense crash issue.  This past week we released two additional patches that you can download and apply to the VS 2010 RC to immediately fix two other common issues we’ve seen people run into: Patch that fixes crashes with Tooltip invocation and when hovering over identifiers The Visual Studio team recently released a second patch that fixes some crashes we’ve seen when tooltips are displayed – most commonly when hovering over an identifier to view a QuickInfo tooltip. You can learn more about this issue from this blog post, and download and apply the patch here. Patch that fixes issues with the Web Forms designer not correctly adding controls to the auto-generated designer files The Visual Web Developer team recently released a patch that fixes issues where web controls are not correctly added to the .designer.cs file associated with the .aspx file – which means they can’t be programmed against in the code-behind file.  This issue is most commonly described as “controls are not being recognized in the code-behind” or “editing existing .aspx files regenerates the .aspx.designer.(vb or cs) file and controls are now missing” or “I can’t embed controls within the Ajax Control Toolkit TabContainer or the <asp:createuserwizard> control”. You can learn more about the issue here, and download the patch that fixes it here. Common Cause of Intellisense and IDE sluggishness on Windows XP, Vista, Win Server 2003/2008 systems Over the last few months we’ve occasionally seen reports of people seeing tremendous slowness when typing and using intellisense within VS 2010 despite running on decent machines.  It took us awhile to track down the cause – but we have found that the common culprit seems to be that these machines don’t have the latest versions of the UIA (Windows Automation) component installed. UIA 3 ships with Windows 7, and is a recommended Windows Update patch on XP and Vista (which is why we didn’t see the problem in our tests – since our machines are patched with all recommended updates).  Many systems (especially on XP) don’t automatically install recommended updates, though, and are running with older versions of UIA. This can cause significant performance slow-downs within the VS 2010 editor when large lists are displayed (for example: with intellisense). If you are running on Windows XP, Vista, or Windows Server 2003 or 2008 and are seeing any performance issues with the editor or IDE, please install the free UIA 3 update that can be downloaded from this page.  If you scroll down the page you’ll find direct links to versions for each OS. Note that we are making improvements to the final release of VS 2010 so that we don’t have big perf issues when UIA 3 isn’t installed – and we are also adding a message within the IDE that will warn you if you don’t have UIA 3 installed and accessibility is activated. Improved Text Rendering with WPF 4 and VS 2010 We recently made some nice changes to WPF 4 which improve the text clarity and text crispness over what was in the VS 2010/.NET 4 Release Candidate.  In particular these changes improve scenarios where you have a dark background with light text. You can learn more about these improvements in this WPF Team blog post.  These changes will be in the final release of VS 2010 and .NET 4. Hope this helps, Scott

    Read the article

  • How does Visual Studio decide the order in which stack variables should be allocated?

    - by Jason
    I'm trying to turn some of the programs in gera's Insecure Programming by example into client/server applications that could be used in capture the flag scenarios to teach exploit development. The problem I'm having is that I'm not sure how Visual Studio (I'm using 2005 Professional Edition) decides where to allocate variables on the stack. When I compile and run example 1: int main() { int cookie; char buf[80]; printf("buf: %08x cookie: %08x\n", &buf, &cookie); gets(buf); if (cookie == 0x41424344) printf("you win!\n"); } I get the following result: buf: 0012ff14 cookie: 0012ff64 buf starts at an address eighty bytes lower than cookie, and any four bytes that are copied in buf after the first eighty will appear in cookie. The problem I'm having is when I place this code in some other function. When I compile and run the following code, I get a different result: buf appears at an address greater than cookie's. void ClientSocketHandler(SOCKET cs){ int cookie; char buf[80]; char stringToSend[160]; int numBytesRecved; int totalNumBytes; sprintf(stringToSend,"buf: %08x cookie: %08x\n",&buf,&cookie); send(cs,stringToSend,strlen(stringToSend),NULL); The result is: buf: 0012fd00 cookie: 0012fcfc Now there is no way to set cookie to arbitrary data via overwriting buf. Is there any way to tell Visual Studio to allocate cookie before buf? Is there any way to tell beforehand how the variables will be allocated? Thanks, Jason

    Read the article

  • Why can't I debug from Visual Studio 2005 after installing IE8?

    - by tjrobinson
    I've just installed IE8 (final) and restarted. I can no longer debug Web Application Projects using Visual Studio 2005 on Windows Server 2003 Enterprise R2. I get the message "Internet Explorer cannot display the webpage" and then WebDev.WebServer.exe quits with no visible error message and nothing in the Event Viewer. Does anyone have any ideas? Things that haven't helped: Adding localhost to trusted sites Changing the port to 8080 or 80 Checking my hosts file (it's just got 127.0.0.1 localhost in it) Things that have helped a bit: Running (not debugging) with CTRL-F5, which works fine (unless you need to debug) Changing the default Visual Studio browser to Firefox, which allows me to debug My hosts file contains: # Copyright (c) 1993-1999 Microsoft Corp. # # This is a sample HOSTS file used by Microsoft TCP/IP for Windows. # # This file contains the mappings of IP addresses to host names. Each # entry should be kept on an individual line. The IP address should # be placed in the first column followed by the corresponding host name. # The IP address and the host name should be separated by at least one # space. # # Additionally, comments (such as these) may be inserted on individual # lines or following the machine name denoted by a '#' symbol. # # For example: # # 102.54.94.97 rhino.acme.com # source server # 38.25.63.10 x.acme.com # x client host 127.0.0.1 localhost

    Read the article

  • How should I setup my Visual Studio projects/solutions in a Mercurial repository?

    - by Dave A
    At my company we have a few different web apps that each share some common libraries. The Visual Studio setup looks like this. Website 1 Solution Website 1 Shared Library 1 Project Shared Library 2 Project Website 2 Solution Website 2 Shared Library 1 Project Shared Library 2 Project Windows Service Solution Windows Service Project Shared Library 1 Project Shared Library 2 Project Shared Library Solution Shared Library 1 Project Shared Library 2 Project All Projects Solution Website 1 Website 2 Windows Service Project Shared Library 1 Project Shared Library 2 Project We want to start using Mercurial for source control, but I'm still not sure the best way to do it. From what I've read you're supposed to use a separate repository for each project. No problem there, but where do the Visual Studio solution files (.sln) go? Should there be a separate repository with just an .sln file? Ideally the projects that use the shared libraries should all use the same version, and the solution "All Projects Solution" should build without errors, but sometimes we need to branch the shared libraries. What is the best way to do this, and how would the repositories be setup? How do I get a working copy of a certain branch/tag of the Website 1 solution when every project is in a separate repository. Do I have to pull each one separately, or write a script to do it all at once? Can tortoise hg do that for me? Any other tips to make this process easier?

    Read the article

  • Why does the .NET tab in the 'Add Reference' dialog in Visual Studio not list the contents of the GA

    - by abroun
    Duplicate of: Getting assemblies to show in the .NET tab of Add Reference So, I'm using Visual C# 2008 Express Edition and I've just been on a bit of a detour as I found out that my assumption that the .NET tab of the 'Add Reference' dialog lists the contents of the GAC was incorrect. This was a bit of a problem for me as the assembly that I wanted to reference from my project was only available in the GAC. (It was Microsoft.XNA.Framework v2.0 obtained from the XNA 2.0 redistributalbe and as far as I could see it installed only into the GAC). I worked round the problem by setting the reference to Microsoft.XNA.Framework manually in the .csproj file and then getting a copy of the dll out of the cache. I was then able to create a directory for the DLL, add it to Visual Studio's list of assembly directories in the registry and then voila! I could see it in the .NET tab. This all seems like a bit of a faff to me and I don't think that my initial assumption (that the .NET tab shows the contents of the GAC) was that unreasonable or would be that uncommon. Can someone who knows more than me tell me why the contents of the GAC aren't shown? The documentation just says that they are not, but is there a good reason? is there actually a way to get the entire contents of the GAC to be listed? A tick box I've missed somewhere? Any info much appreciated.

    Read the article

  • Is MVC 2 client-side validation broken in Visual Studio 2010 RC?

    - by Will
    I can't seem to get client side validation working with the version of MVC released with Visual Studio 2010 RC. I've tried it with two projects -- one upgrade from 1.0, and one using the template that came with VS. I'd think the template version would work, but it doesn't. Added the following scripts: <script type="text/javascript" src="<%= Url.Content("~/Scripts/MicrosoftMvcValidation.js") %>"> </script> <script type="text/javascript" src="<%= Url.Content("~/Scripts/jquery.validate.js")%>"> </script> which are downloaded to the client correctly. Added the following to my form page: <% Html.EnableClientValidation(); %> <%--yes, am aware of the EndForm() bug! --%> <% using (Html.BeginForm()) { %> <%--snip --%> and I can see the client validation scripts have been added to the bottom of the form. But still client validation never happens. What is worse is that in my upgraded project, the client validation scripts are never output in the page! PLEASE NOTE: I am specifically asking about the version of MVC2 that came with VS2010 RC. Also, I do know how to google; please don't waste anybody's time searching and answering if you aren't familiar with this issue in the release candidate of Visual Studio. Thanks.

    Read the article

  • Visual Studio 2008 project file does not load because of an unexpected encoding change.

    - by Xenan
    In our team we have a database project in visual Studio 2008 which is under source control by Team Foundation Server. Every two weeks or so, after one co-worker checks in, the project file won't load on the other developers machines. The error message is: The project file could not be loaded. Data at the root level is invalid. Line 1, position 1. When I look at the project file in Notepad++, the file looks like this: ??<NUL?NULxNULmNULlNUL NULvNULeNULrNULsNULiNULoNULnNUL ... and so on (you can see <?xml version in this) whereas an normal project file looks like: <?xml version="1.0" encoding="utf-16"?> ... So probably something is wrong with the encoding of the file. This is a problem for us because it turns out to be impossible to get the file encoding correct again. The 'solution' is to throw away the project file an get the last know working version from source control. According to the file, the encoding should be UTF-16. According to Notepad++, the corrupted file is actually UTF-8. My questions are: Why is Visual Studio messing up the encoding of the project file, apparently at random times and at random machines? What should we do to prevent this? When it has happened, is there a possibility to restore the current file in the correct encoding instead of pulling an older version from source control? As a last note: the problem is with one single project file, all other project files don't expose this problem.

    Read the article

  • visusal studio embedded crystal report keeps prompting database login?

    - by phill
    I'm using visual studio 2005 to develop a form with a combobox passing a value into the parameter of an embedded crystal report. I'm trying to figure out why it keeps prompting me for a database login every single time you try to run the report with a different combobox selection. Here is my code: private Sub Form1_load... Dim ConnName As String Dim ServerName As String Dim DBName As String Dim user As String Dim pass As String Dim gDBA As ADODB.Connection Dim records As ADODB.Recordset Dim datver As ADODB.Recordset Dim query As String '---OPEN THE DATABASE CONNECTIONS gDBA = New ADODB.Connection ': gDBA.CursorLocation = adUseServer 'Added to prevent time out error gDBA.CommandTimeout = 1000 : gDBA.ConnectionTimeout = 1000 gDBA.ConnectionString = "Server=svr13;Database=subscribers;User ID=KViews;Password=Solution;Trusted_Connection=True;" gDBA.Open("Data Source=Kaseya;Initial Catalog=subscribers;User Id=KViews;Password=Solution;", "KViews", "Solution") records = New ADODB.Recordset query = "select distinct groupname from _v_k order by groupname desc" 'records.ActiveConnection = gDBA.ConnectionString records.CursorType = CursorTypeEnum.adOpenForwardOnly records.LockType = LockTypeEnum.adLockReadOnly records.Open(query, gDBA) Do While Not records.EOF ComboBox1.Items.Add(records.Fields("groupname").Value) records.MoveNext() Loop end Sub Private Sub Button1_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles Button1.Click Dim selected As String selected = ComboBox1.Text Dim cryRpt As New ReportDocument cryRpt.Load("C:\Visual Studio 2005\Projects\WindowsApplication1\WindowsApplication1\CrystalReport1.rpt") cryRpt.SetDatabaseLogon("KViews", "Solutions", "svr13", "subscribers") cryRpt.SetParameterValue("companyname", selected) CrystalReportViewer1.ReportSource = cryRpt CrystalReportViewer1.Refresh() End Sub I looked at this previous posting http://stackoverflow.com/questions/1132314/database-login-prompt-with-crystal-reports but this wasn't very helpful. I couldn't find where a CMC was to disable the prompt. Any ideas? thanks in advance

    Read the article

  • How do I run NUnit in debug mode from Visual Studio?

    - by Jon Cage
    I've recently been building a test framework for a bit of C# I've been working on. I have NUnit set up and a new project within my workspace to test the component. All works well if I load up my unit tests from Nunit (v2.4), but I've got to the point where it would be really useful to run in debug mode and set some break points. I've tried the suggestions from several guides which all suggest changing the 'Debug' properties of the test project: Start external program: C:\Program Files\NUnit 2.4.8\bin\nunit-console.exe Command line arguments: /assembly: <full-path-to-solution>\TestDSP\bin\Debug\TestDSP.dll I'm using the console version there, but have tried the calling the GUI as well. Both give me the same error when I try and start debugging: Cannot start test project 'TestDSP' because the project does not contain any tests. Is this because I normally load \DSP.nunit into the Nunit GUI and that's where the tests are held? I'm beginning to think the problem may be that VS wants to run it's own test framework and that's why it's failing to find the NUnit tests? [Edit] To those asking about test fixtures, one of my .cs files in the TestDSP project looks roughly like this: namespace Some.TestNamespace { // Testing framework includes using NUnit.Framework; [TestFixture] public class FirFilterTest { /// <summary> /// Tests that a FirFilter can be created /// </summary> [Test] public void Test01_ConstructorTest() { ...some tests... } } } ...I'm pretty new to C# and the Nunit test framework so it's entirely possible I've missed some crucial bit of information ;-) [FINAL SOLUTION] The big problem was the project I'd used. If you pick: Other Languages->Visual C#->Test->Test Project ...when you're choosing the project type, Visual Studio will try and use it's own testing framework as far as I can tell. You should pick a normal c# class library project instead and then the instructions in my selected answer will work.

    Read the article

  • Visual Studio 2010 randomly unable to debug WCF service.

    - by rossisdead
    I'm running Visual Studio 2010 on a Windows 7 x64 machine, and occasionally VS is giving me the good old "The remote procedure could not be debugged.This usually indicates that debugging has not been enabled on the server" error that a lot of people ask about. My problem, though, is that it seems to only do this randomly(it can be anywhere from a few minutes to a few hours), and after I've made plenty of successful calls to the service already. It doesn't prevent the service from working. It still returns values and doesn't throw any errors. The only difference is that annoying dialog pops up everytime I start to debug my application. I should mention that I'm connecting the WCF service from a WPF application. If I launch the web site the service is part of, I don't get the dialog. A few of the things I've tried that do not work: Killing and restarting the server. Compiling the web server in x86 Enabling tracing, but couldn't find any problems. Is this just a bug in Visual Studio 2010, or is there something I'm missing?

    Read the article

  • .NET MVC: How to fix Visual Studio's lack of awareness of CSS classes in partial views?

    - by Mega Matt
    Hi all, This has been sort of an annoyance for me for a while. I make pretty heavy use of partial views in MVC, and am using Visual Studio 2008 to develop. The problem is that when I give html elements a class in a partial view (<div class="someClass">), it will underline them in green like it doesn't know what they are. I realize this is because I'm in a partial view, and haven't put link tags anywhere in that file for it to know where the CSS is (the link tags are in the main view that renders the partial view). The CSS still works fine on my site because the browser will render all views as one long html page anyway, but it's really annoying to look through my partial views and see all of my classes underlined in green. Is there a way that I can still tell Visual Studio that those classes exist somewhere, from the partial view? I figured there has to be a way to let it know, but am not sure what it is. Maybe a way to import the stylesheets from the parent view? Thanks for your help.

    Read the article

  • Was Visual Studio 2008 or 2010 written to use multi cores?

    - by Erx_VB.NExT.Coder
    basically i want to know if the visual studio IDE and/or compiler in 2010 was written to make use of a multi core environment (i understand we can target multi core environments in 08 and 10, but that is not my question). i am trying to decide on if i should get a higher clock dual core or a lower clock quad core, as i want to try and figure out which processor will give me the absolute best possible experience with Visual Studio 2010 (ide and background compiler). if they are running the most important section (background compiler and other ide tasks) in one core, then the core will get cut off quicker if running a quad core, esp if background compiler is the heaviest task, i would imagine this would b e difficult to seperate in more then one process, so even if it uses multi cores you might still be better off with going for a higher clock cpu if the majority of the processing is still bound to occur in one core (ie the most significant part of the VS environment). i am a vb programmer, they've made great performance improvements in beta 2, congrats, but i would love to be able to use VS seamlessly... anyone have any ideas? thanks, erx

    Read the article

  • Creating an ASP.NET Database using MS SQL 2008 in Visual Web Developer 2008

    This article illustrates how to create a database in ASP.NET. We ll be using Microsoft SQL Server 2 8 and developing it in Visual Web Developer Express 2 8. Given the importance of databases to most websites nowadays you should find this information useful when building just about any website based on Microsoft technology.... Email Marketing Software No Mthly Fees - Powerful email marketing software that installs on your server.

    Read the article

  • Visual Studio 2010 released!

    - by Daniel Moth
    Visual Studio 2010 releases to the word today. Get the full story from Soma's blog post (inc. links for buy, try etc). Our team is very proud of what we have contributed to this release and you can learn more about it through our content on the Parallel Computing MSDN home. Comments about this post welcome at the original blog.

    Read the article

  • Visual Studio 2010 and .NET Framework 4 IDE Enhancements –Part3

    In my previous article I explained some of the nice features related to IDE, in continuation to that I am going to explain Add Reference enhancements for developers, Windows 7 support for developers, Share Point 2010 enhancements , Office Business Application Support, Cloud Development, Document Map Margin and Visual Studio 2010 Tips

    Read the article

  • [News] Visual Studio 2010 RC disponible

    Des rumeurs faisaient ?tat hier dans la journ?e d'une prochaine disponibilit? de VS 2010 RC et .NET V4, Microsoft vient de l'annoncer ce matin : "Today I?m pleased to announce we have shipped the RC for Visual Studio 2010 / .NET Framework 4! MSDN subscribers can download the bits immediately from this location. The RC will be made available to the public on Wednesday February 10.". Inutile de rappeler que cette version est une version majeure dans l'histoire de .NET.

    Read the article

  • Visual Studio 2010 Released

    - by Latest Microsoft Blogs
    It's a big day at Microsoft today as Visual Studio 2010 officially releases. There's a lot going on with this release and I thought I'd do a big rollup post with lots of details and context to help you find your way to the information and Read More......(read more)

    Read the article

  • Attach to Process in Visual Studio

    - by Daniel Moth
    One option for achieving step 1 in the Live Debugging process is attaching to an already running instance of the process that hosts your code, and this is a good place for me to talk about debug engines. You can attach to a process by selecting the "Debug" menu and then the "Attach To Process…" menu in Visual Studio 11 (Ctrl+Alt+P with my keyboard bindings), and you should see something like this screenshot: I am not going to explain this UI, besides being fairly intuitive, there is good documentation on MSDN for the Attach dialog. I do want to focus on the row of controls that starts with the "Attach to:" label and ends with the "Select..." button. Between them is the readonly textbox that indicates the debug engine that will be used for the selected process if you click the "Attach" button. If you haven't encountered that term before, read on MSDN about debug engines. Notice that the "Type" column shows the Code Type(s) that can be detected for the process. Typically each debug engine knows how to debug a specific code type (the two terms tend to be used interchangeably). If you click on a different process in the list with a different code type, the debug engine used will be different. However note that this is the automatic behavior. If you believe you know best, or more typically you want to choose the debug engine for a process using more than one code type, you can do so by clicking the "Select..." button, which should yield a "Select Code Type" dialog like this one: In this dialog you can switch to the debug engine you want to use by checking the box in front of your desired one, then hit "OK", then hit "Attach" to use it. Notice that the dialog suggests that you can select more than one. Not all combinations work (you'll get an error if you select two incompatible debug engines), but some do. Also notice in the list of debug engines one of the new players in Visual Studio 11, the GPU debug engine - I will be covering that on the C++ AMP team blog (and no, it cannot be combined with any others in this release). Comments about this post by Daniel Moth welcome at the original blog.

    Read the article

  • 2010 Visual C# MVP Award

    - by Reed
    I received a pleasant surprise today.  I was presented this morning with the 2010 Microsoft® MVP Award for Visual C#.  According to the award email, this “award is given to exceptional technical community leaders who actively share their high quality, real world expertise with others.” I feel honored and proud to receive this award, and hope that I can continue to be a valuable member of the community in the future.  Thank you to everyone who nominated me!

    Read the article

< Previous Page | 126 127 128 129 130 131 132 133 134 135 136 137  | Next Page >