Search Results

Search found 26167 results on 1047 pages for 'visual programming langua'.

Page 222/1047 | < Previous Page | 218 219 220 221 222 223 224 225 226 227 228 229  | Next Page >

  • Simple MSBuild Configuration: Updating Assemblies With A Version Number

    - by srkirkland
    When distributing a library you often run up against versioning problems, once facet of which is simply determining which version of that library your client is running.  Of course, each project in your solution has an AssemblyInfo.cs file which provides, among other things, the ability to set the Assembly name and version number.  Unfortunately, setting the assembly version here would require not only changing the version manually for each build (depending on your schedule), but keeping it in sync across all projects.  There are many ways to solve this versioning problem, and in this blog post I’m going to try to explain what I think is the easiest and most flexible solution.  I will walk you through using MSBuild to create a simple build script, and I’ll even show how to (optionally) integrate with a Team City build server.  All of the code from this post can be found at https://github.com/srkirkland/BuildVersion. Create CommonAssemblyInfo.cs The first step is to create a common location for the repeated assembly info that is spread across all of your projects.  Create a new solution-level file (I usually create a Build/ folder in the solution root, but anywhere reachable by all your projects will do) called CommonAssemblyInfo.cs.  In here you can put any information common to all your assemblies, including the version number.  An example CommonAssemblyInfo.cs is as follows: using System.Reflection; using System.Resources; using System.Runtime.InteropServices;   [assembly: AssemblyCompany("University of California, Davis")] [assembly: AssemblyProduct("BuildVersionTest")] [assembly: AssemblyCopyright("Scott Kirkland & UC Regents")] [assembly: AssemblyConfiguration("")] [assembly: AssemblyTrademark("")]   [assembly: ComVisible(false)]   [assembly: AssemblyVersion("1.2.3.4")] //Will be replaced   [assembly: NeutralResourcesLanguage("en-US")] .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; }   Cleanup AssemblyInfo.cs & Link CommonAssemblyInfo.cs For each of your projects, you’ll want to clean up your assembly info to contain only information that is unique to that assembly – everything else will go in the CommonAssemblyInfo.cs file.  For most of my projects, that just means setting the AssemblyTitle, though you may feel AssemblyDescription is warranted.  An example AssemblyInfo.cs file is as follows: using System.Reflection;   [assembly: AssemblyTitle("BuildVersionTest")] .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } Next, you need to “link” the CommonAssemblyinfo.cs file into your projects right beside your newly lean AssemblyInfo.cs file.  To do this, right click on your project and choose Add | Existing Item from the context menu.  Navigate to your CommonAssemblyinfo.cs file but instead of clicking Add, click the little down-arrow next to add and choose “Add as Link.”  You should see a little link graphic similar to this: We’ve actually reduced complexity a lot already, because if you build all of your assemblies will have the same common info, including the product name and our static (fake) assembly version.  Let’s take this one step further and introduce a build script. Create an MSBuild file What we want from the build script (for now) is basically just to have the common assembly version number changed via a parameter (eventually to be passed in by the build server) and then for the project to build.  Also we’d like to have a flexibility to define what build configuration to use (debug, release, etc). In order to find/replace the version number, we are going to use a Regular Expression to find and replace the text within your CommonAssemblyInfo.cs file.  There are many other ways to do this using community build task add-ins, but since we want to keep it simple let’s just define the Regular Expression task manually in a new file, Build.tasks (this example taken from the NuGet build.tasks file). <?xml version="1.0" encoding="utf-8"?> <Project ToolsVersion="4.0" DefaultTargets="Go" xmlns="http://schemas.microsoft.com/developer/msbuild/2003"> <UsingTask TaskName="RegexTransform" TaskFactory="CodeTaskFactory" AssemblyFile="$(MSBuildToolsPath)\Microsoft.Build.Tasks.v4.0.dll"> <ParameterGroup> <Items ParameterType="Microsoft.Build.Framework.ITaskItem[]" /> </ParameterGroup> <Task> <Using Namespace="System.IO" /> <Using Namespace="System.Text.RegularExpressions" /> <Using Namespace="Microsoft.Build.Framework" /> <Code Type="Fragment" Language="cs"> <![CDATA[ foreach(ITaskItem item in Items) { string fileName = item.GetMetadata("FullPath"); string find = item.GetMetadata("Find"); string replaceWith = item.GetMetadata("ReplaceWith"); if(!File.Exists(fileName)) { Log.LogError(null, null, null, null, 0, 0, 0, 0, String.Format("Could not find version file: {0}", fileName), new object[0]); } string content = File.ReadAllText(fileName); File.WriteAllText( fileName, Regex.Replace( content, find, replaceWith ) ); } ]]> </Code> </Task> </UsingTask> </Project> .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } If you glance at the code, you’ll see it’s really just going a Regex.Replace() on a given file, which is exactly what we need. Now we are ready to write our build file, called (by convention) Build.proj. <?xml version="1.0" encoding="utf-8"?> <Project ToolsVersion="4.0" DefaultTargets="Go" xmlns="http://schemas.microsoft.com/developer/msbuild/2003"> <Import Project="$(MSBuildProjectDirectory)\Build.tasks" /> <PropertyGroup> <Configuration Condition="'$(Configuration)' == ''">Debug</Configuration> <SolutionRoot>$(MSBuildProjectDirectory)</SolutionRoot> </PropertyGroup>   <ItemGroup> <RegexTransform Include="$(SolutionRoot)\CommonAssemblyInfo.cs"> <Find>(?&lt;major&gt;\d+)\.(?&lt;minor&gt;\d+)\.\d+\.(?&lt;revision&gt;\d+)</Find> <ReplaceWith>$(BUILD_NUMBER)</ReplaceWith> </RegexTransform> </ItemGroup>   <Target Name="Go" DependsOnTargets="UpdateAssemblyVersion; Build"> </Target>   <Target Name="UpdateAssemblyVersion" Condition="'$(BUILD_NUMBER)' != ''"> <RegexTransform Items="@(RegexTransform)" /> </Target>   <Target Name="Build"> <MSBuild Projects="$(SolutionRoot)\BuildVersionTest.sln" Targets="Build" /> </Target>   </Project> Reviewing this MSBuild file, we see that by default the “Go” target will be called, which in turn depends on “UpdateAssemblyVersion” and then “Build.”  We go ahead and import the Bulid.tasks file and then setup some handy properties for setting the build configuration and solution root (in this case, my build files are in the solution root, but we might want to create a Build/ directory later).  The rest of the file flows logically, we setup the RegexTransform to match version numbers such as <major>.<minor>.1.<revision> (1.2.3.4 in our example) and replace it with a $(BUILD_NUMBER) parameter which will be supplied externally.  The first target, “UpdateAssemblyVersion” just runs the RegexTransform, and the second target, “Build” just runs the default MSBuild on our solution. Testing the MSBuild file locally Now we have a build file which can replace assembly version numbers and build, so let’s setup a quick batch file to be able to build locally.  To do this you simply create a file called Build.cmd and have it call MSBuild on your Build.proj file.  I’ve added a bit more flexibility so you can specify build configuration and version number, which makes your Build.cmd look as follows: set config=%1 if "%config%" == "" ( set config=debug ) set version=%2 if "%version%" == "" ( set version=2.3.4.5 ) %WINDIR%\Microsoft.NET\Framework\v4.0.30319\msbuild Build.proj /p:Configuration="%config%" /p:build_number="%version%" .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } Now if you click on the Build.cmd file, you will get a default debug build using the version 2.3.4.5.  Let’s run it in a command window with the parameters set for a release build version 2.0.1.453.   Excellent!  We can now run one simple command and govern the build configuration and version number of our entire solution.  Each DLL produced will have the same version number, making determining which version of a library you are running very simple and accurate. Configure the build server (TeamCity) Of course you are not really going to want to run a build command manually every time, and typing in incrementing version numbers will also not be ideal.  A good solution is to have a computer (or set of computers) act as a build server and build your code for you, providing you a consistent environment, excellent reporting, and much more.  One of the most popular Build Servers is JetBrains’ TeamCity, and this last section will show you the few configuration parameters to use when setting up a build using your MSBuild file created earlier.  If you are using a different build server, the same principals should apply. First, when setting up the project you want to specify the “Build Number Format,” often given in the form <major>.<minor>.<revision>.<build>.  In this case you will set major/minor manually, and optionally revision (or you can use your VCS revision number with %build.vcs.number%), and then build using the {0} wildcard.  Thus your build number format might look like this: 2.0.1.{0}.  During each build, this value will be created and passed into the $BUILD_NUMBER variable of our Build.proj file, which then uses it to decorate your assemblies with the proper version. After setting up the build number, you must choose MSBuild as the Build Runner, then provide a path to your build file (Build.proj).  After specifying your MSBuild Version (equivalent to your .NET Framework Version), you have the option to specify targets (the default being “Go”) and additional MSBuild parameters.  The one parameter that is often useful is manually setting the configuration property (/p:Configuration="Release") if you want something other than the default (which is Debug in our example).  Your resulting configuration will look something like this: [Under General Settings] [Build Runner Settings]   Now every time your build is run, a newly incremented build version number will be generated and passed to MSBuild, which will then version your assemblies and build your solution.   A Quick Review Our goal was to version our output assemblies in an automated way, and we accomplished it by performing a few quick steps: Move the common assembly information, including version, into a linked CommonAssemblyInfo.cs file Create a simple MSBuild script to replace the common assembly version number and build your solution Direct your build server to use the created MSBuild script That’s really all there is to it.  You can find all of the code from this post at https://github.com/srkirkland/BuildVersion. Enjoy!

    Read the article

  • Best practices that you disagree with

    - by SnOrfus
    'Best practices' is a bit of a fuzzy term. Recently I've gone through another wave of self-improvement in my coding practices (mostly brought on by reading Clean Code) and I find that some of the things I disagree with. I'd hate to take things at face value and not think about them critically, but I wonder whether or not my thinking is wrong. So I wonder, what are some best practies or practices that you've seen that many of your peers seem to agree with that you disagree with? For the time being, I'm speaking strictly of coding practices.

    Read the article

  • What are developer's problems with helpful error messages?

    - by Moo-Juice
    It continue to astounds me that, in this day and age, products that have years of use under their belt, built by teams of professionals, still to this day - fail to provide helpful error messages to the user. In some cases, the addition of just a little piece of extra information could save a user hours of trouble. A program that generates an error, generated it for a reason. It has everything at its disposal to inform the user as much as it can, why something failed. And yet it seems that providing information to aid the user is a low-priority. I think this is a huge failing. One example is from SQL Server. When you try and restore a database that is in use, it quite rightly won't let you. SQL Server knows what processes and applications are accessing it. Why can't it include information about the process(es) that are using the database? I know not everyone passes an Applicatio_Name attribute on their connection string, but even a hint about the machine in question could be helpful. Another candidate, also SQL Server (and mySQL) is the lovely string or binary data would be truncated error message and equivalents. A lot of the time, a simple perusal of the SQL statement that was generated and the table shows which column is the culprit. This isn't always the case, and if the database engine picked up on the error, why can't it save us that time and just tells us which damned column it was? On this example, you could argue that there may be a performance hit to checking it and that this would impede the writer. Fine, I'll buy that. How about, once the database engine knows there is an error, it does a quick comparison after-the-fact, between values that were going to be stored, versus the column lengths. Then display that to the user. ASP.NET's horrid Table Adapters are also guilty. Queries can be executed and one can be given an error message saying that a constraint somewhere is being violated. Thanks for that. Time to compare my data model against the database, because the developers are too lazy to provide even a row number, or example data. (For the record, I'd never use this data-access method by choice, it's just a project I have inherited!). Whenever I throw an exception from my C# or C++ code, I provide everything I have at hand to the user. The decision has been made to throw it, so the more information I can give, the better. Why did my function throw an exception? What was passed in, and what was expected? It takes me just a little longer to put something meaningful in the body of an exception message. Hell, it does nothing but help me whilst I develop, because I know my code throws things that are meaningful. One could argue that complicated exception messages should not be displayed to the user. Whilst I disagree with that, it is an argument that can easily be appeased by having a different level of verbosity depending on your build. Even then, the users of ASP.NET and SQL Server are not your typical users, and would prefer something full of verbosity and yummy information because they can track down their problems faster. Why to developers think it is okay, in this day and age, to provide the bare minimum amount of information when an error occurs? It's 2011 guys, come on.

    Read the article

  • On improving commit practices

    - by greengit
    I was thinking about ways to improving my commit practices. Is there any co-relation between no. of source code lines and no. of commits? In a recent project that I was involved in, I was going at 30 commits per 1000 lines. One typical file from the project has these stats language: JavaScript total commits that include this file: 32 total lines: 1408 source lines: 1140 comment lines: 98 no. of function declarations: 28 other declarations: 8 Another file has these... Language: Python total commits that include this file: 17 total lines: 933 source lines: 730 comment lines: 80 classes: 1 methods: 10 I also think that no. of commits is more related to no. of features or no. of changes to the code and less to the no. of lines. The general git community motto is make short commits and commit often. So, do you really think about you commit strategy before you start the project. For that matter, is there anything like commit strategy? If so, what's yours?

    Read the article

  • Should a newcomer to Perl learn both Perl 5 and 6?

    - by M. Joanis
    Hello! I have started playing around with Perl 5 lately, and it seems very interesting. I would like to spend some time learning it more in depth when I can. My question, since Perl 6 is slowly taking place (I believe...) and is said to break backward compatibility, is this: am I better learning Perl 5 and then Perl 6, or is learning Perl 6 directly a better time investment according to you? If the changes from Perl 5 to 6 are making it hard to understand Perl 5, I should certainly start with Perl 5 to be able to read "old" scripts, then check Perl 6. There is also the "Perl 6 is not yet completely released" problem. I know there's an early adopter implementation for Perl 6, but if Perl 6 isn't officially released before some more years, I'll stick to 5 for now. I would certainly like some insight on this. Feel free to discuss related stuff. My interest in scripting languages is fairly new. Thanks!

    Read the article

  • Is CUDA, cuBLAS or cuBLAS-XT the right place to start with for machine learning?

    - by Stefan R. Falk
    I am not sure if this is the right forum to post this question - but it surely is no question for stackoverflow. I work on my bachelor thesis and therefore I am implementing a so called Echo-State Network which basically is an artificial neural network that has a large reservoir of randomly initialized neurons and just a few input and output neurons .. but I think we can skip that. The thing is, there is a Python library called Theano which I am using for this implementation. It encapsulates the CUDA API and offers a quiet "comfortable" way to access the power of a NVIDIA graphics card. Since CUDA 6.0 there is a sub-library called cuBLAS (Basic Linear Algebra Subroutines) for LinAlg operations and also a cuBLAS-XT an extention which allows to run calculations on multiple graphics cards. My question at this point is if it would make sense to start using cuBLAS and/or cuBLAS-XT right now since the API is quite complex or rather wait for libraries that will build up on those library (such as Theano does on basic CUDA)? If you think this is the wrong place for this question please tell me which one is, thank you.

    Read the article

  • Problem converting FBX file into XNB

    - by Dado
    I create a Monogame Content Project to convert assets into XNB. For FBX file without texture there is no problem: the file is correctly converted and when I load XNB into my project everything is ok. The problem occours when i have associated to fbx file a texture map: in this case both FBX and PNG files are converted to XNB but when i try to load these XNB files into my project the following problem occours: "ContentLoadException: Could not load Models/maze1 asset as a non-content file!" Note: maze1 is the XNB file that was converted from FBX. How can I solve this problem? Thank you in advance

    Read the article

  • using ‘using’ and scope. Not try finally!

    - by Edward Boyle
    An object that implements IDisposable has, you guessed it, a Dispose() method. In the code you write you should both declare and instantiate any object that implements IDisposable with the using statement. The using statement allows you to set the scope of an object and when your code exits that scope, the object will be disposed of. Note that when an exception occurs, this will pull your code out of scope, so it still forces a Dispose() using (mObject o = new mObject()) { // do stuff } //<- out of Scope, object is disposed. // Note that you can also use multiple objects using // the using statement if of the same type: using (mObject o = new mObject(), o2 = new mObject(), o3 = new mObject()) { // do stuff } //<- out of Scope, objects are disposed. What about try{ }finally{}? It is not needed when you use the using statement. Additionally, using is preferred, Microsoft’s own documents put it this way: As a rule, when you use an IDisposable object, you should declare and instantiate it in a using statement. When I started out in .NET I had a very bad habit of not using the using statement. As a result I ran into what many developers do: #region BAD CODE - DO NOT DO try { mObject o = new mObject(); //do stuff } finally { o.Dispose(); // error - o is out of scope, no such object. } // and here is what I find on blogs all over the place as a solution // pox upon them for creating bad habits. mObject o = new mObject(); try { //do stuff } finally { o.Dispose(); } #endregion So when should I use the using statement? Very simple rule, if an object implements IDisposable, use it. This of course does not apply if the object is going to be used as a global object outside of a method. If that is the case, don’t forget to dispose of the object in code somewhere. It should be made clear that using the try{}finally{} code block is not going to break your code, nor cause memory leaks. It is perfectly acceptable coding practice, just not best coding practice in C#. This is how VB.NET developers must code, as there is no using equivalent for them to use.

    Read the article

  • Display ‘–Select–’ in an ASP.NET DropDownList

    - by Ken Cox [MVP]
    A purchaser of my book writes: “I would like a drop down list to display the text: "-Select-" initially instead of the first value of the data it is bound to.” Here you go…   <%@ Page Language="VB" %> <script runat="server">     Protected Sub Page_Load(ByVal sender As Object, _                            ...(read more)

    Read the article

  • Microsoft’s Contribution to jQuery – Client Templating

    - by joelvarty
    I am interested to see the community’s response to Microsoft’s contributions to jQuery.  I have been using jTemplates on and off in my apps for a while, but I will certainly check out the new templating plugins put forth by MS and explained here by Scott Guthrie. It may be that some are against the very idea of a company like Microsoft being involved with jQuery, and Scott explains the process with the following: “jQuery has a fantastic developer community, and a very open way to propose suggestions and make contributions.  Microsoft is following the same process to contribute to jQuery as any other member of the community.” I think we can take this in one of two ways:  It’s great that Microsoft sees themselves as a part of a greater community that they can support. It’s the first step in Microsoft’s attempt to usurp the community and have greater control over the web, it’s standards, and it’s developer community. Personally, I believe Microsoft sees the world (and the web) differently from how they did back when IE had more than %80 of the browser market.  Now, in order to keep it’s development products relevant, they are pushing Asp.Net (as they have been for a few years) towards a more open strategy that’s more “web-like” in my opinion. These contributions to jQuery are a good thing, I think.  Now, let’s go try out these new plug-ins and see if they stack up… more later - joel

    Read the article

  • Code snippets for ASP.NET MVC2 in VS 2010

    - by rajbk
    VS 2010 comes with ready made snippets which helps you save time while coding. You insert a snippet by typing the name of the code snippet and hitting the Tab key twice. You can also use the following method if you wish to see a listing of snippets available. Press Ctrl + K, Ctrl + X Select ASP.NET MVC2 with the arrow keys and hit enter to see a list of snippets available.   The MVC related snippets you get out of the box (for C#) are listed below: HTML actionlink Markup snippet for an ASP.NET MVC action link helper <%= Html.ActionLink("linktext", "actionname") %>   beginformajaxcs Markup snippet for an ASP.NET MVC AJAX-enabled form helper in C# <% using (Ajax.BeginForm("actionname", new AjaxOptions {UpdateTargetId= "elementid" })) { %> <% } %>   beginformcs Markup snippet for an ASP.NET MVC form helper in C# <% using (Html.BeginForm()) { %> <% } %>   displayforcs Markup snippet for an ASP.NET MVC templated helper. <%= Html.DisplayFor(x => x.Property) %>   editorforcs Markup snippet for an ASP.NET MVC templated helper. <%= Html.EditorFor(x => x.Property) %>   foreachcs Markup snippet for an ASP.NET MVC foreach statement in C# <% foreach (var item in collection) { %> <% } %>   ifcs Markup snippet for a code-nugget if else statement in C# <% if (true) { %> <% } %>   ifelsecs Markup snippet for a code-nugget if else statement in C# <% if (true) { %> <% } else { %> <% } %>   renderpartialcs Markup snippet for an ASP.NET MVC partial view rendering in C# <% Html.RenderPartial("viewname"); %>   textboxmvc Markup snippet for an ASP.NET MVC textbox helper <%= Html.TextBox("name") %>   validationsummarymvc Markup snippet for an ASP.NET MVC validation summary helper <%= Html.ValidationSummary() %> CS mvcaction Code snippet for an action. public ActionResult Action() {     return View(); }   mvcpostaction Code snippet for an action via http post. [HttpPost] public ActionResult Action() {     return View(); }   Enjoy!

    Read the article

  • Reading data from an Entity Framework data model through a WCF Data Service

    - by nikolaosk
    This is going to be the fourth post of a series of posts regarding ASP.Net and the Entity Framework and how we can use Entity Framework to access our datastore. You can find the first one here , the second one here and the third one here . I have a post regarding ASP.Net and EntityDataSource. You can read it here .I have 3 more posts on Profiling Entity Framework applications. You can have a look at them here , here and here . Microsoft with .Net 3.0 Framework, introduced WCF. WCF is Microsoft's...(read more)

    Read the article

  • Daily tech links for .net and related technologies - Apr 26-28, 2010

    - by SanjeevAgarwal
    Daily tech links for .net and related technologies - Apr 26-28, 2010 Web Development MVC: Unit Testing Action Filters - Donn ASP.NET MVC 2: Ninja Black Belt Tips - Scott Hanselman Turn on Compile-time View Checking for ASP.NET MVC Projects in TFS Build 2010 - Jim Lamb Web Design List of 25+ New tags introduced in HTML 5 - techfreakstuff 15 CSS Habits to Develop for Frustration-Free Coding - noupe Silverlight, WPF & RIA Essential Silverlight and WPF Skills: The UI Thread, Dispatchers, Background...(read more)

    Read the article

  • Identifier for the “completed” stage of a process: 0, 99, something else?

    - by Arnold Sakhnov
    Say, that you are handling a multi-step process (like a complex registration form, with a number of steps the user has go through in order). You need to be able to save the current state of the process (e.g. so the user can come back to that registration form later and continue form the step where they were left off). Obviously, you’ll probably want to give each “step” an identifier you can refer to: 1, 2, 3, 4, etc. You logic will check for this step_id (or whatever you call it) to render the appropriate data. The question: how would you identify the stage after the final step, like the completed registration state (say, that you have to give that last “step” its own id, that’s how your logic is structured). Would it be a 0, 999, a non-integer value, something else entirely?

    Read the article

  • Speaking at Microsoft's Duth DevDays

    - by gsusx
    Last week I had the pleasure of presenting two sessions at Microsoft's Dutch DevDays at Den Hague. On Tuesday I presented a sessions about how to implement real world RESTFul services patterns using WCF, WCF Data Services and ASP.NET MVC2. During that session I showed a total of 15 small demos that highlighted how to implement key aspects of RESTful solutions such as Security, LowREST clients, URI modeling, Validation, Error Handling, etc. As part of those demos I used the OAuth implementation created...(read more)

    Read the article

  • Is Learning C++ Through The Qt Framework Really Learning C++

    - by user866190
    The problem I have, is that most of the C++ books I read spend almost forever on syntax and the basics of the language, e.g. for and loops while, arrays, lists, pointers, etc. But they never seem to build anything that is simple enough to use for learning, yet practical enough to get you to understand the philosophy and power of the language. Then I stumbled upon QT which is an amazing library! But working through the demos they have, it seems like I am now in the reverse dilemma. I feel like the rich man's son driving round in a sports car subsidized by the father. Like I could build fantastic software, but have no clue what's going on under the hood. As an example of my dilemma take the task of building a simple web browser. In pure C++, I wouldn't even know where to start, yet with the Qt library it can be done within a few lines on code. I am not complaining about this. I am just wondering how to fill the knowledge void between the basic structure of the language and the high level interface that the Qt framework provides?

    Read the article

  • Free .NET Training at DevCare in Dallas...

    - by [email protected]
    Come take an early look at the debugging experience in VS 2010 this Friday (3/25/2010) at TekFocus in Dallas, at the InfoMart, at 9 AM: In this session, we’ll … Dive deep into the new IntelliTrace (formerly, historical debugging) feature, which enables you to step back in time within your debugging session and inspect or re-execute code, without having to restart your application See how to manage large numbers of breakpoints with labeling, searching and filtering Extend “data tips” by adding comments, notes and strategically “pinning” these resources to maintain their visibility throughout your session Demonstrate “collaborative debugging,“ by debugging a portion of an application and then exporting breakpoints and labeled data tips, so that others can leverage your effort, without having to start over Leverage these new debugging features in applications built in earlier versions of the .NET Framework through the MultiTargeting features available in VS 2010 You’ll walk-away with a clear understanding of how you can use this upcoming technology to vastly increase your productivity and build better software.Register to attend ==>  http://www.dallasdevcares.com/upcoming-sessions/ DevCares is a monthly series of FREE half-day events sponsored by TekFocus and Microsoft. Targeted specifically at developers, the content is presented by experts on a variety of .NET topics. These briefings include expert testimonials, working demos and sample code designed to help you get the most out of application development with .NET. Events are held on the last Friday of each month at the TekFocus offices in the Infomart near downtown Dallas.TekFocus is a full-service technology training provider with a core business delivering Microsoft-certified technical training and product skills enhancements to customers worldwide    

    Read the article

  • Wpf vs WinForms for a vb programmer? [closed]

    - by Jeroen
    I am asked by a client to develop an application that is basically a screen on which the user can choose several items to pass the time (used in holding cells in mental hospitals for example). The baisc idea is as follows: TV (choosing this will provide the user with a number of TV streams from the interweb) Radio (...) Games (serveral flash games, also from the interweb) Music (play local music or streams) Draw something (not the game) Create an email Choose lighting settings for the room etc. etc. I am torn between WinForms and WPF for this project. It seems that WPF is the way to go since there is quite a bit of rich media involved but I have a 15 year VB background. The project obviously has a dead line and certain budget that I cannot cross and if I can avoid starting from scratch with some thing that will be nice. Is WPF worth it in this particular case or can I use WinForms with the incorperation of WPF controls? I would very much like to hear your thoughts/comments/suggestions!

    Read the article

  • Interface contracts – forcing code contracts through interfaces

    - by DigiMortal
    Sometimes we need a way to make different implementations of same interface follow same rules. One option is to duplicate contracts to all implementation but this is not good option because we have duplicated code then. The other option is to force contracts to all implementations at interface level. In this posting I will show you how to do it using interface contracts and contracts class. Using code from previous example about unit testing code with code contracts I will go further and force contracts at interface level. Here is the code from previous example. Take a careful look at it because I will talk about some modifications to this code soon. public interface IRandomGenerator {     int Next(int min, int max); }   public class RandomGenerator : IRandomGenerator {     private Random _random = new Random();       public int Next(int min, int max)     {         return _random.Next(min, max);     } }    public class Randomizer {     private IRandomGenerator _generator;       private Randomizer()     {         _generator = new RandomGenerator();     }       public Randomizer(IRandomGenerator generator)     {         _generator = generator;     }       public int GetRandomFromRangeContracted(int min, int max)     {         Contract.Requires<ArgumentOutOfRangeException>(             min < max,             "Min must be less than max"         );           Contract.Ensures(             Contract.Result<int>() >= min &&             Contract.Result<int>() <= max,             "Return value is out of range"         );           return _generator.Next(min, max);     } } If we look at the GetRandomFromRangeContracted() method we can see that contracts set in this method are applicable to all implementations of IRandomGenerator interface. Although we can write new implementations as we want these implementations need exactly the same contracts. If we are using generators somewhere else then code contracts are not with them anymore. To solve the problem we will force code contracts at interface level. NB! To make the following code work you must enable Contract Reference Assembly building from project settings. Interface contracts and contracts class Interface contains no code – only definitions of members that implementing type must have. But code contracts must be defined in body of member they are part of. To get over this limitation, code contracts are defined in separate contracts class. Interface is bound to this class by special attribute and contracts class refers to interface through special attribute. Here is the IRandomGenerator with contracts and contracts class. Also I write simple fake so we can test contracts easily based only on interface mock. [ContractClass(typeof(RandomGeneratorContracts))] public interface IRandomGenerator {     int Next(int min, int max); }   [ContractClassFor(typeof(IRandomGenerator))] internal sealed class RandomGeneratorContracts : IRandomGenerator {     int IRandomGenerator.Next(int min, int max)     {         Contract.Requires<ArgumentOutOfRangeException>(                 min < max,                 "Min must be less than max"             );           Contract.Ensures(             Contract.Result<int>() >= min &&             Contract.Result<int>() <= max,             "Return value is out of range"         );           return default(int);     } }   public class RandomFake : IRandomGenerator {     private int _testValue;       public RandomGen(int testValue)     {         _testValue = testValue;     }       public int Next(int min, int max)     {         return _testValue;     } } To try out these changes use the following code. var gen = new RandomFake(3);   try {     gen.Next(10, 1); } catch(Exception ex) {     Debug.WriteLine(ex.Message); }   try {     gen.Next(5, 10); } catch(Exception ex) {     Debug.WriteLine(ex.Message); } Now we can force code contracts to all types that implement our IRandomGenerator interface and we must test only the interface to make sure that contracts are defined correctly.

    Read the article

  • Deeper Integration with phpBB

    - by The Official Microsoft IIS Site
    More good news on the Interoperability front: the new phpBB release is now available for installation from the Windows Web Application Gallery and Web Platform Installer (Web PI) for Windows, IIS and SQL Server. Version 3.0.7-PL1 of phpBB takes advantage of a number of features for PHP applications on the Microsoft Web Platform with Windows, IIS and SQL Server, including SQL Server Driver for PHP 1.1 , which provides key interoperability for PHP applications to use SQL Server or SQL Azure for data...(read more)

    Read the article

  • Daily tech links for .net and related technologies - Mar 29-31, 2010

    - by SanjeevAgarwal
    Daily tech links for .net and related technologies - Mar 29-31, 2010 Web Development Querying the Future With Reactive Extensions - Phil Haack Creating an OData API for StackOverflow including XML and JSON in 30 minutes - Scott Hanselman MVC Automatic Menu - Nuri Halperin jqGrid for ASP.NET MVC - TriRand Team Foolproof Provides Contingent Data Annotation Validation for ASP.NET MVC 2 -Nick Riggs Using FubuMVC.UI in asp.net MVC : Getting started - Cannibal Coder Building A Custom ActionResult in MVC...(read more)

    Read the article

  • Introducing Data Annotations Extensions

    - by srkirkland
    Validation of user input is integral to building a modern web application, and ASP.NET MVC offers us a way to enforce business rules on both the client and server using Model Validation.  The recent release of ASP.NET MVC 3 has improved these offerings on the client side by introducing an unobtrusive validation library built on top of jquery.validation.  Out of the box MVC comes with support for Data Annotations (that is, System.ComponentModel.DataAnnotations) and can be extended to support other frameworks.  Data Annotations Validation is becoming more popular and is being baked in to many other Microsoft offerings, including Entity Framework, though with MVC it only contains four validators: Range, Required, StringLength and Regular Expression.  The Data Annotations Extensions project attempts to augment these validators with additional attributes while maintaining the clean integration Data Annotations provides. A Quick Word About Data Annotations Extensions The Data Annotations Extensions project can be found at http://dataannotationsextensions.org/, and currently provides 11 additional validation attributes (ex: Email, EqualTo, Min/Max) on top of Data Annotations’ original 4.  You can find a current list of the validation attributes on the afore mentioned website. The core library provides server-side validation attributes that can be used in any .NET 4.0 project (no MVC dependency). There is also an easily pluggable client-side validation library which can be used in ASP.NET MVC 3 projects using unobtrusive jquery validation (only MVC3 included javascript files are required). On to the Preview Let’s say you had the following “Customer” domain model (or view model, depending on your project structure) in an MVC 3 project: public class Customer { public string Email { get; set; } public int Age { get; set; } public string ProfilePictureLocation { get; set; } } .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } When it comes time to create/edit this Customer, you will probably have a CustomerController and a simple form that just uses one of the Html.EditorFor() methods that the ASP.NET MVC tooling generates for you (or you can write yourself).  It should look something like this: With no validation, the customer can enter nonsense for an email address, and then can even report their age as a negative number!  With the built-in Data Annotations validation, I could do a bit better by adding a Range to the age, adding a RegularExpression for email (yuck!), and adding some required attributes.  However, I’d still be able to report my age as 10.75 years old, and my profile picture could still be any string.  Let’s use Data Annotations along with this project, Data Annotations Extensions, and see what we can get: public class Customer { [Email] [Required] public string Email { get; set; }   [Integer] [Min(1, ErrorMessage="Unless you are benjamin button you are lying.")] [Required] public int Age { get; set; }   [FileExtensions("png|jpg|jpeg|gif")] public string ProfilePictureLocation { get; set; } } .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } Now let’s try to put in some invalid values and see what happens: That is very nice validation, all done on the client side (will also be validated on the server).  Also, the Customer class validation attributes are very easy to read and understand. Another bonus: Since Data Annotations Extensions can integrate with MVC 3’s unobtrusive validation, no additional scripts are required! Now that we’ve seen our target, let’s take a look at how to get there within a new MVC 3 project. Adding Data Annotations Extensions To Your Project First we will File->New Project and create an ASP.NET MVC 3 project.  I am going to use Razor for these examples, but any view engine can be used in practice.  Now go into the NuGet Extension Manager (right click on references and select add Library Package Reference) and search for “DataAnnotationsExtensions.”  You should see the following two packages: The first package is for server-side validation scenarios, but since we are using MVC 3 and would like comprehensive sever and client validation support, click on the DataAnnotationsExtensions.MVC3 project and then click Install.  This will install the Data Annotations Extensions server and client validation DLLs along with David Ebbo’s web activator (which enables the validation attributes to be registered with MVC 3). Now that Data Annotations Extensions is installed you have all you need to start doing advanced model validation.  If you are already using Data Annotations in your project, just making use of the additional validation attributes will provide client and server validation automatically.  However, assuming you are starting with a blank project I’ll walk you through setting up a controller and model to test with. Creating Your Model In the Models folder, create a new User.cs file with a User class that you can use as a model.  To start with, I’ll use the following class: public class User { public string Email { get; set; } public string Password { get; set; } public string PasswordConfirm { get; set; } public string HomePage { get; set; } public int Age { get; set; } } Next, create a simple controller with at least a Create method, and then a matching Create view (note, you can do all of this via the MVC built-in tooling).  Your files will look something like this: UserController.cs: public class UserController : Controller { public ActionResult Create() { return View(new User()); }   [HttpPost] public ActionResult Create(User user) { if (!ModelState.IsValid) { return View(user); }   return Content("User valid!"); } } .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } Create.cshtml: @model NuGetValidationTester.Models.User   @{ ViewBag.Title = "Create"; }   <h2>Create</h2>   <script src="@Url.Content("~/Scripts/jquery.validate.min.js")" type="text/javascript"></script> <script src="@Url.Content("~/Scripts/jquery.validate.unobtrusive.min.js")" type="text/javascript"></script>   @using (Html.BeginForm()) { @Html.ValidationSummary(true) <fieldset> <legend>User</legend> @Html.EditorForModel() <p> <input type="submit" value="Create" /> </p> </fieldset> } .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } In the Create.cshtml view, note that we are referencing jquery validation and jquery unobtrusive (jquery is referenced in the layout page).  These MVC 3 included scripts are the only ones you need to enjoy both the basic Data Annotations validation as well as the validation additions available in Data Annotations Extensions.  These references are added by default when you use the MVC 3 “Add View” dialog on a modification template type. Now when we go to /User/Create we should see a form for editing a User Since we haven’t yet added any validation attributes, this form is valid as shown (including no password, email and an age of 0).  With the built-in Data Annotations attributes we can make some of the fields required, and we could use a range validator of maybe 1 to 110 on Age (of course we don’t want to leave out supercentenarians) but let’s go further and validate our input comprehensively using Data Annotations Extensions.  The new and improved User.cs model class. { [Required] [Email] public string Email { get; set; }   [Required] public string Password { get; set; }   [Required] [EqualTo("Password")] public string PasswordConfirm { get; set; }   [Url] public string HomePage { get; set; }   [Integer] [Min(1)] public int Age { get; set; } } .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } Now let’s re-run our form and try to use some invalid values: All of the validation errors you see above occurred on the client, without ever even hitting submit.  The validation is also checked on the server, which is a good practice since client validation is easily bypassed. That’s all you need to do to start a new project and include Data Annotations Extensions, and of course you can integrate it into an existing project just as easily. Nitpickers Corner ASP.NET MVC 3 futures defines four new data annotations attributes which this project has as well: CreditCard, Email, Url and EqualTo.  Unfortunately referencing MVC 3 futures necessitates taking an dependency on MVC 3 in your model layer, which may be unadvisable in a multi-tiered project.  Data Annotations Extensions keeps the server and client side libraries separate so using the project’s validation attributes don’t require you to take any additional dependencies in your model layer which still allowing for the rich client validation experience if you are using MVC 3. Custom Error Message and Globalization: Since the Data Annotations Extensions are build on top of Data Annotations, you have the ability to define your own static error messages and even to use resource files for very customizable error messages. Available Validators: Please see the project site at http://dataannotationsextensions.org/ for an up-to-date list of the new validators included in this project.  As of this post, the following validators are available: CreditCard Date Digits Email EqualTo FileExtensions Integer Max Min Numeric Url Conclusion Hopefully I’ve illustrated how easy it is to add server and client validation to your MVC 3 projects, and how to easily you can extend the available validation options to meet real world needs. The Data Annotations Extensions project is fully open source under the BSD license.  Any feedback would be greatly appreciated.  More information than you require, along with links to the source code, is available at http://dataannotationsextensions.org/. Enjoy!

    Read the article

< Previous Page | 218 219 220 221 222 223 224 225 226 227 228 229  | Next Page >