Search Results

Search found 9219 results on 369 pages for 'msbuild engine'.

Page 99/369 | < Previous Page | 95 96 97 98 99 100 101 102 103 104 105 106  | Next Page >

  • Why are a visual studio project's command-line settings stored per user? Is it OK to check-in (and

    - by DanO
    We're creating an application that understands some command-line parameters. There are some default's we would like to supply on the command-line when debugging, and these are easily set in the project settings as explained here. The thing is visual studio stores these settings in a *.csproj.user file, and the default settings for integrated source control do not check-in *.user files. We would like to just have these default command-line parameters in everyone's IDE when debugging this project. Often (but not always) when visual studio guides you into doing things a certain way it is for good reason. We probably don't want to just check-in someone's .csproj.user file... right? This question is has a few parts: Why does Visual Studio store this particular setting per user? Is there a way to alter this behavior? - Would doing so bring bad juju? Under these circumstances is it OK to check-in and share a .user file? Is there a better way to accomplish what we are trying to do here? Thank you -

    Read the article

  • NullReferenceException at Microsoft.Silverlight.Build.Tasks.CompileXaml.LoadAssemblies(ITaskItem[] R

    - by Eugene Larchick
    Hi, I updated my Visual Studio 2010 to the version 10.0.30319.1 RTM Rel and start getting the following exception during the build: System.NullReferenceException: Object reference not set to an instance of an object. at Microsoft.Silverlight.Build.Tasks.CompileXaml.LoadAssemblies(ITaskItem[] ReferenceAssemblies) at Microsoft.Silverlight.Build.Tasks.CompileXaml.get_GetXamlSchemaContext() at Microsoft.Silverlight.Build.Tasks.CompileXaml.GenerateCode(ITaskItem item, Boolean isApplication) at Microsoft.Silverlight.Build.Tasks.CompileXaml.Execute() at Bohr.Silverlight.BuildTasks.BohrCompileXaml.Execute() The code of BohrCompileXaml.Execute is the following: public override bool Execute() { List<TaskItem> pages = new List<TaskItem>(); foreach (ITaskItem item in SilverlightPages) { string newFileName = getGeneratedName(item.ItemSpec); String content = File.ReadAllText(item.ItemSpec); String parentClassName = getParentClassName(content); if (null != parentClassName) { content = content.Replace("<UserControl", "<" + parentClassName); content = content.Replace("</UserControl>", "</" + parentClassName + ">"); content = content.Replace("bohr:ParentClass=\"" + parentClassName + "\"", ""); } File.WriteAllText(newFileName, content); pages.Add(new TaskItem(newFileName)); } if (null != SilverlightApplications) { foreach (ITaskItem item in SilverlightApplications) { Log.LogMessage(MessageImportance.High, "Application: " + item.ToString()); } } foreach (ITaskItem item in pages) { Log.LogMessage(MessageImportance.High, "newPage: " + item.ToString()); } CompileXaml xamlCompiler = new CompileXaml(); xamlCompiler.AssemblyName = AssemblyName; xamlCompiler.Language = Language; xamlCompiler.LanguageSourceExtension = LanguageSourceExtension; xamlCompiler.OutputPath = OutputPath; xamlCompiler.ProjectPath = ProjectPath; xamlCompiler.RootNamespace = RootNamespace; xamlCompiler.SilverlightApplications = SilverlightApplications; xamlCompiler.SilverlightPages = pages.ToArray(); xamlCompiler.TargetFrameworkDirectory = TargetFrameworkDirectory; xamlCompiler.TargetFrameworkSDKDirectory = TargetFrameworkSDKDirectory; xamlCompiler.BuildEngine = BuildEngine; bool result = xamlCompiler.Execute(); // HERE we got the error! And the definition of the task: <BohrCompileXaml LanguageSourceExtension="$(DefaultLanguageSourceExtension)" Language="$(Language)" SilverlightPages="@(Page)" SilverlightApplications="@(ApplicationDefinition)" ProjectPath="$(MSBuildProjectFullPath)" RootNamespace="$(RootNamespace)" AssemblyName="$(AssemblyName)" OutputPath="$(IntermediateOutputPath)" TargetFrameworkDirectory="$(TargetFrameworkDirectory)" TargetFrameworkSDKDirectory="$(TargetFrameworkSDKDirectory)" > <Output ItemName="Compile" TaskParameter="GeneratedCodeFiles" /> <!-- Add to the list list of files written. It is used in Microsoft.Common.Targets to clean up for a next clean build --> <Output ItemName="FileWrites" TaskParameter="WrittenFiles" /> <Output ItemName="_GeneratedCodeFiles" TaskParameter="GeneratedCodeFiles" /> </BohrCompileXaml> What can be the reason? And how can I get more info what's happening inside CompileXaml class?

    Read the article

  • Can't select anything for build definition process tab

    - by Alexandru-Dan Maftei
    I am trying to create a build definition, specified the build definition name inside the General tab, specified the trigger, the workspace, the build controller that I want to use, the drop folder as a network shared location, the retention policy but when I go to the Process tab I can't select anything. Does anyone knows why I can't select anything inside the Process tab, it looks like it is not enabled, can't press Show details because is not enabled. Thanks!

    Read the article

  • What /else/ causes this?

    - by Mordachai
    MFC Toolbox Library.lib(SimpleFileIO.obj) : error LNK2005: _wcsnlen already defined in libcmtd.lib(wcslen_s.obj) fatal error LNK1169: one or more multiply defined symbols found This is driving me nuts. Normally, one would get this if the various projects that are a part of their solution do not agree on which CRT to use (single threaded, multi-threaded, release or debug). However, I have been over this thing about 500 times now, and they all agree. Background: this is a VS 2010 project just converted from VS 2008. MFC Toolbox Library.lib is set to compile as a static library, using /MTd, as is the target .exe I am trying to compile in this solution. Further, the solution that this is being converted from (VS 2008) already compiles & links properly!!! So it's not like that there is a disagreement between the two .vcproj's - or at least there wasn't before the conversion. Furthermore, the MFC Toolbox Library is used by about 25 other projects in another solution - and in that solution (Master Build English) it compiles & links against those other projects without complaint in both debug and release targets. I have just spent the last hour going over every single project property for this target project (Cimex Header Viewer) vs. several different target exe projects in Master Build English solution - and I cannot find a difference. They appear to be identical, excepting that they're different names. I've tried doing a clean & build all. I'm simply out of ideas. Does anyone have a thought on what else I might investigate??? I think I'm ready to start chewing glass. :(

    Read the article

  • VS2008 project with Entity Framework model results in "always dirty" compile

    - by Jeremy Lew
    In VS 2008, I have a simple .csproj that contains an Entity Framework .edmx (V1) file. Every time I build the project, the output DLL is updated, even though nothing has changed. I have reproduced this in the simplest-possible project (containing one ordinary .cs file and one edmx model). If I remove the edmx model and build repeatedly, the output assembly will not be touched. If I add the edmx model and build repeatedly, the output assembly is modified each time. This is a problem because the real project is a dependency of dozens of other projects and it is wreaking havoc with what times when working in higher layers of the application. Is this a known problem? Any way to fix it? Thanks!

    Read the article

  • Can I batch based on a Property (not just Items)?

    - by Josh Buedel
    I have a property group, like so: <PropertyGroup> <Platform>Win32;x64</Platform> </PropertyGroup> And I want to batch in an Exec task, like so: <Exec Command='devenv MySolution.sln /Build "Release|%(Platform)"' /> But of course, as written I get an error: error MSB4095: The item metadata %(Platform) is being referenced without an item name. Specify the item name by using %(itemname.Platform). Can I batch tasks on properties that are lists? I suppose I could hack it by creating a placeholder ItemGroup with metadata and batch on that.

    Read the article

  • Best practices for parsing HTML from Wikipedia for iPhone viewing?

    - by ivanTheTerrible
    I am building an iPhone Wikipeida game app, that requires modifying the default Wiki HTML a little bit (mostly simplifying the page). So far I am directly downloading the HTML output from en.wikipedia.org/wiki/Article_Foo to a python Google App Engine, and then modify its CSS and HTML structure, cache it, and finally output to iPhone. It works but I find this method quite tedious, there must be a better method? Please note that I use App Engine not just for parsing the Wiki, but the game also requires it to keep the stores...etc, hence not a overkill. Also, I would prefer doing all the work with python on App Engine, to keep the iPhone client as thin and mobile as possible (XML on iPhone is a big no fun) Thanks a lot.

    Read the article

  • Visual Studio build and deploy ordering

    - by mthornal
    We have a VS 2010 solution that includes a few class library projects, a SQL Server 2008 database project and a Wix setup project. We are trying to get to a point where the following happens in the order specified: Build the class library projects and the database project Deploy the database project to generate the deploy .sql script Build the Wix setup project. The reason for the desired order is that the setup project requires the deployment .sql scripts as it will use these to generate/update the database on the machine that the msi is run. It seems that there is no way within a Visual Studio solution file to create this type of build/deploy/build order. Is this correct? Thanks

    Read the article

  • How to Create C++ Project Filter/Folder in Visual Studio?

    - by BSalita
    Using Visual Studio 2012, I'd like to create a C++ project folder called "Include Files", which has the same characteristics as the well known folder "Header Files". That is, the files in Include Files have a cpi extenson and will be parsed out for use with InteliSense, and also can be precompiled. I'm able to create the folder but files within it aren't parsed. I've tried setting the type to C++ Header file. Nothing seems to work. The files work fine when given a hpp extension and put into Header Files folder.

    Read the article

  • Twitterbot/0.1,gzip(gfe),gzip(gfe)

    - by Pete
    I see a lot of this on my google app engine error log: 05-18 06:44AM 00.897 /NTp9 405 17ms 0cpu_ms 0kb Twitterbot/0.1,gzip(gfe),gzip(gfe) 128.242.241.133 - - [18/May/2010:06:44:00 -0700] "HEAD /NTp9 HTTP/1.1" 405 124 - "Twitterbot/0.1,gzip(gfe),gzip(gfe)" Should I do something for it??

    Read the article

  • Using multiple PaaS Vendors

    - by jpabluz
    I am developing a SaaS App, and I want to decide for a PaaS Vendor. Since one of my biggest concerns is uptime, is there an application or service, that allows me to use several PaaS Vendors (like Azure, Google App Engine, Amazon Web Services, etc.)? I want my application to be able to respond from one PaaS Vendor to another almost instantly without any downtime, to use the redundancy that this provides. This means that I need to be able to use the different services homogeneously.

    Read the article

  • How do you protect your <appid>.appspot.com domain from DDOS attack?

    - by jacob
    If I want to use CloudFlare to help protect my GAE app via it's custom domain, I still am vulnerable to attacks directly on the .appspot.com domain. How do I mitigate that? I could force redirect appspot.com host requests, such as discussed here: http://stackoverflow.com/questions/1364733/block-requests-from-appspot-com-and-force-custom-domain-in-google-app-engine/ But I would still suffer the load of processing the redirect in my app. Are there any other solutions?

    Read the article

  • Determining if you&rsquo;re running on the build server with MSBuild &ndash; Easy way

    - by ParadigmShift
    When you're customizing MSBuild in building a visual studio project, it often becomes important to determine if the build is running on the build server or your development environment. This information can change the way you set up path variables and other Conditional tasks.I've found many different answers online. It seems like they all only worked under certain conditions, so none of them were guaranteed to be consistent.So here's the simplest way I've found that has not failed me yet. <PropertyGroup> <!-- Determine if the current build is running on the build server --> <IsBuildServer>false</IsBuildServer> <IsBuildServer Condition="'$(BuildUri)' != ''">true</IsBuildServer> </PropertyGroup>   Shahzad Qureshi is a Software Engineer and Consultant in Salt Lake City, Utah, USAHis certifications include:Microsoft Certified System Engineer 3CX Certified Partner Global Information Assurance Certification – Secure Software Programmer – .NETHe is the owner of Utah VoIP Store at www.UtahVoIPStore.com and SWS Development at www.swsdev.com and publishes windows apps under the name Blue Voice.

    Read the article

  • design pattern advice: graph -> computation

    - by csetzkorn
    I have a domain model, persisted in a database, which represents a graph. A graph consists of nodes (e.g. NodeTypeA, NodeTypeB) which are connected via branches. The two generic elements (nodes and branches will have properties). A graph will be sent to a computation engine. To perform computations the engine has to be initialised like so (simplified pseudo code): Engine Engine = new Engine() ; Object ID1 = Engine.AddNodeTypeA(TypeA.Property1, TypeA.Property2, …, TypeA.Propertyn); Object ID2 = Engine.AddNodeTypeB(TypeB.Property1, TypeB.Property2, …, TypeB.Propertyn); Engine.AddBranch(ID1,ID2); Finally the computation is performed like this: Engine.DoSomeComputation(); I am just wondering, if there are any relevant design patterns out there, which help to achieve the above using good design principles. I hope this makes sense. Any feedback would be very much appreciated.

    Read the article

  • Microsoft Introduces WebMatrix

    - by Rick Strahl
    originally published in CoDe Magazine Editorial Microsoft recently released the first CTP of a new development environment called WebMatrix, which along with some of its supporting technologies are squarely aimed at making the Microsoft Web Platform more approachable for first-time developers and hobbyists. But in the process, it also provides some updated technologies that can make life easier for existing .NET developers. Let’s face it: ASP.NET development isn’t exactly trivial unless you already have a fair bit of familiarity with sophisticated development practices. Stick a non-developer in front of Visual Studio .NET or even the Visual Web Developer Express edition and it’s not likely that the person in front of the screen will be very productive or feel inspired. Yet other technologies like PHP and even classic ASP did provide the ability for non-developers and hobbyists to become reasonably proficient in creating basic web content quickly and efficiently. WebMatrix appears to be Microsoft’s attempt to bring back some of that simplicity with a number of technologies and tools. The key is to provide a friendly and fully self-contained development environment that provides all the tools needed to build an application in one place, as well as tools that allow publishing of content and databases easily to the web server. WebMatrix is made up of several components and technologies: IIS Developer Express IIS Developer Express is a new, self-contained development web server that is fully compatible with IIS 7.5 and based on the same codebase that IIS 7.5 uses. This new development server replaces the much less compatible Cassini web server that’s been used in Visual Studio and the Express editions. IIS Express addresses a few shortcomings of the Cassini server such as the inability to serve custom ISAPI extensions (i.e., things like PHP or ASP classic for example), as well as not supporting advanced authentication. IIS Developer Express provides most of the IIS 7.5 feature set providing much better compatibility between development and live deployment scenarios. SQL Server Compact 4.0 Database access is a key component for most web-driven applications, but on the Microsoft stack this has mostly meant you have to use SQL Server or SQL Server Express. SQL Server Compact is not new-it’s been around for a few years, but it’s been severely hobbled in the past by terrible tool support and the inability to support more than a single connection in Microsoft’s attempt to avoid losing SQL Server licensing. The new release of SQL Server Compact 4.0 supports multiple connections and you can run it in ASP.NET web applications simply by installing an assembly into the bin folder of the web application. In effect, you don’t have to install a special system configuration to run SQL Compact as it is a drop-in database engine: Copy the small assembly into your BIN folder (or from the GAC if installed fully), create a connection string against a local file-based database file, and then start firing SQL requests. Additionally WebMatrix includes nice tools to edit the database tables and files, along with tools to easily upsize (and hopefully downsize in the future) to full SQL Server. This is a big win, pending compatibility and performance limits. In my simple testing the data engine performed well enough for small data sets. This is not only useful for web applications, but also for desktop applications for which a fully installed SQL engine like SQL Server would be overkill. Having a local data store in those applications that can potentially be accessed by multiple users is a welcome feature. ASP.NET Razor View Engine What? Yet another native ASP.NET view engine? We already have Web Forms and various different flavors of using that view engine with Web Forms and MVC. Do we really need another? Microsoft thinks so, and Razor is an implementation of a lightweight, script-only view engine. Unlike the Web Forms view engine, Razor works only with inline code, snippets, and markup; therefore, it is more in line with current thinking of what a view engine should represent. There’s no support for a “page model” or any of the other Web Forms features of the full-page framework, but just a lightweight scripting engine that works with plain markup plus embedded expressions and code. The markup syntax for Razor is geared for minimal typing, plus some progressive detection of where a script block/expression starts and ends. This results in a much leaner syntax than the typical ASP.NET Web Forms alligator (<% %>) tags. Razor uses the @ sign plus standard C# (or Visual Basic) block syntax to delineate code snippets and expressions. Here’s a very simple example of what Razor markup looks like along with some comment annotations: <!DOCTYPE html> <html>     <head>         <title></title>     </head>     <body>     <h1>Razor Test</h1>          <!-- simple expressions -->     @DateTime.Now     <hr />     <!-- method expressions -->     @DateTime.Now.ToString("T")          <!-- code blocks -->     @{         List<string> names = new List<string>();         names.Add("Rick");         names.Add("Markus");         names.Add("Claudio");         names.Add("Kevin");     }          <!-- structured block statements -->     <ul>     @foreach(string name in names){             <li>@name</li>     }     </ul>           <!-- Conditional code -->        @if(true) {                        <!-- Literal Text embedding in code -->        <text>         true        </text>;    }    else    {        <!-- Literal Text embedding in code -->       <text>       false       </text>;    }    </body> </html> Like the Web Forms view engine, Razor parses pages into code, and then executes that run-time compiled code. Effectively a “page” becomes a code file with markup becoming literal text written into the Response stream, code snippets becoming raw code, and expressions being written out with Response.Write(). The code generated from Razor doesn’t look much different from similar Web Forms code that only uses script tags; so although the syntax may look different, the operational model is fairly similar to the Web Forms engine minus the overhead of the large Page object model. However, there are differences: -Razor pages are based on a new base class, Microsoft.WebPages.WebPage, which is hosted in the Microsoft.WebPages assembly that houses all the Razor engine parsing and processing logic. Browsing through the assembly (in the generated ASP.NET Temporary Files folder or GAC) will give you a good idea of the functionality that Razor provides. If you look closely, a lot of the feature set matches ASP.NET MVC’s view implementation as well as many of the helper classes found in MVC. It’s not hard to guess the motivation for this sort of view engine: For beginning developers the simple markup syntax is easier to work with, although you obviously still need to have some understanding of the .NET Framework in order to create dynamic content. The syntax is easier to read and grok and much shorter to type than ASP.NET alligator tags (<% %>) and also easier to understand aesthetically what’s happening in the markup code. Razor also is a better fit for Microsoft’s vision of ASP.NET MVC: It’s a new view engine without the baggage of Web Forms attached to it. The engine is more lightweight since it doesn’t carry all the features and object model of Web Forms with it and it can be instantiated directly outside of the HTTP environment, which has been rather tricky to do for the Web Forms view engine. Having a standalone script parser is a huge win for other applications as well – it makes it much easier to create script or meta driven output generators for many types of applications from code/screen generators, to simple form letters to data merging applications with user customizability. For me personally this is very useful side effect and who knows maybe Microsoft will actually standardize they’re scripting engines (die T4 die!) on this engine. Razor also better fits the “view-based” approach where the view is supposed to be mostly a visual representation that doesn’t hold much, if any, code. While you can still use code, the code you do write has to be self-contained. Overall I wouldn’t be surprised if Razor will become the new standard view engine for MVC in the future – and in fact there have been announcements recently that Razor will become the default script engine in ASP.NET MVC 3.0. Razor can also be used in existing Web Forms and MVC applications, although that’s not working currently unless you manually configure the script mappings and add the appropriate assemblies. It’s possible to do it, but it’s probably better to wait until Microsoft releases official support for Razor scripts in Visual Studio. Once that happens, you can simply drop .cshtml and .vbhtml pages into an existing ASP.NET project and they will work side by side with classic ASP.NET pages. WebMatrix Development Environment To tie all of these three technologies together, Microsoft is shipping WebMatrix with an integrated development environment. An integrated gallery manager makes it easy to download and load existing projects, and then extend them with custom functionality. It seems to be a prominent goal to provide community-oriented content that can act as a starting point, be it via a custom templates or a complete standard application. The IDE includes a project manager that works with a single project and provides an integrated IDE/editor for editing the .cshtml and .vbhtml pages. A run button allows you to quickly run pages in the project manager in a variety of browsers. There’s no debugging support for code at this time. Note that Razor pages don’t require explicit compilation, so making a change, saving, and then refreshing your page in the browser is all that’s needed to see changes while testing an application locally. It’s essentially using the auto-compiling Web Project that was introduced with .NET 2.0. All code is compiled during run time into dynamically created assemblies in the ASP.NET temp folder. WebMatrix also has PHP Editing support with syntax highlighting. You can load various PHP-based applications from the WebMatrix Web Gallery directly into the IDE. Most of the Web Gallery applications are ready to install and run without further configuration, with Wizards taking you through installation of tools, dependencies, and configuration of the database as needed. WebMatrix leverages the Web Platform installer to pull the pieces down from websites in a tight integration of tools that worked nicely for the four or five applications I tried this out on. Click a couple of check boxes and fill in a few simple configuration options and you end up with a running application that’s ready to be customized. Nice! You can easily deploy completed applications via WebDeploy (to an IIS server) or FTP directly from within the development environment. The deploy tool also can handle automatically uploading and installing the database and all related assemblies required, making deployment a simple one-click install step. Simplified Database Access The IDE contains a database editor that can edit SQL Compact and SQL Server databases. There is also a Database helper class that facilitates database access by providing easy-to-use, high-level query execution and iteration methods: @{       var db = Database.OpenFile("FirstApp.sdf");     string sql = "select * from customers where Id > @0"; } <ul> @foreach(var row in db.Query(sql,1)){         <li>@row.FirstName @row.LastName</li> } </ul> The query function takes a SQL statement plus any number of positional (@0,@1 etc.) SQL parameters by simple values. The result is returned as a collection of rows which in turn have a row object with dynamic properties for each of the columns giving easy (though untyped) access to each of the fields. Likewise Execute and ExecuteNonQuery allow execution of more complex queries using similar parameter passing schemes. Note these queries use string-based queries rather than LINQ or Entity Framework’s strongly typed LINQ queries. While this may seem like a step back, it’s also in line with the expectations of non .NET script developers who are quite used to writing and using SQL strings in code rather than using OR/M frameworks. The only question is why was something not included from the beginning in .NET and Microsoft made developers build custom implementations of these basic building blocks. The implementation looks a lot like a DataTable-style data access mechanism, but to be fair, this is a common approach in scripting languages. This type of syntax that uses simple, static, data object methods to perform simple data tasks with one line of code are common in scripting languages and are a good match for folks working in PHP/Python, etc. Seems like Microsoft has taken great advantage of .NET 4.0’s dynamic typing to provide this sort of interface for row iteration where each row has properties for each field. FWIW, all the examples demonstrate using local SQL Compact files - I was unable to get a SQL Server connection string to work with the Database class (the connection string wasn’t accepted). However, since the code in the page is still plain old .NET, you can easily use standard ADO.NET code or even LINQ or Entity Framework models that are created outside of WebMatrix in separate assemblies as required. The good the bad the obnoxious - It’s still .NET The beauty (or curse depending on how you look at it :)) of Razor and the compilation model is that, behind it all, it’s still .NET. Although the syntax may look foreign, it’s still all .NET behind the scenes. You can easily access existing tools, helpers, and utilities simply by adding them to the project as references or to the bin folder. Razor automatically recognizes any assembly reference from assemblies in the bin folder. In the default configuration, Microsoft provides a host of helper functions in a Microsoft.WebPages assembly (check it out in the ASP.NET temp folder for your application), which includes a host of HTML Helpers. If you’ve used ASP.NET MVC before, a lot of the helpers should look familiar. Documentation at the moment is sketchy-there’s a very rough API reference you can check out here: http://www.asp.net/webmatrix/tutorials/asp-net-web-pages-api-reference Who needs WebMatrix? Uhm… good Question Clearly Microsoft is trying hard to create an environment with WebMatrix that is easy to use for newbie developers. The goal seems to be simplicity in providing a minimal development environment and an easy-to-use script engine/language that makes it easy to get started with. There’s also some focus on community features that can be used as starting points, such as Web Gallery applications and templates. The community features in particular are very nice and something that would be nice to eventually see in Visual Studio as well. The question is whether this is too little too late. Developers who have been clamoring for a simpler development environment on the .NET stack have mostly left for other simpler platforms like PHP or Python which are catering to the down and dirty developer. Microsoft will be hard pressed to win those folks-and other hardcore PHP developers-back. Regardless of how much you dress up a script engine fronted by the .NET Framework, it’s still the .NET Framework and all the complexity that drives it. While .NET is a fine solution in its breadth and features once you get a basic handle on the core features, the bar of entry to being productive with the .NET Framework is still pretty high. The MVC style helpers Microsoft provides are a good step in the right direction, but I suspect it’s not enough to shield new developers from having to delve much deeper into the Framework to get even basic applications built. Razor and its helpers is trying to make .NET more accessible but the reality is that in order to do useful stuff that goes beyond the handful of simple helpers you still are going to have to write some C# or VB or other .NET code. If the target is a hobby/amateur/non-programmer the learning curve isn’t made any easier by WebMatrix it’s just been shifted a tad bit further along in your development endeavor when you run out of canned components that are supplied either by Microsoft or the community. The database helpers are interesting and actually I’ve heard a lot of discussion from various developers who’ve been resisting .NET for a really long time perking up at the prospect of easier data access in .NET than the ridiculous amount of code it takes to do even simple data access with raw ADO.NET. It seems sad that such a simple concept and implementation should trigger this sort of response (especially since it’s practically trivial to create helpers like these or pick them up from countless libraries available), but there it is. It also shows that there are plenty of developers out there who are more interested in ‘getting stuff done’ easily than necessarily following the latest and greatest practices which are overkill for many development scenarios. Sometimes it seems that all of .NET is focused on the big life changing issues of development, rather than the bread and butter scenarios that many developers are interested in to get their work accomplished. And that in the end may be WebMatrix’s main raison d'être: To bring some focus back at Microsoft that simpler and more high level solutions are actually needed to appeal to the non-high end developers as well as providing the necessary tools for the high end developers who want to follow the latest and greatest trends. The current version of WebMatrix hits many sweet spots, but it also feels like it has a long way to go before it really can be a tool that a beginning developer or an accomplished developer can feel comfortable with. Although there are some really good ideas in the environment (like the gallery for downloading apps and components) which would be a great addition for Visual Studio as well, the rest of the development environment just feels like crippleware with required functionality missing especially debugging and Intellisense, but also general editor support. It’s not clear whether these are because the product is still in an early alpha release or whether it’s simply designed that way to be a really limited development environment. While simple can be good, nobody wants to feel left out when it comes to necessary tool support and WebMatrix just has that left out feeling to it. If anything WebMatrix’s technology pieces (which are really independent of the WebMatrix product) are what are interesting to developers in general. The compact IIS implementation is a nice improvement for development scenarios and SQL Compact 4.0 seems to address a lot of concerns that people have had and have complained about for some time with previous SQL Compact implementations. By far the most interesting and useful technology though seems to be the Razor view engine for its light weight implementation and it’s decoupling from the ASP.NET/HTTP pipeline to provide a standalone scripting/view engine that is pluggable. The first winner of this is going to be ASP.NET MVC which can now have a cleaner view model that isn’t inconsistent due to the baggage of non-implemented WebForms features that don’t work in MVC. But I expect that Razor will end up in many other applications as a scripting and code generation engine eventually. Visual Studio integration for Razor is currently missing, but is promised for a later release. The ASP.NET MVC team has already mentioned that Razor will eventually become the default MVC view engine, which will guarantee continued growth and development of this tool along those lines. And the Razor engine and support tools actually inherit many of the features that MVC pioneered, so there’s some synergy flowing both ways between Razor and MVC. As an existing ASP.NET developer who’s already familiar with Visual Studio and ASP.NET development, the WebMatrix IDE doesn’t give you anything that you want. The tools provided are minimal and provide nothing that you can’t get in Visual Studio today, except the minimal Razor syntax highlighting, so there’s little need to take a step back. With Visual Studio integration coming later there’s little reason to look at WebMatrix for tooling. It’s good to see that Microsoft is giving some thought about the ease of use of .NET as a platform For so many years, we’ve been piling on more and more new features without trying to take a step back and see how complicated the development/configuration/deployment process has become. Sometimes it’s good to take a step - or several steps - back and take another look and realize just how far we’ve come. WebMatrix is one of those reminders and one that likely will result in some positive changes on the platform as a whole. © Rick Strahl, West Wind Technologies, 2005-2010Posted in ASP.NET   IIS7  

    Read the article

  • Troubleshooting .NET "Fatal Execution Engine Error"

    - by JYelton
    Summary: I periodically get a .NET Fatal Execution Engine Error on an application which I cannot seem to debug. The dialog that comes up only offers to close the program or send information about the error to Microsoft. I've tried looking at the more detailed information but I don't know how to make use of it. Error: The error is visible in Event Viewer under Applications and is as follows: .NET Runtime version 2.0.50727.3607 - Fatal Execution Engine Error (7A09795E) (80131506) The computer running it is Windows XP Professional SP 3. (Intel Core2Quad Q6600 2.4GHz w/ 2.0 GB of RAM) Other .NET-based projects that lack multi-threaded downloading (see below) seem to run just fine. Application: The application is written in C#/.NET 3.5 using VS2008, and installed via a setup project. The app is multi-threaded and downloads data from multiple web servers using System.Net.HttpWebRequest and its methods. I've determined that the .NET error has something to do with either threading or HttpWebRequest but I haven't been able to get any closer as this particular error seems impossible to debug. I've tried handling errors on many levels, including the following in Program.cs: // handle UI thread exceptions Application.ThreadException += new ThreadExceptionEventHandler(Application_ThreadException); // handle non-UI thread exceptions AppDomain.CurrentDomain.UnhandledException += new UnhandledExceptionEventHandler(CurrentDomain_UnhandledException); Application.EnableVisualStyles(); Application.SetCompatibleTextRenderingDefault(false); // force all windows forms errors to go through our handler Application.SetUnhandledExceptionMode(UnhandledExceptionMode.CatchException); More Notes and What I've Tried... Installed Visual Studio 2008 on the target machine and tried running in debug mode, but the error still occurs, with no hint as to where in source code it occurred. When running the program from its installed version (Release) the error occurs more frequently, usually within minutes of launching the application. When running the program in debug mode inside of VS2008, it can run for hours or days before generating the error. Reinstalled .NET 3.5 and made sure all updates are applied. Broke random cubicle objects in frustration. Rewritten parts of code that deal with threading and downloading in attempts to catch and log exceptions, though logging seemed to aggravate the problem (and never provided any data). Question: What steps can I take to troubleshoot or debug this kind of error? Memory dumps and the like seem to be the next step, but I'm not experienced at interpreting them. Perhaps there's something more I can do in the code to try and catch errors... It would be nice if the "Fatal Execution Engine Error" was more informative, but internet searches have only told me that it's a common error for a lot of .NET-related items.

    Read the article

  • How to get the default audio format of a TTS Engine

    - by Itslava
    In Microsoft TTS 5.1 or newer. The SpVoice.AudioOutputStream property says: The AudioOutputStream property gets and sets the current audio stream object used by the voice. Setting the voice's AudioOutputStream property may cause its audio output format to be automatically changed to match the text-to-speech (TTS) engine's preferred audio output format. If the voice's AllowAudioOutputFormatChangesOnNextSet property is True, the format change takes place; if False, the format remains unchanged. In order to set the AudioOutputStream property of a voice to a specific format, its AllowOutputFormatChangesOnNextSet should be False. It means a engine's always has a preferred audio output format. So, how can i get it.. i have not found any interface to get that attribute.

    Read the article

  • Writing a search engine

    - by wvd
    Hello all, The title might be a bit misleading, but I couldn't figure out a better title. I'm writing a simple search engine which will search on several sites for the specific domain. To be concrete: I'm writing a search engine for hardstyle livesets/aftermovies/tracks. To do I will search on the sites who provide livesets, tracks, and such. The problem here is speed, I need to pass the search query to 5-7 sites, get the results and then use my own algorithm to display the results in a sorted order. I could just "multithread" it, but it's easier said then done so I have a few questions. What would be the best solution to this problem? Should I just multithread/process this application, so I'm going to get a bit of speed-up? Are there any other solutions or I am doing something really wrong? Thanks, William van Doorn

    Read the article

  • Best Template Engine for ASP.NET MVC

    - by OnesimusUnbound
    I am exploring ASP.NET MVC and I wanted to add jQuery to make the site interactive. I used StringTemplate, ported to .Net, as my template engine to generate html and to send JSON. However, when I view the page, I could not see it. After debugging, I've realized that the $ is used by the StringTemplate to access property, etc and jQuery uses it too to manipulate the DOM. Gee, I've looked on other template engines and most of them uses the dollar sign :(. Any alternative template engine for ASP.Net MVC? I wanted to retain jQuery because MSFT announced that it will used in the Visual Studio (2008?) Thanks in Advance :)

    Read the article

  • Inference engine to calculate matching set according to internal rules

    - by Zecrates
    I have a set of objects with attributes and a bunch of rules that, when applied to the set of objects, provides a subset of those objects. To make this easier to understand I'll provide a concrete example. My objects are persons and each has three attributes: country of origin, gender and age group (all attributes are discrete). I have a bunch of rules, like "all males from the US", which correspond with subsets of this larger set of objects. I'm looking for either an existing Java "inference engine" or something similar, which will be able to map from the rules to a subset of persons, or advice on how to go about creating my own. I have read up on rule engines, but that term seems to be exclusively used for expert systems that externalize the business rules, and usually doesn't include any advanced form of inferencing. Here are some examples of the more complex scenarios I have to deal with: I need the conjunction of rules. So when presented with both "include all males" and "exclude all US persons in the 10 - 20 age group," I'm only interested in the males outside of the US, and the males within the US that are outside the 10 - 20 age group. Rules may have different priorities (explicitly defined). So a rule saying "exclude all males" will override a rule saying "include all US males." Rules may be conflicting. So I could have both an "include all males" and an "exclude all males" in which case the priorities will have to settle the issue. Rules are symmetric. So "include all males" is equivalent to "exclude all females." Rules (or rather subsets) may have meta rules (explicitly defined) associated with them. These meta rules will have to be applied in any case that the original rule is applied, or if the subset is reached via inferencing. So if a meta rule of "exclude the US" is attached to the rule "include all males", and I provide the engine with the rule "exclude all females," it should be able to inference that the "exclude all females" subset is equivalent to the "include all males" subset and as such apply the "exclude the US" rule additionally. I can in all likelihood live without item 5, but I do need all the other properties mentioned. Both my rules and objects are stored in a database and may be updated at any stage, so I'd need to instantiate the 'inference engine' when needed and destroy it afterward.

    Read the article

< Previous Page | 95 96 97 98 99 100 101 102 103 104 105 106  | Next Page >