Search Results

Search found 11009 results on 441 pages for 'third party integration'.

Page 158/441 | < Previous Page | 154 155 156 157 158 159 160 161 162 163 164 165  | Next Page >

  • In which cases Robolectric is a relevant solution?

    - by Francis Toth
    As you may now, Robolectric is a framework that provides stubs for Android objects, in order to make tests runnable outside the Dalvik environment. My concern is that, by doing this, one can fake a third party library, which is, I believe, not a good practice (it should be encapsulated instead). If you make assumptions about an interface you don't own, which is changed once your test has been written, you won't be always noticed about the modifications. This can lead to a misunderstanding between your implementations and the interface they depends on. In addition, Android use mostly inheritance over interfaces which limits contract testing. So here's my question: Are there situations when Robolectric is the way to go? Here are some links you can check for further information: test-doubles-with-mockito in-brief-contract-tests

    Read the article

  • Is this overkill? Using MDX queries and cubes instead of SQL stored procedures

    - by Jason Holland
    I am new to Microsoft's SQL Server Analysis Services Cubes and MDX queries. Where I work we have a daily sales table in SQL Server 2005 that already contains an aggregate of sale information per store per day. At this time it contains only 164,000+ rows. We have a sales cube dedicated to this table that about 15 reports are based off of. Now, I should also note that we generate reports based on our own fiscal year criteria: a 13 period year (1 month equals 28 days etc.). Is this overkill? At what point is it justified to begin using SSAS Cubes/MDX over plain old SQL Server stored procedures? Since I have always been just using plain old SQL am I tragically late to the MDX party?

    Read the article

  • My First Post @ geekswithblogs

    - by sathya
    Dear Friends, Here is my first post on geekswithblogs. I am happy that I have got a separate space here to blog. I am an MCTS certified Professional in .Net 2.0 Web applications, working as a Senior Software Engineer. Willing to share my knowledge on all topics whatever I know. I am also an active presenter / speaker in Microsoft Developer User Group HyderabadTechies. And I have presented many online sessions there. I keep myself updated on the latest technologies in Microsoft. You can see my posts here on the following subjects : C# ASP.NET SQL Server SQL Server Integration Services (SSIS) SQL Server Analysis Services (SSAS) SQL Server Reporting Services (SSRS) I have a personal blog too where I share my knowledge. Pls take a note of it. http://cybersathya.blogspot.com You can see me here often posting the updates on technologies and the technical challenges that I faced and the solutions for the same. Stay Tuned !!! Regards Sathya Narayanan Srinivasan

    Read the article

  • Load Balancer impact on web development

    - by confusedGeek
    This question has it's roots in a SharePoint site that I am help with. Background on the issue I dealt with: The dev box and integration server are not setup behind a load balancer. The links were being built using the HttpRequest.Url value from the current context. Note that the links weren't relative links but full URIs. Once we deployed to testing (which has a LB, amongst other things) we received errors on the links being built since the server had an address of "http://some.site.org:999" while the address at the LB as "https://site.org" (SSL was off-loaded at the LB). The fix was easy enough by using relative URIs. The Question: Since this is the first site I've worked with that's behind a Load Balancer on I'm wondering if there are other gotcha's that I need to consider when developing a site behind one?

    Read the article

  • What can I do with the twitter API?

    - by aditya menon
    I've tried googling for this but could not find concrete developer examples. When building mundane daily web applications like Classified websites, Job boards or Intranet targeted Document Management Systems, how can the twitter API help me do more things. May I please have some examples on how developers have used twitter to make their apps better? Other than the obvious use for promotional and search engine optimization purpose (yay there's a new job post on our site), what can I do with it? Also, am I late to the party? I hear a lot of upset on the internet about how twitter is apparently slowly betraying developers (I don't understand the specifics), so should I even look at the system or consider alternatives?

    Read the article

  • AutoVue for Mobile Asset Management

    - by Warren Baird
    Our partner VentureForth is working with us on a solution to use AutoVue in their mobile asset management system. The idea is to allow workers in the field to easily be able to access visual information they might need while doing their work - and to annotate them.   For instance the worker can take a photo of something unexpected they found on-site, and indicate on the drawing what they found, so there's a record of it for the future.VentureForth's vMobile suite allows workers to take information for EBS eAM and PeopleSoft CAM, as well as multiple other systems) offline with them while working in the field, and with the AutoVue integration the workers can then view and annotate CAD drawings or photographs and have that information saved back into the asset management system.VentureForth has posted a video of this in action on YouTube:  http://youtu.be/X5Lv_X2v7uc - Check it out and let us know what your thoughts are!Learn more about VentureForth at http://ventureforth.com/

    Read the article

  • SAS unifie la vision des risques pour les banques avec SAS Risk Management for Banking

    SAS unifie la vision des risques pour les banques Avec SAS Risk Management for Banking SAS lance SAS Risk Management for Banking, une solution intégrée dédiée à la gestion des risques dans le secteur de la banque. Cette solution exploite les fonctionnalités de la plate-forme SAS Business Analytics dans le domaine de l'intégration des données, de l'analyse et du reporting. « Ses fonctionnalités de gestion des risques répondent aux exigences en matière de normes réglementaires et de besoins de performance des différentes entités métier », souligne l'éditeur. SAS Risk Management for Banking couvre le processus complet allant de la gestion des données au reportin...

    Read the article

  • Update errors on Xubuntu12.04

    - by Kris Everett
    I'm trying to install updates via the Update Manager, and I got this error: The Package system is broken Check if you are using third party repositories. if so disable them, since they are a common source of problems. Furthermore run the following command in a Terminal: apt-get install -f When I run apt-get install -f, I get: E: could not open lock file /var/lib/dpkg/lock - open (13: Permission Denied E: Unabvle to lock administration directory (/var/lib/dpkg/), are you root? What is wrong? How do I fix it? Why does this happen?

    Read the article

  • Free Virtual Developer Day - Oracle Fusion Development

    - by Grant Ronald
    You know, there is no reason for you or your developers not to be top notch Fusion developers.  This is my third blog in a row telling you about some sort of free training.  In this case its a whole on line conference! In this on line conference you can learn about the various components that make up the Oracle Fusion Middleware development platform including Oracle ADF, Oracle WebCenter, Business Intelligence, BPM and more!  The online conference will include seminars, hands-on lab and live chats with our technical staff including me!!  And the best bit, it doesn't cost you a single penny/cent.  Its free and available right on your desktop. You have to register to attend so click here to confirm your place.

    Read the article

  • How do I prove or disprove "god" objects are wrong?

    - by honestduane
    Problem Summary: Long story short, I inherited a code base and an development team I am not allowed to replace and the use of God Objects is a big issue. Going forward, I want to have us re-factor things but I am getting push-back from the teams who want to do everything with God Objects "because its easier" and this means I would not be allowed to re-factor. I pushed back citing my years of dev experience, that I'm the new boss who was hired to know these things, etc, and so did the third party offshore companies account sales rep, and this is now at the executive level and my meeting is tomorrow and I want to go in with a lot of technical ammo to advocate best practices because I feel it will be cheaper in the long run (And I personally feel that is what the third party is worried about) for the company. My issue is from a technical level, I know its good long term but I'm having trouble with the ultra short term and 6 months term, and while its something I "know" I cant prove it with references and cited resources outside of one person (Robert C. Martin, aka Uncle Bob), as that is what I am being asked to do as I have been told having data from one person and only one person (Robert C Martin) is not good enough of an argument. Question: What are some resources I can cite directly (Title, year published, page number, quote) by well known experts in the field that explicitly say this use of "God" Objects/Classes/Systems is bad (or good, since we are looking for the most technically valid solution)? Research I have already done: I have a number of books here and I have searched their indexes for the use of the words "god object" and "god class". I found that oddly its almost never used and the copy of the GoF book I have for example, never uses it (At least according to the index in front of me) but I have found it in 2 books per the below, but I want more I can use. I checked the Wikipedia page for "God Object" and its currently a stub with little reference links so although I personally agree with that it says, It doesn't have much I can use in an environment where personal experience is not considered valid. The book cited is also considered too old to be valid by the people I am debating these technical points with as the argument they are making is that "it was once thought to be bad but nobody could prove it, and now modern software says "god" objects are good to use". I personally believe that this statement is incorrect, but I want to prove the truth, whatever it is. In Robert C Martin's "Agile Principles, Patterns, and Practices in C#" (ISBN: 0-13-185725-8, hardcover) where on page 266 it states "Everybody knows that god classes are a bad idea. We don't want to concentrate all the intelligence of a system into a single object or a single function. One of the goals of OOD is the partitioning and distribution of behavior into many classes and many function." -- And then goes on to say sometimes its better to use God Classes anyway sometimes (Citing micro-controllers as an example). In Robert C Martin's "Clean Code: A Handbook of Agile Software Craftsmanship" page 136 (And only this page) talks about the "God class" and calls it out as a prime example of a violation of the "classes should be small" rule he uses to promote the Single Responsibility Principle" starting on on page 138. The problem I have is all my references and citations come from the same person (Robert C. Martin), and am from the same single person/source. I am being told that because he is just one guy, my desire to not use "God Classes" is invalid and not accepted as a standard best practice in the software industry. Is this true? Am I doing things wrong from a technical perspective by trying to keep to the teaching of Uncle Bob? God Objects and Object Oriented Programming and Design: The more I think of this the more I think this is more something you learn when you study OOP and its never explicitly called out; Its implicit to good design is my thinking (Feel free to correct me, please, as I want to learn), The problem is I "know" this, but but not everybody does, so in this case its not considered a valid argument because I am effectively calling it out as universal truth when in fact most people are statistically ignorant of it since statistically most people are not programmers. Conclusion: I am at a loss on what to search for to get the best additional results to cite, since they are making a technical claim and I want to know the truth and be able to prove it with citations like a real engineer/scientist, even if I am biased against god objects due to my personal experience with code that used them. Any assistance or citations would be deeply appreciated.

    Read the article

  • Ask the Readers: Are You A Second Screen Multi-tasker?

    - by Jason Fitzpatrick
    Television watchers are no longer keeping their eyes continuously glued to the screen–increasingly smartphone, tablet, and laptop users have merged their mobile device and television time. Are you one of the second screen multi-taskers? Image courtesy of Umani, a TV-companion application for iPad. According to Nielsen user surveys, at least 80% of mobile device owners have used their device while watching television in the past month–27% said they use their mobile device alongside the television multiple times a day. What the survey results are light on, however, is an in depth look at what the users are doing with their second screen. This week we want to hear about whether or not you’re one of the second screen multi-taskers and what you use your mobile device for during your television/movie time. Sound off in the comments and then check back in on Friday for the What You Said roundup. How to Get Pro Features in Windows Home Versions with Third Party Tools HTG Explains: Is ReadyBoost Worth Using? HTG Explains: What The Windows Event Viewer Is and How You Can Use It

    Read the article

  • GWB | What is the next feature you want to see?

    - by Jeff Julian
    We want to know what you are thinking bloggers and visitors of Geekswithblogs.net.  If you were able to add items to the product backlog for this site, what would they be? New skins? Better search? Organic tag system? Better twitter integration? More ways to link other social media outlets to your blog like LinkedIn, Plaxo, Flickr? …. What would you like to see?   You can leave feedback on this post or email me at [email protected].  We love this community and want to see how we can continue to make Geekswithblogs.net relevant to developers in 2010. Technorati Tags: Geekswithblogs,2010,Next Features

    Read the article

  • Display folder sizes in file manager

    - by wim
    In nautilus (or nemo) file manager, the "Size" column shows the filesize for files and the number of items contained in a folder for subdirectories: Number of items is not that important for me, it would be more useful if I could make this column show the total size contained under the directory. I had an extension on windows called foldersize which shows what I mean: I think it involved a service which ran in the background monitoring filesystem modifications in order to make sure the column was kept up to date. I am interested to know if there is any similar extension to nautilus, I would also be open to switching to another file manager to get this functionality. I am aware of the Disk Usage Analyser in Ubuntu, but what I'm looking for is a solution with file manager integration.

    Read the article

  • question about 12.10 32 bit syslinux 4.06 EDD 2012-10-23 usb direct boot

    - by logan
    alright so guys i am trying to do a direct boot for Ubuntu 12.10 (as stated in the topic) i have been trying to do a direct boot for about ten minutes now. so i have to do a direct boot because my original OS has a corrupted kernel (i am trying to fix a friends laptop) Toshiba satellite C655D-S5518 anyway getting back to the main point when i put the usb in everything starts up fine it loads the usb then it goes to this screen saying "SYSLINUX 4.06 EDD 2012-10-23 Copyright (C) 1994-2012 H. Peter Anvin et al." also getting a blinking line (as if i could put something in however when i try to type it does not show up so im guessing its telling me it is loading) However i am not getting a corrupted file or any kind of file error so im guessing my main question is, is this normal? and if so how long does it take for this to be done and what are the steps proceeding this? sorry guys if this is a dumb question i am new to the Ubuntu party haha thanks Logan.

    Read the article

  • Livre blanc gratuit : « L'Environnement Personnalisable dans Windows Embedded 7 », l'OS embarqué fondé sur Windows 7

    Livre Blanc : l'environnement personnalisable dans Windows Embedded 7 Le système embarqué de Microsoft fondé sur Windows 7 L'utilisation des systèmes embarqués se distingue de celle des PC à maints égards. L'une des différences tient à la personnalisation et à l'usage de marques. Un système embarqué n'indique généralement aucune autre marque que celle de l'OEM, une exigence qui pose souvent problème au client. Dans sa dernière version, Windows Embedded Standard 7 tente de résoudre ces problèmes en proposant de nouvelles fonctionnalités pour faciliter l'intégration de marques et le blocage des messages. Pour présenter ces nouveautés (écrans de démarrage sans le marqu...

    Read the article

  • Package System Broken AMD FGLRX Unmet Dependencies

    - by fleamour
    I uninstalled propriety ATI driver to check if it would fix wrong Plymouth resolution. It did. But now I cannot reinstall either propriety drivers listed. Also package system has broken with this error: The package system is broken If you are using third party repositories then disable them, since they are a common source of problems. Now run the following command in a terminal: apt-get install -f The following packages have unmet dependencies: fglrx-amdcccle: Depends: fglrx but it is not installed Depends: libqtcore4 (= 4:4.5.3) but 4:4.8.1-0ubuntu4.1 is installed Depends: libqtgui4 (= 4:4.5.3) but 4:4.8.1-0ubuntu4.1 is installed What should I do?

    Read the article

  • When is the best time to do self learning in relation with software management?

    - by shankbond
    It all started from here. I have been following Software Estimation: Demystifying the Black Art (Best Practices (Microsoft)). The third chapter says that in Software Management: You cannot give too much time to software developers, if you give it to them, then it is likely that extra time given to them will be filled by some other tasks (in other words, the developers will eat that time :)) Parkinson's Law You can also not squeeze the time from their schedule because if you do that, it is likely that they will develop poor quality product, poor design and will hurt you in the long run, there will be a panic situation and total chaos in the project, lots of rework etc. My question is related to the first point. If you don't give enough time then will the typical software engineer learn his/her skills? The market is always coming with new technologies, you need to learn them. Even with the existing familiar technologies there are always best practices and dos and don'ts.

    Read the article

  • SQL SERVER – What is Incremental Statistics? – Performance improvements in SQL Server 2014 – Part 1

    - by Pinal Dave
    This is the first part of the series Incremental Statistics. Here is the index of the complete series. What is Incremental Statistics? – Performance improvements in SQL Server 2014 – Part 1 Simple Example of Incremental Statistics – Performance improvements in SQL Server 2014 – Part 2 DMV to Identify Incremental Statistics – Performance improvements in SQL Server 2014 – Part 3 Statistics are considered one of the most important aspects of SQL Server Performance Tuning. You might have often heard the phrase, with related to performance tuning. “Update Statistics before you take any other steps to tune performance”. Honestly, I have said above statement many times and many times, I have personally updated statistics before I start to do any performance tuning exercise. You may agree or disagree to the point, but there is no denial that Statistics play an extremely vital role in the performance tuning. SQL Server 2014 has a new feature called Incremental Statistics. I have been playing with this feature for quite a while and I find that very interesting. After spending some time with this feature, I decided to write about this subject over here. New in SQL Server 2014 – Incremental Statistics Well, it seems like lots of people wants to start using SQL Server 2014′s new feature of Incremetnal Statistics. However, let us understand what actually this feature does and how it can help. I will try to simplify this feature first before I start working on the demo code. Code for all versions of SQL Server Here is the code which you can execute on all versions of SQL Server and it will update the statistics of your table. The keyword which you should pay attention is WITH FULLSCAN. It will scan the entire table and build brand new statistics for you which your SQL Server Performance Tuning engine can use for better estimation of your execution plan. UPDATE STATISTICS TableName(StatisticsName) WITH FULLSCAN Who should learn about this? Why? If you are using partitions in your database, you should consider about implementing this feature. Otherwise, this feature is pretty much not applicable to you. Well, if you are using single partition and your table data is in a single place, you still have to update your statistics the same way you have been doing. If you are using multiple partitions, this may be a very useful feature for you. In most cases, users have multiple partitions because they have lots of data in their table. Each partition will have data which belongs to itself. Now it is very common that each partition are populated separately in SQL Server. Real World Example For example, if your table contains data which is related to sales, you will have plenty of entries in your table. It will be a good idea to divide the partition into multiple filegroups for example, you can divide this table into 3 semesters or 4 quarters or even 12 months. Let us assume that we have divided our table into 12 different partitions. Now for the month of January, our first partition will be populated and for the month of February our second partition will be populated. Now assume, that you have plenty of the data in your first and second partition. Now the month of March has just started and your third partition has started to populate. Due to some reason, if you want to update your statistics, what will you do? In SQL Server 2012 and earlier version You will just use the code of WITH FULLSCAN and update the entire table. That means even though you have only data in third partition you will still update the entire table. This will be VERY resource intensive process as you will be updating the statistics of the partition 1 and 2 where data has not changed at all. In SQL Server 2014 You will just update the partition of Partition 3. There is a special syntax where you can now specify which partition you want to update now. The impact of this is that it is smartly merging the new data with old statistics and update the entire statistics without doing FULLSCAN of your entire table. This has a huge impact on performance. Remember that the new feature in SQL Server 2014 does not change anything besides the capability to update a single partition. However, there is one feature which is indeed attractive. Previously, when table data were changed 20% at that time, statistics update were triggered. However, now the same threshold is applicable to a single partition. That means if your partition faces 20% data, change it will also trigger partition level statistics update which, when merged to your final statistics will give you better performance. In summary If you are not using a partition, this feature is not applicable to you. If you are using a partition, this feature can be very helpful to you. Tomorrow: We will see working code of SQL Server 2014 Incremental Statistics. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Performance, SQL Query, SQL Server, SQL Tips and Tricks, T SQL Tagged: SQL Statistics, Statistics

    Read the article

  • Differences between software testing processes and techniques?

    - by Aptos
    I get confused between these terms. For examples, should Unit testing be listed as a software testing process or technique? I think unit testing is a software testing technique. And how about Test driven development? Can you give me some examples for software testing processes and techniques? In my opinion, software testing process is a part of the software development life cycle. For example, if we use V-Model, the software testing process will be System test, Acceptance test, Integration Test... Thank you.

    Read the article

  • Microsoft Introduces WebMatrix

    - by Rick Strahl
    originally published in CoDe Magazine Editorial Microsoft recently released the first CTP of a new development environment called WebMatrix, which along with some of its supporting technologies are squarely aimed at making the Microsoft Web Platform more approachable for first-time developers and hobbyists. But in the process, it also provides some updated technologies that can make life easier for existing .NET developers. Let’s face it: ASP.NET development isn’t exactly trivial unless you already have a fair bit of familiarity with sophisticated development practices. Stick a non-developer in front of Visual Studio .NET or even the Visual Web Developer Express edition and it’s not likely that the person in front of the screen will be very productive or feel inspired. Yet other technologies like PHP and even classic ASP did provide the ability for non-developers and hobbyists to become reasonably proficient in creating basic web content quickly and efficiently. WebMatrix appears to be Microsoft’s attempt to bring back some of that simplicity with a number of technologies and tools. The key is to provide a friendly and fully self-contained development environment that provides all the tools needed to build an application in one place, as well as tools that allow publishing of content and databases easily to the web server. WebMatrix is made up of several components and technologies: IIS Developer Express IIS Developer Express is a new, self-contained development web server that is fully compatible with IIS 7.5 and based on the same codebase that IIS 7.5 uses. This new development server replaces the much less compatible Cassini web server that’s been used in Visual Studio and the Express editions. IIS Express addresses a few shortcomings of the Cassini server such as the inability to serve custom ISAPI extensions (i.e., things like PHP or ASP classic for example), as well as not supporting advanced authentication. IIS Developer Express provides most of the IIS 7.5 feature set providing much better compatibility between development and live deployment scenarios. SQL Server Compact 4.0 Database access is a key component for most web-driven applications, but on the Microsoft stack this has mostly meant you have to use SQL Server or SQL Server Express. SQL Server Compact is not new-it’s been around for a few years, but it’s been severely hobbled in the past by terrible tool support and the inability to support more than a single connection in Microsoft’s attempt to avoid losing SQL Server licensing. The new release of SQL Server Compact 4.0 supports multiple connections and you can run it in ASP.NET web applications simply by installing an assembly into the bin folder of the web application. In effect, you don’t have to install a special system configuration to run SQL Compact as it is a drop-in database engine: Copy the small assembly into your BIN folder (or from the GAC if installed fully), create a connection string against a local file-based database file, and then start firing SQL requests. Additionally WebMatrix includes nice tools to edit the database tables and files, along with tools to easily upsize (and hopefully downsize in the future) to full SQL Server. This is a big win, pending compatibility and performance limits. In my simple testing the data engine performed well enough for small data sets. This is not only useful for web applications, but also for desktop applications for which a fully installed SQL engine like SQL Server would be overkill. Having a local data store in those applications that can potentially be accessed by multiple users is a welcome feature. ASP.NET Razor View Engine What? Yet another native ASP.NET view engine? We already have Web Forms and various different flavors of using that view engine with Web Forms and MVC. Do we really need another? Microsoft thinks so, and Razor is an implementation of a lightweight, script-only view engine. Unlike the Web Forms view engine, Razor works only with inline code, snippets, and markup; therefore, it is more in line with current thinking of what a view engine should represent. There’s no support for a “page model” or any of the other Web Forms features of the full-page framework, but just a lightweight scripting engine that works with plain markup plus embedded expressions and code. The markup syntax for Razor is geared for minimal typing, plus some progressive detection of where a script block/expression starts and ends. This results in a much leaner syntax than the typical ASP.NET Web Forms alligator (<% %>) tags. Razor uses the @ sign plus standard C# (or Visual Basic) block syntax to delineate code snippets and expressions. Here’s a very simple example of what Razor markup looks like along with some comment annotations: <!DOCTYPE html> <html>     <head>         <title></title>     </head>     <body>     <h1>Razor Test</h1>          <!-- simple expressions -->     @DateTime.Now     <hr />     <!-- method expressions -->     @DateTime.Now.ToString("T")          <!-- code blocks -->     @{         List<string> names = new List<string>();         names.Add("Rick");         names.Add("Markus");         names.Add("Claudio");         names.Add("Kevin");     }          <!-- structured block statements -->     <ul>     @foreach(string name in names){             <li>@name</li>     }     </ul>           <!-- Conditional code -->        @if(true) {                        <!-- Literal Text embedding in code -->        <text>         true        </text>;    }    else    {        <!-- Literal Text embedding in code -->       <text>       false       </text>;    }    </body> </html> Like the Web Forms view engine, Razor parses pages into code, and then executes that run-time compiled code. Effectively a “page” becomes a code file with markup becoming literal text written into the Response stream, code snippets becoming raw code, and expressions being written out with Response.Write(). The code generated from Razor doesn’t look much different from similar Web Forms code that only uses script tags; so although the syntax may look different, the operational model is fairly similar to the Web Forms engine minus the overhead of the large Page object model. However, there are differences: -Razor pages are based on a new base class, Microsoft.WebPages.WebPage, which is hosted in the Microsoft.WebPages assembly that houses all the Razor engine parsing and processing logic. Browsing through the assembly (in the generated ASP.NET Temporary Files folder or GAC) will give you a good idea of the functionality that Razor provides. If you look closely, a lot of the feature set matches ASP.NET MVC’s view implementation as well as many of the helper classes found in MVC. It’s not hard to guess the motivation for this sort of view engine: For beginning developers the simple markup syntax is easier to work with, although you obviously still need to have some understanding of the .NET Framework in order to create dynamic content. The syntax is easier to read and grok and much shorter to type than ASP.NET alligator tags (<% %>) and also easier to understand aesthetically what’s happening in the markup code. Razor also is a better fit for Microsoft’s vision of ASP.NET MVC: It’s a new view engine without the baggage of Web Forms attached to it. The engine is more lightweight since it doesn’t carry all the features and object model of Web Forms with it and it can be instantiated directly outside of the HTTP environment, which has been rather tricky to do for the Web Forms view engine. Having a standalone script parser is a huge win for other applications as well – it makes it much easier to create script or meta driven output generators for many types of applications from code/screen generators, to simple form letters to data merging applications with user customizability. For me personally this is very useful side effect and who knows maybe Microsoft will actually standardize they’re scripting engines (die T4 die!) on this engine. Razor also better fits the “view-based” approach where the view is supposed to be mostly a visual representation that doesn’t hold much, if any, code. While you can still use code, the code you do write has to be self-contained. Overall I wouldn’t be surprised if Razor will become the new standard view engine for MVC in the future – and in fact there have been announcements recently that Razor will become the default script engine in ASP.NET MVC 3.0. Razor can also be used in existing Web Forms and MVC applications, although that’s not working currently unless you manually configure the script mappings and add the appropriate assemblies. It’s possible to do it, but it’s probably better to wait until Microsoft releases official support for Razor scripts in Visual Studio. Once that happens, you can simply drop .cshtml and .vbhtml pages into an existing ASP.NET project and they will work side by side with classic ASP.NET pages. WebMatrix Development Environment To tie all of these three technologies together, Microsoft is shipping WebMatrix with an integrated development environment. An integrated gallery manager makes it easy to download and load existing projects, and then extend them with custom functionality. It seems to be a prominent goal to provide community-oriented content that can act as a starting point, be it via a custom templates or a complete standard application. The IDE includes a project manager that works with a single project and provides an integrated IDE/editor for editing the .cshtml and .vbhtml pages. A run button allows you to quickly run pages in the project manager in a variety of browsers. There’s no debugging support for code at this time. Note that Razor pages don’t require explicit compilation, so making a change, saving, and then refreshing your page in the browser is all that’s needed to see changes while testing an application locally. It’s essentially using the auto-compiling Web Project that was introduced with .NET 2.0. All code is compiled during run time into dynamically created assemblies in the ASP.NET temp folder. WebMatrix also has PHP Editing support with syntax highlighting. You can load various PHP-based applications from the WebMatrix Web Gallery directly into the IDE. Most of the Web Gallery applications are ready to install and run without further configuration, with Wizards taking you through installation of tools, dependencies, and configuration of the database as needed. WebMatrix leverages the Web Platform installer to pull the pieces down from websites in a tight integration of tools that worked nicely for the four or five applications I tried this out on. Click a couple of check boxes and fill in a few simple configuration options and you end up with a running application that’s ready to be customized. Nice! You can easily deploy completed applications via WebDeploy (to an IIS server) or FTP directly from within the development environment. The deploy tool also can handle automatically uploading and installing the database and all related assemblies required, making deployment a simple one-click install step. Simplified Database Access The IDE contains a database editor that can edit SQL Compact and SQL Server databases. There is also a Database helper class that facilitates database access by providing easy-to-use, high-level query execution and iteration methods: @{       var db = Database.OpenFile("FirstApp.sdf");     string sql = "select * from customers where Id > @0"; } <ul> @foreach(var row in db.Query(sql,1)){         <li>@row.FirstName @row.LastName</li> } </ul> The query function takes a SQL statement plus any number of positional (@0,@1 etc.) SQL parameters by simple values. The result is returned as a collection of rows which in turn have a row object with dynamic properties for each of the columns giving easy (though untyped) access to each of the fields. Likewise Execute and ExecuteNonQuery allow execution of more complex queries using similar parameter passing schemes. Note these queries use string-based queries rather than LINQ or Entity Framework’s strongly typed LINQ queries. While this may seem like a step back, it’s also in line with the expectations of non .NET script developers who are quite used to writing and using SQL strings in code rather than using OR/M frameworks. The only question is why was something not included from the beginning in .NET and Microsoft made developers build custom implementations of these basic building blocks. The implementation looks a lot like a DataTable-style data access mechanism, but to be fair, this is a common approach in scripting languages. This type of syntax that uses simple, static, data object methods to perform simple data tasks with one line of code are common in scripting languages and are a good match for folks working in PHP/Python, etc. Seems like Microsoft has taken great advantage of .NET 4.0’s dynamic typing to provide this sort of interface for row iteration where each row has properties for each field. FWIW, all the examples demonstrate using local SQL Compact files - I was unable to get a SQL Server connection string to work with the Database class (the connection string wasn’t accepted). However, since the code in the page is still plain old .NET, you can easily use standard ADO.NET code or even LINQ or Entity Framework models that are created outside of WebMatrix in separate assemblies as required. The good the bad the obnoxious - It’s still .NET The beauty (or curse depending on how you look at it :)) of Razor and the compilation model is that, behind it all, it’s still .NET. Although the syntax may look foreign, it’s still all .NET behind the scenes. You can easily access existing tools, helpers, and utilities simply by adding them to the project as references or to the bin folder. Razor automatically recognizes any assembly reference from assemblies in the bin folder. In the default configuration, Microsoft provides a host of helper functions in a Microsoft.WebPages assembly (check it out in the ASP.NET temp folder for your application), which includes a host of HTML Helpers. If you’ve used ASP.NET MVC before, a lot of the helpers should look familiar. Documentation at the moment is sketchy-there’s a very rough API reference you can check out here: http://www.asp.net/webmatrix/tutorials/asp-net-web-pages-api-reference Who needs WebMatrix? Uhm… good Question Clearly Microsoft is trying hard to create an environment with WebMatrix that is easy to use for newbie developers. The goal seems to be simplicity in providing a minimal development environment and an easy-to-use script engine/language that makes it easy to get started with. There’s also some focus on community features that can be used as starting points, such as Web Gallery applications and templates. The community features in particular are very nice and something that would be nice to eventually see in Visual Studio as well. The question is whether this is too little too late. Developers who have been clamoring for a simpler development environment on the .NET stack have mostly left for other simpler platforms like PHP or Python which are catering to the down and dirty developer. Microsoft will be hard pressed to win those folks-and other hardcore PHP developers-back. Regardless of how much you dress up a script engine fronted by the .NET Framework, it’s still the .NET Framework and all the complexity that drives it. While .NET is a fine solution in its breadth and features once you get a basic handle on the core features, the bar of entry to being productive with the .NET Framework is still pretty high. The MVC style helpers Microsoft provides are a good step in the right direction, but I suspect it’s not enough to shield new developers from having to delve much deeper into the Framework to get even basic applications built. Razor and its helpers is trying to make .NET more accessible but the reality is that in order to do useful stuff that goes beyond the handful of simple helpers you still are going to have to write some C# or VB or other .NET code. If the target is a hobby/amateur/non-programmer the learning curve isn’t made any easier by WebMatrix it’s just been shifted a tad bit further along in your development endeavor when you run out of canned components that are supplied either by Microsoft or the community. The database helpers are interesting and actually I’ve heard a lot of discussion from various developers who’ve been resisting .NET for a really long time perking up at the prospect of easier data access in .NET than the ridiculous amount of code it takes to do even simple data access with raw ADO.NET. It seems sad that such a simple concept and implementation should trigger this sort of response (especially since it’s practically trivial to create helpers like these or pick them up from countless libraries available), but there it is. It also shows that there are plenty of developers out there who are more interested in ‘getting stuff done’ easily than necessarily following the latest and greatest practices which are overkill for many development scenarios. Sometimes it seems that all of .NET is focused on the big life changing issues of development, rather than the bread and butter scenarios that many developers are interested in to get their work accomplished. And that in the end may be WebMatrix’s main raison d'être: To bring some focus back at Microsoft that simpler and more high level solutions are actually needed to appeal to the non-high end developers as well as providing the necessary tools for the high end developers who want to follow the latest and greatest trends. The current version of WebMatrix hits many sweet spots, but it also feels like it has a long way to go before it really can be a tool that a beginning developer or an accomplished developer can feel comfortable with. Although there are some really good ideas in the environment (like the gallery for downloading apps and components) which would be a great addition for Visual Studio as well, the rest of the development environment just feels like crippleware with required functionality missing especially debugging and Intellisense, but also general editor support. It’s not clear whether these are because the product is still in an early alpha release or whether it’s simply designed that way to be a really limited development environment. While simple can be good, nobody wants to feel left out when it comes to necessary tool support and WebMatrix just has that left out feeling to it. If anything WebMatrix’s technology pieces (which are really independent of the WebMatrix product) are what are interesting to developers in general. The compact IIS implementation is a nice improvement for development scenarios and SQL Compact 4.0 seems to address a lot of concerns that people have had and have complained about for some time with previous SQL Compact implementations. By far the most interesting and useful technology though seems to be the Razor view engine for its light weight implementation and it’s decoupling from the ASP.NET/HTTP pipeline to provide a standalone scripting/view engine that is pluggable. The first winner of this is going to be ASP.NET MVC which can now have a cleaner view model that isn’t inconsistent due to the baggage of non-implemented WebForms features that don’t work in MVC. But I expect that Razor will end up in many other applications as a scripting and code generation engine eventually. Visual Studio integration for Razor is currently missing, but is promised for a later release. The ASP.NET MVC team has already mentioned that Razor will eventually become the default MVC view engine, which will guarantee continued growth and development of this tool along those lines. And the Razor engine and support tools actually inherit many of the features that MVC pioneered, so there’s some synergy flowing both ways between Razor and MVC. As an existing ASP.NET developer who’s already familiar with Visual Studio and ASP.NET development, the WebMatrix IDE doesn’t give you anything that you want. The tools provided are minimal and provide nothing that you can’t get in Visual Studio today, except the minimal Razor syntax highlighting, so there’s little need to take a step back. With Visual Studio integration coming later there’s little reason to look at WebMatrix for tooling. It’s good to see that Microsoft is giving some thought about the ease of use of .NET as a platform For so many years, we’ve been piling on more and more new features without trying to take a step back and see how complicated the development/configuration/deployment process has become. Sometimes it’s good to take a step - or several steps - back and take another look and realize just how far we’ve come. WebMatrix is one of those reminders and one that likely will result in some positive changes on the platform as a whole. © Rick Strahl, West Wind Technologies, 2005-2010Posted in ASP.NET   IIS7  

    Read the article

  • SQL Server Data Tools–BI for Visual Studio 2013 Re-released

    - by Greg Low
    Customers used to complain that the tooling for creating BI projects (Analysis Services MD and Tabular, Reporting Services, and Integration services) has been based on earlier versions of Visual Studio than the ones they were using for their other work in Visual Studio (such as C#, VB, and ASP.NET projects). To alleviate that problem, the shipment of those tools has been decoupled from the shipment of the SQL Server product. In SQL Server 2014, the BI tooling isn’t even included in the released version of SQL Server. This allows the team to keep up-to-date with the releases of Visual Studio. A little while back, I was really pleased to see that the Visual Studio 2013 update for SSDT-BI (SQL Server Data Tools for Business Intelligence) had been released. Unfortunately, they then had to be withdrawn. The good news is that they’re back and you can get the latest version from here: http://www.microsoft.com/en-us/download/details.aspx?id=42313

    Read the article

  • Welcome to Linchpin People, Tim Mitchell!

    - by andyleonard
    I am honored to welcome Tim Mitchell ( blog | @Tim_Mitchell ) to Linchpin People! Tim brings years of experience consulting  with SQL Server, Integration Services, and Business Intelligence to our growing organization. I am overjoyed to be able to work with my friend! Rather than babble on about Linchpin People (using words like "synergy" and "world class"), I direct you to Tim's awesome remarks on his transition , and end with a simple "w00t!" :{>...(read more)

    Read the article

  • WWDC 2010 : Apple a présenté la nouvelle version de XCode, l'environnement de développement d'Apple

    Apple a dévoilé la nouvelle version de XCode 4 aux développeurs présents à l'édition 2010 de la WWDC. Seuls les développeurs inscrits à la WWDC peuvent actuellement télécharger XCode 4 sur leur Mac. XCode 4 nécessite Mac OS X 10.6.4 Qu'apporte-t-il de neuf ? On ne sait pas encore grand chose, vu que tout ce qui se passe à la WWDC en dehors de la keynote est sous NDA. Mais, malgré la NDA, quelques bruits filtrent sur le net. Il semblerait que XCode 4 apporterait un meilleur support pour le versionning du code. Où on aurait une interface à la Time Machine pour merger son code facilement. Meilleure intégration avec SubVersion, Git. Interface Builder ne serait plus quelque chose qui est à ...

    Read the article

  • Code from my DevConnections Talks and Workshop

    - by dwahlin
    Thanks to everyone who attended my sessions at DevConnections Las Vegas. I had a great time meeting new people, discussing business problems and solutions and interacting. Here’s the code and slides for the sessions.  For those that came to the full-day Silverlight workshop I’ve included the slides that didn’t get printed plus a ton of code to help you get started with various Silverlight topics.   Get Started Building Silverlight Applications Building Architecturally Sound Silverlight Applications Using WCF RIA Services in Silverlight Applications (will post soon) Silverlight Data Integration Options and Usage Scenarios Silverlight Workshop Code

    Read the article

  • Scheduling Jobs in SQL Server Express

    As we all know SQL Server 2005 Express is a very powerful free edition of SQL Server 2005. However it does not contain SQL Server Agent service. Because of this scheduling jobs is not possible. So if we want to do this we have to install a free or commercial 3rd party product. This usually isn't allowed due to the security policies of many hosting companies and thus presents a problem. Maybe we want to schedule daily backups, database reindexing, statistics updating, etc. This is why I wanted to have a solution based only on SQL Server 2005 Express and not dependent on the hosting company. And of course there is one based on our old friend the Service Broker.

    Read the article

< Previous Page | 154 155 156 157 158 159 160 161 162 163 164 165  | Next Page >