Search Results

Search found 766 results on 31 pages for 'simplicity'.

Page 15/31 | < Previous Page | 11 12 13 14 15 16 17 18 19 20 21 22  | Next Page >

  • Passing report values to a query

    - by Beavis
    I'm a novice with Microsoft Access as my background is mostly .NET. I'm sure what I'm trying to accomplish is dead simple but I need some direction. I have a report and a query. The query returns a single numeric value based on a single numeric criteria. Select total from table where id = [topic] I have placed a text box on my report so I can feed the id to this query and in return get the total. It seems like DLookUp is what I want but no matter how I construct it, I get an "#Error" in the text box when I run the report. Currently my DLookUp looks like this (I just hard-coded now for simplicity): =DLookUp("[total]","myquery","[topic] = 3") How can I pass a value from a field on my report to a query so I can return the query's single numeric value? Thanks.

    Read the article

  • Offline database access

    - by dtech
    I have a small application which basically consists of an user-friendly CRUD interface to a few tables (and joined tables) It currently works with a MySQL database but I would like to make it available offline. My first thought was to create a SQLite "buffer" in between the MySQL database and the application, e.g. by executing all queries on the SQLite but also storing them in a log table so that they can be executed later in the main database with very basic conflict resolution (I will basically let the user solve it if a conflict is detected) Due to the simplicity of the application this shouldn't be too difficult and good exercise, but I think I would be re-inventing the wheel. So my question is: are there existing solutions or other approaches for this problem?

    Read the article

  • Will a source-removal sort always return a maximal cycle?

    - by Jason Baker
    I wrote a source-removal algorithm to sort some dependencies between tables in our database, and it turns out we have a cycle. For simplicity, let's say we have tables A, B, C, and D. The edges are like this: (A, B) (B, A) (B, C) (C, D) (D, A) As you can see, there are two cycles here. One is between A and B and another is between all four of them. Will this type of sort always choke on the largest cycle? Or is that not necessarily the case?

    Read the article

  • Android Loading & Playing Sound Based on String

    - by Chance
    I'm currently working on a simple Android app, and right now I am trying to get it to load in and play sounds. The problem I am faced with is that I want the sound it uses to be based on a string (With the same name as the sound file). The reason for this is simplicity in both the code and adding on to it. Now unfortunately I can't just slap a string in place of referencing the actual sound, but is there some way for me to compare a string to the entire raw folder to find the matching sound, or some other alternative short of defining every sound manually? Thank you for your time.

    Read the article

  • Fast, cross-platform timer?

    - by dsimcha
    I'm looking to improve the D garbage collector by adding some heuristics to avoid garbage collection runs that are unlikely to result in significant freeing. One heuristic I'd like to add is that GC should not be run more than once per X amount of time (maybe once per second or so). To do this I need a timer with the following properties: It must be able to grab the correct time with minimal overhead. Calling core.stdc.time takes an amount of time roughly equivalent to a small memory allocation, so it's not a good option. Ideally, should be cross-platform (both OS and CPU), for maintenance simplicity. Super high resolution isn't terribly important. If the times are accurate to maybe 1/4 of a second, that's good enough. Must work in a multithreaded/multi-CPU context. The x86 rdtsc instruction won't work.

    Read the article

  • VB.NET: SQLite to MSSQL

    - by user1736785
    I have a vb.net project that uses a SQLite database. I do this by using dataset/table adapters. The client is happy and all works well. However I have just heard that they plan on providing this product to another customer that wishes to use their MSSQL database. So I am writing this post so I can mentally prepare for this before I begin. I am not a database pro and have really enjoyed the simplicity of setting up and managing an SQLite database. So any ideas on the easiest way to support MSSQL as well? I am happy to run them parallel to each other. Can I just make a separate service / middleware that syncs the SQLite database to the MSSQL on a timer and does not care about what the main app is up to? Any pointers are appreciated.

    Read the article

  • 3D points to quaternions

    - by Hubrus
    For the simplicity, we'll consider two 3D points, that moves one relatively to other, in time. Let's say: at moment t0, we have P1(0,0,0) and P2(0,2,0) at moment t1, P1 is still (0,0,0) but P2 changed to (0,2,2). From what I've understood reading about quaternions, is that, at moment t0, Q1 (representing P1) and Q2 (representing P2) will be both (0, 0, 0, 0). But at the moment t1, Q2 will become something else (w, x, y, z). How do I calculate the Q2 at t1 moment? I've googled a lot on this subject, but I was able to find only rotation between quaternions. I will appreciate any guidance. Thanks!

    Read the article

  • Copying Primary key to another field in Access.

    - by BashLover
    Hey, I'm struggling to copy the Primary Key to another field in Access. This is irrelevant , but clarifying on what I'm comparing. ... WHERE Tunniste=" & [Tarkiste] & "" Tunniste = Primary Key , Autonumber , ID (Generated by Access.) Tarkiste = This is the field I want to copy it to compare it. I'm open to suggestions, I've already try'ed with Form_Load, using the following code. Private Sub Form_Load() DoCmd.RunSQL "UPDATE Korut SET [Tarkiste]=('" & Tunniste & "');" End Sub But this copied the same key to all the entries in "Tarkiste" field. In simplicity I want 1:1 copy of field "Tunniste" to "Tarkiste" , whichever method it takes. Started from this question. File Picker Replaces All Rows With The Same Choice.

    Read the article

  • what's the right way to do polymorphism with protocol buffers?

    - by user364003
    I'm trying to long-term serialize a bunch of objects related by a strong class hierarchy in java, and I'd like to use protocol buffers to do it due to their simplicity, performance, and ease of upgrade. However, they don't provide much support for polymorphism. Right now, the way I'm handling it is by having a "one message to rule them all" solution that has a required string uri field that allows me to instantiate the correct type via reflection, then a bunch of optional fields for all the other possible classes I could serialize, only one of which will be used (based on the value of the uri field). Is there a better way to handle polymorphism, or is this as good as I'm going to get?

    Read the article

  • C++ STL: Trouble with string iterators

    - by Rosarch
    I'm making a simple command line Hangman game. void Hangman::printStatus() { cout << "Lives remaining: " << livesRemaining << endl; cout << getFormattedAnswer() << endl; } string Hangman::getFormattedAnswer() { return getFormattedAnswerFrom(correctAnswer.begin(), correctAnswer.end()); } string Hangman::getFormattedAnswerFrom(string::const_iterator begin, string::const_iterator end) { return begin == end? "" : displayChar(*begin) + getFormattedAnswerFrom(++begin, end); } char Hangman::displayChar(const char c) { return c; } (Eventually, I'll change this so displayChar() displays a - or a character if the user has guessed it, but for simplicity now I'm just returning everything.) When I build and run this from VS 2010, I get a popup box: Debug Assertion Failed! xstring Line: 78 Expression: string iterator not dereferenceable What am I doing wrong?

    Read the article

  • When is it good to use FTP?

    - by Tom Duckering
    In my experience I see a lot of architecture diagrams which make extensive use of FTP as a medium for linking architectural components. As someone who doesn't make architectural decisions but tends to look at architecture diagrams could anyone explain what the value is of using FTP, where it's appropriate and when transferring data as files is a good idea. I get that there are often legacy systems that just need to work that way - although any historical insight would be interesting too I can see the attraction in transferring files (especially if that's what needs to be transferred) because of the simplicity and familiarity and wonder if the reasoning goes beyond this.

    Read the article

  • Entity Framework 4 & WCF Data Service: N:M mapping

    - by JJO
    I have three tables in my database: An A table, a B table, and a many-to-many ABMapping table. For simplicity, A and B are keyed with identity columns; ABMapping has just two columns: AId and BId. I built an Entity Framework 4 model from this, and it did correctly identify the N:M mapping between A and B. I then built a WCF Data Service based on this EF model. I'm trying to consume this WCF Data Service. Unfortunately, I can't figure out how to get a mapping between As and Bs to map back to the database. I've tried something like this: A a = new A(); B b = new B(); a.Bs.Add(b); connection.SaveChanges(); But this doesn't seem to have worked. Any clues? What am I missing?

    Read the article

  • Rails - How to secure foreign keys and still allow association selection

    - by Bryce
    For simplicity, assume that I have a simple has-many-through relationship class User < ActiveRecord::Base has_many :courses, :through => :registrations end class Registration < ActiveRecord::Base belongs_to :user belongs_to :course end class Course < ActiveRecord::Base has_many :users, :through => :registrations end I want to keep my app secure, so I use attr_accessible to whitelist my attributes. My question is twofold: How would I set up my whitelist attributes such that I could create a new Registration object through a form (passing in :user and :course, but not risk allowing those foreign keys to be maliciously updated later? How would I set up my validations such that both belongs_to associations are required BUT also allow for Registration objects to be created in nested forms?

    Read the article

  • Find number of serialized objects

    - by tmosier
    Hello all. My issue is trying to determine a number of objects created, the objects being serialized from an XML document. The XML document should be set up for simplicity, so any developer can add an additional object and need no further modification to the code. However each of these objects need to be handled/updated seperately, and specifically, some of the objects are of different sub-classes, which need to be handled differently. Here is some pseudocode for what I am going for class A { public B myClassB; public void Update() { // for every object of whatever type in myClassB // update logic } } XML: <\data <\object1 etc... So what would be my simplest course of action, allowing other to add objects via the XML, but still ensuring the proper logic happenes for each?

    Read the article

  • Dynamically Generate Multi-Dimensional Array in Ruby

    - by user335729
    Hi, I'm trying to build a multidimensional array dynamically. What I want is basically this (written out for simplicity): b = 0 test = [[]] test[b] << ["a", "b", "c"] b += 1 test[b] << ["d", "e", "f"] b += 1 test[b] << ["g", "h", "i"] This gives me the error: NoMethodError: undefined method `<<' for nil:NilClass. I can make it work by setting up the array like test = [[], [], []] and it works fine, but in my actual usage, I won't know how many arrays will be needed beforehand. Is there a better way to do this? Thanks

    Read the article

  • OpenGL multitexture tessellation

    - by user1715296
    I have to tessellate some surface in OpenGL with rectangular textures. Let it be a single triangle for simplicity. The textures touch each other by sides, and do not overlap. That is done by setting GL_TEXTURE_WRAP_S and GL_TEXTURE_WRAP_T to GL_CLAMP_TO_BORDER and adjusting texture coords properly. Everything goes fine while GL_TEXTURE_MIN_FILTER and GL_TEXTURE_MAG_FILTER is set to GL_NEAREST, but when I want to apply GL_LINEAR filering and/or anisotropic filtering following arifact apperas: textures border pixel's alpha gradually fall to transparent, so that line of background color is visible between neighbouring textures. How can I avoid this artifact without merging multiple textures to one while linear filtering is preserved?

    Read the article

  • Complex SQL queries (DELETE)?

    - by Joe
    Hello all, I'm working with three tables, and for simplicity's sake let's call them table A, B, and C. Both tables A and B have a column called id, as well as one other column, Aattribute and Battribute, respectively. Column c also has an id column, and two other columns which hold values for A.id and B.id. Now, in my code, I have easy access to values for both Aattribute and Battribute, and want to delete the row at C, so effectively I want to do something like this: DELETE FROM C WHERE aid=(SELECT id FROM A WHERE Aattribute='myvalue') AND bid=(SELECT id FROM B WHERE Battribute='myothervalue') But this obviously doesn't work. Is there any way to make a single complex query, or do I have to run three queries, where I first get the value of A.id using a SELECT with 'myvalue', then the same for B.id, then use those in the final query? [Edit: it's not letting me comment, so in response to the first comment on this: I tried the above query and it did not work, I figured it just wasn't syntactically correct. Using MS Access, for what it's worth. ]

    Read the article

  • Add a multiple buttons to a view programatically, call the same method, determine which button it wa

    - by just_another_coder
    I want to programatically add multiple UIButtons to a view - the number of buttons is unknown at compile time. I can make one or more UIButton's like so (in a loop, but shorted for simplicity): UIButton *button = [UIButton buttonWithType:UIButtonTypeRoundedRect]; [button addTarget:self action:@selector(buttonClicked:) forControlEvents:UIControlEventTouchDown]; [button setTitle:@"Button x" forState:UIControlStateNormal]; button.frame = CGRectMake(100.0, 100.0, 120.0, 50.0); [view addSubview:button]; Copied/Edited from this link: http://stackoverflow.com/questions/1378765/how-do-i-create-a-basic-uibutton-programmatically But how do I determine in buttonClicked: which button was clicked? I'd like to pass tag data if possible to identify the button.

    Read the article

  • Display System Information on Your Desktop with Desktop Info

    - by Asian Angel
    Do you like to monitor your system but do not want a complicated app to do it with? If you love simplicity and easy configuration then join us as we look at Desktop Info. Desktop Info in Action Desktop Info comes in a zip file format so you will need to unzip the app, place it into an appropriate “Program Files Folder”, and create a shortcut. Do NOT delete the “Read Me File”…this will be extremely useful to you when you make changes to the “Configuration File”. Once you have everything set up you are ready to start Desktop Info up. This is the default layout and set of listings displayed when you start Desktop Info up for the first time. The font colors will be a mix of colors as seen here and the font size will perhaps be a bit small but those are very easy to change if desired. You can access the “Context Menu” directly over the “information area”…so no need to look for it in the “System Tray”. Notice that you can easily access that important “Read Me File” from here… The full contents of the configuration file (.ini file) are displayed here so that you can see exactly what kind of information can be displayed using the default listings. The first section is “Options”…you will most likely want to increase the font size while you are here. Then “Items”… If you are unhappy with any of the font colors in the “information area” this is where you can make the changes. You can turn information display items on or off here. And finally “Files, Registry, & Event Logs”. Here is our displayed information after a few tweaks in the configuration file. Very nice. Conclusion If you have been looking for a system information app that is simple and easy to set up then you should definitely give Desktop Info a try. Links Download Desktop Info Similar Articles Productive Geek Tips Ask the Readers: What are Your Computer’s Hardware Specs?Allow Remote Control To Your Desktop On UbuntuHow To Get Detailed Information About Your PCGet CPU / System Load Average on Ubuntu LinuxEnable Remote Desktop (VNC) on Kubuntu TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Test Drive Windows 7 Online Download Wallpapers From National Geographic Site Spyware Blaster v4.3 Yes, it’s Patch Tuesday Generate Stunning Tag Clouds With Tagxedo Install, Remove and HIDE Fonts in Windows 7

    Read the article

  • MVC Portable Areas &ndash; Web Application Projects

    - by Steve Michelotti
    This is the first post in a series related to build and deployment considerations as I’ve been exploring MVC Portable Areas: #1 – Using Web Application Project to build portable areas #2 – Conventions for deploying portable area static files #3 – Portable area static files as embedded resources Portable Areas is a relatively new feature available in MvcContrib that builds upon the new feature called Areas that was introduced in MVC 2. In short, portable areas provide a way to distribute MVC binary components as simple .NET assemblies rather than an assembly along with all the physical files for the views. At the heart of portable areas is a custom view engine that delivers the *.aspx pages by pulling them from embedded resources rather than from the physical file system. A portable area can be something as small as a tiny snippet of html that eventually gets rendered on a page, to something as large as an entire MVC web application. You should read this 4-part series to get up to speed on what portable areas are. Web Application Project In most of the posts to date, portable areas are shown being created with a simple C# class library. This is cool and it serves as an effective way to illustrate the simplicity of portable areas. However, the problem with that is that the developer loses out on the normal developer experience with the various tooling/scaffolding options that we’ve come to expect in visual studio like the ability to add controllers, views, etc. easily: I’ve had good results just using a normal web application project (rather than a class library) to develop portable areas and get the normal vs.net benefits. However, one gotcha that comes as a result is that it’s easy to forget to set the file to “Embedded Resource” every time you add a new aspx page. To mitigate this, simply add this MSBuild snippet shown below to your *.csproj file and all *.aspx, *ascx will automatically be set as embedded resources when your project compiles: 1: <Target Name="BeforeBuild"> 2: <ItemGroup> 3: <EmbeddedResource Include="**\*.aspx;**\*.ascx" /> 4: </ItemGroup> 5: </Target> Also, you should remove the Global.asax from this web application as it is not the host. Being able to have the normal tooling experience we’ve come to expect from Visual Studio makes creating portable areas quite simple. This even allows us to do things like creating a project template such as “MVC Portable Area Web Application” that would come pre-configured with routes set up in the PortableAreaRegistration and no Global.asax file.

    Read the article

  • Limiting DOPs &ndash; Who rules over whom?

    - by jean-pierre.dijcks
    I've gotten a couple of questions from Dan Morgan and figured I start to answer them in this way. While Dan is running on a big system he is running with Database Resource Manager and he is trying to make sure the system doesn't go crazy (remember end user are never, ever crazy!) on very high DOPs. Q: How do I control statements with very high DOPs driven from user hints in queries? A: The best way to do this is to work with DBRM and impose limits on consumer groups. The Max DOP setting you can set in DBRM allows you to overwrite the hint. Now let's go into some more detail here. Assume my object (and for simplicity we assume there is a single object - and do remember that we always pick the highest DOP when in doubt and when conflicting DOPs are available in a query) has PARALLEL 64 as its setting. Assume that the query that selects something cool from that table lives in a consumer group with a max DOP of 32. Assume no goofy things (like running out of parallel_max_servers) are happening. A query selecting from this table will run at DOP 32 because DBRM caps the DOP. As of 11.2.0.1 we also use the DBRM cap to create the original plan (at compile time) and not just enforce the cap at runtime. Now, my user is smart and writes a query with a parallel hint requesting DOP 128. This query is still capped by DBRM and DBRM overrules the hint in the statement. The statement, despite the hint, runs at DOP 32. Note that in the hinted scenario we do compile the statement with DOP 128 (the optimizer obeys the hint). This is another reason to use table decoration rather than hints. Q: What happens if I set parallel_max_servers higher than processes (e.g. the max number of processes allowed to run on my machine)? A: Processes rules. It is important to understand that processes are fixed at startup time. If you increase parallel_max_servers above the number of processes in the processes parameter you should get a warning in the alert log stating it can not take effect. As a follow up, a hinted query requesting more parallel processes than either parallel_max_servers or processes will not be able to acquire the requested number. Parallel_max_processes will prevent this. And since parallel_max_servers should be lower than max processes you can never go over either...

    Read the article

  • “Cloud Integration in Minutes” – True or False?

    - by Bruce Tierney
    The short answer is “yes”. Connecting on-premise and cloud applications “in minutes” is true…provided you only consider the connectivity subset of integration and have a small number of cloud integration touch points. At the recent Gartner AADI conference, 230 attendees filled up the Oracle session to get a more comprehensive answer to this question. During the session, titled “Simplifying Integration – The Cloud & Mobile Pre-requisite”, Oracle’s Tim Hall described cloud connectivity and then, equally importantly, the other essential and sometimes overlooked aspects of integration required to ensure a long term application and service integration strategy. To understand the challenges and opportunities faced by cloud integration, the session started off with a slide that describes how connectivity can quickly transition from simplicity to complexity as the number of applications and service vendor instances grows: Increased complexity puts increased demand on the integration platform As companies expand from on-premise applications into a hybrid on-premise/cloud infrastructure with support for mobile, cloud, and social, there is a new sense of urgency to implement a unified and comprehensive service integration platform. Without getting this unified platform in place, companies face increased complexity and cost managing a growing patchwork of niche integration toolsets as well as the disparate standards mandated by each SaaS vendor as shown in the image below: dddddddddddddddddddd Incomplete and overlapping offerings from a patchwork of niche vendors Also at Gartner AADI, Oracle SOA Suite customer Geeta Pyne, Director of Middleware at BMC presented their successful strategy on how BMC efficiently manages their cloud integration despite disparate requirements from each vendor. From one of Geeta’s slide: Interfaces are dictated by SaaS vendors; wide variety (SOAP, REST, Socket, HTTP/POX, SFTP); Flexibility of Oracle Service Bus/SOA Suite helps to support Every vendor has their way to handle Security; WS-Security, Custom Header; Support in Oracle Service Bus helps to adhere to disparate requirements At BMC, the flexibility of Oracle Service Bus and Oracle SOA Suite allowed them to support the wide variation in the functional requirements as mandated by their SaaS vendors. In contrast to the patchwork platform approach of escalating complexity from overlapping SaaS toolkits, Oracle’s strategy is to provide a unified platform to support disparate requirements from your SaaS vendors, on-premise apps, legacy apps, and more. Furthermore, Oracle SOA Suite includes the many aspects of comprehensive integration beyond basic connectivity including orchestration, analytics (BAM, events…), service virtualization and more in a single unified interface. Oracle SOA Suite – Unified and comprehensive To summarize, yes you can achieve “cloud integration in minutes” when considering the connectivity subset of integration but be sure to look for ways to simplify as you consider a more comprehensive view of integration beyond basic connectivity such as service virtualization, management, event processing and more. And finally, be sure your integration platform has the deep flexibility to handle the requirements of all your future SaaS applications…many of which are unknown to you now.

    Read the article

  • ArchBeat Link-o-Rama for 11/15/2011

    - by Bob Rhubart
    Java Magazine - November/December 2011 - by and for the Java Community Java Magazine is an essential source of knowledge about Java technology, the Java programming language, and Java-based applications for people who rely on them in their professional careers, or who aspire to. Enterprise 2.0 Conference: November 14-17 | Kellsey Ruppel "Oracle is proud to be a Gold sponsor of the Enterprise 2.0 West Conference, November 14-17, 2011 in Santa Clara, CA. You will see the latest collaboration tools and technologies, and learn from thought leaders in Enterprise 2.0's comprehensive conference." The Return of Oracle Wikis: Bigger and Better | @oracletechnet The Oracle Wikis are back - this time, with Oracle SSO on top and powered by Atlassian's Confluence technology. These wikis offer quite a bit more functionality than the old platform. Cloud Migration Lifecycle | Tom Laszewski Laszewski breaks down the four steps in the Set Up Phase of the Cloud Migration lifecycle. Architecture all day. Oracle Technology Network Architect Day - Phoenix, AZ - Dec14 Spend the day with your peers learning from Oracle experts in engineered systems, cloud computing, Oracle Coherence, Oracle WebLogic, and more. Registration is free, but seating is limited. SOA all the Time; Architects in AZ; Clearing Info Integration Hurdles This week on the Architect Home Page on OTN. Live Webcast: New Innovations in Oracle Linux Date: Tuesday, November 15, 2011 Time: 9:00 AM PT / Noon ET Speakers: Chris Mason, Elena Zannoni. People in glass futures should throw stones | Nicholas Carr "Remember that Microsoft video on our glassy future? Or that one from Corning? Or that one from Toyota?" asks Carr. "What they all suggest, and assume, is that our rich natural 'interface' with the world will steadily wither away as we become more reliant on software mediation." Integration of SABSA Security Architecture Approaches with TOGAF ADM | Jeevak Kasarkod Jeevak Kasarkod's overview of a new paper from the OpenGroup and the SABSA institute "which delves into the incorporatation of risk management and security architecture approaches into a well established enterprise architecture methodology - TOGAF." Cloud Computing at the Tactical Edge | Grace Lewis - SEI Lewis describes the SEI's work with Cloudlets, " lightweight servers running one or more virtual machines (VMs), [that] allow soldiers in the field to offload resource-consumptive and battery-draining computations from their handheld devices to nearby cloudlets." Simplicity Is Good | James Morle "When designing cluster and storage networking for database platforms, keep the architecture simple and avoid the complexities of multi-tier topologies," says Morle. "Complexity is the enemy of availability." Mainframe as the cloud? Tom Laszewski There's nothing new about using the mainframe in the cloud, says Laszewski. Let Devoxx 2011 begin! | The Aquarium The Aquarium marks the kick-off of Devoxx 2011 with "a quick rundown of the Java EE and GlassFish side of things."

    Read the article

  • Microsoft Introduces WebMatrix

    - by Rick Strahl
    originally published in CoDe Magazine Editorial Microsoft recently released the first CTP of a new development environment called WebMatrix, which along with some of its supporting technologies are squarely aimed at making the Microsoft Web Platform more approachable for first-time developers and hobbyists. But in the process, it also provides some updated technologies that can make life easier for existing .NET developers. Let’s face it: ASP.NET development isn’t exactly trivial unless you already have a fair bit of familiarity with sophisticated development practices. Stick a non-developer in front of Visual Studio .NET or even the Visual Web Developer Express edition and it’s not likely that the person in front of the screen will be very productive or feel inspired. Yet other technologies like PHP and even classic ASP did provide the ability for non-developers and hobbyists to become reasonably proficient in creating basic web content quickly and efficiently. WebMatrix appears to be Microsoft’s attempt to bring back some of that simplicity with a number of technologies and tools. The key is to provide a friendly and fully self-contained development environment that provides all the tools needed to build an application in one place, as well as tools that allow publishing of content and databases easily to the web server. WebMatrix is made up of several components and technologies: IIS Developer Express IIS Developer Express is a new, self-contained development web server that is fully compatible with IIS 7.5 and based on the same codebase that IIS 7.5 uses. This new development server replaces the much less compatible Cassini web server that’s been used in Visual Studio and the Express editions. IIS Express addresses a few shortcomings of the Cassini server such as the inability to serve custom ISAPI extensions (i.e., things like PHP or ASP classic for example), as well as not supporting advanced authentication. IIS Developer Express provides most of the IIS 7.5 feature set providing much better compatibility between development and live deployment scenarios. SQL Server Compact 4.0 Database access is a key component for most web-driven applications, but on the Microsoft stack this has mostly meant you have to use SQL Server or SQL Server Express. SQL Server Compact is not new-it’s been around for a few years, but it’s been severely hobbled in the past by terrible tool support and the inability to support more than a single connection in Microsoft’s attempt to avoid losing SQL Server licensing. The new release of SQL Server Compact 4.0 supports multiple connections and you can run it in ASP.NET web applications simply by installing an assembly into the bin folder of the web application. In effect, you don’t have to install a special system configuration to run SQL Compact as it is a drop-in database engine: Copy the small assembly into your BIN folder (or from the GAC if installed fully), create a connection string against a local file-based database file, and then start firing SQL requests. Additionally WebMatrix includes nice tools to edit the database tables and files, along with tools to easily upsize (and hopefully downsize in the future) to full SQL Server. This is a big win, pending compatibility and performance limits. In my simple testing the data engine performed well enough for small data sets. This is not only useful for web applications, but also for desktop applications for which a fully installed SQL engine like SQL Server would be overkill. Having a local data store in those applications that can potentially be accessed by multiple users is a welcome feature. ASP.NET Razor View Engine What? Yet another native ASP.NET view engine? We already have Web Forms and various different flavors of using that view engine with Web Forms and MVC. Do we really need another? Microsoft thinks so, and Razor is an implementation of a lightweight, script-only view engine. Unlike the Web Forms view engine, Razor works only with inline code, snippets, and markup; therefore, it is more in line with current thinking of what a view engine should represent. There’s no support for a “page model” or any of the other Web Forms features of the full-page framework, but just a lightweight scripting engine that works with plain markup plus embedded expressions and code. The markup syntax for Razor is geared for minimal typing, plus some progressive detection of where a script block/expression starts and ends. This results in a much leaner syntax than the typical ASP.NET Web Forms alligator (<% %>) tags. Razor uses the @ sign plus standard C# (or Visual Basic) block syntax to delineate code snippets and expressions. Here’s a very simple example of what Razor markup looks like along with some comment annotations: <!DOCTYPE html> <html>     <head>         <title></title>     </head>     <body>     <h1>Razor Test</h1>          <!-- simple expressions -->     @DateTime.Now     <hr />     <!-- method expressions -->     @DateTime.Now.ToString("T")          <!-- code blocks -->     @{         List<string> names = new List<string>();         names.Add("Rick");         names.Add("Markus");         names.Add("Claudio");         names.Add("Kevin");     }          <!-- structured block statements -->     <ul>     @foreach(string name in names){             <li>@name</li>     }     </ul>           <!-- Conditional code -->        @if(true) {                        <!-- Literal Text embedding in code -->        <text>         true        </text>;    }    else    {        <!-- Literal Text embedding in code -->       <text>       false       </text>;    }    </body> </html> Like the Web Forms view engine, Razor parses pages into code, and then executes that run-time compiled code. Effectively a “page” becomes a code file with markup becoming literal text written into the Response stream, code snippets becoming raw code, and expressions being written out with Response.Write(). The code generated from Razor doesn’t look much different from similar Web Forms code that only uses script tags; so although the syntax may look different, the operational model is fairly similar to the Web Forms engine minus the overhead of the large Page object model. However, there are differences: -Razor pages are based on a new base class, Microsoft.WebPages.WebPage, which is hosted in the Microsoft.WebPages assembly that houses all the Razor engine parsing and processing logic. Browsing through the assembly (in the generated ASP.NET Temporary Files folder or GAC) will give you a good idea of the functionality that Razor provides. If you look closely, a lot of the feature set matches ASP.NET MVC’s view implementation as well as many of the helper classes found in MVC. It’s not hard to guess the motivation for this sort of view engine: For beginning developers the simple markup syntax is easier to work with, although you obviously still need to have some understanding of the .NET Framework in order to create dynamic content. The syntax is easier to read and grok and much shorter to type than ASP.NET alligator tags (<% %>) and also easier to understand aesthetically what’s happening in the markup code. Razor also is a better fit for Microsoft’s vision of ASP.NET MVC: It’s a new view engine without the baggage of Web Forms attached to it. The engine is more lightweight since it doesn’t carry all the features and object model of Web Forms with it and it can be instantiated directly outside of the HTTP environment, which has been rather tricky to do for the Web Forms view engine. Having a standalone script parser is a huge win for other applications as well – it makes it much easier to create script or meta driven output generators for many types of applications from code/screen generators, to simple form letters to data merging applications with user customizability. For me personally this is very useful side effect and who knows maybe Microsoft will actually standardize they’re scripting engines (die T4 die!) on this engine. Razor also better fits the “view-based” approach where the view is supposed to be mostly a visual representation that doesn’t hold much, if any, code. While you can still use code, the code you do write has to be self-contained. Overall I wouldn’t be surprised if Razor will become the new standard view engine for MVC in the future – and in fact there have been announcements recently that Razor will become the default script engine in ASP.NET MVC 3.0. Razor can also be used in existing Web Forms and MVC applications, although that’s not working currently unless you manually configure the script mappings and add the appropriate assemblies. It’s possible to do it, but it’s probably better to wait until Microsoft releases official support for Razor scripts in Visual Studio. Once that happens, you can simply drop .cshtml and .vbhtml pages into an existing ASP.NET project and they will work side by side with classic ASP.NET pages. WebMatrix Development Environment To tie all of these three technologies together, Microsoft is shipping WebMatrix with an integrated development environment. An integrated gallery manager makes it easy to download and load existing projects, and then extend them with custom functionality. It seems to be a prominent goal to provide community-oriented content that can act as a starting point, be it via a custom templates or a complete standard application. The IDE includes a project manager that works with a single project and provides an integrated IDE/editor for editing the .cshtml and .vbhtml pages. A run button allows you to quickly run pages in the project manager in a variety of browsers. There’s no debugging support for code at this time. Note that Razor pages don’t require explicit compilation, so making a change, saving, and then refreshing your page in the browser is all that’s needed to see changes while testing an application locally. It’s essentially using the auto-compiling Web Project that was introduced with .NET 2.0. All code is compiled during run time into dynamically created assemblies in the ASP.NET temp folder. WebMatrix also has PHP Editing support with syntax highlighting. You can load various PHP-based applications from the WebMatrix Web Gallery directly into the IDE. Most of the Web Gallery applications are ready to install and run without further configuration, with Wizards taking you through installation of tools, dependencies, and configuration of the database as needed. WebMatrix leverages the Web Platform installer to pull the pieces down from websites in a tight integration of tools that worked nicely for the four or five applications I tried this out on. Click a couple of check boxes and fill in a few simple configuration options and you end up with a running application that’s ready to be customized. Nice! You can easily deploy completed applications via WebDeploy (to an IIS server) or FTP directly from within the development environment. The deploy tool also can handle automatically uploading and installing the database and all related assemblies required, making deployment a simple one-click install step. Simplified Database Access The IDE contains a database editor that can edit SQL Compact and SQL Server databases. There is also a Database helper class that facilitates database access by providing easy-to-use, high-level query execution and iteration methods: @{       var db = Database.OpenFile("FirstApp.sdf");     string sql = "select * from customers where Id > @0"; } <ul> @foreach(var row in db.Query(sql,1)){         <li>@row.FirstName @row.LastName</li> } </ul> The query function takes a SQL statement plus any number of positional (@0,@1 etc.) SQL parameters by simple values. The result is returned as a collection of rows which in turn have a row object with dynamic properties for each of the columns giving easy (though untyped) access to each of the fields. Likewise Execute and ExecuteNonQuery allow execution of more complex queries using similar parameter passing schemes. Note these queries use string-based queries rather than LINQ or Entity Framework’s strongly typed LINQ queries. While this may seem like a step back, it’s also in line with the expectations of non .NET script developers who are quite used to writing and using SQL strings in code rather than using OR/M frameworks. The only question is why was something not included from the beginning in .NET and Microsoft made developers build custom implementations of these basic building blocks. The implementation looks a lot like a DataTable-style data access mechanism, but to be fair, this is a common approach in scripting languages. This type of syntax that uses simple, static, data object methods to perform simple data tasks with one line of code are common in scripting languages and are a good match for folks working in PHP/Python, etc. Seems like Microsoft has taken great advantage of .NET 4.0’s dynamic typing to provide this sort of interface for row iteration where each row has properties for each field. FWIW, all the examples demonstrate using local SQL Compact files - I was unable to get a SQL Server connection string to work with the Database class (the connection string wasn’t accepted). However, since the code in the page is still plain old .NET, you can easily use standard ADO.NET code or even LINQ or Entity Framework models that are created outside of WebMatrix in separate assemblies as required. The good the bad the obnoxious - It’s still .NET The beauty (or curse depending on how you look at it :)) of Razor and the compilation model is that, behind it all, it’s still .NET. Although the syntax may look foreign, it’s still all .NET behind the scenes. You can easily access existing tools, helpers, and utilities simply by adding them to the project as references or to the bin folder. Razor automatically recognizes any assembly reference from assemblies in the bin folder. In the default configuration, Microsoft provides a host of helper functions in a Microsoft.WebPages assembly (check it out in the ASP.NET temp folder for your application), which includes a host of HTML Helpers. If you’ve used ASP.NET MVC before, a lot of the helpers should look familiar. Documentation at the moment is sketchy-there’s a very rough API reference you can check out here: http://www.asp.net/webmatrix/tutorials/asp-net-web-pages-api-reference Who needs WebMatrix? Uhm… good Question Clearly Microsoft is trying hard to create an environment with WebMatrix that is easy to use for newbie developers. The goal seems to be simplicity in providing a minimal development environment and an easy-to-use script engine/language that makes it easy to get started with. There’s also some focus on community features that can be used as starting points, such as Web Gallery applications and templates. The community features in particular are very nice and something that would be nice to eventually see in Visual Studio as well. The question is whether this is too little too late. Developers who have been clamoring for a simpler development environment on the .NET stack have mostly left for other simpler platforms like PHP or Python which are catering to the down and dirty developer. Microsoft will be hard pressed to win those folks-and other hardcore PHP developers-back. Regardless of how much you dress up a script engine fronted by the .NET Framework, it’s still the .NET Framework and all the complexity that drives it. While .NET is a fine solution in its breadth and features once you get a basic handle on the core features, the bar of entry to being productive with the .NET Framework is still pretty high. The MVC style helpers Microsoft provides are a good step in the right direction, but I suspect it’s not enough to shield new developers from having to delve much deeper into the Framework to get even basic applications built. Razor and its helpers is trying to make .NET more accessible but the reality is that in order to do useful stuff that goes beyond the handful of simple helpers you still are going to have to write some C# or VB or other .NET code. If the target is a hobby/amateur/non-programmer the learning curve isn’t made any easier by WebMatrix it’s just been shifted a tad bit further along in your development endeavor when you run out of canned components that are supplied either by Microsoft or the community. The database helpers are interesting and actually I’ve heard a lot of discussion from various developers who’ve been resisting .NET for a really long time perking up at the prospect of easier data access in .NET than the ridiculous amount of code it takes to do even simple data access with raw ADO.NET. It seems sad that such a simple concept and implementation should trigger this sort of response (especially since it’s practically trivial to create helpers like these or pick them up from countless libraries available), but there it is. It also shows that there are plenty of developers out there who are more interested in ‘getting stuff done’ easily than necessarily following the latest and greatest practices which are overkill for many development scenarios. Sometimes it seems that all of .NET is focused on the big life changing issues of development, rather than the bread and butter scenarios that many developers are interested in to get their work accomplished. And that in the end may be WebMatrix’s main raison d'être: To bring some focus back at Microsoft that simpler and more high level solutions are actually needed to appeal to the non-high end developers as well as providing the necessary tools for the high end developers who want to follow the latest and greatest trends. The current version of WebMatrix hits many sweet spots, but it also feels like it has a long way to go before it really can be a tool that a beginning developer or an accomplished developer can feel comfortable with. Although there are some really good ideas in the environment (like the gallery for downloading apps and components) which would be a great addition for Visual Studio as well, the rest of the development environment just feels like crippleware with required functionality missing especially debugging and Intellisense, but also general editor support. It’s not clear whether these are because the product is still in an early alpha release or whether it’s simply designed that way to be a really limited development environment. While simple can be good, nobody wants to feel left out when it comes to necessary tool support and WebMatrix just has that left out feeling to it. If anything WebMatrix’s technology pieces (which are really independent of the WebMatrix product) are what are interesting to developers in general. The compact IIS implementation is a nice improvement for development scenarios and SQL Compact 4.0 seems to address a lot of concerns that people have had and have complained about for some time with previous SQL Compact implementations. By far the most interesting and useful technology though seems to be the Razor view engine for its light weight implementation and it’s decoupling from the ASP.NET/HTTP pipeline to provide a standalone scripting/view engine that is pluggable. The first winner of this is going to be ASP.NET MVC which can now have a cleaner view model that isn’t inconsistent due to the baggage of non-implemented WebForms features that don’t work in MVC. But I expect that Razor will end up in many other applications as a scripting and code generation engine eventually. Visual Studio integration for Razor is currently missing, but is promised for a later release. The ASP.NET MVC team has already mentioned that Razor will eventually become the default MVC view engine, which will guarantee continued growth and development of this tool along those lines. And the Razor engine and support tools actually inherit many of the features that MVC pioneered, so there’s some synergy flowing both ways between Razor and MVC. As an existing ASP.NET developer who’s already familiar with Visual Studio and ASP.NET development, the WebMatrix IDE doesn’t give you anything that you want. The tools provided are minimal and provide nothing that you can’t get in Visual Studio today, except the minimal Razor syntax highlighting, so there’s little need to take a step back. With Visual Studio integration coming later there’s little reason to look at WebMatrix for tooling. It’s good to see that Microsoft is giving some thought about the ease of use of .NET as a platform For so many years, we’ve been piling on more and more new features without trying to take a step back and see how complicated the development/configuration/deployment process has become. Sometimes it’s good to take a step - or several steps - back and take another look and realize just how far we’ve come. WebMatrix is one of those reminders and one that likely will result in some positive changes on the platform as a whole. © Rick Strahl, West Wind Technologies, 2005-2010Posted in ASP.NET   IIS7  

    Read the article

  • Transparent Technology from Amazon

    - by David Dorf
    Amazon has been making some interesting moves again, this time in the augmented humanity area.  Augmented humanity is about helping humans overcome their shortcomings using technology.  Putting a powerful smartphone in your pocket helps you in many ways like navigating streets, communicating with far off friends, and accessing information.  But the interface for smartphones is somewhat limiting and unnatural, so companies have been looking for ways to make the technology more transparent and therefore easier to use. When Apple helped us drop the stylus, we took a giant leap forward in simplicity.  Using touchscreens with intuitive gestures was part of the iPhone's original appeal.  People don't want to know that technology is there -- they just want the benefits.  So what's the next leap beyond the touchscreen to make smartphones even easier to use? Two natural ways we interact with the world around us is by using sight and voice.  Google and Apple have been using both in their mobile platforms for limited uses cases.  Nobody actually wants to type a text message, so why not just speak it?  Any if you want more information about a book, why not just snap a picture of the cover?  That's much more accurate than trying to key the title and/or author. So what's Amazon been doing?  First, Amazon released a new iPhone app called Flow that allows iPhone users to see information about products in context.  Yes, its an augmented reality app that uses the phone's camera to view products, and overlays data about the products on the screen.  For the most part it requires the barcode to be visible to correctly identify the product, but I believe it can also recognize certain logos as well.  Download the app and try it out but don't expect perfection.  Its good enough to demonstrate the concept, but its far from accurate enough.  (MobileBeat did a pretty good review.)  Extrapolate to the future and we might just have a heads-up display in our eyeglasses. The second interesting area is voice response, for which Siri is getting lots of attention.  Amazon may have purchased a voice recognition company called Yap, although the deal is not confirmed.  But it would make perfect sense, especially with the Kindle Fire in Amazon's lineup. I believe over the next 3-5 years the way in which we interact with smartphones will mature, and they will become more transparent yet more important to our daily lives.  This will, of course, impact the way we shop, making information more readily accessible than it already is.  Amazon seems to be positioning itself to be at the forefront of this trend, so we should be watching them carefully.

    Read the article

< Previous Page | 11 12 13 14 15 16 17 18 19 20 21 22  | Next Page >