Search Results

Search found 1367 results on 55 pages for 'eric nguyen'.

Page 14/55 | < Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >

  • Q&amp;A: Will my favourite ORM Foo work with SQL Azure?

    - by Eric Nelson
    short answer: Quite probably, as SQL Azure is very similar to SQL Server longer answer: Object Relational Mappers (ORMs) that work with SQL Server are likely but not guaranteed to work with SQL Azure. The differences between the RDBMS versions are small – but may cause problems, for example in tools used to create the mapping between objects and tables or in generated SQL from the ORM which expects “certain things” :-) More specifically: ADO.NET Entity Framework / LINQ to Entities can be used with SQL Azure, but the Visual Studio designer does not currently work. You will need to point the designer at a version of your database running of SQL Server to create the mapping, then change the connection details to run against SQL Azure. LINQ to SQL has similar issues to ADO.NET Entity Framework above NHibernate can be used against SQL Azure DevExpress XPO supports SQL Azure from version 9.3 DataObjects.Net supports SQL Azure Open Access from Telerik works “seamlessly”  - their words not mine :-) The list above is by no means comprehensive – please leave a comment with details of other ORMs that work (or do not work) with SQL Azure. Related Links: General guidelines and limitations of SQL Azure SQL Azure vs SQL Server

    Read the article

  • Have you downloaded the All-In-One Code Framework?

    - by Eric Nelson
    The Microsoft All-In-One Code Framework is a free, centralized code sample library provided by the Microsoft Community team. It aims to provide typical code samples for all Microsoft development technologies. The team listens to developers’ pains in MSDN forums, social media and various developer communities and write code samples based on developers’ frequently asked programming tasks. Additionally, our team offers a free code sample request service. Awesome?! I think so. Have also just added it to 99 technical resources for developers and architects inside ISVs – also worth checking out. Check it out on codeplex

    Read the article

  • Announcing a new Free Windows Azure Platform Trial offer

    - by Eric Nelson
    We now have a  truly useful Windows Azure Platform trial. Which makes me very happy as I was a vocal critic of the original trial offer. Simply put, the small number of compute hours it included made it useless for many potential early adopters. This is now fixed. The new Introductory Special now includes a generous 750 hours of compute – enough to run a web role 24/7. Enjoy! Related Links Full announcement If you are an ISV then there is a better offer for you via Microsoft Platform Ready and Cloud Essentials and keep an eye on our events for ISVs as we will be doing Windows Azure Platform technical briefings starting March 31st.

    Read the article

  • Windows Azure Platform eBook Update #2 &ndash; 100 pages of goodness

    - by Eric Nelson
    I previously mentioned I was working on a community authored eBook for the Windows Azure Platform. Well, today I assembled the 20 articles that made it through to the end of the review process into a single eBook – and it looks (and reads) great. Still a lot more to do (and stuff in the way of me doing it) but as a teaser, here is the (very draft) table of contents:

    Read the article

  • #OOW 2012 : IaaS, Private Cloud, Multitenant Database, and X3H2M2

    - by Eric Bezille
    The title of this post is a summary of the 4 announcements made by Larry Ellison today, during the opening session of Oracle Open World 2012... To know what's behind X3H2M2, you will have to wait a little, as I will go in order, beginning with the IaaS - Infrastructure as a Service - announcement. Oracle IaaS goes Public... and Private... Starting in 2004 with Fusion development, Oracle Cloud was launch last year to provide not only SaaS Application, based on standard development, but also the underlying PaaS, required to build the specifics, and required interconnections between applications, in and outside of the Cloud. Still, to cover the end-to-end Cloud  Services spectrum, we had to provide an Infrastructure as a Service, leveraging our Servers, Storage, OS, and Virtualization Technologies, all "Engineered Together". This Cloud Infrastructure, was already available for our customers to build rapidly their own Private Cloud either on SPARC/Solaris or x86/Linux... The second announcement made today bring that proposition a big step further : for cautious customers (like Banks, or sensible industries) who would like to benefits from the Cloud value of "as a Service", but don't want their Data out in the Cloud... We propose to them to operate the same systems, Exadata, Exalogic & SuperCluster, that are providing our Public Cloud Infrastructure, behind their firewall, in a Private Cloud model. Oracle 12c Multitenant Database This is also a major announcement made today, on what's coming with Oracle Database 12c : the ability to consolidate multiple databases with no extra additional  cost especially in terms of memory needed on the server node, which is often THE consolidation limiting factor. The principle could be compare to Solaris Zones, where, you will have a Database Container, who is "owning" the memory and Database background processes, and "Pluggable" Database in this Database Container. This particular feature is a strong compelling event to evaluate rapidly Oracle Database 12c once it will be available, as this is major step forward into true Database consolidation with Multitenancy on a shared (optimized) infrastructure. X3H2M2, enabling the new Exadata X3 in-Memory Database Here we are :  X3H2M2 stands for X3 (the new version of Exadata announced also today) Heuristic Hierarchical Mass Memory, providing the capability to keep most if not all the Data in the memory cache hierarchy. Of course, this is the major software enhancement of the new X3 Exadata machine, but as this is a software, our current customers would be able to benefit from it on their existing systems by upgrading to the new release. But that' not the only thing that we did with X3, at the same time we have upgraded everything : the CPUs, adding more cores per server node (16 vs. 12, with the arrival of Intel E5 / Sandy Bridge), the memory with 512GB memory as well per node,  and the new Flash Fire card, bringing now up to 22 TB of Flash cache. All of this 4TB of RAM + 22TB of Flash being use cleverly not only for read but also for write by the X3H2M2 algorithm... making a very big difference compare to traditional storage flash extension. But what does those extra performances brings to you on an already very efficient system: double your performances compare to the fastest storage array on the market today (including flash) and divide you storage price x10 at the same time... Something to consider closely this days... Especially that we also announced the availability of a new Exadata X3-2 8th rack : a good starting point. As you have seen a major opening for this year again with true innovation. But that was not the only thing that we saw today, as before Larry's talk, Fujitsu did introduce more in deep the up coming new SPARC processor, that they are co-developing with us. And as such Andrew Mendelsohn - Senior Vice President Database Server Technologies came on stage to explain that the next step after I/O optimization for Database with Exadata, was to accelerate the Database at execution level by bringing functions in the SPARC processor silicium. All in all, to process more and more Data... The big theme of the day... and of the Oracle User Groups Conferences that were also happening today and where I had the opportunity to attend some interesting sessions on practical use cases of Big Data one in Finances and Fraud profiling and the other one on practical deployment of Oracle Exalytics for Data Analytics. In conclusion, one picture to try to size Oracle Open World ... and you can understand why, with such a rich content... and this only the first day !

    Read the article

  • How should I plan the inheritance structure for my game?

    - by Eric Thoma
    I am trying to write a platform shooter in C++ with a really good class structure for robustness. The game itself is secondary; it is the learning process of writing it that is primary. I am implementing an inheritance tree for all of the objects in my game, but I find myself unsure on some decisions. One specific issue that it bugging me is this: I have an Actor that is simply defined as anything in the game world. Under Actor is Character. Both of these classes are abstract. Under Character is the Philosopher, who is the main character that the user commands. Also under Character is NPC, which uses an AI module with stock routines for friendly, enemy and (maybe) neutral alignments. So under NPC I want to have three subclasses: FriendlyNPC, EnemyNPC and NeutralNPC. These classes are not abstract, and will often be subclassed in order to make different types of NPC's, like Engineer, Scientist and the most evil Programmer. Still, if I want to implement a generic NPC named Kevin, it would nice to be able to put him in without making a new class for him. I could just instantiate a FriendlyNPC and pass some values for the AI machine and for the dialogue; that would be ideal. But what if Kevin is the one benevolent Programmer in the whole world? Now we must make a class for him (but what should it be called?). Now we have a character that should inherit from Programmer (as Kevin has all the same abilities but just uses the friendly AI functions) but also should inherit from FriendlyNPC. Programmer and FriendlyNPC branched away from each other on the inheritance tree, so inheriting from both of them would have conflicts, because some of the same functions have been implemented in different ways on the two of them. 1) Is there a better way to order these classes to avoid these conflicts? Having three subclasses; Friendly, Enemy and Neutral; from each type of NPC; Engineer, Scientist, and Programmer; would amount to a huge number of classes. I would share specific implementation details, but I am writing the game slowly, piece by piece, and so I haven't implemented past Character yet. 2) Is there a place where I can learn these programming paradigms? I am already trying to take advantage of some good design patterns, like MVC architecture and Mediator objects. The whole point of this project is to write something in good style. It is difficult to tell what should become a subclass and what should become a state (i.e. Friendly boolean v. Friendly class). Having many states slows down code with if statements and makes classes long and unwieldy. On the other hand, having a class for everything isn't practical. 3) Are there good rules of thumb or resources to learn more about this? 4) Finally, where does templating come in to this? How should I coordinate templates into my class structure? I have never actually taken advantage of templating honestly, but I hear that it increases modularity, which means good code.

    Read the article

  • What is the path to JavaScript mastery?

    - by Eric Wilson
    I know how we start with JavaScript, we cut-and-paste a snippit to gain a little client-side functionality or validation. But if you follow this path in trying to implement rich interactive behavior, it doesn't take long before you realize that you are creating a Big Ball Of Mud. So what is the path towards expertise in programming the interaction layer? What books, tutorials, exercises, and processes contribute towards the ability to program robust, maintainable JavaScript? We all know that practice is important in any endeavor, but I'm looking for a path similar to the answer here: http://stackoverflow.com/questions/2573135/

    Read the article

  • Darn, no pay rise again then&hellip;

    - by Eric Nelson
    Fantastic news… “Great Place to Work Institute has announced Microsoft as the number one best place to work in Europe for the third year running” (more) But…does nothing to help when you are trying to convince your manager to give you a pay rise because the conditions are so awful here :-)

    Read the article

  • Superb post - What if Visual Studio had Achievements?

    - by Eric Nelson
    This post is simple superb – What if Visual Studio had Achievements :-) Although maybe you need to a developer who also has an Xbox to fully understand how good it is. My favourites: Shotgun Debugging – 5 Consecutive Solution Rebuilds with a single character change The Architect – Created 25 Interfaces in a single project The Multitasker – Have more than 50 source files open at the same time Every Option Considered – Created an enum with more than 30 values Thanks to Dominic for highlighting it to me!

    Read the article

  • Can I animate render targets or the swap chain?

    - by Eric F.
    I want to animate some synthetic video bits to fullscreen w/o tearing. Can I set up D3D 9/10/11 in exclusive mode, and have it present a series of buffers that I'm writing to? I know how to copy system memory bits into a texture, then draw that texture as a fullscreen quad, but it seems like overkill. Why should I use the triangle rasterizer when I want to do something so simple? All I want to do is set up a long (4-8 buffer) swapchain and set the bits of the back buffer that is about to be displayed. Or, I want to allocate 4-8 RenderTargets, and on each frame, copy the bits from system memory to the RenderTarget, then set it as the next thing to display. I've never seen or heard about anybody doing this, but it seems so dead simple!

    Read the article

  • Force Your Mac to Sort Folders on Top of Files (Windows Style)

    - by Eric Z Goodnight
    Even die-hard Mac converts have their issues with Mac OS, and one of those problems is that OS X lists folders mixed in with all other files. Here’s how to fix that in under five minutes with a clever hack. You know you’ve had that issue. You’ve dug through your files looking for that one elusive folder, and because it’s jumbled in with all the other stuff, it’s more or less impossible to find. Have no fear, with no downloads or silly plug-in software, you can finally make Mac OS behave like Windows and Linux and list those folders in the proper order.  How To Encrypt Your Cloud-Based Drive with BoxcryptorHTG Explains: Photography with Film-Based CamerasHow to Clean Your Dirty Smartphone (Without Breaking Something)

    Read the article

  • Free Windows Azure training in Reading, UK on the 25th of May for partners

    - by Eric Nelson
    The 6 weeks of Windows Azure training is full (500 registration in around a week) but it turns out we have a few places free on the 25th if you can make it to Reading. 14 places when I last checked (today, 5th May). Register now if you can make it. Workshop Outline Module 1: Windows Azure Platform overview Module 2: Introduction to Windows Azure Module 3: Building services using Windows Azure Module 4: Windows Azure storage Module 5: Building applications using SQL Azure Module 6: Introduction to .NET Services Module 7: Building applications using the .NET Service Bus

    Read the article

  • Problem with dpkg-preconfigure, how to correct?

    - by Eric Wilson
    I was trying to install TeamViewer, and I followed the instructions here even though they specify 11.10 instead of 12.04 (what I'm running). In particular, I executed. $ wget http://www.teamviewer.com/download/teamviewer_linux.deb $ sudo dpkg -i teamviewer_linux.deb The dpkg command failed, and after this point my packaging system has been broken. The software center instructs me to try: $ sudo apt-get -f install which leads to Reading package lists... Done Building dependency tree Reading state information... Done Correcting dependencies... Done The following packages will be REMOVED: teamviewer7:i386 0 upgraded, 0 newly installed, 1 to remove and 17 not upgraded. 9 not fully installed or removed. Need to get 89.0 kB of archives. After this operation, 81.9 MB disk space will be freed. Do you want to continue [Y/n]? y Get:1 http://us.archive.ubuntu.com/ubuntu/ precise/main dash amd64 0.5.7-2ubuntu2 [89.0 kB] Fetched 89.0 kB in 1s (83.9 kB/s) E: Sub-process /usr/sbin/dpkg-preconfigure --apt || true returned an error code (100) E: Failure running script /usr/sbin/dpkg-preconfigure --apt || true At this point I'm stumped.

    Read the article

  • Current State EA: Focus on the Integration!!!

    - by Eric A. Stephens
    A recent project has me at the front end of a large implementation effort covering multiple software components. In addition to the challenges of integrating 15-20 separate and new software components there is the challenge of integrating the portfolio into an existing environment. Like other clients I've worked with and other environments I've worked in for many years, this is typical. The applications are undocumented and under patched leading to a mystery for any architect leading change.  We can boil down most architecture development methodologies (ADM) into first understanding the current/baseline state and then envisioning one or more future states. Many pundits emphasize the need to focus on the future/target states. I agree since enterprise architecture (EA) is about where you are going and not so much where you have been. But to be effective in the future, I contend some focused time needs to be spent on the current state. And specifically on the integration. Integration is always the difficult part of a project (I might put it more coarsely at a cocktail party). While I don't have a case study, my anecdotal experience suggests poorly integrated application portfolios tend to cost more to operate and create entropy when trying to respond to new changes and opportunities. In the aforementioned project, I was able to get one of our EAs assigned to focus on just integration almost immediately. While we're still early in the process, this EA is uncovering all sorts of information that will greatly assist our future state planning for this solution. This information is driving early decision making that we anticipate will accelerate our efforts moving forward. #next_pages_container { width: 5px; hight: 5px; position: absolute; top: -100px; left: -100px; z-index: 2147483647 !important; } #next_pages_container { width: 5px; hight: 5px; position: absolute; top: -100px; left: -100px; z-index: 2147483647 !important; } #next_pages_container { width: 5px; hight: 5px; position: absolute; top: -100px; left: -100px; z-index: 2147483647 !important; } #next_pages_container { width: 5px; hight: 5px; position: absolute; top: -100px; left: -100px; z-index: 2147483647 !important; }

    Read the article

  • Google Keyword Competition rating

    - by Eric
    Google offers a Keyword application that allows me to see the number of time a particular query has been made in Google. There is a column in the results named "Competition" (Actually its Concurrence in French, I'm just translating). Its a rating from 0 to 1, as in percentage. What indicator is that? EDIT * Is this something useful I should rely on? I'm not sure about how to interpret this data. Should I go for less competitive keywords with a lower number of searches or not worry about it and go for the highly searched keywords anyway? Is 50% considered high? what about 75% ? I have a very niche market that sell expensive offline services, so the very long tail is my goal (I assume). If you didn't already figured out, I'm very new to SEO =)

    Read the article

  • Hiding recent files in Unity dashboard

    - by Eric
    Ubuntu 13.04 (though had the same issue in both 12.04 LTS and 12.10). Unity desktop (yes I like it, shush). Anyways, when clicking on the dashboard there is a tab for 'Files and Folders'. I don't have any files on this computer that isn't porn. In other words, it displays the images there (as it's supposed to), but I can't have it displaying the porn for obvious reasons. I have disabled 'recent activity' and even added the folder it's all in to the 'do not record activity in the following folders'. I'm assuming that works but as I don't actually have any other files, it still displays them. I don't want to have to make it a hidden folder because it's on an external HDD and causes issues when moving from computer to computer (I have other movies on it as well). TL;DR: Get rid of the 'Files and Folders' tab in the dashboard. Is it possible?

    Read the article

  • How do I make my hosting detect _escaped_fragment_ and fetch the corresponding HTML? [on hold]

    - by Eric
    I have an AJAX site and I'm using shebangs (#!) in my urls with the intention of then providing the correct HTML versions when google bots replace the #! with _escaped_fragment_. How do I go about routing/proxying/redirecting the url with _escaped_fragment_ to the corresponding html pages? I can't find documentation on this part of the process specifically, and my first thought was that I should be using a 301 or 302 redirect, but I was told that wasn't the case, albeit not given any more info.

    Read the article

  • How to backup encrypted home in encrypted form only?

    - by Eric
    I want to backup the encrypted home of a user who might be logged in at backup time. Which directories should I backup if I want to ensure that absolutely no plaintext data can be leaked? Are the following folders always encrypted? /home/user/.Private /home/user/.ecryptfs Just want to make sure that no data leaks, as the backup destination is untrustworthy. Edit: Yes, as Lord of Time has suggested, I'd like to know which folders and/or files I need to backup if I need to store only encrypted content in a way that allows me to recover it later with the right passphrase.

    Read the article

  • What is a Histogram, and How Can I Use it to Improve My Photos?

    - by Eric Z Goodnight
    What’s with that weird graph with all the peaks and valleys? You’ve seen it when you open Photoshop or go to edit a camera raw file. But what is that weird thing called a histogram, and what does it mean? The histogram is one of the most important and powerful tools for the digital imagemaker. And with a few moments reading, you’ll understand a few simple rules can make you a much more powerful image editor, as well as helping you shoot better photographs in the first place. So what are you waiting for? Read on!  What is a Histogram, and How Can I Use it to Improve My Photos?How To Easily Access Your Home Network From Anywhere With DDNSHow To Recover After Your Email Password Is Compromised

    Read the article

  • Steve Ballmer on Cloud Computing &ndash; We&rsquo;re all in

    - by Eric Nelson
    Steve spoke last week (March 4th 2010) on the possibilities of the Cloud and the importance to Microsoft. Of our 40,000 people building software, 70% of the people at Microsoft today are working on the Cloud – 90% in a years time In other words: The video is 85mins of Steve and there is an easy way of navigating to some soundbytes on presspass. I also like the new website that simplifies our story and commitment around the cloud http://www.microsoft.com/cloud/ Which includes an easy mapping between well known product offerings from Microsoft and the Cloud “equivalents”

    Read the article

  • Unicode license

    - by Eric Grange
    Unicode Terms of use (http://www.unicode.org/copyright.html) state that any software that uses their data files (or a modification of it) should carry the Unicode license references. It seems to me that most Unicode libraries have functions to check if a character is a digit, a letter, a symbol, etc. ans so will contain a modification of the Unicode Data Files (usually in the form of tables). Does that mean the license applies and all applications that use such Unicode libraries should carry the license? I've checked around, and it appears very few Unicode software do carry the license, though arguable most of those that didn't carry the license were from companies that were members of the Unicode consortium (do they get license exemption?). Some (f.i. Mozilla) are only "Liaison Members", and while their software do not carry the license (AFAICT), they do obviously rely on data derived from those data files. Is Mozilla in breach of the license? Should we carry the license in all apps that include any form of advanced Unicode support? (ie. are bound to rely on the Unicode data files) Or is there some form of broad exemption? (since very very few software out there carries the license)

    Read the article

  • Can I improve my AdWords quality scores with better landing pages?

    - by Eric
    I noticed that I have some keywords in my AdWords that are totally applicable to my site but the quality score of the keyword is 4 or 5. I'd like to get it up higher by creating custom versions of my site's home page (landing page) targeted specifically for people searching on those keywords. So for example, if we pretend my site sells pet food, my current home page has the phrase "dog food." I have a specific AdWords campaign for people searching on cat food (with cat food-specific ads). I'm thinking about changing the URL on those ads to something like http://mysite.com/cat.html, so a different home page comes up with the phrase "cat food." My thinking is that will help Google see that this new landing page is appropriate for the keywords and will raise my quality score for the "cat food" keywords. (Note that none of what I'm doing is shady or misleading; nobody would disagree that all of the keywords and ads I've created are perfect and appropriate for what my site offers.) Question: is what I describe the correct way to raise poor quality scores on keywords, and will it help?

    Read the article

  • A little gem from MPN&ndash;FREE online course on Architectural Guidance for Migrating Applications to Windows Azure Platform

    - by Eric Nelson
    I know a lot of technical people who work in partners (ISVs, System Integrators etc). I know that virtually none of them would think of going to the Microsoft Partner Network (MPN) learning portal to find some deep and high quality technical content. Instead they would head to MSDN, Channel 9, msdev.com etc. I am one of those people :-) Hence imagine my surprise when i stumbled upon this little gem Architectural Guidance for Migrating Applications to Windows Azure Platform (your company and hence your live id need to be a member of MPN – which is free to join). This is first class stuff – and represents about 4 hours which is really 8 if you stop and ponder :) Course Structure The course is divided into eight modules.  Each module explores a different factor that needs to be considered as part of the migration process. Module 1:  Introduction:  This section provides an introduction to the training course, highlighting the values of the Windows Azure Platform for developers. Module 2:  Dynamic Environment: This section goes into detail about the dynamic environment of the Windows Azure Platform. This session will explain the difference between current development states and the Windows Azure Platform environment, detail the functions of roles, and highlight development considerations to be aware of when working with the Windows Azure Platform. Module 3:  Local State: This session details the local state of the Windows Azure Platform. This section details the different types of storage within the Windows Azure Platform (Blobs, Tables, Queues, and SQL Azure). The training will provide technical guidance on local storage usage, how to write to blobs, how to effectively use table storage, and other authorization methods. Module 4:  Latency and Timeouts: This session goes into detail explaining the considerations surrounding latency, timeouts and how to assess an IT portfolio. Module 5:  Transactions and Bandwidth: This session details the performance metrics surrounding transactions and bandwidth in the Windows Azure Platform environment. This session will detail the transactions and bandwidth costs involved with the Windows Azure Platform and mitigation techniques that can be used to properly manage those costs. Module 6:  Authentication and Authorization: This session details authentication and authorization protocols within the Windows Azure Platform. This session will detail information around web methods of authorization, web identification, Access Control Benefits, and a walkthrough of the Windows Identify Foundation. Module 7:  Data Sensitivity: This session details data considerations that users and developers will experience when placing data into the cloud. This section of the training highlights these concerns, and details the strategies that developers can take to increase the security of their data in the cloud. Module 8:  Summary Provides an overall review of the course.

    Read the article

  • Functions that only call other functions. Is this a good practice?

    - by Eric C.
    I'm currently working on a set of reports that have many different sections (all requiring different formatting), and I'm trying to figure out the best way to structure my code. Similar reports we've done in the past end up having very large (200+ line) functions that do all of the data manipulation and formatting for the report, such that the workflow looks something like this: DataTable reportTable = new DataTable(); void RunReport() { reportTable = DataClass.getReportData(); largeReportProcessingFunction(); outputReportToUser(); } I would like to be able to break these large functions up into smaller chunks, but I'm afraid that I'll just end up having dozens of non-reusable functions, and a similar "do everything here" function whose only job is to call all these smaller functions, like so: void largeReportProcessingFunction() { processSection1HeaderData(); calculateSection1HeaderAverages(); formatSection1HeaderDisplay(); processSection1SummaryTableData(); calculateSection1SummaryTableTotalRow(); formatSection1SummaryTableDisplay(); processSection1FooterData(); getSection1FooterSummaryTotals(); formatSection1FooterDisplay(); processSection2HeaderData(); calculateSection1HeaderAverages(); formatSection1HeaderDisplay(); calculateSection1HeaderAverages(); ... } Or, if we go one step further: void largeReportProcessingFunction() { callAllSection1Functions(); callAllSection2Functions(); callAllSection3Functions(); ... } Is this really a better solution? From an organizational point of view I suppose it is (i.e. everything is much more organized than it might otherwise be), but as far as code readability I'm not sure (potentially large chains of functions that only call other functions). Thoughts?

    Read the article

  • Basic is Best

    - by Eric A. Stephens
    Fellow foodies will recognize the recent movement towards "farm-to-table" restaurants. These venues attempt to simplify their menus and source ingredients as close to the source as possible. I had the opportunity to dine at such a restaurant the other evening. I was gushing about the appetizer to my server when she described the preparation for the item and then punctuated her comments with "basic is best". I reminded my fellow enterprise architect diners there was an architecture lesson in that statement. They rolled their eyes and chuckled. But they also knew I was right. I'm reminded of Frederick Brooks' book The Mythical Man Month and his latest The Design of Design. The former must read book talks about complexity. But he refrains from damning all complexity. The world we live in and enterprises we strive to transform with enterprise architecture are complicated organisms, much like the human body. But sometimes a simple solution is the best approach. Fewer applications (think: portfolio rationalization). Fewer components. Fewer lines of code. Whatever level of abstraction you are working at, less is more. I'm reminded of the enterprise architecture principle "Control Technical Diversity". At one firm I created pithy catch phrases for each principles. I named this one "Less is More". But perhaps another variation is what my server said the other night, "Basic is Best".

    Read the article

< Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >