Search Results

Search found 1280 results on 52 pages for 'eric maurice'.

Page 13/52 | < Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >

  • FREE goodies if you are a UK based software house already live on the Windows Azure Platform

    - by Eric Nelson
    In the UK we have seen some fantastic take up around the Windows Azure Platform and we have lined up some great stuff in 2011 to help companies fully exploit the Cloud – but we need you to tell us what you are up to! Once you tell us about your plans around Windows Azure, you will get access to FREE benefits including email based developer support and free monthly allowance of Windows Azure, SQL Azure and AppFabric from Jan 2011 – and more! (This offer is referred to as Cloud Essentials and is explained here) And… we will be able to plan the right amount of activity to continue to help early adopters through 2011. Step 1: Sign up your company to Microsoft Platform Ready (you will need a windows live id to do this) Step 2: Add your applications For each application, state your intention around Windows Azure (and SQL etc if you so wish) Step 3: Verify your application works on the Windows Azure Platform Step 4 (Optional): Test your application works on the Windows Azure Platform Download the FREE test tool. Test your application with it and upload the successful results. Step 5: Revisit the MPR site in early January to get details of Cloud Essentials and other benefits P.S. You might want some background on the “fantastic take up” bit: We helped over 3000 UK companies deploy test applications during the beta phase of Windows Azure We directly trained over 1000 UK developers during 2010 We already have over 100 UK applications profiled on the Microsoft Platform Ready site And in a recent survey of UK ISVs you all look pretty excited around Cloud – 42% already offer their solution on the Cloud or plan to.

    Read the article

  • Announcing a new Free Windows Azure Platform Trial offer

    - by Eric Nelson
    We now have a  truly useful Windows Azure Platform trial. Which makes me very happy as I was a vocal critic of the original trial offer. Simply put, the small number of compute hours it included made it useless for many potential early adopters. This is now fixed. The new Introductory Special now includes a generous 750 hours of compute – enough to run a web role 24/7. Enjoy! Related Links Full announcement If you are an ISV then there is a better offer for you via Microsoft Platform Ready and Cloud Essentials and keep an eye on our events for ISVs as we will be doing Windows Azure Platform technical briefings starting March 31st.

    Read the article

  • #OOW 2012 : IaaS, Private Cloud, Multitenant Database, and X3H2M2

    - by Eric Bezille
    The title of this post is a summary of the 4 announcements made by Larry Ellison today, during the opening session of Oracle Open World 2012... To know what's behind X3H2M2, you will have to wait a little, as I will go in order, beginning with the IaaS - Infrastructure as a Service - announcement. Oracle IaaS goes Public... and Private... Starting in 2004 with Fusion development, Oracle Cloud was launch last year to provide not only SaaS Application, based on standard development, but also the underlying PaaS, required to build the specifics, and required interconnections between applications, in and outside of the Cloud. Still, to cover the end-to-end Cloud  Services spectrum, we had to provide an Infrastructure as a Service, leveraging our Servers, Storage, OS, and Virtualization Technologies, all "Engineered Together". This Cloud Infrastructure, was already available for our customers to build rapidly their own Private Cloud either on SPARC/Solaris or x86/Linux... The second announcement made today bring that proposition a big step further : for cautious customers (like Banks, or sensible industries) who would like to benefits from the Cloud value of "as a Service", but don't want their Data out in the Cloud... We propose to them to operate the same systems, Exadata, Exalogic & SuperCluster, that are providing our Public Cloud Infrastructure, behind their firewall, in a Private Cloud model. Oracle 12c Multitenant Database This is also a major announcement made today, on what's coming with Oracle Database 12c : the ability to consolidate multiple databases with no extra additional  cost especially in terms of memory needed on the server node, which is often THE consolidation limiting factor. The principle could be compare to Solaris Zones, where, you will have a Database Container, who is "owning" the memory and Database background processes, and "Pluggable" Database in this Database Container. This particular feature is a strong compelling event to evaluate rapidly Oracle Database 12c once it will be available, as this is major step forward into true Database consolidation with Multitenancy on a shared (optimized) infrastructure. X3H2M2, enabling the new Exadata X3 in-Memory Database Here we are :  X3H2M2 stands for X3 (the new version of Exadata announced also today) Heuristic Hierarchical Mass Memory, providing the capability to keep most if not all the Data in the memory cache hierarchy. Of course, this is the major software enhancement of the new X3 Exadata machine, but as this is a software, our current customers would be able to benefit from it on their existing systems by upgrading to the new release. But that' not the only thing that we did with X3, at the same time we have upgraded everything : the CPUs, adding more cores per server node (16 vs. 12, with the arrival of Intel E5 / Sandy Bridge), the memory with 512GB memory as well per node,  and the new Flash Fire card, bringing now up to 22 TB of Flash cache. All of this 4TB of RAM + 22TB of Flash being use cleverly not only for read but also for write by the X3H2M2 algorithm... making a very big difference compare to traditional storage flash extension. But what does those extra performances brings to you on an already very efficient system: double your performances compare to the fastest storage array on the market today (including flash) and divide you storage price x10 at the same time... Something to consider closely this days... Especially that we also announced the availability of a new Exadata X3-2 8th rack : a good starting point. As you have seen a major opening for this year again with true innovation. But that was not the only thing that we saw today, as before Larry's talk, Fujitsu did introduce more in deep the up coming new SPARC processor, that they are co-developing with us. And as such Andrew Mendelsohn - Senior Vice President Database Server Technologies came on stage to explain that the next step after I/O optimization for Database with Exadata, was to accelerate the Database at execution level by bringing functions in the SPARC processor silicium. All in all, to process more and more Data... The big theme of the day... and of the Oracle User Groups Conferences that were also happening today and where I had the opportunity to attend some interesting sessions on practical use cases of Big Data one in Finances and Fraud profiling and the other one on practical deployment of Oracle Exalytics for Data Analytics. In conclusion, one picture to try to size Oracle Open World ... and you can understand why, with such a rich content... and this only the first day !

    Read the article

  • How should I plan the inheritance structure for my game?

    - by Eric Thoma
    I am trying to write a platform shooter in C++ with a really good class structure for robustness. The game itself is secondary; it is the learning process of writing it that is primary. I am implementing an inheritance tree for all of the objects in my game, but I find myself unsure on some decisions. One specific issue that it bugging me is this: I have an Actor that is simply defined as anything in the game world. Under Actor is Character. Both of these classes are abstract. Under Character is the Philosopher, who is the main character that the user commands. Also under Character is NPC, which uses an AI module with stock routines for friendly, enemy and (maybe) neutral alignments. So under NPC I want to have three subclasses: FriendlyNPC, EnemyNPC and NeutralNPC. These classes are not abstract, and will often be subclassed in order to make different types of NPC's, like Engineer, Scientist and the most evil Programmer. Still, if I want to implement a generic NPC named Kevin, it would nice to be able to put him in without making a new class for him. I could just instantiate a FriendlyNPC and pass some values for the AI machine and for the dialogue; that would be ideal. But what if Kevin is the one benevolent Programmer in the whole world? Now we must make a class for him (but what should it be called?). Now we have a character that should inherit from Programmer (as Kevin has all the same abilities but just uses the friendly AI functions) but also should inherit from FriendlyNPC. Programmer and FriendlyNPC branched away from each other on the inheritance tree, so inheriting from both of them would have conflicts, because some of the same functions have been implemented in different ways on the two of them. 1) Is there a better way to order these classes to avoid these conflicts? Having three subclasses; Friendly, Enemy and Neutral; from each type of NPC; Engineer, Scientist, and Programmer; would amount to a huge number of classes. I would share specific implementation details, but I am writing the game slowly, piece by piece, and so I haven't implemented past Character yet. 2) Is there a place where I can learn these programming paradigms? I am already trying to take advantage of some good design patterns, like MVC architecture and Mediator objects. The whole point of this project is to write something in good style. It is difficult to tell what should become a subclass and what should become a state (i.e. Friendly boolean v. Friendly class). Having many states slows down code with if statements and makes classes long and unwieldy. On the other hand, having a class for everything isn't practical. 3) Are there good rules of thumb or resources to learn more about this? 4) Finally, where does templating come in to this? How should I coordinate templates into my class structure? I have never actually taken advantage of templating honestly, but I hear that it increases modularity, which means good code.

    Read the article

  • Windows Azure Platform eBook Update #2 &ndash; 100 pages of goodness

    - by Eric Nelson
    I previously mentioned I was working on a community authored eBook for the Windows Azure Platform. Well, today I assembled the 20 articles that made it through to the end of the review process into a single eBook – and it looks (and reads) great. Still a lot more to do (and stuff in the way of me doing it) but as a teaser, here is the (very draft) table of contents:

    Read the article

  • Superb post - What if Visual Studio had Achievements?

    - by Eric Nelson
    This post is simple superb – What if Visual Studio had Achievements :-) Although maybe you need to a developer who also has an Xbox to fully understand how good it is. My favourites: Shotgun Debugging – 5 Consecutive Solution Rebuilds with a single character change The Architect – Created 25 Interfaces in a single project The Multitasker – Have more than 50 source files open at the same time Every Option Considered – Created an enum with more than 30 values Thanks to Dominic for highlighting it to me!

    Read the article

  • What is the path to JavaScript mastery?

    - by Eric Wilson
    I know how we start with JavaScript, we cut-and-paste a snippit to gain a little client-side functionality or validation. But if you follow this path in trying to implement rich interactive behavior, it doesn't take long before you realize that you are creating a Big Ball Of Mud. So what is the path towards expertise in programming the interaction layer? What books, tutorials, exercises, and processes contribute towards the ability to program robust, maintainable JavaScript? We all know that practice is important in any endeavor, but I'm looking for a path similar to the answer here: http://stackoverflow.com/questions/2573135/

    Read the article

  • Darn, no pay rise again then&hellip;

    - by Eric Nelson
    Fantastic news… “Great Place to Work Institute has announced Microsoft as the number one best place to work in Europe for the third year running” (more) But…does nothing to help when you are trying to convince your manager to give you a pay rise because the conditions are so awful here :-)

    Read the article

  • Can I animate render targets or the swap chain?

    - by Eric F.
    I want to animate some synthetic video bits to fullscreen w/o tearing. Can I set up D3D 9/10/11 in exclusive mode, and have it present a series of buffers that I'm writing to? I know how to copy system memory bits into a texture, then draw that texture as a fullscreen quad, but it seems like overkill. Why should I use the triangle rasterizer when I want to do something so simple? All I want to do is set up a long (4-8 buffer) swapchain and set the bits of the back buffer that is about to be displayed. Or, I want to allocate 4-8 RenderTargets, and on each frame, copy the bits from system memory to the RenderTarget, then set it as the next thing to display. I've never seen or heard about anybody doing this, but it seems so dead simple!

    Read the article

  • Force Your Mac to Sort Folders on Top of Files (Windows Style)

    - by Eric Z Goodnight
    Even die-hard Mac converts have their issues with Mac OS, and one of those problems is that OS X lists folders mixed in with all other files. Here’s how to fix that in under five minutes with a clever hack. You know you’ve had that issue. You’ve dug through your files looking for that one elusive folder, and because it’s jumbled in with all the other stuff, it’s more or less impossible to find. Have no fear, with no downloads or silly plug-in software, you can finally make Mac OS behave like Windows and Linux and list those folders in the proper order.  How To Encrypt Your Cloud-Based Drive with BoxcryptorHTG Explains: Photography with Film-Based CamerasHow to Clean Your Dirty Smartphone (Without Breaking Something)

    Read the article

  • Free Windows Azure training in Reading, UK on the 25th of May for partners

    - by Eric Nelson
    The 6 weeks of Windows Azure training is full (500 registration in around a week) but it turns out we have a few places free on the 25th if you can make it to Reading. 14 places when I last checked (today, 5th May). Register now if you can make it. Workshop Outline Module 1: Windows Azure Platform overview Module 2: Introduction to Windows Azure Module 3: Building services using Windows Azure Module 4: Windows Azure storage Module 5: Building applications using SQL Azure Module 6: Introduction to .NET Services Module 7: Building applications using the .NET Service Bus

    Read the article

  • Google Keyword Competition rating

    - by Eric
    Google offers a Keyword application that allows me to see the number of time a particular query has been made in Google. There is a column in the results named "Competition" (Actually its Concurrence in French, I'm just translating). Its a rating from 0 to 1, as in percentage. What indicator is that? EDIT * Is this something useful I should rely on? I'm not sure about how to interpret this data. Should I go for less competitive keywords with a lower number of searches or not worry about it and go for the highly searched keywords anyway? Is 50% considered high? what about 75% ? I have a very niche market that sell expensive offline services, so the very long tail is my goal (I assume). If you didn't already figured out, I'm very new to SEO =)

    Read the article

  • Hiding recent files in Unity dashboard

    - by Eric
    Ubuntu 13.04 (though had the same issue in both 12.04 LTS and 12.10). Unity desktop (yes I like it, shush). Anyways, when clicking on the dashboard there is a tab for 'Files and Folders'. I don't have any files on this computer that isn't porn. In other words, it displays the images there (as it's supposed to), but I can't have it displaying the porn for obvious reasons. I have disabled 'recent activity' and even added the folder it's all in to the 'do not record activity in the following folders'. I'm assuming that works but as I don't actually have any other files, it still displays them. I don't want to have to make it a hidden folder because it's on an external HDD and causes issues when moving from computer to computer (I have other movies on it as well). TL;DR: Get rid of the 'Files and Folders' tab in the dashboard. Is it possible?

    Read the article

  • Problem with dpkg-preconfigure, how to correct?

    - by Eric Wilson
    I was trying to install TeamViewer, and I followed the instructions here even though they specify 11.10 instead of 12.04 (what I'm running). In particular, I executed. $ wget http://www.teamviewer.com/download/teamviewer_linux.deb $ sudo dpkg -i teamviewer_linux.deb The dpkg command failed, and after this point my packaging system has been broken. The software center instructs me to try: $ sudo apt-get -f install which leads to Reading package lists... Done Building dependency tree Reading state information... Done Correcting dependencies... Done The following packages will be REMOVED: teamviewer7:i386 0 upgraded, 0 newly installed, 1 to remove and 17 not upgraded. 9 not fully installed or removed. Need to get 89.0 kB of archives. After this operation, 81.9 MB disk space will be freed. Do you want to continue [Y/n]? y Get:1 http://us.archive.ubuntu.com/ubuntu/ precise/main dash amd64 0.5.7-2ubuntu2 [89.0 kB] Fetched 89.0 kB in 1s (83.9 kB/s) E: Sub-process /usr/sbin/dpkg-preconfigure --apt || true returned an error code (100) E: Failure running script /usr/sbin/dpkg-preconfigure --apt || true At this point I'm stumped.

    Read the article

  • Current State EA: Focus on the Integration!!!

    - by Eric A. Stephens
    A recent project has me at the front end of a large implementation effort covering multiple software components. In addition to the challenges of integrating 15-20 separate and new software components there is the challenge of integrating the portfolio into an existing environment. Like other clients I've worked with and other environments I've worked in for many years, this is typical. The applications are undocumented and under patched leading to a mystery for any architect leading change.  We can boil down most architecture development methodologies (ADM) into first understanding the current/baseline state and then envisioning one or more future states. Many pundits emphasize the need to focus on the future/target states. I agree since enterprise architecture (EA) is about where you are going and not so much where you have been. But to be effective in the future, I contend some focused time needs to be spent on the current state. And specifically on the integration. Integration is always the difficult part of a project (I might put it more coarsely at a cocktail party). While I don't have a case study, my anecdotal experience suggests poorly integrated application portfolios tend to cost more to operate and create entropy when trying to respond to new changes and opportunities. In the aforementioned project, I was able to get one of our EAs assigned to focus on just integration almost immediately. While we're still early in the process, this EA is uncovering all sorts of information that will greatly assist our future state planning for this solution. This information is driving early decision making that we anticipate will accelerate our efforts moving forward. #next_pages_container { width: 5px; hight: 5px; position: absolute; top: -100px; left: -100px; z-index: 2147483647 !important; } #next_pages_container { width: 5px; hight: 5px; position: absolute; top: -100px; left: -100px; z-index: 2147483647 !important; } #next_pages_container { width: 5px; hight: 5px; position: absolute; top: -100px; left: -100px; z-index: 2147483647 !important; } #next_pages_container { width: 5px; hight: 5px; position: absolute; top: -100px; left: -100px; z-index: 2147483647 !important; }

    Read the article

  • What is a Histogram, and How Can I Use it to Improve My Photos?

    - by Eric Z Goodnight
    What’s with that weird graph with all the peaks and valleys? You’ve seen it when you open Photoshop or go to edit a camera raw file. But what is that weird thing called a histogram, and what does it mean? The histogram is one of the most important and powerful tools for the digital imagemaker. And with a few moments reading, you’ll understand a few simple rules can make you a much more powerful image editor, as well as helping you shoot better photographs in the first place. So what are you waiting for? Read on!  What is a Histogram, and How Can I Use it to Improve My Photos?How To Easily Access Your Home Network From Anywhere With DDNSHow To Recover After Your Email Password Is Compromised

    Read the article

  • How do I make my hosting detect _escaped_fragment_ and fetch the corresponding HTML? [on hold]

    - by Eric
    I have an AJAX site and I'm using shebangs (#!) in my urls with the intention of then providing the correct HTML versions when google bots replace the #! with _escaped_fragment_. How do I go about routing/proxying/redirecting the url with _escaped_fragment_ to the corresponding html pages? I can't find documentation on this part of the process specifically, and my first thought was that I should be using a 301 or 302 redirect, but I was told that wasn't the case, albeit not given any more info.

    Read the article

  • How to backup encrypted home in encrypted form only?

    - by Eric
    I want to backup the encrypted home of a user who might be logged in at backup time. Which directories should I backup if I want to ensure that absolutely no plaintext data can be leaked? Are the following folders always encrypted? /home/user/.Private /home/user/.ecryptfs Just want to make sure that no data leaks, as the backup destination is untrustworthy. Edit: Yes, as Lord of Time has suggested, I'd like to know which folders and/or files I need to backup if I need to store only encrypted content in a way that allows me to recover it later with the right passphrase.

    Read the article

  • Steve Ballmer on Cloud Computing &ndash; We&rsquo;re all in

    - by Eric Nelson
    Steve spoke last week (March 4th 2010) on the possibilities of the Cloud and the importance to Microsoft. Of our 40,000 people building software, 70% of the people at Microsoft today are working on the Cloud – 90% in a years time In other words: The video is 85mins of Steve and there is an easy way of navigating to some soundbytes on presspass. I also like the new website that simplifies our story and commitment around the cloud http://www.microsoft.com/cloud/ Which includes an easy mapping between well known product offerings from Microsoft and the Cloud “equivalents”

    Read the article

  • Unicode license

    - by Eric Grange
    Unicode Terms of use (http://www.unicode.org/copyright.html) state that any software that uses their data files (or a modification of it) should carry the Unicode license references. It seems to me that most Unicode libraries have functions to check if a character is a digit, a letter, a symbol, etc. ans so will contain a modification of the Unicode Data Files (usually in the form of tables). Does that mean the license applies and all applications that use such Unicode libraries should carry the license? I've checked around, and it appears very few Unicode software do carry the license, though arguable most of those that didn't carry the license were from companies that were members of the Unicode consortium (do they get license exemption?). Some (f.i. Mozilla) are only "Liaison Members", and while their software do not carry the license (AFAICT), they do obviously rely on data derived from those data files. Is Mozilla in breach of the license? Should we carry the license in all apps that include any form of advanced Unicode support? (ie. are bound to rely on the Unicode data files) Or is there some form of broad exemption? (since very very few software out there carries the license)

    Read the article

  • Can I improve my AdWords quality scores with better landing pages?

    - by Eric
    I noticed that I have some keywords in my AdWords that are totally applicable to my site but the quality score of the keyword is 4 or 5. I'd like to get it up higher by creating custom versions of my site's home page (landing page) targeted specifically for people searching on those keywords. So for example, if we pretend my site sells pet food, my current home page has the phrase "dog food." I have a specific AdWords campaign for people searching on cat food (with cat food-specific ads). I'm thinking about changing the URL on those ads to something like http://mysite.com/cat.html, so a different home page comes up with the phrase "cat food." My thinking is that will help Google see that this new landing page is appropriate for the keywords and will raise my quality score for the "cat food" keywords. (Note that none of what I'm doing is shady or misleading; nobody would disagree that all of the keywords and ads I've created are perfect and appropriate for what my site offers.) Question: is what I describe the correct way to raise poor quality scores on keywords, and will it help?

    Read the article

  • A little gem from MPN&ndash;FREE online course on Architectural Guidance for Migrating Applications to Windows Azure Platform

    - by Eric Nelson
    I know a lot of technical people who work in partners (ISVs, System Integrators etc). I know that virtually none of them would think of going to the Microsoft Partner Network (MPN) learning portal to find some deep and high quality technical content. Instead they would head to MSDN, Channel 9, msdev.com etc. I am one of those people :-) Hence imagine my surprise when i stumbled upon this little gem Architectural Guidance for Migrating Applications to Windows Azure Platform (your company and hence your live id need to be a member of MPN – which is free to join). This is first class stuff – and represents about 4 hours which is really 8 if you stop and ponder :) Course Structure The course is divided into eight modules.  Each module explores a different factor that needs to be considered as part of the migration process. Module 1:  Introduction:  This section provides an introduction to the training course, highlighting the values of the Windows Azure Platform for developers. Module 2:  Dynamic Environment: This section goes into detail about the dynamic environment of the Windows Azure Platform. This session will explain the difference between current development states and the Windows Azure Platform environment, detail the functions of roles, and highlight development considerations to be aware of when working with the Windows Azure Platform. Module 3:  Local State: This session details the local state of the Windows Azure Platform. This section details the different types of storage within the Windows Azure Platform (Blobs, Tables, Queues, and SQL Azure). The training will provide technical guidance on local storage usage, how to write to blobs, how to effectively use table storage, and other authorization methods. Module 4:  Latency and Timeouts: This session goes into detail explaining the considerations surrounding latency, timeouts and how to assess an IT portfolio. Module 5:  Transactions and Bandwidth: This session details the performance metrics surrounding transactions and bandwidth in the Windows Azure Platform environment. This session will detail the transactions and bandwidth costs involved with the Windows Azure Platform and mitigation techniques that can be used to properly manage those costs. Module 6:  Authentication and Authorization: This session details authentication and authorization protocols within the Windows Azure Platform. This session will detail information around web methods of authorization, web identification, Access Control Benefits, and a walkthrough of the Windows Identify Foundation. Module 7:  Data Sensitivity: This session details data considerations that users and developers will experience when placing data into the cloud. This section of the training highlights these concerns, and details the strategies that developers can take to increase the security of their data in the cloud. Module 8:  Summary Provides an overall review of the course.

    Read the article

  • Functions that only call other functions. Is this a good practice?

    - by Eric C.
    I'm currently working on a set of reports that have many different sections (all requiring different formatting), and I'm trying to figure out the best way to structure my code. Similar reports we've done in the past end up having very large (200+ line) functions that do all of the data manipulation and formatting for the report, such that the workflow looks something like this: DataTable reportTable = new DataTable(); void RunReport() { reportTable = DataClass.getReportData(); largeReportProcessingFunction(); outputReportToUser(); } I would like to be able to break these large functions up into smaller chunks, but I'm afraid that I'll just end up having dozens of non-reusable functions, and a similar "do everything here" function whose only job is to call all these smaller functions, like so: void largeReportProcessingFunction() { processSection1HeaderData(); calculateSection1HeaderAverages(); formatSection1HeaderDisplay(); processSection1SummaryTableData(); calculateSection1SummaryTableTotalRow(); formatSection1SummaryTableDisplay(); processSection1FooterData(); getSection1FooterSummaryTotals(); formatSection1FooterDisplay(); processSection2HeaderData(); calculateSection1HeaderAverages(); formatSection1HeaderDisplay(); calculateSection1HeaderAverages(); ... } Or, if we go one step further: void largeReportProcessingFunction() { callAllSection1Functions(); callAllSection2Functions(); callAllSection3Functions(); ... } Is this really a better solution? From an organizational point of view I suppose it is (i.e. everything is much more organized than it might otherwise be), but as far as code readability I'm not sure (potentially large chains of functions that only call other functions). Thoughts?

    Read the article

  • How should I track approval workflow when users at every security level can create a request?

    - by Eric Belair
    I am writing a new application that allows users to enter requests. Once a request is entered, it must follow an approval workflow to be finally approved by a user the highest security level. So, let's say a user at Security Level 1 enters a request. This request must be approved by his superior - a user at Security Level 2. Once the Security Level 2 user approves it, it must be approved by a user at Security Level 3. Once the Security Level 3 user approves it, it is considered fully approved. However, users at any of the three Security Levels can enter requests. So, if a Security Level 3 user enters a request, it is automatically considered "fully approved". And, if a Security Level 2 user enters a request, it must only be approved by a Security Level 3 user. I'm currently storing each approval status in a Database Log Table, like so: STATUS_ID (PK) REQUEST_ID STATUS STATUS_DATE -------------- ------------- ---------------- ----------------------- 1 1 USER_SUBMIT 2012-09-01 00:00:00.000 2 1 APPROVED_LEVEL2 2012-09-01 01:00:00.000 3 1 APPROVED_LEVEL3 2012-09-01 02:00:00.000 4 2 USER_SUBMIT 2012-09-01 02:30:00.000 5 2 APPROVED_LEVEL2 2012-09-01 02:45:00.000 My question is, which is a better design: Record all three statuses for every request ...or... Record only the statuses needed according to the Security Level of the user submitting the request In Case 2, the data might look like this for two requests - one submitted by Security Level 2 User and another submitted by Security Level 3 user: STATUS_ID (PK) REQUEST_ID STATUS STATUS_DATE -------------- ------------- ---------------- ----------------------- 1 3 APPROVED_LEVEL2 2012-09-01 01:00:00.000 2 3 APPROVED_LEVEL3 2012-09-01 02:00:00.000 3 4 APPROVED_LEVEL3 2012-09-01 02:00:00.000

    Read the article

  • FREE three days of online SharePoint 2010 development training for UK software houses Feb 9th to 11th

    - by Eric Nelson
    I have been working to get a SharePoint development course delivered online in February and March – online means lots of opportunities to ask questions. The first dates are now in place. The training is being delivered as a benefit for companies signed up to Microsoft Platform Ready. It is intended for UK based companies who develop software products* Agenda: Day 1 (Live Meeting 3 hours) 1:30 - 4:30 •         Getting Started with SharePoint: Understand why and how to start developing for SharePoint 2010 •         SharePoint 2010 Developer Roadmap:  Explore the new capabilities and features •         UI Enhancements: How to take advantage of the many UI enhancements including the fluent UI ribbon and  extensible dialog system. Day 2 (Live Meeting 3 hours) 1:30 - 4:30 •         Visual Studio 2010 Tools for SharePoint 2010: Overview of the project and item templates and a walkthrough of the designers •         Sandboxed Solutions: The new deployment model can help mitigate the risk of deploying custom code   •         LINQ to SharePoint:  SharePoint now fully supports LINQ for querying lists Day 3 (Live Meeting 3 hours) 1:30 - 4:30 •         Client Object Model: The Client OM can be accessed via web services, via a client (JavaScript) API, and via REST •         Accessing External Data: Business Connectivity Services (BCS) enables integration with back end systems •         Workflow: A powerful mechanism to create functionality using Windows Workflow Foundation Register for FREE (and tell your colleagues – we have a pretty decent capacity) To take advantage of this you need to: Sign your company up to Microsoft Platform Ready and record your SharePoint interest against one of your companies products Read about Microsoft Platform Ready Navigate to the “Get Technical Benefits” tab for SharePoint and click on Register Today You will then ultimately get an email with details of the Live Meeting to join on the 9th. But you should also favourite the team blog for any last minute details * Such companies are often referred to as an Independent Software Vendors. My team is focused on companies that create products used by many other companies or individuals. That could be a packaged product you can buy "off the shelf" or a Web Site offering a service - the definition is actually pretty wide these days :-) What it does not include is a company building software which will only be used by its own people.

    Read the article

< Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >