Search Results

Search found 2575 results on 103 pages for 'canonical cover'.

Page 76/103 | < Previous Page | 72 73 74 75 76 77 78 79 80 81 82 83  | Next Page >

  • ApiChange Corporate Edition

    - by Alois Kraus
    In my inital announcement I could only cover a small subset what ApiChange can do for you. Lets look at how ApiChange can help you to fix bugs due to wrong usage of an Api within a fraction of time than it would take normally. It happens that software is tested and some bugs show up. One bug could be …. : We get way too man log messages during our test run. Now you have the task to find the most frequent messages and eliminate the Log calls from the source code. But what about the myriads other log calls? How can we check that the distribution of log calls is nearly equal across all developers? And if not how can we contact the developer to check his code? ApiChange can help you too connect these loose ends. It combines several information silos into one cohesive view. The picture below shows how it is able to fill the gaps. The public version does currently “only” parse the binaries and pdbs to give you for a –whousesmethod query the following colums: If it happens that you have Rational ClearCase (a source control system) in your development shop and an Active Directory in place then ApiChange will try to determine from the source file which was determined from the pdb the last check in user which should be present in your Active Directory. From there it is only a small hop to an LDAP query to your AD domain or the GC (Global Catalog) to get from the user name his Full name Email Phone number Department …. ApiChange will append this additional data all of your query results which contain source files if you add the –fileinfo option. As I said this is currently not enabled by default since the AD domain needs to be configured which are currently only some hard coded values in the SiteConstants.cs source file of ApiChange.Api.dll. Once you got this data you can generate metrics based on source file, developer, assembly, … and add additional data by drag and drop directly into the pivot tables inside Excel. This allows you to e.g. to generate a report which lists the source files with most log calls in descending order along with the developer name and email in the pivot table. Armed with this knowledge you can take meaningful measures e.g. to ask the developer if the huge number of log calls in this source file can be optimized. I am aware that this is a very specific scenario but it is a huge time saver when you are able to fill the missing gaps of information. ApiChange does this in an extensible way. namespace ApiChange.ExternalData {     public interface IFileInformationProvider     {         UserInfo GetInformationFromFile(string fileName);     } } It defines an interface where you can implement your custom information provider to close the gap between source control system and the real person I have to send an email to ask if his code needs a closer inspection.

    Read the article

  • Cooking with Wessty: WordPress and HTML 5

    - by David Wesst
    WordPress is easily one, if not the most, popular blogging platforms on the web. With the release of WordPress 3.x, the potential for what you can do with this open source software is limitless. This technique intends to show you how to get your WordPress wielding the power of the future web, that being HTML 5. --- Ingredients WordPress 3.x Your favourite HTML 5 compliant browser (e.g. Internet Explorer 9) Directions Setup WordPress on your server or host. Note: You can setup a WordPress.com account, but you will require an paid add-on to really take advantage of this technique.Login to the administration panel. Login to the administration section of your blog, using your web browser.  On the left side of the page, click the Appearance heading. Then, click on Themes. At the top of the page, select the Install Themes tab. In the search box, type the “toolbox” and click search. In the search results, you should see an theme called Toolbox. Click the Install link in the Toolbox item. A dialog window should appear with a sample picture of what the theme looks like. Click on the Install Now button in the bottom right corner. Et voila! Once the installation is done, you are done and ready to bring your blog into the future of the web. Try previewing your blog in HTML 5 by clicking the preview link.   Now, you are probably thinking “Man…HTML 5 looks like junk”. To that, I respond: “HTML was never why your site looked good in the first place. It was the CSS.” Now you have an un-stylized theme that uses HTML 5 elements throughout your WordPress site. If you want to learn how to apply CSS to your WordPress blog, you should check out the WordPress codex that pretty much covers everything there is to cover about WordPress development. Now, remember how we noted earlier that your free WordPress.com account wouldn’t take advantage of this technique? That is because, as of the time of this writing, you needed to pay a fee to use custom CSS. Remember now, this only gives you the foundation to create your own HTML 5 WordPress site. There are some HTML 5 themes out there that already look good, and were built using this as the foundation and added some CSS 3 to really spice it up. Looking forward to seeing more HTML 5 WordPress sites! Enjoy developing the future of the web. Resources Toolbox Theme JustCSS Theme WordPress Installation Tutorial WordPress Theme Development Tutorial This post also appears at http://david.wes.st

    Read the article

  • Three Master Data Management Deployment Tips

    - by david.butler(at)oracle.com
    MDM is all about data quality and data governance. We now know that improved data quality raises all operational and analytical boats. But it's not just about deploying data quality tools. It's about deploying data quality tools within and across the IT landscape - from a thousand points of data entry to a single version of the truth. Here are three tips to deploying MDM across your applications and enterprise.   #1: Identify a tactical, high-value business problem where MDM can materially help. §  Support a customer acquisition and retention program with a 'customer' master data solution. §  Accelerate new products and services to market with a 'product' master data solution. §  Reduce supplier exceptions or support spend control initiatives with a 'supplier' master data solution. §  Support new store (branch, campus, restaurant, hospital, office, well head) location analysis with a 'site' master data solution. §  Fix long standing Chart of Accounts and Cost Center problems with a 'financial' master data solution. §  Support M&A activity, application upgrades, an SOA initiative, a cloud computing program, or a new business intelligence deployment by implementing a mix of master data solutions.   #2: Incrementally expand to a full information architecture. Quite often, the measurable return on interest from tactical MDM initiatives will fund future deployments. Over time, the MDM solution expands into its full architecture to cover the entire IT landscape. Operations and analytics are united, IT flexibility is restored, and sustainable competitive advantage is achieved.   #3: Bring business into every MDM deployment. To be successful, MDM must work hand in hand with data governance. In fact, Oracle MDM incorporates data governance tools for business users. IT can insure data quality, but only after the business side has defined what quality means. The business establishes the rules for governing the master data, and then IT enforces the rules via the MDM applications. Without this business/IT collaboration, MDM initiatives seldom achieve their full potential.   It is not very often that a technology comes along that can measurably assist organizations across a wide variety of top IT initiatives. Reducing costs, increasing flexibility, getting more out of existing assets, and aligning business and IT are not easy tasks for any CIO. But with MDM, success is achievable. IT can regain its place as a center for innovation.   For more information on this topic, take a look at my article Master Data Management Deployment Tips in the Opinion Section of Oracle's Profit Online magazine.

    Read the article

  • What are some good realistic programming related movies (docu-dramas, documentaries, accurate fiction, etc)?

    - by EpsilonVector
    A while ago I asked this question and the result was this. Following the response I got in the meta question I'm re-asking the question with new guidelines to focus it on the direction I wanted it to have originally. ================================================================== The guidelines are as follows: by "programming related" I mean movies from which we can learn about stuff like the development process, or history of software/computers, or programming culture. In other words, they must be grounded in the industry. No tangential stuff. Good entries answer as many of the following criteria as possible: Teach you about the history of the industry, or the development process, or teach you about important industry related topics (software patents for example) Are based on real life events, companies, people, practices, and they are the main focus of the movie After watching them, you feel like you understand or know something about the programmers' world that you didn't before (or you can see how someone could have such a response). You can point to it and say "this faithfully represents the industry/programmer culture at some point in time". This might be something you would show laymen to explain to them what "your people" are like and what is it that you do. Examples for good entries include: Pirates of Silicon Valley- the story of how Microsoft and Apple started the industry. Revolution OS- The story of Linux's rise to fame, and a pretty good cover of the Free Software/Open Source world. Aardvark'd: 12 Weeks with Geeks- development process. Examples for bad entries: Movies who's sole relevance is that they can be appreciated by programmers. The point of this question is not to be "what are some good movies" with "for a programmer" appended to it. Just because the writers got a few computer jokes right in itself doesn't make it about the industry. Movies where there's a computer related element, but are not about the industry. For example, 24 (the TV series). It's a product of the information age but it isn't actually about it. Another example is movies where there's a really cool programmer character, but are overall about something completely different. Likewise, The Big Bang Theory is not about physics, even though they have a cool physicist as a character. Science fiction, even if it draws ideas from computers. For example, the Matrix trilogy. Movies that you can't point to them and say: this is a faithful representation of our world (at some point in time). If you can't do that then it doesn't mirror the industry. Keep it one entry per answer so that the voting could sort the entries out.

    Read the article

  • how do I get dual monitors to work properly in Ubuntu 11.10 on a Dell Latitude D630?

    - by wes cook
    I have spent a lot of time trying to get dual monitors to work on Ubuntu 11.10 on my Dell Latitude D630 (nVidia NVS 135m video card). - For starters, the System Displays settings app always only showed one unknown monitor, even though I had the external Acer monitor connected. - So I downloaded and installed the nVidia drivers. According to what I read I would need to only use the nVidia driver app (nVidia X Server Settings), so that's what I've done. (System Displays settings continued to only show a single monitor anyway). - nVidia settings app only showed on monitor until I changed the BIOS setting to use the onboard video for external monitor (not the dock video, which it was set to, even though I don't have a docking station). - The nVidia setting app now recognized both monitors. So, I setup the X Server display config as Separate X screen for both monitors. My laptop screen shows up as AUO 1440x900 and my external monitor as Acer E211H 1920x1080. - Everything seemed like it would work, but the external monitor was just a complete white screen. The external monitor was non-functional, even though sometimes it would show the background image - still nothing would show up over there. - So, I checked the Enable Xinerama box. - Now, after logging out and back in, the wallpaper extends to both screens but I get no taskbar at the bottom or top, no system menus, and I have to press the power button to restart or log off. - After experimenting with all the shells, the only one that shows the menus and taskbars when I log in is Gnome Classic. - This is pretty much the same symptoms as found here: How do I fix 11.10 GUI?. - So, I resign myself to the older shell. - Everything works fine until ... I unplug the external monitor ... this is a laptop after all. - Anyway, after doing some work on the road, I plug back in and I still see both screens and it's functional except, ... - Now, the laptop screen (with the taskbar and menu bar) has 4 black bars at the top that windows cannot cover. The top bar is the menu bar (with Applications, Places, the date and time and the system menu on the right). But the next 3 bars (the same height as the top menu bar) are empty and are just reducing the max size of windows on that screen. - See screenshot here: http://i39.tinypic.com/35d2kh1.png - So ... 1. How do I get rid of those extra 3 black bars? They're taking valuable screen space. 2. (less critical) How do I successfully use both screens in the Ubuntu or Ubuntu 2D shell?

    Read the article

  • EPM Planning (Hyperion) V11.1.2 Implementation Hands-On Boot-camp

    - by Mike.Hallett(at)Oracle-BI&EPM
    5-Day Training for Partners: 29th October - 2nd November 2012, London (UK): REGISTER Here This FREE for Partners 5-day workshop is designed to provide implementation instruction on Oracle Hyperion EPM Planning.  This boot-camp is intended for prospective implementers of the Planning and Budgeting functionality of Oracle EPM or implementers that are currently familiar with the basics of EPM Planning and looking to strengthen their base of knowledge in the product. The class begins with an overview of Essbase, the foundation of Hyperion Planning. It provides a general overview of Planning and Planning terms, the architecture of all the Planning components, and how they are commonly used. The course goes over all the steps to create an application from scratch. This involves some preparation work outside of Planning and leads to developing the application in both the Planning Windows and Web clients. Participants will modify existing dimensions and build out the hierarchies using the Web client. Topics Covered The boot-camp shows developers how to build out dimensions using Classic Planning and by using EPMA. It covers the mechanics and cover strategies for automating the build process such as interface tables. It reviews data loads using Load Rules to load the Planning database. The course focuses on tasks that end-users must perform during the planning cycle. It walks students through creating and modifying forms, working with forms to enter data, adding annotations, and the rest of the form features such as running business rules and managing task lists. It covers how to use the forms in the Smart View client and finishes up the end-user perspective by going through Workflow Management and the process of submitting a plan for review. The final section of the course covers Security and other administration topics such as automation and deployment. Prerequisites Ideal participants are Oracle partners (SIs and resellers) with a background in business information systems and a clientele of customers with ongoing or prospective EPM initiatives. Alternatively, partners with the background described above and an interest in evolving their practice to a similar profile are suitable participants. Further online OPN guided learning path information and webinars are available at: Oracle Hyperion Planning 11 Essentials. Please note that attendees are required to bring a laptop. View here laptop requirements and detailed agenda. ·       REGISTER Here : acceptance is subject to availability and your place will be confirmed within two weeks  ( and for help see the Partner Registration Guide ). Training Location: Oracle Corporation UK Ltd Columbus Room Customer Visit Center 1 South Place London EC2M 2RB Training Dates: 29th October - 2nd November  9:30 am – 5:00 pm BST For more information please contact [email protected].

    Read the article

  • Transform Your Application Integration with Best Practices from Oracle Customers

    - by Lionel Dubreuil
    You want to transform your application integration into an environment based on a service-oriented architecture (SOA). You also want to utilize business process management (BPM) to improve efficiency, deliver business agility, lower total cost of ownership, and increase business visibility. And you want to hear directly from like-minded professionals who have made those types of transformations. Easy enough. Attend this Webcast series to learn from customers who have successfully integrated with Oracle SOA and BPM solutions.Join us for this series and discover how to: Use a single unified platform for all types of processes Increase real-time process visibility Improve efficiency of existing IT investments Lower up-front costs and achieve faster time to market Gain greater benefit from SOA with the addition of BPM Here's the list of upcoming webcasts: “Migrating to SOA at Choice Hotels” on Thurs., June 21, 2012 — 10 a.m. PT / 1 p.m. ET Hear how Choice Hotels successfully made the transition from a complex legacy environment into a SOA-based shared services infrastructure that accelerated time to market as the company implemented its event-driven Google API project. “San Joaquin County—Optimizing Justice and Public Safety with Oracle BPM and Oracle SOA” on Thurs., July 26, 2012 — 10 a.m. PT / 1 p.m. ET Learn how San Joaquin County moved to a service-oriented architecture foundation and business process management platform to gain efficiency and greater visibility into mission-critical information for public safety. “Streamlining Order to Cash with SOA at Eaton” on Thurs., August 23, 2012 — 10 a.m. PT / 1 p.m. ET Discover how Eaton transitioned from a legacy TIBCO infrastructure. Learn about the company’s reference architecture for a SOA-based Oracle Fusion Distributed Order Orchestration (DOO). “Fast BPM Implementation with Fusion: Production in Five Months” on Thurs., September 13, 2012 — 10 a.m. PT / 1 p.m. ET Learn how Nets Denmark A/S implemented Oracle Unified Business Process Management Suite in just five months. The Webcast will cover the implementation from start to production, including integration with legacy systems. “SOA Implementation at Farmers Insurance” on Thurs., October 18, 2012 — 10 a.m. PT / 1 p.m. ET Learn how Farmers Insurance Group lowered application infrastructure costs, reduced time to market, and introduced flexibility by transforming to a SOA-based infrastructure with SOA governance. Register today!

    Read the article

  • Transform Your Application Integration with Best Practices from Oracle Customers

    - by Lionel Dubreuil
    You want to transform your application integration into an environment based on a service-oriented architecture (SOA). You also want to utilize business process management (BPM) to improve efficiency, deliver business agility, lower total cost of ownership, and increase business visibility. And you want to hear directly from like-minded professionals who have made those types of transformations. Easy enough. Attend this Webcast series to learn from customers who have successfully integrated with Oracle SOA and BPM solutions.Join us for this series and discover how to: Use a single unified platform for all types of processes Increase real-time process visibility Improve efficiency of existing IT investments Lower up-front costs and achieve faster time to market Gain greater benefit from SOA with the addition of BPM Here's the list of upcoming webcasts: “Migrating to SOA at Choice Hotels” on Thurs., June 21, 2012 — 10 a.m. PT / 1 p.m. ET Hear how Choice Hotels successfully made the transition from a complex legacy environment into a SOA-based shared services infrastructure that accelerated time to market as the company implemented its event-driven Google API project. “San Joaquin County—Optimizing Justice and Public Safety with Oracle BPM and Oracle SOA” on Thurs., July 26, 2012 — 10 a.m. PT / 1 p.m. ET Learn how San Joaquin County moved to a service-oriented architecture foundation and business process management platform to gain efficiency and greater visibility into mission-critical information for public safety. “Streamlining Order to Cash with SOA at Eaton” on Thurs., August 23, 2012 — 10 a.m. PT / 1 p.m. ET Discover how Eaton transitioned from a legacy TIBCO infrastructure. Learn about the company’s reference architecture for a SOA-based Oracle Fusion Distributed Order Orchestration (DOO). “Fast BPM Implementation with Fusion: Production in Five Months” on Thurs., September 13, 2012 — 10 a.m. PT / 1 p.m. ET Learn how Nets Denmark A/S implemented Oracle Unified Business Process Management Suite in just five months. The Webcast will cover the implementation from start to production, including integration with legacy systems. “SOA Implementation at Farmers Insurance” on Thurs., October 18, 2012 — 10 a.m. PT / 1 p.m. ET Learn how Farmers Insurance Group lowered application infrastructure costs, reduced time to market, and introduced flexibility by transforming to a SOA-based infrastructure with SOA governance. Register today!

    Read the article

  • Transform Your Application Integration with Best Practices from Oracle Customers

    - by Lionel Dubreuil
    You want to transform your application integration into an environment based on a service-oriented architecture (SOA). You also want to utilize business process management (BPM) to improve efficiency, deliver business agility, lower total cost of ownership, and increase business visibility. And you want to hear directly from like-minded professionals who have made those types of transformations. Easy enough. Attend this Webcast series to learn from customers who have successfully integrated with Oracle SOA and BPM solutions.Join us for this series and discover how to: Use a single unified platform for all types of processes Increase real-time process visibility Improve efficiency of existing IT investments Lower up-front costs and achieve faster time to market Gain greater benefit from SOA with the addition of BPM Here's the list of upcoming webcasts: “Migrating to SOA at Choice Hotels” on Thurs., June 21, 2012 — 10 a.m. PT / 1 p.m. ET Hear how Choice Hotels successfully made the transition from a complex legacy environment into a SOA-based shared services infrastructure that accelerated time to market as the company implemented its event-driven Google API project. “San Joaquin County—Optimizing Justice and Public Safety with Oracle BPM and Oracle SOA” on Thurs., July 26, 2012 — 10 a.m. PT / 1 p.m. ET Learn how San Joaquin County moved to a service-oriented architecture foundation and business process management platform to gain efficiency and greater visibility into mission-critical information for public safety. “Streamlining Order to Cash with SOA at Eaton” on Thurs., August 23, 2012 — 10 a.m. PT / 1 p.m. ET Discover how Eaton transitioned from a legacy TIBCO infrastructure. Learn about the company’s reference architecture for a SOA-based Oracle Fusion Distributed Order Orchestration (DOO). “Fast BPM Implementation with Fusion: Production in Five Months” on Thurs., September 13, 2012 — 10 a.m. PT / 1 p.m. ET Learn how Nets Denmark A/S implemented Oracle Unified Business Process Management Suite in just five months. The Webcast will cover the implementation from start to production, including integration with legacy systems. “SOA Implementation at Farmers Insurance” on Thurs., October 18, 2012 — 10 a.m. PT / 1 p.m. ET Learn how Farmers Insurance Group lowered application infrastructure costs, reduced time to market, and introduced flexibility by transforming to a SOA-based infrastructure with SOA governance. Register today!

    Read the article

  • Transform Your Application Integration with Best Practices from Oracle Customers

    - by Lionel Dubreuil
    You want to transform your application integration into an environment based on a service-oriented architecture (SOA). You also want to utilize business process management (BPM) to improve efficiency, deliver business agility, lower total cost of ownership, and increase business visibility. And you want to hear directly from like-minded professionals who have made those types of transformations. Easy enough. Attend this Webcast series to learn from customers who have successfully integrated with Oracle SOA and BPM solutions.Join us for this series and discover how to: Use a single unified platform for all types of processes Increase real-time process visibility Improve efficiency of existing IT investments Lower up-front costs and achieve faster time to market Gain greater benefit from SOA with the addition of BPM Here's the list of upcoming webcasts: “Migrating to SOA at Choice Hotels” on Thurs., June 21, 2012 — 10 a.m. PT / 1 p.m. ET Hear how Choice Hotels successfully made the transition from a complex legacy environment into a SOA-based shared services infrastructure that accelerated time to market as the company implemented its event-driven Google API project. “San Joaquin County—Optimizing Justice and Public Safety with Oracle BPM and Oracle SOA” on Thurs., July 26, 2012 — 10 a.m. PT / 1 p.m. ET Learn how San Joaquin County moved to a service-oriented architecture foundation and business process management platform to gain efficiency and greater visibility into mission-critical information for public safety. “Streamlining Order to Cash with SOA at Eaton” on Thurs., August 23, 2012 — 10 a.m. PT / 1 p.m. ET Discover how Eaton transitioned from a legacy TIBCO infrastructure. Learn about the company’s reference architecture for a SOA-based Oracle Fusion Distributed Order Orchestration (DOO). “Fast BPM Implementation with Fusion: Production in Five Months” on Thurs., September 13, 2012 — 10 a.m. PT / 1 p.m. ET Learn how Nets Denmark A/S implemented Oracle Unified Business Process Management Suite in just five months. The Webcast will cover the implementation from start to production, including integration with legacy systems. “SOA Implementation at Farmers Insurance” on Thurs., October 18, 2012 — 10 a.m. PT / 1 p.m. ET Learn how Farmers Insurance Group lowered application infrastructure costs, reduced time to market, and introduced flexibility by transforming to a SOA-based infrastructure with SOA governance. Register today!

    Read the article

  • Upgarde from Asp.Net MVC 1 to MVC 2 - how to and issues with JsonRequestBehavior

    - by Renso
    Goal Upgrade your MVC 1 app to MVC 2 Issues You may get errors about your Json data being returned via a GET request violating security principles - we also address this here. This post is not intended to delve into why the Json GET request is or may be an issue, just how to resolve it as part of upgrading from MVC1 to 2. Solution First remove all references from your projects to the MVC 1 dll and replace it with the MVC 2 dll. Now update your web.config file in your web app root folder by simply changing references to assembly="System.Web.Mvc, Version 1.0.0.0 to Version 2.0.0.0, there are a couple of references in your config file, here are probably most of them you may have:         <compilation debug="true" defaultLanguage="c#">             <assemblies>                        <add assembly="System.Web.Mvc, Version=2.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" />             </assemblies>         </compilation>           <pages masterPageFile="~/Views/Masters/CRMTemplate.master" pageParserFilterType="System.Web.Mvc.ViewTypeParserFilter, System.Web.Mvc, Version=2.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" pageBaseType="System.Web.Mvc.ViewPage, System.Web.Mvc, Version=2.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" userControlBaseType="System.Web.Mvc.ViewUserControl, System.Web.Mvc, Version=2.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" validateRequest="False">             <controls>                 <add assembly="System.Web.Mvc, Version=2.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" namespace="System.Web.Mvc" tagPrefix="mvc" />   Secondly, if you return Json objects from an ajax call via the GET method you ahve several options to fix this depending on your situation: 1. The simplest, as in my case I did this for an internal web app, you may simply do:             return Json(myObject, JsonRequestBehavior.AllowGet);   2. In Mvc if you have a controller base you could wrap the Json method with:         public new JsonResult Json(object data)         {             return Json(data, "application/json", JsonRequestBehavior.AllowGet);                    }   3. The most work would be to decorate your Actions with:         [AcceptVerbs(HttpVerbs.Get)]   4. Another tnat is also a lot of work that needs to be done to every ajax call returning Json is:                             msg = $.ajax({ url: $('#ajaxGetSampleUrl').val(), dataType: 'json', type: 'POST', async: false, data: { name: theClass }, success: function(data, result) { if (!result) alert('Failure to retrieve the Sample Data.'); } }).responseText;   This should cover all the issues you may run into when upgrading. Let me kow if you run into any other ones.

    Read the article

  • Pro SharePoint 2010 Business Intelligence Solutions

    - by Sahil Malik
    Ad:: SharePoint 2007 Training in .NET 3.5 technologies (more information). Oh yeah baby, it’s out finally! This book is what I wanted to write for so long now, but never really got a chance to. For SharePoint 2007, I authored the SharePoint section of “Smart BI Solutions with SQL Server 2008” for MS Press. But never really got the time, to author a full book that this topic deserved. Until SharePoint 2010, we actually have a full book on this topic. So first things first, I didn’t actually write it. My role was limited to the overall concept, the outline, the layout, completion of it, code samples, identifying what we need in here, vouching for technical accuracy, identifying authors etc. The real work was done by Srini (5 chapters), and Steve (1 chapter). So credit given where it is due. But, with that said, this is a pretty good book. It has always been a challenge to find the superman that knows both, data ware housing concepts, and SharePoint concepts. The data ware housing concepts include basic stuff you need to know to work in the BI area such as cubes, MDX queries, etc. So chapter 1 covers that – and if you’re a hardcore DBA, feel free to skip Chapter 1. Then beyond that, we take every single SharePoint 2010 BI topic, and slice and dice it in detail. The topics we deal with are - Visio Services Reporting services Business Connectivity Services Excel Services PerformancePoint Services And in covering each of these topics, we ensure that a general layout was followed for each topic, to ensure completeness of content. We make sure we cover Setup related issues and advice Point and click usage Code usage, i.e. extensibility using visual studio and a walkthrough of the administration side of things, including powershell. (Yes, I insisted on that in being there in every chapter). Writing a book is always a lot of work, so we hope you find it useful. And it should go very well with the other book I just reviewed, which is Microsoft ADO.NET 4, step by step. Comment on the article ....

    Read the article

  • Avoiding Duplicate Content Penalties on a Corporate/Franchise website

    - by heath
    My question is really an extension of a previous question that was ported from stackoverflow and closed so I cannot edit it. The basic gist is a regional franchise company has decided to force all independent stores into one website look; they currently all have their own domains and completely different websites. After reading the helpful answers and looking over some links provided, I think my solution is to put a 301 on each franchise store site (acme-store1.com, acme-store2.com, etc) back to the main corporate site (acme.com). All of the company history, product info, etc (about 90% of the entire site) applies to all stores. However, each store should have some exclusive content such as staff, location pictures, exclusive events and promotions, etc. I originally thought that I would simply do something like acme.com/store1/staff, acme.com/store2/staff, etc for the store exclusive content and then acme.com/our-company, for example, would cover all stores. However, I now see two issues that I don't know how to solve. They want to see site stats based on what store site they came from. If a user comes from acme-store1.com, is redirected to acme.com and hits several pages, don't I need to somehow keep that original site in the new url to track each page in that user's session and show they originally came from acme-store1.com? Each store is still independently owned and is essentially still in competition with the other stores, albeit, in less competition than they are with other brands. This is important because each store would like THEIR contact info, links to their social media pages, their mailing list sign-up and customer requests on EVERY page. So if a user originally goes to acme-store1.com and is redirected to acme.com, it still should look to the user that it's all about store 1, even though 90% of the content will be exactly the same as it is in the store 2, store 3 and corporate site. For example, acme.com/our-company would have the same company history, same header/footer/navigation, BUT depending on the original site the user came from, it would display contact and links to THAT store. If someone came directly to the corporate site, it would display their contact and links (they have their own as well). I was considering that all redirects would be to store1.acme.com, store2.acme.com, etc (or acme.com/store1) and then I can dynamically add the contact info and appropriate links based on the subdomain or subfolder. But, then I have to worry about duplicate content penalties because, again, about 90% of the text in these "subdomains" are all the same. For reference, this is a PHP5 site. I've already written a compact framework utilizing templates and mod-rewrite that I've used for other sites. Is this an easy fix that I'm just not grasping? Any suggestions?

    Read the article

  • Do we need to adopt a black-box asset our project is inheriting from its predecessor?

    - by Tom Anderson
    Our client has an eCommerce site which was developed by an in-house team, and is now showing its age. I work for a firm brought in as external contractors to build a replacement. Part of the current site is a Flash viewer applet which displays media about the product - zoom-able images, 360-degree views, movies, and so on. We need to show the same media the current site does, so we are simply reusing the viewer. The viewer is embedded on a page in the usual way, and told what media to show by means of an XML file it loads from our server, which is pretty simple for us to generate. We've got this working; it was pretty straightforward. But what else do we need to do? The thing is, as far as we're concerned, the viewer is a binary blob which is served from the client's content-distribution network. We embed it, feed it some XML, and it does its job, but we have no power over its internals. It's completely opaque to us - a black box. We can use it to do what it does, but we can't change it, so if we ever need to do something different, we're stuffed. We're building this site for the client, and when we're done, we'll hand it over for them to maintain. We won't be doing the maintenance ourselves. There's a small team within the client who are working as part of our team, and who will be the ones doing the maintenance. That team only includes one person from the team that built the old site, and it's not someone who knows the image viewer. The people who do know the image viewer are not slated to join our team when our system replaces theirs - they'll be moved to other projects. The documentation on the viewer is extremely thin, and as far as i know doesn't cover the internals at all. My worry is that if someone doesn't take some positive action, all knowledge of the internal workings of the viewer - even down to where the source code for it is - will be lost. It's possible it already has been. Is this something to worry about? If so, whose job is it to worry about it? What should they do about it once they've got worried?

    Read the article

  • Cheating on Technical Debt

    - by Tony Davis
    One bad practice guaranteed to cause dismay amongst your colleagues is passing on technical debt without full disclosure. There could only be two reasons for this. Either the developer or DBA didn’t know the difference between good and bad practices, or concealed the debt. Neither reflects well on their professional competence. Technical debt, or code debt, is a convenient term to cover all the compromises between the ideal solution and the actual solution, reflecting the reality of the pressures of commercial coding. The one time you’re guaranteed to hear one developer, or DBA, pass judgment on another is when he or she inherits their project, and is surprised by the amount of technical debt left lying around in the form of inelegant architecture, incomplete tests, confusing interface design, no documentation, and so on. It is often expedient for a Project Manager to ignore the build-up of technical debt, the cut corners, not-quite-finished features and rushed designs that mean progress is satisfyingly rapid in the short term. It’s far less satisfying for the poor person who inherits the code. Nothing sends a colder chill down the spine than the dawning realization that you’ve inherited a system crippled with performance and functional issues that will take months of pain to fix before you can even begin to make progress on any of the planned new features. It’s often hard to justify this ‘debt paying’ time to the project owners and managers. It just looks as if you are making no progress, in marked contrast to your predecessor. There can be many good reasons for allowing technical debt to build up, at least in the short term. Often, rapid prototyping is essential, there is a temporary shortfall in test resources, or the domain knowledge is incomplete. It may be necessary to hit a specific deadline with a prototype, or proof-of-concept, to explore a possible market opportunity, with planned iterations and refactoring to follow later. However, it is a crime for a developer to build up technical debt without making this clear to the project participants. He or she needs to record it explicitly. A design compromise made in to order to hit a deadline, be it an outright hack, or a decision made without time for rigorous investigation and testing, needs to be documented with the same rigor that one tracks a bug. What’s the best way to do this? Ideally, we’d have some kind of objective assessment of the level of technical debt in a software project, although that smacks of Science Fiction even as I write it. I’d be interested of hear of any methods you’ve used, but I’m sure most teams have to rely simply on the integrity of their colleagues and the clear perceptions of the project manager… Cheers, Tony.

    Read the article

  • Configuring the iPlanet as web tier for Oracle WebCenter Content (UCM)

    - by Adao Junior
    If you are looking for configure the iPlanet as Web server/proxy to use with the Oracle WebCenter Content, you probably won’t found an specific documentation for that or will found some old complex notes related to the old 10gR3. This post will help you out with few simple steps. That’s the diagram of the test scenario, considering that you will deploy in production in an cluster environment. First you need the software, for our scenario you will need: - Oracle iPlanet Web Server 7.0.15+ (Installed) - Oracle WebCenter Content 11gR1 PS5 (Installed) - Oracle WebLogic Web Server Plugins 11g (1.1) - Supported JDK (Using Oracle Java JDK 7u4 for the test) - Certified Client OS - Certified Server OS (Using Oracle Solaris 11 for the test) - Certified Database (Using Oracle Database 11.2.0.3 for the test) Then the configuration: - Download the latest plugin: http://www.oracle.com/technetwork/middleware/ias/downloads/wls-plugins-096117.html - Extract the WLSPlugin11g-iPlanet7.0 in some folder, like <iPlanet_Home>/plugins/wls11 - Include the plugin reference to the magnus.conf: If Unix (Solaris or Linux), include the line: Init fn="load-modules" shlib="/apps/oracle/WebServer7/plugins/wls11/lib/mod_wl.so" If Windows, Include the line:        Init fn="load-modules" shlib="D:\\oracle\\WebServer7\\plugins\\wls11\\lib\\mod_wl.dll" - Include the proxy reference to the obj.conf of each instance: <Object name="weblogic" ppath="*/cs/*"> Service fn="wl-proxy" WebLogicCluster="wcc-node1:16201,wcc-node2:16202, wcc-node3:16203" </Object>   <Object name="weblogic" ppath="*/_dav/*"> Service fn="wl-proxy" WebLogicCluster="wcc-node1:16201,wcc-node2:16202, wcc-node3:16203" </Object>   <Object name="weblogic" ppath="*/_ocsh/*"> Service fn="wl-proxy" WebLogicCluster="wcc-node1:16201,wcc-node2:16202, wcc-node3:16203" </Object>   <Object name="weblogic" ppath="*/adfAuthentication/*"> Service fn="wl-proxy" WebLogicCluster="wcc-node1:16201,wcc-node2:16202, wcc-node3:16203" </Object> If you are using an single node setup, change the Service fn=…. line to something like: Service fn="wl-proxy" WebLogicHost=<wcc-server> WebLogicPort=16200 With these configurations, your should have the WebCenter Content UI working with the iPlanet, test it. [http://<web-server>/cs/] With the UI working, the last step is to configure the WebDav: - Go to the iPlanet Admin Console (usually https://<web-server>:8989) - Go to Configurations >> [instance] >> Virtual Servers >> [Virtual Server] >> WebDAV: - Click New - Populate the URI with /cs/idcplg/webdav: - Select “Anyone (No Authentication)”, the wc Content will take care of the security: This will allow you to use the WebDav feature and the Desktop Integration Suite, including double-byte characters. Anothers iPlanet tunes could be done, I can cover in the next post related to the iPlanet. Cross-posted on the ContentrA.com Blog Related posts:  - Using a Web Proxy Server with WebCenter Family

    Read the article

  • Maximum Availability with Oracle GoldenGate

    - by Irem Radzik
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 Oracle Database offers a variety of built-in and optional products for maximum availability, and it is well known for its robust high availability and disaster recovery solutions. With its heterogeneous, real-time, transactional data movement capabilities, Oracle GoldenGate is a key part of the Maximum Availability Architecture for Oracle Database. This week on Thursday Dec. 13th we will be presenting in a live webcast how Oracle GoldenGate fits into Oracle Database Maximum Availability Architecture (MAA). Joe Meeks from the Oracle Database High Availability team will discuss how Oracle GoldenGate complements other key products within MAA such as Active Data Guard. Nick Wagner from GoldenGate PM team will present how to upgrade to latest Oracle Database release without any downtime. Nick will also cover 2 new features of  Oracle GoldenGate 11gR2:  Integrated Capture for Oracle Database and Automated Conflict Detection and Resolution. Nick will provide in depth review of these new features with examples. Oracle GoldenGate also offers maximum availability for non-Oracle databases, such as HP NonStop, SQL Server, DB2 (LUW, iSeries, or zSeries) and more. The same robust, reliable real-time, bidirectional data movement capabilities apply to all supported databases.  I'd like to invite you to join us on Thursday Dec. 13th 10am PT/1pm ET to hear from the product experts on how to use GoldenGate for maximizing database availability and to ask your questions. You can find the registration link below. Webcast: Maximum Availability with Oracle GoldenGate Thursday Dec. 13th 10am PT/1pm ET Look forward to another great webcast with lots of interaction with the audience. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;}

    Read the article

  • Oracle JDeveloper 11gR2 Cookbook book review

    - by Chris Muir
    I recently received a free copy of Oracle JDeveloper 11gR2 Cookbook published by Packt Publishing for review. Readers of technical cookbooks would know this genre of text includes problems that developers will hit and the prescribed solutions, in this case for Oracle's Application Development Framework (ADF).  Books like this excel themselves on excellent coverage, a logical progress of solutions through out the book, and providing a readable narrative around the numerous steps and code. This book progresses well through ADF application assembly, ADF Business Components, the view layer, security, deployment and tuning.  Each recipe had a clear introduction and I especially enjoyed the "There's more" follow up sections for some recipes that leads the reader onto related ideas and issues the reader really needs to be aware of. Also worthy of comment having worked with ADF for over 5 years, there certainly was recipes and solutions I hadn't encountered before, this book gets bonus points for that. As a reviewer what negatives can I give this text? The book has cast it's net too wide by trying to cover "everything from design and construction, to deployment, testing, debugging and optimization."  ADF is such a large and sophistication technology, this book with 100 recipes barely scrapes the surface.  Don't expect all your ADF problems to be solved here. In turn there is inconsistency in the level of problems and solutions.  I felt at the beginning the book was pitching itself at advanced problems to solve (that's great for me), but then it introduces topics like building a static View Object or train.  These topics in my opinion are fairly simple and are covered by the Oracle documentation just as well, they shouldn't have been included here.  In conclusion, ADF beginners will find this book worthwhile as it will open your eyes to the wider problems and solutions required for ADF, and experts for just the fact they can point junior programmers at the book for certain problems and say "get on with it". Is there scope for more ADF tombs like this?  Yes!  I'd love to see a cookbook specializing on ADF Business Components (hint hint to budding authors).

    Read the article

  • ASP.NET vNext Blog Post Series

    - by Soe Tun
    Originally posted on: http://geekswithblogs.net/stun/archive/2014/06/04/asp.net-vnext-blog-post-series.aspxASP.NET vNext Blog Post Series ASP.NET vNext was announced at TechEd 2014, and I have been playing around with it a bit. ASP.NET vNext is an exciting and revolutionary change for the Microsoft .NET development platform. ASP.NET vNext is now open-source, and available on Github at this location: https://github.com/aspnet/Home. I want to start a blog post series on the ASP.NET vNext, and share my experience as I learn more about it. Keeping it simple Each blog post in the series will be short and simple so I can write them in a short amount of time, and keep it focused on one (at most two) topic(s) per post. My goal is to make it easy to absorb the information as there are a ton of great new stuff to cover. Many other people in the community have blogged about the key new features of the ASP.NET vNext. I will link to those blog posts in my next blog post. MVC 6 POCO Controller Today, I want to start this blog post series with a teaser code snippet for those developers familiar with the ASP.NET MVC. Getting Started with ASP.NET MVC 6 article from ASP.NET website shows how to write a lightweight POCO (plain-old CLR object) MVC Controller class in the upcoming ASP.NET MVC 6. However, it doesn't show us how to use the IActionResultHelper interface to render a View. This is how I wrote my POCO MVC Controller based on the https://github.com/aspnet/Home/blob/master/samples/HelloMvc/Controllers/HomeController.cs sample from Github.   Note that this may not be the best way to write it, but this is good enough for now. using Microsoft.AspNet.Mvc; using Microsoft.AspNet.Mvc.ModelBinding; using MvcSample.Web.Models; namespace MvcSample.Web { public class HomeController { IActionResultHelper html; IModelMetadataProvider mmp; public HomeController(IActionResultHelper h, IModelMetadataProvider mmp) { this.html = h; this.mmp = mmp; } public IActionResult Index() { var viewData = new ViewDataDictionary<User>(mmp) { Model = User() }; return html.View("Index", viewData); } public User User() { return new User { Name = "My name", Address = "My address" }; } } } Please feel free to give me feedback as this will greatly help me organize the blog posts in this series, and plan head. Thanks for reading!

    Read the article

  • Exalogic 2.0.1 Tea Break Snippets - Creating and using Distribution Groups

    - by The Old Toxophilist
    By default running your Exalogic in a Virtual provides you with, what to Cloud Users, is a single large resource and they can just create vServers and not care about how they are laid down on the the underlying infrastructure. All the Cloud Users will know is that they can create vServers. For example if we have a Quarter Rack (8 Nodes) and our Cloud User creates 8 vServers those 8 vServers may run on 8 distinct nodes or may all run on the same node. Although in many cases we, as Cloud Users, may not be to worried how the Virtualisation Algorithm decides where to place our vServers there are cases where it is extremely important that vServers run on distinct physical compute nodes. For example if we have a Weblogic Cluster we will want the Servers with in the cluster to run on distinct physical node to cover for the situation where one physical node is lost. To achieve this the Exalogic Virtualised implementation provides Distribution Groups that define and anti-aliasing policy that the underlying Virtualisation Algorithm will take into account when placing vServers. It should be noted that Distribution Groups must be created before you create vServers because a vServer can only be added to a Distribution Group at creation time. Creating A Distribution Group To create a Distribution Groups we will first need to select the Account in which we want the Distribution Group to be created. Once we have selected the account we will see the Interface update and Account specific Actions will be displayed within the Action Panes. From the Action pane (or Right-Click on the Account) select the "Create Distribution Group" action. This will initiate the create wizard as follows. Distribution Group Details Within the first Step of the Wizard we can specify the name of the distribution group and this should be unique. In addition we can provide a detailed description of the group. Distribution Group Configuration The second step of the configuration wizard allows you to specify the number of elements that are required within this group and will specify a maximum of the number of nodes within you Exalogic. At this point it is always better to specify a group with spare capacity allowing for future expansion. As vServers are added to group the available slots decrease. Summary Finally the last step of the wizard display a summary of the information entered.

    Read the article

  • Several New Hints

    - by Ondrej Brejla
    Hi all! Today we would like to introduce you some of our new experimental hints for NetBeans 7.2. They are called: Unused Use Statement and Immutable Variables. Unused Use Statement This hint is quite simple. It highlights (underlines) your use statements, which are not used. Typical use case is after some refactoring, when you forgot to remove some obsolete use statements. This hint warns you on them and allows you to remove them easily. Just click on the hint bulb in the gutter and select Remove Unused Use Statement. And of course, it works in multiple use statements combined too. Immutable Variables The next one is the hint which checks too many assignments into a variable. And why? That's simple. Mostly you should use just one assignment into one variable. But sometimes you are lazy and you do something like: But it's quite wrong, because what you really do is: And that's exactly the case, when our new hint warns you, that Too many assignments (2) into variable $foo occured. Nothing more. Yes, we know that there are some cases, where could be more assignments and no warning should occur, e.g.: Because maybe one likes longer increment syntax more than the short one. So we tried to handle these cases to don't bother you if it's not a need. Note: We are almost sure that this hint doesn't cover all your use cases, because there are a lot of them. So if you find something strange, write it into our bugzilla so we can handle it better for you. Thanks for your patience! And the last thing is, that you can set the number of allowed assignments in Tools -> Options -> Editor -> Hints -> PHP: Immutable Variables. Note: This hint works just for a common variables, not for fields. We have an enhancement request for that and it should be implemented in next version of NetBeans (probably 7.3). And that's all for today and as usual, please test it and if you find something strange, don't hesitate to file a new issue (product php, component Editor). Thanks.

    Read the article

  • How do I use IIS7 rewrite to redirect requests for (HTTP or HTTPS):// (www or no-www) .domainaliases.ext to HTTPS://maindomain.ext

    - by costax
    I have multiple domain names assigned to the same site and I want all possible access combinations redirected to one domain. In other words whether the visitor uses http://domainalias.ext or http://www.domainalias.ext or https://www.domainalias3.ext or https://domainalias4.ext or any other combination, including http://maindomain.ext, http://www.maindomain.ext, and https://www.maindomain.ext they are all redirected to https://maindomain.ext I currently use the following code to partially achieve my objectives: <?xml version="1.0" encoding="UTF-8"?> <configuration> <system.webServer> <rewrite> <rules> <rule name="CanonicalHostNameRule" stopProcessing="true"> <match url="(.*)" /> <conditions> <add input="{HTTP_HOST}" pattern="^MAINDOMAIN\.EXT$" negate="true" /> </conditions> <action type="Redirect" redirectType="Permanent" url="https://MAINDOMAIN.EXT/{R:1}" /> </rule> <rule name="HTTP2HTTPS" stopProcessing="true"> <match url="(.*)" /> <conditions> <add input="{HTTPS}" pattern="off" ignoreCase="true" /> </conditions> <action type="Redirect" redirectType="Permanent" url="https://MAINDOMAIN.EXT/{R:1}" /> </rule> </rules> </rewrite> </system.webServer> </configuration> ...but it fails to work in all instances. It does not redirect to https://maindomain.ext when user inputs https://(www.)domainalias.ext So my question is, are there any programmers here familiar with IIS7 ReWrite that can help me modify my existing code to cover all possibilities and reroute all my domain aliases, loaded by themselves or using www in front, in HTTP or HTTPS mode, to my main domain in HTTPS format??? The logic would be: if entire URL does NOT start with https://maindomain.ext then REDIRECT to https://maindomain.ext/(plus_whatever_else_that_followed). Thank you very much for your attention and any help would be appreciated. NOTE TO MODS: If my question is not in the correct format, please edit or advise. Thanks in advance.

    Read the article

  • Styling Windows Phone Silverlight Applications

    - by Tim Murphy
    If you have not developed with styles in Silverlight/XAML then it can be challenging and resources can be sparse depending on how deep you get.  One thing that you need to understand is what level you can apply styles and how much they can cascade.  What I am finding is that this doesn’t go to the level that we are used to in HTML and CSS. While styles can be defined at a page level if you want to share styles throughout your application they should be defined in the App.xaml file.  This is of course analogous to placing a style in your HTML file versus an external CSS file.  This is the type of style I will concentrate on in this post. The first thing to look it how styles associate to elements.  TargetType defines the object type that your style will apply to.  In the example below the style is targeting the TextBlock object type. <Style x:Key="TextBlockSmallGray" TargetType="TextBlock"> Next we use a Setter which allows you to apply values for specific attributes of the target object type.  The setters can be a simple value or complex.  The first example here is simply applying a color to the background property of the target. <Setter Property="Background" Value="White"/> The second setter example here is for the same property, but we are applying a the definition of a LinearGradientBrush. <Setter Property="Background"> <Setter.Value> <LinearGradientBrush> <GradientStop Offset="0" Color="Black"/> <GradientStop Offset="1" Color="White"/> </LinearGradientBrush> </Setter.Value> </Setter> The last thing I want to cover here is that you can leverage the system styles and then override or extend them.  The BasedOn attribute of the Style tag allows this sort of inheritance.  In the example below I am going to start with the PhoneTextTitleStyle and then override properties as needed. <Style x:Key="TextBlockTitle" BasedOn="{StaticResource PhoneTextTitle1Style}" TargetType="TextBlock"> So now that we have our styles defined applying it is fairly straight forward.  Add the style name as a static resource to the style property of the element in your page and off you go. <Grid x:Name="LayoutRoot" Style="{StaticResource PageGridStyle}"> So this is one step in creating consistency in your application’s look.  In future posts I will dig a little deeper. del.icio.us Tags: windows phone 7,mobile development,windows phone 7 development,.NET,software development,design,UX

    Read the article

  • Portland Silverlight User Group: WP7 &amp; XNA &ndash; I survived.

    - by George Clingerman
    Last night I gave a talk to the Portland Silverlight User Group. http://www.portlandsilverlight.net/Meetings/Details/15 And I survived (which you should have probably already figured out since you’re reading this post AND that’s what I titled it…) Really though it was a fantastic time and I had a lot of fun! I was a little nervous getting ready for it, but I’m always a little nervous getting ready for things. I had the game all written,  I knew the general flow for what the talk was going to be. I read over Scott Hanselman’s 11 Top Tips for a Successful Technical presentation (which has become something I do EVERY time I’m preparing for a talk). I gave myself a brief list of points I wanted to make sure I covered or worked into the talk. But then I was ready and I waited. I hate the waiting. It makes me nervous. Once I was up in front of the room though with my laptop open and some XNA code in front of me, my nerves go away. Then I’m ready. I know XNA, I love talking about XNA and I love sharing what I know and hearing questions that make me think. And hopefully that came through while I was talking. I had a freaking blast. The swag went quickly (and I was even able to hand out some XNA 2.0 books that have been around forever!) and the pizza was been gobbled down. I started the talk at about 6:10 and managed to cover a very brief intro to programming against the game loop (it’s a different concept and one that seems to trip a lot of people up getting started with game development) and then rolled into making a basic 2D game for Windows Phone 7 using XNA. And I finished the whole thing before 8:30. Wahoo! I’m planning on posting the source code and assets on my site so those should be coming soon. And to make things even better, they recorded the whole thing on video so everyone will get to see me pretend I can speak! Just wait till you hear the new song I wrote for this talk…

    Read the article

  • More on Map Testing

    - by Michael Stephenson
    I have been chatting with Maurice den Heijer recently about his codeplex project for the BizTalk Map Testing Framework (http://mtf.codeplex.com/). Some of you may remember the article I did for BizTalk 2009 and 2006 about how to test maps but with Maurice's project he is effectively looking at how to improve productivity and quality by building some useful testing features within the framework to simplify the process of testing maps. As part of our discussion we realized that we both had slightly different approaches to how we validate the output from the map. Put simple Maurice does some xpath validation of the data in various nodes where as my approach for most standard cases is to use serialization to allow you to validate the output using normal MSTest assertions. I'm not really going to go into the pro's and con's of each approach because I think there is a place for both and also I'm sure others have various approaches which work too. What would be great is for the map testing framework to provide support for different ways of testing which can cover everything from simple cases to some very specialized scenarios. So as agreed with Maurice I have done the sample which I will talk about in the rest of this article to show how we can use the serialization approach to create and compare the input and output from a map in normal development testing. Prerequisites One of the common patterns I usually implement when developing BizTalk solutions is to use xsd.exe to create .net classes for most of the schemas used within the solution. In the testing pattern I will take advantage of these .net classes. The Map In this sample the map we will use is very simple and just concatenates some data from the input message to the output message. Hopefully the below picture illustrates this well. The Test In the test I'm basically taking the following actions: Use the .net class generated from the schema to create an input message for the map Serialize the input object to a file Run the map from .net using the standard BizTalk test method which was generated for running the map Deserialize the output file from the map execution to a .net class representing the output schema Use MsTest assertions to validate things about the output message The below picture shows this: As you can see the code for this is pretty simple and it's all strongly typed which means changes to my schema which can affect the tests can be easily picked up as compilation errors. I can then chose to have one test which validates most of the output from the map, or to have many specific tests covering individual scenarios within the map. Summary Hopefully this post illustrates a powerful yet simple way of effectively testing many BizTalk mapping scenarios. I will probably have more conversations with Maurice about these approaches and perhaps some of the above will be included in the mapping test framework.   The sample can be downloaded from here: http://cid-983a58358c675769.office.live.com/self.aspx/Blog%20Samples/More%20Map%20Testing/MapTestSample.zip

    Read the article

< Previous Page | 72 73 74 75 76 77 78 79 80 81 82 83  | Next Page >