Search Results

Search found 1081 results on 44 pages for 'migrating'.

Page 36/44 | < Previous Page | 32 33 34 35 36 37 38 39 40 41 42 43  | Next Page >

  • Oracle Forms Migration to ADF - Webinar vom ORACLE Partner PITSS

    - by Thomas Leopold
      Tuesday, February 22, 2011 5:00 PM - 6:00 PM CET Free Webinar Re-Engineering Legacy Oracle Forms Migration from Forms to ADF - A Case Study Join Oracle's Grant Ronald and PITSS to see a software architecture comparison of Oracle Forms and ADF and a live step-by-step presentation on how to achieve a successful migration. Learn about various migration options, challenges and best practices to protect your current investment in Oracle Forms. PL/SQL is without match for what it does: manipulating data in the database. If you blindly migrate all your PL/SQL to Java you will, in all probability, end up with less maintainable and less efficient code. Instead you should consider which code it best left as PL/SQL..." Grant Ronald - "Migrating Oracle Forms to Fusion: Myth or Magic Bullet?" Re-Engineering existing business logic is mandatory for your legacy Forms application to take advantage of the new Software architectures like ADF. The PITSS.CON solution combines the deep understanding of Oracle Forms and Reports applications architecture with powerful re-engineering capabilities that allows the developer community to protect the investment in the existing Forms applications and to concentrate on fine-tuning and customization of the modernized functionality rather than manually recreating every module and business logic from bottom up. Registration: https://www2.gotomeeting.com/register/971702250   PITSS GmbHKönigsdorferstrasse 25D-82515 WolfratshausenDo not forget to check out these Free Webinars in March! Thursday, March 3, 2011 Upgrade and Modernize Your Application to Forms 11g Registration/Information Tuesday, March 15, 2011 Shaping the Future for your Oracle Forms Application:Forms 11g, ADF, APEX Registration/Information Tuesday, March 29, 2011 Oracle Forms Modernization to APEX Registration/Information Registration is limited, so sign up  today!Presented By:        Grant Ronald, Senior Group Product Manager,Oracle       Magdalena Serban, Product Manager,PITSS   Contact Us:            PITSS in Americas +1 248.740.0935 [email protected] www.pitssamerica.com       PITSS in Europe +49 (0) 717287 5200 [email protected] www.pitss.com   White Paper:      From Oracle Forms to Oracle ADF and JEE     © Copyright 2010 PITSS GmbH, Wolfratshausen, Stuttgart, München; Managing Directors: Dipl.-Ing. Andreas Gaede, Michael Kilimann, Dipl.-Ing. Dirk Fleischmann Commercial Register: HRB 125471 at District Court Munich. All rights reserved. Any duplication or further treatment in any medium, in parts or as a whole, requires a written agreement. If you do not want to receive invitations for events, meetings and seminars from us, then please click here.

    Read the article

  • Oracle Database 11gR2 11.2.0.3 Certified with E-Business Suite on Windows

    - by John Abraham
    As a follow up to our original certification announcement, Oracle Database 11g Release 2 (11.2.0.3) is now certified with Oracle E-Business Suite Release 11i and Release 12 on the following Microsoft Windows operating systems: Release 12.1 (12.1.1 and higher) Microsoft Windows Server (32-bit) (2003, 2008) Microsoft Windows x64 (64-bit) (20031, 20081, 2008 R22) Release 12.0 (12.0.4 and higher) Microsoft Windows Server (32-bit) (2003) Microsoft Windows x64 (64-bit) (2003, 2008)1 Release 11i (11.5.10.2 + ATG PF.H RUP 6 and higher) Microsoft Windows Server (32-bit) (2003, 20081) Microsoft Windows x64 (64-bit) (2003, 2008, 2008 R2)1 Notes 1: This OS is a 'database tier only' or 'split tier configuration' platform where the application tier must be on a fully certified E-Business Suite platform. 2: This OS is a 'database tier only' platform for Release 11i. For 12.1.1 or higher, it is also supported on the application tier via the migration process outlined in My Oracle Support Document 1188535.1. Pending Certification E-Business Suite 12.0 with 11.2.0.3 Split Tier Certification on Microsoft Windows x64 (64-bit) (2008 R2) is in progress and will be announced separately. This announcement for Oracle E-Business Suite 11i and R12 includes: Real Application Clusters (RAC) Oracle Database Vault Transparent Data Encryption (Column Encryption) TDE Tablespace Encryption Advanced Security Option (ASO)/Advanced Networking Option (ANO) Export/Import Process for Oracle E-Business Suite Release 11i and Release 12 Database Instances Transportable Database and Transportable Tablespaces Data Migration Processes for Oracle E-Business Suite Release 11i and Release 12 References MOS Document 881505.1 - Interoperability Notes - Oracle E-Business Suite Release 11i with Oracle Database 11g Release 2 (11.2.0) MOS Document 1058763.1 - Interoperability Notes - Oracle E-Business Suite Release 12 with Oracle Database 11g Release 2 (11.2.0) MOS Document 1091086.1 - Integrating Oracle E-Business Suite Release 11i with Oracle Database Vault 11gR2 MOS Document 1091083.1 - Integrating Oracle E-Business Suite Release 12 with Oracle Database Vault 11gR2 MOS Document 216205.1 - Database Initialization Parameters for Oracle E-Business Suite 11i MOS Document 396009.1 - Database Initialization Parameters for Oracle Applications Release 12 MOS Document 823586.1 - Using Oracle 11g Release 2 Real Application Clusters with Oracle E-Business Suite Release 11i MOS Document 823587.1 - Using Oracle 11g Release 2 Real Application Clusters with Oracle E-Business Suite Release 12 MOS Document 403294.1 - Using Transparent Data Encryption (TDE) Column Encryption with Oracle E-Business Suite Release 11i MOS Document 732764.1 - Using Transparent Data Encryption (TDE) Column Encryption with Oracle E-Business Suite Release 12 MOS Document 828223.1 - Using TDE Tablespace Encryption with Oracle E-Business Suite Release 11i MOS Document 828229.1 - Using TDE Tablespace Encryption with Oracle E-Business Suite Release 12 MOS Document 391248.1 - Encrypting Oracle E-Business Suite Release 11i Network Traffic using Advanced Security Option and Advanced Networking Option MOS Document 732764.1 - Using Transparent Data Encryption (TDE) Column Encryption with Oracle E-Business Suite Release 12 MOS Document 557738.1 - Export/Import Process for Oracle E-Business Suite Release 11i Database Instances Using Oracle Database 11g Release 1 or 11g Release 2 MOS Document 741818.1 - Export/Import Process for Oracle E-Business Suite Release 12 Database Instances Using Oracle Database 11g Release 1 or 11g Release 2 MOS Document 1366265.1 - Using Transportable Tablespaces to Migrate Oracle Applications 11i Using Oracle Database 11g Release 2 MOS Document 1311487.1 - Using Transportable Tablespaces to Migrate Oracle E-Business Suite Release 12 Using Oracle Database 11g Release 2 MOS Document 729309.1 - Using Transportable Database to Migrate Oracle E-Business Suite Release 11i Using Oracle Database 10g Release 2 or 11g MOS Document 734763.1 - Using Transportable Database to Migrate Oracle E-Business Suite Release 12 Using Oracle Database 10g Release 2 or 11g MOS Document 1188535.1 - Migrating Oracle E-Business Suite R12 to Microsoft Windows Server 2008 R2 Please also review the platform-specific Oracle Database Installation Guides for operating system and other prerequisites. Related Articles Database 11.2.0.2 Certified with EBS R12 on IBM: Linux on System z EBS R12 Certified with Database 11gR2 on SLES 11 11gR2 11.2.0.3 Database Certified with E-Business Suite

    Read the article

  • From NaN to Infinity...and Beyond!

    - by Tony Davis
    It is hard to believe that it was once possible to corrupt a SQL Server Database by storing perfectly normal data values into a table; but it is true. In SQL Server 2000 and before, one could inadvertently load invalid data values into certain data types via RPC calls or bulk insert methods rather than DML. In the particular case of the FLOAT data type, this meant that common 'special values' for this type, namely NaN (not-a-number) and +/- infinity, could be quite happily plugged into the database from an application and stored as 'out-of-range' values. This was like a time-bomb. When one then tried to query this data; the values were unsupported and so data pages containing them were flagged as being corrupt. Any query that needed to read a column containing the special value could fail or return unpredictable results. Microsoft even had to issue a hotfix to deal with failures in the automatic recovery process, caused by the presence of these NaN values, which rendered the whole database inaccessible! This problem is history for those of us on more current versions of SQL Server, but its ghost still haunts us. Recently, for example, a developer on Red Gate’s SQL Response team reported a strange problem when attempting to load historical monitoring data into a SQL Server 2005 database via the C# ADO.NET provider. The ratios used in some of their reporting calculations occasionally threw out NaN or infinity values, and the subsequent attempts to load these values resulted in a nasty error. It turns out to be a different manifestation of the same problem. SQL Server 2005 still does not fully support the IEEE 754 standard for floating point numbers, in that the FLOAT data type still cannot handle NaN or infinity values. Instead, they just added validation checks that prevent the 'invalid' values from being loaded in the first place. For people migrating from SQL Server 2000 databases that contained out-of-range FLOAT (or DATETIME etc.) data, to SQL Server 2005, Microsoft have added to the latter's version of the DBCC CHECKDB (or CHECKTABLE) command a DATA_PURITY clause. When enabled, this will seek out the corrupt data, but won’t fix it. You have to do this yourself in what can often be a slow, painful manual process. Our development team, after a quizzical shrug of the shoulders, simply decided to represent NaN and infinity values as NULL, and move on, accepting the minor inconvenience of not being able to tell them apart. However, what of scientific, engineering and other applications that really would like the luxury of being able to both store and access these perfectly-reasonable floating point data values? The sticking point seems to be the stipulation in the IEEE 754 standard that, when NaN is compared to any other value including itself, the answer is "unequal" (i.e. FALSE). This is clearly different from normal number comparisons and has repercussions for such things as indexing operations. Even so, this hardly applies to infinity values, which are single definite values. In fact, there is some encouraging talk in the Connect note on this issue that they might be supported 'in the SQL Server 2008 timeframe'. If didn't happen; SQL 2008 doesn't support NaN or infinity values, though one could be forgiven for thinking otherwise, based on the MSDN documentation for the FLOAT type, which states that "The behavior of float and real follows the IEEE 754 specification on approximate numeric data types". However, the truth is revealed in the XPath documentation, which states that "…float (53) is not exactly IEEE 754. For example, neither NaN (Not-a-Number) nor infinity is used…". Is it really so hard to fix this problem the right way, and properly support in SQL Server the IEEE 754 standard for the floating point data type, NaNs, infinities and all? Oracle seems to have managed it quite nicely with its BINARY_FLOAT and BINARY_DOUBLE types, so it is technically possible. We have an enterprise-class database that is marketed as being part of an 'integrated' Windows platform. Absurdly, we have .NET and XPath libraries that fully support the standard for floating point numbers, and we can't even properly store these values, let alone query them, in the SQL Server database! Cheers, Tony.

    Read the article

  • Session Update from IASA 2010

    - by [email protected]
    Below: Tom Kristensen, senior vice president at Marsh US Consumer, and Roger Soppe, CLU, LUTCF, senior director of insurance strategy, Oracle Insurance. Tom and Roger participated in a panel discussion on policy administration systems this week at IASA 2010. This week was the 82nd Annual IASA Educational Conference & Business Show held in Grapevine, Texas. While attending the conference, I had the pleasure of serving as a panelist in one of many of the outstanding sessions conducted this year. The session - entitled "Achieving Business Agility and Promoting Growth with a Modern Policy Administration System" - included industry experts Steve Forte from OneShield, Mike Sciole of IFG Companies, and Tom Kristensen, senior vice president at Marsh US Consumer. The session was conducted as a panel discussion and focused on how insurers can leverage best practices to mitigate risk while enabling rapid product innovation through a modern policy administration system. The panelists offered insight into business and technical challenges for both Life & Annuity and Property & Casualty carriers. The session had three primary learning objectives: Identifying how replacing a legacy system with a more modern policy administration solution can deliver agility and growth Identifying how processes and system should be re-engineered or replaced in order to improve speed-to-market and product support Uncovering how to leverage best practices to mitigate risk during a migration to a new platform Tom Kristensen, who is an industry veteran with over 20 years of experience, was able was able to offer a unique perspective as a business process outsourcer (BPO). Marsh US Consumer is currently implementing both the Oracle Insurance Policy Administration solution and the Oracle Revenue Management and Billing platform while at the same time implementing a new BPO customer. Tom offered insight on the need to replace their aging systems and Marsh's ability to drive new products and processes with a modern solution. As a best practice, their current project has empowered their business users to play a major role in both the requirements gathering and configuration phases. Tom stated that working with a modern solution has also enabled his organization to use a more agile implementation methodology and get hands-on experience with the software earlier in the project. He also indicated that Marsh was encouraged by how quickly it will be able to implement new products, which is another major advantage of a modern rules-based system. One of the more interesting issues was raised by an audience member who asked, "With all the vendor solutions available in North American and across Europe, what is going to make some of them more successful than others and help ensure their long term success?" Panelist Mike Sciole, IFG Companies suggested that carriers do their due diligence and follow a structured evaluation process focusing on vendors who demonstrate they have the "cash to invest in long term R&D" and evaluate audited annual statements for verification. Other panelists suggested that the vendor space will continue to evolve and those with a strong strategy focused on the insurance industry and a solid roadmap will likely separate themselves from the rest. The session concluded with the panelists offering advice about not being afraid to evaluate new modern systems. While migrating to a new platform can be challenging and is typically only undertaken every 15+ years by carriers, the ability to rapidly deploy and manage new products, create consistent processes to better service customers, and the ability to manage their business more effectively, transparently and securely are well worth the effort. Roger A.Soppe, CLU, LUTCF, is the Senior Director of Insurance Strategy, Oracle Insurance.

    Read the article

  • Oracle Database 11.2.0.4 Certified with EBS on Microsoft Windows Server

    - by John Abraham
    As a follow up to to a previous announcement, Oracle Database 11g Release 2 (11.2.0.4) is now certified with Oracle E-Business Suite Release 11i and Release 12 on the following Microsoft Windows Server operating systems: Release 12.2 (12.2.3 and higher): Microsoft Windows x64 (64-bit) (2008 R2) Release 12.1 (12.1.1 and higher): Microsoft Windows Server (32-bit) (2003, 2008) Microsoft Windows x64 (64-bit) (20031, 20081, 2008 R22) Release 12.0 (12.0.4 and higher): Microsoft Windows Server (32-bit) (2003) Microsoft Windows x64 (64-bit) (2003, 2008, 2008 R2)1 Release 11i (11.5.10.2 + ATG PF.H RUP 6 and higher):: Microsoft Windows Server (32-bit) (2003, 20081) Microsoft Windows x64 (64-bit) (2003, 2008, 2008 R2)1 Notes: 1: This OS is a 'database tier only' or 'split tier configuration' platform where the application tier must be on a fully certified E-Business Suite platform. 2: This OS is a 'database tier only' platform for Release 11i. For 12.1.1 or higher, it is also supported on the application tier via the migration process outlined in My Oracle Support Document 1188535.1. This announcement for Oracle E-Business Suite 11i and R12 includes: Oracle Database 11gR2 version 11.2.0.4 Oracle Database 11gR2 version 11.2.0.4 Real Application Clusters (RAC) Oracle Database Vault 11gR2 version 11.2.0.4 Transparent Data Encryption (Column Encryption) using Oracle Database 11gR2 version 11.2.0.4 TDE Tablespace Encryption using Oracle Database 11gR2 version 11.2.0.4 Advanced Security Option (ASO)/Advanced Networking Option (ANO) with Oracle Database 11gR2 version 11.2.0.4 Export/Import Process for Oracle E-Business Suite Release 11i and Release 12 Database Instances Transportable Database and Transportable Tablespaces Data Migration Processes for Oracle E-Business Suite Release 11i and Release 12 Certification data in My Oracle Support (http://support.oracle.com) has been updated with this certification - please review the documents below for all requirements and additional details: Where can I find more information? MOS Document 881505.1 - Interoperability Notes - Oracle E-Business Suite Release 11i with Oracle Database 11g Release 2 (11.2.0) MOS Document 1058763.1 - Interoperability Notes - Oracle E-Business Suite Release 12 with Oracle Database 11g Release 2 (11.2.0) MOS Dcoument 1623879.1 - Interoperability Notes - Oracle E-Business Suite Release 12.2 with Oracle Database 11g Release 2 (11.2.0) MOS Document 1091086.1 - Integrating Oracle E-Business Suite Release 11i with Oracle Database Vault 11gR2 MOS Document 1091083.1 - Integrating Oracle E-Business Suite Release 12 with Oracle Database Vault 11gR2 MOS Document 216205.1 - Database Initialization Parameters for Oracle E-Business Suite 11i MOS Document 396009.1 - Database Initialization Parameters for Oracle Applications Release 12 MOS Document 823586.1 - Using Oracle 11g Release 2 Real Application Clusters with Oracle E-Business Suite Release 11i MOS Document 823587.1 - Using Oracle 11g Release 2 Real Application Clusters with Oracle E-Business Suite Release 12 MOS Document 946413.1 - Using Oracle Applications with a Split Configuration Database Tier on Oracle Release 11g Release 2 MOS Document 403294.1 - Using Transparent Data Encryption (TDE) Column Encryption with Oracle E-Business Suite Release 11i MOS Document 732764.1 - Using Transparent Data Encryption (TDE) Column Encryption with Oracle E-Business Suite Release 12 MOS Document 828223.1 - Using TDE Tablespace Encryption with Oracle E-Business Suite Release 11i MOS Document 828229.1 - Using TDE Tablespace Encryption with Oracle E-Business Suite Release 12 MOS Document 391248.1 - Encrypting Oracle E-Business Suite Release 11i Network Traffic using Advanced Security Option and Advanced Networking Option MOS Document 376700.1 - Enabling SSL in Oracle Application Release 12 MOS Document 732764.1 - Using Transparent Data Encryption (TDE) Column Encryption with Oracle E-Business Suite Release 12 MOS Document 557738.1 - Export/Import Process for Oracle E-Business Suite Release 11i Database Instances Using Oracle Database 11g Release 1 or 11g Release 2 MOS Document 741818.1 - Export/Import Process for Oracle E-Business Suite Release 12 Database Instances Using Oracle Database 11g Release 1 or 11g Release 2 MOS Document 1366265.1 - Using Transportable Tablespaces to Migrate Oracle Applications 11i Using Oracle Database 11g Release 2 MOS Document 1311487.1 - Using Transportable Tablespaces to Migrate Oracle E-Business Suite Release 12 Using Oracle Database 11g Release 2 MOS Document 729309.1 - Using Transportable Database to Migrate Oracle E-Business Suite Release 11i Using Oracle Database 10g Release 2 or 11g MOS Document 734763.1 - Using Transportable Database to Migrate Oracle E-Business Suite Release 12 Using Oracle Database 10g Release 2 or 11g MOS Document 1188535.1 - Migrating Oracle E-Business Suite R12 to Microsoft Windows Server 2008 R2 MOS Dcoument 1349240.1 - Database Preparation Guidelines for an Oracle E-Business Suite Release 12.2 Upgrade MOS Document 1594274.1 - Oracle E-Business Suite Release 12.2: Consolidated List of Patches and Technology Bug Fixes Please also review the platform-specific Oracle Database Installation Guides for operating system and other prerequisites.

    Read the article

  • Pimp my Silverlight Firestarter

    - by mbcrump
    So Silverlight Firestarter is over and your sitting on your couch thinking… what now? Well its time to So how exactly can you pimp the Silverlight Firestarter? Well read below and you will find out: 1) Pimp the videos: First we are going to use a program named Juice to download all of the Silverlight Firestarter videos. Go ahead and point your browser to http://juicereceiver.sourceforge.net/ and download the application. It works on Mac, Linux and PC. After it is downloaded you are going to want to add an RSS feed by clicking the button highlighted below. At this point you are going to want to add the following URL inside the textbox and hit Save: http://channel9.msdn.com/Series/Silverlight-Firestarter/RSS This RSS feed includes all the Silverlight Firestarter Labs and Presentations located below. The Future of Silverlight Data Binding Strategies with Silverlight and WP7 Building Compelling Apps with WCF using REST and LINQ Building Feature Rich Business Apps Today with RIA Services MVVM: Why and How? Tips and Patterns using MVVM and Service Patterns with Silverlight and WP7 Tips and Tricks for a Great Installation Experience Tune Your Application: Profiling and Performance Tips Performance Tips for Silverlight Windows Phone 7 Select all the videos and click the Download button located below (has blue arrow): Once all the videos are downloaded you will have about 4.64GB of Silverlight fun. You can now move these videos to your MediaServer and watch them with whatever device you want. Put it on an iPad, iPhone.. emm wait I mean WP7 or WMC7.  2) Pimp the Training Material – Download the offline installer for the labs here. This will give you almost a gig of free training materials. Here is the topics covered: Level 100: Getting Started Lab 01 - WinForms and Silverlight Lab 02 - ASP.NET and Silverlight Lab 03 - XAML and Controls Lab 04 - Data Binding Level 200: Ready for More Lab 05 - Migrating Apps to Out-of-Browser Lab 06 - Great UX with Blend Lab 07 - Web Services and Silverlight Lab 08 - Using WCF RIA Services Level 300: Take me Further Lab 09 - Deep Dive into Out-of-Browser Lab 10 - Silverlight Patterns: Using MVVM Lab 11 - Silverlight and Windows Phone 7 You will notice that it install Firestarter to the default of C:\Firestarter. So you will have to navigate to that folder and double click on Default.htm to get started. Now if you followed part one of the pimping guide then you will already have all the videos on your pc. You will notice that once you go into the lab you will get a Lab Document and Source at the bottom of the article. Now instead of opening the Source Folder in a web browser you can just copy the folder C:\Firestarter\Labs into your Visual Studio 2010 Project Folder. This will save a lot of time later.   3) Pimp my Silverlight 5 Knowledge – Always keep reading as much as possible and remember that the Silverlight 5 Beta should come Q1 of 2011 and the final release at the end of 2011. Here are 5 great blog post on Silverlight 5. Scott Gu’s Blog Mary Jo’s Article on Silverlight 5 The Future of Silverlight (Official) Kunal Chowdhury Blog Tim Heuer’s Blog Thats all that I got for now. Have fun with all the new Silverlight content.  Subscribe to my feed

    Read the article

  • Copy TFS Build Definitions between Projects and Collections

    - by Jakob Ehn
    Originally posted on: http://geekswithblogs.net/jakob/archive/2014/06/05/copy-tfs-build-definitions-between-projects-and-collections.aspxThe last couple of years it has become apparent that using multiple team projects in TFS is generally a bad idea. There are of course exceptions to this, but there are a lot ot things that becomes much easier to do when you put all of your projects and team in the same team project. Fellow ALM MVP Martin Hinshelwood has blogged about this several times, as well as other people in the community. In particular, using the backlog and portfolio management tools makes much more sense when everything is located in the same team project. Consolidating multiple team projects into one is not that easy unfortunately, it involves migrating source code, work items, reports etc.  Another thing that also need to be migrated is build definitions. It is possible to clone build definitions within the same team project using the TFS power tools. The Community TFS Build Manager also lets you clone build definitions to other team projects. But there is no tool that allows you to clone/copy a build definition to another collection. So, I whipped up a simple console application that let you do this. The tool can be downloaded from https://onedrive.live.com/redir?resid=EE034C9F620CD58D!8162&authkey=!ACTr56v1QVowzuE&ithint=file%2c.zip   Using CopyTFSBuildDefinitions You use the tool like this: CopyTFSBuildDefinitions  SourceCollectionUrl  SourceTeamProject  BuildDefinitionName  DestinationCollectionUrl  DestinationTeamProject [NewDefinitionName] Arguments SourceCollectionUrl The URL to the TFS collection that contains the team project with the build definition that you want to copy SourceTeamProject The name of the team project that contains the build definition BuildDefinitionName Name of the build definition DestinationCollectionUrl The URL to the TFS collection that contains the team project that you want to copy your build definition to DestinationTeamProject The name of the team project in the destination collection NewDefinitionName (Optional) Use this to override the name of the new build definition. If you don’t specify this, the name will the same as the original one Example: CopyTFSBuildDefinitions  https://jakob.visualstudio.com DemoProject  WebApplication.CI https://anotheraccount.visualstudio.com     Notes Since we are (potentially) create a build definition in a new collection, there is no guarantee that the various paths that are defined in the build definition exist in the new collection. For example, a build definition refers to server paths in TFVC or repos + branches in TFGit. It also refers to build controllers that definitely don’t exist in the new collection. So there will be some cleanup to do after you copy your build definitions. You can fix some of these using the Community TFS Build Manager, for example it is very easy to apply the correct build controller to a set of build definitions The problem stated above also applies to build process templates. However, the tool tries to find a build process template in the new team project with the same file name as the one that existed in the old team project. If it finds one, it will be used for the new build definition. Otherwise is will use the default build template If you want to run the tool for many build definitions, you can use this SQL scripts, compliments of Mr. Scrum/ALM MVP Richard Hundhausen to generate the necessary commands: USE Tfs_Collection GO SELECT 'CopyTFSBuildDefinitions.exe http://SERVER:8080/tfs/collection "' + P.ProjectName + '" "' + REPLACE(BD.DefinitionName,'\','') + '" http://NEWSERVER:8080/tfs/COLLECTION TEAMPROJECT'   FROM tbl_Project P        INNER JOIN tbl_BuildGroup BG on BG.TeamProject = P.ProjectUri        INNER JOIN tbl_BuildDefinition BD on BD.GroupId = BG.GroupId   ORDER BY P.ProjectName, BD.DefinitionName   Hope that helps, let me know if you have any problems with the tool or if you find it useful

    Read the article

  • 30 Steps to Master ASP.NET MVC Application development

    - by Rajesh Pillai
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 st1\:*{behavior:url(#ieooui) } /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";} Welcome Readers!,   I am starting out a new series on ASP.NET  MVC skill building which will be posted over the next couple of weeks.  Let me know your thoughts on the content, which I have planned and a couple of them has been taken from ASP.NET MVC2 Cookbook. (NOTE: Only the heading has been taken, the content will be not :)).   Do let me know what you would like to see, or any additional inputs or ideas to cover in this topics.  The 30 steps are oultined below for quick reference.  Will start filling this out quickly.   Outlined is the ‘30’ step to master ASP.NET MVC.   A Peek Into Model What is a model? Different types of model Presentation/ViewModel Model Mapping (AutoMapper)   A Peak into View How view works in ASP.NET MVC? View Engine Design Custom View Engine View Best Practices Templated Helpers Partial Views   A Peak into Controller Introduction Controller Design Controller Best Practices Asynchronous Controller Custom Action Result Action Filters Controller Factory to use with IOC   Routes Explanation Routes from the database Routes from XML More complex routing   Master Pages Basics Setting Master Page Dynamically   Working with data in the view Repeating Views Array of check boxes Array of radio buttons Paged data CRUD Client side action Confirmation Dialog (modal window) jqGrid   Working with Forms   Validation Model Validation with DataAnnotations Using the xVal validation framework Client side validation with jQuery Validation Fluent Validation Model Binders   Templating Create strongly typed helper using T4 Custom View Templates with T4 Create custom MVC project template using T4   IOC AutoFac Ninject Unity Application   Areas   jQuery, Ajax and jQuery Plugins   State Maintenance Application State User state Cookies Webfarm   Error Handling View error handling Controller error handling ELMAH (Error Logging Modules and Handlers)   Authentication and Authorization User Registration form SignOn Process Password Reminder Membership and Roles Windows authentication Restricting access to all pages Restricting access to selected pages Restricting access to pages by role Restricting access to a controller Restricting access to selected area   Profiles and Themes Using Profiles Inheriting a Profile Migrating an anonymous profile Creating custom themes Using themes User personalized themes   Configuration Adding custom application settings in web.config Displaying custom error messages Accessing other web.config configuration elements Adding custom configuration elements to web.config Encrypting web.config sections   Tracing, Debugging and Logging   Caching Caching a whole page Caching pages based on route details Caching pages based on browser type and version Caching pages based custom strings Caching partial pages Caching application data Object Caching Using Microsoft Velocity Using MemCache Using AppFabric cache   Localization   HTTP Handlers and Modules   Security XSS/CSRF AnitForgery Encoding   HtmlHelpers Strongly typed helpers Writing custom helpers   Repository Pattern (Data access)   WF/WCF   Unit Testing   Mocking Framework   Integration Testing   Load / Performance Testing   Deployment    Once again let me know your thoughts on this.   Till then, Enjoy MVC'ing!!!

    Read the article

  • Bridging the Gap in Cloud, Big Data, and Real-time

    - by Dain C. Hansen
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} With all the buzz of around big data and cloud computing, it is easy to overlook one of your most precious commodities—your data. Today’s businesses cannot stand still when it comes to data. Market success now depends on speed, volume, complexity, and keeping pace with the latest data integration breakthroughs. Are you up to speed with big data, cloud integration, real-time analytics? Join us in this three part blog series where we’ll look at each component in more detail. Meet us online on October 24th where we’ll take your questions about what issues you are facing in this brave new world of integration. Let’s start first with Cloud. What happens with your data when you decide to implement a private cloud architecture? Or public cloud? Data integration solutions play a vital role migrating data simply, efficiently, and reliably to the cloud; they are a necessary ingredient of any platform as a service strategy because they support cloud deployments with data-layer application integration between on-premise and cloud environments of all kinds. For private cloud architectures, consolidation of your databases and data stores is an important step to take to be able to receive the full benefits of cloud computing. Private cloud integration requires bidirectional replication between heterogeneous systems to allow you to perform data consolidation without interrupting your business operations. In addition, integrating data requires bulk load and transformation into and out of your private cloud is a crucial step for those companies moving to private cloud. In addition, the need for managing data services as part of SOA/BPM solutions that enable agile application delivery and help build shared data services for organizations. But what about public Cloud? If you have moved your data to a public cloud application, you may also need to connect your on-premise enterprise systems and the cloud environment by moving data in bulk or as real-time transactions across geographies. For public and private cloud architectures both, Oracle offers a complete and extensible set of integration options that span not only data integration but also service and process integration, security, and management. For those companies investing in Oracle Cloud, you can move your data through Oracle SOA Suite using REST APIs to Oracle Messaging Cloud Service —a new service that lets applications deployed in Oracle Cloud securely and reliably communicate over Java Messaging Service . As an example of loading and transforming data into other public clouds, Oracle Data Integrator supports a knowledge module for Salesforce.com—now available on AppExchange. Other third-party knowledge modules are being developed by customers and partners every day. To learn more about how to leverage Oracle’s Data Integration products for Cloud, join us live: Data Integration Breakthroughs Webcast on October 24th 10 AM PST.

    Read the article

  • ORA-4031 Troubleshooting

    - by [email protected]
      QUICKLINK: Note 396940.1 Troubleshooting and Diagnosing ORA-4031 Error Note 1087773.1 : ORA-4031 Diagnostics Tools [Video]   Have you observed an ORA-04031 error reported in your alert log? An ORA-4031 error is raised when memory is unavailable for use or reuse in the System Global Area (SGA).  The error message will indicate the memory pool getting errors and high level information about what kind of allocation failed and how much memory was unavailable.  The challenge with ORA-4031 analysis is that the error and associated trace is for a "victim" of the problem.   The failing code ran into the memory limitation, but in almost all cases it was not part of the root problem.    Looking for the best way to diagnose? When an ORA-4031 error occurs, a trace file is raised and noted in the alert log if the process experiencing the error is a background process.   User processes may experience errors without reports in the alert log or traces generated.   The V$SHARED_POOL_RESERVED view will show reports of misses for memory over the life of the database. Diagnostics scripts are available in Note 430473.1 to help in analysis of the problem.  There is also a training video on using and interpreting the script data Note 1087773.1. 11g DiagnosabilityStarting with Oracle Database 11g Release 1, the Diagnosability infrastructure was introduced which places traces and core files into a location controlled by the DIAGNOSTIC_DEST initialization parameter when an incident, such as an ORA-4031 occurs. For earlier versions, the trace file will be written to either USER_DUMP_DEST (if the error was caught in a user process) or BACKGROUND_DUMP_DEST (if the error was caught in a background process like PMON or SMON). The trace file contains vital information about what led to the error condition.  Note 443529.1 11g Quick Steps to Package and Send Critical Error Diagnostic Information to Support[Video]Oracle Configuration Manager (OCM)Oracle Configuration Manager (OCM) works with My Oracle Support to enable proactive support capability that helps you organize, collect and manage your Oracle configurations.Oracle Configuration Manager Quick Start GuideNote 548815.1: My Oracle Support Configuration Management FAQ Note 250434.1: BULLETIN: Learn More About My Oracle Support Configuration Manager    Common Causes/Solutions The ORA-4031 can occur for many different reasons.  Some possible causes are: SGA components too small for workload Auto-tuning issues Fragmentation due to application design Bug/leaks in memory allocationsFor more on the 4031 and how this affects the SGA, see Note 396940.1 Troubleshooting and Diagnosing ORA-4031 Error Because of the multiple potential causes, it is important to gather enough diagnostics so that an appropriate solution can be identified.  However, most commonly the cause is associated with configuration tuning.   Ensuring that MEMORY_TARGET or SGA_TARGET are large enough to accommodate workload can get around many scenarios.  The default trace associated with the error provides very high level information about the memory problem and the "victim" that ran into the issue.   The data in the default trace is not going to point to the root cause of the problem. When migrating from 9i to 10g and higher, it is necessary to increase the size of the Shared Pool due to changes in the basic design of the shared memory area. Note 270935.1 Shared pool sizing in 10gNOTE: Diagnostics on the errors should be investigated as close to the time of the error(s) as possible.  If you must restart a database, it is not feasible to diagnose the problem until the database has matured and/or started seeing the problems again. Note 801787.1 Common Cause for ORA-4031 in 10gR2, Excess "KGH: NO ACCESS" Memory Allocation ***For reference to the content in this blog, refer to Note.1088239.1 Master Note for Diagnosing ORA-4031 

    Read the article

  • OTN ArchBeat Top 10 for September 2012

    - by Bob Rhubart
    The results are in... Listed below are the Top 10 most popular items shared via the OTN ArchBeat Facebook Page for the month of September 2012. The Real Architects of Los Angeles - OTN Architect Day - Oct 25 No gossip. No drama. No hair pulling. Just a full day of technical sessions and peer interaction focused on using Oracle technologies in today's cloud and SOA architectures. The event is free, but seating is limited, so register now. Thursday October 25, 2012. 8:00 a.m. – 5:00 p.m. Sofitel Los Angeles, 8555 Beverly Boulevard, Los Angeles, CA 90048. Oracle Fusion Middleware Security: Attaching OWSM policies to JRF-based web services clients "OWSM (Oracle Web Services Manager) is Oracle's recommended method for securing SOAP web services," says Oracle Fusion Middleware A-Team member Andre Correa. "It provides agents that encapsulate the necessary logic to interact with the underlying software stack on both service and client sides. Such agents have their behavior driven by policies. OWSM ships with a bunch of policies that are adequate to most common real world scenarios." His detailed post shows how to make it happen. Oracle 11gR2 RAC on Software Defined Network (SDN) (OpenvSwitch, Floodlight, Beacon) | Gilbert Stan "The SDN [software defined network] idea is to separate the control plane and the data plane in networking and to virtualize networking the same way we have virtualized servers," explains Gil Standen. "This is an idea whose time has come because VMs and vmotion have created all kinds of problems with how to tell networking equipment that a VM has moved and to preserve connectivity to VPN end points, preserve IP, etc." H/T to Oracle ACE Director Tim Hall for the recommendation. Process Oracle OER Events using a simple Web Service | Bob Webster Bob Webster's post "provides an example of a simple web service that processes Oracle Enterprise Repository (OER) Events. The service receives events from OER and utilizes the OER REX API to implement simple OER automations for selected event types." Understanding Oracle BI 11g Security vs Legacy Oracle BI 10g | Christian Screen "After conducting a large amount of Oracle BI 10g to Oracle BI 11g upgrades and after writing the Oracle BI 11g book,"says Oracle ACE Christian Screen, "I still continually get asked one of the most basic questions regarding security in Oracle BI 11g; How does it compare to Oracle BI 10g? The trail of questions typically goes on to what are the differences? And, how do we leverage our current Oracle BI 10g security table schema in Oracle BI 11g?" OIM-OAM-OAAM integration using TAP – Request Flow you must understand!! | Atul Kumar Atul Kumar's post addresses "key points and request flow that you must understand" when integrating three Oracle Identity Management product Oracle Identity Management, Oracle Access Management, and Oracle Adaptive Access Manager. Adding a runtime LOV for a taskflow parameter in WebCenter | Yannick Ongena Oracle ACE Yannick Ongena illustrates how to customize the parameters tab for a taskflow in WebCenter. Tips on Migrating from AquaLogic .NET Accelerator to WebCenter WSRP Producer for .NET | Scott Nelson "It has been a very winding path and this blog entry is intended to share both the lessons learned and relevant approaches that led to those learnings," says Scott Nelson. "Like most journeys of discovery, it was not a direct path, and there are notes to let you know when it is practical to skip a section if you are in a hurry to get from here to there." 15 Lessons from 15 Years as a Software Architect | Ingo Rammer In this presentation from the GOTO Conference in Copenhagen, Ingo Rammer shares 15 tips regarding people, complexity and technology that he learned doing software architecture for 15 years. WebCenter Content (WCC) Trace Sections | ECM Architect ECM Architect Kevin Smith shares a detailed technical post covering WebCenter Content (WCC) Trace Sections. Thought for the Day "Eventually everything connects - people, ideas, objects. The quality of the connections is the key to quality per se." — Charles Eames (June 17, 1907 – August 21, 1978) Source: SoftwareQuotes.com

    Read the article

  • JavaOne Latin America 2012 Trip Report

    - by reza_rahman
    JavaOne Latin America 2012 was held at the Transamerica Expo Center in Sao Paulo, Brazil on December 4-6. The conference was a resounding success with a great vibe, excellent technical content and numerous world class speakers. Some notable local and international speakers included Bruno Souza, Yara Senger, Mattias Karlsson, Vinicius Senger, Heather Vancura, Tori Wieldt, Arun Gupta, Jim Weaver, Stephen Chin, Simon Ritter and Henrik Stahl. Topics covered included the JCP/JUGs, Java SE 7, HTML 5/WebSocket, CDI, Java EE 6, Java EE 7, JSF 2.2, JMS 2, JAX-RS 2, Arquillian and JavaFX. Bruno Borges and I manned the GlassFish booth at the Java Pavilion on Tuesday and Webnesday. The booth traffic was decent and not too hectic. We met a number of GlassFish adopters including perhaps one of the largest GlassFish deployments in Brazil as well as some folks migrating to Java EE from Spring. We invited them to share their stories with us. We also talked with some key members of the local Java community. Tuesday evening we had the GlassFish party at the Tribeca Pub. The party was definitely a hit and we could have used a larger venue (this was the first time we had the GlassFish party in Brazil). Along with GlassFish enthusiasts, a number of Java community leaders were there. We met some of the same folks again at the JUG leader's party on Wednesday evening. On Thursday Arun Gupta, Bruno Borges and I ran a hands-on-lab on JAX-RS, WebSocket and Server-Sent Events (SSE) titled "Developing JAX-RS Web Applications Utilizing Server-Sent Events and WebSocket". This is the same Java EE 7 lab run at JavaOne San Francisco. The lab provides developers a first hand glipse of how an HTML 5 powered Java EE application might look like. We had an overflow crowd for the lab (at one point we had about twenty people standing) and the lab went very well. The slides for the lab are here: Developing JAX-RS Web Applications Utilizing Server-Sent Events and WebSocket from Reza Rahman The actual contents for the lab is available here. Give me a shout if you need help getting it up and running. I gave two solo talks following the lab. The first was on JMS 2 titled "What’s New in Java Message Service 2". This was essentially the same talk given by JMS 2 specification lead Nigel Deakin at JavaOne San Francisco. I talked about the JMS 2 simplified API, JMSContext injection, delivery delays, asynchronous send, JMS resource definition in Java EE 7, standardized configuration for JMS MDBs in EJB 3.2, mandatory JCA pluggability and the like. The session went very well, there was good Q & A and someone even told me this was the best session of the conference! The slides for the talk are here: What’s New in Java Message Service 2 from Reza Rahman My last talk for the conference was on JAX-RS 2 in the keynote hall. Titled "JAX-RS 2: New and Noteworthy in the RESTful Web Services API" this was basically the same talk given by the specification leads Santiago Pericas-Geertsen and Marek Potociar at JavaOne San Francisco. I talked about the JAX-RS 2 client API, asyncronous processing, filters/interceptors, hypermedia support, server-side content negotiation and the like. The talk went very well and I got a few very kind complements afterwards. The slides for the talk are here: JAX-RS 2: New and Noteworthy in the RESTful Web Services API from Reza Rahman On a more personal note, Sao Paulo has always had a special place in my heart as the incubating city for Sepultura and Soulfy -- two of my most favorite heavy metal musical groups of all time! Consequently, the city has a perpertually alive and kicking metal scene pretty much any given day of the week. This time I got to check out a solid performance by local metal gig Republica at the legendary Manifesto Bar. I also wanted to see a Dio Tribute at the Blackmore but ran out of time and energy... Overall I enjoyed the conference/Sao Paulo and look forward to going to Brazil again next year!

    Read the article

  • ArchBeat Link-o-Rama Top 10 for September 2-8, 2012

    - by Bob Rhubart
    The Top 10 items shared on the OTN Facebook Page for the week of September 2-8, 2012. Adding a runtime LOV for a taskflow parameter in WebCenter | Yannick Ongena Oracle ACE Yannick Ongena illustrates how to customize the parameters tab for a taskflow in WebCenter. Tips on Migrating from AquaLogic .NET Accelerator to WebCenter WSRP Producer for .NET | Scott Nelson "It has been a very winding path and this blog entry is intended to share both the lessons learned and relevant approaches that led to those learnings," says Scott Nelson. "Like most journeys of discovery, it was not a direct path, and there are notes to let you know when it is practical to skip a section if you are in a hurry to get from here to there." Free Event: Oracle Technology Network Architect Day – Boston, MA – 9/12/2012 Sure, you could ask a voodoo priestess for help in improving your solution architecture skills. But there's the whole snake thing, and the zombie thing, and other complications. So why not keep it simple and register for Oracle Technology Network Architect Day in Boston, MA. There's no magic, just a full day of technical sessions covering Cloud, SOA, Engineered Systems, and more. Registration is free, but seating is limited. You'll curse yourself if you miss this one. Starting and Stopping Fusion Applications the Right Way | Ronaldo Viscuso While the fastartstop tool that ships with Oracle Fusion Applications does most of the work to start/stop/bounce the Fusion Apps environment, it does not do it all. Oracle Fusion Applications A-Team blogger Ronaldo Viscuso's post "aims to explain all tasks involved in starting and stopping a Fusion Apps environment completely." Article Index: Architect Community Column in Oracle Magazine Did you know that Oracle Magazine features a regular column devoted specifically to the architect community? Every issue includes insight and expertise from architects who regularly work with Oracle Technologies. Click here to see a complete list of these articles. Using FMAP and AnalyticsRes in a Oracle BI High Availability Implementation | Art of Business Intelligence "The fmap syntax has been used for a long time in Oracle BI / Siebel Analytics when referencing images inherent in the application as well as custom images," says Oracle ACE Christian Screen. "This syntax is used on Analysis requests an dashboards." Dodeca Customer Feedback - The Rosewood Company | Tim Tow Oracle ACE Director Tim Tow shares anecdotal comments from one of his clients, a company that is deploying Dodeca to replace an aging VBA/Essbase application. Configuring UCM cache to check for external Content Server changes | Martin Deh Oracle WebCenter and ADF A-Team blogger shares the background information and the solution to a recently encountered customer scenario. Attend OTN Architect Day in Los Angeles – by Architects, for Architects – October 25 The OTN Architect Day roadshow stops in Boston next week, then it's on to Los Angeles for another all architecture, all day event on Thursday October 25, 2012 at the Sofitel Los Angeles, 555 Beverly Boulevard, Los Angeles, CA 90048. Like all Architect Day events, this one is absolutley free, so register now. The Role of Oracle VM Server for SPARC In a Virtualization Strategy New OTN article from Matthias Pfutzner. Thought for the Day "Practicing architects, through eduction, experience and examples, accumulate a considerable body of contextual sense by the time they're entrusted with solving a system-level problem…" — Eberhardt Rechtin (January 16, 1926 – April 14, 2006) Source: SoftwareQuotes.com

    Read the article

  • Administer, manage, monitor, and fine tune the performance of your Oracle SOA Suite 11g Service Infrastructure and SOA composite applications.

    - by JuergenKress
    Key Features of the book If you are an Oracle SOA suite administrator, then this book is your bible. It gives you everything you need to know about all your tasks and help you to apply what you learn in your everyday life right from the first chapter. The book walks through promoting code across environments, performance tuning the service infrastructure, monitoring the environment, configuring security policies, managing the dehydration store, backing and restoring environments and so on. Packed with real-world examples from authors' own experiences, this books offers a unique insight into Oracle SOA Suite Administration. Detailed description The book begins with an introduction of SOA and quickly moves on to management of SOA composite applications. Readers will learn how to manage composite applications, their deployments and lifecycles. Equipped with this knowledge, readers will be introduced to monitoring and performance tuning SOA Suite, monitoring instances, messages, and composite applications, managing faults and exceptions, configuring audit levels of composite applications to include end-to-end monitoring through the use of extended logging as well as administering and configuring all SOA Suite components. A very important aspect of administration is tuning and optimizing the infrastructure for performance and book offers real work recommendations to monitor and performance tune service engines, the underlying WebLogic server, threads and timeouts, files systems, and composite applications. It also covers detailed administration of individual service components, configuring the infrastructure MBeans using both Oracle Enterprise Manager Fusion Middleware Control and WLST based scripts, migrating worklist preferences and BAM data across environments, setting up Email, LDAP and custom XPath. An administrator is always trusted with troubleshooting and root causing problems in the infrastructure and this book will help you through the troubleshooting approaches as how to identify faults and exception through extended logging and thread dumps and find solutions to common startup problems and deployment issues. The advanced contents of this book explains OWSM security framework and how to secure components deployed to the infrastructure along with the details of all groundwork needed to ready the environment. Last few chapters help you to understand and deal with managing the metadata services repository and dehydration store, backup and recovery and concluding with advanced topics such as silent/scripted installations, cloning, upgrading, patching and high availability installations. Packed with real-world examples, and tips straight from the trench; this book offers insights into SOA Suite administration that you will not find elsewhere. Part of our writing style in this book draws heavily on the philosophy of reuse and as such the book provide an ample of executable SQL queries and WLST scripts that administrators can reuse and extend to perform most of the administration tasks such as monitoring instances, processing times, instance states and perform automatic deployments, tuning, migration, and installation. These scripts are spread over each of the chapters in the book and can also be downloaded from here. The book is available in different formats at the following websites: Paperback and eBook versions & Kindle version. It is available for order and signed copies are available through our web site. SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit  www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Technorati Tags: SOA book,SOA Suite Adminsitration,SOA Community,Oracle SOA,Oracle BPM,Community,OPN,Jürgen Kress

    Read the article

  • Accounts in Work Items after migration to TFS 2010 and to new domain

    - by Clara Oscura
    Lately I’ve been doing some tests on migrating our TFS 2008 installation to TFS 2010, coupled with a machine and domain change. One particular topic that was tricky is user accounts. We installed first a new machine with TFS 2010 and then migrated the projects in the old server. The work items were migrated with the projects. Great, but if I try to edit one of the old work items I cannot save it anymore because some fields contain old user names (ex. OLDDOMAIN\user) which are not known in the new domain (it should be NEWDOMAIN\user). The errors look like this: When I correct the ‘Assigned To’ field value, I get another error regarding another field: Before TFS 2010, we had TFSUsers power tool. It allow you to map an old user name to a new user name. This is not available anymore because WI fields with user accounts are now synchronized with AD display names changes (explained here). The correct way to go about this in TFS 2010 is to use TFSConfig Identities before adding the new domain accounts into the TFS groups (documented here). So, too late for us. I’ve found a (tedious) workaround to change those old account in work items in order to allow people to keep working with them. 1. Install TFS 2010 power tools 2. Export WIT from your project (VS | Tools | Process Editor | Work Item Types). Save the definition, for example: Original_MyProject_Task.xml 3. Copy the xml (NoReadOnly_MyProject_Task.xml) and edit it. From the field definition of ‘Activated By’, ‘Closed By’ and ‘Resolved By’, remove the following:        <WHENNOTCHANGED field="System.State">           <READONLY />         </WHENNOTCHANGED> 4. Import WIT in VS. Choose the new file (NoReadOnly_MyProject_Task.xml) and import it in MyProject 5. Open all tasks in Excel (flat list). Display the following columns: Asssigned To Activated By Closed By Resolved By Change the user accounts to the new ones (I usually sort each column alphabetically to make it easier). 6. Publish. If you get a conflict on a field, tough luck. You will have to manually choose “Local version” for each work item. I told you it was a tedious process. 7. Import original WIT (Original_MyProject_Task.xml) in MyProject. We only changed the WI definition so that we could change some fields. The original definition should be put back. And what about these other fields? Created By Authorized As These fields are not editable by definition (VS | Tools | Process Editor | Work Item Fields Explorer), even if they are not marked as read-only in the WIT. You can leave the old values. It doesn’t seem to matter to TFS. The other four fields are editable by definition, so only the WIT readonly rule prevents us from changing them. Technorati Tags: TFS,Team Foundation Server 2010,Work Item,Domain change

    Read the article

  • Perm SSIS Developer Urgently Required

    - by blakmk
      Job Role To provide dedicated data services support to the company, by designing, creating, maintaining and enhancing database objects, ensuring data quality, consistency and integrity. Migrating data from various sources to central SQL 2008 data warehouse will be the primary function. Migration of data from bespoke legacy database’s to SQL 2008 data warehouse. Understand key business requirements, Liaising with various aspects of the company. Create advanced transformations of data, with focus on data cleansing, redundant data and duplication. Creating complex business rules regarding data services, migration, Integrity and support (Best Practices). Experience ·         Minimum 3 year SSIS experience, in a project or BI Development role and involvement in at least 3 full ETL project life cycles, using the following methodologies and tools o    Excellent knowledge of ETL concepts including data migration & integrity, focusing on SSIS. o    Extensive experience with SQL 2005 products, SQL 2008 desirable. o    Working knowledge of SSRS and its integration with other BI products. o    Extensive knowledge of T-SQL, stored procedures, triggers (Table/Database), views, functions in particular coding and querying. o    Data cleansing and harmonisation. o    Understanding and knowledge of indexes, statistics and table structure. o    SQL Agent – Scheduling jobs, optimisation, multiple jobs, DTS. o    Troubleshoot, diagnose and tune database and physical server performance. o    Knowledge and understanding of locking, blocks, table and index design and SQL configuration. ·         Demonstrable ability to understand and analyse business processes. ·         Experience in creating business rules on best practices for data services. ·         Experience in working with, supporting and troubleshooting MS SQL servers running enterprise applications ·         Proven ability to work well within a team and liaise with other technical support staff such as networking administrators, system administrators and support engineers. ·         Ability to create formal documentation, work procedures, and service level agreements. ·         Ability to communicate technical issues at all levels including to a non technical audience. ·         Good working knowledge of MS Word, Excel, PowerPoint, Visio and Project.   Location Based in Crawley with possibility of some remote working Contact me for more info: http://sqlblogcasts.com/blogs/blakmk/contact.aspx      

    Read the article

  • Oracle GoldenGate 11g Release 2 Launch Webcast Replay Available

    - by Irem Radzik
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif"; mso-fareast-font-family:"Times New Roman";} For those of you who missed Oracle GoldenGate 11g Release 2 launch webcasts last week, the replay is now available from the following url. Harnessing the Power of the New Release of Oracle GoldenGate 11g I would highly recommend watching the webcast to meet many new features of the new release and hear the product management team respond to the questions from the audience in a nice long Q&A section. In my blog last week I listed the media coverage for this new release. There is a new article published by ITJungle talking about Oracle GoldenGate’s heterogeneity and support for DB2 for iSeries: Oracle Completes DB2/400 Support in Data Replication Tool As mentioned in last week’s blog, we received over 150 questions from the audience and in this blog I'd like to continue to post some of the frequently asked,  questions and their answers: Question: What are the fundamental differences between classic data capture and integrated data capture? Do both use the redo logs in the source database? Answer: Yes, they both use redo logs. Classic capture parses the redo log data directly, whereas the Integrated Capture lets the Oracle database parse the redo log record using an internal API. Question: Does GoldenGate version need to match Oracle Database version? Answer: No, they are not directly linked. Oracle GoldenGate 11g Release 2 supports Oracle Database version 10gR2 as well. For Oracle Database version 10gR1 and Oracle Database version 9i you will need GoldenGate11g Release 1 or lower. And for Oracle Database 8i you need Oracle GoldenGate 10 or earlier versions. Question: If I already use Data Guard, do I need GoldenGate? Answer: Data Guard is designed as the best disaster recovery solution for Oracle Database. If you would like to implement a bidirectional Active-Active replication solution or need to move data between heterogeneous systems, you will need GoldenGate. Question: On Compression and GoldenGate, if the source uses compression, is it required that the target also use compression? Answer: No, the source and target do not need to have the same compression settings. Question: Does GG support Advance Security Option on the Source database? Answer: Yes it does. Question: Can I use GoldenGate to upgrade the Oracle Database to 11g and do OS migration at the same time? Answer: Yes, this is a very common project where GoldenGate can eliminate downtime, give flexibility to test the target as needed, and minimize risks with fail-back option to the old environment. For more information on database upgrades please check out the following white papers: Best Practices for Migrating/Upgrading Oracle Database Using Oracle GoldenGate 11g Zero-Downtime Database Upgrades Using Oracle GoldenGate Question: Does GoldenGate create any trigger in the source database table level or row level to for real-time data integration? Answer: No, GoldenGate does not create triggers. Question: Can transformation be done after insert to destination table or need to be done before? Answer: It can happen in the Capture (Extract) process, in the  Delivery (Replicat) process, or in the target database. For more resources on Oracle GoldenGate 11gR2 please check out our Oracle GoldenGate 11gR2 resource kit as well.

    Read the article

  • PaaS, DBaaS and the Oracle Database Cloud Service

    - by yaldahhakim
    v\:* {behavior:url(#default#VML);} o\:* {behavior:url(#default#VML);} w\:* {behavior:url(#default#VML);} .shape {behavior:url(#default#VML);} Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} As with many widely hyped areas, there is much more variation within the broad spectrum of products referred to as “Cloud” that is immediately apparent. This variation is evident in one of the key misunderstandings about the Oracle Database Cloud Service. People could be forgiven for thinking that the Database Cloud Service was a Database-as-a-Service (DBaaS), but this is actually not true. The Database Cloud Service is a Platform-as-a-Service, which presents a different user and developer interface and has a different set of qualities. A good way to think about the difference between these two varieties of Cloud offerings is that you, the customer, have to deal with things at the level of the offering, but not for anything below it. In practice, this means that you do not have to deal with hardware or system software, including installation and maintenance, for DBaaS. You also do not have much control over configuration of these options. For PaaS, you don’t have to deal with hardware, system software, or database software – and also do not have control over these levels in the stack. So you cannot modify configuration parameters for the database with the Database Cloud Service – your interface is through SQL and PL/SQL, with Application Express, included in the Database Cloud Service, or through JDBC for Java apps running in the Java Cloud Service, or through RESTful Web Services. You will notice what is not mentioned there – SQL*Net. You cannot access your Oracle Database Cloud Service by changing an entry in the TNSNames file and using SQL*Net. So the effort involved in migrating an existing Oracle Database in your data center to the Database Cloud Service may be prohibitive. The good news is that Application Express and the RESTful Web Services wizard in the Database Cloud Service allow you to develop new applications very quickly, and, of course, the provisioning of the entire Database Cloud Service takes only minutes.

    Read the article

  • Getting Help with 'SEPA' Questions

    - by MargaretW
    What is 'SEPA'? The Single Euro Payments Area (SEPA) is a self-regulatory initiative for the European banking industry championed by the European Commission (EC) and the European Central Bank (ECB). The aim of the SEPA initiative is to improve the efficiency of cross border payments and the economies of scale by developing common standards, procedures, and infrastructure. The SEPA territory currently consists of 33 European countries -- the 28 EU states, together with Iceland, Liechtenstein, Monaco, Norway and Switzerland. Part of that infrastructure includes two new SEPA instruments that were introduced in 2008: SEPA Credit Transfer (a Payables transaction in Oracle EBS) SEPA Core Direct Debit (a Receivables transaction in Oracle EBS) A SEPA Credit Transfer (SCT) is an outgoing payment instrument for the execution of credit transfers in Euro between customer payment accounts located in SEPA. SEPA Credit Transfers are executed on behalf of an Originator holding a payment account with an Originator Bank in favor of a Beneficiary holding a payment account at a Beneficiary Bank. In R12 of Oracle applications, the current SEPA credit transfer implementation is based on Version 5 of the "SEPA Credit Transfer Scheme Customer-To-Bank Implementation Guidelines" and the "SEPA Credit Transfer Scheme Rulebook" issued by European Payments Council (EPC). These guidelines define the rules to be applied to the UNIFI (ISO20022) XML message standards for the implementation of the SEPA Credit Transfers in the customer-to-bank space. This format is compliant with SEPA Credit Transfer version 6. A SEPA Core Direct Debit (SDD) is an incoming payment instrument used for making domestic and cross-border payments within the 33 countries of SEPA, wherein the debtor (payer) authorizes the creditor (payee) to collect the payment from his bank account. The payment can be a fixed amount like a mortgage payment, or variable amounts such as those of invoices. The "SEPA Core Direct Debit" scheme replaces various country-specific direct debit schemes currently prevailing within the SEPA zone. SDD is based on the ISO20022 XML messaging standards, version 5.0 of the "SEPA Core Direct Debit Scheme Rulebook", and "SEPA Direct Debit Core Scheme Customer-to-Bank Implementation Guidelines". This format is also compliant with SEPA Core Direct Debit version 6. EU Regulation #260/2012 established the technical and business requirements for both instruments in euro. The regulation is referred to as the "SEPA end-date regulation", and also defines the deadlines for the migration to the new SEPA instruments: Euro Member States: February 1, 2014 Non-Euro Member States: October 31, 2016. Oracle and SEPA Within the Oracle E-Business Suite of applications, Oracle Payables (AP), Oracle Receivables (AR), and Oracle Payments (IBY) provide SEPA transaction capabilities for the following releases, as noted: Release 11.5.10.x -  AP & AR Release 12.0.x - AP & AR & IBY Release 12.1.x - AP & AR & IBY Release 12.2.x - AP & AR & IBY Resources To assist our customers in migrating, using, and troubleshooting SEPA functionality, a number of resource documents related to SEPA are available on My Oracle Support (MOS), including: R11i: AP: White Paper - SEPA Credit Transfer V5 support in Oracle Payables, Doc ID 1404743.1R11i: AR: White Paper - SEPA Core Direct Debit v5.0 support in Oracle Receivables, Doc ID 1410159.1R12: IBY: White Paper - SEPA Credit Transfer v5 support in Oracle Payments, Doc ID 1404007.1R12: IBY: White Paper - SEPA Core Direct Debit v5 support in Oracle Payments, Doc ID 1420049.1R11i/R12: AP/AR/IBY: Get Help Setting Up, Using, and Troubleshooting SEPA Payments in Oracle, Doc ID 1594441.2R11i/R12: Single European Payments Area (SEPA) - UPDATES, Doc ID 1541718.1R11i/R12: FAQs for Single European Payments Area (SEPA), Doc ID 791226.1

    Read the article

  • Qt vs WPF/.NET

    - by aaronc
    My company is trying to make the decision between using Qt/C++ for our GUI framework or migrating to .NET and using WPF. We have up to this point been using MFC. It seems that .NET/WPF is technically the most advanced and feature-rich platform. I do, however, have several concerns. These include: Platform support Framework longevity (i.e. future-proofing) Performance and overhead For this application we are willing to sacrifice support for Windows 2000, Macs, and Linux. But, the issue is more related to Microsoft's commitment to the framework and their extant platforms. It seems like Microsoft has a bad habit of coming up with something new, hyping it for a few years, and then relegating it to the waste-bin essentially abandoning the developers who chose it. First it was MFC and VB6, then Windows Forms, and now there's WPF. Also with .NET, versions of Windows were progressively nicked off the support list. Looks like WPF could be here to stay for a while, but since its not open source its really in Microsoft's hands. I'm also concerned about the overhead and performance of WPF since some of our applications involve processing large amounts of information and doing real-time data capture. Qt seems like a really good option, but it doesn't have all the features of WPF/.NET couldn't use languages like C#. Basically, what does the community think about Microsoft's commitment to WPF compared with previous frameworks? Are the performance considerations significant enough to avoid using it for a realtime app? And, how significant are the benefits of WPF/.NET in terms of productivity and features compared to Qt?

    Read the article

  • Core Data migration problem: "Persistent store migration failed, missing source managed object model

    - by John Gallagher
    The Background A Cocoa Non Document Core Data project with two Managed Object Models. Model 1 stays the same. Model 2 has changed, so I want to migrate the store. I've created a new version by Design Data Model Add Model Version in Xcode. The difference between versions is a single relationship that's been changed from to a one to many. I've made my changes to the model, then saved. I've made a new Mapping Model that has the old model as a source and new model as a destination. I've ensured all Mapping Models and Data Models and are being compiled and all are copied to the Resource folder of my app bundle. I've switched on migrations by passing in a dictionary with the NSMigratePersistentStoresAutomaticallyOption key as [NSNumber numberWithBool:YES] when adding the Persistent Store. Rather than merging all models in the bundle, I've specified the two models I want to use (model 1 and the new version of model 2) and merged them using modelByMergingModels: The Problem No matter what I do to migrate, I get the error message: "Persistent store migration failed, missing source managed object model." What I've Tried I clean after every single build. I've tried various combinations of having only the model I'm migrating to in Resources, being compiled, or both. Since the error message implies it can't find the source model for my migration, I've tried having every version of the model in both the Resources folder and being compiled. I've made sure I'm not making a really basic error by switching back to the original version of my data model. The app runs fine. I've deleted the Mapping Model and the new version of the model, cleaned, then recreated both. I've tried making a different change in the new model - deleting an entity instead. I'm at my wits end. I can't help but think I've made a huge mistake somewhere that I'm not seeing. Any ideas?

    Read the article

  • LinkBuilder.BuildUrlFromExpression not working anymore in .Net 4 / VS 2010 ?

    - by Mose
    Hi, I recently migrating my ASP.Net MVC 1 application from VS.Net 2008 / C# 3.5 to VS.NET 2010 / C# 4.0. I massively used a builder to get URL strings from the strongly typed calls. It looks like this : // sample call : string toSamplePage = Url.To<SampleController>(c => c.Page(parameter1, parameter2)); the code is added as an extension to the default UrlHelper : public static string To<Tcontroller>(UrlHelper helper, Expression<Action<Tcontroller>> action) where Tcontroller : Controller { // based on Microsoft.Web.Mvc.dll LinkBuilder return LinkBuilder.BuildUrlFromExpression<Tcontroller>(helper.RequestContext, helper.RouteCollection, action); } The only problem of this, is the reference to Microsoft.Web.Mvc dll, but the gain in readability was worth it. Problem : it does not work anymore, return (null) whatever the parameters. Questions : is there a better way now to build links from an expression ? (yes I tried to google it without success) is there a trick to have the former LinkBuilder.BuildUrlFromExpression works ? I tried to recompile it into C# 4.0, but the problem is that it implies working on my own compilated version of System.Web.Mvc which is not an option. I'm currently trying to migrate to MVC 2 but I still have issues... Waiting for your suggestions...

    Read the article

  • IIS7 Mixed Mode Authentication

    - by drachenstern
    We're getting ready to start migrating some of our IIS6 sites to IIS7, and the application currently uses Forms Authentication. We have started getting some requests from various sites to use the Windows Authentication for the users. While this is easy enough to implement (and I've shown internally that there is no issue with the app, as expected) the question then is how to continue to keep Forms authentication for when Integrated Windows doesn't work. I've seen several walkthroughs on how to have it configured on IIS6, and I could do the same thing on IIS7, but then I have to turn on Classic Mode processing. Any solution should also be back portable to IIS6, if possible, to keep the build tree simple. So what are my options on this? Do I setup the app with Integrated Windows Authentication in IIS7, Forms Auth in the web.config, and redirect 401 errors to an "error page" allowing them to login using forms, then back to the regular app? The case when Forms is likely to be needed is going to be reserved for Contract workers, our support staff, and if someone needs to access it on their site from their Extranet. So primarily it's for our staff to login to check functionality and confirm bug reports. I suggested we just maintain that for our support staff to work, we need a Windows login that will always be live, and then we'll just enforce local responsibility on who can login to the site, but I'm told that we would do better to have Forms Authentication. Any thoughts? I can post some of the links of the articles I've already read through if that would help the forum better narrow my needs. Many thanks. tl;dr: How to do mixed mode authentication (forms, windows) in IIS7 without changing to classic pipeline and still be able to use the build in IIS6 if possible.

    Read the article

  • Negative ItemCount in SharePoint Document Library

    - by ccomet
    What can be done about negative numbers in library item counts? ItemCount is a read-only property, what are you supposed to do when it is drastically incorrect? Earlier last week, I was doing some testing involving the copying and moving of files and folders from one document library to another. I was transfering the items from our actual document library to a sandbox "Test" library that I used to run all sorts of object model and workflow testing in before migrating to the public lists and libraries. I noticed that with files, things worked correctly, but when I copied a folder that had a file inside it (using SPFolder.CopyTo()), the item count for the test library did not actually update. Since this testing was mostly playing around, I paid it little heed. Today I was back in the test library to test a different workflow (regarding PDF conversion). While I was there, I decided to delete the folder I left last week since I didn't need it anymore. And that's when I saw the item count for the list drop to -1 in the All Site Content View. When I deleted the new PDF I had just uploaded, it then dropped to -2! I even checked with the object model... getting an instance of the library I checked the ItemCount property... lo and behold it was also -2. Is there any process which runs in the background, kinda like the one that cleans up workflow history, which will correct this kind of issue? Or is a programmer expected to keep watch for this kind of situation and come up with calculations to compensate the "count penalty", as it were?

    Read the article

  • How to add custom SOAP-Header element to the generated WSDL in Spring-WS

    - by Petr Macek
    Hi, we are migrating from WebLogic web-services to Spring-WS (1.5.X). There is currently one issue we are facing: We need to pass a context object (on WLS it is passed as SOAP-Header element) to other services that are still running on WLS from the Spring-WS powered service. The header element is still formulated on client side and the newly created WS (Spring-WS) should just pass it to other services. I can imagine how the custom element would be passed: override the doWithMessage(WebServiceMessage message) method... Is there a way to generate the wsdl with the help of DefaultWsdl11Definition to contain that custom header element? See the example: <wsdl:operation name="GetSomeInformation"> <soap:operation soapAction="http://www.dummyservice.com/InformationService/GetSomeInformation" /> <wsdl:input> <soap:body use="literal" /> <soap:header message="ctx:ServiceContextMessage" part="serviceContext" use="literal" /> </wsdl:input> <wsdl:output> <soap:body use="literal" /> </wsdl:output> <wsdl:fault name="Error"> <soap:fault name="Error" use="literal" /> </wsdl:fault> </wsdl:operation> Thanks for help

    Read the article

< Previous Page | 32 33 34 35 36 37 38 39 40 41 42 43  | Next Page >