Search Results

Search found 24784 results on 992 pages for 'process integration packs'.

Page 26/992 | < Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >

  • SQL Server 2008 and 2008 R2 Integration Services - Managing Local Processes Using Script Task

    SQL Server 2008 R2 Integration Services includes a number of predefined tasks that implement common administrative actions to help with data extraction, transformation and loading (ETL). While in a majority of cases they are sufficient to deliver required functionality, there might be situations where an extra level of flexibility is desired. NEW! SQL Monitor 2.0Monitor SQL Server Central's servers withRed Gate's new SQL Monitor.No installation required. Find out more.

    Read the article

  • ZionTech blogging about integration of OUD and EUS

    - by Sylvain Duloutre
    Here are good posts about OUD and EUS integration : http://ziontech.com/blog/integrating-oud-and-eus/ EUS OUD related posts as of now are here: http://ziontech.com/blog/preparing_database/ http://ziontech.com/blog/integrating-oud-eus-users-groups-mapping/ http://ziontech.com/blog/integrating-oud-eus-oudproxy/ http://ziontech.com/blog/integrating-oud-eus-troubleshooting/

    Read the article

  • SQL Server 2012 Integration Services- Using Environments in Package Execution

    SQL Server 2012 Integration Services offers several different options for deploying and storing SSIS packages along with their associated projects, two of which are directly related to two deployment models available in SQL Server Data Tools console. Marcin Policht presents one of these methods, which deals with packages deployed using Project Deployment Model and leverages newly introduced Environments.

    Read the article

  • SQL Server 2012 Integration Services - Implementing Package Security using Access Control

    SQL Server 2012 Integration Services offers a wide range of powerful features that allow you to streamline and automate tasks involving data extraction, transformation, and loading. However, incorporating these features into your existing business intelligence framework frequently necessitates additional security measures ensuring that data which is being processed remains protected from unauthorized access.

    Read the article

  • Application Integration Architecture – Bringing It All Together - Part 1

    Oracle's Application Integration Architecture (AIA) provides Oracle customers,prospects and partners with the capability to more easily integrate and orchestrate information and transactions across multiple systems. Learn more about Oracle AIA and get an update on new and planned integrations from Jose Lazares,Vice President, Oracle Applications Development.

    Read the article

  • Application Integration Architecture – Bringing It All Together - Part 2

    Oracle's Application Integration Architecture (AIA) provides Oracle customers,prospects and partners with the capability to more easily integrate and orchestrate information and transactions across multiple systems. Learn more about Oracle AIA and get an update on new and planned integrations from Jose Lazares,Vice President, Oracle Applications Development.

    Read the article

  • SQL Server 2012 Integration Services - Project Deployment

    SQL Server 2012 Integration Services parameters introduce a new way of dealing with package development, deployment, and execution. In order to truly appreciate their relevance, it is necessary to take a look at the new Project Deployment Model. Get smart with SQL Backup ProGet faster, smaller backups with integrated verification.Quickly and easily DBCC CHECKDB your backups. Learn more.

    Read the article

  • Data Mining: Part 14 Export DMX results with Integration Services

    In this chapter we will explain how to work with Data Mining models and the Integration Services. Specifically, we will talk about the Data Mining Query Task in SSIS. Free ebook "TortoiseSVN and Subversion Cookbook - Oracle Edition"Use these recipes to work better, faster, and do things you never knew you could do with SVN. If you're new to source control, this book provides a concise guide to getting the most out of Subversion. Download it for free.

    Read the article

  • Running a Java program with a .dll from Adobe AIR's native process

    - by Donny
    I would like to be able to operate a scanner from my AIR application. Since there's no support for this natively, I'm trying to use the NativeProcess class to start a jar file that can run the scanner. The Java code is using the JTwain library to operate the scanner. The Java application runs fine by itself, and the AIR application can start and communicate with the Java application. The problem seems to be that any time I attempt to use a function from JTwain (which relies on the JTwain.dll), the application dies IF AIR STARTED IT. I'm not sure if there's some limit about referencing dll files from the native process or what. I've included my code below Java code- while(true) { try { System.out.println("Start"); text = in.readLine(); Source source = SourceManager.instance().getCurrentSource(); System.out.println("Java says: "+ text); } catch (IOException e) { System.err.println("Exception while reading the input. " + e); } catch (Exception e) { System.out.println("Other exception occured: " + e.toString()); } finally { } } } Air application- import mx.events.FlexEvent; private var nativeProcess:NativeProcess; private var npInfo:NativeProcessStartupInfo; private var processBuffer:ByteArray; private var bLength:int = 0; protected function windowedapplication1_applicationCompleteHandler(event:FlexEvent):void { var arg:Vector.<String> = new Vector.<String>; arg.push("-jar"); arg.push(File.applicationDirectory.resolvePath("Hello2.jar").nativePath); processBuffer = new ByteArray; npInfo = new NativeProcessStartupInfo; npInfo.executable = new File("C:/Program Files/Java/jre6/bin/javaw.exe"); npInfo.arguments = arg; nativeProcess = new NativeProcess; nativeProcess.addEventListener(ProgressEvent.STANDARD_OUTPUT_DATA, onStandardOutputData); nativeProcess.start(npInfo); } private function onStandardOutputData(e:ProgressEvent):void { tArea.text += nativeProcess.standardOutput.readUTFBytes(nativeProcess.standardOutput.bytesAvailable); } protected function button1_clickHandler(event:MouseEvent):void { tArea.text += 'AIR app: '+tInput.text + '\n'; nativeProcess.standardInput.writeMultiByte(tInput.text + "\n", 'utf-8'); tInput.text = ''; } protected function windowedapplication1_closeHandler(event:Event):void { nativeProcess.closeInput(); } ]]> </fx:Script> <s:Button label="Send" x="221" y="11" click="button1_clickHandler(event)"/> <s:TextInput id="tInput" x="10" y="10" width="203"/> <s:TextArea id="tArea" x="10" width="282" height="88" top="40"/> I would love some explanation about why this is dying. I've done enough testing that I know absolutely that the line that kills it is the SourceManager.instance().getCurrentSource(). I would love any suggestions. Thanks.

    Read the article

  • Oracle Process Accelerators Release 11.1.1.7.0 Now Available

    - by Cesare Rotundo
    The new Oracle Process Accelerators (PA) Release (11.1.1.7.0) delivers key functionality in many dimensions: new PAs across industries, new functionality in preexisting PAs, and an improved installation process. All PAs in Release 11.1.1.7.0 run on the latest Oracle BPM Suite and SOA Suite, 11.1.1.7. New PAs include: Financial Reports Approval (FRA): end-to-end solution for efficient and controlled Financial Report review and approval process, enabling financial analysts and decision makers to collaborate around Excel. Electronic Forms Management (EFM): supports the process to design and expose eForms with the ability to quickly design eForms and associate approval processes to them, and to then enable users to select, fill, and submit eForms for approval Mobile Data Offloading (MDO): enables telecommunications providers to reduce congestion on cellular networks and lower cost of operations by using Oracle Event Processing (OEP) and BAM to switch devices from cellular networks to Wi-Fi. By adopting the latest PA release , customers will also be able to better identify and kick-start smart extension of their processes where business steps are supported by Apps: PA 11.1.1.7.0 includes out-of-the-box business process extension scenarios with Oracle Apps such as Siebel (FSLO) and PeopleSoft (EOB).

    Read the article

  • How to avoid “The web server process that was being debugged has been terminated by IIS”

    - by ybbest
    Problem: When debugging asp.net by attaching w3wp.exe process, you often will see encounter the following error message, the web server process that was being debugged has been terminated by IIS. Analysis: This caused Internet Information Services (IIS) to assume that the worker process had stopped responding. Therefore, IIS terminated the worker process. Solution: 1. Open IIS manager. 2.Click application Pools>>select the application pool associated with the site>>and click Advanced Settings 3. Click Advanced Settings of the application pool and set the Ping Enabled property from True to False. Now, reattach the process from Visual Studio, you should not get the error message. References: msdn

    Read the article

  • Should integration testing of DAOs be done in an application server?

    - by HDave
    I have a three tier application under development and am creating integration tests for DAOs in the persistence layer. When the application runs in Websphere or JBoss I expect to use the connection pooling and transaction manager of those application servers. When the application runs in Tomcat or Jetty, we'll be using C3P0 for pooling and Atomikos for transactions. Because of these different subsystems, should the DAO's be tested in a fully configured application server environment or should we handle those concerns when integration testing the service layer? Currently we plan on setting up a simple JDBC data source with non-JTA (i.e. resource-local) transactions for DAO integration testing, thus no application server is involved....but this leaves me wondering about environmental problems we won't uncover.

    Read the article

  • Portal And Content - Content Integration - Best Practices

    - by Stefan Krantz
    Lately we have seen an increase in projects that have failed to either get user friendly content integration or non satisfactory performance. Our intention is to mitigate any knowledge gap that our previous post might have left you with, therefore this post will repeat some recommendation or reference back to old useful post. Moreover this post will help you understand ground up how to design, architect and implement business enabled, responsive and performing portals with complex requirements on business centric information publishing. Design the Information Model The key to successful portal deployments is Information modeling, it's a key task to understand the use case you designing for, therefore I have designed a set of question you need to ask yourself or your customer: Question: Who will own the content, IT or Business? Answer: BusinessQuestion: Who will publish the content, IT or Business? Answer: BusinessQuestion: Will there be multiple publishers? Answer: YesQuestion: Are the publishers computer scientist?Answer: NoQuestion: How often do the information changes, daily, weekly, monthly?Answer: Daily, weekly If your answers to the questions matches at least 2, we strongly recommend you design your content with following principles: Divide your pages in to logical sections, where each section is marked with its purpose Assign capabilities to each section, does it contain text, images, formatting and/or is it static and is populated through other contextual information Select editor/design element type WYSIWYG - Rich Text Plain Text - non-format text Image - Image object Static List - static list of formatted informationDynamic Data List - assembled information from multiple data files through CMIS query The result of such design map could look like following below examples: Based on the outcome of the required elements in the design column 3 from the left you will now simply design a data model in WebCenter Content - Site Studio by creating a Region Definition structure matching your design requirements.For more information on how to create a Region definition see following post: Region Definition Post - note see instruction 7 for details. Each region definition can now be used to instantiate data files, a data file will hold the actual data for each element in the region definition. Another way you can see this is to compare the region definition as an extension to the metadata model in WebCenter Content for each data file item. Design content templates With a solid dependable information model we can now proceed to template creation and page design, in this phase focuses on how to place the content sections from the region definition on the page via a Content Presenter template. Remember by creating content presenter templates you will leverage the latest and most integrated technology WebCenter has to offer. This phase is much easier since the you already have the information model and design wire-frames to base the logic on, however there is still few considerations to pay attention to: Base the template on ADF and make only necessary exceptions to markup when required Leverage ADF design components for Tabs, Accordions and other similar components, this way the design in the content published areas will comply with other design areas based on custom ADF taskflows There is no performance impact when using meta data or region definition based data All data access regardless of type, metadata or xml data it can be accessed via the Content Presenter - Node. See below for applied examples on how to access data Access metadata property from Document - #{node.propertyMap['myProp'].value}myProp in this example can be for instance (dDocName, dDocTitle, xComments or any other available metadata) Access element data from data file xml - #{node.propertyMap['[Region Definition Name]:[Element name]'].asTextHtml}Region Definition Name is the expect region definition that the current data file is instantiatingElement name is the element value you like to grab from the data file I recommend you read following  useful post on content template topic:CMIS queries and template creation - note see instruction 9 for detailsStatic List template rendering For more information on templates:Single Item Content TemplateMulti Item Content TemplateExpression Language Internationalization Considerations When integrating content assets via content presenter you by now probably understand that the content item/data file is wired to the page, what is also pretty common at this stage is that the content item/data file only support one language since its not practical or business friendly to mix that into a complex structure. Therefore you will be left with a very common dilemma that you will have to either build a complete new portal for each locale, which is not an good option! However with little bit of information modeling and clear naming convention this can be addressed. Basically you can simply make sure that all content item/data file are named with a predictable naming convention like "Content1_EN" for the English rendition and "Content1_ES" for the Spanish rendition. This way through simple none complex customizations you will be able to dynamically switch the actual content item/data file just before rendering. By following proposed approach above you not only enable a simple mechanism for internationalized content you also preserve the functionality in the content presenter to support business accessible run-time publishing of information on existing and new pages. I recommend you read following useful post on Internationalization topics:Internationalize with Content Presenter Integrate with Review & Approval processes Today the Review and approval functionality and configuration is based out of WebCenter Content - Criteria Workflows. Criteria Workflows uses the metadata of the checked in document to evaluate if the document is under any review/approval process. So for instance if a Criteria Workflow is configured to force any documents with Version = "2" or "higher" and Content Type is "Instructions", any matching content item version on check in will now enter the workflow before getting released for general access. Few things to consider when configuring Criteria Workflows: Make sure to not trigger on version one for Content Items that are Data Files - if you trigger on version 1 you will not only approve an empty document you will also have a content presenter pointing to a none existing document - since the document will only be available after successful completion of the workflow Approval workflows sometimes requires more complex criteria, the recommendation if that is the case is that the meta data triggering such criteria is automatically populated, this can be achieved through many approaches including Content Profiles Criteria workflows are configured and managed in WebCenter Content Administration Applets where you can configure one or more workflows. When you configured Criteria workflows the Content Presenter will support the editors with the approval process directly inline in the "Contribution mode" of the portal. In addition to approve/reject and details of the task, the content presenter natively support the user to view the current and future version of the change he/she is approving. See below for example: Architectural recommendation To support review&approval processes - minimize the amount of data files per page Each CMIS query can consume significant time depending on the complexity of the query - minimize the amount of CMIS queries per page Use Content Presenter Templates based on ADF - this way you minimize the design considerations and optimize the usage of caching Implement the page in as few Data files as possible - simplifies publishing process, increases performance and simplifies release process Named data file (node) or list of named nodes when integrating to pages increases performance vs. querying for data Named data file (node) or list of named nodes when integrating to pages enables business centric page creation and publishing and reduces the need for IT department interaction Summary Just because one architectural decision solves a business problem it doesn't mean its the right one, when designing portals all architecture has to be in harmony and not impacting each other. For instance the most technical complex solution is not always the best since it will most likely defeat the business accessibility, performance or both, therefore the best approach is to first design for simplicity that even a non-technical user can operate, after that consider the performance impact and final look at the technology challenges these brings and workaround them first with out-of-the-box features, after that design and develop functions to complement the short comings.

    Read the article

  • OAM OVD integration - Error Encounterd while performance test "LDAP response read timed out, timeout used:2000ms"

    - by siddhartha_sinha
    While working on OAM OVD integration for one of my client, I have been involved in the performance test of the products wherein I encountered OAM authentication failures while talking to OVD during heavy load. OAM logs revealed the following: oracle.security.am.common.policy.common.response.ResponseException: oracle.security.am.engines.common.identity.provider.exceptions.IdentityProviderException: OAMSSA-20012: Exception in getting user attributes for user : dummy_user1, idstore MyIdentityStore with exception javax.naming.NamingException: LDAP response read timed out, timeout used:2000ms.; remaining name 'ou=people,dc=oracle,dc=com' at oracle.security.am.common.policy.common.response.IdentityValueProvider.getUserAttribute(IdentityValueProvider.java:271) ... During the authentication and authorization process, OAM complains that the LDAP repository is taking too long to return user attributes.The default value is 2 seconds as can be seen from the exception, "2000ms". While troubleshooting the issue, it was found that we can increase the ldap read timeout in oam-config.xml.  For reference, the attribute to add in the oam-config.xml file is: <Setting Name="LdapReadTimeout" Type="xsd:string">2000</Setting> However it is not recommended to increase the time out unless it is absolutely necessary and ensure that back-end directory servers are working fine. Rather I took the path of tuning OVD in the following manner: 1) Navigate to ORACLE_INSTANCE/config/OPMN/opmn folder and edit opmn.xml. Search for <data id="java-options" ………> and edit the contents of the file with the highlighted items: <category id="start-options"><data id="java-bin" value="$ORACLE_HOME/jdk/bin/java"/><data id="java-options" value="-server -Xms1024m -Xmx1024m -Dvde.soTimeoutBackend=0 -Didm.oracle.home=$ORACLE_HOME -Dcommon.components.home=$ORACLE_HOME/../oracle_common -XX:+PrintGCDetails -XX:+PrintGCDateStamps -Xloggc:/opt/bea/Middleware/asinst_1/diagnostics/logs/OVD/ovd1/ovdGClog.log -XX:+UseConcMarkSweepGC -Doracle.security.jps.config=$ORACLE_INSTANCE/config/JPS/jps-config-jse.xml"/><data id="java-classpath" value="$ORACLE_HOME/ovd/jlib/vde.jar$:$ORACLE_HOME/jdbc/lib/ojdbc6.jar"/></category></module-data><stop timeout="120"/><ping interval="60"/></process-type> When the system is busy, a ping from the Oracle Process Manager and Notification Server (OPMN) to Oracle Virtual Directory may fail. As a result, OPMN will restart Oracle Virtual Directory after 20 seconds (the default ping interval). To avoid this, consider increasing the ping interval to 60 seconds or more. 2) Navigate to ORACLE_INSTANCE/config/OVD/ovd1 folder.Open listeners.os_xml file and perform the following changes: · Search for <ldap id=”Ldap Endpoint”…….> and point the cursor to that line. · Change threads count to 200. · Change anonymous bind to Deny. · Change workQueueCapacity to 8096. Add a new parameter <useNIO> and set its value to false viz: <useNIO>false</useNio> Snippet: <ldap version="8" id="LDAP Endpoint"> ....... .......  <socketOptions><backlog>128</backlog>         <reuseAddress>false</reuseAddress>         <keepAlive>false</keepAlive>         <tcpNoDelay>true</tcpNoDelay>         <readTimeout>0</readTimeout>      </socketOptions> <useNIO>false</useNIO></ldap> Restart OVD server. For more information on OVD tuneup refer to http://docs.oracle.com/cd/E25054_01/core.1111/e10108/ovd.htm. Please Note: There were few patches released from OAM side for performance tune-up as well. Will provide the updates shortly !!!

    Read the article

  • VS.NET solution built differently on build server

    - by slolife
    I have a VS.NET solution with two Projects, ProjectWeb and ProjectLibrary. PW depends on PL, so I have a VS.NET project reference to PL in PW. That works all well and good on my dev box, but when it all gets to the build server, I have two different build projects, one for PL and one for PW. I'd like to build PL and copy the binaries somewhere. Then, I'd like to build PW and it only, using the binaries from the previous PL build. But will that work since the PW VS.NET project is referencing a project that doesn't exist when I build PW only on the build server? How can I set this up For specifics, I am using CC.NET and NAnt, but I have other projects that use Hudson and straight MS build

    Read the article

< Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >