Search Results

Search found 3738 results on 150 pages for 'communications suite 7'.

Page 12/150 | < Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >

  • Webcast Replay Available: Scrambling Sensitive Data in E-Business Suite Release 12 Cloned Environments

    - by BillSawyer
    I am pleased to release the replay and presentation for ATG Live Webcast Scrambling Sensitive Data in EBS 12 Cloned Environments (Presentation) Eric Bing, Senior Director, Jagan Athreya, Enterprise Manager Product Management, and Elke Phelps, Senior Principal Product Manager, discussed the Oracle E-Business Suite Template for Data Masking Pack, and how it can be used in situations where confidential or regulated data needs to be shared with other non-production users who need access to some of the original data, but not necessarily every table.  Examples of non-production users include internal application developers or external business partners such as offshore testing companies, suppliers or customers. (July 2012) Finding other recorded ATG webcastsThe catalog of ATG Live Webcast replays, presentations, and all ATG training materials is available in this blog's Webcasts and Training section.

    Read the article

  • SOA Suite demo pod at OOW 2012

    - by Simone Geib
    Visit us in the exhibition hall at Oracle Open World from Monday October 1st till Wednesday October 3rd. You'll have the opportunity to meet our engineering team and product managers and learn what SOA Suite is about and all the cool stuff you can do with it. Come to get a general demo, ask specific questions, give feedback or just have discussions with the Oracle SOA team. You can find us at Moscone South, Right - S-229. And don't forget to check out the Focus on SOA and BPM document for an overview of all SOA and BPM sessions

    Read the article

  • How to Integrate Cloud Applications with Oracle SOA Suite

    - by Bruce Tierney
    Having seen a preview of the slides of this upcoming Oracle OpenWorld session on Tuesday Oct 2nd at 11:45 at Moscone West 3003, I'm definitely looking forward to this deep-dive view into cloud integration.  Oracle's Senior Director of Product Management Rajesh Raheja will cover what's involved with cloud integration and provide technical solutions for how to integrate with Oracle cloud applications such as Oracle Fusion, RightNow as well as third party applications like Salesforce. Here is a screenshot from the draft presentation: Also presenting will be a Geeta Pyne, Director of Middleware for BMC to discuss how they have used Oracle SOA Suite for Cloud Integration.

    Read the article

  • Installing Oracle 11g SOA Suite?

    - by asantaga
    Are you working for an SI like Accenture or Cap Gemini? Are you a sales consultant who needs to install software quickly??? Well I’m sure if your reading this you probably are.. Anyway if your like me, and like many tecchies reading manuals isn't natural to us, we’ll download the software, try to install it and then… ultimately fail.. or take a lot longer than it should..  However never fear help is here! For Oracle 11g SOA Suite (ps3) a good friend of mine , a SOA 11g PM in the states, has written a document, a quick start and its on OTN.. Although the document is PS3 focused, apart from the download URLs its also totally applicable for PS4 too. The document can be found at this link

    Read the article

  • Building iPhone Interfaces for Oracle E-Business Suite

    - by Shay Shmeltzer
    Over on his blog Juan has been showing you how simple it is to develop a rich Web user interface with ADF accessing Oracle E-Business Suite data and even exposing it on an iPad. In this entry I'm starting from his sample application and I'm showing how easy it is to build an interface that will look great on an iPhone (or other mobile devices) using Oracle ADF Mobile Browser. For those of you who are just using ADF and never tried ADF Mobile Browser - you'll find that the development experience is quite familiar and similar to your normal Web application development. In the latest version of JDeveloper (11.1.2.1) which I'm usingin this demo we have a built-in skin that will give your application the native iOS look and feel. In the demo I achieve this by setting the styleclass of a tr:panelHeader component to af_m_toolbar to get this. For more on this styling read the doc. Check out this quick demo:

    Read the article

  • MoneyGram Shares their Oracle E-Business Suite Release Upgrade Story from 11i to 12

    MoneyGram International is a leading global provider of money transfer services. MoneyGram helps consumers send money around the world with funds arriving at available agent locations in as little as 10 minutes. Its global network is comprised of 203,000 agent locations in more than 190 countries and territories. With the help of Oracle Partner Infosys, MoneyGram was able to upgrade to Oracle E-Business Suite Release 12 in record time. As a result, MoneyGram has been able to reduce accounting errors and close their books more efficiently, ultimately saving time and money.

    Read the article

  • NUnit not running Suite tests

    - by Assaf Lavie
    I've created a test suite in NUnit that references several distinct unit test fixtures in various assemblies. I've pretty much used the example code from NUnit's docs: namespace NUnit.Tests { using System; using NUnit.Framework; using System.Collections; public class AllTests { [Suite] public static IEnumerable Suite { get { ArrayList suite = new ArrayList(); suite.Add(new VisionMap.DotNet.Tests.ManagedInteropTest.DotNetUtilsTest()); return suite; } } } } My goal is to add several tests to the list above so I can run them all in a batch. But when I try to load the DLL in NUnit's GUI I get this: What am I doing wrong? I'm using nunit 2.5.0.9122.

    Read the article

  • SOA Suite Integration: Part 3: Loading files

    - by Anthony Shorten
    One of the most common scenarios in SOA Integration is the loading of a file into the product from an external source. In Oracle SOA Suite there is a File Adapter that can process many file types into your BPEL process. For this example I will use the File Adapter to load a file of user and emails to update the user object within the Oracle Utilities Application Framework. Remember you can repeat this process with other objects and other file types. Again I am illustrating the ease of integration. The first thing is to create an empty BPEL process that will hold our flow. In Oracle JDeveloper this can be achieved by specifying the Define Service Later template (as other templates have predefined inputs and outputs and in this case we want to specify those). So I will create simpleFileLoad process to house our process. You will start with an empty canvas so you need to first specify the load part of the process using the File Adapter. Select the File Adapter from the Component Palette under BPEL Services and drag and drop it to the left side Partner Links (left is input). You name the Service. In this case I chose LoadFile. Press Next. We will define the interface as part of the wizard so select Define from operation and schema (specified later). Press Next. We are going to choose Read File to denote that we will read the file and specify the default Operation Name as Read. Press Next. The next step is to tell the Adapter the location of the files, how to process them and what to do with them after they have been processed. I am using hardcoded locations in this example but you can have logical locations as well. Press Next. I am now going to tell the adapter how to recognize the files I want to load. In my case I am using CSV files and more importantly I am tell the adapter to run the process for each record in the file it encounters. Press Next. Now, I tell the adapter how often I want to poll for the files. I have taken the defaults. Press Next. At this stage I have no explanation of the format of the input. So I am going to invoke the Native Format Wizard which will guide me through the process of creating the file input format. Clicking the purple cog icon will start the wizard. After an introduction screen (not shown), you specify the format of the input file. The File Adapter supports multiple format types. For this example, I will use Delimited as I am going to load a CSV file. Press Next. The best way for the wizard to work is with a sample. I have a sample file and the wizard will ask how much of the file to use as a template. I will use the defaults. Note: If you are using a language that has other languages other than US-ASCII, it is at this point you specify the character set to use.  Press Next. The sample contains multiple instances of a single record type. The wizard supports complex types as well. We will use the appropriate setting for our file. Press Next. You have to specify the file element and the record element. This will be used by the input wizard to translate the CSV data into an XML structure (this will make sense later). I am using LoadUsers as my file delimiter (root element) and User Record as my record root element. Press Next. As the file is CSV the delimiter is "," so I will also specify that the End Of Line (EOL) indicator indicates the end of a record. Press Next. Up until this point your have not given the columns their names. In my case my sample includes the column names in the first record. This is not always the case but you can specify the names and formats of columns in this dialog (not shown). Press Next. The wizard now generates the schema for the input file. You can specify a name for the schema. I have used userupdate.xsd. We want to verify the schema so press Test. You can test the schema by specifying an input sample. and pressing the green play button. You will see the delimiters you specified earlier for the file and the records. Press Ok to continue. A confirmation screen will be displayed showing you the location of the schema in your project. Press Finish to return to the File Adapter configuration. You will now see the schema and elements prepopulated from the wizard. Press Next. The File Adapter configuration is now complete. Press Finish. Now you need to receive the input from the LoadFile component so we need to place a Receive node in the BPEL process by drag and dropping the Receive component from the Component Palette under BPEL Constructs onto the BPEL process. We link the receive process with the LoadFile component by dragging the left most connect node of the Receive node to the LoadFile component. Once the link is established you need to name the Receive node appropriately and as in the post of the last part of this series you need to generate input variables for the BPEL process to hold the input records in. You need to now add the product Web Service. The process is the same as described in the post of the last part of this series. You drop the Web Service BPEL Service onto the right side of the process and fill in the details of the WSDL URL . You also have to add an Invoke node to call the service and generate the input and outputs variables for the call in the Invoke node. Now, to get the inputs from File to the service. You have to use a Transform (you can use an Assign action but a Transform action is more flexible). You drag and drop the Transform component from the Component Palette under Oracle Extensions and place it between the Receive and Invoke nodes. We name the Transform Node, Mapper File and associate the source of the mapping the schema from the Receive node and the output will be the input variable from the Invoke node. We now build the transform. We first map the user and email attributes by drag and drop the elements from the left to the right. The reason we needed to use the transform is that we will be telling the AS-User service that we want to issue an update action. Remember when we registered the service we actually used Read as the default. If we do not otherwise inform the service to use the Update action it will use the Read action instead (which is not desired). To specify the update action you need to click on the transactionType node on the right and select Set Text to set the action. You need to specify the transactionType of UPD (for update). The mapping is now complete. The final BPEL process is ready for deployment. You then deploy the BPEL process to the server and to test the service by simply dropping a file, in the same pattern/name as you specified, in the directory you specified in the File Adapter. You will see each record as a separate instance entry in the Fusion Middleware Control console. You can now load files into the product. You can repeat this process for each type of file to process. While this was a simple example it illustrates the method of loading data can be achieved using SOA Suite in conjunction with our products.

    Read the article

  • Can i upgrade Photoshop CS6 to Adobe Creative Suite 6 at a discounted price?

    - by indifferentDrum
    I've had a look around on the official Adobe Shop website, but I couldn't find a clear answer for this.. I am a freelance developer and I have recently purchased Adobe Photoshop CS6 through the Adobe.com (small/medium business site, volume licensing). However, after downloading and installing it a couple of days ago, I realised that I may require the complete 'Adobe Creative Suite 6' instead. Does anyone out there know if this it is possible to upgrade your purchase, and not just the specific product you've bought previously? I guess I'm looking for something along the lines of 'Complete My Album', but for Adobe software, so that I don't end up with two copies of Photoshop. Thanks :)

    Read the article

  • How to set the service endPoint URI dynamically in SOA Suite 11gR1 by Sylvain Grosjean’s

    - by JuergenKress
    Use Case : This example demonstrates how to get the URI of the backend service from a repository and how to set it dynamically to our partnerLink (dynamicPartnerLink). Implementation steps : Create a dvm file Create a BPEL component Add the endPointURI variable and assign the uri Set the endpointURI property in the invoke activity 1. Create a DVM file : In order to define our repository, we are going to use DVM (Data Value Maps) : For more explanation regarding DVM, you should read this documentation. 2. Create a BPEL Component : First you need to implement the simple bpel process like this : - The AssignPayload is used to set the inputvariable of our invoke activity. - The AssignEndpointURI is used to dynamically set the endPointURI variable from our DVM repository - The invoke activity to call the external service Read the complete article here. SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Facebook Wiki Technorati Tags: human task,SOA Community,Oracle SOA,Oracle BPM,Community,OPN,Jürgen Kress,Sylvain Grosjean

    Read the article

  • The Scoop: Oracle E-Business Suite Support on 64-bit Linux

    - by Terri Noyes
    This article addresses frequently asked questions about Oracle E-Business Suite (EBS) 64-bit Linux support.Q:  Which 64-bit Linux OSs are supported for EBS? A: Beginning with Release 12, we support the following 64-bit operating systems for both application and database tiers on x86-64 servers:Oracle Enterprise Linux Red Hat Enterprise Linux SUSE Linux Enterprise Server For EBS Release 11i (and again in Release 12), when the application tier is installed on a certified platform, additional platforms (including the above) may be used for a 64-bit database tier on x86-64 servers. This is an example of a mixed platform architecture (Release 12), or a Split Configuration (Release 11i). Q:  I understand that the EBS application tier code is 32-bit, even for the 64-bit Linux OS -- is this the case?A: It is true that the majority of executables provided as part of our release media on the application tier are 32-bit (as are the Fusion Middleware libraries and objects they depend on).  However, the 'Planning' products have large memory requirements and therefore are 64-bit compiled to take advantage of the larger memory space afforded by the 64-bit OS'es.

    Read the article

  • Resolve Instructional Webcast Series — E-Business Suite Payables Period Close

    - by user793044
    Resolve Instructional Webcast Series — New Product Specific Troubleshooting Topics For E-Business we have coming up: Title: Resolve—Best Practices for E-Business Suite Payables Period Close Date: Nov 7, 2012 Time: 8:00 am MT - 3:00 pm GMT - 10:00 am Eastern - 8:30 pm India - 7:00 am Pacific This one-hour webcast shows you how to use 3 main recommendations: Period Close Helper – New Diagnostic to identify and resolve period close issues. Master Generic Datafix Diagnostic (MGD) usage in proactive/reactive mode. Recommended Patch Collection (RPC) uptake. This session will help customers to plan and complete their month on month period close activities successfully. Also, in approaching period close in a proactive way. It will assist in order to avoid last minute hassles and prevent delays in achieving the Period Close deadlines. Customers who are involved in the Payables period close activities (both Functional and Technical) will benefit most from the webcast. Join us. Leverage this opportunity to learn Support Best Practices that help you resolve the issues you face with your Oracle products. Oracle Support experts provide live demonstrations of proactive resources. You will see you how working proactively helps you work more efficiently—from using the right tools to providing the right information on Service requests—you can get answers faster. Register for sessions now Resolve—Troubleshooting Questions? Contact Oracle’s "Get Proactive" team today. WORK SMART. SOLVE FAST. RESOLVE.

    Read the article

  • SOA Suite Demo System updated – make use of Oracle hosted demo systems or download the image

    - by JuergenKress
    To get access to the demo environments please contact OPN! Global Sales Engineering (GSE, formerly DSS) is happy to announce the availability of the SOA 11g (11.1.1.8) Platform. The Platform is fully featured, based on plug and play architecture, and designed to build best-of-breed SOA & Business Process Management 11g demos. Demo Highlights Designed & Developed on the "Build your own demos (POC)" concept Installed, configured latest versions of FMW products SOA, Business Activity Monitoring (BAM), Oracle Service Bus (OSB), Oracle Enterprise Repository (OER), Oracle Event Processor (OEP), Oracle Service Registry (OSR), WebCenter Content and WebCenter Portal Platform is designed & tuned for best performance Hot plug-in capability for additional middleware components Call to Action Check out the 1 minute video overview of this SOA 11g (11.1.1.8) Platform Review the latest Release Notes & other collaterals on Demo Store Visit the GSE home page to book the “SOA 11.1.1.8.0 Platform” Customizable demo Additional information is available on this page. For questions or feedback please contact [email protected] or [email protected]. This announcement will appear in the archive as Number 453. Support If you need assistance or encounter any issues please submit a GSE Repository ticket or call the GSE Support Hotline for assistance. The GSE Support Hotline is available 24 hours a day, Monday through Friday, at: US/CAN: +1.650.506.8763 or EMEA: +44 118 9240808 or APAC: +65.6436.2150 or LAD: +1.650.506.8763 or Japan: +81-3-6834-6097. SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Facebook Wiki Mix Forum Technorati Tags: SOA demo,demo system,sales. pre-sales,SOA Community,Oracle SOA,Oracle BPM,Community,OPN,Jürgen Kress

    Read the article

  • Hibernate Communications Link Failure in Hibernate Based Java Servlet application powered by MySQL

    - by Vatsala
    Let me describe my question - I have a Java application - Hibernate as the DB interfacing layer over MySQL. I get the communications link failure error in my application. The occurence of this error is a very specific case. I get this error , When I leave mysql server unattended for more than approximately 6 hours (i.e. when there are no queries issued to MySQL for more than approximately 6 hours). I am pasting a top 'exception' level description below, and adding a pastebin link for a detailed stacktrace description. javax.persistence.PersistenceException: org.hibernate.exception.JDBCConnectionException: Cannot open connection - Caused by: org.hibernate.exception.JDBCConnectionException: Cannot open connection - Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure - The last packet successfully received from the server was 1,274,868,181,212 milliseconds ago. The last packet sent successfully to the server was 0 milliseconds ago. - Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure - The last packet successfully received from the server was 1,274,868,181,212 milliseconds ago. The last packet sent successfully to the server was 0 milliseconds ago. - Caused by: java.net.ConnectException: Connection refused: connect the link to the pastebin for further investigation - http://pastebin.com/4KujAmgD What I understand from these exception statements is that MySQL is refusing to take in any connections after a period of idle/nil activity. I have been reading up a bit about this via google search, and came to know that one of the possible ways to overcome this is to set values for c3p0 properties as c3p0 comes bundled with Hibernate. Specifically, I read from here http://www.mchange.com/projects/c3p0/index.html that setting two properties idleConnectionTestPeriod and preferredTestQuery will solve this for me. But these values dont seem to have had an effect. Is this the correct approach to fixing this? If not, what is the right way to get over this? The following are related Communications Link Failure questions at stackoverflow.com, but I've not found a satisfactory answer in their answers. http://stackoverflow.com/questions/2121829/java-db-communications-link-failure http://stackoverflow.com/questions/298988/how-to-handle-communication-link-failure Note 1 - i dont get this error when I am using my application continuosly. Note 2 - I use JPA with Hibernate and hence my hibernate.dialect,etc hibernate properties reside within the persistence.xml in the META-INF folder (does that prevent the c3p0 properties from working?) edit - Here are the c3p0 parameters I tried out - select 1; 2

    Read the article

  • Hibernate Communications Link Failure in Restlet-Hibernate Based Java application powered by MySQL

    - by Vatsala
    Let me describe my question - I have a Java application - Hibernate as the DB interfacing layer over MySQL. I get the communications link failure error in my application. The occurence of this error is a very specific case. I get this error , When I leave mysql server unattended for more than approximately 6 hours (i.e. when there are no queries issued to MySQL for more than approximately 6 hours). I am pasting a top 'exception' level description below, and adding a pastebin link for a detailed stacktrace description. javax.persistence.PersistenceException: org.hibernate.exception.JDBCConnectionException: Cannot open connection - Caused by: org.hibernate.exception.JDBCConnectionException: Cannot open connection - Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure - The last packet successfully received from the server was 1,274,868,181,212 milliseconds ago. The last packet sent successfully to the server was 0 milliseconds ago. - Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure - The last packet successfully received from the server was 1,274,868,181,212 milliseconds ago. The last packet sent successfully to the server was 0 milliseconds ago. - Caused by: java.net.ConnectException: Connection refused: connect the link to the pastebin for further investigation - http://pastebin.com/4KujAmgD What I understand from these exception statements is that MySQL is refusing to take in any connections after a period of idle/nil activity. I have been reading up a bit about this via google search, and came to know that one of the possible ways to overcome this is to set values for c3p0 properties as c3p0 comes bundled with Hibernate. Specifically, I read from here http://www.mchange.com/projects/c3p0/index.html that setting two properties idleConnectionTestPeriod and preferredTestQuery will solve this for me. But these values dont seem to have had an effect. Is this the correct approach to fixing this? If not, what is the right way to get over this? The following are related Communications Link Failure questions at stackoverflow.com, but I've not found a satisfactory answer in their answers. http://stackoverflow.com/questions/2121829/java-db-communications-link-failure http://stackoverflow.com/questions/298988/how-to-handle-communication-link-failure Note 1 - i dont get this error when I am using my application continuosly. Note 2 - I use JPA with Hibernate and hence my hibernate.dialect,etc hibernate properties reside within the persistence.xml in the META-INF folder (does that prevent the c3p0 properties from working?)

    Read the article

  • Level 3 Communications Monitoring

    - by thursdasy
    Does level 3 Communications offer any type of map of current network status? It seems our connection here is getting high latency through their network. Also what is the experience asking them questions about if one of their appliances is malfunctioning or being overloaded because you are receiving bad latency in their network?

    Read the article

  • ODEE Green Field (Windows) Part 3 - SOA Suite

    - by AndyL-Oracle
     So you're still here, are you? I'm sure you're probably overjoyed at the prospect of continuing with our green field installation of ODEE. In my previous post, I covered the installation of WebLogic - you probably noticed, like I did, that it's a pretty quick install. I'm pretty certain this had everything to do with how quickly the next post made it to the internet! So let's dig in. Make sure you've followed the steps from the initial post to obtain the necessary software and prerequisites! Unpack the RCU (Repository Creation Utility). This ZIP file contains a directory (rcuHome) that should be extracted into your ORACLE_HOME. Run the RCU – execute rcuHome/bin/rcu.bat. Click Next. Select Create and click Next. Enter the database connection details and click Next – any failure to connection will show in the Messages box. Click Ok Expand and select the SOA Infrastructure item. This will automatically select additional required components. You can change the prefix used, but DEV is recommended. If you are creating a sandbox that includes additional components like WebCenter Content and UMS, you may select those schemas as well but they are not required for a basic ODEE installation. Click Next. Click OK. Specify the password for the schema(s). Then click Next. Click Next. Click OK. Click OK. Click Create. Click Close. Unpack the SOA Suite installation files into a single directory e.g. SOA. Run the installer – navigate and execute SOA/Disk1/setup.exe. If you receive a JDK error, switch to a command line to start the installer. To start the installer via command line, do Start?Run?cmd and cd into the SOA\Disk1 directory. Run setup.exe –jreLoc < pathtoJRE >. Ensure you do not use a path with spaces – use the ~1 notation as necessary (your directory must not exceed 8 characters so “Program Files” becomes “Progra~1” and “Program Files (x86)” becomes “Progra~2” in this notation). Click Next. Select Skip and click Next. Resolve any issues shown and click Next. Verify your oracle home locations. Defaults are recommended. Click Next. Select your application server. If you’ve already installed WebLogic, this should be automatically selected for you. Click Next. Click Install. Allow the installation to progress… Click Next. Click Finish. You can save the installation details if you want. That should keep you satisfied for the moment. Get ready, because the next posts are going to be meaty! 

    Read the article

  • SOA Suite 11g Dynamic Payload Testing with soapUI Free Edition

    - by Greg Mally
    Overview Many web service developers use soapUI for various tests like: smoke test, unit test, and load testing because you can get a free edition that is fairly robust. However, if you need to venture into more complex testing that requires a dynamic payload, then the free edition doesn't necessarily make it easy. This feature does exist in soapUI, but for obvious reasons it is in the Pro version. In this blog I will show you how to use soapUI free edition for dynamic payloads in a simplified example. Hopefully this will open the doors for you to expand into more complex scenarios. The following assumes that you have a working knowledge of soapUI and will not go into concepts like setting up a project etc. For the basics, please review the documentation for soapUI: http://www.soapui.org/Getting-Started/. Additionally, we will be using asynchronous web services and you can review the setup for this in my blog: SOA Suite 11g Asynchronous Testing with soapUI. Features in soapUI Free Edition Relating to this Topic The soapUI test tool provides a very feature rich environment that can do many things provided you are willing to go beyond point and click. For this example, we will be leveraging just a couple features for our dynamic payload example: Test Case Properties Scripting with Groovy Basically, we will be using a property as a global variable and we will manipulate that property using a Groovy script. Setting Up Our Property Properties are available throughout soapUI and here is a snippet from the soapUI website defining the locations: Projects : for handling Project scope values, for example a subscription ID TestSuite : for handling TestSuite scoped values, can be seen as "arguments" to a TestSuite TestCases : for handling TestCase scoped values, can be seen as "arguments" to a TestCase Properties TestStep : for providing local values/state within a TestCase Local TestStep properties : several TestStep types maintain their own list of properties specific to their functionality : DataSource, DataSink, Run TestCase MockServices : for handling MockService scoped values/arguments MockResponses : for handling MockResponse scoped values Global Properties : for handling Global properties, optionally from an external source For our example, we will be defining a custom property in a TestCase called SimpleAsyncPayload. The property can be created in either the Custom Properties tab located at the bottom of the Navigator panel when the TestCase is selected in the Navigator or the Properties label in the TestCase editor: Navigator Panel TestCase Editor You will notice that I set a value of “0” for the custom property. For this simplified example, we will need to retrieve that value and manipulate it prior to making the web service request invocation. In order to accomplish this, we will need to get Groovy ;) Let's Get Groovy We will now add a new Groovy Script step to the TestCase called Manipulate Payload: TestCase Editor > Append Step > Groovy Script Once we have added the Groovy Script step to our TestCase, we can open the Groovy Script editor to add the code to: Get the current value of the property we created called SimpleAsyncPayload. Convert the value of the property to an integer. Increment the value. Store the incremented value back into the TestCase property called SimpleAsyncPayload. The script should look something like the following: Groovy Script Editor – Manipulate Payload At this point we can test the script to see if it is working by simply running the TestCase (left-click on the green triangle in the upper left-hand corner of the TestCase editor). To verify if it ran correctly, we can look at the value of the SimpleAsyncPayload property which should now be 1: TestCase Editor – Run Results All that is left to complete the TestCase is to append another step of type Test Request. The information required to append the request is a name and an operation to invoke. In this example we will use the default name and select the SimpleAsyncBPELProcessBingd -> process as the operation (any other information being requested, simply use the defaults unless you are calling an asynchronous operation then do not add any assertions). We are now in familiar ground with the Test Request editor. Depending upon the type of operation you are invoking (synchronous or asynchronous), please update the request with the necessary information (e.g., callback information for asynchronous operations). We will now tweak the Test Request payload to retrieve the value of the SimpleAsyncPayload property. The soapUI editor makes this very simple: right-click in the payload and navigate to the property (e.g., right-click > Get Data.. > TestCase: [Groovy TestCase] > Property [SimpleAsyncPayload]): Test Request Editor – Insert Property Value Your payload should now look something like the following: Test Request Editor – Inserted Property Value Just like before, we are now ready to run the TestCase. If everything goes as expected we should see a response like the following: Message Viewer – Results of TestCase Run We are now setup to be able to run a stress test where the payload will change for each request. This simple example can be expanded to include multiple payload values, complex calculations in the scripts, or whatever can be done via the soapUI scripting. Hopefully you have found this useful and happy testing to you :)

    Read the article

  • Fraud Detection with the SQL Server Suite Part 2

    - by Dejan Sarka
    This is the second part of the fraud detection whitepaper. You can find the first part in my previous blog post about this topic. My Approach to Data Mining Projects It is impossible to evaluate the time and money needed for a complete fraud detection infrastructure in advance. Personally, I do not know the customer’s data in advance. I don’t know whether there is already an existing infrastructure, like a data warehouse, in place, or whether we would need to build one from scratch. Therefore, I always suggest to start with a proof-of-concept (POC) project. A POC takes something between 5 and 10 working days, and involves personnel from the customer’s site – either employees or outsourced consultants. The team should include a subject matter expert (SME) and at least one information technology (IT) expert. The SME must be familiar with both the domain in question as well as the meaning of data at hand, while the IT expert should be familiar with the structure of data, how to access it, and have some programming (preferably Transact-SQL) knowledge. With more than one IT expert the most time consuming work, namely data preparation and overview, can be completed sooner. I assume that the relevant data is already extracted and available at the very beginning of the POC project. If a customer wants to have their people involved in the project directly and requests the transfer of knowledge, the project begins with training. I strongly advise this approach as it offers the establishment of a common background for all people involved, the understanding of how the algorithms work and the understanding of how the results should be interpreted, a way of becoming familiar with the SQL Server suite, and more. Once the data has been extracted, the customer’s SME (i.e. the analyst), and the IT expert assigned to the project will learn how to prepare the data in an efficient manner. Together with me, knowledge and expertise allow us to focus immediately on the most interesting attributes and identify any additional, calculated, ones soon after. By employing our programming knowledge, we can, for example, prepare tens of derived variables, detect outliers, identify the relationships between pairs of input variables, and more, in only two or three days, depending on the quantity and the quality of input data. I favor the customer’s decision of assigning additional personnel to the project. For example, I actually prefer to work with two teams simultaneously. I demonstrate and explain the subject matter by applying techniques directly on the data managed by each team, and then both teams continue to work on the data overview and data preparation under our supervision. I explain to the teams what kind of results we expect, the reasons why they are needed, and how to achieve them. Afterwards we review and explain the results, and continue with new instructions, until we resolve all known problems. Simultaneously with the data preparation the data overview is performed. The logic behind this task is the same – again I show to the teams involved the expected results, how to achieve them and what they mean. This is also done in multiple cycles as is the case with data preparation, because, quite frankly, both tasks are completely interleaved. A specific objective of the data overview is of principal importance – it is represented by a simple star schema and a simple OLAP cube that will first of all simplify data discovery and interpretation of the results, and will also prove useful in the following tasks. The presence of the customer’s SME is the key to resolving possible issues with the actual meaning of the data. We can always replace the IT part of the team with another database developer; however, we cannot conduct this kind of a project without the customer’s SME. After the data preparation and when the data overview is available, we begin the scientific part of the project. I assist the team in developing a variety of models, and in interpreting the results. The results are presented graphically, in an intuitive way. While it is possible to interpret the results on the fly, a much more appropriate alternative is possible if the initial training was also performed, because it allows the customer’s personnel to interpret the results by themselves, with only some guidance from me. The models are evaluated immediately by using several different techniques. One of the techniques includes evaluation over time, where we use an OLAP cube. After evaluating the models, we select the most appropriate model to be deployed for a production test; this allows the team to understand the deployment process. There are many possibilities of deploying data mining models into production; at the POC stage, we select the one that can be completed quickly. Typically, this means that we add the mining model as an additional dimension to an existing DW or OLAP cube, or to the OLAP cube developed during the data overview phase. Finally, we spend some time presenting the results of the POC project to the stakeholders and managers. Even from a POC, the customer will receive lots of benefits, all at the sole risk of spending money and time for a single 5 to 10 day project: The customer learns the basic patterns of frauds and fraud detection The customer learns how to do the entire cycle with their own people, only relying on me for the most complex problems The customer’s analysts learn how to perform much more in-depth analyses than they ever thought possible The customer’s IT experts learn how to perform data extraction and preparation much more efficiently than they did before All of the attendees of this training learn how to use their own creativity to implement further improvements of the process and procedures, even after the solution has been deployed to production The POC output for a smaller company or for a subsidiary of a larger company can actually be considered a finished, production-ready solution It is possible to utilize the results of the POC project at subsidiary level, as a finished POC project for the entire enterprise Typically, the project results in several important “side effects” Improved data quality Improved employee job satisfaction, as they are able to proactively contribute to the central knowledge about fraud patterns in the organization Because eventually more minds get to be involved in the enterprise, the company should expect more and better fraud detection patterns After the POC project is completed as described above, the actual project would not need months of engagement from my side. This is possible due to our preference to transfer the knowledge onto the customer’s employees: typically, the customer will use the results of the POC project for some time, and only engage me again to complete the project, or to ask for additional expertise if the complexity of the problem increases significantly. I usually expect to perform the following tasks: Establish the final infrastructure to measure the efficiency of the deployed models Deploy the models in additional scenarios Through reports By including Data Mining Extensions (DMX) queries in OLTP applications to support real-time early warnings Include data mining models as dimensions in OLAP cubes, if this was not done already during the POC project Create smart ETL applications that divert suspicious data for immediate or later inspection I would also offer to investigate how the outcome could be transferred automatically to the central system; for instance, if the POC project was performed in a subsidiary whereas a central system is available as well Of course, for the actual project, I would repeat the data and model preparation as needed It is virtually impossible to tell in advance how much time the deployment would take, before we decide together with customer what exactly the deployment process should cover. Without considering the deployment part, and with the POC project conducted as suggested above (including the transfer of knowledge), the actual project should still only take additional 5 to 10 days. The approximate timeline for the POC project is, as follows: 1-2 days of training 2-3 days for data preparation and data overview 2 days for creating and evaluating the models 1 day for initial preparation of the continuous learning infrastructure 1 day for presentation of the results and discussion of further actions Quite frequently I receive the following question: are we going to find the best possible model during the POC project, or during the actual project? My answer is always quite simple: I do not know. Maybe, if we would spend just one hour more for data preparation, or create just one more model, we could get better patterns and predictions. However, we simply must stop somewhere, and the best possible way to do this, according to my experience, is to restrict the time spent on the project in advance, after an agreement with the customer. You must also never forget that, because we build the complete learning infrastructure and transfer the knowledge, the customer will be capable of doing further investigations independently and improve the models and predictions over time without the need for a constant engagement with me.

    Read the article

  • Zeacom UC Compared To Microsoft UC

    - by Kia
    Which is a better solution? Zeacom's Unified Communications or Microsoft's Unified Communications (UC)? Which one has your company implemented? I heard Microsoft coined the term "Unified Communications" but they were slow to jumpstart it... Other companies such as Zeacom have been working on and improving on their UC product since years ago. But Microsoft is such a standard. Which one would you go with?

    Read the article

  • Configuring a Unified Communications Certificate for many virtual hosts running in Jetty

    - by rrc7cz
    I have a single IP with Jetty serving up X sites on port 80. Basically you can sign up for our service, then point your domain www.mycompany.com to that IP, and Jetty will serve up your custom site. I would like to add SSL support for all sites. To simplify things, I've looked at getting a single Unified Communications Certificate to plug into Jetty and have it work for all sites. Is this possible? Has anyone done this before? Does Jetty only support traditional, single-domain certs? What issues might I run in to compared to a single-domain cert?

    Read the article

  • Simulación de carga productiva para anticipar errores

    - by [email protected]
    La presión por la agilidad en el día a día del negocio y por obtener siempre altos niveles de servicio hacen del manejo de la calidad un imperativo básico. Relacionado con ello, Oracle propone a través de su solución ATS (Application Testing Suite) servicios para cumplir con los objetivos de calidad. Oracle Functional Testing permitirá automatizar tediosas tareas de prueba reduciendo el nivel de esfuerzo dentro de los equipos de pruebas y garantizando calidad en cada cambio en los sistemas productivos. Oracle Load Testing permitirá simular carga productiva en los entornos y anticipar errores derivados de la concurrencia, congestión, rendimiento y falta de capacidad sin afectar a los usuarios finales. La suite de Oracle está probada y certificada sobre las siguientes plataformas: Siebel 7.x y 8.x, e-Business Suite 11i10 y superiores, Hyperion, Peoplesoft, JD Edwards, Aplicaciones Web, Web Services y sobre Base de Datos. Brochure: Oracle Load Testing

    Read the article

  • Three Global Telecoms Soar With Siebel

    - by michael.seback
    Deutsche Telekom Group Selects Oracle's Siebel CRM to Underpin Next-Generation CRM Strategy The Deutsche Telekom Group (DTAG), one of the world's leading telecommunications companies, and a customer of Oracle since 2001, has invested in Oracle's Siebel CRM as the standard platform for its Next Generation CRM strategy; a move to lower the cost of managing its 120 million customers across its European businesses. Oracle's Siebel CRM is planned to be deployed in Germany and all of the company's European business within five years. "...Our Next-Generation strategy is a significant move to lower our operating costs and enhance customer service for all our European customers. Not only is Oracle underpinning this strategy, but is also shaping the way our company operates and sells to customers. We look forward to working with Oracle over the coming years as the technology is extended across Europe," said Dr. Steffen Roehn, CIO Deutsche Telekom AG... "The telecommunications industry is currently undergoing some major changes. As a result, companies like Deutsche Telekom are needing to be more intelligent about the way they use technology, particularly when it comes to customer service. Deutsche Telekom is a great example of how organisations can use CRM to not just improve services, but also drive more commercial opportunities through the ability to offer highly tailored offers, while the customer is engaged online or on the phone," said Steve Fearon, vice president CRM, EMEA Read more. Telecom Argentina S.A. Accelerates Time-to-Market for New Communications Products and Services Telecom Argentina S.A. offers basic telephone, urban landline, and national and international long-distance services...."With Oracle's Siebel CRM and Oracle Communication Billing and Revenue Management, we started a technological transformation that allows us to satisfy our critical business needs, such as improving customer service and quickly launching new phone and internet products and services." - Saba Gooley, Chief Information Officer, Wire Line and Internet Services, Telecom Argentina S.A.Read more. Türk Telekom Develops Benefits-Driven CRM Roadmap Türk Telekom Group provides integrated telecommunication services from public switched telephone network (PSTN) and global systems for mobile communications technology (GSM). to broadband internet...."Oracle Insight provided us with a structured deployment approach that makes sense for our business. It quantified the benefits of the CRM solution allowing us to engage with the relevant business owners; essential for a successful transformation program." - Paul Taylor, VP Commercial Transformation, Türk Telekom Read more.

    Read the article

  • How do I code this relationship in SQLAlchemy?

    - by Martin Del Vecchio
    I am new to SQLAlchemy (and SQL, for that matter). I can't figure out how to code the idea I have in my head. I am creating a database of performance-test results. A test run consists of a test type and a number (this is class TestRun below) A test suite consists the version string of the software being tested, and one or more TestRun objects (this is class TestSuite below). A test version consists of all test suites with the given version name. Here is my code, as simple as I can make it: from sqlalchemy import * from sqlalchemy.ext.declarative import declarative_base from sqlalchemy.orm import relationship, backref, sessionmaker Base = declarative_base() class TestVersion (Base): __tablename__ = 'versions' id = Column (Integer, primary_key=True) version_name = Column (String) def __init__ (self, version_name): self.version_name = version_name class TestRun (Base): __tablename__ = 'runs' id = Column (Integer, primary_key=True) suite_directory = Column (String, ForeignKey ('suites.directory')) suite = relationship ('TestSuite', backref=backref ('runs', order_by=id)) test_type = Column (String) rate = Column (Integer) def __init__ (self, test_type, rate): self.test_type = test_type self.rate = rate class TestSuite (Base): __tablename__ = 'suites' directory = Column (String, primary_key=True) version_id = Column (Integer, ForeignKey ('versions.id')) version_ref = relationship ('TestVersion', backref=backref ('suites', order_by=directory)) version_name = Column (String) def __init__ (self, directory, version_name): self.directory = directory self.version_name = version_name # Create a v1.0 suite suite1 = TestSuite ('dir1', 'v1.0') suite1.runs.append (TestRun ('test1', 100)) suite1.runs.append (TestRun ('test2', 200)) # Create a another v1.0 suite suite2 = TestSuite ('dir2', 'v1.0') suite2.runs.append (TestRun ('test1', 101)) suite2.runs.append (TestRun ('test2', 201)) # Create another suite suite3 = TestSuite ('dir3', 'v2.0') suite3.runs.append (TestRun ('test1', 102)) suite3.runs.append (TestRun ('test2', 202)) # Create the in-memory database engine = create_engine ('sqlite://') Session = sessionmaker (bind=engine) session = Session() Base.metadata.create_all (engine) # Add the suites in version1 = TestVersion (suite1.version_name) version1.suites.append (suite1) session.add (suite1) version2 = TestVersion (suite2.version_name) version2.suites.append (suite2) session.add (suite2) version3 = TestVersion (suite3.version_name) version3.suites.append (suite3) session.add (suite3) session.commit() # Query the suites for suite in session.query (TestSuite).order_by (TestSuite.directory): print "\nSuite directory %s, version %s has %d test runs:" % (suite.directory, suite.version_name, len (suite.runs)) for run in suite.runs: print " Test '%s', result %d" % (run.test_type, run.rate) # Query the versions for version in session.query (TestVersion).order_by (TestVersion.version_name): print "\nVersion %s has %d test suites:" % (version.version_name, len (version.suites)) for suite in version.suites: print " Suite directory %s, version %s has %d test runs:" % (suite.directory, suite.version_name, len (suite.runs)) for run in suite.runs: print " Test '%s', result %d" % (run.test_type, run.rate) The output of this program: Suite directory dir1, version v1.0 has 2 test runs: Test 'test1', result 100 Test 'test2', result 200 Suite directory dir2, version v1.0 has 2 test runs: Test 'test1', result 101 Test 'test2', result 201 Suite directory dir3, version v2.0 has 2 test runs: Test 'test1', result 102 Test 'test2', result 202 Version v1.0 has 1 test suites: Suite directory dir1, version v1.0 has 2 test runs: Test 'test1', result 100 Test 'test2', result 200 Version v1.0 has 1 test suites: Suite directory dir2, version v1.0 has 2 test runs: Test 'test1', result 101 Test 'test2', result 201 Version v2.0 has 1 test suites: Suite directory dir3, version v2.0 has 2 test runs: Test 'test1', result 102 Test 'test2', result 202 This is not correct, since there are two TestVersion objects with the name 'v1.0'. I hacked my way around this by adding a private list of TestVersion objects, and a function to find a matching one: versions = [] def find_or_create_version (version_name): # Find existing for version in versions: if version.version_name == version_name: return (version) # Create new version = TestVersion (version_name) versions.append (version) return (version) Then I modified my code that adds the records to use it: # Add the suites in version1 = find_or_create_version (suite1.version_name) version1.suites.append (suite1) session.add (suite1) version2 = find_or_create_version (suite2.version_name) version2.suites.append (suite2) session.add (suite2) version3 = find_or_create_version (suite3.version_name) version3.suites.append (suite3) session.add (suite3) Now the output is what I want: Suite directory dir1, version v1.0 has 2 test runs: Test 'test1', result 100 Test 'test2', result 200 Suite directory dir2, version v1.0 has 2 test runs: Test 'test1', result 101 Test 'test2', result 201 Suite directory dir3, version v2.0 has 2 test runs: Test 'test1', result 102 Test 'test2', result 202 Version v1.0 has 2 test suites: Suite directory dir1, version v1.0 has 2 test runs: Test 'test1', result 100 Test 'test2', result 200 Suite directory dir2, version v1.0 has 2 test runs: Test 'test1', result 101 Test 'test2', result 201 Version v2.0 has 1 test suites: Suite directory dir3, version v2.0 has 2 test runs: Test 'test1', result 102 Test 'test2', result 202 This feels wrong to me; it doesn't feel right that I am manually keeping track of the unique version names, and manually adding the suites to the appropriate TestVersion objects. Is this code even close to being correct? And what happens when I'm not building the entire database from scratch, as in this example. If the database already exists, do I have to query the database's TestVersion table to discover the unique version names? Thanks in advance. I know this is a lot of code to wade through, and I appreciate the help.

    Read the article

< Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >