Search Results

Search found 604 results on 25 pages for 'appstore approval'.

Page 22/25 | < Previous Page | 18 19 20 21 22 23 24 25  | Next Page >

  • Looking for a 'pick a-or-b' voting system script

    - by user324455
    Apologies: this is my first time on stackoverflow and I'm starting with a question and seeking advice. Sorry. Caveats: I know HTML and CSS pretty well. Javascript and PHP are not completely alien, but I'm really pretty basic on those. That said, I'm pretty sharp and willing to search for explanations independently. Ok, so my question is this: I want to create a site with a voting system very much like the one on kittenwar.com - the page loads 2 random images from a db of some sort and you click on the one you want to 'win'. Ranked pairs kind of deal. Then there is a leaderboard of those images which have the highest win-loss ratio. There also needs to be an uploader for peeps to upload their own images and have them go into an approval workflow, and from there into the db that feeds the voting thing. I tried a pre-made solution ('photo battle') but found it was completely standalone, so trying to integrate it or change any of the options was a nightmare, plus it was buggy. i'm sure there has to be a relatively easy way to do this, right? Ideally I'd like to build my site in Joomla and integrate this functionality somehow. I'd be very grateful for any advice on this. Thanks Tom

    Read the article

  • Does new JUnit 4.8 @Category render test suites almost obsolete?

    - by grigory
    Given question 'How to run all tests belonging to a certain Category?' and the answer would the following approach be better for test organization? define master test suite that contains all tests (e.g. using ClasspathSuite) design sufficient set of JUnit categories (sufficient means that every desirable collection of sets is identifiable using one or more categories) define targeted test suites based on master test suite and set of categories For example: identify categories for speed (slow, fast), dependencies (mock, database, integration), function (), domain ( demand that each test is properly qualified (tagged) with relevant set of categories. create master test suite using ClasspathSuite (all tests found in classpath) create targeted suites by qualifying master test suite with categories, e.g. mock test suite, fast database test suite, slow integration for domain X test suite, etc. My question is more like soliciting approval rate for such approach vs. classic test suite approach. One unbeatable benefit is that every new test is immediately contained by relevant suites with no suite maintenance. One concern is proper categorization of each test.

    Read the article

  • CodeIgnither OAuth 2.0 database setup for users and access_tokens

    - by xref
    Per this question I am using CodeIgniter and OAuth 2 in an attempt to provide SSO for internal users of my webapp, ideally verifying them against their Google Apps account. No registrations or anything, just existing users. Using the CI oauth2 spark I'm getting back from Google an OAuth token similar to below: OAuth2_Token_Access Object ( [access_token:protected] => dp83.AHSDj899sDHHD908DHFBDjidkd8989dDHhjjd [expires:protected] => 1349816820 [refresh_token:protected] => [uid:protected] => ) And using that token I can retrieve some user info from Google: [uid] => 3849450385394595 [nickname] => this_guy [name] => This Guy [first_name] => This [last_name] => Guy [email] => [email protected] [location] => [image] => [description] => [urls] => Array ( ) Now to allow the 15 people or so who will be using the webapp currently to log in, do I need to create a users table in the mysql database with their email address as a key? Then compare the email which just came back from the Google OAuth request and see if it exists in my users table? What about the Google access_token, do I store that now along with the email which already existed in the users table? Related: How would I go about verifying the user automatically in the future against that access_token so they don't have to go through the whole OAuth approval process with Google again?

    Read the article

  • PHP - file uploads and ways to prevent viruses from being uploaded in zip/rar archives

    - by Joe
    I am trying to provide a service on my website to allow users to upload files so others can download them. The issue is, since some of these files I will allow to upload will be .zip/.rar files, I am curious as to what ideas exist to help prevent the uploading of archives with Viruses/trojans etc. included. Some .zip files will include legitimate .exe files,though I am not sure what options I have. I thought about it and I don't have a method for verifying with a virus scanner on the server, since I am on shared hosting w/o the option to run a service like that... nor do I have the knowledge on how to do that. I am also aware there is no php class or database to scan the files for viruses. This means, my only options are to rely on: a). manual approval <-- not an acceptable option for me as it might become a busy site with thousands of uploads b). get the users to somehow point out it if has viruses through voting or "flagging", etc.... anyway, regarding "b" - what ideas would you suggest?

    Read the article

  • log activity. intrusion detection. user event notification ( interraction ). messaging

    - by Julian Davchev
    Have three questions that I somehow find related so I put them in same place. Currently building relatively large LAMP system - making use of messaging(activeMQ) , memcache and other goodies. I wonder if there are best practices or nice tips and tricks on howto implement those. System is user aware - meaning all actions done can be bind to particular logged user. 1. How to log all actions/activities of users? So that stats/graphics might be extracted later for analysing. At best that will include all url calls, post data etc etc. Meaning tons of inserts. I am thinking sending messages to activeMQ and later cron dumping in DB and cron analysing might be good idea here. Since using Zend Framework I guess I may use some request plugin so I don't have to make the log() call all over the code. 2.How to log stuff so may be used for intrusion detection? I know most things might be done on http level using apache mods for example but there are also specific cases like (5 failed login attempts in a row (leads to captcha) etc etc..) This also would include tons of inserts. Here I guess direct usage of memcache might be best approach as data don't seem vital to be permanantly persisted. Not sure if cannot use data from point 1. 3.System will notify users of some events. Like need approval , something broke..whatever.Some events will need feedback(action) from user, others are just informational. Wonder if there is common solutions for needs like this. Example: Based on occuring event(s) user will be notifed (user inbox for example) what happend. There will be link or something to lead him to details of thingy that happend and take action accordingly. Those seem trivial at first look but problem I see if coding it directly is becoming really fast hard to maintain.

    Read the article

  • Where is a small, simple CMS that has no Front End done in PHP?

    - by user559469
    The keys are: small and simple PHP MySql no Front End By "no front end" I mean literally, I can control the look 100%. I just want a CMS on the "backend" to manage content (user login/security, upload images, udate articles, etc.) that will not dictate in anyway how the managed data is presented. Maybe it just keeps the info in a (MySql) database (which I can query and extract myself) or if it writes content, it is in super-clean xhtml fragments or even just xml I will parse myself? I have looked at Wordpress -- and don't like the code it generates, not to mention the sites look too "canned" (you can usually spot a WP site a mile a way.) Joomla and Drupal look more customizable, but they are bloated now in my opinion, and really I just want something lightweight and simple. For one-user mom-and-pop sites. (No tiered publishing/approval systems, and all that.) I envision plugging this CMS into existing websites/web apps where most of the site is made and managed by me, but a few choice areas are managed by the site owner.

    Read the article

  • My iOS app has a + in its name. Bundle is invalid due to this. Need help resolving

    - by d.altman
    I did find a couple of very similar or identical threads here but they seemed to end before full resolution. My app runs fine on my device with no build error. I am trying to submit app for approval and I get the following error, "This bundle is invalid. The executable name, as reported by CFBundleExecutable in the info.plist file may not contain any of these characters ..... +". So I opened my info.plist file and changed the info.plist file executable name from the macro ${EXECUTABLE_NAME} to the name of my app without the +. I did a new archive but then get an error saying the "codesign failed with exit code 1". In another thread I read to just change targets name removing the + from there and leaving the info.plist file with the macro for the executable name, restarting Xcode and then archiving again. That allowed me to archive but I received the same error in iTunes Connect. I have been working on this all day and don't find the solution. Can anyone please point me in the right direction? Thank you for any help.

    Read the article

  • Access database Need to prevent from approving overlapping OT.Second Try with modified request Not a programmer [on hold]

    - by user2512764
    Employees Signups on company Website for advance overtime line. Access table already has overtime signups which does not require user to add the time but it requires only to add location as approved. Since this table has field Employee name, Date, start time and End time and location, All the fields has the data except for location. In the data base I have created a form based on this table. Since the table already have most of the information User only has to add location in the form field in order to approve overtime. Once user approves an overtime line for example: User approves overtime for employee name 'John' which starts on 7/1/2013 at 0400-0800, location is successfully added. When user tries to add location for John again which might has the start time for 7/1/2013 at 0600=0900. Again we are not entering Start time, End time and date it is already in the table. we are only entering location as approval. Soon user enters the location for John in the form field, since there is a conflict with previously overtime line which has already been approved. program needs to check employee name, date and time in previously approved (Added location) overtime line and The location in current record needs to be deleted and go to next record. I hope I have explained it in understandable format. Thank You,

    Read the article

  • How to program a real-time accurate audio sequencer on the iphone?

    - by Walchy
    Hi... I want to program a simple audio sequencer on the iphone but I can't get accurate timing. The last days I tried all possible audio techniques on the iphone, starting from AudioServicesPlaySystemSound and AVAudioPlayer and OpenAL to AudioQueues. In my last attempt I tried the CocosDenshion sound engine which uses openAL and allows to load sounds into multiple buffers and then play them whenever needed. Here is the basic code: init: int channelGroups[1]; channelGroups[0] = 8; soundEngine = [[CDSoundEngine alloc] init:channelGroups channelGroupTotal:1]; int i=0; for(NSString *soundName in [NSArray arrayWithObjects:@"base1", @"snare1", @"hihat1", @"dit", @"snare", nil]) { [soundEngine loadBuffer:i fileName:soundName fileType:@"wav"]; i++; } [NSTimer scheduledTimerWithTimeInterval:0.14 target:self selector:@selector(drumLoop:) userInfo:nil repeats:YES]; In the initialisation I create the sound engine, load some sounds to different buffers and then establish the sequencer loop with NSTimer. audio loop: - (void)drumLoop:(NSTimer *)timer { for(int track=0; track<4; track++) { unsigned char note=pattern[track][step]; if(note) [soundEngine playSound:note-1 channelGroupId:0 pitch:1.0f pan:.5 gain:1.0 loop:NO]; } if(++step>=16) step=0; } Thats it and it works as it should BUT the timing is shaky and instable. As soon as something else happens (i.g. drawing in a view) it goes out of sync. As I understand the sound engine and openAL the buffers are loaded (in the init code) and then are ready to start immediately with alSourcePlay(source); - so the problem may be with NSTimer? Now there are dozens of sound sequencer apps in the appstore and they have accurate timing. I.g. "idrum" has a perfect stable beat even in 180 bpm when zooming and drawing is done. So there must be a solution. Does anybody has any idea? Thanks for any help in advance! Best regards, Walchy

    Read the article

  • Oracle Announces New Oracle Exastack Program for ISV Partners

    - by pfolgado
    Oracle Exastack Program Enables ISV Partners to Leverage a Scalable, Integrated Infrastructure to Deliver Their Applications Tuned and Optimized for High-Performance News Facts Enabling Independent Software Vendors (ISVs) and other members of Oracle Partner Network (OPN) to rapidly build and deliver faster, more reliable applications to end customers, Oracle today introduced Oracle Exastack Ready, available now, and Oracle Exastack Optimized, available in fall 2011 through OPN. The Oracle Exastack Program focuses on helping ISVs run their solutions on Oracle Exadata Database Machine and Oracle Exalogic Elastic Cloud -- integrated systems in which the software and hardware are engineered to work together. These products provide partners with a lower cost and high performance infrastructure for database and application workloads across on-premise and cloud based environments. Leveraging the new Oracle Exastack Program in which applications can qualify as Oracle Exastack Ready or Oracle Exastack Optimized, partners can use available OPN resources to optimize their applications to run faster and more reliably -- providing increased performance to their end users. By deploying their applications on Oracle Exadata Database Machine and Oracle Exalogic Elastic Cloud, ISVs can reduce the cost, time and support complexities typically associated with building and maintaining a disparate application infrastructure -- enabling them to focus more on their core competencies, accelerating innovation and delivering superior value to customers. After qualifying their applications as Oracle Exastack Ready, partners can note to customers that their applications run on and support Oracle Exadata Database Machine and Oracle Exalogic Elastic Cloud component products including Oracle Solaris, Oracle Linux, Oracle Database and Oracle WebLogic Server. Customers can be confident when choosing a partner's Oracle Exastack Optimized application, knowing it has been tuned by the OPN member on Oracle Exadata Database Machine or Oracle Exalogic Elastic Cloud with a goal of delivering optimum speed, scalability and reliability. Partners participating in the Oracle Exastack Program can also leverage their Oracle Exastack Ready and Oracle Exastack Optimized applications to advance to Platinum or Diamond level in OPN. Oracle Exastack Programs Provide ISVs a Reliable, High-Performance Application Infrastructure With the Oracle Exastack Program ISVs have several options to qualify and tune their applications with Oracle Exastack, including: Oracle Exastack Ready: Oracle Exastack Ready provides qualifying partners with specific branding and promotional benefits based on their adoption of Oracle products. If a partner application supports the latest major release of one of these products, the partner may use the corresponding logo with their product marketing materials: Oracle Solaris Ready, Oracle Linux Ready, Oracle Database Ready, and Oracle WebLogic Ready. Oracle Exastack Ready is available to OPN members at the Gold level or above. Additionally, OPN members participating in the program can leverage their Oracle Exastack Ready applications toward advancement to the Platinum or Diamond levels in the OPN Specialized program and toward achieving Oracle Exastack Optimized status. Oracle Exastack Optimized: When available, for OPN members at the Gold level or above, Oracle Exastack Optimized will provide direct access to Oracle technical resources and dedicated Oracle Exastack lab environments so OPN members can test and tune their applications to deliver optimal performance and scalability on Oracle Exadata Database Machine or Oracle Exalogic Elastic Cloud. Oracle Exastack Optimized will provide OPN members with specific branding and promotional benefits including the use of the Oracle Exastack Optimized logo. OPN members participating in the program will also be able to leverage their Oracle Exastack Optimized applications toward advancement to Platinum or Diamond level in the OPN Specialized program. Oracle Exastack Labs and ISV Enablement: Dedicated Oracle Exastack lab environments and related technical enablement resources (including Guided Learning Paths and Boot Camps) will be available through OPN for OPN members to further their knowledge of Oracle Exastack offerings, and qualify their applications for Oracle Exastack Optimized or Oracle Exastack Ready. Oracle Exastack labs will be available to qualifying OPN members at the Gold level or above. Partners are eligible to participate in the Oracle Exastack Ready program immediately, which will help them meet the requirements to attain Oracle Exastack Optimized status in the future. Guidelines for Oracle Exastack Optimized, as well as Oracle Exastack Labs will be available in fall 2011. Supporting Quotes "In order to effectively differentiate their software applications in the marketplace, ISVs need to rapidly deliver new capabilities and performance improvements," said Judson Althoff, Oracle senior vice president of Worldwide Alliances and Channels and Embedded Sales. "With Oracle Exastack, ISVs have the ability to optimize and deploy their applications with a complete, integrated and cloud-ready infrastructure that will help them accelerate innovation, unlock new features and functionality, and deliver superior value to customers." "We view performance as absolutely critical and a key differentiator," said Tom Stock, SVP of Product Management, GoldenSource. "As a leading provider of enterprise data management solutions for securities and investment management firms, with Oracle Exadata Database Machine, we see an opportunity to notably improve data processing performance -- providing high quality 'golden copy' data in a reduced timeframe. Achieving Oracle Exastack Optimized status will be a stamp of approval that our solution will provide the performance and scalability that our customers demand." "As a leading provider of Revenue Intelligence solutions for telecommunications, media and entertainment service providers, our customers continually demand more readily accessible, enriched and pre-analyzed information to minimize their financial risks and maximize their margins," said Alon Aginsky, President and CEO of cVidya Networks. "Oracle Exastack enables our solutions to deliver the power, infrastructure, and innovation required to transform our customers' business operations and stay ahead of the game." Supporting Resources Oracle PartnerNetwork (OPN) Oracle Exastack Oracle Exastack Datasheet Judson Althoff blog Connect with the Oracle Partner community at OPN on Facebook, OPN on LinkedIn, OPN on YouTube, or OPN on Twitter

    Read the article

  • Tykie

    - by Brian
    Here’s the obituary my mother wrote for Tykie, I still miss the little guy quite a bit. Anyone who’s interested in further information on hearing dogs should check out the IHDI website. I cannot begin to express how helpful a hearing dog can be for the hearing impaired. If you feel so inclined, please make a donation. In Memoriam, Tykie 1993-2010 The American Legion Post 401, South Wichita, KS, supported one of its members and commander by sponsoring a service dog for him. Unlike most service dogs this one was for the hearing impaired. Both Ocie and Betty Sims had hearing loss – Ocie more than Betty. The Post and Auxilliary had garage sales, auctions and other fund-raising endeavors to get donations for the dog. Betty made Teddy bears with growlers that were auctioned for donations to bring a hearing dog from International Hearing Dog, Henderson, Colorado. Tykie, a small wiry, salt and pepper terrier, arrived September 1, 1994 to begin his work that included attending Post 401 meetings and celebrations as well as raising more money to be donated to IHD to help others have hearing dogs. Tykie was a young dog less than a year old when he came to Wichita. He was always anxious to please and seldom barked, though he did put out a kind of cry when he was giving his urgent announcement that someone was at the door or the telephone was ringing. He also enjoyed chasing squirrels in the backyard garden that Ocie prized. In 1995, Betty almost died of a lung infection. Tykie was at the hospital with Ocie when he could visit. Several weeks after she was able to come home after a miraculous recovery, Tykie and Ocie went to a car show in downtown Wichita. Ocie’s retina tore loose in the only eye he could see out of and he almost blind was in great pain. How Ocie and Tykie got home is still a mystery, but the family legend goes that Tykie added seeing eye dog to his repertoire and helped drive him home. Health problems continued for Ocie and when he was placed in a nursing home, Tykie was moved to be Betty’s hearing dog. No problem for Tykie, he still saw his friends at the post and continued to help with visitors at the door. The night of May 3, 1999, Betty and Tykie were in the bedroom watching TV when Tykie began hitting her with both front paws as he would if something were urgent. She said later she thought he wanted to go out. As she and the dog walked down the hall towards the back of the house, Tykie hit her again with his front paws with such urgency that she fell into a small coat closet. That small 2-by-2 closet became their refuge as that very second the roof of her house went off as the f4 tornado raced through the city. Betty acquired one small wound on her hand from a piece of flying glass as she pulled Tykie into the closet with her. Tykie was a hero that day and a lot of days after. He kept Betty going as she rebuilt her home and after her husband died April 15, 2000. Tykie had to be cared for so she had to take him outside and bring him inside. He attended weddings of grandchildren and funerals of Post friends. When Betty died February 17, 2002 Tykie’s life changed again. IHD gave approval for his transfer and retirement to Betty and Ocie’s grandson, Brian Laird, who has a similar hearing loss to his grandfather. A few days after the funeral Tykie flew to his new home in Rutherford, NJ where he was able to take long walks for a couple of years before moving back to the Kansas City area. He was still full of adventure. He was written up in a book about service dogs and his story of the tornado and his picture appeared. He spent weekends at Brian’s mother’s farm to get muddy and be afraid of cats and chickens. He also took on an odyssey as he slipped from his fenced yard in Lenexa one day and walked more than seven miles in Overland Park traffic before being found by a good Samaritan who called IHD to find out where he belonged. Tykie was deaf for about the last two years of his long life and became blind as well, but he continued to strive to please. Tykie was 16 years and 4 months when he was cremated. His ashes were scattered on the graves of Betty and Ocie Sims at Greenwood Cemetery west of Wichita on the afternoon of March 21, 2010, with about a dozen family and Post 401 members. It is still the rule. Service dogs are the only dogs allowed inside the Post home. Submitted by Linda Laird, daughter of Betty and Ocie and mother of Brian Laird.

    Read the article

  • Using CMS for App Configuration - Part 1, Deploying Umbraco

    - by Elton Stoneman
    Originally posted on: http://geekswithblogs.net/EltonStoneman/archive/2014/06/04/using-cms-for-app-configurationndashpart-1-deploy-umbraco.aspxSince my last post on using CMS for semi-static API content, How about a new platform for your next API… a CMS?, I’ve been using the idea for centralized app configuration, and this post is the first in a series that will walk through how to do that, step-by-step. The approach gives you a platform-independent, easily configurable way to specify your application configuration for different environments, with a built-in approval workflow, change auditing and the ability to easily rollback to previous settings. It’s like Azure Web and Worker Roles where you can specify settings that change at runtime, but it's not specific to Azure - you can use it for any app that needs changeable config, provided it can access the Internet. The series breaks down into four posts: Deploying Umbraco – the CMS that will store your configurable settings and the current values; Publishing your config – create a document type that encapsulates your settings and a template to expose them as JSON; Consuming your config – in .NET, a simple client that uses dynamic objects to access settings; Config lifecycle management – how to publish, audit, and rollback settings. Let’s get started. Deploying Umbraco There’s an Umbraco package on Azure Websites, so deploying your own instance is easy – but there are a couple of things to watch out for, so this step-by-step will put you in a good place. Create From Gallery The easiest way to get started is with an Azure subscription, navigate to add a new Website and then Create From Gallery. Under CMS, you’ll see an Umbraco package (currently at version 7.1.3): Configure Your App For high availability and scale, you’ll want your CMS on separate kit from anything else you have in Azure, so in the configuration of Umbraco I’d create a new SQL Azure database – which Umbraco will use to store all its content: You can use the free 20mb database option if you don’t have demanding NFRs, or if you’re just experimenting. You’ll need to specify a password for a SQL Server account which the Umbraco service will use, and changing from the default username umbracouser is probably wise. Specify Database Settings You can create a new database on an existing server if you have one, or create new. If you create a new server *do not* use the same username for the database server login as you used for the Umbraco account. If you do, the deployment will fail later. Think of this as the SQL Admin account that you can use for managing the db, the previous account was the service account Umbraco uses to connect. Make Tea If you have a fast kettle. It takes about two minutes for Azure to create and provision the website and the database. Install Umbraco So far we’ve deployed an empty instance of Umbraco using the Azure package, and now we need to browse to the site and complete installation. My Website was called my-app-config, so to complete installation I browse to http://my-app-config.azurewebsites.net:   Enter the credentials you want to use to login – this account will have full admin rights to the Umbraco instance. Note that between deploying your new Umbraco instance and completing installation in this step, anyone can browse to your website and complete the installation themselves with their own credentials, if they know the URL. Remote possibility, but it’s there. From this page *do not* click the big green Install button. If you do, Umbraco will configure itself with a local SQL Server CE database (.sdf file on the Web server), and ignore the SQL Azure database you’ve carefully provisioned and may be paying for. Instead, click on the Customize link and: Configure Your Database You need to enter your SQL Azure database details here, so you’ll have to get the server name from the Azure Management Console. You don’t need to explicitly grant access to your Umbraco website for the database though. Click Continue and you’ll be offered a “starter” website to install: If you don’t know Umbraco at all (but you are familiar with ASP.NET MVC) then a starter website is worthwhile to see how it all hangs together. But after a while you’ll have a bunch of artifacts in your CMS that you don’t want and you’ll have to work out which you can safely delete. So I’d click “No thanks, I do not want to install a starter website” and give yourself a clean Umbraco install. When it completes, the installation will log you in to the welcome screen for managing Umbraco – which you can access from http://my-app-config.azurewebsites.net/umbraco: That’s It Easy. Umbraco is installed, using a dedicated SQL Azure instance that you can separately scale, sync and backup, and ready for your content. In the next post, we’ll define what our app config looks like, and publish some settings for the dev environment.

    Read the article

  • Pella Increases Online Appointment Scheduling and Rapidly Personalizes and Updates Marketing Initiatives

    - by Michael Snow
    Originally posted on Oracle Customers page.Oracle Customer: Pella CorporationLocation:  Pella, IowaIndustry: Industrial Manufacturing Employees:  7,100 Pella Corporation is an innovative leader in creating a better view for homes and businesses by designing, testing, manufacturing, and installing quality windows and doors for new construction, remodeling, and replacement applications. A family-owned company, Pella has an 88-year history of innovation and, today, is the second-largest manufacturer in the country of windows and doors, including patio, entry, and storm doors. The company has 10 manufacturing facilities in United States and window and door showrooms across the United States and Canada. In-home consultations are an important part of Pella’s sales process. Several years ago, the company launched an online appointment scheduling tool to improve customer convenience. While the functionality worked well, the company wanted to increase online conversion rates and decrease the number of incomplete, online appointment schedules. It also wanted to give its business analysts and other line-of-business personnel the ability to update the scheduling tool and interface quickly, without needing IT team intervention and recoding, to better capitalize on opportunities and personalize the interface for specific markets. Pella also looked to reduce IT complexity by selecting a system that integrated easily with its Oracle E-Business Suite Release 12.1 enterprise applications.Pella, which has a large Oracle footprint, selected Oracle WebCenter Sites as the foundation for its new, real-time appointment scheduling application. It used the solution to re-engineer the scheduling process and the information required to set up an appointment. Just a few months after launch, it is seeing improvement in the number of appointments booked online and experiencing fewer abandoned appointments during the scheduling process. As important, Pella can now quickly and easily make changes to images, video, and content displayed on the scheduling tool interface, delivering greater business agility. Previously, such changes required a developer and weeks of coding and testing. Today, a member of Pella’s business analyst team can complete the changes in hours. This capability enables Pella to personalize the Web experience for customers. For example, it can display different products or images for clients in different regions.The solution is also highly scalable. Pella is using Oracle WebCenter Sites for appointment scheduling now and plans to migrate Pella.com, its configurator tool, and dealer microsites onto the platform. Further, Pella plans to leverage the solution to optimize mobile devices. “Moving ahead, we expect to extensively leverage Oracle WebCenter Sites to gain greater flexibility in updating the Web experience, thanks to the ability to make updates quickly without developer resources. Segmentation and targeting capabilities will allow us to create a more personalized experience across both traditional and mobile platforms,” said Teri Lancaster, IT manager, customer experience applications, Pella Corporation. A word from Pella Corporation "Oracle WebCenter Sites?from the start?delivered important benefits. We’ve redesigned the online scheduling process and are seeing more potential customers completing consultation bookings online. More important, the solution opens a world of other possibilities as we plan to migrate Pella.com and our dealer microsites to the platform, and leverage it to optimize the Web experience for our mobile devices.” – Teri Lancaster, IT Manager, Customer Experience Applications, Pella Corporation Oracle Product and Services Oracle WebCenter Sites Why Oracle Pella has a long-standing relationship with Oracle. “We look to Oracle first for a solution. Our Oracle account team came to us with several solutions, and Oracle WebCenter Sites delivered the scalability, ease-of-use, flexibility, and scalability that we required for the appointment scheduling initiative and other Web projects on the horizon, including migrating Pella.com and optimizing our site for mobile platforms,”said Teri Lancaster, IT manager, customer experience applications, Pella Corporation. Implementation Process The Pella implementation team, working with Oracle partner Element Solutions, LLC, integrated the appointment setting application with Pella.com as well as the company’s Oracle E-Business Suite customer relationship management applications. Using Oracle WebCenter Site’s development tools and subversion capabilities to develop the application, the Element Solutions and Pella teams could work remotely and collaboratively, accelerating deployment. Pella went live with the new scheduling tool in just six months. Partner Oracle PartnerElement Solutions, LLC Element Solutions was instrumental at every major stage of the project, including design creation and approval, development, training, and rollout. “Element Solutions was a vital partner for our Oracle WebCenter Sites initiative. The team provided guidance, and more important, critical knowledge transfer at every stage?which equipped us to get the most out of this powerful and versatile solution. We were definitely collaboration partners,” Lancaster said. Resources Pella Corporation Upgrades Enterprise Applications to Continue to Improve Manufacturing Efficiency Thousands of Customers Successfully and Smoothly Upgrade to Oracle E-Business Suite 12.1 for New Functionality, Lower Operating Costs and Improved Shared Operations Managing the Virtual World

    Read the article

  • Do You Develop Your PL/SQL Directly in the Database?

    - by thatjeffsmith
    I know this sounds like a REALLY weird question for many of you. Let me make one thing clear right away though, I am NOT talking about creating and replacing PLSQL objects directly into a production environment. Do we really need to talk about developers in production again? No, what I am talking about is a developer doing their work from start to finish in a development database. These are generally available to a development team for building the next and greatest version of your databases and database applications. And of course you are using a third party source control system, right? Last week I was in Tampa, FL presenting at the monthly Suncoast Oracle User’s Group meeting. Had a wonderful time, great questions and back-and-forth. My favorite heckler was there, @oraclenered, AKA Chet Justice.  I was in the middle of talking about how it’s better to do your PLSQL work in the Procedure Editor when Chet pipes up - Don’t do it that way, that’s wrong Just press play to edit the PLSQL directly in the database Or something along those lines. I didn’t get what the heck he was talking about. I had been showing how the Procedure Editor gives you much better feedback and support when working with PLSQL. After a few back-and-forths I got to what Chet’s main objection was, and again I’m going to paraphrase: You should develop offline in your SQL worksheet. Don’t do anything in the database until it’s done. I didn’t understand. Were developers expected to be able to internalize and mentally model the PL/SQL engine, see where their errors were, etc in these offline scripts? No, please give Chet more credit than that. What is the ideal Oracle Development Environment? If I were back in the ‘real world’ of database development, I would do all of my development outside of the ‘dev’ instance. My development process looks a little something like this: Do I have a program that already does something like this – copy and paste Has some smart person already written something like this – copy and paste Start typing in the white-screen-of-panic and bungle along until I get something that half-works Tweek, debug, test until I have fooled my subconscious into thinking that it’s ‘good’ As you might understand, I don’t want my co-workers to see the evolution of my code. It would seriously freak them out and I probably wouldn’t have a job anymore (don’t remind me that I already worked myself out of development.) So here’s what I like to do: Run a Local Instance of Oracle on my Machine and Develop My Code Privately I take a copy of development – that’s what source control is for afterall – and run it where no one else can see it. I now get to be my own DBA. If I need a trace – no problem. If I want to run an ASH report, no worries. If I need to create a directory or run some DataPump jobs, that’s all on me. Now when I get my code ‘up to snuff,’ then I will check it into source control and compile it into the official development instance. So my teammates suddenly go from seeing no program, to a mostly complete program. Is this right? If not, it doesn’t seem wrong to me. And after talking to Chet in the car on the way to the local cigar bar, it seems that he’s of the same opinion. So what’s so wrong with coding directly into a development instance? I think ‘wrong’ is a bit strong here. But there are a few pitfalls that you might want to look out for. A few come to mind – and I’m sure Chet could add many more as my memory fails me at the moment. But here goes: Development instance isn’t properly backed up – would hate to lose that work Development is wiped once a week and copied over from Prod – don’t laugh Someone clobbers your code You accidentally on purpose clobber someone else’s code The more developers you have in a single fish pond, the greater chance something ‘bad’ will happen This Isn’t One of Those Posts Where I Tell You What You Should Be Doing I realize many shops won’t be open to allowing developers to stage their own local copies of Oracle. But I would at least be aware that many of your developers are probably doing this anyway – with or without your tacit approval. SQL Developer can do local file tracking, but you should be using Source Control too! I will say that I think it’s imperative that you control your source code outside the database, even if your development team is comprised of a single developer. Store your source code in a file, and control that file in something like Subversion. You would be shocked at the number of teams that do not use a source control system. I know I continue to be shocked no matter how many times I meet another team running by the seat-of-their-pants. I’d love to hear how your development process works. And of course I want to know how SQL Developer and the rest of our tools can better support your processes. And one last thing, if you want a fun and interactive presentation experience, be sure to have Chet in the room

    Read the article

  • Design a T-shirt for .NET Reflector Pro

    - by Laila
    Win a .NET Reflector Pro license, a box of Red Gate goodies, and a t-shirt printed with your design! Red Gate likes t-shirts. Each of our teams has one. In fact, each individual person has one, numbered according to when they joined the company: Red Gate's 1st, 2nd, and so on right up to Red Gate's 170th, with the slogan "More than just a number". Those t-shirts are important, chiefly because they remind the people wearing them that they are important. But that isn't enough. What really makes us great are the people who choose to use our tools. So we'd like to extend our tradition of t-shirts to include you and put the design of our next shirt entirely in your hands. We'd like you to come up with a witty slogan or create an inventive or simply beautiful t-shirt design for .NET Reflector Pro, our add-in for Visual Studio, which allows you to step into decompiled assemblies whilst debugging in Visual Studio. When you're done, post your masterpiece to Twitter with the hash tag #reflectortees, and @redgate will take a look! We'll pick the best design, and the winner will get a licensed copy of .NET Reflector Pro and a box of Red Gate goodies - not to mention a copy of their t-shirt. The winning design will go into production and be worn and given out at tradeshows, conferences, and user group events across the world, proudly bearing the name of their designer. We'll also pick three runners-up who will receive licenses for .NET Reflector Pro. Red Gate goodie box Interested? If you're up for the challenge, then we've got some resources to get you started. Inside the .zip file you'll find high-quality versions of the following: T-shirt templates: don't forget to design the front and the back! Different versions of the .NET Reflector Pro logo and Red Gate logo. Colour sheets to give you an easy reference to the Red Gate colours, including hex and RGB values. You can create and send us as many designs as you like, and each of them will be considered for the prize. To submit your designs, simply tweet including the competition hash tag, #reflectortees, and a link to somewhere we can see your design: either an image hosting site such as Twitpic, Flickr or Picasa, or a personal blog. You will need to create a Twitter account (which is free), if you don't already have one. You only have three limits: The background colour of the t-shirt should be one of our brand colours (red, light/dark grey or black), though you're welcome to use other colours in the rest of the design. You need to make use of either the .NET Reflector Pro logo OR the Red Gate logo (please keep them as they are) If you include any text or slogan, stick with just one or two colors for it. Apart from that, go wild. Go and do whatever it is you do when you get creative: whether you walk barefoot on the grass with a pencil and paper, sit cross-legged on a pile of cushions with a laptop, or simply close your eyes and float through a mist of ideas, now is your chance. Make sure you enjoy it. We're looking forward to seeing your creations. Terms and conditions 1. The closing date for entries is June 11th, 2010 (4 p.m. UK time). Red Gate Software Ltd reserves the right to extend the competition deadline at its discretion. If there is a revision, the revised date will be published on this blog and the date for announcing the results will be postponed accordingly. 2. The winning designer will be notified on June 14th, 2010 through Twitter. The winner must claim his/her prize by sending us a high-resolution image of their design via email (i.e. Illustrator EPS files or appropriate format, ideally at 300dpi). If the winner does not come forward within 3 days of the announcement, they will forfeit their prize and another winner will be selected from the runners-up. The names of the winner and runners-up will be posted on this blog by June 18th.  3. Entry is completed on the designer posting a link to their entry in a tweet with the correct hash tag, #reflectortees. 4. Red Gate Software needs to hold the rights to using the winning design in order to put the t-shirt into production. We will make sure that this is fine with the winner before we do so, but if you do not want us holding the rights to your design, please do not submit your designs. We reserve the right to slightly alter or adjust any artwork we decide to use (mainly to make it easier to print), but we will make sure we contact the winner for approval first. The winner will also need to allow us the use of his/her name for purposes of promoting your design. 5. Entries must be entirely your own original work and must not breach any copyright or third party rights. Red Gate Software Ltd will not be made partially or fully liable for any non-original work submitted by you. 6. This competition is free: you do not need to buy anything or be an existing customer to enter. 7. This competition is not open to employees of Red Gate Software Ltd, their families, or any other company directly connected with the administration of this promotion.

    Read the article

  • Calling All Agile Customers-Share Your Stories at the Upcoming PLM Summit

    - by Terri Hiskey
    Now that we've closed the door on another Oracle OpenWorld, planning is in full swing for the next PLM Summit, taking place February 4-6, 2013 in San Francisco, in conjunction with the Oracle Value Chain Summit. This event is a must-attend for all Agile PLM customers. We will be holding five tracks with over forty Agile PLM-focused sessions covering a range of topics and industries. If you'd like to be notified once registration is live for this event, be sure to sign up at www.oracle.com/goto/vcs. CALL FOR PRESENTATIONS: We are looking for some fresh, new customer stories to share with attendees. Read below for descriptions of the five tracks, and the suggested topics that we'd like to hear from customers. If you are interested in presenting at the PLM Summit (and getting a FREE pass to attend if your presentation is accepted!) send me an email at terri.hiskey-AT-oracle.com with: Your proposed session title and the track your session fits into 3-5 bullets of takeaways that attendees will get from your presentation Your complete contact information including name, title, company, telephone number and email The deadline for this call for presentations is Thursday, November 15, so get your submission in soon! PLM Track #1:  Product Insights and Best Practices This track will provide executive attendees and line of business managers with an overview of how Agile PLM has been deployed and used at customers to enable and manage critical product-related business processes including enterprise quality and supplier management, compliance, product cost management, portfolio management, commercialization and software lifecycle management. These sessions will also provide details around how to manage the development and rollout of the solutions and how to achieve and track value. Possible session topics: Software Lifecycle Management Enterprise Quality Management New Product Development Integrated Business Planning ECO effectivity planning Rapid Commercialization             Manage the Design to Release Process for Complex Configured Products PLM for Life Sciences Companies I (Compliant Data Set) PLM for Life Sciences Companies II (eMDR, UDI) Discrete CPG – Private Label Mgmt Cost Management and Strategic Sourcing IP Mgmt in the Semiconductor Industry Implementing the Enterprise Training Record using Agile PLM PLM Track #2: Product Deep Dives & Demos This track is aimed at line of business  and IT managers who would like to understand the benefits of expanding their PLM footprint. The sessions in this track will provide attendees with an up-close and in-depth look Agile PLM’s newer and exciting applications, including analytics and innovation management, and will detail features and functionality that are available in the latest version of Agile PLM Possible session topics: Oracle Product Lifecycle Analytics Integrating PLM with Engineering and Supply Chain Systems Streamline PLM Design to Manufacturing Processes with AutoVue Visualization Solutions         Achieve Environmental Compliance (REACH and ROHS) with Agile Product Governance & Compliance PIM Deep Dive Achieving Integrated Change Control with Agile PLM and E-Business Suite Deploying PLM at Small and Midsize Enterprises Enhancing Oracle PQM w/APQP and 8D functionality Advanced Roles and Privileges – Enabling ITAR Model Unit Effectivity Implementing REACH with 9.3.2 Deploying Job Functions, Functional Teams in 9.3.2 to Improve Your Approval Matrix PLM Track #3: Administration & Integrations This track will provide sessions for Agile administrators, managers and daily Agile PLM users who are preparing to upgrade or looking to extend the use of their current PLM implementation through AIA and process extensions. It will include deeper conversation about Agile PLM features and best practices on managing an Agile PLM infrastructure. Possible session topics: Expand the Value of your Agile Investment with Innovative Process Extension Ideas Ensuring Implementation & Upgrade Success Ensure the Integrity and Accuracy of Product Data Across the Enterprise              Maximize the Benefits of an Integrated Architecture with AIA Integrating your PLM Implementation with ERP               Infrastructure Optimization Expanding Your PLM Implementation PLM Administrator Open Forum Q&A/Discussion FDA Validation Best Practices Best Practices for Managing a large Agile Deployment: Clustering, Load Balancing and Firewalls PLM Track #4: Agile PLM for Process This track is aimed at attendees interested in or currently using Agile PLM for Process. The sessions in this track will go over new features and functionality available in the newest version of PLM for Process and will give attendees an overview on how PLM for Process is being used to manage critical business processes such as formulation, recipe and specification management Possible session topics: PLM for Process Strategy, Roadmap and Update New Product Development and Introduction Effective Product Supplier Collaboration             Leverage Agile Formulation and Compliance to Manage Cost, Compliance, Quality, Labeling and Nutrition Menu Management Innovation Data Management Food Safety/ Introduction of P4P Quality Mgmt PLM Track #5: Agile PLM and Innovation Management This track consists of five sessions, and is for attendees interested in learning more about Oracle’s Agile Innovation Management, an exciting new addition to the Agile PLM application family that redefines the industry’s scope of product lifecycle management. Oracle’s innovation solutions enable companies to collaborate in a focused way among various functional groups (marketing, sales, operations, engineering/R&D and sourcing), combining insights of customer needs/requirements, competition, available technologies, alternate design scenarios and portfolio constraints to deliver what customers truly value. The results are better products, higher margins, greater efficiencies, more satisfied customers and the increased ability to continuously innovate. Possible session topics: Product Innovation Management Solution Overview Product Requirements & Ideation Management Concept Design Management Product Lifecycle Portfolio Management Innovation as a Competitive Differentiator

    Read the article

  • College Courses through distance learning

    - by Matt
    I realize this isn't really a programming question, but didn't really know where to post this in the stackexchange and because I am a computer science major i thought id ask here. This is pretty unique to the programmer community since my degree is about 95% programming. I have 1 semester left, but i work full time. I would like to finish up in December, but to make things easier i like to take online classes whenever I can. So, my question is does anyone know of any colleges that offer distance learning courses for computer science? I have been searching around and found a few potential classes, but not sure yet. I would like to gather some classes and see what i can get approval for. Class I need: Only need one C SC 437 Geometric Algorithms C SC 445 Algorithms C SC 473 Automata Only need one C SC 452 Operating Systems C SC 453 Compilers/Systems Software While i only need of each of the above courses i still need to take two more electives. These also have to be upper 400 level classes. So i can take multiple in each category. Some other classes I can take are: CSC 447 - Green Computing CSC 425 - Computer Networking CSC 460 - Database Design CSC 466 - Computer Security I hoping to take one or two of these courses over the summer. If not, then online over the regular semester would be ok too. Any help in helping find these classes would be awesome. Maybe you went to a college that offered distance learning. Some of these classes may be considered to be graduate courses too. Descriptions are listed below if you need. Thanks! Descriptions Computer Security This is an introductory course covering the fundamentals of computer security. In particular, the course will cover basic concepts of computer security such as threat models and security policies, and will show how these concepts apply to specific areas such as communication security, software security, operating systems security, network security, web security, and hardware-based security. Computer Networking Theory and practice of computer networks, emphasizing the principles underlying the design of network software and the role of the communications system in distributed computing. Topics include routing, flow and congestion control, end-to-end protocols, and multicast. Database Design Functions of a database system. Data modeling and logical database design. Query languages and query optimization. Efficient data storage and access. Database access through standalone and web applications. Green Computing This course covers fundamental principles of energy management faced by designers of hardware, operating systems, and data centers. We will explore basic energy management option in individual components such as CPUs, network interfaces, hard drives, memory. We will further present the energy management policies at the operating system level that consider performance vs. energy saving tradeoffs. Finally we will consider large scale data centers where energy management is done at multiple layers from individual components in the system to shutting down entries subset of machines. We will also discuss energy generation and delivery and well as cooling issues in large data centers. Compilers/Systems Software Basic concepts of compilation and related systems software. Topics include lexical analysis, parsing, semantic analysis, code generation; assemblers, loaders, linkers; debuggers. Operating Systems Concepts of modern operating systems; concurrent processes; process synchronization and communication; resource allocation; kernels; deadlock; memory management; file systems. Algorithms Introduction to the design and analysis of algorithms: basic analysis techniques (asymptotics, sums, recurrences); basic design techniques (divide and conquer, dynamic programming, greedy, amortization); acquiring an algorithm repertoire (sorting, median finding, strong components, spanning trees, shortest paths, maximum flow, string matching); and handling intractability (approximation algorithms, branch and bound). Automata Introduction to models of computation (finite automata, pushdown automata, Turing machines), representations of languages (regular expressions, context-free grammars), and the basic hierarchy of languages (regular, context-free, decidable, and undecidable languages). Geometric Algorithms The study of algorithms for geometric objects, using a computational geometry approach, with an emphasis on applications for graphics, VLSI, GIS, robotics, and sensor networks. Topics may include the representation and overlaying of maps, finding nearest neighbors, solving linear programming problems, and searching geometric databases.

    Read the article

  • Database-as-a-Service on Exadata Cloud

    - by Gagan Chawla
    Note – Oracle Enterprise Manager 12c DBaaS is platform agnostic and is designed to work on Exadata/non-Exadata, physical/virtual, Oracle/non Oracle platforms and it’s not a mandatory requirement to use Exadata as the base platform. Database-as-a-Service (DBaaS) is an important trend these days and the top business drivers motivating customers towards private database cloud model include constant pressure to reduce IT Costs and Complexity, and also to be able to improve Agility and Quality of Service. The first step many enterprises take in their journey towards cloud computing is to move to a consolidated and standardized environment and Exadata being already a proven best-in-class popular consolidation platform, we are seeing now more and more customers starting to evolve from Exadata based platform into an agile self service driven private database cloud using Oracle Enterprise Manager 12c. Together Exadata Database Machine and Enterprise Manager 12c provides industry’s most comprehensive and integrated solution to transform from a typical silo’ed environment into enterprise class database cloud with self service, rapid elasticity and pay-per-use capabilities.   In today’s post, I’ll list down the important steps to enable DBaaS on Exadata using Enterprise Manager 12c. These steps are chalked down based on a recent DBaaS implementation from a real customer engagement - Project Planning - First step involves defining the scope of implementation, mapping functional requirements and objectives to use cases, defining high availability, network, security requirements, and delivering the project plan. In a Cloud project you plan around technology, business and processes all together so ensure you engage your actual end users and stakeholders early on in the project right from the scoping and planning stage. Setup your EM 12c Cloud Control Site – Once the project plan approval and sign off from stakeholders is achieved, refer to EM 12c Install guide and these are some important tips to follow during the site setup phase - Review the new EM 12c Sizing paper before you get started with install Cloud, Chargeback and Trending, Exadata plug ins should be selected to deploy during install Refer to EM 12c Administrator’s guide for High Availability, Security, Network/Firewall best practices and options Your management and managed infrastructure should not be combined i.e. EM 12c repository should not be hosted on same Exadata where target Database Cloud is to be setup Setup Roles and Users – Cloud Administrator (EM_CLOUD_ADMINISTRATOR), Self Service Administrator (EM_SSA_ADMINISTRATOR), Self Service User (EM_SSA_USER) are the important roles required for cloud lifecycle management. Roles and users are managed by Super Administrator via Setup menu –> Security option. For Self Service/SSA users custom role(s) based on EM_SSA_USER should be created and EM_USER, PUBLIC roles should be revoked during SSA user account creation. Configure Software Library – Cloud Administrator logs in and in this step configures software library via Enterprise menu –> provisioning and patching option and the storage location is OMS shared filesystem. Software Library is the centralized repository that stores all software entities and is often termed as ‘local store’. Setup Self Update – Self Update is one of the most innovative and cool new features in EM 12c framework. Self update can be accessed via Setup -> Extensibility option by Super Administrator and is the unified delivery mechanism to get all new and updated entities (Agent software, plug ins, connectors, gold images, provisioning bundles etc) in EM 12c. Deploy Agents on all Compute nodes, and discover Exadata targets – Refer to Exadata discovery cookbook for detailed walkthrough to ensure successful discovery of Exadata targets. Configure Privilege Delegation Settings – This step involves deployment of privilege setting template on all the nodes by Super Administrator via Setup menu -> Security option with the option to define whether to use sudo or powerbroker for all provisioning and patching operations. Provision Grid Infrastructure with RAC Database on Compute Nodes – Software is provisioned in this step via a provisioning profile using EM 12c database provisioning. In case of Exadata, Grid Infrastructure and RAC Database software is already deployed on compute nodes via OneCommand from Oracle, so SSA Administrator just needs to discover Oracle Homes and Listener as EM targets. Databases will be created as and when users request for databases from cloud. Customize Create Database Deployment Procedure – the actual database creation steps are "templatized" in this step by Self Service Administrator and the newly saved deployment procedure will be used during service template creation in next step. This is an important step and make sure you have locked all the required variables marked as locked as ‘Y’ in this table. Setup Self Service Portal – This step involves setting up of zones, user quotas, service templates, chargeback plan. The SSA portal is setup by Self Service Administrator via Setup menu -> Cloud -> Database option and following guided workflow. Refer to DBaaS cookbook for details. You also have an option to customize SSA login page via steps documented in EM 12c Cloud Administrator’s guide Final Checks – Define and document process guidelines for SSA users and administrators. Get your SSA users trained on Self Service Portal features and overall DBaaS model and SSA administrators should be familiar with Self Service Portal setup pieces, EM 12c database lifecycle management capabilities and overall EM 12c monitoring framework. GO LIVE – Announce rollout of Database-as-a-Service to your SSA users. Users can login to the Self Service Portal and request/monitor/view their databases in Exadata based database cloud. Congratulations! You just delivered a successful database cloud implementation project! In future posts, we will cover these additional useful topics around database cloud – DBaaS Implementation tips and tricks – right from setup to self service to managing the cloud lifecycle ‘How to’ enable real production databases copies in DBaaS with rapid provisioning in database cloud Case study of a customer who recently achieved success with their transformational journey from traditional silo’ed environment on to Exadata based database cloud using Enterprise Manager 12c. More Information – Podcast on Database as a Service using Oracle Enterprise Manager 12c Oracle Enterprise Manager 12c Installation and Administration guide, Cloud Administration guide DBaaS Cookbook Exadata Discovery Cookbook Screenwatch: Private Database Cloud: Set Up the Cloud Self-Service Portal Screenwatch: Private Database Cloud: Use the Cloud Self-Service Portal Stay Connected: Twitter |  Face book |  You Tube |  Linked in |  Newsletter

    Read the article

  • BPM 11g - Dynamic Task Assignment with Multi-level Organization Units

    - by Mark Foster
    I've seen several requirements to have a more granular level of task assignment in BPM 11g based on some value in the data passed to the process. Parametric Roles is normally the first port of call to try to satisfy this requirement, but in this blog we will show how a lot of use-cases can be satisfied by the easier to implement and flexible Organization Unit. The Use-Case Task assignment is to an approval group containing several users. At runtime, a location value in the input data determines which of the particular users the task is ultimately assigned to. In this case we use the Demo Community referenced in the SOA Admin Guide, and specifically the "LoanAnalyticGroup" which contains three users; "szweig", "mmitch" & "fkafka". In our scenario we would like to assign a task to "szweig" if the input data specifies that the location is "JapanCentral", to "fkafka" if the location is "JapanNorth" and to "mmitch" if "JapanSouth", and to all of them if the location is "Japan" i.e....   The Process Simple one human task process.... In the output data association of the "Start" activity we need to set the value of the "Organization Unit" predefined variable based on the input data (note that the  predefined variables can only be set on output data associations)....  ...and in the output data association of the human activity we will reset the "Organization Unit" to empty, always good practice to ensure that the Organization Unit will not be used for any subsequent human activities for which we do not require it.... Set Up the Organization Unit  Log in to the BPM Workspace with an administrator user (weblogic/welcome1 in our case) and choose the "Administration" option. Within "Roles" assign the "ProcessOwner" swim-lane for our process to "LoanAnalyticGroup".... Within "Organization Units" we can model our organization.... "Root Organization Unit" as "Japan" and "Child Organization Unit" as "Central", "South" & "North" as shown. As described previously, add user "szweig" to "Central", "mmitch" to "South" and "fkafka" to "North"....   Test the Process Invalid Data  First let us test with invalid data in the input to see what the consequences are, here we use "X" as input.... ...and looking at the instance we can see it has errored.... Organization Unit Root Level Assignment  Now let us see what happens if we have "Japan" in the input data.... ...looking in the "flow trace" we can see that the task has been assigned....  ... but who has the task been assigned to ? Let us look in the BPM Workspace for user "szweig"....  ...and for "mmitch"....  ... and for "fkafka"....  ...so we can see that with an Organization Unit at "Root" level we have successfully assigned the task to all users. Organization Unit Child Level Assignment  Now let us test with "Japan/North" in the input data.... ...and looking in "fkafka" workspace we see the task has been assigned, remember, he was associated with "JapanNorth"....   ... but what about the workspace of "szweig"....  ...no tasks assigned, neither has "mmitch", just as we expected. Summary  We have seen in this blog how to easily implement multi-level dynamic task routing using Organization Units, a common use-case and a simpler solution than Parametric Roles. 

    Read the article

  • Detecting Duplicates Using Oracle Business Rules

    - by joeywong-Oracle
    Recently I was involved with a Business Process Management Proof of Concept (BPM PoC) where we wanted to show how customers could use Oracle Business Rules (OBR) to easily define some rules to detect certain conditions, such as duplicate account numbers, duplicate names, high transaction amounts, etc, in a set of transactions. Traditionally you would have to loop through the transactions and compare each transaction with each other to find matching conditions. This is not particularly nice as it relies on more traditional approaches (coding) and is not the most efficient way. OBR is a great place to house these types’ of rules as it allows users/developers to externalise the rules, in a simpler manner, externalising the rules from the message flows and allows users to change them when required. So I went ahead looking for some examples. After quite a bit of time spent Googling, I did not find much out in the blogosphere. In fact the best example was actually from...... wait for it...... Oracle Documentation! (http://docs.oracle.com/cd/E28271_01/user.1111/e10228/rules_start.htm#ASRUG228) However, if you followed the link there was not much explanation provided with the example. So the aim of this article is to provide a little more explanation to the example so that it can be better understood. Note: I won’t be covering the BPM parts in great detail. Use case: Payment instruction file is required to be processed. Before instruction file can be processed it needs to be approved by a business user. Before the approval process, it would be useful to run the payment instruction file through OBR to look for transactions of interest. The output of the OBR can then be used to flag the transactions for the approvers to investigate. Example BPM Process So let’s start defining the Business Rules Dictionary. For the input into our rules, we will be passing in an array of payments which contain some basic information for our demo purposes. Input to Business Rules And for our output we want to have an array of rule output messages. Note that the element I am using for the output is only for one rule message element and not an array. We will configure the Business Rules component later to return an array instead. Output from Business Rules Business Rule – Create Dictionary Fill in all the details and click OK. Open the Business Rules component and select Decision Functions from the side. Modify the Decision Function Configuration Select the decision function and click on the edit button (the pencil), don’t worry that JDeveloper indicates that there is an error with the decision function. Then click the Ouputs tab and make sure the checkbox under the List column is checked, this is to tell the Business Rules component that it should return an array of rule message elements. Updating the Decision Service Next we will define the actual rules. Click on Ruleset1 on the side and then the Create Rule in the IF/THEN Rule section. Creating new rule in ruleset Ok, this is where some detailed explanation is required. Remember that the input to this Business Rules dictionary is a list of payments, each of those payments were of the complex type PaymentType. Each of those payments in the Oracle Business Rules engine is treated as a fact in its working memory. Implemented rule So in the IF/THEN rule, the first task is to grab two PaymentType facts from the working memory and assign them to temporary variable names (payment1 and payment2 in our example). Matching facts Once we have them in the temporary variables, we can then start comparing them to each other. For our demonstration we want to find payments where the account numbers were the same but the account name was different. Suspicious payment instruction And to stop the rule from comparing the same facts to each other, over and over again, we have to include the last test. Stop rule from comparing endlessly And that’s it! No for loops, no need to keep track of what you have or have not compared, OBR handles all that for you because everything is done in its working memory. And once all the tests have been satisfied we need to assert a new fact for the output. Assert the output fact Save your Business Rules. Next step is to complete the data association in the BPM process. Pay extra care to use Copy List instead of the default Copy when doing data association at an array level. Input and output data association Deploy and test. Test data Rule matched Parting words: Ideally you would then use the output of the Business Rules component to then display/flag the transactions which triggered the rule so that the approver can investigate. Link: SOA Project Archive [Download]

    Read the article

  • Thought Oracle Usability Advisory Board Was Stuffy? Wrong. Justification for Attending OUAB: ROI

    - by ultan o'broin
    Looking for reasons tell your boss why your organization needs to join the Oracle Usability Advisory Board or why you need approval to attend one of its meetings (see the requirements)? Try phrases such as "Continued Return on Investment (ROI)", "Increased Productivity" or "Happy Workers". With OUAB your participation is about realizing and sustaining ROI across the entire applications life-cycle from input to designs to implementation choices and integration, usage and performance and on measuring and improving the onboarding and support experience. If you think this is a boring meeting of middle-aged people sitting around moaning about customizing desktop forms and why the BlackBerry is here to stay, think again! How about this for a rich agenda, all designed to engage the audience in a thought-provoking and feedback-illiciting day of swirling interactions, contextual usage, global delivery, mobility, consumerizationm, gamification and tailoring your implementation to reflect real users doing real work in real environments.  Foldable, rollable ereader devices provide a newspaper-like UK for electronic news. Or a way to wrap silicon chips, perhaps. Explored at the OUAB Europe Meeting (photograph from Terrace Restaurant in TVP. Nom.) At the 7 December 2012 OUAB Europe meeting in Oracle Thames Valley Park, UK, Oracle partners and customers stepped up to the mic and PPT decks with a range of facts and examples to astound any UX conference C-level sceptic. Over the course of the day we covered much ground, but it was all related in a contextual, flexibile, simplication, engagement way aout delivering results for business: that means solving problems. This means being about the user and their tasks and how to make design and technology transforms work into a productive activity that users and bean counters will be excited by. The sessions really gelled for me: 1. Mobile design patterns and the powerful propositions for customers and partners offered by using the design guidance with Oracle ADF Mobile. Customers' and partners' developers existing ADF developers are now productive, efficient ADF Mobile developers applying proven UX guidance using ADF Mobile components and other Oracle Fusion Middleware in the development toolkit. You can find the Mobile UX Design Patterns and Guidance on Building Mobile Apps on OTN. 2. Oracle Voice and Apps. How this medium offers so much potentual in the enterprise and offers a window in Fusion Apps cloud webservices, Oracle RightNow NLP and Nuance technology. Exciting stuff, demoed live on a mobile phone. Stay tuned for more features and modalities and how you can tailor your own apps experience.  3. Oracle RightNow Natural Language Processing (NLP) Virtual Assistant technology (Ella): how contextual intervention and learning from users sessions delivers a great personalized UX for users interacting with Ella, a fifth generation VA to solve problems and seek knowledge. 4. BYOD Keynote: A balanced keynote address contrasting Fujitsu's explaining of the conceprt, challenges, and trends and setting the expectation that BYOD must be embraced in a flexible way,  with the resolute, crafted high security enterprise requirements that nuancing the BYOD concept and proposals with the realities of their world of water tight information and device sharing policies. Fascinating stuff, as well providing anecdotes to make us thing about out own DYOD Deployments. One size does not fit all. 5. Icon Cultural Surveys Results and Insights Arising: Ever wondered about the cultural appropriateness of icons used in software UIs and how these icons assessed for global use? Or considered that social media "Like" icons might be  unacceptable hand gestures in culture or enterprise? Or do the old world icons like Save floppy disk icons still find acceptable? Well the survey results told you. Challenges must be tested, over time, and context of use is critical now, including external factors such as the internet and social media adoption. Indeed the fears about global rejection of the face and hand icons was not borne out, and some of the more anachronistic icons (checkbooks, microphones, real-to-real tape decks, 3.5" floppies for "save") have become accepted metaphors for current actions. More importantly the findings brought into focus the reason for OUAB - engage with and illicit feedback though working groups before we build anything. 6. EReaders and Oracle iBook: What is the uptake and trends of ereaders? And how about a demo of an iBook with enterprise apps content?  Well received by the audience, the session included a live running poll of ereader usage. 7. Gamification Design Jam: Fun, hands on event for teams of Oracle staff, partners and customers, actually building gamified flows, a practice that can be applied right away by customers and partners.  8. UX Direct: A new offering of usability best practices, coming to an external website for you in 2013. FInd a real user, observe their tasks, design and approve, build and measure. Simple stuff to improve apps implications no end. 9. FUSE (an internal term only, basically Fusion Simplified Experience): demo of the new Face of Fusion Applications: inherently mobile, simple to use, social, personalizable and FAST, three great demos from the HCM, CRM and ICT world on how these UX designs can be used in different ways. So, a powerful breadth and depth of UX solutions and opporunities for customers and partners to engage with and explore how they can make their users happy and benefit their business reaping continued ROI from those apps investments. Find out more about the OUAB and how to get involved here ... 

    Read the article

  • Poor Customer Service Example

    - by MightyZot
    Lately I have been frustrated by examples of poor customer service. At least one is worth writing about because I don’t think companies realize the effects of their service policies on loyal customers. Bad Customer Service Example #1 Recently, I received an offer in the mail from my cable company, suddenLink. The offer was for an updated TiVo for $12/mo. Normally I ignore offers like this one because I already have the service they’re offering and many times advertisers are offering alternatives to what is already an excellent product offering. I tend to exhibit a high level of loyalty to the products and brands that I use. In this case, we were looking to upgrade our TiVo and this deal is attractive for several reasons: I don’t want to pay a huge amount up-front for the device, so paying a monthly amount for the device is attractive to me. My entertainment is almost all on a single invoice. I’m no longer going to be billed by suddenLink and TiVo. TiVo is still involved, so I am still loyal to the brand I love. I have resisted moving to other DVRs and services for over a decade. I called suddenLink to order the new TiVo and was rewarded with great customer service. In fact, I can’t remember ever getting poor customer service from suddenLink. They are always there to answer my technical support questions and they are very responsive to outages. Then I called TiVo. First of all, I chose the option on the phone system to change or cancel my service, which was consequently met by an inordinate hold time. (I’m calling this time inordinate because I get through very quickly if I want to purchase something.) This is a trend that I’ve noticed with companies – if you want me to be loyal to you, it should be just as easy to cancel your service as it is to purchase it. Because, I should never be cancelling because I am unhappy. And, if you ever want my business again, or more importantly a reference, then you’d better make the exit door open just as easy as the enter door. After quite some time on hold, I talked to “Victor” who was very courteous. Victor canceled my service and then told me that I could keep my current TiVo and transfer recorded programs to it from the new TiVo.  Cool I said, but what about the cost?  He said there was no extra cost.  This was also attractive to me because I paid for my TiVo and it would be good to use it for something at least.  That was four months ago. This month I noticed that TiVo was still charging me for my original service. I was a little upset, but I decided to give them the benefit of the doubt. After all, I am a loyal TiVo customer and I have resisted moving to other solutions for over a decade. I’m sure they will do whatever it takes to keep my business, through TiVo or through suddenLink. After quite some time on hold, I was able to talk to a customer service representative, “Les”. I explained that I am a loyal TiVo customer, but I purchased this deal through my cable provider. I’m still with TiVo, I just wanted a single bill and to take advantage of the pay-over-time option. “Les” told me that he was very sorry to hear that I’m leaving TiVo, to which I responded again that I wasn’t leaving TiVo, I just want one invoice, and to take advantage of the pay-over-time. So, after explaining that I requested a termination of the non-suddenLink account (TiVo can see both of course), I was put on hold again for quite some time while my refund was “approved”.  “Les” said that he could see my cancellation request back in July. Note that it is now November, so they have billed me inappropriately four times. After quite some time, he came back on the line and told me that he was able to “get me most of my money back.” He got approval to refund 90 days. Even though I requested cancellation of one of my accounts, TiVo has that cancellation request on file and they admit overbilling me, I am going to get “most” of my money back. To top this experience off, when we were ready to hang up, “Les” told me that he was sorry to see me go and that he hoped I would come back to TiVo again. Again, I explained to “Les” that I have not left TiVo. I am just paying them through suddenLink. At that point, he went into a small dissertation about how this is a special arrangement they have with suddenLink and very few others. He made me feel like I was doing something wrong. Why should I feel that way? TiVo made the deal with suddenLink, not me, and the deal seemed like a good compromise for me to be able to get what I need. Here is what TiVo Customer Service accomplished on those two calls – I no longer feel like I need to be loyal to the TiVo brand or service. If I had been treated better on these two calls, I would still be recommending TiVo to my friends. They would still be getting revenue from a loyal customer, who paid the same rate for over a decade, and this article wouldn’t be here for you to read. Interesting… In my opinion, if you want brand loyalty, be loyal to your customers!

    Read the article

  • Breaking 1NF to model subset constraints. Does this sound sane?

    - by Chris Travers
    My first question here. Appologize if it is in the wrong forum but this seems pretty conceptual. I am looking at doing something that goes against conventional wisdom and want to get some feedback as to whether this is totally insane or will result in problems, so critique away! I am on PostgreSQL 9.1 but may be moving to 9.2 for this part of this project. To re-iterate: Does it seem sane to break 1NF in this way? I am not looking for debugging code so much as where people see problems that this might lead. The Problem In double entry accounting, financial transactions are journal entries with an arbitrary number of lines. Each line has either a left value (debit) or a right value (credit) which can be modelled as a single value with negatives as debits and positives as credits or vice versa. The sum of all debits and credits must equal zero (so if we go with a single amount field, sum(amount) must equal zero for each financial journal entry). SQL-based databases, pretty much required for this sort of work, have no way to express this sort of constraint natively and so any approach to enforcing it in the database seems rather complex. The Write Model The journal entries are append only. There is a possibility we will add a delete model but it will be subject to a different set of restrictions and so is not applicable here. If and when we allow deletes, we will probably do them using a simple ON DELETE CASCADE designation on the foreign key, and require that deletes go through a dedicated stored procedure which can enforce the other constraints. So inserts and selects have to be accommodated but updates and deletes do not for this task. My Proposed Solution My proposed solution is to break first normal form and model constraints on arrays of tuples, with a trigger that breaks the rows out into another table. CREATE TABLE journal_line ( entry_id bigserial primary key, account_id int not null references account(id), journal_entry_id bigint not null, -- adding references later amount numeric not null ); I would then add "table methods" to extract debits and credits for reporting purposes: CREATE OR REPLACE FUNCTION debits(journal_line) RETURNS numeric LANGUAGE sql IMMUTABLE AS $$ SELECT CASE WHEN $1.amount < 0 THEN $1.amount * -1 ELSE NULL END; $$; CREATE OR REPLACE FUNCTION credits(journal_line) RETURNS numeric LANGUAGE sql IMMUTABLE AS $$ SELECT CASE WHEN $1.amount > 0 THEN $1.amount ELSE NULL END; $$; Then the journal entry table (simplified for this example): CREATE TABLE journal_entry ( entry_id bigserial primary key, -- no natural keys :-( journal_id int not null references journal(id), date_posted date not null, reference text not null, description text not null, journal_lines journal_line[] not null ); Then a table method and and check constraints: CREATE OR REPLACE FUNCTION running_total(journal_entry) returns numeric language sql immutable as $$ SELECT sum(amount) FROM unnest($1.journal_lines); $$; ALTER TABLE journal_entry ADD CONSTRAINT CHECK (((journal_entry.running_total) = 0)); ALTER TABLE journal_line ADD FOREIGN KEY journal_entry_id REFERENCES journal_entry(entry_id); And finally we'd have a breakout trigger: CREATE OR REPLACE FUNCTION je_breakout() RETURNS TRIGGER LANGUAGE PLPGSQL AS $$ BEGIN IF TG_OP = 'INSERT' THEN INSERT INTO journal_line (journal_entry_id, account_id, amount) SELECT NEW.id, account_id, amount FROM unnest(NEW.journal_lines); RETURN NEW; ELSE RAISE EXCEPTION 'Operation Not Allowed'; END IF; END; $$; And finally CREATE TRIGGER AFTER INSERT OR UPDATE OR DELETE ON journal_entry FOR EACH ROW EXECUTE_PROCEDURE je_breaout(); Of course the example above is simplified. There will be a status table that will track approval status allowing for separation of duties, etc. However the goal here is to prevent unbalanced transactions. Any feedback? Does this sound entirely insane? Standard Solutions? In getting to this point I have to say I have looked at four different current ERP solutions to this problems: Represent every line item as a debit and a credit against different accounts. Use of foreign keys against the line item table to enforce an eventual running total of 0 Use of constraint triggers in PostgreSQL Forcing all validation here solely through the app logic. My concerns are that #1 is pretty limiting and very hard to audit internally. It's not programmer transparent and so it strikes me as being difficult to work with in the future. The second strikes me as being very complex and required a series of contraints and foreign keys against self to make work, and therefore it strikes me as complex, hard to sort out at least in my mind, and thus hard to work with. The fourth could be done as we force all access through stored procedures anyway and this is the most common solution (have the app total things up and throw an error otherwise). However, I think proof that a constraint is followed is superior to test cases, and so the question becomes whether this in fact generates insert anomilies rather than solving them. If this is a solved problem it isn't the case that everyone agrees on the solution....

    Read the article

  • Application Composer Series: Where and When to use Groovy

    - by Richard Bingham
    This brief post is really intended as more of a reference than an article. The table below highlights two things, firstly where you can add you own custom logic via groovy code (end column), and secondly (middle column) when you might use each particular feature. Obviously this applies only where Application Composer exists, namely Fusion CRM and Oracle Sales Cloud, and is based on current (release 8) functionality. Feature Most Common Use Case Groovy Field Triggers React to run-time data changes. Only fired when the field is changed and upon submit. Y Object Triggers To extend the standard processing logic for an object, based on record creation, updates and deletes. There is a split between these firing events, with some related to UI/ADF actions and others originating in the database. UI Trigger Points: After Create - fires when a new object record is created. Commonly used to set default values for fields. Before Modify - Fires when the end-user tries to modify a field value. Could be used for generic warnings or extra security logic. Before Invalidate - Fires on the parent object when one of its child object records is created, updated, or deleted. For building in relationship logic. Before Remove - Fires when an attempt is made to delete an object record. Can be used to create conditions that prevent deletes. Database Trigger Points: Before Insert in Database - Fires before a new object is inserted into the database. Can be used to ensure a dependent record exists or check for duplicates. After Insert in Database - Fires after a new object is inserted into the database. Could be used to create a complementary record. Before Update in Database -Fires before an existing object is modified in the database. Could be used to check dependent record values. After Update in Database - Fires after an existing object is modified in the database. Could be used to update a complementary record. Before Delete in Database - Fires before an existing object is deleted from the database. Could be used to check dependent record values. After Delete in Database - Fires after an existing object is deleted from the database. Could be used to remove dependent records. After Commit in Database - Fires after the change pending for the current object (insert, update, delete) is made permanent in the current transaction. Could be used when committed data that has passed all validation is required. After Changes Posted to Database - Fires after all changes have been posted to the database, but before they are permanently committed. Could be used to make additional changes that will be saved as part of the current transaction. Y Field Validation Displays a user entered error message based groovy logic validating the field value. The message is shown only when the validation logic returns false, and the logic is triggered only when tabbing out of the field on the user interface. Y Object Validation Commonly used where validation is needed across multiple related fields on the object. Triggered on the submit UI action. Y Object Workflows All Object Workflows are fired upon either record creation or update, along with the option of adding a custom groovy firing condition. Y Field Updates - change another field when a specified one changes. Intended as an easy way to set different run-time values (e.g. pick values for LOV's) plus the value field permits groovy logic entry. Y E-Mail Notification - sends an email notification to specified users/roles. Templates support using run-time value tokens and rich text. N Task Creation - for adding standard tasks for use in the worklist functionality. N Outbound Message - will create and send an XML payload of the related object SDO to a specified endpoint. N Business Process Flow - intended for approval using the seeded process, however can also trigger custom BPMN flows. N Global Functions Utility functions that can be called from any groovy code in Application Composer (across applications). Y Object Functions Utility functions that are local to the parent object. Usually triggered from within 'Buttons and Actions' definitions in Application Composer, although can be called from other code for that object (e.g. from a trigger). Y Add Custom Fields When adding custom fields there are a few places you can include groovy logic. Y Default Value - to add logic within setting the default value when new records are entered. Y Conditionally Updateable - to add logic to set the field to read-only or not. Y Conditionally Required - to add logic to set the field to required or not. Y Formula Field - Used to provide a new aggregate field that is entirely based on groovy logic and other field values. Y Simplified UI Layouts - Advanced Expressions Used for creating dynamic layouts for simplified UI pages where fields and regions show/hide based on run-time context values and logic. Also includes support for the depends-on feature as a trigger. Y Related References This Blog: Application Composer Series Extending Sales Guide: Using Groovy Scripts Groovy Scripting Reference Guide

    Read the article

  • The Best BPM Journey: More Exciting Destinations with Process Accelerators

    - by Cesare Rotundo
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Oracle Open World (OOW) earlier this month has been a great occasion to discuss with our BPM customers. It was interesting to hear definite patterns emerging from those conversations: “BPM is a journey”, “experiences to share”, “our organization now understands what BPM is”, and my favorite (with some caveats): “BPM is like wine tasting, once you start, you want to try more”. These customers have started their journey, climbed up the learning curve, and reached a vantage point that allows them to see their next BPM destination. They see the next few processes they are going to tackle and improve with BPM. These processes/destinations target both horizontal processes where BPM replaces or coordinates manual activities, and critical industry processes that the company needs to improve to compete and deliver increasing value. Each new destination generates value, allowing the organization to reduce the cost of manual processes that were not supported by apps/custom development, and increase efficiency of end-to-end processes partially covered by apps/custom dev. The question we wanted to answer is how to help organizations experience deeper success with BPM, by increasing their awareness of the potential for reaching new targets, and equipping them with the right tools. We decided that we needed to identify destinations, and plot routes to show the fastest path to those destinations. In the end we want to enable customers to reach “Process Excellence”: continuously set new targets and consistently and efficiently reach them. The result is Oracle Process Accelerators (PA), solutions built using the rich functionality in Oracle BPM Suite. PAs offers a rapidly expanding list of exciting destinations. Our launch of the latest installment of Process Accelerators at Oracle Open World includes new Industry-focused solutions such as Public Sector Incident Reporting and Financial Services Loan Origination, and improved other horizontal PAs, including Travel Request Management, Document Routing and Approval, and Internal Service Requests. Just before OOW we had extended the Oracle deployment of Travel Request Management, riding the enthusiastic response from early adopters among travelers (employees), management and support (approvers). “Getting there first” means being among the first to extract value from the PA approach, while acquiring deeper insights into the customers’ perspective. This is especially noteworthy when it comes to PAs, a set of solutions designed to be quickly deployed and iteratively improved by customers. The OOW launch has generated immediate feedback from customers, non-customers, analysts, and partners. They all confirmed that both Business and IT at organizations benefit from PAs when it comes to exploring the potential for BPM to improve their business processes. PAs help customers visualize what can be done with BPM, and PAs are made to be extended: you can see your destination, change the path to fit your needs, and deploy. We're discovering new destinations/processes that the market wants us to support, generic enough across industries and within industries. We'll keep on building sets of requirements, deliver functional design, construct solutions using Oracle BPM, and test them not only functionally but for performance, scalability, clustering, making them robust, product-quality. Delivering BPM solutions with product-grade quality is the equivalent of following a tried-and-tested path on a map. Do you know of existing destinations in your industry? If yes, we can draw a path to innovative processes together.

    Read the article

< Previous Page | 18 19 20 21 22 23 24 25  | Next Page >