Search Results

Search found 69138 results on 2766 pages for 'oracle data mining'.

Page 343/2766 | < Previous Page | 339 340 341 342 343 344 345 346 347 348 349 350  | Next Page >

  • B2B communication using IBM MQ

    - by Dheeraj Kumar M
    Oracle B2B 11g, provides the out-of-the box ability to connect to IBM MQ to exchange the message. This is support is provided via JMS offering of Oracle B2B. This is an addition to the stack of existing communication capabilities of B2B with trading partners. There are 2 ways of connecting to IBM MQ using B2B 1. Credential based connectivity 2. .bindings based connectivity As a pre-requisite to connect to IBM MQ, it is required to provide the following libraries in classpath: a. com.ibm.mqjms.jar b. dhbcore.jar c. com.ibm.mq.jar d. com.ibm.mq.jmqi.jar e. mqcontext.jar f. com.ibm.mq.pcf.jar g. com.ibm.mq.commonservices.jar h. com.ibm.mq.headers.jar i. fscontext.jar j. jms.jar Add the above jars into domain library directory and the directory usually located at $DOMAIN_DIR/lib. The jars located in this($DOMAIN_DIR/lib) directory will be picked up and added dynamically to the end of the server classpath at server startup. For eg. /user_projects/domains//lib/ Alternatively the above jar’s can also be added as part of the setDomainEnv.sh Credential based connectivity : Outbound: : Configure the trading partner delivery channel for using "Generic JMS" protocol Inbound: : Configure the internal delivery channel for using "Generic JMS" protocol with the following details: Parameter NameDescription Destination NameMQ Queue Name Connection FactoryMQ Queue Manager Name Destination Providerjava.naming.factory.initial=com.ibm.mq.jms.context.WMQInitialContextFactory;java.naming.provider.url=<host>:<QM Listen port>/<MQ Channel Name>; User NameMQ User Name passwordMQ password .bindings based connectivity As a pre-requisite, get/generate the .bindings file in MQServer. This can be done by MQ Administrator Set the following values in the respective delivery channel for outbound / inbound Parameter NameDescription Destination NameMQ Queue Name Connection FactoryMQ Queue Manager Name Destination Providerjava.naming.factory.initial=com.ibm.mq.jms.context.WMQInitialContextFactory;java.naming.provider.url=file:///<location of .bindings file>;

    Read the article

  • What Should You Look for In a CRM Demo?

    - by charles.knapp
    I have helped firms evaluate software demos and delivered demos in diverse industries such as manufacturing, healthcare, life sciences, and travel (to name just a few). Here are a few suggestions. First, which vendor has the best fit for your industry? Make sure that the vendor demo staff tell you clearly throughout the demo (not just in a passing comment), what portion of each business process and screen is standard, what has been configured, what has been custom coded, and what has been provided by a partner. If you don't keep asking, what you buy may be less useful than what you saw. This will lead to added (and unbudgeted) costs and time. Second, what are the roles of the primary users? What are their top-most needs, such as exception-oriented dashboards or rapid data entry? Can you get a demo for each key role, showing how the software fits a typical workday? Have the vendor repeatedly tell you what is standard, configured, custom coded, or provided by a partner. Third, how well does the demo balance ease of use with completeness of business processes? One common approach is to hide needed fields or steps that are of low visual value. Another approach is to focus heavily on a visually appealing capability, while downplaying the fit with your key business processes. Result: despite their business acumen, demo attendees may not focus adequately on gaps in business fit So, look for complete disclosure and complete CRM. To arrange a demo from Oracle, please visit http://www.oracle.com/crm.

    Read the article

  • The New OEPE 12.1.1.2 is Out - ADF Development and More

    - by Juan Camilo Ruiz
    Yes you are reading it right. Having announced just last week the general availability of our OEPE release 12.1.1.1.1 which includes support for developing applications to the Oracle Cloud. Today we are happy to announce the release of OEPE 12c (12.1.1.2) which includes various improvements for Webservices policies and security, and new features for implementing ADF applications in Eclipse Juno (3.8.1 and 4.2.1) as well as, bug fixes for other areas of the product - all of the above on top of Oracle Cloud support from the previous release. Many of the new features on this release have been added based on the feedback that we got from the ADF community, so, many thanks to you all and please, keep them coming! The main new features for this release are: ADF Bindings support on Taskflow activities on the diagram. Support for multi-node tree component bidings. Automatic ID generation for ADF Faces components. Support drag-n-drop of components and bindings into the page outline in addition to the regular jsp editor. Improved Webservices policies and security.  You can download the new versión from here. Remenber that you can send us your feedback or post your questions on our forum on OTN The OEPE Team. 

    Read the article

  • WebSocket@QCon NY

    - by reza_rahman
    QCon NY was held on June 10-14 at the New York Marriott/Brooklyn Bridge. Part of the QCon franchise, this is one of the most significant IT conferences in the greater NYC area. It was an honor to do a WebSocket (JSR 356) talk at the conference. Unfortunately, my schedule was such that I could only attend one day of the conference and did not really get a chance to attend many sessions or do much networking. I did get a chance to talk to fellow Oracle speakers Doug Clarke, Stephen Chin and Frederic Desbiens, which was great. My session, titled Building Java HTML5/WebSocket Applications with JSR 356 was very well attended and I had some excellent Q & A. The talk introduces HTML 5 WebSocket, overviews JSR 356, tours the API and ends with a small WebSocket demo on GlassFish 4. The slide deck for the talk is posted below. Building Java HTML5/WebSocket Applications with JSR 356 from Reza Rahman The demo code is posted on GitHub: https://github.com/m-reza-rahman/hello-websocket. Oracle hosted a reception in the evening which was very well attended. Later in the evening the QCon organizers hosted a very nice speakers' dinner at a local boutique restaurant with excellent atmosphere and good food.

    Read the article

  • Drive project success & financial performance with business critical Enterprise Project Portfolio Management

    - by Sylvie MacKenzie, PMP
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";} Oracle Primavera invites you to the first in a series of three webcasts linking Enterprise Project Portfolio Management with enhanced operational performance and better financial results. Few organizations fully understand the impact projects have on their business. Consistently delivering successful projects is vital to the financial success of an asset intensive organization. Enterprise Project Portfolio Management (EPPM) is not a new concept yet for many organizations it is not considered "business critical". Webcast 1: Plan – Aligning project selection and prioritization with corporate objectives This webcast will look at 2 key questions: Are you aligning portfolio decisions with strategic objectives? How do you effectively measure the success of your portfolio decisions? Hear from Accenture who'll present a compelling case for why asset intensive organizations should consider EPPM as business critical. They'll explore: How technology is being used to enhance project delivery How collaboration enhances delivery performance The major challenges associated with the planning phase of a project Next hear from Geoff Roberts, Industry Strategist from Oracle Primavera. With over 30 years experience in project management/project controls in the construction, utilities and oil & gas sectors, Geoff will investigate how EPPM is a best practice and can support an organization through project selection and prioritization ensuring that decisions are aligned with corporate objectives. Don’t miss out, register today!

    Read the article

  • What Poor Project Management Might Be Costing You

    - by Sylvie MacKenzie, PMP
    For project-intensive organizations, capital investment decisions define both success and failure. Getting them wrong—the risk of delays and schedule and cost overruns are ever present—introduces the potential for huge financial losses. The resulting consequences can be significant, and directly impact both a company’s profit outlook and its share price performance—which in turn is the fundamental measure of executive performance. This intrinsic link between long-term investment planning and short-term market performance is investigated in the independent report Stock Shock, written by a consultant from Clarity Economics and commissioned by the EPPM Board. A new international steering group organized by Oracle, the EPPM Board brings together senior executives from leading public and private sector organizations to explore the critical role played by enterprise project and portfolio management (EPPM). Stock Shock reviews several high-profile recent project failures, and combined with other research reviews the lessons to be learned. It analyzes how portfolio management is an exercise in balancing risk and reward, a process that places the emphasis firmly on executives to correctly determine which potential investments will deliver the greatest value and contribute most to the bottom line. Conversely, it also details how poor evaluation decisions can quickly impact the overall value of an organization’s project portfolio and compromise long-range capital planning goals. Failure to Deliver—In Search of ROI The report also cites figures from the Economist Intelligence Unit survey that found that more organizations (12 percent) expected to deliver planned ROI less than half the time, than those (11 percent) who claim to deliver it 90 percent or more of the time. This fact is linked to a recent report from Booz & Co. that shows how the average tenure of a global chief executive has fallen from 8.1 years to 6.3 years. “Senior executives need to begin looking at effective project delivery not as a bonus, but as an essential facet of business success,” according to Stock Shock author Phil Thornton. “Consolidated and integrated visibility into individual projects is the most practical solution to overcoming these challenges, which explains the increasing popularity of PPM technologies as an effective oversight and delivery platform.” Stock Shock is available for download on the EPPM microsite at http://www.oracle.com/oms/eppm/us/stock-shock-report-1691569.html

    Read the article

  • LibGdx efficient data saving/loading?

    - by grimrader22
    Currently, my LibGDX game consists of a 512 x 512 map of Tiles and entities such as players and monsters. I am wondering how to efficiently save and load the data of my levels. At the moment I am using JSON serialization for each class I want to save. I implement the Json.Serializable interface for all of these classes and write only the variables that are necessary. So my map consists of 512 x 512 tiles, that's 260,000 tiles. Each tile on the map consists of a Tile object, which points to some final Tile object like a GRASS_TILE or a STONE_TILE. When I serialize each level tile, the final Tile that it points to is re-serialized over and over again, so if I have 100 Tiles all pointing to GRASS_TILE, the data of GRASS_TILE is written 100 times over. When I go to load/deserialize my objects, 100 GrassTile objects are created, but they are each their own object. They no longer point to the final tile object. I feel like this reading/writing files very slow. If I were to abandon JSON serialization, to my knowledge my next best option would be saving the level data to a sql database. Unless there is a way to speed up serializing/deserializing 260,000 tiles I may have to do this. Is this a good idea? Could I really write that many tiles to the database efficiently? To sum all this up, I am trying to save my levels using JSON serialization, but it is VERY slow. What other options do I have for saving the data of so many tiles. I also must note that the JSON serialization is not slow on a PC, it is only VERY slow on a mobile device. Since file writing/reading is so slow on mobile devices, what can I do?

    Read the article

  • Learn How to Integrate Social Media into Your Customer Service - December 12 Webcast

    - by Tuula Fai
    Are you interested in learning more about social media customer service strategies? Then register for CRM Magazine's Roundtable Webcast, Four Social Media Support Strategies, being held Wednesday, December 12 from 11 AM - 12 PM PT (2 - 3 PM ET). The webcast features Oracle's Charlie Knapp, Director of CRM/CX Applications, Product Marketing who will speak on best practices for social enabling your contact center and customer support. Here is a brief overview of the webinar: Today's customers reveal an incredible amount of valuable information through social media on a daily basis. How well is your organization able to listen and repond? Join Parature, Verint Systems, KANA, and Oracle in this free webinar and learn how to: Enable collaboration across the enterprise to provide service and support in social media. Enhance loyalty, drive voice of the customer listening, and reduce costs. Intelligently identify, route, and engage directly with your customers through social media. Integrate social media into contact center workflows to solve customer issues, protect your brand, and improve satisfaction. Register now to join us for this free web event.  

    Read the article

  • MVVM - child windows and data contexts

    - by GlenH7
    Should a child window have it's own data context (View-Model) or use the data context of the parent? More broadly, should each View have its own View-Model? Are there are any rules to guide making that decision? What if the various View-Models will be accessing the same Model? I haven't been able to find any consistent guidance on my question. The MS definition of MVVM appears to be silent on child windows. For one example, I have created a warning message notification View. It really didn't need a data context since it was passed the message to display. But if I needed to fancy it up a bit, I would have tapped the parent's data context. I have run into another scenario that needs a child window and is more complicated than the notification box. The parent's View-Model is already getting cluttered, so I had planned on generating a dedicated VM for the child window. But I can't find any guidance on whether this is a good idea or what the potential consequences may be. FWIW, I happen to be working in Silverlight, but I don't know that this question is strictly a Silverlight issue.

    Read the article

  • How do I capture a 10053 trace for a SQL statement called in a PL/SQL package?

    - by Maria Colgan
    Traditionally if you wanted to capture an Optimizer trace (10053) for a SQL statement you would issue an alter session command to switch on a 10053 trace for that entire session, and then issue the SQL statement you wanted to capture the trace for. Once the statement completed you would exit the session to disable the trace. You would then look in the USER_DUMP_DEST directory for the trace file. But what if the SQL statement you were interested  in was actually called as part of a PL/SQL package? Oracle Database 11g, introduced a new diagnostic events infrastructure, which greatly simplifies the task of generating a 10053 trace for a specific SQL statement in a PL/SQL package. All you will need to know is the SQL_ID for the statement you are interested in. Instead of turning on the trace event for the entire session you can now switch it on for a specific SQL ID. Oracle will then capture a 10053 trace for the corresponding SQL statement when it is issued in that session. Remember the SQL statement still has to be hard parsed for the 10053 trace to be generated.  Let's begin our example by creating a PL/SQL package called 'cal_total_sales'. The SQL statement we are interested in is the same as the one in our original example, SELECT SUM(AMOUNT_SOLD) FROM SALES WHERE CUST_ID = :B1. We need to know the SQL_ID of this SQL statement to set up the trace, and we can find in V$SQL. We now have everything we need to generate the trace. Finally  you would look in the USER_DUMP_DEST directory for the trace file with the name you specified. Maria Colgan+

    Read the article

  • Best Persistence choice for J2EE-App with frequently changing Data Model

    - by Ben-G
    Whenever I develop a J2EE-Application, I at some point decide to switch from my dummy Persistence (Simply Using Lists and other Data Structures) to some Sort of Database Persistence. Mostly when I hope the Data Model is more or less complete. From this point on, changes to the data model become exhausting, but unluckily they occur rather often. I've used different Object-Relational-Mappers (iBatis, Hibernate) for my projects. They definitely reduce the pain coming with Data Model changes, but they anyway let me adjust code/configuration at 3 or 4 places for every single change. To me, that's cumbersome and error prone. I made a better experience with DB4O, which simply persists Java Objects as they are, but I believe it's performance does not scale for huge applications. Is there anyway to maintain performance while letting out all the ugly configuration work? I'm seeking a performant framework which really hides persistence from my code. Wish for thinking? Or am I missing out THE technology? Hope you can help.

    Read the article

  • Organising data access for dependency injection

    - by IanAWP
    In our company we have a relatively long history of database backed applications, but have only just begun experimenting with dependency injection. I am looking for advice about how to convert our existing data access pattern into one more suited for dependency injection. Some specific questions: Do you create one access object per table (Given that a table represents an entity collection)? One interface per table? All of these would need the low level Data Access object to be injected, right? What about if there are dozens of tables, wouldn't that make the composition root into a nightmare? Would you instead have a single interface that defines things like GetCustomer(), GetOrder(), etc? If I took the example of EntityFramework, then I would have one Container that exposes an object for each table, but that container doesn't conform to any interface itself, so doesn't seem like it's compatible with DI. What we do now, in case it helps: The way we normally manage data access is through a generic data layer which exposes CRUD/Transaction capabilities and has provider specific subclasses which handle the creation of IDbConnection, IDbCommand, etc. Actual table access uses Table classes that perform the CRUD operations associated with a particular table and accept/return domain objects that the rest of the system deals with. These table classes expose only static methods, and utilise a static DataAccess singleton instantiated from a config file.

    Read the article

  • The JavaServer Faces 2.2 viewAction Component

    - by Janice J. Heiss
    Life just got easier for users of JavaServer Faces. In a new article, now up on otn/java, titled “New JavaServer Faces 2.2 Feature: The viewAction Component,” Tom McGinn, Oracle’s Principal Curriculum Developer for Oracle Server Technologies, explores the advantages offered by the JavaServer Faces 2.2 view action feature, which, according to McGinn, “simplifies the process for performing conditional checks on initial and postback requests, enables control over which phase of the lifecycle an action is performed in, and enables both implicit and declarative navigation.”As McGinn observes: “A view action operates like a button command (UICommand) component. By default, it is executed during the Invoke Application phase in response to an initial request. However, as you'll see, view actions can be invoked during any phase of the lifecycle and, optionally, during postback, making view actions well suited to performing preview checks.”McGinn explains that the JavaServer Faces 2.2 view action feature offers several advantages over the previous method of performing evaluations before a page is rendered:   * View actions can be triggered early on, before a full component tree is built, resulting in a lighter weight call.   * View action timing can be controlled.   * View actions can be used in the same context as the GET request.   * View actions support both implicit and explicit navigation.   * View actions support both non-faces (initial) and faces (postback) requests.Read the complete article here.

    Read the article

  • PeopleSoft RECONNECT Conference Unites the PeopleSoft Community

    - by Marc Weintraub
    The PeopleSoft team is looking forward to participating in this new PeopleSoft deep dive conference from the Quest International Users Group.  We’ve worked diligently with the leadership of Quest’s PeopleSoft Special Interest Groups (SIG’s) and Regional User Groups (RUG’s) to make sure this national user event delivers PeopleSoft content that meets the needs of the PeopleSoft community. The inaugural PeopleSoft RECONNECT conference will be held August 27-29, 2012 in Hartford Connecticut.  Through our Product Strategy, Development and Support teams Oracle will provide support for education sessions in these key tracks: Human Capital Management (HCM) Financials (FMS) Supplier Relationship Management (SRM) Supply Chain, Manufacturing & Distribution (SCM) Project Costing Applications Technology (PeopleTools) Oracle will host a general session from John Webb, plus roadmap sessions for the major PeopleSoft product areas.  We will also host enhancement discussions for our key PeopleSoft solutions allowing participants to contribute to the future of PeopleSoft through an interactive forum.  All of this is part of the 100+ education sessions being offered by the customer and vendor community.   There’s a lot of buzz around this conference, so don’t delay in registering key members of your team today.  We look forward to seeing you there so register NOW!

    Read the article

  • Technical Questions for Java Experts

    - by Tori Wieldt
    The "Oracle Technology Network" (meaning me) will be at Devoxx next week doing interviews with Java experts. Do you have technical questions about Project Jigsaw, JavaFX or Java on MacOS? Take a look at the list below of experts and topics. Leave your questions as a comment on this blog and I'll do my best to include them. Most of the interviews happen Tuesday, so get you questions in quickly. Thanks! Interviewee InterviewTopic Arun Gupta and Alexis Moussine-Pouchkine Java EE Mark Reinhold OpenJDK Mark Reinhold Project Jigsaw Jasper Potts JavaFX Scott Kovatch Java on Mac OS Brian Goetz & Mark Reinhold JDK 8 Brian Goetz Project Lambda Steven Chin JavaFX Marek Potociar JAX-RS Claude Falguiere Dev for Tablets Alan Bateman NIO2 Regina ten Bruggencate JDuchess Martijn Verburg Adopt a JSR Note: This is different than the call for questions for the Fireside chat on Tuesday afternoon, Devoxx conference keynote speakers (Henrik Ståhl, senior director of product management for the Java platform at Oracle, and Cameron Purdy, VP of development for the Java EE platform) and the technical discussion panel on Friday morning. Leave (and vote on) those questions here. 

    Read the article

  • CEO Taken Captive in His Own Factory?

    - by Stephen Slade
    Last Friday was no ordinary day for Chip Starnes, the 42 year old factory owner of Specialty Medical Supplies in China. He recently announced movement of some of the production of their diabetes testing equipment from Beijing to Mumbai India.  Of the 110 employees at the facility, about 80 protested by blocking the doors and refusing to let Chip Starnes out of the facility.  He has been trapped in his office several days now.  The employees think the factory was closing but Mr. Starnes said it was not. Mis-information? Poor communications? Work-stoppage. This is a good example of supply chain disruption. Parked cars are blocking the entrance to the facility, front gates are chained close, the CEO a prisoner in his own factory. Chip Starnes was presented with documents to sign in Chinese indicating he would pay severance and other demands he did not understand, possibly bankrupting the company.    If you depend on supply from China and other foreign suppliers, how reliable are your sources? For example how are the shopfloor employee relations? Is it possible to predict these types of HR risks and plan around them? What are your contingencies? It's important to ask the right questions and hear good answers. Having tools in place to rapidly evaluate, assess and react to these disruptions are the keys to survival. Hear how leading organizations are reinforcing their supply chains and mitigating risk through technology with Oracle's latest release of Oracle Supply Chain Management. Source: WSJ pg.B1, June 25, 2013

    Read the article

  • Sorting a Grid of Data in ASP.NET MVC

    Last week's article, Displaying a Grid of Data in ASP.NET MVC, showed, step-by-step, how to display a grid of data in an ASP.NET MVC application. Last week's article started with creating a new ASP.NET MVC application in Visual Studio, then added the Northwind database to the project and showed how to use Microsoft's Linq-to-SQL tool to access data from the database. The article then looked at creating a Controller and View for displaying a list of product information (the Model). This article builds on the demo application created in Displaying a Grid of Data in ASP.NET MVC, enhancing the grid to include bi-directional sorting. If you come from an ASP.NET WebForms background, you know that the GridView control makes implementing sorting as easy as ticking a checkbox. Unfortunately, implementing sorting in ASP.NET MVC involves a bit more work than simply checking a checkbox, but the quantity of work isn't significantly greater and with ASP.NET MVC we have more control over the grid and sorting interface's layout and markup, as well as the mechanism through which sorting is implemented. With the GridView control, sorting is handled through form postbacks with the sorting parameters - what column to sort by and whether to sort in ascending or descending order - being submitted as hidden form fields. In this article we'll use querystring parameters to indicate the sorting parameters, which means a particular sort order can be indexed by search engines, bookmarked, emailed to a colleague, and so on - things that are not possible with the GridView's built-in sorting capabilities. Like with its predecessor, this article offers step-by-step instructions and includes a complete, working demo available for download at the end of the article. Read on to learn more! Read More >

    Read the article

  • External table and preprocessor for loading LOBs

    - by David Allan
    I was using the COLUMN TRANSFORMS syntax to load LOBs into Oracle using the Oracle external which is a handy way of doing several stuff - from loading LOBs from the filesystem to having constants as fields. In OWB you can use unbound external tables to define an external table using your own arbitrary access parameters - I blogged a while back on this for doing preprocessing before it was added into OWB 11gR2. For loading LOBs using the COLUMN TRANSFORMS syntax have a read through this post on loading CLOB, BLOB or any LOB, the files to load can be specified as a field that is a filename field, the content of this file will be the LOB data. So using the example from the linked post, you can define the columns; Then define the access parameters - if you go the unbound external table route you can can put whatever you want in here (your external table get out of jail free card); This will let you read the LOB files fromn the filesystem and use the external table in a mapping. Pushing the envelope a little further I then thought about marrying together the preprocessor with the COLUMN TRANSFORMS, this would have let me have a shell script for example as the preprocessor which listed the contents of a directory and let me read the files as LOBs via an external table. Unfortunately that doesn't quote work - there is now a bug/enhancement logged, so one day maybe. So I'm afraid my blog title was a little bit of a teaser....

    Read the article

  • SQLAuthority News – Download Whitepaper – Power View Infrastructure Configuration and Installation: Step-by-Step and Scripts

    - by pinaldave
    Power View, a feature of SQL Server 2012 Reporting Services Add-in for Microsoft SharePoint Server 2010 Enterprise Edition, is an interactive data exploration, visualization, and presentation experience. It provides intuitive ad-hoc reporting for business users such as data analysts, business decision makers, and information workers. Microsoft has recently released very interesting whitepaper which covers a sample scenario that validates the connectivity of the Power View reports to both PowerPivot workbooks and tabular models. This white paper talks about following important concepts about Power View: Understanding the hardware and software requirements and their download locations Installing and configuring the required infrastructure when Power View and its data models are on the same computer and on different computer Installing and configuring a computer used for client access to Power View reports, models, Sharepoint 2012 and Power View in a workgroup Configuring single sign-on access for double-hop scenarios with and without Kerberos You can download the whitepaper from here. This whitepaper talks about many interesting scenarios. It would be really interesting to know if you are using Power View in your production environment. If yes, would you please share your experience over here. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Business Intelligence, Data Warehousing, PostADay, SQL, SQL Authority, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, SQL White Papers, T SQL, Technology

    Read the article

  • ALERT: Error Processing US Wage Attachment Elements In Payroll Run After RUP Patches

    - by LuciaC
    Customers who have run the Upgrade Wage Attachments process after applying the 2012 RUP are reporting errors similar to those listed below when either running a quickpay or processing a payroll for employee(s) with involuntary deductions. Error: HR_51118_HRPROC_ERR_ON_ASG ASGNO 1115 APP-PAY-51118: Error was encountered when processing assignment 1115 HR_51119_HRPROC_ERR_OCC_ON_ET ETNAME: Garnishment 3 APP-PAY-51119: Error was encountered when processing Element Type Garnishment 3 HR_6881_HRPROC_ORA_ERR SQLERRMC ORA-01403: No data found SQL_NO 520 TABLE_NAME pay_input_values_f APP-PAY-06881:Error ORA-01403: no data found has occured in table pay_input_values_f at location 520 This issue was logged in Bug 14679161 - QUICK PAY ERROR AFTER RUP (2012) AND WAGE ATTACHMENT UPGRADE APP-PAY-06881. The following one off patches have been released to My Oracle Support to resolve this issue*: 11i -  Patch 14679161 12.0 - Patch 14849394:R12.PAY.A 12.1 - Patch 14849394:R12.PAY.B * IMPORTANT:  Depending on when/if customers have run the Wage Attachment upgrade process will determine the appropriate action to take. Any customer who is encountering the above error and/or has run the Wage Attachment upgrade process AFTER applying the 2012 RUP (applicable to their release level) should log a Service Request with Oracle Support to receive assistance on the necessary steps to take to resolve the problem BEFORE applying the above patch. Any customer who has not yet run the Wage Attachment Upgrade process (either before or after applying the 2012 RUP), should follow the action plan documented in the patch readme. For those customers who have already run the Wage Attachment Upgrade process BEFORE applying the 2012 RUP, should apply the patch (applicable to your release) listed above. Be sure to run any post install processes, such as the data install utility and HR global driver.  See the patch readme for full details. Please consult Note 404478.1: Americas (US, CA, MX) HCM High Priority Alert for the latest Alert status.

    Read the article

  • How Do Top Performing High Tech Companies Measure Online Marketing Success?

    - by Charles Knapp
    You might expect a focus on Net Promoter scores, open rates, and click metrics. The real answers from top performers may surprise you. I've been working for a few months with Aberdeen Group and colleagues from IBM and Oracle to survey high technology firms worldwide on best practices in marketing and channel sales effectiveness.  Now, we will share the results of our original customer research in a new white paper and webcast. Register today to learn how leading High Tech companies are increasing their Return on Marketing Investment (ROMI) and growing channel sales revenue. Discover how top performing high tech companies manage and use customer data, measure marketing spend effectiveness, and support internal and channel sales. Learn how best in class high tech companies use enterprise data throughout their customer lifecycle -- messaging to leads, selling to prospects, and serving customers. Our speakers will be: Peter Ostrow, Research Director - Sales Effectiveness, Aberdeen Group David Lasher, Global Business Services Partner, IBM Jonathan Oomrigar, Vice President, Global High Technology Business Unit, Oracle Reserve your place now! This global webinar is on Tuesday, November 15, 10-11 am PST / 1-2 pm EST / 6-7 GMT / 7-8 CET

    Read the article

  • Best of OTN - Week of May 25th

    - by CassandraClark-OTN
    Architect Community Podcast: Going Mobile - Developing Enterprise Mobile Apps This four-part OTN ArchBeat Podcast series is devoted to a discussion about bringing mobility to the enterprise, and how architects and developers can take advantage of the opportunities in the evolution of mobile application development. Video: Data Modeling and Moving Meditation with Kent Graziano Want to learn more about Kent's Kscope 2014 data modeling sessions and how Chi Gung can help you get a great start on your day? Check out this video interview. Video: Oracle ACE Director Stewart Bryson on OBIEE, ODI, GoldenGate In this interview Stewart talks about how OBIEE, ODI, GoldenGate and other technologies fit into his Kscope 2014 sessions, and about the sessions he plans to attend. Friday Funny from OTN Architect Community Manager Bob Rhubart:Even if you're not a person of a certain age, you need read A journey into my colon -- and yours, humorist Dave Barry's wildly funny 2008 account of his colonoscopy. Because one day you will be a person of certain age... Get involved in community conversations on the following OTN channels... OTN TechBlog The Java Source Blog The OTN Garage Blog The OTN ArchBeat Blog @oracleotn @java @OTN_Garage @OTNArchBeat @OracleDBDev OTN I Love Java OTN Garage OTN ArchBeat Oracle DB Dev OTN Java OTN ArchBeat

    Read the article

  • Configuring log4j on weblogic server for web applications.

    - by adejuanc
    To configure Weblogic server : 1.- Read the following link : How to Use Log4j with WebLogic Logging Services http://download.oracle.com/docs/cd/E12840_01/wls/docs103/logging/config_logs.html#wp1014610 Here the step by step : 2.- Go to WL_HOME/server/lib and copy wllog4j.jar to the server CLASSPATH, to do this copy the file into DOMAIN_NAME/lib 3.- Download log4j jar (in my case I had not the file) from http://logging.apache.org/log4j/1.2/download.html , in this case the last available version is log4j-1.2.17.jar, and copy the file into DOMAIN_NAME/lib (As step 2). 4.- In this case I activate log4j using WLST (Weblogic Scripting Tool), as bellow : 4.1 .- As you're using windows, execute a terminal window and go to DOMAIN_NAME/bin and run the file setDomainEnv.cmd (this file will set the environment to run java). 4.2 .- Execute the following comands : C:\>java weblogic.WLST wls:/offline> connect('username','password') wls:/mydomain/serverConfig> edit() wls:/mydomain/edit> startEdit() wls:/mydomain/edit !> cd("Servers/$YOUR_SERVER_NAME/Log/$YOUR_SERVER_NAME" wls:/mydomain/edit/Servers/myserver/Log/myserver !> cmo.setLog4jLoggingEnabled(true) wls:/mydomain/edit/Servers/myserver/Log/myserver !> save() wls:/mydomain/edit/Servers/myserver/Log/myserver !> activate() you can use ls() to list the objects under the WLS directory this will activate log4j to use it with WLS. Configuring WebLogic Logging Services http://download.oracle.com/docs/cd/E12840_01/wls/docs103/logging/config_logs.html To configure applications : 1. Create a log4j.properties file as bellow log4j.debug=TRUE log4j.rootLogger=INFO, R log4j.appender.R=org.apache.log4j.RollingFileAppender log4j.appender.R.File=/home/server.log log4j.appender.R.MaxFileSize=100KB log4j.appender.R.MaxBackupIndex=5 log4j.appender.R.layout=org.apache.log4j.PatternLayout log4j.appender.R.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss.SSSS} %p %t %c – %m%n 2. Copy the file to /WEB-INF/classes directory. of your application. 3.- implement also the last action provided to activate log4j on WLS

    Read the article

  • Introduction to the ADF Debugger

    - by Shay Shmeltzer
    Not that you'll ever need this blog entry - after all there are never bugs in the code that YOU write. But maybe one day one of your peers will ask you for help debugging their ADF application so here we go... One of the cool features of JDeveloper and ADF is the ADF Debugger - a way to debug the declarative pars of Oracle ADF. The debugger goes beyond your regular Java debugger and shows you in a clear way specific information related to Oracle ADF - things like where are you in the taskflow/region hierarchy, what is in your various scopes, what is the value of a specific EL and much more. However, from the number of posts on OTN where people are saying "I placed a System.out.println() to see what the value was...", it seems that not many are familiar with the power of the debugger. So here is a short demo that shows you some aspects of the debugger such as: Setting breakpoints on various ADF artifacts The ADF structure window The ADF Data window The EL Evaluater window Want to learn more about debugging ADF applications - I highly recommend that you go back in time to 2009 and attend Steve Muench's OOW presentation about ADF debugging. Can't travel in time yet? Then the second best option is to look at his very clear ADF Debugging Slides, which were the inspiration to the above demo.

    Read the article

  • Gotcha when using JavaScript in ADF Regions

    - by Frank Nimphius
    You use the ADF Faces af:resource tag to add or reference JavaScript on a page. However, adding the af:resource tag to a page fragment my not produce the desired result if the script is added as shown below <?xml version='1.0' encoding='UTF-8'?> <jsp:root xmlns:jsp="http://java.sun.com/JSP/Page" version="2.1" xmlns:af="http://xmlns.oracle.com/adf/faces/rich"> <af:resource type="javascript">   function yourMethod(evt){ ... } </af:resource> Adding scripts to a page fragment like this will see the script added for the first page fragment loaded by an ADF region but not for any subsequent fragment navigated to within the context of task flow navigation. The cause of this problem is caching as the af:resource tag is a JSP element and not a lazy loaded JSF component, which makes it a candidate for caching. To solve the problem, move the af:resource tag into a container component like af:panelFormLayout so the script is added when the component is instantiated and added to the page.  <?xml version='1.0' encoding='UTF-8'?> <jsp:root xmlns:jsp="http://java.sun.com/JSP/Page" version="2.1" xmlns:af="http://xmlns.oracle.com/adf/faces/rich"> <af:panelFormLayout> <af:resource type="javascript">   function yourMethod(evt){ ... } </af:resource> </af:panelFormLayout> Magically this then works and prevents browser caching of the script when using page fragments.

    Read the article

< Previous Page | 339 340 341 342 343 344 345 346 347 348 349 350  | Next Page >