Search Results

Search found 23653 results on 947 pages for 'oracle workforce management'.

Page 219/947 | < Previous Page | 215 216 217 218 219 220 221 222 223 224 225 226  | Next Page >

  • Post Crosstalk 2012

    - by David Dorf
    This year the Oracle Retail users conference, Crosstalk, had a 20% increase in attendees, which was driven by both new customers and those acquired via Endeca.  As the product assets of Oracle have grown, so has the completeness of the solution set.  This year was marked by the breadth of omni-channel stories. Rose Spicer and her marketing team (see photo on left) always strive for an equal balance of retailer presentations, networking opportunities, and unique experiences -- this year was no exception.  We had 41 different retailers from China, Russia, South Africa, Brazil, Chile, US, Canada and the UK sharing their insights with one another. In all there were 251 executives from 120 iconic brands such as Daphne, Kohl's, Morrisons, Abercrombie & Fitch, Hot Topic, Talbots, Petco, Deckers, Sportmaster, Mr. Price, Falabella, and Disney to name a few. From a product perspective, there were a few new developments from Oracle Retail: Endeca's search engine has been integrated into the ATG commerce platform. The latest Retail Analytics application, Oracle Retail Customer Analytics, is generally available. Oracle Retail previewed a new fully-integrated mobile POS. But the real benefit of attending Crosstalk was hearing about the experiences of retailers and partners.  Here are are a few interesting facts I picked up: At Kohl's, the most popular website accessed by customers within their stores is Facebook.  With all the buzz about showrooming, I was really expecting it to be Amazon. Daphne, a Chinese shoe retailer, is opening 3 new stores per day.  Being located near the factories allows them to have a very agile supply chain as well. Disney Stores have increased sales by 25% at stores upgraded to include Mobile POS.  They continue to lead the pack with excellent customer experiences. Quicksilver reported that 1 in 5 visits to their website comes from a tablet.  More evidence that tablets are replacing traditional PCs in households. By tagging shoes with RFID, Saks is able to ensure all shoe models are on display.  If a model is not being displayed, it has no chance of being sold. Additionally, there were awards, store tours on Michigan Avenue, fireworks at Navy Pier, and the Oracle Retail house band, Bolo313, performing at Solider Field.  Speaking of which, a few retailers got on stage and jammed with band -- possible rival to Rock & Roll Retail? You can always find the latest info from us at the Retail Rack. The next events on tap are the Partner Summit followed by OpenWorld.

    Read the article

  • Webcast Replay Available: SOA Integration Options for E-Business Suite

    - by BillSawyer
    I am pleased to release the replay and presentation for the latest ATG Live Webcast: SOA Integration Options for E-Business Suite (Presentation)Abhishek Verma, Manager, Applications Technology Group and Rajesh Ghosh, Group Manager, ATG Development discussed the web service and SOA integration options for Oracle E-Business Suite. The presentation covered Oracle's integration tools and technologies, including the Oracle Applications Adapter and the Integrated SOA Gateway.Finding other recorded ATG webcastsThe catalog of ATG Live Webcast replays, presentations, and all ATG training materials is available in this blog's Webcasts and Training section.

    Read the article

  • This Week on the Green Data Center Management Front

    Among the big news this week in green data center management: Horizon Data Center Solutions announces a partnership with Power Loft, Cisco, GDSYS, and Incheon Metropolitan City partner to build a next-generation green data center, and Vycon Energy demonstrates how its flywheel backup power systems can be used for health-care data system power backup.

    Read the article

  • Hochkaräter für Middleware

    - by A&C Redaktion
    Seit Mitte 2010 ist die virtual7 GmbH zertifizierter Oracle Platinum Partner. Das Software- und Beratungsunternehmen setzt den Schwerpunkt seiner Arbeit auf Beratung und die Durchführung von Middleware-Projekten im Oracle Umfeld. Nur ein Jahr später wurde virtual7 als OPN Specialized Middleware Partner des Jahres ausgezeichnet! Marcus Weiss, der Geschäftsführer von virtual7, hat klare Ziele für die Zukunft seines Unternehmens. Er weiß, dass der Markt für vernetzte Datensysteme weiter wachsen wird. Daher setzte er auf höchste Qualität und technologische Expertise. Im Oracle Webcast spricht Weiss über die Schwerpunkte des Unternehmens und die Besonderheiten in der Zusammenarbeit mit Oracle.

    Read the article

  • Latest Security Inside Out Newsletter Now Available

    - by Troy Kitch
    The September/October edition of the Security Inside Out Newsletter is now available. Learn about Oracle OpenWorld database security sessions, hands on labs, and demos you'll want to attend, as well as frequently asked question about Label-Based Access Controls in Oracle Database 11g. Subscriber here for the bi-monthly newsletter.  ...and if you haven't already done so, join Oracle Database on these social networks: Twitter Facebook LinkedIn Google+ 

    Read the article

  • New ways for backup, recovery and restore of Essbase Block Storage databases – part 2 by Bernhard Kinkel

    - by Alexandra Georgescu
    After discussing in the first part of this article new options in Essbase for the general backup and restore, this second part will deal with the also rather new feature of Transaction Logging and Replay, which was released in version 11.1, enhancing existing restore options. Tip: Transaction logging and replay cannot be used for aggregate storage databases. Please refer to the Oracle Hyperion Enterprise Performance Management System Backup and Recovery Guide (rel. 11.1.2.1). Even if backups are done on a regular, frequent base, subsequent data entries, loads or calculations would not be reflected in a restored database. Activating Transaction Logging could fill that gap and provides you with an option to capture these post-backup transactions for later replay. The following table shows, which are the transactions that could be logged when Transaction Logging is enabled: In order to activate its usage, corresponding statements could be added to the Essbase.cfg file, using the TRANSACTIONLOGLOCATION command. The complete syntax reads: TRANSACTIONLOGLOCATION [ appname [ dbname]] LOGLOCATION NATIVE ?ENABLE | DISABLE Where appname and dbname are optional parameters giving you the chance in combination with the ENABLE or DISABLE command to set Transaction Logging for certain applications or databases or to exclude them from being logged. If only an appname is specified, the setting applies to all databases in that particular application. If appname and dbname are not defined, all applications and databases would be covered. LOGLOCATION specifies the directory to which the log is written, e.g. D:\temp\trlogs. This directory must already exist or needs to be created before using it for log information being written to it. NATIVE is a reserved keyword that shouldn’t be changed. The following example shows how to first enable logging on a more general level for all databases in the application Sample, followed by a disabling statement on a more granular level for only the Basic database in application Sample, hence excluding it from being logged. TRANSACTIONLOGLOCATION Sample Hyperion/trlog/Sample NATIVE ENABLE TRANSACTIONLOGLOCATION Sample Basic Hyperion/trlog/Sample NATIVE DISABLE Tip: After applying changes to the configuration file you must restart the Essbase server in order to initialize the settings. A maybe required replay of logged transactions after restoring a database can be done only by administrators. The following options are available: In Administration Services selecting Replay Transactions on the right-click menu on the database: Here you can select to replay transactions logged after the last replay request was originally executed or after the time of the last restored backup (whichever occurred later) or transactions logged after a specified time. Or you can replay transactions selectively based on a range of sequence IDs, which can be accessed using Display Transactions on the right-click menu on the database: These sequence ID s (0, 1, 2 … 7 in the screenshot below) are assigned to each logged transaction, indicating the order in which the transaction was performed. This helps to ensure the integrity of the restored data after a replay, as the replay of transactions is enforced in the same order in which they were originally performed. So for example a calculation originally run after a data load cannot be replayed before having replayed the data load first. After a transaction is replayed, you can replay only transactions with a greater sequence ID. For example, replaying the transaction with sequence ID of 4 includes all preceding transactions, while afterwards you can only replay transactions with a sequence ID of 5 or greater. Tip: After restoring a database from a backup you should always completely replay all logged transactions, which were executed after the backup, before executing new transactions. But not only the transaction information itself needs to be logged and stored in a specified directory as described above. During transaction logging, Essbase also creates archive copies of data load and rules files in the following default directory: ARBORPATH/app/appname/dbname/Replay These files are then used during the replay of a logged transaction. By default Essbase archives only data load and rules files for client data loads, but in order to specify the type of data to archive when logging transactions you can use the command TRANSACTIONLOGDATALOADARCHIVE as an additional entry in the Essbase.cfg file. The syntax for the statement is: TRANSACTIONLOGDATALOADARCHIVE [appname [dbname]] [OPTION] While to the [appname [dbname]] argument the same applies like before for TRANSACTIONLOGLOCATION, the valid values for the OPTION argument are the following: Make the respective setting for which files copies should be logged, considering from which location transactions are usually taking place. Selecting the NONE option prevents Essbase from saving the respective files and the data load cannot be replayed. In this case you must first manually load the data before you can replay the transactions. Tip: If you use server or SQL data and the data and rules files are not archived in the Replay directory (for example, you did not use the SERVER or SERVER_CLIENT option), Essbase replays the data that is actually in the data source at the moment of the replay, which may or may not be the data that was originally loaded. You can find more detailed information in the following documents: Oracle Hyperion Enterprise Performance Management System Backup and Recovery Guide (rel. 11.1.2.1) Oracle Essbase Online Documentation (rel. 11.1.2.1)) Enterprise Performance Management System Documentation (including previous releases) Or on the Oracle Technology Network. If you are also interested in other new features and smart enhancements in Essbase or Hyperion Planning stay tuned for coming articles or check our training courses and web presentations. You can find general information about offerings for the Essbase and Planning curriculum or other Oracle-Hyperion products here; (please make sure to select your country/region at the top of this page) or in the OU Learning paths section, where Planning, Essbase and other Hyperion products can be found under the Fusion Middleware heading (again, please select the right country/region). Or drop me a note directly: [email protected]. About the Author: Bernhard Kinkel started working for Hyperion Solutions as a Presales Consultant and Consultant in 1998 and moved to Hyperion Education Services in 1999. He joined Oracle University in 2007 where he is a Principal Education Consultant. Based on these many years of working with Hyperion products he has detailed product knowledge across several versions. He delivers both classroom and live virtual courses. His areas of expertise are Oracle/Hyperion Essbase, Oracle Hyperion Planning and Hyperion Web Analysis. Disclaimer: All methods and features mentioned in this article must be considered and tested carefully related to your environment, processes and requirements. As guidance please always refer to the available software documentation. This article does not recommend or advise any explicit action or change, hence the author cannot be held responsible for any consequences due to the use or implementation of these features.

    Read the article

  • Next Quarterly Customer Update Webcast is July 24th (July 25th in Asia Pacific)

    - by R.Hunter
    Join Team Informatics, Kyle Hatlestad from the WebCenter Content “A-Team” and Oracle WebCenter Product Management for the next Oracle WebCenter Quarterly Customer Update Webcast scheduled for July 24th (July 25th in Asia Pacific). Get the latest product management updates and learn more about WebCenter Content and WebCenter Sites. Team Informatics will give an overview of the WebCenter Sites 11g Connector to WebCenter Content and Kyle Hatlestad will discuss best practices for WCC deployment and configuration. You can follow Kyle’s blog at: http://blogs.oracle.com/kyle/ Don't miss out, there will be two live sessions with Q&A. Further details and the registration links for the webcast can be found on our Oracle Technology Network Page.

    Read the article

  • So long Oracle ...

    - by arungupta
    ... and thanks for all the fish! This Friday (October 18, 2013) is my last day at Oracle. After Publishing almost 1400 blog entries with 5500+ comments on them Working in the Java EE team since inception Visiting 35+ countries and several cities around the world Speaking at all major Java conferences and lots of Java User Groups 15-year alumni of JavaOne as staff Meeting and working with best of the best in the Java community Most importantly having lots of fun Its time for me to move on! No new blog entries will be posted on this blog. Feel free to subscribe to The Aquarium for latest updates on Java EE and GlassFish. I'll continue to publish all the excellent content that you've been used to at blog.arungupta.me now onwards. Read my new blog to learn about my new adventures! Here are some of the conference badges collected over the past years ... And the cities visited ... View Cities Visited by "Miles To Go..." in a larger map The comments on this blog are disabled as I'll not be able to respond to them. Feel free to leave comments on the new blog and I'd love to follow up with you there. Thank you very much for all the support that has been shown on this blog. I'd like to conclude with a Hindi song that I've been humming for the past few days now ... Abhi alvida mat kaho doston ... Na jaane kahan phir mulaqaat ho ... Kyonki ... Beete huye lamhon ki kasak saath to hogi ... Khawabon mein hi ho chahe mulaqaat to hogi ... For my non-Hindi readers, here is my paraphrased meaning ... Don't say goodbye yet my friends ... We'll likely meet somewhere else ... Because ... We'll always have the memories of the wonderful time spent together ... May be in dreams but we will meet again ... With that, over and out, and see you at blog.arungupta.me!

    Read the article

  • ORA-7445 Troubleshooting

    - by [email protected]
        QUICKLINK: Note 153788.1 ORA-600/ORA-7445 Lookup tool Note 1082674.1 : A Video To Demonstrate The Usage Of The ORA-600/ORA-7445 Lookup Tool [Video]   Have you observed an ORA-07445 error reported in your alert log? While the ORA-600 error is "captured" as a handled exception in the Oracle source code, the ORA-7445 is an unhandled exception error due to an OS exception which should result in the creation of a core file.  An ORA-7445 is a generic error, and can occur from anywhere in the Oracle code. The precise location of the error is identified by the core file and/or trace file it produces.  Looking for the best way to diagnose? Whenever an ORA-7445 error is raised a core file is generated.  There may be a trace file generated with the error as well.   Prior to 11g, the core files are located in the CORE_DUMP_DEST directory.   Starting with 11g, there is a new advanced fault diagnosability infrastructure to manage trace data.  Diagnostic files are written into a root directory for all diagnostic data called the ADR home.   Core files at 11g will go to the ADR HOME/cdump directory.   For more information on the Oracle 11g Diagnosability feature see Note 453125.1 11g Diagnosability Frequently Asked Questions Note 443529.1 11g Quick Steps to Package and Send Critical Error Diagnostic Information to Support[Video]   NOTE:  While the core file is captured in the Diagnosability infrastructure, the file may not be included with a diagnostic package.1.  Check the Alert LogThe alert log may indicate additional errors or other internal errors at the time of the problem.   In some cases, the ORA-7445 error will occur along with ORA-600, ORA-3113, ORA-4030 errors.  The ORA-7445 error can be side effects of the other problems and you should review the first error and associated core file or trace file and work down the list of errors.   Note 1020463.6 DIAGNOSING ORA-3113 ERRORS Note 1812.1 TECH:  Getting a Stack Trace from a CORE file Note 414966.1 RDA Documentation Index   If the ORA-7445 errors are not associated with other error conditions, ensure the trace data is not truncated. If you see a message at the end of the file   "MAX DUMP FILE SIZE EXCEEDED"   the MAX_DUMP_FILE_SIZE parameter is not setup high enough or to 'unlimited'. There could be vital diagnostic information missing in the file and discovering the root issue may be very difficult.  Set the MAX_DUMP_FILE_SIZE appropriately and regenerate the error for complete trace information. For pointers on deeper analysis of these errors see   Note 390293.1 Introduction to 600/7445 Internal Error Analysis Note 211909.1 Customer Introduction to ORA-7445 Errors 2.  Search 600/7445 Lookup Tool Visit My Oracle Support to access the ORA-00600 Lookup tool (Note 153788.1). The ORA-600/ORA-7445 Lookup tool may lead you to applicable content in My Oracle Support on the problem and can be used to investigate the problem with argument data from the error message or you can pull out key stack pointers from the associated trace file to match up against known bugs.3.  "Fine tune" searches in Knowledge Base As the ORA-7445 error indicates an unhandled exception in the Oracle source code, your search in the Oracle Knowledge Base will need to focus on the stack data from the core file or the trace file. Keep in mind that searches on generic argument data will bring back a large result set.  The more you can learn about the environment and code leading to the errors, the easier it will be to narrow the hit list to match your problem. Note 153788.1 ORA-600/ORA-7445 TroubleshooterNote 1082674.1 A Video To Demonstrate The Usage Of The ORA-600/ORA-7445 Lookup Tool [Video] NOTE:  If no trace file is captured, see Note 1812.1 TECH:  Getting a Stack Trace from a CORE file.  Core files are managed through 11g Diagnosability, but are not packaged with other diagnostic data automatically.  The core files can be quite large, but may be useful during analysis within Oracle Support.4.  If assistance is required from Oracle Should it become necessary to get assistance from Oracle Support on an ORA-7445 problem, please provide at a minimum, the Alert log  Associated tracefile(s) or incident package at 11g Patch level  information Core file(s)  Information about changes in configuration and/or application prior to  issues  If error is reproducible, a self-contained reproducible testcase: Note.232963.1 How to Build a Testcase for Oracle Data Server Support to Reproduce ORA-600 and ORA-7445 Errors RDA report or Oracle Configuration Manager information Oracle Configuration Manager Quick Start Guide Note 548815.1 My Oracle Support Configuration Management FAQ Note 414966.1 RDA Documentation Index ***For reference to the content in this blog, refer to Note.1092832.1 Master Note for Diagnosing ORA-600

    Read the article

  • BI für schnelle Analysen: Ein Erfolgsprojekt für die DAB bank, durchgeführt von Riverland Reply

    - by A & C Redaktion
    Zufriedene Kunden sind die beste Marketingstrategie. Deshalb bieten wir spezialisierten Partnern die Möglichkeit, professionelle Anwenderberichte über eigene erfolgreiche Oracle Projekte erstellen zu lassen. Hier im Blog präsentieren wir Ihnen in loser Folge Referenzberichte, mit denen Partner bereits erfolgreich werben. Heute: Der Oracle Partner Riverland Reply und sein Business-Intelligence-Projekt für die DAB bank Für die Direkt Anlage Bank, kurz DAB, sind BI-Lösungen längst zu einem wichtigen Steuerungstool geworden. Dabei müssen Datenströme aus verschiedenen Quellen zuverlässig gesteuert werden können. Um die neuesten CRM- und Business Intelligence-Funktionalitäten nutzen zu können, entschied sich die DAB bank für ein Upgrade auf Siebel CRM 8.1.1.7 mit CTI sowie die Integration von Oracle Business Intelligence 11g. Um ihren Wunsch nach Vereinfachung und Vereinheitlichung umzusetzen, fand die DAB in der Münchner Riverland Reply GmbH einen zuverlässigen Partner, der auf technische Beratung, Implementierung und Systemintegration in den Bereichen Prozesse, Businesslösungen und Technologien spezialisiert ist. Die Bedienbarkeit konnte deutlich verbessert werden, was die Prozessabläufe erleichtert. Den Datenzugriff erleichtert nun eine einheitliche Umgebung, und die Reporting-, Visualisierungs-, Such- und Collaboration-Funktionen sind in einem BI-Tool zusammengeführt. Details zum genauen Projektverlauf und den spezifischen Anforderungen finden Sie hier im Anwenderbericht der DAB. Die Möglichkeit, sich und ihre Arbeit gewinnbringend zu präsentieren, können alle spezialisierten Partner nutzen, die ein repräsentatives Oracle Projekt abgeschlossen haben. Erfahrene Fachjournalisten interviewen sowohl Partner als auch Endkunde und erstellen einen ausführlichen, ansprechend aufbereiteten Bericht. Die Veröffentlichung erfolgt über verschiedene Marketing-Kanäle. Natürlich können die Partner die Anwenderberichte auch für eigene Marketingzwecke nutzen, z. B. für Veranstaltungen. Haben Sie Interesse? Dann wenden Sie sich an Frau Renate Mayer. Wir benötigen von Ihnen einige Eckdaten wie Kundenname, Ansprechpartner und eingesetzte Oracle Produkte, eine Beschreibung des Projektes in drei-vier Sätzen sowie Ihren Ansprechpartner im Haus. Und dann: Lassen Sie Ihre gute Arbeit für sich sprechen!

    Read the article

  • SQLAuthority News Free Download Microsoft SQL Server 2008 R2 RTM Express with Management Tools S

    This blog post is in response to several inquiry about Free Download of SQL Server 2008 R2 RTM. Microsoft has announced SQL Server 2008 R2 as RTM (Release To Manufacture). Microsoft SQL Server 2008 R2 Express is a powerful and reliable data management system that delivers a rich set of features, data protection, and performance [...]...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Deploying Data Mining Models using Model Export and Import, Part 2

    - by [email protected]
    In my last post, Deploying Data Mining Models using Model Export and Import, we explored using DBMS_DATA_MINING.EXPORT_MODEL and DBMS_DATA_MINING.IMPORT_MODEL to enable moving a model from one system to another. In this post, we'll look at two distributed scenarios that make use of this capability and a tip for easily moving models from one machine to another using only Oracle Database, not an external file transport mechanism, such as FTP. The first scenario, consider a company with geographically distributed business units, each collecting and managing their data locally for the products they sell. Each business unit has in-house data analysts that build models to predict which products to recommend to customers in their space. A central telemarketing business unit also uses these models to score new customers locally using data collected over the phone. Since the models recommend different products, each customer is scored using each model. This is depicted in Figure 1.Figure 1: Target instance importing multiple remote models for local scoring In the second scenario, consider multiple hospitals that collect data on patients with certain types of cancer. The data collection is standardized, so each hospital collects the same patient demographic and other health / tumor data, along with the clinical diagnosis. Instead of each hospital building it's own models, the data is pooled at a central data analysis lab where a predictive model is built. Once completed, the model is distributed to hospitals, clinics, and doctor offices who can score patient data locally.Figure 2: Multiple target instances importing the same model from a source instance for local scoring Since this blog focuses on model export and import, we'll only discuss what is necessary to move a model from one database to another. Here, we use the package DBMS_FILE_TRANSFER, which can move files between Oracle databases. The script is fairly straightforward, but requires setting up a database link and directory objects. We saw how to create directory objects in the previous post. To create a database link to the source database from the target, we can use, for example: create database link SOURCE1_LINK connect to <schema> identified by <password> using 'SOURCE1'; Note that 'SOURCE1' refers to the service name of the remote database entry in your tnsnames.ora file. From SQL*Plus, first connect to the remote database and export the model. Note that the model_file_name does not include the .dmp extension. This is because export_model appends "01" to this name.  Next, connect to the local database and invoke DBMS_FILE_TRANSFER.GET_FILE and import the model. Note that "01" is eliminated in the target system file name.  connect <source_schema>/<password>@SOURCE1_LINK; BEGIN  DBMS_DATA_MINING.EXPORT_MODEL ('EXPORT_FILE_NAME' || '.dmp',                                 'MY_SOURCE_DIR_OBJECT',                                 'name =''MY_MINING_MODEL'''); END; connect <target_schema>/<password>; BEGIN  DBMS_FILE_TRANSFER.GET_FILE ('MY_SOURCE_DIR_OBJECT',                               'EXPORT_FILE_NAME' || '01.dmp',                               'SOURCE1_LINK',                               'MY_TARGET_DIR_OBJECT',                               'EXPORT_FILE_NAME' || '.dmp' );  DBMS_DATA_MINING.IMPORT_MODEL ('EXPORT_FILE_NAME' || '.dmp',                                 'MY_TARGET_DIR_OBJECT'); END; To clean up afterward, you may want to drop the exported .dmp file at the source and the transferred file at the target. For example, utl_file.fremove('&directory_name', '&model_file_name' || '.dmp');

    Read the article

  • Industry perspectives on managing content

    - by aahluwalia
    Earlier this week I was noodling over a topic for my first blog post. My intention for this blog is to bring a practitioner's perspective on ECM to the community; to share and collaborate on best practices and approaches that address today's business problems. Reviewing my past 14 years of experience with web technologies, I wondered what topic would serve as a good "conversation starter". During this time, I received a call from a friend who was seeking insights on how content management applies to specific industries. She approached me because she vaguely remembered that I had worked in the Health Insurance industry in the recent past. She wanted me to tell her about the specific business needs of this industry. She was in for quite a surprise as she found out that I had spent the better part of a decade managing content within the Health Insurance industry and I discovered a great topic for my first blog post! I offer some insights from Health Insurance and invite my fellow practitioners to share their insights from other industries. What does content management mean to these industries? What can solution providers be aware of when offering solutions to these industries? The United States health care system relies heavily on private health insurance, which is the primary source of coverage for approximately 58% Americans. In the late 19th century, "accident insurance" began to be available, which operated much like modern disability insurance. In the late 20th century, traditional disability insurance evolved into modern health insurance programs. The first thing a solution provider must be aware of about the Health Insurance industry is that it tends to be transaction intensive. They are the ones who manage and administer our health plans and process our claims when we visit our health care providers. It helps to keep in mind that they are in the business of delivering health insurance and not technology. You may find the mindset conservative in comparison to the IT industry, however, the Health Insurance industry has benefited and will continue to benefit from the efficiency that technology brings to traditionally paper-driven processes. We are all aware of the impact that Healthcare reform bill has had a significant impact on the Health Insurance industry. They are under a great deal of pressure to explore ways to reduce their administrative costs and increase operational efficiency. Overall, administrative costs of health insurance include the insurer's cost to administer the health plan, the costs borne by employers, health-care providers, governments and individual consumers. Inefficiencies plague health insurance, owing largely to the absence of standardized processes across the industry. To achieve this, industry leaders have come together to establish standards and invest in initiatives to help their healthcare provider partners transition to the next generation of healthcare technology. The move to online services and paperless explanation of benefits are some manifestations of technological advancements in health insurance. Several companies have adopted Toyota's LEAN methodology or Six Sigma principles to improve quality, reduce waste and excessive costs, thereby increasing the value of their plan offerings. A growing number of health insurance companies have transformed their business systems in the past decade alone and adopted some form of content management to reduce the costs involved in administering health plans. The key strategy has been to convert paper documents and forms into electronic formats, automate the content development process and securely distribute content to various audiences via diverse marketing channels, including web and mobile. Enterprise content management solutions can enable document capture of claim forms, manage digital assets, integrate with Enterprise Resource Planning (ERP) and Human Capital Management (HCM) solutions, build Business Process Management (BPM) processes, define retention and disposition instructions to comply with state and federal regulations and allow eBusiness and Marketing departments to develop and deliver web content to multiple websites, mobile devices and portals. Content can be shared securely within and outside the organization using Information Rights Management.  At the end of the day, solution providers who can translate strategic goals into solutions that maximize process automation, increase ease of use and minimize IT overhead are likely to be successful in today's health insurance environment.

    Read the article

  • Experiencing the New Social Enterprise

    - by kellsey.ruppel(at)oracle.com
    Social media and networking tools, popularly known as Web 2.0 technologies, are rapidly transforming user expectations of enterprise systems. Many organizations are investing in these new tools to cultivate a modern user experience in an "Enterprise 2.0" environment that unlocks the full potential of traditional IT systems and fosters collaboration in key business processes. Is your organization a social enterprise? How are you using Web 2.0 and Enterprise 2.0 technologies? Read this white paper to learn how Oracle WebCenter Suite enables organizations to become social enterprises and is the modern user experience platform for the enterprise and the Web.

    Read the article

  • Top 5 Sites and Activities in San Francisco to Experience During Oracle OpenWorld

    - by kgee
    While Oracle OpenWorld may provide solutions and information on topics like how to simplify your IT, the importance of cloud, and what types of storage may satisfy your enterprise needs, who is going to tell you more about San Francisco? Here are some suggested sites and activities to experience after OpenWorld that aren’t too far from the Moscone Center. It is recommended to take a cab for the sake of time, but the 6 square miles that make up San Francisco will make for a quick trek to any of the following destinations: The Golden Gate BridgeAn image often associated with San Francisco, this bridge is one of the most impressive in the world. Take a walk across it, or view it from nearby Crissy Field, it is a sight that floors even the most veteran of San Franciscans. The Ferry BuildingLocated at the end of Market Street in the Embarcadero, the Ferry Building once served as a hub of water transport and trade. The building has a bay front view and an array of food choices and restaurants. It is easily accessible via the Muni, BART, trolley or by cab. It is a must-see in San Francisco, and not too far from the Moscone Center. Ride the Trolley to the CastroFor only $2, you can get go back in history for a moment on the Trolley. Take the F-line from the Embarcadero and ride it all the way to the Castro district. During the ride, you will get an overview of the landscape and cultures that are prevalent in San Francisco, but be wary that some areas may beg for an open mind more than others. Golden Gate ParkWhen you tire of the concrete jungle, the lucky part of being in San Francisco is that you can escape to a natural refuge, this park being one of the favorites. This park is known for its hiking trails, cultural attractions, monuments, lakes and gardens. It is one good reason to bring your sneakers to San Francisco, and is also a great place to picnic. Please be wary that it is easy to get lost, and it is advisable to bring a map (just in case) if you go. Haight AshburyFor a complete change of scenery, Haight Ashbury is known as one of the places hippies used to live and the location of "The Summer of Love." It is now a more affluent neighborhood with boutique shops and the occasional drum circle. While it may be perceived as grungy in certain spots, it is one of the most photographed places in San Francisco and an integral part of San Franciscan history.

    Read the article

  • Consumer Electronics Show (CES):CRM for High Technology Firms

    - by charles.knapp
    The Consumer Electronics Show, opening Thursday, showcases product innovations that stem from best practices in design, manufacturing, and distribution. Oracle and IBM invite you to learn best practices from peers, as well as why it matters to use CRM tailored for high technology firms -- offered only by Oracle. On Wednesday, January 5, 1-7 pm at the Bellagio Hotel Las Vegas, learn from peers at IBM, VTech, Plantronics, Cisco, Symantec, and Oracle about how to improve:Channel sales, marketing, and operations management - maximize new product introductions (NPI), sales, forecasts, training, channel promotions, and settlement Winning the deal - determine the right price for the right deal for the "perfect quote," capture the order, and manage orders Collaborative and rapid supply chain planning - improve agility, inventory turns, and profits Please join us for the Oracle/IBM CES High Technology Summit and make useful connections with your peers at the evening networking reception. Register now for this FREE event.

    Read the article

  • This Week's News From the Green Data Center Management Front

    Among this week's developments in green data center management: myriad federal and state tax credits are now available for Green IT projects related to data centers; the EPA is finalizing its Energy Star program for data centers; and Numara Software has a way to help you better green your data center.

    Read the article

  • Webcast Reminder: Implementing IDM in Healthcare, September 19th @10:00 am PST

    - by Darin Pendergraft
    Join me and Rex Thexton from PwC tomorrow (September 19th) as we review an IDM project that Rex and his team completed for a large healthcare organization.  Rex will talk through the IT environment and business drivers that lead to the project, and then we will go through planning, design and implementation of the Oracle Identity Management products that PwC and the customer chose to complete the project. This will be a great opportunity to hear about the trends that are driving IT Healthcare, and to get your Identity Management questions answered. If you haven't already registered - Register Here!

    Read the article

  • Join Our Call: Sun Storage 2500-M2 Announcement

    - by mseika
    Oracle's Sun Storage 2500-M2 array brings together the latest Fibre Channel (FC) and SAS2 technologies with Oracle's Sun Storage Common Array software from Oracle to create a robust solution that’s equally adept in an ! entry-level storage area network (SAN) for the mid-size business and integrating into an existing storage network within the enterprise. The Sun Storage 2500-M2 replaces Sun's Storage 2500 array product line and is designed so that the customer may have a quick qualification time for fast and easy deployment in the traditional 2500 environments. Jun Jang, Oracle Principal Product Manager will be hosting this 1 hour live call (a recording will be available), please join us to find out more:24. Jun 2011 08:00 am PST/PDT/4pm UK timeWeb Registration and AccessAccess for Mobile DevicesInternational Participant Dial-In Number: 706-634-8508Additional International Dial-In Numbers LinkDial-In Passcode: 6395

    Read the article

  • This Week on the Green Data Center Management Front

    Among the big news this week in green data center management: APC will demonstrate how to provide energy while addressing energy efficiency legislation; Altruent Systems announced it has completed a new energy efficient data center for one of its key clients; and Voonami is unveiling what it claims is the greenest in Utah.

    Read the article

  • Running Multiple Queries in Oracle SQL Developer

    - by thatjeffsmith
    There are two methods for running queries in SQL Developer: Run Statement Run Statement, Shift+Enter, F9, or this button Run Script No grids, just script (SQL*Plus like) ouput is fine, thank you very much! What’s the Difference? There are some obvious differences between the two features, the most obvious being the format of the output delivered. But there are some other, more subtle differences here, primarily around fetching. What is Fetch? After you run send your query to Oracle, it has to do 3 things: Parse Execute Fetch Technically it has to do at least 2 things, and sometimes only 1. But, to get the data back to the user, the fetch must occur. If you have a 10 row query or a 1,000,000 row query, this can mean 1 or many fetches in groups of records. Ok, before I went on the Fetch tangent, I said there were two ways to run statements in SQL Developer: Run Statement Run statement brings your query results to a grid with a single fetch. The user sees 50, 100, 500, etc rows come back, but SQL Developer and the database know that there are more rows waiting to be retrieved. The process on the server that was used to execute the query is still hanging around too. To alleviate this, increase your fetch size to 500. Every query ran will come back with the first 500 rows, and rows will be continued to be fetched in 500 row increments. You’ll then see most of your ad hoc queries complete with a single fetch. Scroll down, or hit Ctrl+End to force a full fetch and get all your rows back. Run Script Run Script runs the contents of the worksheet (or what’s highlighted) as a ‘script.’ What does that mean exactly? Think of this as being equivalent to running this in SQL*Plus: @my_script.sql; Each statement is executed. Also, ALL rows are fetched. So once it’s finished executing, there are no open cursors left around. The more obvious difference here is that the output comes back formatted as plain old text. Run one or more commands plus SQL*Plus commands like SET and SPOOL The Trick: Run Statement Works With Multiple Statements! It says ‘run statement,’ but if you select more than one with your mouse and hit the button – it will run each and throw the results to 1 grid for each statement. If you mouse hover over the Query Result panel tab, SQL Developer will tell you the query used to populate that grid. This will work regardless of what you have this preference set to: DATABASE – WORKSHEET – SHOW QUERY RESULTS IN NEW TABS Mind the fetch though! Close those cursors by bring back all the records or closing the grids when you’re done with them.

    Read the article

  • MDM for Tax Authorities

    - by david.butler(at)oracle.com
    In last week’s MDM blog, we discussed MDM in the Public Sector. I want to continue that thread. After all, no industry faces tougher data quality problems than governmental organizations, and few industries suffer more significant down side consequences to poor operations than local, state and federal governments. One key challenge area is taxation. Tax Authorities face a multitude of IT challenges. Firstly, the data used in tax calculations is increasing in volume and complexity. They must improve service by introducing multi-channel contact centers and self-service capabilities. Security concerns necessitate increasingly sophisticated data protection procedures. And cost constraints are driving Tax Authorities to rely on off-the-shelf software for many of their functional areas. Compounding these issues is the fact that the IT architectures in operation at most revenue and collections agencies are very complex. They typically include multiple, disparate operational and analytical systems across which the sum total of data about individual constituents is fragmented. To make matters more complicated, taxation is not carried out by a single jurisdiction, and often sources of income including employers, investments and other sources of taxable income and deductions must also be tracked and shared among tax authorities. Collectively, these systems are involved in tax assessment and collections, risk analysis, scoring, tracking, auditing and investigation case management. The Problem of Constituent Data Management The infrastructure described above makes it very difficult to create a consolidated representation of a given party. Differing formats and data models mean that a constituent may be represented in one way in one system and in a different way in another. Individual records are frequently inaccurate, incomplete, out of date and/or inconsistent with other records relating to the same constituent. When constituent data must be aggregated and scored, information within each system must be rationalized and normalized so the agency can produce a constituent information file (CIF) that provides a single source of truth about that party. If information about that constituent changes, each system in turn must be updated. There have been many attempts to solve this problem with technology: from consolidating transactional systems to conducting manual systems integration projects and superimposing layers of business intelligence and analytics. All these approaches can be successful in solving a portion of the problem at a specific point in time, but without an enterprise perspective, anything gained is quickly lost again. Oracle Constituent Data Mastering for Tax Authorities: A Single View of the Constituent Oracle has a flexible and long-term solution to the problem of securely integrating and managing constituent data. The Oracle Solution for mastering Constituent Data for Tax Authorities is based on two core product offerings: Oracle Customer Hub and – optionally – Oracle Application Integration Architecture (AIA). Customer Hub is a master data management (MDM) product that centralizes, de-duplicates, and enriches constituent data. It unifies fragmented information without disrupting existing business processes or IT investments. Role based data access and privacy rules guarantee maximum security and privacy. Data is continuously and automatically synchronized with all source systems. With the Oracle Customer Hub managing the master constituent identity, every department can capture transaction activity against the same record, improving reporting accuracy, employee productivity, reliability of constituent analytics, and day-to-day constituent relationships. Oracle Application Integration Architecture provides a collection of core pre-built processes to support out of the box Master Data Governance across Oracle Customer Hub, Siebel CRM, and Oracle E-Business Suite. It also provides a framework to enable MDM integrations with other Oracle and non-Oracle applications. Oracle AIA removes some of the key inhibitors to implementing a service-oriented architecture (SOA) by providing a pre-built SOA-based middleware foundation as well as industry-optimized service oriented applications, all built around a SOA governance model that encourages effective design and reuse. I encourage you to read Oracle Solution for Mastering Constituents Data for Public Sector – Tax Authorities by Roberto Negro. It is an outstanding whitepaper that describes how the Oracle MDM solution allows you to create a unified, reconciled source of high-quality constituent data and gain an accurate single view of each constituent. This foundation enables you to lower the costs associated with data quality and integration and create a tax organization that is efficient, secure and constituent-centric. Also, don’t forget the upcoming webcast on Thursday, February 10th: Deliver Improved Services to Citizens at Lower Cost to your Organization Our Guest Speaker is Ruben Spekle, from Capgemini. He will also provide insight into Public Sector Master Data Management and Case Management implementations including one that was executed for a Dutch Government Agency. If you are interested in how governmental organizations from around the world are using MDM to advance their cause, click here to register for the webcast.

    Read the article

< Previous Page | 215 216 217 218 219 220 221 222 223 224 225 226  | Next Page >