Search Results

Search found 23098 results on 924 pages for 'multiple processes'.

Page 451/924 | < Previous Page | 447 448 449 450 451 452 453 454 455 456 457 458  | Next Page >

  • A debugging experience with "highly compatible" ASP.NET 4.5

    - by Jeff
    I have to admit that I will pretty much upgrade software for no reason other than being on the latest version. I won't do it if it's super expensive (Adobe gets money from me about once every three or four years at best), but particularly with frameworks and stuff generally available as part of my MSDN subscription, I'll be bleeding edge. CoasterBuzz was running on the MVC 4 framework pretty much as soon as they did a "go live" license for it. I didn't really jump in head-first with Windows 8 and Visual Studio 2012, in part because I just wasn't interested in doing the reinstalls for each new version. Turns out there weren't that many revisions anyway. But when the final versions were released a week and a half ago, I jumped in. I saw on one of the Microsoft sites that .Net 4.5 was a "highly compatible in-place update" to the framework. Good enough for me. I was obviously running it by default in Windows 8, and installed it on my production server. I suppose it's "highly compatible," except when it isn't. Three of my sites are running with various flavors of the MVC version of POP Forums. All of them stopped working under ASP.NET 4.5. It was not immediately obvious what the problem might be beyond an exception indicating that there were no repository classes registered with Ninject, which I use for dependency injection in the forums. This was made all the more weird by the fact that it ran fine locally in the dev Web host. My first instinct was to spin up a Windows Server VM on my local box and put the remote debugger on it. (Side note: running multiple VM's on a Retina MacBook Pro with 16 gigs of RAM is pretty much the most awesome thing ever. I can't believe this computer is for real, and not a 50-pound tower under my desk.) What might have been going on in IIS that doesn't happen in Visual Studio? In the debugging process, I realized that I might be looking in the wrong place. POP Forums creates a Ninject container using a method called from a PreApplicationStartMethod attribute, and at that time registers a module (what Ninject uses to map interfaces to implementations) that maps all of the core dependencies. It also creates an instance of an HttpModule that originally hosted the "services" (search indexing, mailer, etc.), but now just records errors. That's all well and good, but the actual repository mapping, where data is actually read or persisted, happens in Application_Start() in global.asax. The idea there is that you can swap out the SqlSingleWebServer repos for something tuned for multiple servers, Oracle or something else. Of course, if I used something like StructureMap, which does convention-based mapping for dependency injection (a class implementing ISettingsRepository called SettingsRepository is automagically mapped), I wouldn't have to worry about it. In any case, the HttpModule, being instantiated before Application_Start() gets to run, would throw because there was no repo mapped where it could get settings from the database. This makes total sense. The fix is sort of a hack, where I don't setup the innards of the HttpModule until a call to its BeginRequest is made. I say it's a hack, because its primary function, logging exceptions, won't work until the app has warmed up. Still, this brings up an interesting question about the race condition, and what changed in 4.5 when it's running in IIS. In ASP.NET 4, it would appear that the code called via the PreApplicationStartMethod was either failing silently, and running again later, or it was getting to that code after Application_Start was called. In any case, weird thing. The real pain point I'm experiencing now is a bug in MVC 4 that is extremely serious because it renders the mobile/alternate view functionality very much broken.

    Read the article

  • Access PC Settings Easily from Your Desktop in Windows 8 and 8.1

    - by Akemi Iwaya
    Accessing your system’s settings in Windows 8 is not exactly the most straight-forward of processes, so if you need to change your settings often, then it can be a bit frustrating. With that in mind, the good folks over at 7 Tutorials have created an awesome shortcut that will take all the hassle out of accessing those settings, and make ‘tweaking’ Windows 8 much easier. After downloading the zip file, extract the exe file and place it in an appropriate folder, then create a shortcut. Once you have the new shortcut set up in the desired location (i.e. desktop or pinned to the taskbar), accessing your system’s settings has never been easier in Windows 8 and 8.1! Special Note: If you are someone who runs files through VirusTotal before using them, be aware that two listings there (Commtouch and Symantec) will flag the file as malware. We had no problems on our system whatsoever and believe the malware flags to be false positives. Download the Desktop Shortcut to PC Settings, for Windows 8 & 8.1 [7 Tutorials]     

    Read the article

  • Oracle University Nouveaux cours (Week 35)

    - by swalker
    Parmi les nouveautés d’Oracle Université de ce mois-ci, vous trouverez : Fusion Middleware Oracle Directory Services 11g: Administration (5 days) Oracle SOA Suite 11g: Essential Concepts (Training on Demand) e-Business Suite R12 Oracle HRMS iRecruitment Fundamentals (Self-Study Course) R12 Oracle Payroll Fundamentals: Administration (Self-Study Course) R12 Oracle HRMS System Administration Fundamentals (Self-Study Course) R12 Oracle HRMS Self Service Fundamentals (Self-Study Course) R12 Oracle HRMS Implement and Use Fast Formula (Self-Study Course) R12 HRMS Work Structures Fundamentals (Self-Study Course) R12 HRMS Total Compensation Foundations (Self-Study Course) Siebel Siebel 8.1.x Chat and Voice Integration Using CCA (Self-Study Course) Siebel 8.1.x Search using Oracle Secure Enterprise Search (Self-Study Course) Siebel 8.1.x COM Web Services (Self-Study Course) Siebel 8.1.x COM Asset Based Order Management (Self-Study Course) Siebel 8.1.x COM: What is New in Product Configurator (Self-Study Course) Siebel 8.1.x COM Product Configurator Caching & Performance Management (Self-Study Course) Siebel 8.1.x COM PSP Engine Caching and Performance Management (Self-Study Course) Siebel 8.1.x Remote: Administration (Self-Study Course) Siebel 8.1.x Remote: Technical Foundations (Self-Study Course) Siebel Tools: Configuring Chart and Tree Applets (Self-Study Course) Sun - Server Administration SPARC SuperCluster Administration and Maintenance Seminar (2 days) OPN Only Sparc T4-Based Servers Installation Boot Camp (1 day) Primavera Primavera P6 Application Administration Rel 8.x (2 days) Oracle Retail Retail Merchandising System (RMS) Business Overview (Self-Study Course) Retail Invoice Matching (ReIM) Product Overview (Self-Study Course) Retail Invoice Matching (ReIM) Business Introduction (Self-Study Course) Retail Demand Forecasting: RDF Classic Product Overview (Self-Study Course) Retail Demand Forecasting Introduction (Self-Study Course) Retail Data Warehouse (RDW) Overview 13.1 (Self-Study Course) Oracle Retail Point-of-Service (POS) Product Overview (Self-Study Course) Retail Sales Audit (ReSA) Product Overview (Self-Study Course) Retail Price Management (RPM) Product Overview (Self-Study Course) Retail Merchandising System (RMS) Technical Introduction (Self-Study Course) Oracle Retail Integration Bus (RIB) Product Overview (Self-Study Course) Oracle Communiucations Unified Communications Suite Convergence Customization (2 days) OSM Foundations I: Tasks, Processes and Orders Contacter l’ équipe locale d’ Oracle University pour toute information et dates de cours. Restez connecté à Oracle University : LinkedIn OracleMix Twitter Facebook Google+

    Read the article

  • CRM@Oracle Series: CRM Analytics

    - by tony.berk
    What is the most important factor that leads to a successful CRM deployment? Is it the overall strategy, strong governance, defined processes or good data quality? Well, it's definitely a combination of all these, but the most important differentiator from our experience is Business Intelligence. Business Intelligence or Analytics is commonly mentioned as a key aspect to successful CRM and other enterprise deployments. The good news is that Oracle provides pre-built analytics dashboards, which provide real-time, actionable insight, and tools to build custom analyses. However, success with analytics, especially in a large enterprise, still requires a strong strategy, clean data for analysis, and performance. Today's CRM@Oracle slidecast covers Oracle's strategy, architecture and key success factors for deploying CRM Analytics internally at Oracle. CRM@Oracle: CRM Analytics Click here to learn more about Oracle CRM products and here to learn about Oracle Business Intelligence Applications. Have you read our other postings in the CRM@Oracle Series? If you have a particular CRM area or function which you'd like to hear how Oracle implemented it internally, post a comment and we'll get it on our list.

    Read the article

  • How to Watch NCAA March Madness Online

    - by DigitalGeekery
    You’ve filled out your brackets and now you are ready for one of America’s most popular sporting events. But what if you are you stuck at work or away from your TV?  Or your local affiliate is showing a different game? Today we show how to catch all the March Madness online. March Madness on Demand You’ll need a broadband connection, 512 MB RAM or higher, with cookies and Javascript enabled in your browser. March Madness on Demand offers two viewing options, a Standard Player and a High Quality player. The High Quality player is not, unfortunately, high definition. Standard Player Requirements Windows XP/Vista/7 or Mac OS X IE 6+ (We also successfully tested it in Firefox, Chrome, & Opera) Adobe Flash Player 9 or higher High Quality Player Requirements 2.4 GHz Pentium 4 or Intel-based Macintosh Mac OS 10.4.8+ (Intel-based) Windows: XP SP2, Vista, Server 2003, Server 2008, Windows 7 Firefox 1.5+ or IE 6/7/8 Silverlight 3 browser plug-in Watching March Madness on Demand Go to the March Madness on Demand website. (Link below) Check the “Watch in High Quality” section to see if your browser is ready and compatible for the High Quality viewer. If not, you’ll see a message indicating either your browser and system are incompatible… Or that you need to install Silverlight. To install Silverlight, click on the “Get HQ” button and follow the prompts to download and install Silverlight. To launch the player, click the large red “Launch Player” button. At the top of the screen, you’ll see the current and upcoming games. Click on “Watch Now” below to begin watching. At the bottom left, is where you click to watch with the High Quality player. If to many people are watching the High Quality player, you’ll see the following message and have to go back to the Standard Player. At the lower right are volume controls, a “Full Screen” button, and a “Share” button which allows you to share the game you are watching on various social networking sites like Facebook and Twitter. Perhaps most importantly for those who want to steal a bit of viewing time while at work is the “Boss Button” at the top right. Clicking on the “Boss Button” will open a fake Office document so it may appear at first glance like you’re actually doing legitimate work. To return to the game, click anywhere on the screen with your mouse. You’ll be able to catch every single game of the tournament from the first round all the way through the championship with March Madness on Demand. If your computer and Internet connection can handle it, you can even watch multiple games at the same time by opening March Madness on Demand in multiple browser windows. Watch March Madness online Similar Articles Productive Geek Tips Weekend Fun: Watch Television on Your PC with AnyTVWatch NFL Sunday Night Football On Your PCWatch TV On Your PC with FreeZ Online TVGeek Fun: Download Favorite NBC Programs for FreeDitch the RealPlayer Bloat with Real Alternative TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 PCmover Professional How to Browse Privately in Firefox Kill Processes Quickly with Process Assassin Need to Come Up with a Good Name? Try Wordoid StockFox puts a Lightweight Stock Ticker in your Statusbar Explore Google Public Data Visually The Ultimate Excel Cheatsheet

    Read the article

  • Iterative Conversion

    - by stuart ramage
    Question Received: I am toying with the idea of migrating the current information first and the remainder of the history at a later date. I have heard that the conversion tool copes with this, but haven't found any information on how it does. Answer: The Toolkit will support iterative conversions as long as the original master data key tables (the CK_* tables) are not cleared down from Staging (the already converted Transactional Data would need to be cleared down) and the Production instance being migrated into is actually Production (we have migrated into a pre-prod instance in the past and then unloaded this and loaded it into the real PROD instance, but this will not work for your situation. You need to be migrating directly into your intended environment). In this case the migration tool will still know all about the original keys and the generated keys for the primary objects (Account, SA, etc.) and as such it will be able to link the data converted as part of a second pass onto these entities. It should be noted that this may result in the original opening balances potentially being displayed with an incorrect value (if we are talking about Financial Transactions) and also that care will have to be taken to ensure that all related objects are aligned (eg. A Bill must have a set to bill segments, meter reads and a financial transactions, and these entities cannot exist independantly). It should also be noted that subsequent runs of the conversion tool would need to be 'trimmed' to ensure that they are only doing work on the objects affected. You would not want to revalidate and migrate all Person, Account, SA, SA/SP, SP and Premise details since this information has already been processed, but you would definitely want to run the affected transactional record validation and keygen processes. There is no real "hard-and-fast" rule around this processing since is it specific to each implmentations needs, but the majority of the effort required should be detailed in the Conversion Tool section of the online help (under Adminstration/ The Conversion Tool). The major rule is to ensure that you only run the steps and validation/keygen steps that you need and do not do a complete rerun for your subsequent conversion.

    Read the article

  • San Joaquin County, California Wins AIIM 2012 Carl E. Nelson Best Practice Award

    - by Peggy Chen
    Last month, AIIM, the global community of information professionals, announced the winners of the 2012 Carl E. Nelson Best Practices Awards. And San Joaquin County, California won in the small company category for 1-100 employees. The Carl E. Nelson Best Practices Award was established to recognize excellence in the area of information management. "Best practice" denotes a standard of excellence that has been achieved with an organization and refers to a process that can be quantified, adapted and repeated. Like many counties, San Joaquin County, California, was faced with huge challenges due to decreasing funds and staff, including decreased cost of building capability. It needed to streamline processes, cut costs per activity, modernize and strengthen the infrastructure, and adopt new technology and standards such as the National Information Exchange Model (NIEM). The Integrated Justice Information System (IJIS) provides a Web-based system to link more than 650,000 residents, 18 agencies countywide and other law enforcement systems nationwide. The county’s modernization initiative focused on replacing its outdated warrant system, implementing service-oriented architecture (SOA) to simplify integration between county law and justice systems, deploying Business Process Management (BPM), Case Management with content management, and Web technologies from Oracle. A critical part of their success has been the proper alignment of our Strategic Vision to the way the organization was enabled to plan and execute (and continues to execute) their modernization project. Congratulations to San Joaquin County!

    Read the article

  • concurrency::accelerator_view

    - by Daniel Moth
    Overview We saw previously that accelerator represents a target for our C++ AMP computation or memory allocation and that there is a notion of a default accelerator. We ended that post by introducing how one can obtain accelerator_view objects from an accelerator object through the accelerator class's default_view property and the create_view method. The accelerator_view objects can be thought of as handles to an accelerator. You can also construct an accelerator_view given another accelerator_view (through the copy constructor or the assignment operator overload). Speaking of operator overloading, you can also compare (for equality and inequality) two accelerator_view objects between them to determine if they refer to the same underlying accelerator. We'll see later that when we use concurrency::array objects, the allocation of data takes place on an accelerator at array construction time, so there is a constructor overload that accepts an accelerator_view object. We'll also see later that a new concurrency::parallel_for_each function overload can take an accelerator_view object, so it knows on what target to execute the computation (represented by a lambda that the parallel_for_each also accepts). Beyond normal usage, accelerator_view is a quality of service concept that offers isolation to multiple "consumers" of an accelerator. If in your code you are accessing the accelerator from multiple threads (or, in general, from different parts of your app), then you'll want to create separate accelerator_view objects for each thread. flush, wait, and queuing_mode When you create an accelerator_view via the create_view method of the accelerator, you pass in an option of immediate or deferred, which are the two members of the queuing_mode enum. At any point you can access this value from the queuing_mode property of the accelerator_view. When the queuing_mode value is immediate (which is the default), any commands sent to the device such as kernel invocations and data transfers (e.g. parallel_for_each and copy, as we'll see in future posts), will get submitted as soon as the runtime sees fit (that is the definition of immediate). When the value of queuing_mode is deferred, the commands will be batched up. To send all buffered commands to the device for execution, there is a non-blocking flush method that you can call. If you wish to block until all the commands have been sent, there is a wait method you can call. Deferring is a more advanced scenario aimed at performance gains when you are submitting many device commands and you want to avoid the tiny overhead of flushing/submitting each command separately. Querying information Just like accelerator, accelerator_view exposes the is_debug and version properties. In fact, you can always access the accelerator object from the accelerator property on the accelerator_view class to access the accelerator interface we looked at previously. Interop with D3D (aka DX) In a later post I'll show an example of an app that uses C++ AMP to compute data that is used in pixel shaders. In those scenarios, you can benefit by integrating C++ AMP into your graphics pipeline and one of the building blocks for that is being able to use the same device context from both the compute kernel and the other shaders. You can do that by going from accelerator_view to device context (and vice versa), through part of our interop API in amp.h: *get_device, create_accelerator_view. More on those in a later post. Comments about this post by Daniel Moth welcome at the original blog.

    Read the article

  • Oracle Insurance Gets Innovative with Insurance Business Intelligence

    - by nicole.bruns(at)oracle.com
    Oracle Insurance announced yesterday the availability of Oracle Insurance Insight 7.0, an insurance-specific data warehouse and business intelligence (BI) system that transforms the traditional approach to BI by involving business users in the creation and maintenance."Rapid access to business intelligence is essential to compete and thrive in today's insurance industry," said Srini Venkatasantham, vice president, Product Strategy, Oracle Insurance. "The adaptive data modeling approach of Oracle Insurance Insight 7.0, combined with the insurance-specific data model, offers global insurance companies a faster, easier way to get the intelligence they need to make better-informed business decisions." New Features in Oracle Insurance 7.0 include:"Adaptive Data Modeling" via the new warehouse palette: Gives business users the power to configure lines of business via an easy-to-use warehouse palette tool. Oracle Insurance Insight then automatically creates data warehouse elements - such as line-specific database structures and extract-transform-load (ETL) processes -speeding up time-to-value for BI initiatives. Out-of-the-box insurance models or create-from-scratch option: Includes pre-built content and interfaces for six Property and Casualty (P&C) lines. Additionally, insurers can use the warehouse palette to deploy any and all P&C or General Insurance lines of business from scratch, helping insurers support operations in any country.Leverages Oracle technologies: In addition to Oracle Business Intelligence Enterprise Edition, the solution includes Oracle Database 11g as well as Oracle Data Integrator Enterprise Edition 11g, which delivers Extract, Load and Transform (E-L-T) architecture and eliminates the need for a separate transformation server. Additionally, the expanded Oracle technology infrastructure enables support for Oracle Exadata. Martina Conlon, a Principal with Novarica's Insurance practice, and author of Business Intelligence in Insurance: Current State, Challenges, and Expectations says, "The need for continued investment by insurers in business intelligence capabilities is widely understood, and the industry is acting. Arming the business intelligence implementation with predefined insurance specific content, and flexible and configurable technology will get these projects up and running faster."Learn moreTo see a demo of the Oracle Insurance Insight system, click hereTo read the press announcement, click here

    Read the article

  • OpenFilesView Displays All Open and Locked Files to Help Resolve In-Use Errors

    - by Jason Fitzpatrick
    Windows: You go to move a file and Windows throws up an “In Use” error. OpenFilesView shows you what application or system process is locking up the files you’re trying to move. Sometimes the culprit is obvious; if you go to move your media folder and you’ve got your media player open watching South Park then shutting down the media player is the obvious solution. Other times the culprit is less obvious; sometimes Windows processes and less-than-obvious applications are accessing your files in ways that aren’t apparent. The screenshot below showcases the “In Use” error: This is where OpenFilesView comes into play. Fire up the application to see a list of all active files on your system. The master list is a bit overwhelming (on our test system there were over 1200 open files) but you use the find command to drill down to specific file or folder names. Once you’ve found the locked file you can close the file handle, kill the process, or bring the process to the front (so you can examine the program, if possible, before terminating it). It’s much more efficient than rebooting in an attempt to shake the In-Use error. OpenFilesView is freeware and works on Windows XP through Windows 7. HTG Explains: Do You Really Need to Defrag Your PC? Use Amazon’s Barcode Scanner to Easily Buy Anything from Your Phone How To Migrate Windows 7 to a Solid State Drive

    Read the article

  • ArchBeat Link-o-Rama for 2012-06-19

    - by Bob Rhubart
    Discussion: Public, Private, and Hybrid Clouds A conversation about the similarities and differences between public, private, and hybrid clouds; the connection between cows, condos, and cloud computing; and what architects need to know in order to take advantage of cloud computing. (OTN ArchBeat Podcast transcript) InfoQ: Current Trends in Enterprise Mobility Interesting infographics that show current developments and major trends in enterprise mobility. Recap: EMEA User Group Leaders Meeting Latvia May 2012 Tom Scheirsen recaps the recent IOUC event in Riga. Oracle Fusion Middleware Summer Camps in Lisbon: Includes Advanced ADF Training by Oracle Product Management This is how IT people deal with the Summertime Blues. Enterprise 2.0 Conference: Building Social Business | Oracle WebCenter Blog Kellsey Ruppel shares a list of E2.0 conference sessions being presented by members of the Oracle community. Linux 6 Transparent Huge Pages and Hadoop Workloads | Structured Data Greg Rahn documents a problem. BPM Standard Edition to start your BPM project "BPM Standard Edition is an entry level BPM offering designed to help organisations implement their first few processes in order to prove the value of BPM within their own organisation." Troubleshooting ADF Security 11g Login Page Failure | Andrejus Baranovskis Oracle ACE Director Andrejus Baranovskis takes a deep dive into one of the most common ADF 11g Security issues. It's Alive! - The Oracle OpenWorld Content Catalog It's what you’ve been waiting for—the central repository for information on sessions, demos, labs, user groups, exhibitors, and more. 5 minutes or less: Indexing Attributes in OID | Andre Correa Fusion Middleware A-Team blogger Andre Correa offers help for those who encounter issues when running searches with LDAP filters against OID (Oracle Internet Directory). Condos and Clouds: Thinking about Cloud Computng by Looking at Condominiums | Pat Helland In part two of the OTN ArchBeat Podcast Public, Private, and Hybrid Clouds, Oracle Cloud chief architect Mark Nelson mentions an analogy by Pat Helland that compares condos to cloud computing. After some digging I found the October 2011 presentation in which Helland explains that analogy. Thought for the Day "I have always found that plans are useless, but planning is indispensable." — Dwight Eisenhower (October 14, 1890 – March 28, 1969) Source: Quotes for Software Engineers

    Read the article

  • "Has Oracle written the script for CRM success?" - Anthony Lye on Customer Experience at BAFTA

    - by Richard Lefebvre
    Anthony Lye showcased Oracle Fusion CRM at a BAFTA gathering, and MyCustomer.com covered the story under the title of "Has Oracle written the script for CRM success?' According to MyCustomer.com, "Oracle's SVP of CRM Anthony Lye set the scene for the event, suggesting products are becoming commoditized, so that the only way to differentiate is through the relationship with the customer. But he warned that "customers are more and more in control of that relationship, so you have to provide great experiences for them." "The quickest win within your organization to create a single view is to connect your marketing organization with your selling organization, align goals, processes, people and technology," Anthony explained.   "And this is a transition that is already happening - "VPs of marketing have started turning up in the same meetings as VPs of sales, we have started to see that they want to work together" - but this convergence needs nurturing." "In Fusion there are capabilities to align the organisation - we enable marketing on the same platform to build campaigns connected to sales stages. It can affect leads and opportunities at the top end of the funnel. And the selling organisation can take advantage of marketing content - the materials that are exclusively within marketing can now be used by sales. Your sales teams have been campaigning forever, but it's usually by email, it isn't aligned with the corporate message and it's being sent to people it shouldn't. By aligning them we can increase output and the quality of that output." Anthony concluded: "Operating in a disconnected fashion having two distinct systems will cost you time and money. So we feel there's a material advantage in a solution like this." Enjoy the full story at http://www.mycustomer.com/topic/marketing/has-oracle-written-script-crm-success/139958

    Read the article

  • "Oracle", "Sybase", "SQL Server" vs just "SQL/JDBC" in the CV

    - by bobah
    How would you define a testable measure of the expertise that, if you're honest with yourself, lets you write in your CV words "Oracle", "Sybase", or "SQL Server" and not just "Relational Databases, SQL, JDBC" in your software developer's CV? What every XXX-developer (XXX - a vendor name) should know? The question is similar to http://stackoverflow.com/questions/2119859/questions-every-good-database-sql-developer-should-be-able-to-answer but is vendor-specific. Below is a start of the list as an example, demonstrate what kind of answers I am hoping to get. If you are expert in X then you know that Y (X - Y below): Sybase/SQL Server - they are very similar, Sybase is much more expensive Sybase/SQL Server - for Java you can use either native Sybase/JSQLDB driver or jTDS that is using TDS protocol and can connect to SQL Server as well, TDS traffic can be dumped and analyzed with hexdump command Sybase/SQL Server - for C++ you can use FreeTDS to connect to any, for Perl - same Sybase/SQL Server - a query can return multiple result sets and return codes, all need to be processes otherwise errors can happen Sybase/SQL Server - sp_help, sp_helptext Sybase/SQL Server - your tables/views/procedures are under DBName/dbo/... Sybase - for C++ on Linux you can use Sybase client API to connect (at least until recently) SQL Server - JDBC driver has a configurable transparent failover capability Oracle - for C++ Linux one can use OTLv4 that is a very powerful yet lightweight wrapper around Oracle client API Oracle compilation (contributors: ammoQ) PLSQL Java Stored Procedures '' is null Hierarchical Query Analytic Functions Oracle Text

    Read the article

  • ext4 jbd2 journaling active even on empty filesystem

    - by Paul
    I have been having several issues with my ext4 filesystems that seem to be due to jbd2 journaling. I made a related post here and am rephrasing it with the hope that someone may be able to help. For a minimal example, I start with an empty 8gb USB stick and use gparted to create one ext4 partition. The command used by gparted when creating the ext4 file system is: mkfs.ext4 -j -O extent -L DataTraveler8gb /dev/sde1 I check the filesystem with gparted: e2fsck -f -y -v /dev/sde1 and I mount it: sudo mount /dev/sde1 /media/test The disk is empty, but the journaling is very active on this disk (/dev/sde1). The other disks are ext4 SSDs formatted similarly. A snapshot of iotop: % sudo iotop -oPa Total DISK READ: 0.00 B/s | Total DISK WRITE: 2027.21 K/s PID PRIO USER DISK READ DISK WRITE SWAPIN IO COMMAND 262 be/3 root 0.00 B 56.00 K 0.00 % 0.18 % [jbd2/sda1-8] 29069 be/3 root 0.00 B 0.00 B 0.00 % 0.16 % [jbd2/sde1-8] 891 be/3 root 0.00 B 4.00 K 0.00 % 0.03 % [jbd2/sdc1-8] What is jbd2 doing with /dev/sde1? If I follow the same steps with a larger 2Tb disk, iotop indicates this empty disk is constantly being written to by jbd2 at the rate of Mb/s as soon as I mount it. On the other disks, which have the OS and /home, I have tried to find if any files are being modified by processes to cause this behavior but couldn't find any. I also moved many of the disk intensive process to use a tmpfs. And used noatime. I have another non-SSD hard disk on this machine, /dev/sdb, that is also ext4 but was not formatted by gparted (given to me by a coworker). It does not appear in iotop. So I am assuming there an issue with gparted. Any suggestions are appreciated. Also any tips on how to modify existing partitions to fix the issue without having to start from scratch would be great. There are some posts related to jbd2 but they didn't help (eg. here).

    Read the article

  • MySQL for Excel 1.3.0 Beta has been released

    - by Javier Treviño
    The MySQL Windows Experience Team is proud to announce the release of MySQL for Excel version 1.3.0.  This is a beta release for 1.3.x. MySQL for Excel is an application plug-in enabling data analysts to very easily access and manipulate MySQL data within Microsoft Excel. It enables you to directly work with a MySQL database from within Microsoft Excel so you can easily do tasks such as: Importing MySQL data into Excel Exporting Excel data directly into MySQL to a new or existing table Editing MySQL data directly within Excel As this is a beta version the MySQL for Excel product can be downloaded only by using the product standalone installer at this link http://dev.mysql.com/downloads/windows/excel/ Your feedback on this beta version is very well appreciated, you can raise bugs on the MySQL bugs page or give us your comments on the MySQL for Excel forum. Changes in MySQL for Excel 1.3.0 (2014-06-06, Beta) This section documents all changes and bug fixes applied to MySQL for Excel since the release of 1.2.1. Several new features were added, for more information see What Is New In MySQL for Excel (http://dev.mysql.com/doc/refman/5.6/en/mysql-for-excel-what-is-new.html). Known limitations: Upgrading from versions MySQL for Excel 1.2.0 and lower is not possible due to a bug fixed in MySQL for Excel 1.2.1. In that scenario, the old version (MySQL for Excel 1.2.0 or lower) must be uninstalled first. Upgrading from version 1.2.1 works correctly. <CTRL> + <A> cannot be used to select all database objects. Either <SHIFT> + <Arrow Key> or <CTRL> + click must be used instead. PivotTables are normally placed to the right (skipping one column) of the imported data, they will not be created if there is another existing Excel object at that position. Functionality Added or Changed Imported data can now be refreshed by using the native Refresh feature. Fields in the imported data sheet are then updated against the live MySQL database using the saved connection ID. Functionality was added to import data directly into PivotTables, which can be created from any Import operation. Multiple objects (tables and views) can now be imported into Excel, when before only one object could be selected. Relational information is also utilized when importing multiple objects. All options now have descriptive tooltips. Hovering over an option/preference displays helpful information about its use. A new Export Data, Advanced Options option was added that shows all available data types in the Data Type combo box, instead of only showing a subset of the most popular data types. The option dialogs now include a Refresh to Defaults button that resets the dialog's options to their defaults values. Each option dialog is set individually. A new Add Summary Fields for Numeric Columns option was added to the Import Data dialog that automatically adds summary fields for numeric data after the last row of the imported data. The specific summary function is selectable from many options, such as "Total" and "Average." A new collation option was added for the schema and table creation wizards. The default schema collation is "Server Default", and the default table collation is "Schema Default". Collation options may be selected from a drop-down list of all available collations. Quick links: MySQL for Excel documentation: http://dev.mysql.com/doc/en/mysql-for-excel.html. MySQL on Windows blog: http://blogs.oracle.com/MySQLOnWindows. MySQL for Excel forum: http://forums.mysql.com/list.php?172. MySQL YouTube channel: http://www.youtube.com/user/MySQLChannel. Enjoy and thanks for the support! 

    Read the article

  • Learn How to Use Oracle’s Spatial and BI Tools for Location-aware Predictive Analytics

    - by Mandy Ho
    November 29, 2-3pm EST Are you a OBIEE (Oracle Business Intelligence Enterprise Edition) user? Have Location data you'd like to incorporate into your analysis as well? This is a great webinar for you! Join us, as Oracle experts from both teams show how to perform perdictive analytics, network analytics and spatial analysis, combined together, in real world scenarios. We will include demos evaluating airline on-time performance and retail establishment performance.  Learn how to: - Gain better business insights and improve ROI with Oracle Spatial and Graph, Oracle Advanced Analytics, and Oracle Business Intelligence Enterprise Edition (OBIEE). - Streamline and remove the complexity of building applications with OBIEE’s built-in location and analytics features. - Create the statistical model, build interactive reports and dashboards including location analysis and map visualization, and incorporate network analytics for geomarketing and site scoring. - Perform location analysis and processing such as proximity, containment, geocoding, aggregation of geographic regions, and more. Speakers include Jayant Sharma, Director, Product Management, Oracle Spatial and Mapping Technologies; Jean Ihm, Principal Product Manager, Oracle Spatial and Mapping Technologies; and Abhinav Agarwal, OBIEE Product Management. Who should attend This webinar is appropriate for CIOs, business and technical managers, developers, and analysts involved in design and management of analytic applications and solutions where spatial analysis can add insight and value to business processes. Click here, or the link below to sign up today! https://www2.gotomeeting.com/register/764677554

    Read the article

  • Improving the performance of a db import process

    - by mmr
    I have a program in Microsoft Access that processes text and also inserts data in MySQL database. This operation takes 30 mins or less to finished. I translated it into VB.NET and it takes 2 hours to finish. The program goes like this: A text file contains individual swipe from a corresponding person, it contains their id, time and date of swipe in the machine, and an indicator if it is a time-in or a time-out. I process this text, segregate the information and insert the time-in and time-out per row. I also check if there are double occurrences in the database. After checking, I simply merge the time-in and time-out of the corresponding person into one row only. This process takes 2 hours to finished in VB.NET considering I have a table to compare which contains 600,000+ rows. Now, I read in the internet that python is best in text processing, i already have a test but i doubt in database operation. What do you think is the best programming language for this kind of problem? How can I speed up the process? My first idea was using python instead of VB.NET, but since people here telling me here on SO that this most probably won't help I am searching for different solutions.

    Read the article

  • How do I implement the bg, &, and fg commands functionaliity in my custom unix shell program written in C

    - by user1631009
    I am extending the functionality of a custom unix shell which I wrote as part of my lab assignment. It currently supports all commands through execvp calls, in-built commands like pwd, cd, history, echo and export, and also redirection and pipes. Now I wanted to add support for running a command in background e.g. $ls -la& I also want to implement bg and fg job control commands. I know this can be achieved if I execute the command by forking a new child process and not waiting for it in the parent process. But how do I again bring this command to foreground using fg? I have the idea of entering each background command in a list assigning each of them a serial number. But I don't know how do I make the processes execute in the background, then bring them back to foreground. I guess wait() and waitpid() system calls would come handy but I am not that comfortable with them. I tried reading the man pages but still am in the dark. Can someone please explain in a layman's language how to achieve this in UNIX system programming? And does it have something to do with SIGCONT and SIGSTP signals?

    Read the article

  • SQLAuthority News – Download Whitepaper – SQL Server 2008 R2 Analysis Services Operations Guide

    - by pinaldave
    SQL Server Analysis Service (SSAS) has been always interesting subject for research. Analysis Services cubes are a very powerful tool in the hands of the business intelligence (BI) developer. They provide an easy way to expose even large data models directly to business users. Microsoft has published very informative white paper on Analysis Services Operations Guide. This white paper is authored by Thomas Kejser, John Sirmon, and Denny Lee. In this guide you will find information on how to test and run Microsoft SQL Server Analysis Services in SQL Server 2005, SQL Server 2008, and SQL Server 2008 R2 in a production environment. The focus of this guide is how you can test, monitor, diagnose, and remove production issues on even the largest scaled cubes. This paper also provides guidance on how to configure the server for best possible performance. It is the goal of this guide to make your operations processes as painless as possible, and to have you run with the best possible performance without any additional development effort to your deployed cubes. In this guide, you will learn how to get the best out of your existing data model by making changes transparent to the data model and by making configuration changes that improve the user experience of the cube. Download SQL Server 2008 R2 Analysis Services Operations Guide Note: Abstract taken white paper. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, SQL White Papers, SQLAuthority News, T SQL, Technology

    Read the article

  • SQL SERVER – Why Do We Need Data Quality Services – Importance and Significance of Data Quality Services (DQS)

    - by pinaldave
    Databases are awesome.  I’m sure my readers know my opinion about this – I have made SQL Server my life’s work after all!  I love technology and all things computer-related.  Of course, even with my love for technology, I have to admit that it has its limits.  For example, it takes a human brain to notice that data has been input incorrectly.  Computer “brains” might be faster than humans, but human brains are still better at pattern recognition.  For example, a human brain will notice that “300” is a ridiculous age for a human to be, but to a computer it is just a number.  A human will also notice similarities between “P. Dave” and “Pinal Dave,” but this would stump most computers. In a database, these sorts of anomalies are incredibly important.  Databases are often used by multiple people who rely on this data to be true and accurate, so data quality is key.  That is why the improved SQL Server features Master Data Management talks about Data Quality Services.  This service has the ability to recognize and flag anomalies like out of range numbers and similarities between data.  This allows a human brain with its pattern recognition abilities to double-check and ensure that P. Dave is the same as Pinal Dave. A nice feature of Data Quality Services is that once you set the rules for the program to follow, it will not only keep your data organized in the future, but go to the past and “fix up” any data that has already been entered.  It also allows you do combine data from multiple places and it will apply these rules across the board, so that you don’t have any weird issues that crop up when trying to fit a round peg into a square hole. There are two parts of Data Quality Services that help you accomplish all these neat things.  The first part is DQL Server, which you can think of as the hardware component of the system.  It is installed on the side of (it needs to install separately after SQL Server is installed) SQL Server and runs quietly in the background, performing all its cleanup services. DQS Client is the user interface that you can interact with to set the rules and check over your data.  There are three main aspects of Client: knowledge base management, data quality projects and administration.  Knowledge base management is the part of the system that allows you to set the rules, or program the “knowledge base,” so that your database is clean and consistent. Data Quality projects are what run in the background and clean up the data that is already present.  The administration allows you to check out what DQS Client is doing, change rules, and generally oversee the entire process.  The whole process is user-friendly and a pleasure to use.  I highly recommend implementing Data Quality Services in your database. Here are few of my blog posts which are related to Data Quality Services and I encourage you to try this out. SQL SERVER – Installing Data Quality Services (DQS) on SQL Server 2012 SQL SERVER – Step by Step Guide to Beginning Data Quality Services in SQL Server 2012 – Introduction to DQS SQL SERVER – DQS Error – Cannot connect to server – A .NET Framework error occurred during execution of user-defined routine or aggregate “SetDataQualitySessions” – SetDataQualitySessionPhaseTwo SQL SERVER – Configuring Interactive Cleansing Suggestion Min Score for Suggestions in Data Quality Services (DQS) – Sensitivity of Suggestion SQL SERVER – Unable to DELETE Project in Data Quality Projects (DQS) Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: Data Quality Services, DQS

    Read the article

  • ODI 11g – Oracle Multi Table Insert

    - by David Allan
    With the IKM Oracle Multi Table Insert you can generate Oracle specific DML for inserting into multiple target tables from a single query result – without reprocessing the query or staging its result. When designing this to exploit the IKM you must split the problem into the reusable parts – the select part goes in one interface (I named SELECT_PART), then each target goes in a separate interface (INSERT_SPECIAL and INSERT_REGULAR). So for my statement below… /*INSERT_SPECIAL interface */ insert  all when 1=1 And (INCOME_LEVEL > 250000) then into SCOTT.CUSTOMERS_NEW (ID, NAME, GENDER, BIRTH_DATE, MARITAL_STATUS, INCOME_LEVEL, CREDIT_LIMIT, EMAIL, USER_CREATED, DATE_CREATED, USER_MODIFIED, DATE_MODIFIED) values (ID, NAME, GENDER, BIRTH_DATE, MARITAL_STATUS, INCOME_LEVEL, CREDIT_LIMIT, EMAIL, USER_CREATED, DATE_CREATED, USER_MODIFIED, DATE_MODIFIED) /* INSERT_REGULAR interface */ when 1=1  then into SCOTT.CUSTOMERS_SPECIAL (ID, NAME, GENDER, BIRTH_DATE, MARITAL_STATUS, INCOME_LEVEL, CREDIT_LIMIT, EMAIL, USER_CREATED, DATE_CREATED, USER_MODIFIED, DATE_MODIFIED) values (ID, NAME, GENDER, BIRTH_DATE, MARITAL_STATUS, INCOME_LEVEL, CREDIT_LIMIT, EMAIL, USER_CREATED, DATE_CREATED, USER_MODIFIED, DATE_MODIFIED) /*SELECT*PART interface */ select        CUSTOMERS.EMAIL EMAIL,     CUSTOMERS.CREDIT_LIMIT CREDIT_LIMIT,     UPPER(CUSTOMERS.NAME) NAME,     CUSTOMERS.USER_MODIFIED USER_MODIFIED,     CUSTOMERS.DATE_MODIFIED DATE_MODIFIED,     CUSTOMERS.BIRTH_DATE BIRTH_DATE,     CUSTOMERS.MARITAL_STATUS MARITAL_STATUS,     CUSTOMERS.ID ID,     CUSTOMERS.USER_CREATED USER_CREATED,     CUSTOMERS.GENDER GENDER,     CUSTOMERS.DATE_CREATED DATE_CREATED,     CUSTOMERS.INCOME_LEVEL INCOME_LEVEL from    SCOTT.CUSTOMERS   CUSTOMERS where    (1=1) Firstly I create a SELECT_PART temporary interface for the query to be reused and in the IKM assignment I state that it is defining the query, it is not a target and it should not be executed. Then in my INSERT_SPECIAL interface loading a target with a filter, I set define query to false, then set true for the target table and execute to false. This interface uses the SELECT_PART query definition interface as a source. Finally in my final interface loading another target I set define query to false again, set target table to true and execute to true – this is the go run it indicator! To coordinate the statement construction you will need to create a package with the select and insert statements. With 11g you can now execute the package in simulation mode and preview the generated code including the SQL statements. Hopefully this helps shed some light on how you can leverage the Oracle MTI statement. A similar IKM exists for Teradata. The ODI IKM Teradata Multi Statement supports this multi statement request in 11g, here is an extract from the paper at www.teradata.com/white-papers/born-to-be-parallel-eb3053/ Teradata Database offers an SQL extension called a Multi-Statement Request that allows several distinct SQL statements to be bundled together and sent to the optimizer as if they were one. Teradata Database will attempt to execute these SQL statements in parallel. When this feature is used, any sub-expressions that the different SQL statements have in common will be executed once, and the results shared among them. It works in the same way as the ODI MTI IKM, multiple interfaces orchestrated in a package, each interface contributes some SQL, the last interface in the chain executes the multi statement.

    Read the article

  • A Model for Planning Your Oracle BPM 10g Migration by Kris Nelson

    - by JuergenKress
    As the Oracle SOA Suite and BPM Suite 12c products enter beta, many of our clients are starting to discuss migrating from the Oracle 10g or prior platforms. With the BPM Suite 11g, Oracle introduced a major change in architecture with a strong focus on integration with SOA and an entirely new technology stack. In addition, there were fresh new UIs and a renewed business focus with an improved Process Composer and features like Adaptive Case Management. While very beneficial to both technology and the business, the fundamental change in architecture does pose clear migration challenges for clients who have made investments in the 10g platform. Some of the key challenges facing 10g customers include: Managing in-process instance migration and running multiple process engines Migration of User Interfaces and other code within the environment that may not be automated Growing or finding technical staff with both 10g and 12c experience Managing migration projects while continuing to move the business forward and meet day-to-day responsibilities As a former practitioner in a mixed 10g/11g shop, I wrestled with many of these challenges as we tried to plan ahead for the migration. Luckily, there is migration tooling on the way from Oracle and several approaches you can use in planning your migration efforts. In addition, you already have a defined and visible process on the current platform, which will be invaluable as you migrate.  A Migration ModelThis model presents several options across a value and investment spectrum. The goal of the AVIO Migration Model is to kick-start discussions within your company and assist in creating a plan of action to take advantage of the new platform. As with all models, this is a framework for discussion and certain processes or situations may not fit. Please contact us if you have specific questions or want to discuss migrations efforts in your situation. Read the complete article here. SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Facebook Wiki Technorati Tags: Kris Nelson,ACM,Adaptive Case Management,Community,Oracle SOA,Oracle BPM,OPN,Jürgen Kress

    Read the article

  • Enterprise Architecture IS (should not be) Arbitrary

    - by pat.shepherd
    I took a look at a blog entry today by Jordan Braunstein where he comments on another blog entry titled “Yes, “Enterprise Architecture is Relative BUT it is not Arbitrary.”  The blog makes some good points such as the following: Lock 10 architects in 10 separate rooms; provide them all an identical copy of the same business, technical, process, and system requirements; have them design an architecture under the same rules and perspectives; and I guarantee your result will be 10 different architectures of varying degrees. SOA Today: Enterprise Architecture IS Arbitrary Agreed, …to a degree….but less so if all 10 truly followed one of the widely accepted EA frameworks. My thinking is that EA frameworks all focus on getting the business goals/vision locked down first as the primary drivers for decisions made lower down the architecture stack.  Many people I talk to, know about frameworks such as TOGAF, FEA, etc. but seldom apply the tenants to the architecture at hand.  We all seem to want to get right into the Visio diagrams and boxes and arrows and connecting protocols and implementation details and lions and tigers and bears (Oh, my!) too early. If done properly the Business, Application and Information architectures are nailed down BEFORE any technological direction (SOA or otherwise) is set.  Those 3 layers and Governance (people and processes), IMHO, are layers that should not vary much as they have everything to do with understanding the business -- from which technological conclusions can later be drawn. I really like what he went on to say later in the post about the fact that architecture attempts to remove the amount of variance between the 10 different architect’s work.  That is the real heart of what EA is about; REMOVING THE ARBRITRARITY.

    Read the article

  • Silverlight Cream for February 13, 2011 -- #1046

    - by Dave Campbell
    In this Issue: Loek van den Ouweland, Colin Eberhardt, Rudi Grobler, Joost van Schaik, Mike Taulty(-2-, -3-), Deborah Kurata, David Kelley, Peter Foot, Samuel Jack(-2-), and WindowsPhoneGeek(-2-). Above the Fold: Silverlight: "Silverlight Simple MVVM Commanding" Deborah Kurata WP7: "WP7 CustomInputPrompt control with Cancel button" WindowsPhoneGeek Expression Blend: "Silverlight Templated Image Button with two images" Loek van den Ouweland Shoutouts: Dave Campbell posted a write-up about the project he's on and the use of Sterling: Sterling Object-Oriented Database for ISO 1.0 Released!... Also see Jeremy Likness' post on the 1.0 release: Sterling Object-Oriented Database 1.0 RTM Not necessarily Silverlight, but darn cool, a great control by Sasha Barber: WPF : A Weird 3d based control snoutholder announced new content: Windows Phone 7 QuickStarts Live! From SilverlightCream.com: Silverlight Templated Image Button with two images Loek van den Ouweland has a video tutorial up for creating an ImageButton with a hover state... Expression Blend coolness, and check out the external links he has to their training site. Windows Phone 7 Performance Measurements – Emulator vs. Hardware Colin Eberhardt's latest is a popular post comparing performance metrics between the WP7 emulator and a real device. Mileage may vary, but I'm pretty sure the overall results are conculsive, and should help the way you view your app as you're building in the emulator. WP7: WebClient vs HttpWebRequest Rudi Grobler's latest is a discussion of WebClient and HttpWebRequest, gives coding examples of each plus discussion of why you may choose one over the other... and pay attention to his comment about mobile providers. A Blendable Windows Phone 7 / Silverlight clipping behavior Joost van Schaik posted this WP7/Silverlight clipping behavior he developed because all the other solutions were not blendable. Another really useful piece of code from Joost! Blend Bits 22–Being Stylish Mike Taulty has 3 more episodes in his Blend Bits series... first up is on one Styles... explicit, implicit, inheriting... you name it, he's covering it! Blend Bits 23–Templating Part 1 MIke Taulty then has the beginning of a series within his Blend Bits series on Templating. This is something you just have to either bite the bullet and go with Blend to do, or consume someone else's work. Mike shows us how to do it ourself by tweaking the visual aspects of a checkbox Blend Bits 24–Templating Part 2 In part 2 of the Templating series, Mike Taulty digs deeper into Blend and cracks open the Listbox control to take a bunch of the inner elements out for a spin... fun stuff and great tutorial, Mike! Silverlight Simple MVVM Commanding Deborah Kurata has another great MVVM post up... if you don't have your head wrapped around commanding yet, this is a good place to start that process... VB and C# as always. App Development for Windows Phone 7 101 David Kelley goes through the basics of producing a WP7 app both from the Silverlight and XNA side... good info and good external links to get you going. Copyable TextBlock for Windows Phone Peter Foot takes a look at the Copy/Paste functionality in WP7 and how to apply it to a TextBlock... which is NOT an out-of-the-box solution. How to deploy to, and debug, multiple instances of the Windows Phone 7 emulator Samuel Jack has a couple posts up this week... first is this clever one on running multiple copies of the emulator at once... too cool for debugging a multi-player game! Multi-player enabling my Windows Phone 7 game: Day 3 – The Server Side Samuel Jack's latest is a detailed look at his day 3 adventure of taking his multi-player game to WP7... lots of information and external links... what do you say, give him another day? :) WP7 CustomInputPrompt control with Cancel button WindowsPhoneGeek has a couple more posts up... first is this "CustomInputPrompt" control based off the InputPrompt from Coding4Fun. Implementing Windows Phone 7 DataTemplateSelector and CustomDataTemplateSelector In his latest post, WindowsPhoneGeek writes a DataTemplateSelector to allow different data templates for different list elements based on the type of the element. Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • Vitality of Product Information Management Showcased at OpenWorld 2012

    - by Mala Narasimharajan
     By Sachin Patel Can you hear the countdown clock ticking!! OpenWorld 2012 is almost here and as I write this Oracle is buzzing with fresh new ideas and solutions that will be showcased this year. What an exciting time for all of us to be in midst of a digital revolution. Whether it is Apple fans clamoring to find every new feature that has been added to the iPhone 5 or a startup launching a new digital thermostat (has anyone looked at the new one from Nest ), product information is a vital for companies to grow and compete in this cut-throat market. Customer today struggle to aggregate and enrich this product data from the myriad of systems they have in place to run their businesses and operations. Having a product information strategy is paramount to align your sales channels and operations with the most accurate and upto date product data. We have a number of sessions this year at OpenWorld where you can gain more insight into how Oracle’s next generation of Fusion Applications, in this case Fusion Product Hub can provide you with a solution to streamline and get control of your Product Master Data. Enabling Trusted Enterprise Product Data with Oracle Fusion Product HubTuesday, October 2nd 11:45 am, Moscone West 2022 Join me Sachin Patel, Director of Product Strategy and Milan Bhatia, VP of Development as we discuss how you can enable trusted product master data in your enterprise. In this session we plan to cover the challenges companies face today in mastering product data. The discussion will also include how Fusion Product Hub brings new and innovative features to empower your product data owners to create a holistic and rich product definition that can be leveraged across your enterprise. We will also be joined by Pawel Fidelus from Fideltronik an Early Adopter for Fusion Product Hub who will showcase their plans to implement Fusion Product Hub and the value it will bring to Fideltronik Multichannel Fulfillment Excellence in Direct-to-Consumer Market Thursday, October 4th, 12:45 am, Moscone West 2024 Do you have multiple order capture systems? Do you have difficulty in fulfilling orders for your customers across various channels and suppliers? Mark Carson, Director, Fusion DOO and Brad Kerr, Director, AGSS will be showcasing the Fusion Distributed Order Orchestration solution and how companies can orchestrate orders from multiple order capture systems and route them to the appropriate fulfillment system. Sachin Patel, Director Product Strategy for Product MDM will highlight the business pain points in consolidating and commercializing data from a Multi Channel Commerce point of view and how Fusion Product Hub helps in allowing you to provide a single source of truth to drive a singular and rich customer experience. Oracle Fusion Supply Chain Management: Customer Adoption and Experiences                                                Wednesday, October 3rd 10:15 am, Moscone West 2003 This is a great session to attend to learn about how Fusion Supply Chain Management and Fusion Product Hub Early Adopters, including Boeing and Fideltronik are leveraging Fusion Applications to improve their Supply Chain operations. Have a great OpenWorld and see you soon!!

    Read the article

< Previous Page | 447 448 449 450 451 452 453 454 455 456 457 458  | Next Page >