Search Results

Search found 16971 results on 679 pages for 'blogs'.

Page 370/679 | < Previous Page | 366 367 368 369 370 371 372 373 374 375 376 377  | Next Page >

  • Oracle Partner Tier1, inc. Launches the Tier1 Private Oracle Cloud

    - by Catalin Teodor
    Tier1, Inc. announced the availability of the Tier1 Private Oracle Cloud, the most optimized and protected computing environment for Oracle Applications and databases. Leveraging Oracle's Virtual Compute Appliance (VCA) technology, it’s the only virtual environment certified to use Oracle Trusted Partitions – the Tier1 Private Cloud provides the flexibility to license Oracle software on a virtual CPU basis. Read more!

    Read the article

  • What's up with LDoms: Eine Artikel-Reihe zum Thema Oracle VM Server for SPARC

    - by Stefan Hinker
    Unter dem Titel "What's up with LDoms" habe ich soeben den ersten Artikel einer ganzen Reihe veroeffentlicht.  Ziel der Artikelreihe ist es, das vollstaendige Feature-Set der LDoms zu betrachten.  Da das ganze recht umfangreich ist, werde ich hier von der Zweisprachigkeit abweichen und die Artikel ausschliesslich auf Englisch verfassen.  Ich bitte hierfuer um Verstaendnis. Den ersten Artikel gibt es hier: What's up with LDoms: Part 1 - Introduction & Basic Concepts Viel Spass beim Lesen!

    Read the article

  • Will we ever lose the human touch?

    - by divya.malik
    I was at a conference two weeks ago, which was targeted to sales and marketing professionals. The discussions around the changing scenario in sales was very interesting. More and more of selling is moving to the internet- sales people are delivering more of their presentations online, or via the phone. Budget constraints and new technologies have dramatically decreased the need for face-to-face interactions. At the same time, customers are also researching for products on their own, taking the advice of peers, making up their mind, and then contacting the vendor. That takes care of more than half of the usual selling process. But humans are social animals, and because of that I believe that despite these changing trends and technologies, the need to maintain the human touch will always be necessary. One of the presenters at the conference shared this video, which stayed in my mind.

    Read the article

  • [ADF tip #1] Working around some clientListener limitation

    - by julien.schneider(at)oracle.com
    As i occasionally work on Oracle ADF, i've decided to write tips on this Framework. Previous version of adf (10g) had a <afh:body> component to represent the body of your page which allowed to specify javascript events like onload or onunload. In ADF RC, the body (as well as html and head) is geneated by a single <af:document>. To implement the onXXXXX events you embed an <af:clientListener> within this <af:document> and specify property type="load" with the method to trigger.   Problem is that onunload property is not supported by th af:clientListener. So how do i do ? Well, a solution is to add this event during page load.   First step : Embed in your <af:document> the following : <af:clientListener type="load" method="addOnUnload()"/>   Second step : Add your unload event using this kind of script : <af:document>     <f:facet name="metaContainer">     <f:verbatim>     <script type="text/javascript">                 function addOnUnload(evt) {                       window.onunload=function()                               {                                   alert('Goodbye Word');                              }                    }     </script>     </f:verbatim>   When closing browser your event will be triggered as expected.    

    Read the article

  • Why do we (really) program to interfaces?

    - by Kyle Burns
    One of the earliest lessons I was taught in Enterprise development was "always program against an interface".  This was back in the VB6 days and I quickly learned that no code would be allowed to move to the QA server unless my business objects and data access objects each are defined as an interface and have a matching implementation class.  Why?  "It's more reusable" was one answer.  "It doesn't tie you to a specific implementation" a slightly more knowing answer.  And let's not forget the discussion ending "it's a standard".  The problem with these responses was that senior people didn't really understand the reason we were doing the things we were doing and because of that, we were entirely unable to realize the intent behind the practice - we simply used interfaces and had a bunch of extra code to maintain to show for it. It wasn't until a few years later that I finally heard the term "Inversion of Control".  Simply put, "Inversion of Control" takes the creation of objects that used to be within the control (and therefore a responsibility of) of your component and moves it to some outside force.  For example, consider the following code which follows the old "always program against an interface" rule in the manner of many corporate development shops: 1: ICatalog catalog = new Catalog(); 2: Category[] categories = catalog.GetCategories(); In this example, I met the requirement of the rule by declaring the variable as ICatalog, but I didn't hit "it doesn't tie you to a specific implementation" because I explicitly created an instance of the concrete Catalog object.  If I want to test the functionality of the code I just wrote I have to have an environment in which Catalog can be created along with any of the resources upon which it depends (e.g. configuration files, database connections, etc) in order to test my functionality.  That's a lot of setup work and one of the things that I think ultimately discourages real buy-in of unit testing in many development shops. So how do I test my code without needing Catalog to work?  A very primitive approach I've seen is to change the line the instantiates catalog to read: 1: ICatalog catalog = new FakeCatalog();   once the test is run and passes, the code is switched back to the real thing.  This obviously poses a huge risk for introducing test code into production and in my opinion is worse than just keeping the dependency and its associated setup work.  Another popular approach is to make use of Factory methods which use an object whose "job" is to know how to obtain a valid instance of the object.  Using this approach, the code may look something like this: 1: ICatalog catalog = CatalogFactory.GetCatalog();   The code inside the factory is responsible for deciding "what kind" of catalog is needed.  This is a far better approach than the previous one, but it does make projects grow considerably because now in addition to the interface, the real implementation, and the fake implementation(s) for testing you have added a minimum of one factory (or at least a factory method) for each of your interfaces.  Once again, developers say "that's too complicated and has me writing a bunch of useless code" and quietly slip back into just creating a new Catalog and chalking any test failures up to "it will probably work on the server". This is where software intended specifically to facilitate Inversion of Control comes into play.  There are many libraries that take on the Inversion of Control responsibilities in .Net and most of them have many pros and cons.  From this point forward I'll discuss concepts from the standpoint of the Unity framework produced by Microsoft's Patterns and Practices team.  I'm primarily focusing on this library because it questions about it inspired this posting. At Unity's core and that of most any IoC framework is a catalog or registry of components.  This registry can be configured either through code or using the application's configuration file and in the most simple terms says "interface X maps to concrete implementation Y".  It can get much more complicated, but I want to keep things at the "what does it do" level instead of "how does it do it".  The object that exposes most of the Unity functionality is the UnityContainer.  This object exposes methods to configure the catalog as well as the Resolve<T> method which is used to obtain an instance of the type represented by T.  When using the Resolve<T> method, Unity does not necessarily have to just "new up" the requested object, but also can track dependencies of that object and ensure that the entire dependency chain is satisfied. There are three basic ways that I have seen Unity used within projects.  Those are through classes directly using the Unity container, classes requiring injection of dependencies, and classes making use of the Service Locator pattern. The first usage of Unity is when classes are aware of the Unity container and directly call its Resolve method whenever they need the services advertised by an interface.  The up side of this approach is that IoC is utilized, but the down side is that every class has to be aware that Unity is being used and tied directly to that implementation. Many developers don't like the idea of as close a tie to specific IoC implementation as is represented by using Unity within all of your classes and for the most part I agree that this isn't a good idea.  As an alternative, classes can be designed for Dependency Injection.  Dependency Injection is where a force outside the class itself manipulates the object to provide implementations of the interfaces that the class needs to interact with the outside world.  This is typically done either through constructor injection where the object has a constructor that accepts an instance of each interface it requires or through property setters accepting the service providers.  When using dependency, I lean toward the use of constructor injection because I view the constructor as being a much better way to "discover" what is required for the instance to be ready for use.  During resolution, Unity looks for an injection constructor and will attempt to resolve instances of each interface required by the constructor, throwing an exception of unable to meet the advertised needs of the class.  The up side of this approach is that the needs of the class are very clearly advertised and the class is unaware of which IoC container (if any) is being used.  The down side of this approach is that you're required to maintain the objects passed to the constructor as instance variables throughout the life of your object and that objects which coordinate with many external services require a lot of additional constructor arguments (this gets ugly and may indicate a need for refactoring). The final way that I've seen and used Unity is to make use of the ServiceLocator pattern, of which the Patterns and Practices team has also provided a Unity-compatible implementation.  When using the ServiceLocator, your class calls ServiceLocator.Retrieve in places where it would have called Resolve on the Unity container.  Like using Unity directly, it does tie you directly to the ServiceLocator implementation and makes your code aware that dependency injection is taking place, but it does have the up side of giving you the freedom to swap out the underlying IoC container if necessary.  I'm not hugely concerned with hiding IoC entirely from the class (I view this as a "nice to have"), so the single biggest problem that I see with the ServiceLocator approach is that it provides no way to proactively advertise needs in the way that constructor injection does, allowing more opportunity for difficult to track runtime errors. This blog entry has not been intended in any way to be a definitive work on IoC, but rather as something to spur thought about why we program to interfaces and some ways to reach the intended value of the practice instead of having it just complicate your code.  I hope that it helps somebody begin or continue a journey away from being a "Cargo Cult Programmer".

    Read the article

  • Fix import hint

    - by Martin Janicek
    Good news everyone! I've implemented 'Fix import hint' which should make your life (and most probably also the groovy development) much easier! It looks in the same way as in Java editor, so you might choose between classes with the same name. Hope you will enjoy it! And as usual if you would like to try it on your own, download the latest development build and I will be more than happy for every feedback!

    Read the article

  • Development Quirk From ASP.NET Dynamic Compilation

    - by jkauffman
    The Problem I got a compilation error in my ASP.NET MVC3 project that tested my sanity today. (As always, names are changed to protect the innocent) The type or namespace name 'FishViewModel' does not exist in the namespace 'Company.Product.Application.Models' (are you missing an assembly reference?) Sure looks easy! There must be something in the project referring to a FishViewModel. The Confusing Part The first thing I noticed was the that error was occuring in a folder clearly not in my project and in files that I definitely had not created: %SystemRoot%\Microsoft.NET\Framework\(versionNumber)\Temporary ASP.NET Files\ App_Web_mezpfjae.1.cs I also ascertained these facts, each of which made me more confused than the last: Rebuild and Clean had no effect. No controllers in the project ever returned a ViewResult using FishViewModel. No views in the project defined that they use FishViewModel. Searching across all files included in the project for “FishViewModel” provided no results. The build server did not report a problem. The Solution The problem stemmed from a file that was not included in the project but still present on the file system: (By the way, if you don’t know this trick already, there is a toolbar button in the Solution Explorer window to “Show All Files” which allows you to see files all files in the file system) In my situation, I was working on the mission-critical Fish view before abandoning the feature. Instead of deleting the file, I excluded it from the project. However, this was a bad move. It caused the build failure, and in order to fix the error, this file must be deleted. By the way, this file was not in source control, so the build server did not have it. This explains why my build server did not report a problem for me. The Explanation So, what’s going on? This file isn’t even a part of the project, so why is it failing the build? This is a behavior of the ASP.NET Dynamic Compilation. This is the same process that occurs when deploying a webpage; ASP.NET compiles the web application’s code. When this occurs on a production server, it has to do so without the .csproj file (which isn’t usually deployed, if you’ve taken your time to do a deployment cleanly). This process has merely the file system available to identify what to compile. So, back in the world of developing the webpage in visual studio on my developer box, I run into the situation because the same process is occuring there. This is true even though I have more files on my machine than will actually get deployed. I can’t help but think that this error could be attributed back to the real culprit file (Fish.cshtml, rather than the temporary files) with some work, but at least the error had enough information in it to narrow it down. The Conclusion I had previously been accustomed to the idea that for c# projects, the .csproj file always “defines” the build behavior. This investigation has taught me that I’ll need to shift my thinking a bit to remember that the file system has the final say when it comes to web applications, even on the developer’s machine!

    Read the article

  • Gotcha | Installing .net 4.0 and IIS 6

    - by Steve Clements
    Just a quick one, seems pretty weird to me. I installed .net 4.0 on an old IIS6 box, ready to deploy a asp.net mvc app targeting .net 4.0.  I thought, which to me seems logical, that I install .net 4.0, setup a new web site, new app pool, set the web site to asp.net 4.0 (other configuration also needed to run MVC on IIS6 here and here) and it would just work. Errr… No.  The page cannot be displayed!  Nothing to do with MVC. Apparently just because you have installed .net 4 and the option is available in IIS, it’s not enabled.  I’m not going to repeat anything here…take a look at this post – clear, easy steps on exactly what you need to do and how to check if this is the problem. http://johan.driessen.se/archive/2010/04/13/getting-an-asp.net-4-application-to-work-on-iis6.aspx

    Read the article

  • Why people don't patch and upgrade?!?

    - by Mike Dietrich
    Discussing the topic "Why Upgrade" or "Why not Upgrade" is not always fun. Actually the arguments repeat from customer to customer. Typically we hear things such as: A PSU or Patch Set introduces new bugs A new PSU or Patch Set introduces new features which lead to risk and require application verification  Patching means risk Patching changes the execution plans Patching requires too much testing Patching is too much work for our DBAs Patching costs a lot of money and doesn't pay out And to be very honest sometimes it's hard for me to stay calm in such discussions. Let's discuss some of these points a bit more in detail. A PSU or Patch Set introduces new bugsWell, yes, that is true as no software containing more than some lines of code is bug free. This applies to Oracle's code as well as too any application or operating system code. But first of all, does that mean you never patch your OS because the patch may introduce new flaws? And second, what is the point of saying "it introduces new bugs"? Does that mean you will never get rid of the mean issues we know about and we fixed already? Scroll down from MOS Note:161818.1 to the patch release you are on, no matter if it's 10.2.0.4 or 11.2.0.3 and check for the Known Issues And Alerts.Will you take responsibility to know about all these issues and refuse to upgrade to 11.2.0.4? I won't. A new PSU or Patch Set introduces new featuresOk, we can discuss that. Offering new functionality within a database patch set is a dubious thing. It has advantages such as in 11.2.0.4 where we backported Database Redaction to. But this is something you will only use once you have an Advanced Security license. I interpret that statement I've heard quite often from customers in a different way: People don't want to get surprises such as new behaviour. This certainly gives everybody a hard time. And we've had many examples in the past (SESSION_CACHED_CURSROS in 10.2.0.4,  _DATAFILE_WRITE_ERRORS_CRASH_INSTANCE in 11.2.0.2 and others) where those things weren't documented, not even in the README. Thanks to many friends out there I learned about those as well. So new behaviour is the topic people consider as risky - not really new features. And just to point this out: A PSU never brings in new features or new behaviour by definition! Patching means riskDoes it really mean risk? Yes, there were issues in the past (and sometimes in the present as well) where a patch didn't get installed correctly. But personally I consider it way more risky to not patch. Keep that in mind: The day Oracle publishes an PSU (or CPU) containing security fixes all the great security experts out there go public with their findings as well. So from that day on even my grandma can find out about those issues and try to attack somebody. Now a lot of people say: "My database does not face the internet." And I will answer: "The enemy is sitting already behind your firewalls. And knows potentially about these things." My statement: Not patching introduces way more risk to your environment than patching. Seriously! Patching changes the execution plansDo they really? I agree - there's a very small risk for this happening with Patch Sets. But not with PSUs or CPUs as they contain no optimizer fixes changing behaviour (but they may contain fixes curing wrong-query-result-bugs). But what's the point of a changing execution plan? In Oracle Database 11g it is so simple to be prepared. SQL Plan Management is a free EE feature - so once that occurs you'll put the plan into the Plan Baseline. Basta! Yes, you wouldn't like to get such surprises? Than please use the SQL Performance Analyzer (SPA) from Real Application Testing and you'll detect that easily upfront in minutes. And not to forget this, a plan change can also be very positive!Yes, there's a little risk with a database patchset - and we have many possibilites to detect this before patching. Patching requires too much testingWell, does it really? I have seen in the past 12 years how people test. There are very different efforts and approaches on this. I have seen people spending a hell of money on licenses or on project team staffing. And I have seen people sailing blindly without any tests just going the John-Wayne-approach.Proper tools will allow you to test easily without too much efforts. See the paragraph above. We have used Real Application Testing in so many customer projects reducing the amount of work spend on testing by over 50%. But apart from that at some point you will have to stop testing. If you don't you'll get lost and you'll burn money. There's no 100% guaranty. You will have to deal with a little risk as reaching the final 5% of certainty will cost you the same as it did cost to reach 95%. And doing this will lead to abnormal long product cycles that you'll run behind forever. And this will cost even more money. Patching is too much work for our DBAsPatching is a lot of work. I agree. And it's no fun work. It's boring, annoying. You don't learn much from that. That's why you should try to automate this task. Use the Database's Lifecycle Management Pack. And don't cry about the fact that it costs money. Yes it does. But it will ease the process and you'll save a lot of costs as you don't waste your valuable time with patching. Or use Oracle Database 12c Oracle Multitenant and patch either by unplug/plug or patch an entire container database with all PDBs with one patch in one task. We have customer reference cases proofing it saved them 75% of time, effort and cost since they've used Lifecycle Management Pack. So why don't you use it? Patching costs a lot of money and doesn't pay outWell, see my statements in the paragraph above. And it pays out as flying with a database with 100 known critical flaws in it which are already fixed by Oracle (such as in the Oct 2013 PSU for Oracle Database 12c) will cost ways more in case of failure or even data loss. Bet with me? Let me finally ask you some questions. What cell phone are you using and which OS does it run? Do you have an iPhone 5 and did you upgrade already to iOS 7.0.3? I've just encountered on mine that the alarm (which I rely on when traveling) has gotten now a dependency on the physical switch "sound on/off". If it is switched to "off" physically the alarm rings "silently". What a wonderful example of a behaviour change coming in with a patch set. Will this push you to stay with iOS5 or iOS6? No, because those have security flaws which won't be fixed anymore. What browser are you surfing with? Do you use Mozilla 3.6? Well, congratulations to all the hackers. It will be easy for them to attack you and harm your system. I'd guess you have the auto updater on.  Same for Google Chrome, Safari, IE. Right? -Mike The T.htmtableborders, .htmtableborders td, .htmtableborders th {border : 1px dashed lightgrey ! important;} html, body { border: 0px; } body { background-color: #ffffff; } img, hr { cursor: default }

    Read the article

  • Do you know about the Visual Studio 2010 Architecture Guidance?

    - by Martin Hinshelwood
    If you have not seen the Visual Studio 2010 Architectural Guidance from the Visual Studio ALM Rangers then you are missing out. I have been spelunking the TFS Guidance recently and I discovered the Visual Studio 2010 Architectural Guidance. This is not an in-depth look at the capabilities of the architectural tools that shipped with Visual Studio 2010 Ultimate, but is instead a set of samples that lead you by example through real world scenarios. There is practical guidance and checklists to help guide lead developers and architects through the common challenges in understanding both existing and new applications. The content concentrates on practical guidance for Visual Studio 2010 Ultimate and is focused on modelling tools. There is integration into Visual Studio so all you need to do to access it is select “Architecture | Visual Studio ALM Rangers – Architecture Guidance”. Figure: Accessing the Architecture guidance is easy This brings up an inline version of the documentation and a kind of Explorer that lets you pick the tasks you want to perform and takes you strait to that part of the Guidance. Figure: Access the Guidance from right within Visual Studio 2010 This is a big help when you just want to figure out how to do something and can’t be bothered searching for and through the content in the provided Word documents. The Question and Answer section is full of useful content and there are six Hands-On-Labs to sink your teeth into: Creating extensions with the feature extension Explore an Existing System Scenario Extensibility Layer Diagrams New Solution Scenario Reusable Architecture Scenario Validation an Architecture Scenario I’m sold! Where can i get my hands on this fantastic content? Download the Visual Studio 2010 Architecture Tooling Guidance and if you like it don’t forget to add a review to make the team that put it together in their spare time feel all the mere loved.

    Read the article

  • Musical Movements on the NetBeans Platform

    - by Geertjan
    I came across VirtMus recently, the "modern music stand", on the NetBeans Platform: Its intentions remind me a LOT of Mike Kelly's Chord Maestro, which is also on the NetBeans Platform. Maybe the two should integrate? Speaking of music, I've been in touch with Winston Dehaney who is creating score notation software, named "Acapella Score", also on the NetBeans Platform: That's an app that could be integrated with the JFugue Music NotePad at some stage!

    Read the article

  • Multiple vulnerabilities in Network Time Protocol (NTP)

    - by chandan
    CVE DescriptionCVSSv2 Base ScoreComponentProduct and Resolution CVE-2009-0021 Improper Authentication vulnerability 5.0 Firmware SPARC T3-4 SPARC: 147317-01 SPARC T3-2 SPARC: 147316-01 SPARC T3-1B SPARC: 147318-01 SPARC T3-1 SPARC: 147315-01 Netra SPARC T3-1B SPARC: 147320-01 Netra SPARC T3-1 SPARC: 147319-01 Netra SPARC T3-1BA SPARC: 144609-07 CVE-2009-0159 Buffer Overflow vulnerability 6.8 CVE-2009-3563 Denial of Service (DoS) vulnerability 6.4 This notification describes vulnerabilities fixed in third-party components that are included in Sun's product distribution.Information about vulnerabilities affecting Oracle Sun products can be found on Oracle Critical Patch Updates and Security Alerts page.

    Read the article

  • Oracle Linux Newsletter, March Edition is Here...

    - by Monica Kumar
    The March 2012 edition of Oracle Linux Newsletter is now available. It is chock full of new content including: 30-day free trial of Ksplice for Red Hat Enterprise Linux customers Oracle Linux Online Forum, March 27, 2012 Unbreakable Enterprise Kernel Release 2 details Why and how Dell IT migrated from SUSE Linux to Oracle Linux Technical articles Events, and more Read it here. Subscribe to it now. 

    Read the article

  • Oracle Enterprise Pack for Eclipse 12.1.1 update on OTN

    - by gstachni
    Oracle Enterprise Pack for Eclipse (OEPE) 12.1.1.0.1 was released to OTN last week with support for new standards and features including: Support for Eclipse Indigo SR2 (3.7.2) Updated server plugins for Glassfish 3.1.2 SSL configuration support for WebLogic Server deployment and debugging   The SSL configuration option can be found when configuring the domain for a new WebLogic Server connection. For Eclipse early adopters, an OEPE 12c update based on Eclipse Juno M6 will be available soon.

    Read the article

  • Gotcha when using JavaScript in ADF Regions

    - by Frank Nimphius
    You use the ADF Faces af:resource tag to add or reference JavaScript on a page. However, adding the af:resource tag to a page fragment my not produce the desired result if the script is added as shown below <?xml version='1.0' encoding='UTF-8'?> <jsp:root xmlns:jsp="http://java.sun.com/JSP/Page" version="2.1" xmlns:af="http://xmlns.oracle.com/adf/faces/rich"> <af:resource type="javascript">   function yourMethod(evt){ ... } </af:resource> Adding scripts to a page fragment like this will see the script added for the first page fragment loaded by an ADF region but not for any subsequent fragment navigated to within the context of task flow navigation. The cause of this problem is caching as the af:resource tag is a JSP element and not a lazy loaded JSF component, which makes it a candidate for caching. To solve the problem, move the af:resource tag into a container component like af:panelFormLayout so the script is added when the component is instantiated and added to the page.  <?xml version='1.0' encoding='UTF-8'?> <jsp:root xmlns:jsp="http://java.sun.com/JSP/Page" version="2.1" xmlns:af="http://xmlns.oracle.com/adf/faces/rich"> <af:panelFormLayout> <af:resource type="javascript">   function yourMethod(evt){ ... } </af:resource> </af:panelFormLayout> Magically this then works and prevents browser caching of the script when using page fragments.

    Read the article

  • A Modern Marketing Marvel: Eloqua Experience 2013

    - by kristin.jellison
    Hey there, partners— You’d be hard pressed to find a more convincing example of modern marketing than the one that descended upon San Francisco last week. We’re talking about Eloqua Experience 2013, of course. It is remarkable that a marketing technology conference has become a case study in successful 21st-century marketing practices. Eloqua Experience 2013 (#EE13) was all about customer-focused, targeted messaging, multichannel content, analytics and real-time multiscreen engagement. It made for a busy, yet interactive experience for over 2,000 eager attendees. This year’s event brought together some of the world’s most innovative marketers for three days of immersive sessions covering marketing best practices, customer stories and deep-dive technical classes. With 70 breakout sessions, product announcements, and a special conversation with Vince Gilligan, creator and executive producer of “Breaking Bad,” #EE13 brought a lot of critical marketing news to light. Oracle’s goal: to make sure our partners stay updated. As you know, Eloqua joined Oracle in late 2012, further rounding out our Customer Experience applications platform. Eloqua is a marketing automation solution and marketing cloud centerpiece that partners can use to target the right buyers, easily execute campaigns, bring leads to sales and bring in high ROIs. The resources below will help you stay on top of the industry’s best practices for marketing, plus all the advantages Eloqua can bring to partners. Partner Opportunities and Strategy with Eloqua The latest Eloqua partner strategy. Interview with Oracle Eloqua GM Kevin Akeroyd on Eloqua Experience A short recap of 2013’s Experience. Eloqua Product Announcements John Stetic, VP of Products for Oracle Eloqua, highlights the top product news, including a new profiler app and the ability to integrate display advertising into multichannel campaigns. Eloqua Experience Highlight Reel See what all the bustle was about. Eloqua Experience Session Overviews A quick look at what the keynote and breakout sessions covered, with links to session content. Modern Marketing Essentials Library Tips, blueprints, and strategies for success based on the 5 Tenets of Modern Marketing. Over and out, Your OPN Marketing Allies

    Read the article

  • Oracle Knowledge Courses

    - by mseika
    Oracle Knowledge products offer simple and convenient ways for users to access knowledge contained in corporate information stores. With Oracle Knowledge Training, you learn how to utilize tools that improve customer service and satisfaction by helping customers find more relevant answers to questions online or from a service agent guided by a scalable knowledge management platform. The following courses have been scheduled at Oracle in Utrecht: Oracle Knowledge Overview Rel 8.5 (1 day) Learn the technical architecture of Oracle Knowledge at a high-level and the key technologies including InfoCenter, iConnect, Search, Information Manager, Answerflow and Analytics. Dates: to be scheduled Knowledge Technical Architecture and Configuration Rel 8.5 (5 days) Learn to implement and maintain Oracle Knowledge’s core technologies through hands-on exercises including Intelligent Search, Information Manager, iConnect, AnswerFlow and Analytics. Dates: 13-17 January 2014 (afternoon/evening) Location: Live Virtual Class Knowledge Content Administration Rel 8.5 (2 days) Learn to implement, use and manage knowledge and content creation with Oracle Knowledge Information Manager. Dates: 4-5 December 2013 Location: Utrecht, The Netherlands Knowledge Analytics Rel 8.5 (1 day) Learn KPI analyses and how to close gaps using reports and tools provided in Oracle Business Intelligence Enterprise Edition. Dates: 6 December Location: Utrecht, The Netherlands Remember: your OPN discount is always applied to the standard prices shown on the Oracle University web pages. For assistance in booking, scheduling requests and more information contact the Education Service Desk: eMail: [email protected] Telephone: +31 30 66 27 675

    Read the article

  • Gamification = -10#/3mo

    - by erikanollwebb
    One of the purposes of gamification of anything is to see if you can modify the behavior of the user. In the enterprise, that might mean getting sales people to enter more information into a CRM system, encouraging employees to update their HR records, motivating people to participate in forums and discussions, or process invoices more quickly.  Wikipedia defines behavior modification as "the traditional term for the use of empirically demonstrated behavior change techniques to increase or decrease the frequency of behaviors, such as altering an individual's behaviors and reactions to stimuli through positive and negative reinforcement of adaptive behavior and/or the reduction of behavior through its extinction, punishment and/or satiation."  Gamification is just a way to modify someone's behavior using game mechanics. And the magic question is always whether it works. So I thought I would present my own little experiment from the last few months.  This spring, I upgraded to a Samsung Galaxy 4.  It's a pretty sweet phone in many ways, but one of the little extras I discovered was a built in app called S Health. S Health is an app that you can use to track calories, weight, exercise and it has a built in pedometer. I looked at it when I got the phone, but assumed you had to turn it on to use it so I didn't look at it much.  But sometime in July, I realized that in fact, it just ran in the background and was quietly tracking my steps, with a goal of 10,000 per day.  10,000 steps per day is this magic number recommended by the Surgeon General and the American Heart Association.  Dr. Oz pushes it as the goal for daily exercise.  It's about 5 miles of walking. I'm generally not the kind of person who always has my phone with me.  I leave it in my purse and pull it out when I need it.  But then I realized that meant I wasn't getting a good measure of my steps.  I decided to do a little experiment, and carry it with me as much as possible for a week.  That's when I discovered the gamification that changed my life over the last 3 months.  When I hit 10,000 steps, the app jingled out a little "success!" tune and I got a badge.  I was hooked.  I started carrying my phone.  I started making sure I had shoes I could walk in with me.  I started walking at lunch time, because I realized how often I sat at my desk for 8-10 hours every day without moving.  I started pestering my husband to walk with me after work because I hadn't hit my 10,000 yet, leading him at one point to say "I'm not as much a slave to that badge as you are!"  I started looking at parking lots differently.  Can't get a space up close?  No worries, just that many steps toward my 10,000.  I even tried to see if there was a second power user level at 15,000 or 20,000 (*sadly, no).  If I was close at the end of the day, I have done laps around my house until I got my badge.  I have walked around the block one more time to get my badge.  I have mentally chastised myself when I forgot to put my phone in my pocket because I don't know how many steps I got.  The badge below I got when my boss and I were in New York City and we walked around the block of our hotel just to watch the badge pop up. There are a bunch of tools out on the market now that have similar ideas for helping you to track your exercise, make it social.  There are apps (my favorite is still Zombies, Run!).  You could buy a FitBit or UP by Jawbone.   Interactive fitness makes the Expresso stationary bike with built in video games.  All designed to help you be more aware of your activity and keep you engaged and motivated.  And the idea is to help you change your behavior. I know someone who would spend extra time and work hard on the Expresso because he had built up strategies for how to kill the most dragons while he was riding to get more points.  When the machine broke down, he didn't ride a different bike because it just wasn't that interesting. But for me, just the simple jingle and badge have been all I needed.  I admit, I still giggle gleefully when I hear the tune sing out from my pocket. After a few weeks, I noticed I had dropped a few pounds.  Not a lot, just 2-3.  But then I was really hooked.  I started making a point both to eat a little less and hit 10,000 steps as much as I could.  I bemoaned that during the floods in Boulder, I wasn't hitting my 10,000 steps.  And now, a few months later, I'm almost 10 lbs lighter. All for 1 badge a day. So yes, simple gamification can increase motivation and engagement.  And that can lead to changes in behavior.  Now the job is to apply that to the enterprise space in a meaningful and engaging way. 

    Read the article

  • How-to read data from selected tree node

    - by Frank Nimphius
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} By default, the SelectionListener property of an ADF bound tree points to the makeCurrent method of the FacesCtrlHierBinding class in ADF to synchronize the current row in the ADF binding layer with the selected tree node. To customize the selection behavior, or just to read the selected node value in Java, you override the default configuration with an EL string pointing to a managed bean method property. In the following I show how you change the selection listener while preserving the default ADF selection behavior. To change the SelectionListener, select the tree component in the Structure Window and open the Oracle JDeveloper Property Inspector. From the context menu, select the Edit option to create a new listener method in a new or an existing managed bean. Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} For this example, I created a new managed bean. On tree node select, the managed bean code prints the selected tree node value(s) import java.util.List; import javax.el.ELContext; import javax.el.ExpressionFactory; import javax.el.MethodExpression; import javax.faces.application.Application; import javax.faces.context.FacesContext; import java.util.Iterator; import oracle.adf.view.rich.component.rich.data.RichTree; import oracle.jbo.Row; import oracle.jbo.uicli.binding.JUCtrlHierBinding; import oracle.jbo.uicli.binding.JUCtrlHierNodeBinding; import org.apache.myfaces.trinidad.event.SelectionEvent; import org.apache.myfaces.trinidad.model.CollectionModel; import org.apache.myfaces.trinidad.model.RowKeySet; import org.apache.myfaces.trinidad.model.TreeModel; public class TreeSampleBean { public TreeSampleBean() {} public void onTreeSelect(SelectionEvent selectionEvent) { //original selection listener set by ADF //#{bindings.allDepartments.treeModel.makeCurrent} String adfSelectionListener = "#{bindings.allDepartments.treeModel.makeCurrent}";   //make sure the default selection listener functionality is //preserved. you don't need to do this for multi select trees //as the ADF binding only supports single current row selection     /* START PRESERVER DEFAULT ADF SELECT BEHAVIOR */ FacesContext fctx = FacesContext.getCurrentInstance(); Application application = fctx.getApplication(); ELContext elCtx = fctx.getELContext(); ExpressionFactory exprFactory = application.getExpressionFactory();   MethodExpression me = null;   me = exprFactory.createMethodExpression(elCtx, adfSelectionListener,                                           Object.class, newClass[]{SelectionEvent.class});   me.invoke(elCtx, new Object[] { selectionEvent });     /* END PRESERVER DEFAULT ADF SELECT BEHAVIOR */   RichTree tree = (RichTree)selectionEvent.getSource(); TreeModel model = (TreeModel)tree.getValue();  //get selected nodes RowKeySet rowKeySet = selectionEvent.getAddedSet();   Iterator rksIterator = rowKeySet.iterator();   //for single select configurations,this only is called once   while (rksIterator.hasNext()) {     List key = (List)rksIterator.next();     JUCtrlHierBinding treeBinding = null;     CollectionModel collectionModel = (CollectionModel)tree.getValue();     treeBinding = (JUCtrlHierBinding)collectionModel.getWrappedData();     JUCtrlHierNodeBinding nodeBinding = null;     nodeBinding = treeBinding.findNodeByKeyPath(key);     Row rw = nodeBinding.getRow();     //print first row attribute. Note that in a tree you have to     //determine the node type if you want to select node attributes     //by name and not index      String rowType = rw.getStructureDef().getDefName();       if(rowType.equalsIgnoreCase("DepartmentsView")){      System.out.println("This row is a department: " +                          rw.getAttribute("DepartmentId"));     }     else if(rowType.equalsIgnoreCase("EmployeesView")){      System.out.println("This row is an employee: " +                          rw.getAttribute("EmployeeId"));     }        else{       System.out.println("Huh????");     }     // ... do more useful stuff here   } } -------------------- Download JDeveloper 11.1.2.1 Sample Workspace

    Read the article

  • Turning Supply Data Into Savings- (Almost) Everything You Need to Know in 12 Minutes

    - by David Hope-Ross
    Strategic sourcing and supplier management analytics are easy. The hard part is getting reliable data that provide an accurate record of suppliers, spend, invoices, expenses, and so on. In this new AppsCast, e-Three’s Amy Joshi provides an extraordinarily cogent explanation of key challenges, technologies, and tactics for improving spend visibility. Take twelve minutes to listen and learn. The techniques that Amy outlines can add millions to your organization’s bottom line.

    Read the article

  • Expressing the UI for Enterprise Applications with JavaFX 2.0 FXML - Part Two

    - by Janice J. Heiss
    A new article by Oracle’s Java Champion Jim Weaver, titled “Expressing the UI for Enterprise Applications with JavaFX 2.0 FXML -- Part Two,” now up on otn/java, shows developers how to leverage the power of the FX Markup Language to define the UI for enterprise applications. Weaver, the author of Pro JavaFX Platform, extends the SearchDemoFXML example used in Part One to include more concepts and techniques for creating an enterprise application using FXML. Weaver concludes the article by summarizing its content, “FXML provides the ability to radically change the UI without modifying the controller. This task can be accomplished by loading different FXML documents, leveraging JavaFX cascading style sheets, and creating localized resource bundles. Named parameters can be used with these features to provide relevant information to an application at startup.” Check out the article here.

    Read the article

  • 2011 - ALMs for your development team and the people they work with.

    - by David V. Corbin
    Welcome to 2011, it is already shaping up to be a very exciting year. The title of the post is not about charitable giving, although that is also a great topic. Application Lifecycle Management and the Systems that support the environment is, and 2011 will be a year where I expect many teams to invest heavily in this area. For those not familiar with ALM, it can be simplified down to "A comprehensive view of all of the iteas, requirements, activities and artifacts that impact an application over the course of its lifecycle, from concept until decommissioning". Obviously, this encompases a large number of different areas even for relatively small and medium sized projects. In recent years, many teams have adapted methodoligies which address individual aspects of this; but the majority of this adoption has resulted in "islands of improvement" rather than the desired comprehensive outcome...Until now! Last year Microsoft released Team Foundation Server 2010 along with Visual Studio 2010 Ultimate Edition, and with these two in combination the situation has drastically changed. At last there is a single environment that is capable of handling all aspects of ALM, and is also capable of dealing with migration and integration with existing systems to make the transition to a single solution much easier. Thse possibilities (and practicalities) are nothing short of amazing, Architecture thru Testing integration? YES. Being able to correlate specific requirement items (and their history) to actual code (and code history)? YES. Identification of which tests will be potentially impacted by a given code change? YES. Resiliant Automated Testing of User Interfaces? YES. Automatic Deployment Management? YES. Integraton Level testing as part of (designated) Builds? YES. I could easily double or triple the above list, but these items should be enough to get you thinking about the "pain points" your team and organization currently face and the fact that there IS a way to relieve the pain. Over the course of the year, I am hoping to bring together some of the "best of breed" information, along with hosting (and participating in) discussions with various experts in the field. There are already a number of groups (including many on LinkedIn) that have an ALM focus, and I encourage everyone out to check them out. I will be posting a list of the ones I find most helpful in the not too distant future. As I said at the beginning, 2011 is shaping up to be a very interesting (and productive) year. Why wait to start investigating and adopting ALM? ps: For those interested in becoming an "Alms Giver" in the charitable sense, I highly recommend checking out GiveCamp. A group of developers, designers and others get together to create a solution for a charity in just under 48 hours. I will be attending the GiveCamp in New York City on Jan 14-16, more information is available at nycgivecamp.org/

    Read the article

  • FREE Windows Azure Platform Compute and Storage through the Cloud Essentials Pack for Partners

    - by Eric Nelson
    It can be difficult to find something to look forward to in January – but this year it was a little easier as a) I got lots of great Xbox 360 games and b) the Windows Azure Platform element of the Cloud Essentials Pack for Microsoft Partner Network partners went live. I have previously explained what the Cloud Essentials Pack is and how you can access – but at the time I couldn’t share the details of the Windows Azure Platform element. The Windows Azure Platform element is now available. It gives you each month, for FREE: Windows Azure: 750 hours of extra small compute instance 25 hours of small compute instance 3GB of storage and 250,000 storage transactions SQL Azure: 1 SQL Azure Web Edition database (5GB) Windows Azure AppFabric: App Fabric with 100,000 Access Control transactions and 2 Service Bus connections Plus: Data Transfer:  3GB in and 6GB out (More details of the offer) To activate this offer You need to: Sign your company up to Microsoft Platform Ready (NB: there are other routes to get this benefit – but I know about MPR) Read about Microsoft Platform Ready Visit http://www.microsoftcloudpartner.com/ and sign up.

    Read the article

  • JAX-RS 2.0 Early Draft - Third Edition Available

    - by arungupta
    JAX-RS 2.0 Early Draft Third Edition is now available. This updated draft include new samples explaining the features and clarifications in content-negotiation, discovery of providers, client-side API, filters and entity interceptors and several other sections. Provide feedback to users@jax-rs-spec. Jersey 2.0, the Reference Implementation of JAX-RS 2.0, released their fourth milestone a few days ago as well. Several features have already been implemented there. Note, this is an early development preview and several parts of the API and implementation are still evolving. Feel like trying it out? Simply go to Maven Central (of course none of this is production quality at this point). The latest JAX-RS Javadocs and Jersey 2.0 API docs are good starting points to explore. And provide them feedback at [email protected] or @gf_jersey.

    Read the article

  • Disconnected Online Customer Experience? Connect It with Oracle WebCenter

    - by Christie Flanagan
    Engage. Empower. Optimize. Today's customers have higher expectations and more choices than ever before. Successful organizations must deliver an engaging online experience that is personalized, interactive and consistent across all phases of the customer journey. This requires a new approach that connects and optimizes all customer touch points as they research, select and transact with your brand. Attend this webcast to learn how Oracle WebCenter: Works with Oracle ATG Commerce and Oracle Endeca to deliver consistent and engaging browsing, shopping and search experiences across all of your customer facing websites Enables you to optimize the performance of your online initiatives through integration with Oracle Real-Time Decisions for automated targeting and segmentation Connects with Siebel CRM to maintain a single view of the customer and integrate campaigns across channels Register now for the Webcast. Register Now Thurs., November 8, 2012 10 a.m. PT / 1 p.m. ET Presented by: Joshua DuhlSenior Principal Product ManagerOracle WebCenter Christie FlanaganDirector of Product MarketingOracle WebCenter

    Read the article

< Previous Page | 366 367 368 369 370 371 372 373 374 375 376 377  | Next Page >