Search Results

Search found 14457 results on 579 pages for 'oracle payroll'.

Page 493/579 | < Previous Page | 489 490 491 492 493 494 495 496 497 498 499 500  | Next Page >

  • Pillar Axiom 600 Playbook für Partner

    - by swalker
    Erfahren Sie mehr über die Produktpositionierung und Funktionalität von Pillar Axiom 600. Das Pillar Axiom 600 Playbook (Vertriebsskript) unterstützt Sie beim Vertrieb, bei der Identifizierung und Qualifizierung von Vertriebs-Chancen, bei der Entwicklung von Vertriebsszenarien und bei der Abgrenzung zu Ihrem Wettbewerb. Setzen Sie Schwerpunkte bei der Verwendung Ihrer Ressourcen und erweitern Sie Ihr Angebot mit den OPN Specialized-Optionen, die Ihrem Unternehmen zur Verfügung stehen.

    Read the article

  • RESTful Java on Steroids (Parleys, Podcast, ...)

    - by alexismp
    As reported previously here, the JAX-RS 2.0 (JSR 339) expert group is making good progress. If you're interested in what the future holds for RESTful Java web services, you can now watch Marek's Devoxx presentation or listen to him in the latest Java Spotlight Podcast (#74). Marek discusses the new client API, filters/handlers, BeanValidation integration, Hypermedia support (HATEOAS), server-side async processing and more. With JSR 339's Early Draft Review 2 currently out, another draft review is planned for April, the public review should be available in June while the final draft is currently scheduled for the end of the summer. In short, expect completion sometime before the end of 2012.

    Read the article

  • Gradle Support in NetBeans IDE 7.2

    - by Geertjan
    Russel Winder and Steve Chin spent half an hour, and then gave up, setting up NetBeans IDE to use Gradle, because they couldn't find the NetBeans Gradle plugin, during Steve's NightHacking tour. That need happen no more because Attila Kelemen's NetBeans Gradle plugin is now available in the Plugin Manager in NetBeans IDE 7.2: Aside from opening Gradle-based applications, you can now also create new ones: Details and documentation: https://github.com/kelemen/netbeans-gradle-project

    Read the article

  • BPM 11g - Dynamic Task Assignment with Multi-level Organization Units

    - by Mark Foster
    I've seen several requirements to have a more granular level of task assignment in BPM 11g based on some value in the data passed to the process. Parametric Roles is normally the first port of call to try to satisfy this requirement, but in this blog we will show how a lot of use-cases can be satisfied by the easier to implement and flexible Organization Unit. The Use-Case Task assignment is to an approval group containing several users. At runtime, a location value in the input data determines which of the particular users the task is ultimately assigned to. In this case we use the Demo Community referenced in the SOA Admin Guide, and specifically the "LoanAnalyticGroup" which contains three users; "szweig", "mmitch" & "fkafka". In our scenario we would like to assign a task to "szweig" if the input data specifies that the location is "JapanCentral", to "fkafka" if the location is "JapanNorth" and to "mmitch" if "JapanSouth", and to all of them if the location is "Japan" i.e....   The Process Simple one human task process.... In the output data association of the "Start" activity we need to set the value of the "Organization Unit" predefined variable based on the input data (note that the  predefined variables can only be set on output data associations)....  ...and in the output data association of the human activity we will reset the "Organization Unit" to empty, always good practice to ensure that the Organization Unit will not be used for any subsequent human activities for which we do not require it.... Set Up the Organization Unit  Log in to the BPM Workspace with an administrator user (weblogic/welcome1 in our case) and choose the "Administration" option. Within "Roles" assign the "ProcessOwner" swim-lane for our process to "LoanAnalyticGroup".... Within "Organization Units" we can model our organization.... "Root Organization Unit" as "Japan" and "Child Organization Unit" as "Central", "South" & "North" as shown. As described previously, add user "szweig" to "Central", "mmitch" to "South" and "fkafka" to "North"....   Test the Process Invalid Data  First let us test with invalid data in the input to see what the consequences are, here we use "X" as input.... ...and looking at the instance we can see it has errored.... Organization Unit Root Level Assignment  Now let us see what happens if we have "Japan" in the input data.... ...looking in the "flow trace" we can see that the task has been assigned....  ... but who has the task been assigned to ? Let us look in the BPM Workspace for user "szweig"....  ...and for "mmitch"....  ... and for "fkafka"....  ...so we can see that with an Organization Unit at "Root" level we have successfully assigned the task to all users. Organization Unit Child Level Assignment  Now let us test with "Japan/North" in the input data.... ...and looking in "fkafka" workspace we see the task has been assigned, remember, he was associated with "JapanNorth"....   ... but what about the workspace of "szweig"....  ...no tasks assigned, neither has "mmitch", just as we expected. Summary  We have seen in this blog how to easily implement multi-level dynamic task routing using Organization Units, a common use-case and a simpler solution than Parametric Roles. 

    Read the article

  • Murali Papana Blogs About Date Effectivity

    - by steve.muench
    Murali Papana from our Human Capital Management (HCM) Fusion Applications team has posted a series of blogs on a lesser-known, but quite powerful feature of ADF called "date effectivity". This is a feature that allows the framework to simplify managing records whose data values are effective for a given period of time. Imagine an employee's job title or salary that changes over time, which as well might be entered today by an HR reprepsentative but go into effect at some time in the future. Check out these articles if you're curious to learn more: Learning basics of Date Effectivity in ADFADF Model: Creating Date Effective EOADF Model: Creating Date Effective Association and Date Effective VOADF UI - Implementing Date Effective Search with Example

    Read the article

  • Asciidoctor / NetBeans

    - by Geertjan
    With Jason Lee's NetBake plugin (https://bitbucket.org/jdlee/netbake), when you've installed JRuby and then the Asciidoctor gem, you're good to go to use Asciidoctor with NetBeans IDE. New Asciidoc files can be created, which have a Source view... ...and a Visual view. The current content of the text editor is parsed by the Asciidoctor gem and the resulting HTML is displayed in a JEditorPane: Awestruct support is also part of the NetBake plugin, with a new project type and other related features. An Options window is included for configuring the plugin: I've been in touch with Jason and we're discussing separating the Asciidoctor parts from the Awestruct parts and then putting them seperately as plugins on the NetBeans Plugin Portal.

    Read the article

  • DISA Cross Domain Enterprise Solutions on the NetBeans Platform

    - by Geertjan
    Bray 2.0 is a tool based on the NetBeans Platform that assists in creating valid Data Flow Configuration (DFC) files. The DFC Specification was developed to provide a standardized way for defining, validating, and approving data flows for use on cross-domain guarding solutions. A DFC document specifies key entities such as security domains, guards that facilitate data between security domains, data flows that describe how data travels between security domains, filters that transform and validate the data and more. Related info: http://www.disa.mil/Services/Information-Assurance/Cross-Domain-Solutions The Bray product is in development at Fulcrum IT (http://www.fulcrumco.com). The DFC Specification and Bray were developed in support of the US Department of Defense. Bray 2.0 marks the first release of Bray on the NetBeans Platform and utilizes a number of features that are core to the NetBeans Platform: Modular plugability. Bray consumers can integrate their own tools, file types, and more into the product with relative ease. Robust UI. The NetBeans Platform intuitive UI makes it easy to access and manipulate multiple aspects of a DFC. Explorer. The Explorer is a key component that makes the DFC XML easy to traverse, edit, and find errors. Context-sensitive help. JavaHelp can be readily integrated for the product as well as all the UI within. Editors. Any external file can be added to a DFC. Users can register their own editors or use the provided NetBeans editors to edit files. Printing. The NetBeans Platform Print API makes it easy to determine what should be printed and how.   A screenshot: Bray 2.0 provides a lot of key features in developing valid, robust DFC files:  XML validation. A DFC can be validated against the DFC schema specification. DFC Check List. An interactive, minimal guide for creating a complete DFC. Summary Window. The Summary Window functions like the Navigator in NetBeans IDE. The current "item of interest" is checked against various business rules and provides the ability to quickly find and fix errors. Change Log. Bray audits every change to a DFC and places them in a change log for users to peruse. Comments. Users can optionally add comments for other users to see. Digital signatures. DFC files can be digitally signed. A signature history and signature validation is provided in Bray. Pluggable security schemes. Bray ships with plain text and IC-ISM security schemes. If needed, users can integrate additional ones.  ...and more to come! New features for Bray are constantly in development including use of the NetBeans Visual Library, language support, and more. More screenshots:

    Read the article

  • Customer Engagement: Are Your Customers Engaged With Your Brands?

    - by Michael Snow
    Engaging Customers is Critical for Business Growth This week we'll be spending some time looking at Customer Engagement. We all have stories about how we try to engage our customers better than ever before.  We all know that successfully engaging customers is critical to an organization’s business success. We also know that engaging our customers is more challenging today than ever before. There is so much noise to compete with for getting anyone's attention. Over the last decade and a half we’ve watched as the online channel became a primary one for conducting our business and even managing our lives. And during this whole process or evolution, the customer journey has grown increasingly complex. Customers themselves have assumed increasing power and influence over the purchase process and for setting the tone and pace of the relationships they have with brands and you see the evidence of this in the really high expectations that customers have today. They expect brand experiences that are personalized and relevant -- In other words they want experiences that demonstrate that the brand understands their interests, preferences and past interactions with them. They also expect their experience with a brand and the community surrounding it to be social and interactive – it’s no longer acceptable to have a static, one-way dialogue with your customer base or to fail to connect your customers with fellow customers, or with your employees and partners. And on top of all this, customers expect us to deliver this rich and engaging, personalized and interactive experience, in a consistent way across a variety of channels including web, mobile and social channels or even offline venues such as in-store or via a call center. And as a result, we see that delivery on these expectations and successfully engaging your customers is a great challenge today. Customers expect a personal, engaging and consistent online customer experience. Today’s consumer expects to engage with your brand and the community surrounding it in an interactive and social way. Customers have come to expect a lot for the online customer experience.  ·        They expect it to be personal: o   Accessible:  - Regardless of my device  Via my existing online identities  o   Relevant:  Content that interests me  o   Customized:  To be able to tailor my online experience  ·        They expect it to be engaging: o   Social:  So I can share content with my social networks  o   Intuitive:  To easily find what I need   o   Interactive:  So I can interact with online communities And they expect it to be consistent across the online experience – so you better have your brand and information ducks in a row. These expectations are not only limited to your customers by any means. Your employees (and partners) are also expecting to be empowered with engagement tools across their internal and external communications and interactions with customers, partners and other employees. We had a great conversation with Ted Schadler from Forrester Research entitled: "Mobile is the New Face of Engagement" that is now available On-Demand. Take a look at all the webcasts available to watch from our Social Business Thought Leader Series. Social capabilities have become so pervasive and changed customers’ expectations for their online experiences. The days of one-direction communication with customers are at an end. Today’s customers expect to engage in a dialogue with your brand and the community surrounding it in an interactive and social way. You have at a very short window of opportunity to engage a customer before they go to another site in their pursuit of information, product, or services. In fact, customers who engage with brands via social media tend to spend more that customers who don’t, between 20% and 40% more.  And your customers are also increasingly influenced by their social networks too – 40% of consumers say they factor in Facebook recommendations when making purchasing decisions.  This means a few different things for today’s businesses. Incorporating forms of social interaction such as commenting or reviews as well as tightly integrating your online experience with your customers’ social networking experiences into the online customer experience are crucial for maintaining the eyeballs on your desired pages. --- Notes/Sources: 93% - Cone Finds that Americans Expect Companies to Have a Presence in Social Media - http://www.coneinc.com/content1182 40% of consumers factor in Facebook recommendations when making decisions about purchasing (Increasing Campaign Effectiveness with Social Media, Syncapse, March 2011) 20%-40% - Customers who engage with a company via social media spend this percentage more with that company than other customers (Source: Bain & Company Report – Putting Social Media to Work)

    Read the article

  • OOW 2012: Kings of Leon & Pearl Jam - Appreciation Event

    - by Mike Dietrich
    June 15, 1992 - that was actually the day when Pearl Jam played their first concert in Serenadenhof in my hometown, Nürnberg. Oups ... that's over 20 years ago ... So I was so happy to get a ticket to this year's OOW 2012 appreciation event on Treasure Island. Every year it amazes me over and over again how the organizers manage it logistically to bring almost 40,000 people to and back from the island. Food was ... I would say fairly ok ... and beer (as always) is not - actually even though I'm not a beer drinker I wouldn't call it beer.  Kings of Leon did start. I like them a lot and owe their 2008 album Only By Night. That was a good start to warm up the crowd. And then Pearl Jam took over - and ... wooooooow ... they are such a great live band. First of all as far as I understood they were donating the money they've got for that gig to an NGO. And Eddie Vedder's voice is simply striking ... I had shivers running down my spine. They played a good excerpt of their +20 years career closing down with Alive at the very end. It seemed to everybody that the band had real fun playing there - and it was sooooo good. Thanks a lot to the person who did organize me a ticket Catching my bus back to my hotel area down at Fisherman's Wharf worked well - but I must have fallen asleep 5 minutes after we've left the parking lot. The next thing I did recognize was the bus driver pushing the breaks at Northpoint. What a wonderful night ...

    Read the article

  • Learn more about SPARC by listening to our newly recorded podcasts

    - by Cinzia Mascanzoni
    Please listen to our newly recorded series of four podcasts focused on SPARC. The topics are: How SPARC T4 Servers Open New Opportunities SPARC Roadmap and SPARC T4 Architecture Highlights SPARC T4 For Installed Base Refresh and Consolidation SPARC T4 – How Does it Stack up Against the Competition? Rob Ludeman, from SPARC Product Management, and Thomas Ressler, WWA&C Alliances Consultant, are your hosts. The intent is to continue to help you understand how to position and sell SPARC/T4 into your customer architecture.Details on how to access these podcasts can be found here.

    Read the article

  • OWB 11gR2 &ndash; JDBC Helper Utility

    - by David Allan
    One of the common queries when importing the tables via JDBC with 11gR2 is determining why the import wizard doesn’t display the tables that you think it should. I often just use the script below to dump out the schemas, tables and columns that the JDBC driver is returning. This is useful in a few areas; to figure out what the schema name is returned to double check with the schema name you have used in the location (this is used in the DatabaseMetaData.getTables API call within the basic JDBC metadata import. to figure out the data types returned from the JDBC driver when you see columns skipped because of no datatype supported messages. also…I can do it via scripting and don’t need to recompile classes and stuff :-) Edit the tcl script and set the JDBC driver, the connection URL and the username and password (they are at the bottom of the script), the script then calls a basic tcl procedure which writes to standard out the schemas, tables and columns with various properties. For example I executed it using the XML JDBC driver from ODI over a simple customers XML file and it writes the following metadata; You can add more details as you need and execute from the OMBPlus panel within OWB. Download the sample tcl jdbc script here There is a bunch of really useful stuff on OTN documenting this area (start with the white paper here) that is worth checking out all related to the OWB SDK covering everything from platform definitions, custom metadata importers, application adapters, code templates etc. You can find a bunch of goodies on the OWB SDK here.

    Read the article

  • How to Open Any Folder as a Project in the NetBeans Platform

    - by Geertjan
    Typically, as described in the NetBeans Project Type Tutorial, you'll define a project type based on the presence of a file (e.g., "project.xml" or "customer.txt" or something like that) in a folder. I.e., if the file is there, then its parent, i.e., the folder that contains the file, is a project and should be opened in your application. However, in some scenarios (as with the HTML5 project type introduced in NetBeans IDE 7.3), the user should be able to open absolutely any folder at all into the application. How to create a project type that is that liberal? Here you go, the only condition that needs to be true is that the selected item in the "Open Project" dialog is a folder, as defined in the "isProject" method below. Nothing else. That's it. If you select a folder, it will be opened in your application, displaying absolutely everything as-is (since below there's no ProjectLogicalView defined): import java.beans.PropertyChangeListener; import java.io.IOException; import javax.swing.Icon; import org.netbeans.api.project.Project; import org.netbeans.api.project.ProjectInformation; import org.netbeans.spi.project.ProjectFactory; import org.netbeans.spi.project.ProjectState; import org.openide.filesystems.FileObject; import org.openide.loaders.DataFolder; import org.openide.loaders.DataObjectNotFoundException; import org.openide.nodes.FilterNode; import org.openide.util.Exceptions; import org.openide.util.ImageUtilities; import org.openide.util.Lookup; import org.openide.util.lookup.Lookups; import org.openide.util.lookup.ServiceProvider; @ServiceProvider(service = ProjectFactory.class) public class FolderProjectFactory implements ProjectFactory { @Override public boolean isProject(FileObject projectDirectory) { return DataFolder.findFolder(projectDirectory) != null; } @Override public Project loadProject(FileObject dir, ProjectState state) throws IOException { return isProject(dir) ? new FolderProject(dir) : null; } @Override public void saveProject(Project prjct) throws IOException, ClassCastException { // leave unimplemented for the moment } private class FolderProject implements Project { private final FileObject projectDir; private Lookup lkp; private FolderProject(FileObject dir) { this.projectDir = dir; } @Override public FileObject getProjectDirectory() { return projectDir; } @Override public Lookup getLookup() { if (lkp == null) { lkp = Lookups.fixed(new Object[]{ new Info(), }); } return lkp; } private final class Info implements ProjectInformation { @Override public Icon getIcon() { Icon icon = null; try { icon = ImageUtilities.image2Icon( new FilterNode(DataFolder.find( getProjectDirectory()).getNodeDelegate()).getIcon(1)); } catch (DataObjectNotFoundException ex) { Exceptions.printStackTrace(ex); } return icon; } @Override public String getName() { return getProjectDirectory().getName(); } @Override public String getDisplayName() { return getName(); } @Override public void addPropertyChangeListener(PropertyChangeListener pcl) { //do nothing, won't change } @Override public void removePropertyChangeListener(PropertyChangeListener pcl) { //do nothing, won't change } @Override public Project getProject() { return FolderProject.this; } } } } Even the ProjectInformation implementation really isn't needed at all, since it provides nothing more than the icon in the "Open Project" dialog, the rest (i.e., the display name in the "Open Project" dialog) is provided by default regardless of whether you have a ProjectInformation implementation or not.

    Read the article

  • Why do large IT projects tend to fail or have big cost/schedule overruns?

    - by Pratik
    I always read about large scale transformation or integration project that are total or almost total disaster. Even if they somehow manage to succeed the cost and schedule blow out is enormous. What is the real reason behind large projects being more prone to failure. Can agile be used in these sort of projects or traditional approach is still the best. One example from Australia is the Queensland Payroll project where they changed test success criteria to deliver the project. See some more failed projects in this SO question Have you got any personal experience to share?

    Read the article

  • TomEE Integration in NetBeans Next

    - by Geertjan
    At JavaOne 2013, there was a lot of buzz around the TomEE server, e.g., many Tweets, nice party, and a new TomEE consulting company. For those tracking TomEE developments, it is interesting to note that recently the NetBeans IDE development builds have had added to them... TomEE support. Note: The TomEE support described here is not in NetBeans IDE 7.4, but in development builds for the next release of NetBeans IDE.For example, with NetBeans IDE development builds you're able to: register TomEE as a server in the Services window (TomEE has several distributions, e.g., one can use the "with JAX-RS" one, for example) create a Java EE 6 web project (e.g., Maven based) against this server create JPA entities from database create JAX-RS classes from JPA entities create JSF pages from JPA entities the IDE lets you create a new data source for TomEE and deploy it to the server the IDE figures out the components that are already packaged in TomEE, and the fact that (unlike with regular Tomcat), it does not need to package any components such as JSF implementation, persistence provider, or JAX-RS runtime, so that the resulting WAR file is very small the IDE can also do "deploy on save" with TomEE, so that your development cycle is very fast Adam Bien blogged about how he set up TomEE sometime ago, here. The official support in NetBeans IDE will be much more tightly integrated, simplifying the steps Adam describes. For example, the IDE does step 2 from Adam's blog for you, i.e., it sets up TomEE deployment roles. Moreover, it knows about all the technologies included in TomEE so that it can optimize the packaging; it knows about TomEE's persistence setup; it can work with TomEE data sources, etc. Below you see a Maven-based Java EE 6 PrimeFaces application (all entities and JSF pages generated from a database) deployed to TomEE in NetBeans IDE: And here's the management console for configuring and finetuning TomEE in NetBeans IDE: When I tried out the NetBeans IDE development build and TomEE, to see how everything fits together, I was surprised at how fast TomEE started up. Not sure what they did to it, but seems like a server on steroids. And setting it up in NetBeans IDE was trivial. Add the simple set up of TomEE in NetBeans IDE to the many benefits that the widely praised out of the box NetBeans Maven tools make possible, together with the fact that not one single plugin had to be installed to get everything you see described here up and running... and you have a really powerful combination of dev tools, all for free.

    Read the article

  • JavaOne - Java SE Embedded Booth - Digi - Home Health Hub (HHH)

    - by David Clack
    Hi All,  So another exciting platform we will have in the booth at JavaOne is the Digi  Home Health Hub (HHH) platform. http://www.digi.com/products/wireless-wired-embedded-solutions/single-board-computers/idigi-telehealth-application-kit#overview This is a Freescale reference design that has been built by Digi, the system is powered by a Freescale i.MX28 ARM SOC, what's really exciting me is it has every wireless protocol you could ever want on a single motherboard. Ethernet, 802.11b/g/n Wi-Fi, Bluetooth, ZigBee, configurable Sub-GHz radio, NFC plus USB, audio and LCD/touch screen option. I've been experimenting with lots of wireless capable healthcare products in the last few months, plus some Bluetooth Pulse / Oxy meters, we have been looking at how the actual healthcare wireless protocols work. Steve Popovich - Vice President, Digi Internationalwill be doing a talk at the Java Embedded @ JavaOne conference in the Hotel Nikko, right next door to the JavaOne show in the Hilton. If you are registered at JavaOne you can come over to the Java Embedded @ JavaOne for $100 Come see us in booth 5605 See you there Dave

    Read the article

  • Clustering in GlassFish with DCOM on Windows 7

    - by ByronNevins
    I've discovered that Windows 7 makes it very difficult to use DCOM and, mainly, the GlassFish clustering commands that rely on DCOM.  I spent a few days trying to solve the problems.  I don't yet have a cookbook for making DCOM work on Windows 7.  But here are a few tips and advice I've found. run asadmin setup-local-dcom -- It now comes automatically with the open source GlassFish 4.  It will write some critical registry entries for you.  run asadmin validate-dcom to test dcom 3.   When I ran validate-dcom on my Windows 7 network I saw the problem below: Successfully resolved host name to: gloin/10.28.51.10 Successfully connected to DCOM Port at port 135 on host gloin. Successfully connected to NetBIOS Session Service at port 139 on host gloin.Successfully connected to Windows Shares at port 445 on host gloin.Can not access the remote file system.  Is UAC on? : Access is denied. I discovered the actual problem is that Windows 7 no longer has the "C$" Administrative file share available by default. If "C$" isn't available then nothing will work. Here is how to expose the "C$" share: Registry Change -- this change allows “C$” to be accessed.  As soon as I set it -- the file copying started working!  [1] regkey: HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Policies\System create this key, 32 bit word, with value == 1 LocalAccountTokenFilterPolicy 4.  Turn on the Remote Registry Service -- This is critical and it's easy to do. Windows 7 has it turned off by default. MyComputer-right click, manage, services, then turn on Remote Registry Service and set it to start automatically in the fture. 5. Turn off UAC: %systemroot%\system32\UserAccountControlSettings.exe 6. This is where I discovered that McAfee virus scanner blocks all the NetBios shares!  It has to be disabled.  

    Read the article

  • JPedal Action for Converting PDF to JavaFX

    - by Geertjan
    The question of the day comes from Mark Stephens, from JPedal (JPedal is the leading 100% Java PDF library, providing a Java PDF viewer, PDF to image conversion, PDF printing or adding PDF search and PDF extraction features), in the form of a screenshot: The question is clear. By looking at the annotations above, you can see that Mark has an ActionListener that has been bound to the right-click popup menu on PDF files. Now he needs to get hold of the file to which the Action has been bound. How, oh  how, can one get hold of that file? Well, it's simple. Leave everything you see above exactly as it is but change the Java code section to this: public final class PDF2JavaFXContext implements ActionListener {     private final DataObject context;     public PDF2JavaFXContext(DataObject context) {         this.context = context;     }     public void actionPerformed(ActionEvent ev) {         FileObject fo = context.getPrimaryFile();         File theFile = FileUtil.toFile(fo);         //do something with your file...     } } The point is that the annotations at the top of the class bind the Action to either Actions.alwaysEnabled, which is a factory method for creating always-enabled Actions, or Actions.context, which is a factory method for creating context-sensitive Actions. How does the Action get bound to the factory method? The annotations are converted, when the module is compiled, into XML registration entries in the "generated-layer.xml", which you can find in your "build" folder, in the Files window, after building the module. In Mark's case, since the Action should be context-sensitive to PDF files, he needs to bind his PDF2JavaFXContext ActionListener (which should probably be named "PDF2JavaFXActionListener", since the class is an ActionListener) to Actions.context. All he needs to do that is pass in the object he wants to work with into the constructor of the ActionListener. Now, when the module is built, the annotation processor is going to take the annotations and convert them to XML registration entries, but the constructor will also be checked to see whether it is empty or not. In this case, the constructor isn't empty, hence the Action should be context-sensitive and so the ActionListener is bound to Actions.context. The Actions.context will do all the enablement work for Mark, so that he will not need to provide any code for enabling/disabling the Action. The Action will be enabled whenever a DataObject is selected. Since his Action is bound to Nodes in the Projects window that represent PDF files, the Action will always be enabled whenever Mark right-clicks on a PDF Node, since the Node exposes its own DataObject. Once Mark has access to the DataObject, he can get the underlying FileObject via getPrimaryFile and he can then convert the FileObject to a java.io.File via FileUtil.getConfigFile. Once he's got the java.io.File, he can do with it whatever he needs. Further reading: http://bits.netbeans.org/dev/javadoc/

    Read the article

  • java.net outage

    - by alexismp
    The GlassFish website has been down for a number of hours (together with a number of other projects) as a result of a general outage in the java.net datacenter. The team is working hard on getting everything back to normal. You can track progress by following @ProjectKenai. Update: services should now all be back to normal. If you face java.net issues in the future, consider reporting them here. And now, back to work!

    Read the article

  • How to update MDS ?

    - by harsh.singla
    The other question that comes to mind is how to update and existing file or creates a new file in MDS. It's really simple. We already have setup done for this in Foundation Pack. Here are the steps you need to do to update or create a new file in MDS 1) Browse to AIA_HOME/AIAMetaData 2) Place your file(it may be new file or modified existing file) at the desired location. The same hierarchy will appear in MDS starting from AIAMetaData. 3) Edit UpdateMetaDataDeploymentPlan.xml in aia_instances/AIA_INSTANCE_NAME/config. 4) In the tag, mention the files which you have modified or added new. You can add a file or directory as required. 5) Browse to AIA_HOME/Infrastructure/Install/scripts 6) Run ant -f UpdateMetaData.xml Here is the same UpdateMetaData Deployment Plan. UpdateMetaDataDP.xml There are some things that you may want to take care while updating MDS 1) The path in "fileset" tag can be anywhere on the disk. 2) The path in "include" tag has to be relative to what is given in "fileset" tag 3) The path is MDS will appear what you have given in "include" tag. The path in this tag will automatically appended to /apps/AIAMetaData/.

    Read the article

  • Megjelent a Glassfish 3.1

    - by peter.nagy
    Mivel ezzel van tele a blogvilág és a java valamint open source community oldalak többsége nem is mennék bele a részletekbe. De akinek információra van szüksége, mindenféleképpen Arun Gupta blogját ajánlom kiindulási pontként. És az o blogja szerintem ebben a témában jobb kezdolap is, mint egy google keresés. Azért ami a legfontosabb: megjelent a clusterezés, Alább pedig a letöltheto csomagok összetevoi.

    Read the article

  • A Virtual Dilemma

    - by antony.reynolds
    Solving a Gotcha with VirtualBox Guest Additions I was just building a new virtual machine based off an existing image that didn’t have the Virtual Box Guest Additions enabled.  The guest additions allow tight integration between the guest OS and the host environment, providing seemless mouse transfer and the ability to take advantage of full video screen size.  The guest additions need to be linked with the kernel which requires the kernel-devel package to be installed.  After installing this package and then trying to add the guest additions it failed, suggesting that I might not have the kernel-devel package that I had installed.  After a little though I finally realized what had happened.  When I grabbed the kernel-devel package I hadn’t checked the version of my kernel.  The kernel-devel I downloaded didn’t match the revision of the kernel I was running!  Hence my problems.  I upgraded the kernel to the same revision as my kernel-devel package and rebooted.  I had installed dkms so I was pleased to see that my VBox Additions successfully built and the mouse and screen now worked as expected. So now you know my embarrassing story for the day :-)

    Read the article

  • Reminder: ATG Live Webcast Feb. 24th: Using the R12 EBS Adapter

    - by Bill Sawyer
    Reminder: Our next ATG Live Webcast is happening on 24-Feb. The event is titled:E-Business Suite R12.x SOA Using the E-Business Suite AdapterThis live one-hour webcast will offer a review of the Service Oriented Architecture (SOA) capabilities within E-Business Suite R12 focusing on the E-Business Suite Adapter. While primarily focused on integrators and developers, understanding SOA capabilities is important for all E-Business Suite technologists and superusers.

    Read the article

  • EL 3.0 Public Review - JSR 341 and Java EE 7 Moving Along

    - by arungupta
    Following closely on the lines of EL 3.0 Early Draft, the specification is now available for a Public Review. The JCP2 Process Document defines different stages of the specifications. This review period closes Jul 30, 2012. Some of the main goals of the JSR are to separate ELContext into parsing and evaluation contexts, adding operators like equality, string concatenation, etc, and integration with CDI. The section A.7 of the specification highlights the difference between Early Draft and Public Review. Download the Public Review and and follow the updates at el-spec.java.net. For more information about EL 3.0 (JSR 341), check out the JSR project on java.net. The archives of EG discussion are available at jsr341-experts and you can subscribe to the users@el-spec and other aliases on the Mailing Lists page.

    Read the article

  • Is Cloud Security Holding Back Social SaaS?

    - by Mike Stiles
    The true promise of social data co-mingling with enterprise data to influence and inform social marketing (all marketing really) lives in cloud computing. The cloud brings processing power, services, speed and cost savings the likes of which few organizations could ever put into action on their own. So why wouldn’t anyone jump into SaaS (Software as a Service) with both feet? Cloud security. Being concerned about security is proper and healthy. That just means you’re a responsible operator. Whether it’s protecting your customers’ data or trying to stay off the radar of regulatory agencies, you have plenty of reasons to make sure you’re as protected from hacking, theft and loss as you can possibly be. But you also have plenty of reasons to not let security concerns freeze you in your tracks, preventing you from innovating, moving the socially-enabled enterprise forward, and keeping up with competitors who may not be as skittish regarding SaaS technology adoption. Over half of organizations are transferring sensitive or confidential data to the cloud, an increase of 10% over last year. With the roles and responsibilities of CMO’s, CIO’s and other C’s changing, the first thing you should probably determine is who should take point on analyzing cloud software options, providers, and policies. An oft-quoted Ponemon Institute study found 36% of businesses don’t have a cloud security policy at all. So that’s as good a place to start as any. What applications and data are you comfortable housing in the cloud? Do you have a classification system for data that clearly spells out where data types can go and how they can be used? Who, both internally and at the cloud provider, will function as admins? What are the different levels of admin clearance? Will your security policies and procedures sync up with those of your cloud provider? The key is verifiable trust. Trust in cloud security is actually going up. 1/3 of organizations polled say it’s the cloud provider who should be responsible for data protection. And when you look specifically at SaaS providers, that expectation goes up to 60%. 57% “strongly agree” or “agree” there’s more confidence in cloud providers’ ability to protect data. In fact, some businesses bypass the “verifiable” part of verifiable trust. Just over half have no idea what their cloud provider does to protect data. And yet, according to the “Private Cloud Vision vs. Reality” InformationWeek Report, 82% of organizations say security/data privacy are one of the main reasons they’re still holding the public cloud at arm’s length. That’s going to be a tough position to maintain, because just as social is rapidly changing the face of marketing, big data is rapidly changing the face of enterprise IT. Netflix, who’s particularly big on the benefits of the cloud, says, "We're systematically disassembling the corporate IT components." An enterprise can never realize the full power of big data, nor get the full potential value out of it, if it’s unwilling to enable the integrations and dataset connections necessary in the cloud. Because integration is called for to reduce fragmentation, a standardized platform makes a lot of sense. With multiple components crafted to work together, you’re maximizing scalability, optimization, cost effectiveness, and yes security and identity management benefits. You can see how the incentive is there for cloud companies to develop and add ever-improving security features, making cloud computing an eventual far safer bet than traditional IT. @mikestilesPhoto: stock.xchng

    Read the article

< Previous Page | 489 490 491 492 493 494 495 496 497 498 499 500  | Next Page >