Search Results

Search found 46178 results on 1848 pages for 'java home'.

Page 40/1848 | < Previous Page | 36 37 38 39 40 41 42 43 44 45 46 47  | Next Page >

  • Java heap space

    - by java_mouse
    In Java/JVM, why do we call the memory place where Java creates objects as "Heap"? Does it use the Heap Data Structure to create/remove/maintain the objects? As I read in the documentation of Heap data structure, the algorithm compares the objects with existing nodes and places them in such a way that Parent object is "greater" than the children. ( Or "lesser" in case of min heap). So in JVM, how are the objects compared against each other before placing them in the heap?

    Read the article

  • Java GNOME bindings, are those a good idea?

    - by Phobia
    What do you think of Java's GNOME bindings and I was surprised to know that the latest version of the bindings was released this month and they're backed by a company that uses them, which means that there's a considerable amount of activity in the project, and that it's not going to be ditched anytime soon Is this going to be a second chance for Java on the desktop, since GTK+ is cross platform like swing, but less bloated and more responsive Should I be learning how to develop applications using it? or it's not worth the time?

    Read the article

  • Instantiating objects in Java

    - by Davis Naglis
    I'm learning now Java from scratch and when I started to learn about instantiating objects, I don't understand - in which cases do I need to instantiate objects? For example I'm studying from TutsPlus course about it and there is example about "Rectangle" class. Instructor says that it needs to be instantiated. So I started to doubt about - when do I need to instantiate those objects when writing Java code?

    Read the article

  • Why can't Java/C# implement RAII?

    - by mike30
    Question: Why can't Java/C# implement RAII? Clarification: I am aware the garbage collector is not deterministic. So with the current language features it is not possible for an object's Dispose() method to be called automatically on scope exit. But could such a deterministic feature be added? My understanding: I feel an implementation of RAII must satisfy two requirements: 1. The lifetime of a resource must be bound to a scope. 2. Implicit. The freeing of the resource must happen without an explicit statement by the programmer. Analogous to a garbage collector freeing memory without an explicit statement. The "implicitness" only needs to occur at point of use of the class. The class library creator must of course explicitly implement a destructor or Dispose() method. Java/C# satisfy point 1. In C# a resource implementing IDisposable can be bound to a "using" scope: void test() { using(Resource r = new Resource()) { r.foo(); }//resource released on scope exit } This does not satisfy point 2. The programmer must explicitly tie the object to a special "using" scope. Programmers can (and do) forget to explicitly tie the resource to a scope, creating a leak. In fact the "using" blocks are converted to try-finally-dispose() code by the compiler. It has the same explicit nature of the try-finally-dispose() pattern. Without an implicit release, the hook to a scope is syntactic sugar. void test() { //Programmer forgot (or was not aware of the need) to explicitly //bind Resource to a scope. Resource r = new Resource(); r.foo(); }//resource leaked!!! I think it is worth creating a language feature in Java/C# allowing special objects that are hooked to the stack via a smart-pointer. The feature would allow you to flag a class as scope-bound, so that it always is created with a hook to the stack. There could be a options for different for different types of smart pointers. class Resource - ScopeBound { /* class details */ void Dispose() { //free resource } } void test() { //class Resource was flagged as ScopeBound so the tie to the stack is implicit. Resource r = new Resource(); //r is a smart-pointer r.foo(); }//resource released on scope exit. I think implicitness is "worth it". Just as the implicitness of garbage collection is "worth it". Explicit using blocks are refreshing on the eyes, but offer no semantic advantage over try-finally-dispose(). Is it impractical to implement such a feature into the Java/C# languages? Could it be introduced without breaking old code?

    Read the article

  • Java game applet development

    - by RomZes
    I'm getting 4 sec delay when sending objects over UDP. Working on small game and trying to implement multiplayer. For now just trying to synchronize movements of 2 balls on the screen. StartingPoint.java is my server(first player), that receiving serialized objects (coordinates). SecondPlayer.java is client that sending serialized objects to server. When I'm moving my first object it appears 4 seconds later on different screen. StartingPoint.java @Override public void run() { byte[] receiveData = new byte[256]; byte[] sendData = new byte[256]; // DatagramSocket socketS; try { socket = new DatagramSocket(5000); System.out.println("Socket created on "+ port + " port"); } catch (SocketException e1) { // TODO Auto-generated catch block e1.printStackTrace(); } while(true){ b1.update(this); b3.update(); System.out.println("Starting server..."); //// Receiving and deserializing object try { //socket.setSoTimeout(1000); DatagramPacket packet = new DatagramPacket(buf, buf.length); socket.receive(packet); byte[] data = packet.getData(); ByteArrayInputStream in = new ByteArrayInputStream(data); ObjectInputStream is = new ObjectInputStream(in); // socket.setSoTimeout(300); b1 = (Ball) is.readObject(); } catch (IOException | ClassNotFoundException e) { // TODO Auto-generated catch block e.printStackTrace(); } repaint(); try { Thread.sleep(17); } catch (InterruptedException e) { e.printStackTrace(); } SecondPlayer.java @Override public void run() { while(true){ b.update(); networkSend(); repaint(); try { Thread.sleep(17); } catch (InterruptedException e) { e.printStackTrace(); } } public void networkSend(){ // Serialize to a byte array try { ByteArrayOutputStream bStream = new ByteArrayOutputStream(); ObjectOutputStream oo; oo = new ObjectOutputStream(bStream); oo.writeObject(b); oo.flush(); oo.close(); byte[] bufCar = bStream.toByteArray(); //socket = new DatagramSocket(); //socket.setSoTimeout(1000); InetAddress address = InetAddress.getByName("localhost"); DatagramPacket packet = new DatagramPacket(bufCar, bufCar.length, address, port); socket.send(packet); } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); }

    Read the article

  • Suggestions on the best home server rack cabinet

    - by allentown
    I have a lot of gear in a colocation facility right now. Some of it is going to come home with me now. I do not know anything about the "rack mount" side of the industry. I lease a rack, and I put my stuff in it. I have a few 1U boxes, a few 2U boxes, and a few 4U boxes. 1U switch. One is a new Xserve, which means it is deep. I think I can get by with around 12U to 18U. I want to keep it as small as possible, since I do not have a lot of spare space at my home. I will not be able to bolt to the wall, floor etc, so it should not be tall. This is something I would love to more or less just be a box that sits on the floor but gives me the ability to mount nicely, do nice cable management etc. Are the "post" style racks junk? I am liking the open space, and the no limitations on depth of something like this: http://www.rackmountsolutions.net/images/products/Martin-relay-rack.jpg However, that thing is way too tall, and probably way too expensive. I am looking to be around $300.00 or less. More if I have to, though I would prefer not to. These look near perfect: (See comment for this link, the system will not let me post a second url) but I am worried the Xserve will not fit in it. If anyone has any good links, or website recommendations of good past experience, I would appreciate it. I am almost considering that I may be able to build something with random scraps of stuff at Home Depot as well.

    Read the article

  • Setting Up My Home Network

    - by Skizz
    I currently have five PCs at home, three running WinXP and two running Ubuntu. They are set up like this: ISP ----- Modem ---- Switch ---- Ubuntu1 -- B&W Printer | |--WinXP1 | |--WinXP2 Wireless |--Colour Printer | |---------Ubuntu2 |---------WinXP3 (laptop) The Ubuntu1 machine is set up as a PDC using Samba and runs fetchmail, procmail, dovecot to get my e-mail and allow me to access the e-mail via imap so I can read the e-mail on any PC. I'd like to set up the network like this: ISP ----- Modem ---- Ubuntu1 ---- Switch ------WinXP1 | | |--WinXP2 B&W Printer Wireless |--Colour Printer | |---------Ubuntu2 |---------WinXP3 (laptop) My questions are: How to configure Ubuntu1 to act as a firewall. How to configure Ubuntu1 to provide a consistant user authentication across the network, at the moment Samba provides roaming profiles for the XP machines but the Ubuntu2 machine has it's own user lists. I'd like to have a single authentication for both XP machines and linux machines so that users added to the server list will propagate to all PCs (i.e. new users can log on using any PC without modifying any of the client PCs). How to configure a linux client (Ubuntu2 above) to access files on the server (Ubuntu1), some of which are in user specific folders, effectively sharing /home/{user} per user (read and write access) and stuff like /home/media/photos with read access for everyone and limited write access. How to configure the XP machines (if it is different from a the Samba method). How to set up e-mail filtering. I'd like to have a whitelist/blacklist system for incoming e-mails for some of the e-mail accounts (mainly, my kids' accounts) with filtered e-mails being put into quaranteen until a sysadmin either adds the sender to a blacklist or whitelist. OK, that's a lot of stuff. For now, I don't want config files*, rather, what services / applications to use and how they interact. For example, LDAP could be used for authentication but what else would be useful to make the administration of the LDAP easier. Once I have a general idea for the overall configuration, I can ask other questions about the specifics. Skizz I have looked around for information, but most answers are usually in the form of abstract config files and lists of packages to install.

    Read the article

  • Copying windows home server backup offsite

    - by Simon
    What ways are there to copy a windows home server backup to an offsite location? I'm talking specifically (and only) about the automated backup of my entire machine, and not the shared network folders. I am 90% working away from home on my laptop which has a 640GB drive so the shared folders are essentially useless to me. I backup every night, but if my house burns down or broken into the I'm in serious serious trouble ! I'm really looking for some alternative way to back up my entire machine - which much not interfere with the reliability or speed by which my WHS backs up my laptop every night. Either a way to 'export' a complete machine backup from the server, or recommendations on non-conflicting software I can backup to a 1TB drive at work are what I'm looking for. Note: I believe that WHS uses its own completely proprietary backup and doesn't use things like any 'backup bit' or 'archive bit'. I just dont want to install some other backup software that will conflict. PS I'm now running Windows 7 and just realized that I should probably check out the backup functionality it gives me. I assume that won't conflict right! Edit: Thanks for the hosted solutions. I'd also appreciate ways to backup to an 'offsite' location that I control - like my office vs. my home. The hosted solutions I think will be too slow or expensive for my needs.

    Read the article

  • creating a Menu from SQLite values in Java

    - by shanahobo86
    I am trying to create a ListMenu using data from an SQLite database to define the name of each MenuItem. So in a class called menu.java I have defined the array String classes [] = {}; which should hold each menu item name. In a DBAdapter class I created a function so the user can insert info to a table (This all works fine btw). public long insertContact(String name, String code, String location, String comments, int days, int start, int end, String type) { ContentValues initialValues = new ContentValues(); initialValues.put(KEY_NAME, name); initialValues.put(KEY_CODE, code); initialValues.put(KEY_LOCATION, location); initialValues.put(KEY_COMMENTS, comments); initialValues.put(KEY_DAYS, days); initialValues.put(KEY_START, start); initialValues.put(KEY_END, end); initialValues.put(KEY_TYPE, type); return db.insert(DATABASE_TABLE, null, initialValues); } It would be the Strings inserted into KEY_NAME that I need to populate that String array with. Does anyone know if this is possible? Thanks so much for the help guys. If I implement that function by Sam/Mango the program crashes, am I using it incorrectly or is the error due to the unknown size of the array? DBAdapter db = new DBAdapter(this); String classes [] = db.getClasses(); edit: I should mention that if I manually define the array: String classes [] = {"test1", "test2", "test3", etc}; It works fine. The error is a NullPointerException Here's the logcat (sorry about the formatting). I hadn't initialized with db = helper.getReadableDatabase(); in the getClasses() function but unfortunately it didn't fix the problem. 11-11 22:53:39.117: D/dalvikvm(17856): Late-enabling CheckJNI 11-11 22:53:39.297: D/TextLayoutCache(17856): Using debug level: 0 - Debug Enabled: 0 11-11 22:53:39.337: D/libEGL(17856): loaded /system/lib/egl/libGLES_android.so 11-11 22:53:39.337: D/libEGL(17856): loaded /system/lib/egl/libEGL_adreno200.so 11-11 22:53:39.357: D/libEGL(17856): loaded /system/lib/egl/libGLESv1_CM_adreno200.so 11-11 22:53:39.357: D/libEGL(17856): loaded /system/lib/egl/libGLESv2_adreno200.so 11-11 22:53:39.387: I/Adreno200-EGLSUB(17856): <ConfigWindowMatch:2078>: Format RGBA_8888. 11-11 22:53:39.407: D/memalloc(17856): /dev/pmem: Mapped buffer base:0x5c66d000 size:36593664 offset:32825344 fd:65 11-11 22:53:39.417: E/(17856): Can't open file for reading 11-11 22:53:39.417: E/(17856): Can't open file for reading 11-11 22:53:39.417: D/OpenGLRenderer(17856): Enabling debug mode 0 11-11 22:53:39.477: D/memalloc(17856): /dev/pmem: Mapped buffer base:0x5ecd3000 size:40361984 offset:36593664 fd:68 11-11 22:53:40.507: D/memalloc(17856): /dev/pmem: Mapped buffer base:0x61451000 size:7254016 offset:3485696 fd:71 11-11 22:53:41.077: I/Adreno200-EGLSUB(17856): <ConfigWindowMatch:2078>: Format RGBA_8888. 11-11 22:53:41.077: D/memalloc(17856): /dev/pmem: Mapped buffer base:0x61c4c000 size:7725056 offset:7254016 fd:74 11-11 22:53:41.097: D/memalloc(17856): /dev/pmem: Mapped buffer base:0x623aa000 size:8196096 offset:7725056 fd:80 11-11 22:53:41.937: D/memalloc(17856): /dev/pmem: Mapped buffer base:0x62b7b000 size:8667136 offset:8196096 fd:83 11-11 22:53:41.977: D/memalloc(17856): /dev/pmem: Unmapping buffer base:0x61c4c000 size:7725056 offset:7254016 11-11 22:53:41.977: D/memalloc(17856): /dev/pmem: Unmapping buffer base:0x623aa000 size:8196096 offset:7725056 11-11 22:53:41.977: D/memalloc(17856): /dev/pmem: Unmapping buffer base:0x62b7b000 size:8667136 offset:8196096 11-11 22:53:42.167: I/Adreno200-EGLSUB(17856): <ConfigWindowMatch:2078>: Format RGBA_8888. 11-11 22:53:42.177: D/memalloc(17856): /dev/pmem: Mapped buffer base:0x61c5d000 size:17084416 offset:13316096 fd:74 11-11 22:53:42.317: D/memalloc(17856): /dev/pmem: Mapped buffer base:0x63853000 size:20852736 offset:17084416 fd:80 11-11 22:53:42.357: D/OpenGLRenderer(17856): Flushing caches (mode 0) 11-11 22:53:42.357: D/memalloc(17856): /dev/pmem: Unmapping buffer base:0x5c66d000 size:36593664 offset:32825344 11-11 22:53:42.357: D/memalloc(17856): /dev/pmem: Unmapping buffer base:0x5ecd3000 size:40361984 offset:36593664 11-11 22:53:42.367: D/memalloc(17856): /dev/pmem: Unmapping buffer base:0x61451000 size:7254016 offset:3485696 11-11 22:53:42.757: D/memalloc(17856): /dev/pmem: Mapped buffer base:0x5c56d000 size:24621056 offset:20852736 fd:65 11-11 22:53:44.247: D/AndroidRuntime(17856): Shutting down VM 11-11 22:53:44.247: W/dalvikvm(17856): threadid=1: thread exiting with uncaught exception (group=0x40ac3210) 11-11 22:53:44.257: E/AndroidRuntime(17856): FATAL EXCEPTION: main 11-11 22:53:44.257: E/AndroidRuntime(17856): java.lang.RuntimeException: Unable to instantiate activity ComponentInfo{niall.shannon.timetable/niall.shannon.timetable.menu}: java.lang.NullPointerException 11-11 22:53:44.257: E/AndroidRuntime(17856): at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:1891) 11-11 22:53:44.257: E/AndroidRuntime(17856): at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:1992) 11-11 22:53:44.257: E/AndroidRuntime(17856): at android.app.ActivityThread.access$600(ActivityThread.java:127) 11-11 22:53:44.257: E/AndroidRuntime(17856): at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1158) 11-11 22:53:44.257: E/AndroidRuntime(17856): at android.os.Handler.dispatchMessage(Handler.java:99) 11-11 22:53:44.257: E/AndroidRuntime(17856): at android.os.Looper.loop(Looper.java:137) 11-11 22:53:44.257: E/AndroidRuntime(17856): at android.app.ActivityThread.main(ActivityThread.java:4441) 11-11 22:53:44.257: E/AndroidRuntime(17856): at java.lang.reflect.Method.invokeNative(Native Method) 11-11 22:53:44.257: E/AndroidRuntime(17856): at java.lang.reflect.Method.invoke(Method.java:511) 11-11 22:53:44.257: E/AndroidRuntime(17856): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:823) 11-11 22:53:44.257: E/AndroidRuntime(17856): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:590) 11-11 22:53:44.257: E/AndroidRuntime(17856): at dalvik.system.NativeStart.main(Native Method) 11-11 22:53:44.257: E/AndroidRuntime(17856): Caused by: java.lang.NullPointerException 11-11 22:53:44.257: E/AndroidRuntime(17856): at android.content.ContextWrapper.openOrCreateDatabase(ContextWrapper.java:221) 11-11 22:53:44.257: E/AndroidRuntime(17856): at android.database.sqlite.SQLiteOpenHelper.getWritableDatabase(SQLiteOpenHelper.java:157) 11-11 22:53:44.257: E/AndroidRuntime(17856): at niall.shannon.timetable.DBAdapter.getClasses(DBAdapter.java:151) 11-11 22:53:44.257: E/AndroidRuntime(17856): at niall.shannon.timetable.menu.<init>(menu.java:15) 11-11 22:53:44.257: E/AndroidRuntime(17856): at java.lang.Class.newInstanceImpl(Native Method) 11-11 22:53:44.257: E/AndroidRuntime(17856): at java.lang.Class.newInstance(Class.java:1319) 11-11 22:53:44.257: E/AndroidRuntime(17856): at android.app.Instrumentation.newActivity(Instrumentation.java:1023) 11-11 22:53:44.257: E/AndroidRuntime(17856): at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:1882) 11-11 22:53:44.257: E/AndroidRuntime(17856): ... 11 more 11-11 22:53:46.527: I/Process(17856): Sending signal. PID: 17856 SIG: 9

    Read the article

  • Java ME scorecard with vector and multiple input fields/screens

    - by proximity
    I have made a scorecard with 5 holes, for each a input field (shots), and a image is shown. The input should be saved into a vector and shown on each hole, eg. hole 2: enter shots, underneath it "total shots: 4" (if you have made 4 shots on hole 1). In the end I would need a sum up of all shots, eg. Hole 1: 4 Hole 2: 3 Hole 3: 2 ... Total: 17 Could someone please help me with this task? { f = new Form("Scorecard"); d = Display.getDisplay(this); mTextField = new TextField("Shots:", "", 2, TextField.NUMERIC); f.append(mTextField); mStatus = new StringItem("Hole 1:", "Par 3, 480m"); f.append(mStatus); try { Image j = Image.createImage("/hole1.png"); ImageItem ii = new ImageItem("", j, 3, "Hole 1"); f.append(ii); } catch (java.io.IOException ioe) {} catch (Exception e) {} f.addCommand(mBackCommand); f.addCommand(mNextCommand); f.addCommand(mExitCommand); f.setCommandListener(this); Display.getDisplay(this).setCurrent(f); } public void startApp() { mBackCommand = new Command("Back", Command.BACK, 0); mNextCommand = new Command("Next", Command.OK, 1); mExitCommand = new Command("Exit", Command.EXIT, 2); } public void pauseApp() { } public void destroyApp(boolean unconditional) { } public void commandAction(Command c, Displayable d) { if (c == mExitCommand) { destroyApp(true); notifyDestroyed(); } else if ( c == mNextCommand) { // -> go to next hole input! save the mTextField input into a vector. } } } ------------------------------ Full code --------------------------------- import java.util.; import javax.microedition.midlet.; import javax.microedition.lcdui.*; public class ScorerMIDlet extends MIDlet implements CommandListener { private Command mExitCommand, mBackCommand, mNextCommand; private Display d; private Form f; private TextField mTextField; private Alert a; private StringItem mHole1; private int b; // repeat holeForm for all five holes and add the input into a vector or array. Display the values in the end after asking for todays date and put todays date in top of the list. Make it possible to go back in the form, eg. hole 3 - hole 2 - hole 1 public void holeForm(int b) { f = new Form("Scorecard"); d = Display.getDisplay(this); mTextField = new TextField("Shots:", "", 2, TextField.NUMERIC); f.append(mTextField); mHole1 = new StringItem("Hole 1:", "Par 5, 480m"); f.append(mHole1); try { Image j = Image.createImage("/hole1.png"); ImageItem ii = new ImageItem("", j, 3, "Hole 1"); f.append(ii); } catch (java.io.IOException ioe) {} catch (Exception e) {} // Set date&time in the end long now = System.currentTimeMillis(); DateField df = new DateField("Playing date:", DateField.DATE_TIME); df.setDate(new Date(now)); f.append(df); f.addCommand(mBackCommand); f.addCommand(mNextCommand); f.addCommand(mExitCommand); f.setCommandListener(this); Display.getDisplay(this).setCurrent(f); } public void startApp() { mBackCommand = new Command("Back", Command.BACK, 0); mNextCommand = new Command("OK-Next", Command.OK, 1); mExitCommand = new Command("Exit", Command.EXIT, 2); b = 0; holeForm(b); } public void pauseApp() {} public void destroyApp(boolean unconditional) {} public void commandAction(Command c, Displayable d) { if (c == mExitCommand) { destroyApp(true); notifyDestroyed(); } else if ( c == mNextCommand) { holeForm(b); } } }

    Read the article

  • jenkins-maven-android when running throwing the error "android-sdk-linux/platforms" is not a directory"

    - by Sam
    I start setting up the jenkins-maven-android and i'm facing an issue when running the jenkin job. My Machine Details $uname -a Linux development2 3.0.0-12-virtual #20-Ubuntu SMP Fri Oct 7 18:19:02 UTC 2011 x86_64 x86_64 x86_64 GNU/Linux Steps to install the Android SDK in Ubuntu https://help.ubuntu.com/community/AndroidSDK since i'm working on headless env (ssh to client machine) i used following command to install the platform tools android update sdk --no-ui download apache maven and install on http://maven.apache.org/download.html mvn -version output root@development2:/opt/android-sdk-linux/tools# mvn -version Apache Maven 3.0.4 (r1232337; 2012-01-17 08:44:56+0000) Maven home: /opt/apache-maven-3.0.4 Java version: 1.6.0_24, vendor: Sun Microsystems Inc. Java home: /usr/lib/jvm/java-6-openjdk/jre Default locale: en_US, platform encoding: UTF-8 OS name: "linux", version: "3.0.0-12-virtual", arch: "amd64", family: "unix" root@development2:/opt/android-sdk-linux/tools# ran the following two command as mention in below sudo apt-get update sudo apt-get install ia32-libs Problems with Eclipse and Android SDK http://developer.android.com/sdk/installing/index.html As error suggest i gave the path to android SDK in jenkins build config still im getting the error clean install -Dandroid.sdk.path=/opt/android-sdk-linux Can someone help me to resolve this. Thanks Error I'm Getting Waiting for Jenkins to finish collecting data mavenExecutionResult exceptions not empty message : Failed to execute goal com.jayway.maven.plugins.android.generation2:android-maven-plugin:3.1.1:generate-sources (default-generate-sources) on project base-template: Execution default-generate-sources of goal com.jayway.maven.plugins.android.generation2:android-maven-plugin:3.1.1:generate-sources failed: Path "/opt/android-sdk-linux/platforms" is not a directory. Please provide a proper Android SDK directory path as configuration parameter <sdk><path>...</path></sdk> in the plugin <configuration/>. As an alternative, you may add the parameter to commandline: -Dandroid.sdk.path=... or set environment variable ANDROID_HOME. cause : Execution default-generate-sources of goal com.jayway.maven.plugins.android.generation2:android-maven-plugin:3.1.1:generate-sources failed: Path "/opt/android-sdk-linux/platforms" is not a directory. Please provide a proper Android SDK directory path as configuration parameter <sdk><path>...</path></sdk> in the plugin <configuration/>. As an alternative, you may add the parameter to commandline: -Dandroid.sdk.path=... or set environment variable ANDROID_HOME. Stack trace : org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal com.jayway.maven.plugins.android.generation2:android-maven-plugin:3.1.1:generate-sources (default-generate-sources) on project base-template: Execution default-generate-sources of goal com.jayway.maven.plugins.android.generation2:android-maven-plugin:3.1.1:generate-sources failed: Path "/opt/android-sdk-linux/platforms" is not a directory. Please provide a proper Android SDK directory path as configuration parameter <sdk><path>...</path></sdk> in the plugin <configuration/>. As an alternative, you may add the parameter to commandline: -Dandroid.sdk.path=... or set environment variable ANDROID_HOME. at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:225) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59) at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183) at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161) at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:320) at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156) at org.jvnet.hudson.maven3.launcher.Maven3Launcher.main(Maven3Launcher.java:79) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:616) at org.codehaus.plexus.classworlds.launcher.Launcher.launchStandard(Launcher.java:329) at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:239) at org.jvnet.hudson.maven3.agent.Maven3Main.launch(Maven3Main.java:158) at hudson.maven.Maven3Builder.call(Maven3Builder.java:98) at hudson.maven.Maven3Builder.call(Maven3Builder.java:64) at hudson.remoting.UserRequest.perform(UserRequest.java:118) at hudson.remoting.UserRequest.perform(UserRequest.java:48) at hudson.remoting.Request$2.run(Request.java:326) at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334) at java.util.concurrent.FutureTask.run(FutureTask.java:166) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603) at java.lang.Thread.run(Thread.java:679) Caused by: org.apache.maven.plugin.PluginExecutionException: Execution default-generate-sources of goal com.jayway.maven.plugins.android.generation2:android-maven-plugin:3.1.1:generate-sources failed: Path "/opt/android-sdk-linux/platforms" is not a directory. Please provide a proper Android SDK directory path as configuration parameter <sdk><path>...</path></sdk> in the plugin <configuration/>. As an alternative, you may add the parameter to commandline: -Dandroid.sdk.path=... or set environment variable ANDROID_HOME. at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:110) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209) ... 27 more Caused by: com.jayway.maven.plugins.android.InvalidSdkException: Path "/opt/android-sdk-linux/platforms" is not a directory. Please provide a proper Android SDK directory path as configuration parameter <sdk><path>...</path></sdk> in the plugin <configuration/>. As an alternative, you may add the parameter to commandline: -Dandroid.sdk.path=... or set environment variable ANDROID_HOME. at com.jayway.maven.plugins.android.AndroidSdk.assertPathIsDirectory(AndroidSdk.java:125) at com.jayway.maven.plugins.android.AndroidSdk.getPlatformDirectories(AndroidSdk.java:285) at com.jayway.maven.plugins.android.AndroidSdk.findAvailablePlatforms(AndroidSdk.java:260) at com.jayway.maven.plugins.android.AndroidSdk.<init>(AndroidSdk.java:80) at com.jayway.maven.plugins.android.AbstractAndroidMojo.getAndroidSdk(AbstractAndroidMojo.java:844) at com.jayway.maven.plugins.android.phase01generatesources.GenerateSourcesMojo.generateR(GenerateSourcesMojo.java:329) at com.jayway.maven.plugins.android.phase01generatesources.GenerateSourcesMojo.execute(GenerateSourcesMojo.java:102) at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101) ... 28 more channel stopped Finished: FAILURE* android home Echo root@development2:~# echo $ANDROID_HOME /opt/android-sdk-linux

    Read the article

  • Hibernate unable to instantiate default tuplizer - cannot find getter

    - by ZeldaPinwheel
    I'm trying to use Hibernate to persist a class that looks like this: public class Item implements Serializable, Comparable<Item> { // Item id private Integer id; // Description of item in inventory private String description; // Number of items described by this inventory item private int count; //Category item belongs to private String category; // Date item was purchased private GregorianCalendar purchaseDate; public Item() { } public Integer getId() { return id; } public void setId(Integer id) { this.id = id; } public String getDescription() { return description; } public void setDescription(String description) { this.description = description; } public int getCount() { return count; } public void setCount(int count) { this.count = count; } public String getCategory() { return category; } public void setCategory(String category) { this.category = category; } public GregorianCalendar getPurchaseDate() { return purchaseDate; } public void setPurchasedate(GregorianCalendar purchaseDate) { this.purchaseDate = purchaseDate; } My Hibernate mapping file contains the following: <property name="puchaseDate" type="java.util.GregorianCalendar"> <column name="purchase_date"></column> </property> When I try to run, I get error messages indicating there is no getter function for the purchaseDate attribute: 577 [main] INFO org.hibernate.connection.DriverManagerConnectionProvider - Using Hibernate built-in connection pool (not for production use!) 577 [main] INFO org.hibernate.connection.DriverManagerConnectionProvider - Hibernate connection pool size: 20 577 [main] INFO org.hibernate.connection.DriverManagerConnectionProvider - autocommit mode: false 592 [main] INFO org.hibernate.connection.DriverManagerConnectionProvider - using driver: com.mysql.jdbc.Driver at URL: jdbc:mysql://localhost:3306/home_inventory 592 [main] INFO org.hibernate.connection.DriverManagerConnectionProvider - connection properties: {user=root, password=****} 1078 [main] INFO org.hibernate.cfg.SettingsFactory - RDBMS: MySQL, version: 5.1.45 1078 [main] INFO org.hibernate.cfg.SettingsFactory - JDBC driver: MySQL-AB JDBC Driver, version: mysql-connector-java-5.1.12 ( Revision: ${bzr.revision-id} ) 1103 [main] INFO org.hibernate.dialect.Dialect - Using dialect: org.hibernate.dialect.MySQLDialect 1107 [main] INFO org.hibernate.engine.jdbc.JdbcSupportLoader - Disabling contextual LOB creation as JDBC driver reported JDBC version [3] less than 4 1109 [main] INFO org.hibernate.transaction.TransactionFactoryFactory - Using default transaction strategy (direct JDBC transactions) 1110 [main] INFO org.hibernate.transaction.TransactionManagerLookupFactory - No TransactionManagerLookup configured (in JTA environment, use of read-write or transactional second-level cache is not recommended) 1110 [main] INFO org.hibernate.cfg.SettingsFactory - Automatic flush during beforeCompletion(): disabled 1110 [main] INFO org.hibernate.cfg.SettingsFactory - Automatic session close at end of transaction: disabled 1110 [main] INFO org.hibernate.cfg.SettingsFactory - JDBC batch size: 15 1110 [main] INFO org.hibernate.cfg.SettingsFactory - JDBC batch updates for versioned data: disabled 1111 [main] INFO org.hibernate.cfg.SettingsFactory - Scrollable result sets: enabled 1111 [main] INFO org.hibernate.cfg.SettingsFactory - JDBC3 getGeneratedKeys(): enabled 1111 [main] INFO org.hibernate.cfg.SettingsFactory - Connection release mode: auto 1111 [main] INFO org.hibernate.cfg.SettingsFactory - Maximum outer join fetch depth: 2 1111 [main] INFO org.hibernate.cfg.SettingsFactory - Default batch fetch size: 1 1111 [main] INFO org.hibernate.cfg.SettingsFactory - Generate SQL with comments: disabled 1111 [main] INFO org.hibernate.cfg.SettingsFactory - Order SQL updates by primary key: disabled 1111 [main] INFO org.hibernate.cfg.SettingsFactory - Order SQL inserts for batching: disabled 1112 [main] INFO org.hibernate.cfg.SettingsFactory - Query translator: org.hibernate.hql.ast.ASTQueryTranslatorFactory 1113 [main] INFO org.hibernate.hql.ast.ASTQueryTranslatorFactory - Using ASTQueryTranslatorFactory 1113 [main] INFO org.hibernate.cfg.SettingsFactory - Query language substitutions: {} 1113 [main] INFO org.hibernate.cfg.SettingsFactory - JPA-QL strict compliance: disabled 1113 [main] INFO org.hibernate.cfg.SettingsFactory - Second-level cache: enabled 1113 [main] INFO org.hibernate.cfg.SettingsFactory - Query cache: disabled 1113 [main] INFO org.hibernate.cfg.SettingsFactory - Cache region factory : org.hibernate.cache.impl.NoCachingRegionFactory 1113 [main] INFO org.hibernate.cfg.SettingsFactory - Optimize cache for minimal puts: disabled 1114 [main] INFO org.hibernate.cfg.SettingsFactory - Structured second-level cache entries: disabled 1117 [main] INFO org.hibernate.cfg.SettingsFactory - Echoing all SQL to stdout 1118 [main] INFO org.hibernate.cfg.SettingsFactory - Statistics: disabled 1118 [main] INFO org.hibernate.cfg.SettingsFactory - Deleted entity synthetic identifier rollback: disabled 1118 [main] INFO org.hibernate.cfg.SettingsFactory - Default entity-mode: pojo 1118 [main] INFO org.hibernate.cfg.SettingsFactory - Named query checking : enabled 1118 [main] INFO org.hibernate.cfg.SettingsFactory - Check Nullability in Core (should be disabled when Bean Validation is on): enabled 1151 [main] INFO org.hibernate.impl.SessionFactoryImpl - building session factory org.hibernate.HibernateException: Unable to instantiate default tuplizer [org.hibernate.tuple.entity.PojoEntityTuplizer] at org.hibernate.tuple.entity.EntityTuplizerFactory.constructTuplizer(EntityTuplizerFactory.java:110) at org.hibernate.tuple.entity.EntityTuplizerFactory.constructDefaultTuplizer(EntityTuplizerFactory.java:135) at org.hibernate.tuple.entity.EntityEntityModeToTuplizerMapping.<init>(EntityEntityModeToTuplizerMapping.java:80) at org.hibernate.tuple.entity.EntityMetamodel.<init>(EntityMetamodel.java:323) at org.hibernate.persister.entity.AbstractEntityPersister.<init>(AbstractEntityPersister.java:475) at org.hibernate.persister.entity.SingleTableEntityPersister.<init>(SingleTableEntityPersister.java:133) at org.hibernate.persister.PersisterFactory.createClassPersister(PersisterFactory.java:84) at org.hibernate.impl.SessionFactoryImpl.<init>(SessionFactoryImpl.java:295) at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1385) at service.HibernateSessionFactory.currentSession(HibernateSessionFactory.java:53) at service.ItemSvcHibImpl.generateReport(ItemSvcHibImpl.java:78) at service.test.ItemSvcTest.testGenerateReport(ItemSvcTest.java:226) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at junit.framework.TestCase.runTest(TestCase.java:164) at junit.framework.TestCase.runBare(TestCase.java:130) at junit.framework.TestResult$1.protect(TestResult.java:106) at junit.framework.TestResult.runProtected(TestResult.java:124) at junit.framework.TestResult.run(TestResult.java:109) at junit.framework.TestCase.run(TestCase.java:120) at junit.framework.TestSuite.runTest(TestSuite.java:230) at junit.framework.TestSuite.run(TestSuite.java:225) at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130) at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:467) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:683) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:390) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:197) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) at java.lang.reflect.Constructor.newInstance(Constructor.java:513) at org.hibernate.tuple.entity.EntityTuplizerFactory.constructTuplizer(EntityTuplizerFactory.java:107) ... 29 more Caused by: org.hibernate.PropertyNotFoundException: Could not find a getter for puchaseDate in class domain.Item at org.hibernate.property.BasicPropertyAccessor.createGetter(BasicPropertyAccessor.java:328) at org.hibernate.property.BasicPropertyAccessor.getGetter(BasicPropertyAccessor.java:321) at org.hibernate.mapping.Property.getGetter(Property.java:304) at org.hibernate.tuple.entity.PojoEntityTuplizer.buildPropertyGetter(PojoEntityTuplizer.java:299) at org.hibernate.tuple.entity.AbstractEntityTuplizer.<init>(AbstractEntityTuplizer.java:158) at org.hibernate.tuple.entity.PojoEntityTuplizer.<init>(PojoEntityTuplizer.java:77) ... 34 more I'm new to Hibernate, so I don't know all the ins and outs, but I do have the getter and setter for the purchaseDate attribute. I don't know what I'm missing here - does anyone else? Thanks!

    Read the article

  • Selenium Webdriver (FirefoxDriver)

    - by Samareri
    Hey all, I'm using Selenium Webdriver to do some robottesting. Since some functions appear to only work in Firefox, I'm obligated to use Firefoxdriver. Now and then, something weird happens. Starting up te driver driver = new FirefoxDriver(); driver.get(URL); gets firefox to startup but not to go to the specified url. The strange thing is that it works on another computer with the same preferences set in Firefox. I solved this problem once by changing to another version of firefox, but this time this doesn't do the trick for me, it did however worked for the other developers. Yes, the error started for all developers on the same time, same day... My first question is: is it a firefox problem or Webdriver problem. Second question: how is it possible that it works on other pc's? Any help would be very appreciated Thanks Error: org.openqa.selenium.WebDriverException: Could not parse "". System info: os.name: 'Windows XP', os.arch: 'x86', os.version: '5.1', java.version: '1.6.0_18' Driver info: driver.version: firefox at org.openqa.selenium.firefox.Response.<init>(Response.java:53) at org.openqa.selenium.firefox.internal.AbstractExtensionConnection.nextResponse(AbstractExtensionConnection.java:258) at org.openqa.selenium.firefox.internal.AbstractExtensionConnection.readLoop(AbstractExtensionConnection.java:220) at org.openqa.selenium.firefox.internal.AbstractExtensionConnection.waitForResponseFor(AbstractExtensionConnection.java:213) at org.openqa.selenium.firefox.internal.AbstractExtensionConnection.sendMessageAndWaitForResponse(AbstractExtensionConnection.java:162) at org.openqa.selenium.firefox.FirefoxDriver.executeCommand(FirefoxDriver.java:329) at org.openqa.selenium.firefox.FirefoxDriver.sendMessage(FirefoxDriver.java:312) at org.openqa.selenium.firefox.FirefoxDriver.sendMessage(FirefoxDriver.java:308) at org.openqa.selenium.firefox.FirefoxDriver.fixId(FirefoxDriver.java:350) at org.openqa.selenium.firefox.FirefoxDriver.<init>(FirefoxDriver.java:130) at org.openqa.selenium.firefox.FirefoxDriver.<init>(FirefoxDriver.java:109) at be.....MMCRobotTest.login(MMCRobotTest.java:98) at be.....MMCRobotTestAttribute.testNewAttribute(MMCRobotTestAttribute.java:12) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at junit.framework.TestCase.runTest(TestCase.java:164) at junit.framework.TestCase.runBare(TestCase.java:130) at junit.framework.TestResult$1.protect(TestResult.java:106) at junit.framework.TestResult.runProtected(TestResult.java:124) at junit.framework.TestResult.run(TestResult.java:109) at junit.framework.TestCase.run(TestCase.java:120) at junit.framework.TestSuite.runTest(TestSuite.java:230) at junit.framework.TestSuite.run(TestSuite.java:225) at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130) at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:467) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:683) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:390) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:197) Caused by: org.json.JSONException: A JSONObject text must begin with '{' at character 0 at org.json.JSONTokener.syntaxError(JSONTokener.java:496) at org.json.JSONObject.<init>(JSONObject.java:180) at org.json.JSONObject.<init>(JSONObject.java:403) at org.openqa.selenium.firefox.Response.<init>(Response.java:41) ... 30 more

    Read the article

  • Eclipse Galileo + Glassfish v3: JPADeployer NullPointerException on deploy

    - by bshacklett
    I've created a very simple "Enterprise Application" project with about 7 entity beans and one stateless session bean. I've also configured an instance of Glassfish v3 to run as my application server. Unfortunately, when I attempt to publish the EAR to Glassfish, I'm getting the following response: SEVERE: Exception while invoking class org.glassfish.persistence.jpa.JPADeployer prepare method java.lang.NullPointerException at org.glassfish.persistence.jpa.JPADeployer.prepare(JPADeployer.java:104) at com.sun.enterprise.v3.server.ApplicationLifecycle.prepareModule(ApplicationLifecycle.java:644) at org.glassfish.javaee.full.deployment.EarDeployer.prepareBundle(EarDeployer.java:269) at org.glassfish.javaee.full.deployment.EarDeployer.access$200(EarDeployer.java:79) at org.glassfish.javaee.full.deployment.EarDeployer$1.doBundle(EarDeployer.java:131) at org.glassfish.javaee.full.deployment.EarDeployer$1.doBundle(EarDeployer.java:129) at org.glassfish.javaee.full.deployment.EarDeployer.doOnBundles(EarDeployer.java:197) at org.glassfish.javaee.full.deployment.EarDeployer.doOnAllTypedBundles(EarDeployer.java:206) at org.glassfish.javaee.full.deployment.EarDeployer.doOnAllBundles(EarDeployer.java:232) at org.glassfish.javaee.full.deployment.EarDeployer.prepare(EarDeployer.java:129) at com.sun.enterprise.v3.server.ApplicationLifecycle.prepareModule(ApplicationLifecycle.java:644) at com.sun.enterprise.v3.server.ApplicationLifecycle.deploy(ApplicationLifecycle.java:296) at com.sun.enterprise.v3.server.ApplicationLifecycle.deploy(ApplicationLifecycle.java:183) at org.glassfish.deployment.admin.DeployCommand.execute(DeployCommand.java:272) at com.sun.enterprise.v3.admin.CommandRunnerImpl$1.execute(CommandRunnerImpl.java:305) at com.sun.enterprise.v3.admin.CommandRunnerImpl.doCommand(CommandRunnerImpl.java:320) at com.sun.enterprise.v3.admin.CommandRunnerImpl.doCommand(CommandRunnerImpl.java:1176) at com.sun.enterprise.v3.admin.CommandRunnerImpl.access$900(CommandRunnerImpl.java:83) at com.sun.enterprise.v3.admin.CommandRunnerImpl$ExecutionContext.execute(CommandRunnerImpl.java:1235) at com.sun.enterprise.v3.admin.CommandRunnerImpl$ExecutionContext.execute(CommandRunnerImpl.java:1224) at com.sun.enterprise.v3.admin.AdminAdapter.doCommand(AdminAdapter.java:365) at com.sun.enterprise.v3.admin.AdminAdapter.service(AdminAdapter.java:204) at com.sun.grizzly.tcp.http11.GrizzlyAdapter.service(GrizzlyAdapter.java:166) at com.sun.enterprise.v3.server.HK2Dispatcher.dispath(HK2Dispatcher.java:100) at com.sun.enterprise.v3.services.impl.ContainerMapper.service(ContainerMapper.java:245) at com.sun.grizzly.http.ProcessorTask.invokeAdapter(ProcessorTask.java:791) at com.sun.grizzly.http.ProcessorTask.doProcess(ProcessorTask.java:693) at com.sun.grizzly.http.ProcessorTask.process(ProcessorTask.java:954) at com.sun.grizzly.http.DefaultProtocolFilter.execute(DefaultProtocolFilter.java:170) at com.sun.grizzly.DefaultProtocolChain.executeProtocolFilter(DefaultProtocolChain.java:135) at com.sun.grizzly.DefaultProtocolChain.execute(DefaultProtocolChain.java:102) at com.sun.grizzly.DefaultProtocolChain.execute(DefaultProtocolChain.java:88) at com.sun.grizzly.http.HttpProtocolChain.execute(HttpProtocolChain.java:76) at com.sun.grizzly.ProtocolChainContextTask.doCall(ProtocolChainContextTask.java:53) at com.sun.grizzly.SelectionKeyContextTask.call(SelectionKeyContextTask.java:57) at com.sun.grizzly.ContextTask.run(ContextTask.java:69) at com.sun.grizzly.util.AbstractThreadPool$Worker.doWork(AbstractThreadPool.java:330) at com.sun.grizzly.util.AbstractThreadPool$Worker.run(AbstractThreadPool.java:309) at java.lang.Thread.run(Thread.java:637)

    Read the article

  • Index out of bounds error

    - by sprasad12
    Hello, I am working on a program where i am recreating the saved widgets back on to the boundary panel. When i am creating them i am also trying to put the values into the ArrayList so that if i want to update and save the opened project i should be able to do so by getting the values from the ArrayList. Here is how the code looks like: for(int i = 0; i < result.length; i++){ if(ename.contains(result[i].getParticipateEntityName())){ ername.add(ename.indexOf(result[i].getParticipateEntityName()), result[i].getParticipateRelatioshipName()); etotalpartial.add(ename.indexOf(result[i].getParticipateEntityName()), result[i].getTotalPartial()); }else if(wename.contains(result[i].getParticipateEntityName())){ wrname.add(wename.indexOf(result[i].getParticipateEntityName()), result[i].getParticipateRelatioshipName()); } } Here ename, ername, etotalpartial, wename and wrname are all ArrayList. This piece of code is included in an asynchronous class method. When i run the code i get error at "ername.add(ename......". Here is the error stack: java.lang.IndexOutOfBoundsException: Index: 1, Size: 0 at java.util.ArrayList.add(ArrayList.java:367) at com.e.r.d.client.ERD1$16.onSuccess(ERD1.java:898) at com.e.r.d.client.ERD1$16.onSuccess(ERD1.java:1) at com.google.gwt.user.client.rpc.impl.RequestCallbackAdapter.onResponseReceived(RequestCallbackAdapter.java:216) at com.google.gwt.http.client.Request.fireOnResponseReceived(Request.java:287) at com.google.gwt.http.client.RequestBuilder$1.onReadyStateChange(RequestBuilder.java:393) at sun.reflect.GeneratedMethodAccessor16.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at com.google.gwt.dev.shell.MethodAdaptor.invoke(MethodAdaptor.java:103) at com.google.gwt.dev.shell.MethodDispatch.invoke(MethodDispatch.java:71) at com.google.gwt.dev.shell.OophmSessionHandler.invoke(OophmSessionHandler.java:157) at com.google.gwt.dev.shell.BrowserChannel.reactToMessagesWhileWaitingForReturn(BrowserChannel.java:1713) at com.google.gwt.dev.shell.BrowserChannelServer.invokeJavascript(BrowserChannelServer.java:165) at com.google.gwt.dev.shell.ModuleSpaceOOPHM.doInvoke(ModuleSpaceOOPHM.java:120) at com.google.gwt.dev.shell.ModuleSpace.invokeNative(ModuleSpace.java:507) at com.google.gwt.dev.shell.ModuleSpace.invokeNativeObject(ModuleSpace.java:264) at com.google.gwt.dev.shell.JavaScriptHost.invokeNativeObject(JavaScriptHost.java:91) at com.google.gwt.core.client.impl.Impl.apply(Impl.java) at com.google.gwt.core.client.impl.Impl.entry0(Impl.java:188) at sun.reflect.GeneratedMethodAccessor9.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at com.google.gwt.dev.shell.MethodAdaptor.invoke(MethodAdaptor.java:103) at com.google.gwt.dev.shell.MethodDispatch.invoke(MethodDispatch.java:71) at com.google.gwt.dev.shell.OophmSessionHandler.invoke(OophmSessionHandler.java:157) at com.google.gwt.dev.shell.BrowserChannel.reactToMessages(BrowserChannel.java:1668) at com.google.gwt.dev.shell.BrowserChannelServer.processConnection(BrowserChannelServer.java:401) at com.google.gwt.dev.shell.BrowserChannelServer.run(BrowserChannelServer.java:222) at java.lang.Thread.run(Thread.java:619) I am not sure what i am doing wrong. Any input will be of great help. Thank you.

    Read the article

  • UnsatisfiedLinkError on Websphere Application Server 6.1 Data Source

    - by user338154
    Hi, I am unable to start the installed App on my WAS instance. I believe the root cause is an UnsatisfiedLinkError which is shown as follows: Caused by: java.lang.UnsatisfiedLinkError: no ocijdbc10 in java.library.path at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1682) at java.lang.Runtime.loadLibrary0(Runtime.java:822) at java.lang.System.loadLibrary(System.java:993) at oracle.jdbc.driver.T2CConnection$1.run(T2CConnection.java:3147) at java.security.AccessController.doPrivileged(Native Method) at oracle.jdbc.driver.T2CConnection.loadNativeLibrary(T2CConnection.java:3143) at oracle.jdbc.driver.T2CConnection.logon(T2CConnection.java:221) at oracle.jdbc.driver.PhysicalConnection.(PhysicalConnection.java:441) at oracle.jdbc.driver.T2CConnection.(T2CConnection.java:132) at oracle.jdbc.driver.T2CDriverExtension.getConnection(T2CDriverExtension.java:78) at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:801) at oracle.jdbc.pool.OracleDataSource.getPhysicalConnection(OracleDataSource.java:297) at oracle.jdbc.xa.client.OracleXADataSource.getPooledConnection(OracleXADataSource.java:515) at oracle.jdbc.xa.client.OracleXADataSource.getXAConnection(OracleXADataSource.java:159) at oracle.jdbc.xa.client.OracleXADataSource.getXAConnection(OracleXADataSource.java:133) at com.ibm.ws.rsadapter.spi.InternalGenericDataStoreHelper$1.run(InternalGenericDataStoreHelper.java:935) at com.ibm.ws.security.util.AccessController.doPrivileged(AccessController.java:118) at com.ibm.ws.rsadapter.spi.InternalGenericDataStoreHelper.getPooledConnection(InternalGenericDataStoreHelper.java:972) at com.ibm.ws.rsadapter.spi.WSRdbDataSource.getPooledConnection(WSRdbDataSource.java:1625) at com.ibm.ws.rsadapter.spi.WSManagedConnectionFactoryImpl.createManagedConnection(WSManagedConnectionFactoryImpl.java:1220) at com.ibm.ejs.j2c.FreePool.createManagedConnectionWithMCWrapper(FreePool.java:1988) at com.ibm.ejs.j2c.FreePool.createOrWaitForConnection(FreePool.java:1660) at com.ibm.ejs.j2c.PoolManager.reserve(PoolManager.java:2341) at com.ibm.ejs.j2c.ConnectionManager.allocateMCWrapper(ConnectionManager.java:932) at com.ibm.ejs.j2c.ConnectionManager.allocateConnection(ConnectionManager.java:608) at com.ibm.ws.rsadapter.jdbc.WSJdbcDataSource.getConnection(WSJdbcDataSource.java:449) at com.ibm.ws.rsadapter.jdbc.WSJdbcDataSource.getConnection(WSJdbcDataSource.java:418) at org.apache.ojb.broker.accesslayer.ConnectionFactoryAbstractImpl.newConnectionFromDataSource(Unknown Source) at org.apache.ojb.broker.accesslayer.ConnectionFactoryAbstractImpl.lookupConnection(Unknown Source) at org.apache.ojb.broker.accesslayer.ConnectionFactoryManagedImpl.lookupConnection(Unknown Source) at org.apache.ojb.broker.accesslayer.ConnectionManagerImpl.getConnection(Unknown Source) at org.apache.ojb.broker.accesslayer.StatementManager.getPreparedStatement(Unknown Source) at org.apache.ojb.broker.accesslayer.JdbcAccessImpl.executeQuery(Unknown Source) at org.apache.ojb.broker.accesslayer.RsQueryObject.performQuery(Unknown Source) at org.apache.ojb.broker.accesslayer.RsIterator.(Unknown Source) at org.apache.ojb.broker.core.RsIteratorFactoryImpl.createRsIterator(Unknown Source) at org.apache.ojb.broker.core.PersistenceBrokerImpl.getRsIteratorFromQuery(Unknown Source) at org.apache.ojb.broker.core.PersistenceBrokerImpl.getIteratorFromQuery(Unknown Source) at org.apache.ojb.broker.core.QueryReferenceBroker.getCollectionByQuery(Unknown Source) at org.apache.ojb.broker.core.QueryReferenceBroker.getCollectionByQuery(Unknown Source) at org.apache.ojb.broker.core.QueryReferenceBroker.getCollectionByQuery(Unknown Source) at org.apache.ojb.broker.core.PersistenceBrokerImpl.getCollectionByQuery(Unknown Source) at org.apache.ojb.broker.core.DelegatingPersistenceBroker.getCollectionByQuery(Unknown Source) at org.apache.ojb.broker.core.DelegatingPersistenceBroker.getCollectionByQuery(Unknown Source) at com.ascential.xmeta.persistence.orm.impl.ojb.OjbPersistentEObjectPersistenceRegistry.loadPackageCache(OjbPersistentEObjectPersistenceRegistry.java:371) ... 115 more My LD_LIBRARY_PATH variable for the 'was' user is /opt/oracle/product/10.2.0/lib What else should I be checking to fix this error? Please help. Thanks

    Read the article

  • Google App Engine 1.3.1 <admin-console> Issue

    - by Taylor L
    I attempted to add an <admin-console> section to my appengine-web.xml and I got the exception below. The <admin-console> element is a valid element according to the appengine-web.xsd. It's also documented in the app engine docs. Any ideas as to what is wrong? <admin-console> <page name="My Admin" url="/app/admin" /> </admin-console> Feb 14, 2010 12:40:09 AM com.google.apphosting.utils.config.AppEngineWebXmlReader readAppEngineWebXml SEVERE: Received exception processing C:/development/taylor/myapp/target/myapp-web-0.0.1-SNAPSHOT\WEB-INF/appengine-web.xml com.google.apphosting.utils.config.AppEngineConfigException: Unrecognized element <admin-console> at com.google.apphosting.utils.config.AppEngineWebXmlProcessor.processSecondLevelNode(AppEngineWebXmlProcessor.java:99) at com.google.apphosting.utils.config.AppEngineWebXmlProcessor.processXml(AppEngineWebXmlProcessor.java:46) at com.google.apphosting.utils.config.AppEngineWebXmlReader.processXml(AppEngineWebXmlReader.java:94) at com.google.apphosting.utils.config.AppEngineWebXmlReader.readAppEngineWebXml(AppEngineWebXmlReader.java:61) at com.google.appengine.tools.admin.Application.<init>(Application.java:88) at com.google.appengine.tools.admin.Application.readApplication(Application.java:120) at com.google.appengine.tools.admin.AppCfg.<init>(AppCfg.java:107) at com.google.appengine.tools.admin.AppCfg.<init>(AppCfg.java:58) at com.google.appengine.tools.admin.AppCfg.main(AppCfg.java:54) at net.kindleit.gae.EngineGoalBase.runAppCfg(EngineGoalBase.java:140) at net.kindleit.gae.DeployGoal.execute(DeployGoal.java:38) at org.apache.maven.plugin.DefaultPluginManager.executeMojo(DefaultPluginManager.java:579) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoalAndHandleFailures(DefaultLifecycleExecutor.java:498) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeTaskSegmentForProject(DefaultLifecycleExecutor.java:265) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeTaskSegments(DefaultLifecycleExecutor.java:191) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.execute(DefaultLifecycleExecutor.java:149) at org.apache.maven.DefaultMaven.execute_aroundBody0(DefaultMaven.java:223) at org.apache.maven.DefaultMaven.execute_aroundBody1$advice(DefaultMaven.java:304) at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:1) at org.apache.maven.embedder.MavenEmbedder.execute_aroundBody2(MavenEmbedder.java:904) at org.apache.maven.embedder.MavenEmbedder.execute_aroundBody3$advice(MavenEmbedder.java:304) at org.apache.maven.embedder.MavenEmbedder.execute(MavenEmbedder.java:1) at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:176) at org.apache.maven.cli.MavenCli.main(MavenCli.java:63) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289) at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229) at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:408) at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:351) at org.codehaus.classworlds.Launcher.main(Launcher.java:31)

    Read the article

  • Where does java.util.logging.Logger store their log

    - by Harry Pham
    This might be a stupid question but I am a bit lost with java Logger private static Logger logger = Logger.getLogger("order.web.OrderManager"); logger.info("Removed order " + id + "."); Where do I see the log? Also this quote from java.util.logging.Logger library: On each logging call the Logger initially performs a cheap check of the request level (e.g. SEVERE or FINE) against the effective log level of the logger. If the request level is lower than the log level, the logging call returns immediately. After passing this initial (cheap) test, the Logger will allocate a LogRecord to describe the logging message. It will then call a Filter (if present) to do a more detailed check on whether the record should be published. If that passes it will then publish the LogRecord to its output Handlers. Does this mean that if I have 3 request level log: logger.log(Level.FINE, "Something"); logger.log(Level.WARNING, "Something"); logger.log(Level.SEVERE, "Something"); And my log level is SEVERE, I can see all three logs, if my log level is WARNING, then I cant see SEVERE log, is that correct? And how do I set the log level?

    Read the article

  • Increase Timeout for remote sessions in Debian 5 Lenny

    - by Ash
    I always get a remote connection time out when using PuTTy and also when i send emails with attachments from a mail sever installed on Debian. I always get this error. I'm not sure if this is firewall or the new Debian 5 installation which i made. Is there any settings i need to fix after fresh install. Any inputs are highly appreciated. This error is pulling my brains out. Thanks. Error: 2011-01-10 15:21:13,454 INFO [btpool0-23://69.19.19.89/service/upload?fmt=extended] [[email protected];mid=72;ip=10.10.01.78;ua=Mozilla/5.0 (Windows;; U;; Windows NT 5.2;; en-US;; rv:1.9.2.13) Gecko/20101203 Firefox/3.6.13 (.NET CLR 3.5.30729);] FileUploadServlet - File upload failed org.apache.commons.fileupload.FileUploadBase$IOFileUploadException: Processing of multipart/form-data request failed. timeout at org.apache.commons.fileupload.FileUploadBase.parseRequest(FileUploadBase.java:367) at org.apache.commons.fileupload.servlet.ServletFileUpload.parseRequest(ServletFileUpload.java:126) at com.zimbra.cs.service.FileUploadServlet.handleMultipartUpload(FileUploadServlet.java:430) at com.zimbra.cs.service.FileUploadServlet.doPost(FileUploadServlet.java:412) at javax.servlet.http.HttpServlet.service(HttpServlet.java:727) at com.zimbra.cs.servlet.ZimbraServlet.service(ZimbraServlet.java:181) at javax.servlet.http.HttpServlet.service(HttpServlet.java:820) at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511) at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1166) at com.zimbra.cs.servlet.SetHeaderFilter.doFilter(SetHeaderFilter.java:79) at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1157) at org.mortbay.servlet.UserAgentFilter.doFilter(UserAgentFilter.java:81) at org.mortbay.servlet.GzipFilter.doFilter(GzipFilter.java:132) at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1157) at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:388) at org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:218) at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182) at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:765) at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:418) at org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230) at org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114) at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152) at org.mortbay.jetty.handler.rewrite.RewriteHandler.handle(RewriteHandler.java:230) at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152) at org.mortbay.jetty.handler.DebugHandler.handle(DebugHandler.java:77) at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152) at org.mortbay.jetty.Server.handle(Server.java:326) at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:543) at org.mortbay.jetty.HttpConnection$RequestHandler.content(HttpConnection.java:939) at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:755) at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:218) at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:405) at org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:413) at org.mortbay.thread.BoundedThreadPool$PoolThread.run(BoundedThreadPool.java:451) Caused by: org.mortbay.jetty.EofException: timeout at org.mortbay.jetty.HttpParser$Input.blockForContent(HttpParser.java:1172) at org.mortbay.jetty.HttpParser$Input.read(HttpParser.java:1122) at org.apache.commons.fileupload.MultipartStream$ItemInputStream.makeAvailable(MultipartStream.java:977) at org.apache.commons.fileupload.MultipartStream$ItemInputStream.read(MultipartStream.java:887) at java.io.InputStream.read(InputStream.java:85) at org.apache.commons.fileupload.util.Streams.copy(Streams.java:94) at org.apache.commons.fileupload.util.Streams.copy(Streams.java:64) at org.apache.commons.fileupload.FileUploadBase.parseRequest(FileUploadBase.java:362) ... 33 more

    Read the article

  • Java JRE 1.7.0_45 Certified with Oracle E-Business Suite

    - by Steven Chan (Oracle Development)
    Java Runtime Environment 7u45 (a.k.a. JRE 7u45-b18) and later updates on the JRE 7 codeline are now certified with Oracle E-Business Suite Release 11i and 12.0, 12.1, and 12.2 for Windows-based desktop clients. Effects of new support dates on Java upgrades for EBS environments Support dates for the E-Business Suite and Java have changed.  Please review the sections below for more details: What does this mean for Oracle E-Business Suite users? Will EBS users be forced to upgrade to JRE 7 for Windows desktop clients? Will EBS users be forced to upgrade to JDK 7 for EBS application tier servers? All JRE 6 and 7 releases are certified with EBS upon release Our standard policy is that all E-Business Suite customers can apply all JRE updates to end-user desktops from JRE 1.6.0_03 and later updates on the 1.6 codeline, and from JRE 7u10 and later updates on the JRE 7 codeline.  We test all new JRE 1.6 and JRE 7 releases in parallel with the JRE development process, so all new JRE 1.6 and 7 releases are considered certified with the E-Business Suite on the same day that they're released by our Java team.  You do not need to wait for a certification announcement before applying new JRE 1.6 or JRE 7 releases to your EBS users' desktops. What's needed to enable EBS environments for JRE 7? EBS customers should ensure that they are running JRE 7u17, at minimum, on Windows desktop clients. Of the compatibility issues identified with JRE 7, the most critical is an issue that prevents E-Business Suite Forms-based products from launching on Windows desktops that are running JRE 7.  Customers can prevent this issue -- and all other JRE 7 compatibility issues -- by ensuring that they have applied the latest certified patches documented for JRE 7 configurations to their EBS application tier servers.  These patches are compatible with JRE 6 and 7, production ready, and fully-tested with the E-Business Suite.  These patches may be applied immediately to all E-Business Suite environments. All other Forms prerequisites documented in the Notes above should also be applied.  Where are the official patch requirements documented? All patches required for ensuring full compatibility of the E-Business Suite with JRE 7 are documented in these Notes: For EBS 11i: Deploying Sun JRE (Native Plug-in) for Windows Clients in Oracle E-Business Suite Release 11i (Note 290807.1) Upgrading Developer 6i with Oracle E-Business Suite 11i (Note 125767.1) For EBS 12.0, 12.1, 12.2 Deploying Sun JRE (Native Plug-in) for Windows Clients in Oracle E-Business Suite Release 12 (Note 393931.1) Upgrading OracleAS 10g Forms and Reports in Oracle E-Business Suite Release 12 (Note 437878.1) EBS + Discoverer 11g Users JRE 1.7.0_45 is certified for Discoverer 11g in E-Business Suite environments with the following minimum requirements: Discoverer (11g) 11.1.1.6 plus Patch 13877486 and later  Reference: How To Find Oracle BI Discoverer 10g and 11g Certification Information (Document 233047.1) Worried about the 'mismanaged session cookie' issue? No need to worry -- it's fixed.  To recap: JRE releases 1.6.0_18 through 1.6.0_22 had issues with mismanaging session cookies that affected some users in some circumstances. The fix for those issues was first included in JRE 1.6.0_23. These fixes will carry forward and continue to be fixed in all future JRE releases on the JRE 6 and 7 codelines.  In other words, if you wish to avoid the mismanaged session cookie issue, you should apply any release after JRE 1.6.0_22 on the JRE 6 codeline, and JRE 7u10 and later JRE 7 codeline updates. Implications of Java 6 End of Public Updates for EBS Users The Support Roadmap for Oracle Java is published here: Oracle Java SE Support Roadmap The latest updates to that page (as of Sept. 19, 2012) state (emphasis added): Java SE 6 End of Public Updates Notice After February 2013, Oracle will no longer post updates of Java SE 6 to its public download sites. Existing Java SE 6 downloads already posted as of February 2013 will remain accessible in the Java Archive on Oracle Technology Network. Developers and end-users are encouraged to update to more recent Java SE versions that remain available for public download. For enterprise customers, who need continued access to critical bug fixes and security fixes as well as general maintenance for Java SE 6 or older versions, long term support is available through Oracle Java SE Support . What does this mean for Oracle E-Business Suite users? EBS users fall under the category of "enterprise users" above.  Java is an integral part of the Oracle E-Business Suite technology stack, so EBS users will continue to receive Java SE 6 updates from February 2013 to the end of Java SE 6 Extended Support in June 2017. In other words, nothing changes for EBS users after February 2013.  EBS users will continue to receive critical bug fixes and security fixes as well as general maintenance for Java SE 6 until the end of Java SE 6 Extended Support in June 2017. How can EBS customers obtain Java 6 updates after the public end-of-life? EBS customers can download Java 6 patches from My Oracle Support.  For a complete list of all Java SE patch numbers, see: All Java SE Downloads on MOS (Note 1439822.1) Will EBS users be forced to upgrade to JRE 7 for Windows desktop clients? This upgrade is highly recommended but remains optional while Java 6 is covered by Extended Support. Updates will be delivered via My Oracle Support, where you can continue to receive critical bug fixes and security fixes as well as general maintenance for JRE 6 desktop clients.  Java 6 is covered by Extended Support until June 2017.  All E-Business Suite customers must upgrade to JRE 7 by June 2017. Coexistence of JRE 6 and JRE 7 on Windows desktops The upgrade to JRE 7 is highly recommended for EBS users, but some users may need to run both JRE 6 and 7 on their Windows desktops for reasons unrelated to the E-Business Suite. Most EBS configurations with IE and Firefox use non-static versioning by default. JRE 7 will be invoked instead of JRE 6 if both are installed on a Windows desktop. For more details, see "Appendix B: Static vs. Non-static Versioning and Set Up Options" in Notes 290807.1 and 393931.1. Applying Updates to JRE 6 and JRE 7 to Windows desktops Auto-update will keep JRE 7 up-to-date for Windows users with JRE 7 installed. Auto-update will only keep JRE 7 up-to-date for Windows users with both JRE 6 and 7 installed.  JRE 6 users are strongly encouraged to apply the latest Critical Patch Updates as soon as possible after each release. The Jave SE CPUs will be available via My Oracle Support.  EBS users can find more information about JRE 6 and 7 updates here: Information Center: Installation & Configuration for Oracle Java SE (Note 1412103.2) The dates for future Java SE CPUs can be found on the Critical Patch Updates, Security Alerts and Third Party Bulletin.  An RSS feed is available on that site for those who would like to be kept up-to-date. What do Mac users need? Mac users running Mac OS 10.7 or 10.8 can run JRE 7 plug-ins.  See this article: EBS 12 certified with Mac OS X 10.7 and 10.8 with Safari 6 and JRE 7 Will EBS users be forced to upgrade to JDK 7 for EBS application tier servers? JRE is used for desktop clients.  JDK is used for application tier servers JDK upgrades for E-Business Suite application tier servers are highly recommended but currently remain optional while Java 6 is covered by Extended Support. Updates will be delivered via My Oracle Support, where you can continue to receive critical bug fixes and security fixes as well as general maintenance for JDK 6 for application tier servers.  Java SE 6 is covered by Extended Support until June 2017.  All EBS customers with application tier servers on Windows, Solaris, and Linux must upgrade to JDK 7 by June 2017. EBS customers running their application tier servers on other operating systems should check with their respective vendors for the support dates for those platforms. JDK 7 is certified with E-Business Suite 12.  See: Java (JDK) 7 Certified for E-Business Suite 12 Servers References Recommended Browsers for Oracle Applications 11i (Metalink Note 285218.1) Upgrading Sun JRE (Native Plug-in) with Oracle Applications 11i for Windows Clients (Metalink Note 290807.1) Recommended Browsers for Oracle Applications 12 (MetaLink Note 389422.1) Upgrading JRE Plugin with Oracle Applications R12 (MetaLink Note 393931.1) Related Articles Mismanaged Session Cookie Issue Fixed for EBS in JRE 1.6.0_23 Roundup: Oracle JInitiator 1.3 Desupported for EBS Customers in July 2009

    Read the article

  • What is meant by 4 GL?

    - by Geek
    I came across the term 4GL(generation language) in reading about Oracle ADF Busniess components . I want to know what exactly is 4GL ? This is the actual quote from the book Oracle Fusion Guide: Oracle ADF Business Components is the business services layer of choice in Oracle Fusion application development. Compared to other persistence layers, ADF Business Components provides exceptional built-in business application functionality and maximally declarative experience that makes it a perfect match for those who seek an end-to-end fourth generation language (4GL) Java EE development architecture.

    Read the article

  • Problem compiling hive with ant

    - by conandor
    I compiling with Solaris 10 SPARC, jdk 1.6 from Sun, Ant 1.7.1 from OpenCSW. I have no problem running hadoop 0.17.2.1 However, I have problem compiling/integrating hive with the error 'cannot find symbol', although I followed the tutorial. I have the hive source code from SVN exactly from tutorial. How can I know the hive version I compiling and how can I compile against hadoop 0.17.2.1? Please advice. Thank you. -bash-3.00$ export PATH=/usr/jdk/instances/jdk1.6.0/bin:/usr/bin:/opt/csw/bin:/opt/webstack/bin -bash-3.00$ export JAVA_HOME=/usr/jdk/instances/jdk1.6.0 -bash-3.00$ export HADOOP=/export/home/mywork/hadoop-0.17.2.1/bin/hadoop -bash-3.00$ /opt/csw/bin/ant package -Dhadoop.version=0.17.2.1 Buildfile: build.xml jar: create-dirs: compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: init: compile: ivy-init-dirs: ivy-download: [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar [get] To: /export/home/mywork/hive/build/ivy/lib/ivy-2.1.0.jar [get] Not modified - so not downloaded ivy-probe-antlib: ivy-init-antlib: ivy-init: ivy-retrieve-hadoop-source: [ivy:retrieve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ :: [ivy:retrieve] :: loading settings :: file = /export/home/mywork/hive/ivy/ivysettings.xml [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#shims;working@kaili [ivy:retrieve] confs: [default] [ivy:retrieve] found hadoop#core;0.17.2.1 in hadoop-source [ivy:retrieve] found hadoop#core;0.18.3 in hadoop-source [ivy:retrieve] found hadoop#core;0.19.0 in hadoop-source [ivy:retrieve] found hadoop#core;0.20.0 in hadoop-source [ivy:retrieve] :: resolution report :: resolve 25878ms :: artifacts dl 37ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 4 | 0 | 0 | 0 || 4 | 0 | --------------------------------------------------------------------- [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#shims [ivy:retrieve] confs: [default] [ivy:retrieve] 0 artifacts copied, 4 already retrieved (0kB/82ms) install-hadoopcore-internal: build_shims: [echo] Compiling shims against hadoop 0.17.2.1 (/export/home/mywork/hive/build/hadoopcore/hadoop-0.17.2.1) ivy-init-dirs: ivy-download: [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar [get] To: /export/home/mywork/hive/build/ivy/lib/ivy-2.1.0.jar [get] Not modified - so not downloaded ivy-probe-antlib: ivy-init-antlib: ivy-init: ivy-retrieve-hadoop-source: [ivy:retrieve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ :: [ivy:retrieve] :: loading settings :: file = /export/home/mywork/hive/ivy/ivysettings.xml [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#shims;working@kaili [ivy:retrieve] confs: [default] [ivy:retrieve] found hadoop#core;0.17.2.1 in hadoop-source [ivy:retrieve] found hadoop#core;0.18.3 in hadoop-source [ivy:retrieve] found hadoop#core;0.19.0 in hadoop-source [ivy:retrieve] found hadoop#core;0.20.0 in hadoop-source [ivy:retrieve] :: resolution report :: resolve 12041ms :: artifacts dl 30ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 4 | 0 | 0 | 0 || 4 | 0 | --------------------------------------------------------------------- [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#shims [ivy:retrieve] confs: [default] [ivy:retrieve] 0 artifacts copied, 4 already retrieved (0kB/39ms) install-hadoopcore-internal: build_shims: [echo] Compiling shims against hadoop 0.18.3 (/export/home/mywork/hive/build/hadoopcore/hadoop-0.18.3) ivy-init-dirs: ivy-download: [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar [get] To: /export/home/mywork/hive/build/ivy/lib/ivy-2.1.0.jar [get] Not modified - so not downloaded ivy-probe-antlib: ivy-init-antlib: ivy-init: ivy-retrieve-hadoop-source: [ivy:retrieve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ :: [ivy:retrieve] :: loading settings :: file = /export/home/mywork/hive/ivy/ivysettings.xml [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#shims;working@kaili [ivy:retrieve] confs: [default] [ivy:retrieve] found hadoop#core;0.17.2.1 in hadoop-source [ivy:retrieve] found hadoop#core;0.18.3 in hadoop-source [ivy:retrieve] found hadoop#core;0.19.0 in hadoop-source [ivy:retrieve] found hadoop#core;0.20.0 in hadoop-source [ivy:retrieve] :: resolution report :: resolve 11107ms :: artifacts dl 36ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 4 | 0 | 0 | 0 || 4 | 0 | --------------------------------------------------------------------- [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#shims [ivy:retrieve] confs: [default] [ivy:retrieve] 0 artifacts copied, 4 already retrieved (0kB/49ms) install-hadoopcore-internal: build_shims: [echo] Compiling shims against hadoop 0.19.0 (/export/home/mywork/hive/build/hadoopcore/hadoop-0.19.0) ivy-init-dirs: ivy-download: [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar [get] To: /export/home/mywork/hive/build/ivy/lib/ivy-2.1.0.jar [get] Not modified - so not downloaded ivy-probe-antlib: ivy-init-antlib: ivy-init: ivy-retrieve-hadoop-source: [ivy:retrieve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ :: [ivy:retrieve] :: loading settings :: file = /export/home/mywork/hive/ivy/ivysettings.xml [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#shims;working@kaili [ivy:retrieve] confs: [default] [ivy:retrieve] found hadoop#core;0.17.2.1 in hadoop-source [ivy:retrieve] found hadoop#core;0.18.3 in hadoop-source [ivy:retrieve] found hadoop#core;0.19.0 in hadoop-source [ivy:retrieve] found hadoop#core;0.20.0 in hadoop-source [ivy:retrieve] :: resolution report :: resolve 9969ms :: artifacts dl 33ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 4 | 0 | 0 | 0 || 4 | 0 | --------------------------------------------------------------------- [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#shims [ivy:retrieve] confs: [default] [ivy:retrieve] 0 artifacts copied, 4 already retrieved (0kB/57ms) install-hadoopcore-internal: build_shims: [echo] Compiling shims against hadoop 0.20.0 (/export/home/mywork/hive/build/hadoopcore/hadoop-0.20.0) jar: [echo] Jar: shims create-dirs: compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: init: install-hadoopcore: install-hadoopcore-default: ivy-init-dirs: ivy-download: [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar [get] To: /export/home/mywork/hive/build/ivy/lib/ivy-2.1.0.jar [get] Not modified - so not downloaded ivy-probe-antlib: ivy-init-antlib: ivy-init: ivy-retrieve-hadoop-source: [ivy:retrieve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ :: [ivy:retrieve] :: loading settings :: file = /export/home/mywork/hive/ivy/ivysettings.xml [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#common;working@kaili [ivy:retrieve] confs: [default] [ivy:retrieve] found hadoop#core;0.20.0 in hadoop-source [ivy:retrieve] :: resolution report :: resolve 4864ms :: artifacts dl 13ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 1 | 0 | 0 | 0 || 1 | 0 | --------------------------------------------------------------------- [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#common [ivy:retrieve] confs: [default] [ivy:retrieve] 0 artifacts copied, 1 already retrieved (0kB/52ms) install-hadoopcore-internal: setup: compile: [echo] Compiling: common jar: [echo] Jar: common create-dirs: compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: init: dynamic-serde: compile: [echo] Compiling: hive [javac] Compiling 167 source files to /export/home/mywork/hive/build/serde/classes [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/objectinspector/ObjectInspectorFactory.java:30: cannot find symbol [javac] symbol : class PrimitiveObjectInspectorFactory [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorFactory; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/objectinspector/ObjectInspectorFactory.java:31: cannot find symbol [javac] symbol : class PrimitiveObjectInspectorUtils [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorUtils; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/MetadataTypedColumnsetSerDe.java:31: cannot find symbol [javac] symbol : class MetadataListStructObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector [javac] import org.apache.hadoop.hive.serde2.objectinspector.MetadataListStructObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/SerDeUtils.java:33: cannot find symbol [javac] symbol : class BooleanObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.BooleanObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/SerDeUtils.java:35: cannot find symbol [javac] symbol : class DoubleObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.DoubleObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/SerDeUtils.java:36: cannot find symbol [javac] symbol : class FloatObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.FloatObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/SerDeUtils.java:39: cannot find symbol [javac] symbol : class ShortObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.ShortObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/SerDeUtils.java:40: cannot find symbol [javac] symbol : class StringObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/binarysortable/BinarySortableSerDe.java:44: cannot find symbol [javac] symbol : class BooleanObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.BooleanObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/binarysortable/BinarySortableSerDe.java:46: cannot find symbol [javac] symbol : class DoubleObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.DoubleObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/binarysortable/BinarySortableSerDe.java:47: cannot find symbol [javac] symbol : class FloatObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.FloatObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/binarysortable/BinarySortableSerDe.java:50: cannot find symbol [javac] symbol : class ShortObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.ShortObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/binarysortable/BinarySortableSerDe.java:51: cannot find symbol [javac] symbol : class StringObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazySimpleSerDe.java:43: cannot find symbol [javac] symbol : class PrimitiveObjectInspectorFactory [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorFactory; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/columnar/ColumnarSerDe.java:41: cannot find symbol [javac] symbol : class PrimitiveObjectInspectorFactory [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorFactory; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyStruct.java:26: cannot find symbol [javac] symbol : class LazySimpleStructObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.lazy.objectinspector [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.LazySimpleStructObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyStruct.java:39: cannot find symbol [javac] symbol: class LazySimpleStructObjectInspector [javac] LazyNonPrimitive<LazySimpleStructObjectInspector> { [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyStruct.java:68: cannot find symbol [javac] symbol : class LazySimpleStructObjectInspector [javac] location: class org.apache.hadoop.hive.serde2.lazy.LazyStruct [javac] public LazyStruct(LazySimpleStructObjectInspector oi) { [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type/DynamicSerDe.java:36: cannot find symbol [javac] symbol : class PrimitiveObjectInspectorFactory [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorFactory; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type/DynamicSerDe.java:37: cannot find symbol [javac] symbol : class PrimitiveObjectInspectorUtils [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorUtils; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type/DynamicSerDeTypeString.java:23: cannot find symbol [javac] symbol : class StringObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type/DynamicSerDeTypei16.java:23: cannot find symbol [javac] symbol : class ShortObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.ShortObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type/DynamicSerDeTypeDouble.java:23: cannot find symbol [javac] symbol : class DoubleObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.DoubleObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type/DynamicSerDeTypeBool.java:23: cannot find symbol [javac] symbol : class BooleanObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.BooleanObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyBoolean.java:20: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyBooleanObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyBoolean.java:37: cannot find symbol [javac] symbol: class LazyBooleanObjectInspector [javac] LazyPrimitive<LazyBooleanObjectInspector, BooleanWritable> { [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyBoolean.java:39: cannot find symbol [javac] symbol : class LazyBooleanObjectInspector [javac] location: class org.apache.hadoop.hive.serde2.lazy.LazyBoolean [javac] public LazyBoolean(LazyBooleanObjectInspector oi) { [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyByte.java:21: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyByteObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyByte.java:37: cannot find symbol [javac] symbol: class LazyByteObjectInspector [javac] LazyPrimitive<LazyByteObjectInspector, ByteWritable> { [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyByte.java:39: cannot find symbol [javac] symbol : class LazyByteObjectInspector [javac] location: class org.apache.hadoop.hive.serde2.lazy.LazyByte [javac] public LazyByte(LazyByteObjectInspector oi) { [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyDouble.java:23: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyDoubleObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyDouble.java:31: cannot find symbol [javac] symbol: class LazyDoubleObjectInspector [javac] LazyPrimitive<LazyDoubleObjectInspector, DoubleWritable> { [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyDouble.java:33: cannot find symbol [javac] symbol : class LazyDoubleObjectInspector [javac] location: class org.apache.hadoop.hive.serde2.lazy.LazyDouble [javac] public LazyDouble(LazyDoubleObjectInspector oi) { [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:25: cannot find symbol [javac] symbol : class LazyObjectInspectorFactory [javac] location: package org.apache.hadoop.hive.serde2.lazy.objectinspector [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.LazyObjectInspectorFactory; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:26: cannot find symbol [javac] symbol : class LazySimpleStructObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.lazy.objectinspector [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.LazySimpleStructObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:27: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyBooleanObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:28: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyByteObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:29: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyDoubleObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:30: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyFloatObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:31: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyIntObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:32: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyLongObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:33: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyPrimitiveObjectInspectorFactory; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:34: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyShortObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:35: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyStringObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFloat.java:

    Read the article

  • EntityManager injection works in JBoss 7.1.1 but not WebSphere 7

    - by BikerJared
    I've built an EJB that will manage my database access. I'm building a web app around it that uses Struts 2. The problem I'm having is when I deploy the ear, the EntityManager doesn't get injected into my service class (and winds up null and results in NullPointerExceptions). The weird thing is, it works on JBoss 7.1.1 but not on WebSphere 7. You'll notice that Struts doesn't inject the EJB, so I've got some intercepter code that does that. My current working theory right now is that the WS7 container can't inject the EntityManager because of Struts for some unknown reason. My next step is to try Spring next, but I'd really like to get this to work if possible. I've spent a few days searching and trying various things and haven't had any luck. I figured I'd give this a shot. Let me know if I can provide additional information. <?xml version="1.0" encoding="UTF-8"?> <persistence xmlns="http://java.sun.com/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" version="1.0" xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_1_0.xsd"> <persistence-unit name="JPATestPU" transaction-type="JTA"> <description>JPATest Persistence Unit</description> <jta-data-source>jdbc/Test-DS</jta-data-source> <class>org.jaredstevens.jpatest.db.entities.User</class> <properties> <property name="hibernate.hbm2ddl.auto" value="update"/> </properties> </persistence-unit> </persistence> package org.jaredstevens.jpatest.db.entities; import java.io.Serializable; import javax.persistence.*; @Entity @Table public class User implements Serializable { private static final long serialVersionUID = -2643583108587251245L; private long id; private String name; private String email; @Id @GeneratedValue(strategy = GenerationType.TABLE) public long getId() { return id; } public void setId(long id) { this.id = id; } @Column(nullable=false) public String getName() { return this.name; } public void setName( String name ) { this.name = name; } @Column(nullable=false) public String getEmail() { return this.email; } @Column(nullable=false) public void setEmail( String email ) { this.email= email; } } package org.jaredstevens.jpatest.db.services; import java.util.List; import javax.ejb.Remote; import javax.ejb.Stateless; import javax.ejb.TransactionAttribute; import javax.ejb.TransactionAttributeType; import javax.persistence.EntityManager; import javax.persistence.PersistenceContext; import javax.persistence.PersistenceContextType; import javax.persistence.Query; import org.jaredstevens.jpatest.db.entities.User; import org.jaredstevens.jpatest.db.interfaces.IUserService; @Stateless(name="UserService",mappedName="UserService") @Remote public class UserService implements IUserService { @PersistenceContext(unitName="JPATestPU",type=PersistenceContextType.TRANSACTION) private EntityManager em; @TransactionAttribute(TransactionAttributeType.REQUIRED) public User getUserById(long userId) { User retVal = null; if(userId > 0) { retVal = (User)this.getEm().find(User.class, userId); } return retVal; } @TransactionAttribute(TransactionAttributeType.REQUIRED) public List<User> getUsers() { List<User> retVal = null; String sql; sql = "SELECT u FROM User u ORDER BY u.id ASC"; Query q = this.getEm().createQuery(sql); retVal = (List<User>)q.getResultList(); return retVal; } @TransactionAttribute(TransactionAttributeType.REQUIRED) public void save(User user) { this.getEm().persist(user); } @TransactionAttribute(TransactionAttributeType.REQUIRED) public boolean remove(long userId) { boolean retVal = false; if(userId > 0) { User user = null; user = (User)this.getEm().find(User.class, userId); if(user != null) this.getEm().remove(user); if(this.getEm().find(User.class, userId) == null) retVal = true; } return retVal; } public EntityManager getEm() { return em; } public void setEm(EntityManager em) { this.em = em; } } package org.jaredstevens.jpatest.actions.user; import javax.ejb.EJB; import org.jaredstevens.jpatest.db.entities.User; import org.jaredstevens.jpatest.db.interfaces.IUserService; import com.opensymphony.xwork2.ActionSupport; public class UserAction extends ActionSupport { @EJB(mappedName="UserService") private IUserService userService; private static final long serialVersionUID = 1L; private String userId; private String name; private String email; private User user; public String getUserById() { String retVal = ActionSupport.SUCCESS; this.setUser(userService.getUserById(Long.parseLong(this.userId))); return retVal; } public String save() { String retVal = ActionSupport.SUCCESS; User user = new User(); if(this.getUserId() != null && Long.parseLong(this.getUserId()) > 0) user.setId(Long.parseLong(this.getUserId())); user.setName(this.getName()); user.setEmail(this.getEmail()); userService.save(user); this.setUser(user); return retVal; } public String getUserId() { return this.userId; } public void setUserId(String userId) { this.userId = userId; } public String getName() { return this.name; } public void setName( String name ) { this.name = name; } public String getEmail() { return this.email; } public void setEmail( String email ) { this.email = email; } public User getUser() { return this.user; } public void setUser(User user) { this.user = user; } } package org.jaredstevens.jpatest.utils; import com.opensymphony.xwork2.ActionInvocation; import com.opensymphony.xwork2.interceptor.Interceptor; public class EJBAnnotationProcessorInterceptor implements Interceptor { private static final long serialVersionUID = 1L; public void destroy() { } public void init() { } public String intercept(ActionInvocation ai) throws Exception { EJBAnnotationProcessor.process(ai.getAction()); return ai.invoke(); } } package org.jaredstevens.jpatest.utils; import java.lang.reflect.Field; import javax.ejb.EJB; import javax.naming.Context; import javax.naming.InitialContext; import javax.naming.NamingException; public class EJBAnnotationProcessor { public static void process(Object instance)throws Exception{ Field[] fields = instance.getClass().getDeclaredFields(); if(fields != null && fields.length > 0){ EJB ejb; for(Field field : fields){ ejb = field.getAnnotation(EJB.class); if(ejb != null){ field.setAccessible(true); field.set(instance, EJBAnnotationProcessor.getEJB(ejb.mappedName())); } } } } private static Object getEJB(String mappedName) { Object retVal = null; String path = ""; Context cxt = null; String[] paths = {"cell/nodes/virgoNode01/servers/server1/","java:module/"}; for( int i=0; i < paths.length; ++i ) { try { path = paths[i]+mappedName; cxt = new InitialContext(); retVal = cxt.lookup(path); if(retVal != null) break; } catch (NamingException e) { retVal = null; } } return retVal; } } <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE struts PUBLIC "-//Apache Software Foundation//DTD Struts Configuration 2.0//EN" "http://struts.apache.org/dtds/struts-2.0.dtd"> <struts> <constant name="struts.devMode" value="true" /> <package name="basicstruts2" namespace="/diagnostics" extends="struts-default"> <interceptors> <interceptor name="ejbAnnotationProcessor" class="org.jaredstevens.jpatest.utils.EJBAnnotationProcessorInterceptor"/> <interceptor-stack name="baseStack"> <interceptor-ref name="defaultStack"/> <interceptor-ref name="ejbAnnotationProcessor"/> </interceptor-stack> </interceptors> <default-interceptor-ref name="baseStack"/> </package> <package name="restAPI" namespace="/conduit" extends="json-default"> <interceptors> <interceptor name="ejbAnnotationProcessor" class="org.jaredstevens.jpatest.utils.EJBAnnotationProcessorInterceptor" /> <interceptor-stack name="baseStack"> <interceptor-ref name="defaultStack" /> <interceptor-ref name="ejbAnnotationProcessor" /> </interceptor-stack> </interceptors> <default-interceptor-ref name="baseStack" /> <action name="UserAction.getUserById" class="org.jaredstevens.jpatest.actions.user.UserAction" method="getUserById"> <result type="json"> <param name="ignoreHierarchy">false</param> <param name="includeProperties"> ^user\.id, ^user\.name, ^user\.email </param> </result> <result name="error" type="json" /> </action> <action name="UserAction.save" class="org.jaredstevens.jpatest.actions.user.UserAction" method="save"> <result type="json"> <param name="ignoreHierarchy">false</param> <param name="includeProperties"> ^user\.id, ^user\.name, ^user\.email </param> </result> <result name="error" type="json" /> </action> </package> </struts> Stack Trace java.lang.NullPointerException org.jaredstevens.jpatest.actions.user.UserAction.save(UserAction.java:38) sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60) sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37) java.lang.reflect.Method.invoke(Method.java:611) com.opensymphony.xwork2.DefaultActionInvocation.invokeAction(DefaultActionInvocation.java:453) com.opensymphony.xwork2.DefaultActionInvocation.invokeActionOnly(DefaultActionInvocation.java:292) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:255) org.jaredstevens.jpatest.utils.EJBAnnotationProcessorInterceptor.intercept(EJBAnnotationProcessorInterceptor.java:21) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:249) org.apache.struts2.interceptor.debugging.DebuggingInterceptor.intercept(DebuggingInterceptor.java:256) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:249) com.opensymphony.xwork2.interceptor.DefaultWorkflowInterceptor.doIntercept(DefaultWorkflowInterceptor.java:176) com.opensymphony.xwork2.interceptor.MethodFilterInterceptor.intercept(MethodFilterInterceptor.java:98) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:249) com.opensymphony.xwork2.validator.ValidationInterceptor.doIntercept(ValidationInterceptor.java:265) org.apache.struts2.interceptor.validation.AnnotationValidationInterceptor.doIntercept(AnnotationValidationInterceptor.java:68) com.opensymphony.xwork2.interceptor.MethodFilterInterceptor.intercept(MethodFilterInterceptor.java:98) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:249) com.opensymphony.xwork2.interceptor.ConversionErrorInterceptor.intercept(ConversionErrorInterceptor.java:138) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:249) com.opensymphony.xwork2.interceptor.ParametersInterceptor.doIntercept(ParametersInterceptor.java:211) com.opensymphony.xwork2.interceptor.MethodFilterInterceptor.intercept(MethodFilterInterceptor.java:98) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:249) com.opensymphony.xwork2.interceptor.ParametersInterceptor.doIntercept(ParametersInterceptor.java:211) com.opensymphony.xwork2.interceptor.MethodFilterInterceptor.intercept(MethodFilterInterceptor.java:98) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:249) com.opensymphony.xwork2.interceptor.StaticParametersInterceptor.intercept(StaticParametersInterceptor.java:190) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:249) org.apache.struts2.interceptor.MultiselectInterceptor.intercept(MultiselectInterceptor.java:75) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:249) org.apache.struts2.interceptor.CheckboxInterceptor.intercept(CheckboxInterceptor.java:90) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:249) org.apache.struts2.interceptor.FileUploadInterceptor.intercept(FileUploadInterceptor.java:243) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:249) com.opensymphony.xwork2.interceptor.ModelDrivenInterceptor.intercept(ModelDrivenInterceptor.java:100) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:249) com.opensymphony.xwork2.interceptor.ScopedModelDrivenInterceptor.intercept(ScopedModelDrivenInterceptor.java:141) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:249) com.opensymphony.xwork2.interceptor.ChainingInterceptor.intercept(ChainingInterceptor.java:145) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:249) com.opensymphony.xwork2.interceptor.PrepareInterceptor.doIntercept(PrepareInterceptor.java:171) com.opensymphony.xwork2.interceptor.MethodFilterInterceptor.intercept(MethodFilterInterceptor.java:98) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:249) com.opensymphony.xwork2.interceptor.I18nInterceptor.intercept(I18nInterceptor.java:176) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:249) org.apache.struts2.interceptor.ServletConfigInterceptor.intercept(ServletConfigInterceptor.java:164) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:249) com.opensymphony.xwork2.interceptor.AliasInterceptor.intercept(AliasInterceptor.java:192) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:249) com.opensymphony.xwork2.interceptor.ExceptionMappingInterceptor.intercept(ExceptionMappingInterceptor.java:187) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:249) org.apache.struts2.impl.StrutsActionProxy.execute(StrutsActionProxy.java:54) org.apache.struts2.dispatcher.Dispatcher.serviceAction(Dispatcher.java:511) org.apache.struts2.dispatcher.ng.ExecuteOperations.executeAction(ExecuteOperations.java:77) org.apache.struts2.dispatcher.ng.filter.StrutsPrepareAndExecuteFilter.doFilter(StrutsPrepareAndExecuteFilter.java:91) com.ibm.ws.webcontainer.filter.FilterInstanceWrapper.doFilter(FilterInstanceWrapper.java:188) com.ibm.ws.webcontainer.filter.WebAppFilterChain.doFilter(WebAppFilterChain.java:116) com.ibm.ws.webcontainer.filter.WebAppFilterChain._doFilter(WebAppFilterChain.java:77) com.ibm.ws.webcontainer.filter.WebAppFilterManager.doFilter(WebAppFilterManager.java:908) com.ibm.ws.webcontainer.filter.WebAppFilterManager.invokeFilters(WebAppFilterManager.java:997) com.ibm.ws.webcontainer.extension.DefaultExtensionProcessor.invokeFilters(DefaultExtensionProcessor.java:1062) com.ibm.ws.webcontainer.extension.DefaultExtensionProcessor.handleRequest(DefaultExtensionProcessor.java:982) com.ibm.ws.webcontainer.webapp.WebApp.handleRequest(WebApp.java:3935) com.ibm.ws.webcontainer.webapp.WebGroup.handleRequest(WebGroup.java:276) com.ibm.ws.webcontainer.WebContainer.handleRequest(WebContainer.java:931) com.ibm.ws.webcontainer.WSWebContainer.handleRequest(WSWebContainer.java:1583) com.ibm.ws.webcontainer.channel.WCChannelLink.ready(WCChannelLink.java:186) com.ibm.ws.http.channel.inbound.impl.HttpInboundLink.handleDiscrimination(HttpInboundLink.java:452) com.ibm.ws.http.channel.inbound.impl.HttpInboundLink.handleNewRequest(HttpInboundLink.java:511) com.ibm.ws.http.channel.inbound.impl.HttpInboundLink.processRequest(HttpInboundLink.java:305) com.ibm.ws.http.channel.inbound.impl.HttpInboundLink.ready(HttpInboundLink.java:276) com.ibm.ws.tcp.channel.impl.NewConnectionInitialReadCallback.sendToDiscriminators(NewConnectionInitialReadCallback.java:214) com.ibm.ws.tcp.channel.impl.NewConnectionInitialReadCallback.complete(NewConnectionInitialReadCallback.java:113) com.ibm.ws.tcp.channel.impl.AioReadCompletionListener.futureCompleted(AioReadCompletionListener.java:165) com.ibm.io.async.AbstractAsyncFuture.invokeCallback(AbstractAsyncFuture.java:217) com.ibm.io.async.AsyncChannelFuture.fireCompletionActions(AsyncChannelFuture.java:161) com.ibm.io.async.AsyncFuture.completed(AsyncFuture.java:138) com.ibm.io.async.ResultHandler.complete(ResultHandler.java:204) com.ibm.io.async.ResultHandler.runEventProcessingLoop(ResultHandler.java:775) com.ibm.io.async.ResultHandler$2.run(ResultHandler.java:905) com.ibm.ws.util.ThreadPool$Worker.run(ThreadPool.java:1604)

    Read the article

< Previous Page | 36 37 38 39 40 41 42 43 44 45 46 47  | Next Page >