Search Results

Search found 1342 results on 54 pages for 'maven cargo'.

Page 31/54 | < Previous Page | 27 28 29 30 31 32 33 34 35 36 37 38  | Next Page >

  • Why is javac 1.5 running so slowly compared with the Eclipse compiler?

    - by Simon Nickerson
    I have a Java Maven project with about 800 source files (some generated by javacc/JTB) which is taking a good 25 minutes to compile with javac. When I changed my pom.xml over to use the Eclipse compiler, it takes about 30 seconds to compile. Any suggestions as to why javac (1.5) is running so slowly? (I don't want to switch over to the Eclipse compiler permanently, as the plugin for Maven seems more than a little buggy.) I have a test case which easily reproduces the problem. The following code generates a number of source files in the default package. If you try to compile ImplementingClass.java with javac, it will seem to pause for an inordinately long time. import java.io.File; import java.io.FileNotFoundException; import java.io.PrintStream; public class CodeGenerator { private final static String PATH = System.getProperty("java.io.tmpdir"); private final static int NUM_TYPES = 1000; public static void main(String[] args) throws FileNotFoundException { PrintStream interfacePs = new PrintStream(PATH + File.separator + "Interface.java"); PrintStream abstractClassPs = new PrintStream(PATH + File.separator + "AbstractClass.java"); PrintStream implementingClassPs = new PrintStream(PATH + File.separator + "ImplementingClass.java"); interfacePs.println("public interface Interface<T> {"); abstractClassPs.println("public abstract class AbstractClass<T> implements Interface<T> {"); implementingClassPs.println("public class ImplementingClass extends AbstractClass<Object> {"); for (int i=0; i<NUM_TYPES; i++) { String nodeName = "Node" + i; PrintStream nodePs = new PrintStream(PATH + File.separator + nodeName + ".java"); nodePs.printf("public class %s { }\n", nodeName); nodePs.close(); interfacePs.printf("void visit(%s node, T obj);%n", nodeName); abstractClassPs.printf("public void visit(%s node, T obj) { System.out.println(obj.toString()); }%n", nodeName); } interfacePs.println("}"); abstractClassPs.println("}"); implementingClassPs.println("}"); interfacePs.close(); abstractClassPs.close(); implementingClassPs.close(); } }

    Read the article

  • m2eclipse sets JDK compliance to 1.4

    - by jihedamine
    Using eclipse 3.5, when I create a new maven project, m2eclipse automatically adds J2SE1.4 to libraries and Compiler Compliance Level to 1.4 (Project properties Java Compiler). My JRE system library is 1.6 and my default compiler compliance level is 1.6. I don't even have 1.4 installed. Can I make m2eclipse use my default settings and prevent it from modifying project settings?

    Read the article

  • How to find where a library is used across multiple pom files

    - by Pablojim
    We have multiple maven projects depending on on our own common libraries. When we upgrade a library it would be useful to quickly find out which projects have a dependency on the library (and might need to use the new version) Obviously I can manually look in all the pom files or write a script to do it but this is less than ideal. Are there any tools that provide this functionality. e.g. a hudson plugin, Nexus, artifactory etc?

    Read the article

  • How to use a properties file with Hudson in compilation time?

    - by Neuquino
    Hi, I have a pom.xml that uses cxf-codegen-plugin to generate a couple of WS clients. Inside the configuration of cxf-codegen-plugin, there are the WSDL locations. I would like to externalize those strings to a env.properties file. I used org.codehaus.mojo's properties-maven-plugin to look inside src/main/resources/conf/app/env.properties. How can I make Hudson to replace those properties with the apropiate host? Thanks in advance

    Read the article

  • commons-logging-1.1.jar; cannot read zip file entry

    - by user1226162
    I have imported a GWT project from GIT , but when i run maven Install it says .m2\repository\commons-logging\commons-logging\1.1\commons-logging-1.1.jar; cannot read zip file entry and if i simply run my application , i get this \git\my-Search-Engine\qsse\war}: java.lang.NoClassDefFoundError: com/google/inject/servlet/GuiceServletContextListener I tried to find out the way , one solution i found was to move the guice-servlet-3.0 from build path to \qsse\war\webinf\lib but if i do that i start gettin the exception ava.lang.NoClassDefFoundError: com/google/inject/Injector any idea how can i resolve this

    Read the article

  • Junit run not picking file src/test/resources. For file required by some dependency jar

    - by saddy-dj
    Hi, I m facing a issue where test/resource is not picked,but instead jar's main/resource is picked Scenario is like : Myproject src/test/resources--- have config.xml w which should be needed by abc.jar which is a dependecy in Myproject. When running test case for Myproject its loading config.xml of abc.jar instead of Myproject test/resources. - I need to know order in which maven pick resources. - Or wat im trying is not possible. Thanks.

    Read the article

  • Performance Testing through distributed jmeter instances and bamboo

    - by user1617754
    I´m working on performance test for several services running in an Amazon network. Our architecture is: Continuous Integration server running in our facilities (Bamboo); A Jmeter server instance in the same network than the services to test; A Jmeter client connected to the JMeter server (ssh tunnels) in our facilities. I want to start the execution of tests from bamboo, and see the different results on it too. Bamboo with <---------> Jmeter server <--------> WebService Jmeter client on Amazon on Amazon Has anybody tried something like this?

    Read the article

  • java.lang.OutOfMemoryError: unable to create new native thread

    - by Brad
    I consistently get this exception when trying to run my Junit tests on my mac: java.lang.OutOfMemoryError: unable to create new native thread at java.lang.Thread.start0(Native Method) at java.lang.Thread.start(Thread.java:658) at java.util.concurrent.ThreadPoolExecutor.addIfUnderMaximumPoolSize(ThreadPoolExecutor.java:727) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:657) at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:92) at com.google.appengine.tools.development.ApiProxyLocalImpl$PrivilegedApiAction.run(ApiProxyLocalImpl.java:197) at com.google.appengine.tools.development.ApiProxyLocalImpl$PrivilegedApiAction.run(ApiProxyLocalImpl.java:184) at java.security.AccessController.doPrivileged(Native Method) at com.google.appengine.tools.development.ApiProxyLocalImpl.doAsyncCall(ApiProxyLocalImpl.java:172) at com.google.appengine.tools.development.ApiProxyLocalImpl.makeAsyncCall(ApiProxyLocalImpl.java:138) The same set of unit tests pass perfectly fine on ubuntu and windows. Some information about my system resources on the mac: $ ulimit -a core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited file size (blocks, -f) unlimited max locked memory (kbytes, -l) unlimited max memory size (kbytes, -m) unlimited open files (-n) 1024 pipe size (512 bytes, -p) 1 stack size (kbytes, -s) 8192 cpu time (seconds, -t) unlimited max user processes (-u) 266 virtual memory (kbytes, -v) unlimited $ java -version java version "1.6.0_24" Java(TM) SE Runtime Environment (build 1.6.0_24-b07-334-10M3326) Java HotSpot(TM) 64-Bit Server VM (build 19.1-b02-334, mixed mode) The reason I dont think this is an application issue is because the same tests pass in different environments. I have tried setting heap to 1024m, 512m and setting the stack to 64k and 128k (and each of these combinations) with no luck. My open files was originally 256 and I have bumped this to 1024. I have been googling around for a bit and all posts say to decrease heap size and increase stack size but that doesnt seem to help. Anyone have anymore ideas? EDIT: Here are is some environment information on my ubuntu box: $ ulimit -a core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited scheduling priority (-e) 20 file size (blocks, -f) unlimited pending signals (-i) 16382 max locked memory (kbytes, -l) 64 max memory size (kbytes, -m) unlimited open files (-n) 1024 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size (kbytes, -s) 8192 cpu time (seconds, -t) unlimited max user processes (-u) unlimited virtual memory (kbytes, -v) unlimited file locks (-x) unlimited $ java -version java version "1.6.0_24" Java(TM) SE Runtime Environment (build 1.6.0_24-b07) Java HotSpot(TM) 64-Bit Server VM (build 19.1-b02, mixed mode)

    Read the article

  • Running Sonatype Nexus in Tomcat 7.0, Tomcat blocking PUT requests

    - by gdm
    I was previously running Nexus 1.8 on OSX and uploading jars for releases without any issues. The OSX box died, so I moved to a FreeBSD server. Since Nexus doesn't have binaries for FreeBSD, I decided to run it in my Tomcat container. Now, I have set up Nexus 1.9 in Tomcat 7.0 on FreeBSD. Everything is working well, except I can't upload jars to my release or snapshot repositories. If I try via Hudson, I get a 401 error (and no further details). If I try manually via curl, I get an error message back from Tomcat: "This request requires HTTP authentication.". Why is Tomcat giving this error, and how do I stop it? If I look in the Nexus logs I can see that the PUT request doesn't even reach Nexus, Tomcat is intercepting it.

    Read the article

  • JUnit Parameterized Runner and mvn Surefire Report integration

    - by fraido
    I'm using the Junit Parameterized Runner and the Maven Plugin Surefire Report to generate detailed reports during the mvn site phase. I've something like this @RunWith(Parameterized.class) public class MyTest { private String string1; private String string2; @Parameterized.Parameters public static Collection params() { return Arrays.asList(new String[][] { { "1", "2"}, { "3", "4"}, { "5", "6"} }); } public MyTest(String string1, String string2) { this.string1 = string1; this.string2 = string2; } @Test public void myTestMethod() { ... } @Test public void myOtherTestMethod() { ... } The report shows something like myTestMethod[0] 0.018 myTestMethod[1] 0.009 myTestMethod[2] 0.009 ... myOtherTestMethod[0] 0.018 myOtherTestMethod[1] 0.009 myOtherTestMethod[2] 0.009 ... Is there a way to display something else rather than the iteration number [0]..[1]..etc.. The constructor parameters would be a much better information. For example myTestMethod["1", "2"] 0.018 ...

    Read the article

  • m2eclipse workspace resolution

    - by Bartosz Radaczynski
    Hi all, I am using m2eclipse for managing maven projects in eclipse. It seems that in the previous release that I was using (0.9.8) the workspace resolution did not work at all, but right now it also does not work quite as I would expect. Namely, when the "resolve dependencied from workspace" setting for a project is not checked, the project turns red and cannot be build. The message says: artifact xxx x.y-SNAPSHOT cannot be found int local repository (or something to that extent). The trouble is that m2eclipse is putting information about workspace project into my local repo. Is there a way to change this behaviour? P.S. The workaround for this is to close the xxx project, then m2eclipse resolved the dependency to whatever version I've had previously in the local repository (i.e. the non-snapshot version).

    Read the article

  • Build Issue with multi module project

    - by vijay.shad
    Hi, I have a multi module web project. Four modules of the project are packaged as jar and added as dependency to the fifth module, which is packaged as war. When it is time to deploy the application i just run package on the war project and my war is created with all the dependencies. Now there is a problem. One of the my module have heavy changes. Now when i created war for my projects these changes was not reflected in the output war file(the jar in lib folder of war has still the old code). Can you please point the things i am missing from the release process? Why the old code is being packaged with the war? Can you please point some good resource for real file build process using maven? Regards, Vijay

    Read the article

  • Sonar default, meet "container state was: CONSTRUCTED"

    - by larry cai
    Environment: hudson/sonar/maven2 in ubuntu locally with default parameters And I got the log from hudson below, I can't figure out where is the problem. [INFO] Sonar host: http://localhost:9000 [INFO] Sonar version: 2.0.1 [INFO] [sonar-core:internal {execution: default-internal}] [INFO] Database dialect class org.sonar.api.database.dialect.Derby [INFO] ------------- Analyzing Game of Life business logic module [INFO] ------------------------------------------------------------------------ [ERROR] BUILD ERROR [INFO] ------------------------------------------------------------------------ [INFO] Can not execute Sonar Embedded error: Can not analyze the project Cannot stop. Current container state was: CONSTRUCTED [INFO] ------------------------------------------------------------------------ [INFO] Trace org.apache.maven.lifecycle.LifecycleExecutionException: Can not execute Sonar And I notice it also has problem when run it command line without hudson mvn sonar:sonar

    Read the article

  • M2Eclipse and EAR projects on Weblogic

    - by Steve
    How can I import a maven EAR project into Eclipse 3.4, and be able to use the IDE (WTP) to deploy the ear successfully to Weblogic (9.2)? The main issue is that the dependent jars are not being included in the ear (under APP-INF/lib) when it gets deployed through the IDE. When I build from command line, the ear is exactly how I want it. I am using the APP-INF/lib configuration for the ear plugin, and have included the jarModule sections for all the required jars. When editing the eclipse EAR project's Java EE components, all the jars are listed, but not in the APP-INF/lib. Only when I open a dependent jar project do those specific jars get set under that subfolder. All the 3rd party jars are showing they will end up in the wrong place. If you need more info, just let me know. Thanks!

    Read the article

  • m2eclipse resource filtering

    - by drewzilla
    I've having problems with resource filtering using m2eclipse Maven support in Eclipse. It seems that filtering only takes place on resources that have changed. This is fundamentally flawed because, if I have a file that references properties (e.g. ${my.property}, if the value of the property changes, the filtering will only be performed if the referencing file is also modified - if I only change the property value (in my pom.xml), the filtering is not applied to the files that that reference it. So, if I make a change to a property in my pom file, the filtering is not applied. However, if I then go to the file that references that property (e.g. a Spring config file) then edit and save it, the filtering is applied. I did read somewhere that: "m2eclipse skips filtering if there were no resource changes during incremental build" I'm using m2eclipse 0.10.x Has anyone else come across this? Thanks, Andrew

    Read the article

  • How to override TOMCAT Oracle ojdbc14 driver in the application?

    - by Luís Henrique Rocha
    The TOMCAT server is using an Oracle 9G ojdbc14 driver to its jndi connections in the /common/lib folder. My web application uses Maven + Spring and I'm getting the dataSource using Spring jndi features. I'm trying to bypass TOMCAT old ojdbc14 driver with a newer one (ojdbc14 10.2.0.4.0). I've tried putting the jars in the WEB-INF/lib folder as a project dependency, but it doesn't work the application keeps using the old oracle driver that is in the TOMCAT folder. I'm trying to bypass the TOMCAT oracle driver because I cannot update it to the newest version because there are lots of other projects using it. Does anyone have a clue?

    Read the article

  • How to figure out which jars are needed?

    - by Ari
    How can I systematically determine which jars I'll need, and thus should include in my pom.xml file (I'm using maven as my project management tool)? When learning spring, to keep things simple, added all the jars (even the ones I never used) to the classpath. Right now for the most part, I'm guessing which jars to include. For example, I know in my spring configuration file, I have: <tx:annotation-driven /> <context:annotation-config /> <aop:aspectj-autoproxy /> So, I guess I'll need: spring-context-x.x.x.jar, spring-tx-x.x.x.jar, spring-aop-x.x.x.jar Thanks.

    Read the article

  • How do I exclude the sources jar in mvn deploy?

    - by Richard
    When I run "mvn deploy:deploy", maven deploys 4 jar files to my internal remote repository. They are: [module-name]-1.jar [module-name]-1.pom [module-name]-1-sources.jar [module-name]-1-tests.jar There are actually more files, such as md5 and sha1 files, being deployed. But for simplicity, I just skip these files here. Is there any way to exclude [module-name]-1-sources.jar from the deployment process? One way I can think of is to use "mvn deploy:deploy-file", which allows me to pinpoint which jar to deploy. But since I have a few dozen modules to deploy, it'll be nice if I can configure the deployment file exclusion in pom.xml. Otherwise, I'll have to write a script to deploy. Thanks, Richard

    Read the article

  • Reading a file from a jar, or anywhere on the classpath?

    - by Stefan Kendall
    I'm trying to build an application that builds a resource file into a jar, but I'd like to have the project runnable within eclipse. I have a basic maven 2 structure for my project, and I'm unsure how to read in the file such that it's found and used when run from the JAR or from within eclipse. Thought? Structure: src/main/java src/main/resources/file.txt Current reading method: getClass().getResourceAsStream("/file.txt") Is there reading method that will pick up src/main/resources/*, as well as the root level of the JAR (where resources are deployed)?

    Read the article

  • Measuring code coverage for selenium tests that reside in separate project

    - by ilu
    I have two separate java maven projects: one is my web app itself and other one is tellurium+selenium automation tests for my web(I moved these tests to separate projects as their code doesnt really belong to the web app project code and doesnt use java classes of my web app, also I want to reuse some parts of those tests for testing my other web apps). Therefore, project where my tests reside doesnt know anything about my web app, except tellurium/selenium conf files(host name, credentials, browser). So the question: is there any way to measure code coverage of my webb app backend that is invoked by my tellurium/sellenium tests that reside in separate project? Thanks in advance. Any help is highly appreciated.

    Read the article

  • Netbeans 6.8 groovy files in src/main/java

    - by Jeff Storey
    I have a new netbeans maven/groovy project, and I actually prefer to mix my java and groovy files in src/main/java and src/test/java (I find it easier to navigate this way and my pom reflects this configuration). However, when I have my project setup this way in Netbeans 6.8, it always shows the generated-sources folder in error. The stubs generated from groovy files in src/test/java can't be opened by netbeans and given an error that they can't be parsed. However, in windows explorer the files are in tact. Netbeans can run the project but it continues to prompt me that some files are in error (even though I know they're not). It's like netbeans isn't refreshing itself. Any thoughts on how to fix this? thanks, Jeff

    Read the article

< Previous Page | 27 28 29 30 31 32 33 34 35 36 37 38  | Next Page >