Search Results

Search found 1621 results on 65 pages for 'maven scm'.

Page 36/65 | < Previous Page | 32 33 34 35 36 37 38 39 40 41 42 43  | Next Page >

  • Mercurial "server"

    - by user85116
    I've been using mercurial for a little while, but mainly for my own usage. Now though, I have a project I'm working on where two of us are building the same project, and we will probably be modifiying each other's files. I would like to setup a mercurial repo on a server, make that repo the "server", so my changes and the other editor's changes both push to that server (so basically the subversion / cvs model); I like mercurial though, and don't want to switch to something like subversion. Here in my own network, everything is done on linux, and my "server" has openssh installed. So pushing my changes (I work on multiple computers) from one computer to the server is just a matter of "hg push"; the protocol used is ssh for transfering the changes. The problem is that I use linux, the server will be windows (so no openssh, right?) and the other editor will be using windows too. As far as I know, the best way of working in mercurial in these types of setups is for the repo to pull changes from the source, rather then the source pushing to the "server". I'm behind several firewall's (not entirely my network) and my computer won't be visible from the server, and I'm assuming the other editor will be behind a firewall too (so we can't just start up the local mercurial http server and get the "server" computer to pull from that). What's the best way for both editors to get our changes to the server repo? (I should add that the server is a server on the internet, so just as visible as something like google.com. It's a hosted windows server, but I would probably have permission to install software if needed for this)

    Read the article

  • How to automatically add SVN commit messages and revision numbering to java file?

    - by John
    I'm working on an Apache Wicket project in Eclipse with Maven2 -- my SCM is Subversion. I've got Subclipse set up which I use to commit changes to the repository. I've seen several projects with nice headers containing the current revision number and at the bottom of the java source file there's a list of all the changes that have been committed to the file including the comments that were passed. Is there any way of achieving this sort of behaviour automatically? At work I'm using MKS which does this automatically but I am yet to figure out how to achieve this with SVN and Eclipse.

    Read the article

  • Performance Testing through distributed jmeter instances and bamboo

    - by user1617754
    I´m working on performance test for several services running in an Amazon network. Our architecture is: Continuous Integration server running in our facilities (Bamboo); A Jmeter server instance in the same network than the services to test; A Jmeter client connected to the JMeter server (ssh tunnels) in our facilities. I want to start the execution of tests from bamboo, and see the different results on it too. Bamboo with <---------> Jmeter server <--------> WebService Jmeter client on Amazon on Amazon Has anybody tried something like this?

    Read the article

  • java.lang.OutOfMemoryError: unable to create new native thread

    - by Brad
    I consistently get this exception when trying to run my Junit tests on my mac: java.lang.OutOfMemoryError: unable to create new native thread at java.lang.Thread.start0(Native Method) at java.lang.Thread.start(Thread.java:658) at java.util.concurrent.ThreadPoolExecutor.addIfUnderMaximumPoolSize(ThreadPoolExecutor.java:727) at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:657) at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:92) at com.google.appengine.tools.development.ApiProxyLocalImpl$PrivilegedApiAction.run(ApiProxyLocalImpl.java:197) at com.google.appengine.tools.development.ApiProxyLocalImpl$PrivilegedApiAction.run(ApiProxyLocalImpl.java:184) at java.security.AccessController.doPrivileged(Native Method) at com.google.appengine.tools.development.ApiProxyLocalImpl.doAsyncCall(ApiProxyLocalImpl.java:172) at com.google.appengine.tools.development.ApiProxyLocalImpl.makeAsyncCall(ApiProxyLocalImpl.java:138) The same set of unit tests pass perfectly fine on ubuntu and windows. Some information about my system resources on the mac: $ ulimit -a core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited file size (blocks, -f) unlimited max locked memory (kbytes, -l) unlimited max memory size (kbytes, -m) unlimited open files (-n) 1024 pipe size (512 bytes, -p) 1 stack size (kbytes, -s) 8192 cpu time (seconds, -t) unlimited max user processes (-u) 266 virtual memory (kbytes, -v) unlimited $ java -version java version "1.6.0_24" Java(TM) SE Runtime Environment (build 1.6.0_24-b07-334-10M3326) Java HotSpot(TM) 64-Bit Server VM (build 19.1-b02-334, mixed mode) The reason I dont think this is an application issue is because the same tests pass in different environments. I have tried setting heap to 1024m, 512m and setting the stack to 64k and 128k (and each of these combinations) with no luck. My open files was originally 256 and I have bumped this to 1024. I have been googling around for a bit and all posts say to decrease heap size and increase stack size but that doesnt seem to help. Anyone have anymore ideas? EDIT: Here are is some environment information on my ubuntu box: $ ulimit -a core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited scheduling priority (-e) 20 file size (blocks, -f) unlimited pending signals (-i) 16382 max locked memory (kbytes, -l) 64 max memory size (kbytes, -m) unlimited open files (-n) 1024 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size (kbytes, -s) 8192 cpu time (seconds, -t) unlimited max user processes (-u) unlimited virtual memory (kbytes, -v) unlimited file locks (-x) unlimited $ java -version java version "1.6.0_24" Java(TM) SE Runtime Environment (build 1.6.0_24-b07) Java HotSpot(TM) 64-Bit Server VM (build 19.1-b02, mixed mode)

    Read the article

  • Running Sonatype Nexus in Tomcat 7.0, Tomcat blocking PUT requests

    - by gdm
    I was previously running Nexus 1.8 on OSX and uploading jars for releases without any issues. The OSX box died, so I moved to a FreeBSD server. Since Nexus doesn't have binaries for FreeBSD, I decided to run it in my Tomcat container. Now, I have set up Nexus 1.9 in Tomcat 7.0 on FreeBSD. Everything is working well, except I can't upload jars to my release or snapshot repositories. If I try via Hudson, I get a 401 error (and no further details). If I try manually via curl, I get an error message back from Tomcat: "This request requires HTTP authentication.". Why is Tomcat giving this error, and how do I stop it? If I look in the Nexus logs I can see that the PUT request doesn't even reach Nexus, Tomcat is intercepting it.

    Read the article

  • JUnit Parameterized Runner and mvn Surefire Report integration

    - by fraido
    I'm using the Junit Parameterized Runner and the Maven Plugin Surefire Report to generate detailed reports during the mvn site phase. I've something like this @RunWith(Parameterized.class) public class MyTest { private String string1; private String string2; @Parameterized.Parameters public static Collection params() { return Arrays.asList(new String[][] { { "1", "2"}, { "3", "4"}, { "5", "6"} }); } public MyTest(String string1, String string2) { this.string1 = string1; this.string2 = string2; } @Test public void myTestMethod() { ... } @Test public void myOtherTestMethod() { ... } The report shows something like myTestMethod[0] 0.018 myTestMethod[1] 0.009 myTestMethod[2] 0.009 ... myOtherTestMethod[0] 0.018 myOtherTestMethod[1] 0.009 myOtherTestMethod[2] 0.009 ... Is there a way to display something else rather than the iteration number [0]..[1]..etc.. The constructor parameters would be a much better information. For example myTestMethod["1", "2"] 0.018 ...

    Read the article

  • m2eclipse workspace resolution

    - by Bartosz Radaczynski
    Hi all, I am using m2eclipse for managing maven projects in eclipse. It seems that in the previous release that I was using (0.9.8) the workspace resolution did not work at all, but right now it also does not work quite as I would expect. Namely, when the "resolve dependencied from workspace" setting for a project is not checked, the project turns red and cannot be build. The message says: artifact xxx x.y-SNAPSHOT cannot be found int local repository (or something to that extent). The trouble is that m2eclipse is putting information about workspace project into my local repo. Is there a way to change this behaviour? P.S. The workaround for this is to close the xxx project, then m2eclipse resolved the dependency to whatever version I've had previously in the local repository (i.e. the non-snapshot version).

    Read the article

  • Build Issue with multi module project

    - by vijay.shad
    Hi, I have a multi module web project. Four modules of the project are packaged as jar and added as dependency to the fifth module, which is packaged as war. When it is time to deploy the application i just run package on the war project and my war is created with all the dependencies. Now there is a problem. One of the my module have heavy changes. Now when i created war for my projects these changes was not reflected in the output war file(the jar in lib folder of war has still the old code). Can you please point the things i am missing from the release process? Why the old code is being packaged with the war? Can you please point some good resource for real file build process using maven? Regards, Vijay

    Read the article

  • Sonar default, meet "container state was: CONSTRUCTED"

    - by larry cai
    Environment: hudson/sonar/maven2 in ubuntu locally with default parameters And I got the log from hudson below, I can't figure out where is the problem. [INFO] Sonar host: http://localhost:9000 [INFO] Sonar version: 2.0.1 [INFO] [sonar-core:internal {execution: default-internal}] [INFO] Database dialect class org.sonar.api.database.dialect.Derby [INFO] ------------- Analyzing Game of Life business logic module [INFO] ------------------------------------------------------------------------ [ERROR] BUILD ERROR [INFO] ------------------------------------------------------------------------ [INFO] Can not execute Sonar Embedded error: Can not analyze the project Cannot stop. Current container state was: CONSTRUCTED [INFO] ------------------------------------------------------------------------ [INFO] Trace org.apache.maven.lifecycle.LifecycleExecutionException: Can not execute Sonar And I notice it also has problem when run it command line without hudson mvn sonar:sonar

    Read the article

  • M2Eclipse and EAR projects on Weblogic

    - by Steve
    How can I import a maven EAR project into Eclipse 3.4, and be able to use the IDE (WTP) to deploy the ear successfully to Weblogic (9.2)? The main issue is that the dependent jars are not being included in the ear (under APP-INF/lib) when it gets deployed through the IDE. When I build from command line, the ear is exactly how I want it. I am using the APP-INF/lib configuration for the ear plugin, and have included the jarModule sections for all the required jars. When editing the eclipse EAR project's Java EE components, all the jars are listed, but not in the APP-INF/lib. Only when I open a dependent jar project do those specific jars get set under that subfolder. All the 3rd party jars are showing they will end up in the wrong place. If you need more info, just let me know. Thanks!

    Read the article

  • m2eclipse resource filtering

    - by drewzilla
    I've having problems with resource filtering using m2eclipse Maven support in Eclipse. It seems that filtering only takes place on resources that have changed. This is fundamentally flawed because, if I have a file that references properties (e.g. ${my.property}, if the value of the property changes, the filtering will only be performed if the referencing file is also modified - if I only change the property value (in my pom.xml), the filtering is not applied to the files that that reference it. So, if I make a change to a property in my pom file, the filtering is not applied. However, if I then go to the file that references that property (e.g. a Spring config file) then edit and save it, the filtering is applied. I did read somewhere that: "m2eclipse skips filtering if there were no resource changes during incremental build" I'm using m2eclipse 0.10.x Has anyone else come across this? Thanks, Andrew

    Read the article

  • How to override TOMCAT Oracle ojdbc14 driver in the application?

    - by Luís Henrique Rocha
    The TOMCAT server is using an Oracle 9G ojdbc14 driver to its jndi connections in the /common/lib folder. My web application uses Maven + Spring and I'm getting the dataSource using Spring jndi features. I'm trying to bypass TOMCAT old ojdbc14 driver with a newer one (ojdbc14 10.2.0.4.0). I've tried putting the jars in the WEB-INF/lib folder as a project dependency, but it doesn't work the application keeps using the old oracle driver that is in the TOMCAT folder. I'm trying to bypass the TOMCAT oracle driver because I cannot update it to the newest version because there are lots of other projects using it. Does anyone have a clue?

    Read the article

  • How to figure out which jars are needed?

    - by Ari
    How can I systematically determine which jars I'll need, and thus should include in my pom.xml file (I'm using maven as my project management tool)? When learning spring, to keep things simple, added all the jars (even the ones I never used) to the classpath. Right now for the most part, I'm guessing which jars to include. For example, I know in my spring configuration file, I have: <tx:annotation-driven /> <context:annotation-config /> <aop:aspectj-autoproxy /> So, I guess I'll need: spring-context-x.x.x.jar, spring-tx-x.x.x.jar, spring-aop-x.x.x.jar Thanks.

    Read the article

  • How do I exclude the sources jar in mvn deploy?

    - by Richard
    When I run "mvn deploy:deploy", maven deploys 4 jar files to my internal remote repository. They are: [module-name]-1.jar [module-name]-1.pom [module-name]-1-sources.jar [module-name]-1-tests.jar There are actually more files, such as md5 and sha1 files, being deployed. But for simplicity, I just skip these files here. Is there any way to exclude [module-name]-1-sources.jar from the deployment process? One way I can think of is to use "mvn deploy:deploy-file", which allows me to pinpoint which jar to deploy. But since I have a few dozen modules to deploy, it'll be nice if I can configure the deployment file exclusion in pom.xml. Otherwise, I'll have to write a script to deploy. Thanks, Richard

    Read the article

  • Reading a file from a jar, or anywhere on the classpath?

    - by Stefan Kendall
    I'm trying to build an application that builds a resource file into a jar, but I'd like to have the project runnable within eclipse. I have a basic maven 2 structure for my project, and I'm unsure how to read in the file such that it's found and used when run from the JAR or from within eclipse. Thought? Structure: src/main/java src/main/resources/file.txt Current reading method: getClass().getResourceAsStream("/file.txt") Is there reading method that will pick up src/main/resources/*, as well as the root level of the JAR (where resources are deployed)?

    Read the article

  • Measuring code coverage for selenium tests that reside in separate project

    - by ilu
    I have two separate java maven projects: one is my web app itself and other one is tellurium+selenium automation tests for my web(I moved these tests to separate projects as their code doesnt really belong to the web app project code and doesnt use java classes of my web app, also I want to reuse some parts of those tests for testing my other web apps). Therefore, project where my tests reside doesnt know anything about my web app, except tellurium/selenium conf files(host name, credentials, browser). So the question: is there any way to measure code coverage of my webb app backend that is invoked by my tellurium/sellenium tests that reside in separate project? Thanks in advance. Any help is highly appreciated.

    Read the article

  • Netbeans 6.8 groovy files in src/main/java

    - by Jeff Storey
    I have a new netbeans maven/groovy project, and I actually prefer to mix my java and groovy files in src/main/java and src/test/java (I find it easier to navigate this way and my pom reflects this configuration). However, when I have my project setup this way in Netbeans 6.8, it always shows the generated-sources folder in error. The stubs generated from groovy files in src/test/java can't be opened by netbeans and given an error that they can't be parsed. However, in windows explorer the files are in tact. Netbeans can run the project but it continues to prompt me that some files are in error (even though I know they're not). It's like netbeans isn't refreshing itself. Any thoughts on how to fix this? thanks, Jeff

    Read the article

  • Can someone explain the ivy.xml dependency's conf attribute?

    - by tieTYT
    I can't find any thorough explanation of the ivy dependency tag's conf attribute: <dependency org="hibernate" name="hibernate" rev="3.1.3" conf="runtime, standalone -> runtime(*)"/> See that conf attribute? I can't find any explanation (that I can understand) about the right hand side of the - symbol. PLEASE keep in mind I don't know the first thing about maven so please explain this attribute with that consideration. Yes, I've already looked at this: http://ant.apache.org/ivy/history/latest-release/ivyfile/dependency.html Thanks, Dan

    Read the article

  • Why can I not deploy my ear on Glassfish

    - by hexin
    I have standard maven project in netbeans (netbeans' enterprise application), that have 1 war, 1 ejb and 1 ear modules. I want to inject with @Inject my @Stateless from ejb to war (REST class) using its interface. I have added some beans.xml files in correct folders in project, but im still getting this: Error occurred during deployment: Exception while loading the app : WELD-001409 Ambiguous dependencies for type [LogicBean] with qualifiers [@Default] at injection point [[field] @Inject private pl.edu.amu.wmi.kino.rk.rest.ReportRest.bean]. Possible dependencies [[Session bean [class pl.edu.amu.wmi.kino.rk.data.impl.LogicBeanImpl with qualifiers [@Any @Default]; local interfaces are [LogicBean], Session bean [class pl.edu.amu.wmi.kino.rk.data.impl.LogicBeanImpl with qualifiers [@Any @Default]; local interfaces are [LogicBean]]]. Please see server.log for more details. What am i doing wrong? I have searched the whole internet, but could not find the solution. I know it is possible because i worked on a project with such a staff. THX for any help:)

    Read the article

  • How do I get Tycho?

    - by Jens Schauder
    Tycho is supposed to be a plug-in for maven for building eclipse plug-ins. I have found various blogs and other articles about it, but the contained links to Tycho are all dead or not accessible for the general public, for example: http://www.sonatype.com/people/2009/04/tycho-040-roadmap/ The only thing I found is a project proposal on the eclipse site, but it doesn't reference any downloads: http://www.eclipse.org/proposals/tycho/ I found a svn repository, but it seems to be extremely dated: http://svn.codehaus.org/m2eclipse/tycho/trunk/ So my question is: where do I get tycho from? Or is it dead and I should stop bothering? I doubt it's dead, since I found out there is a talk about it on JAX2010 ...

    Read the article

  • How to repeat a particular execution multiple times

    - by Joshua
    The following snippet generates create / drop sql for a particular database, whenever there is a modification to JPA entity classes. How do I perform something equivalent of a 'for' operation where-in the following code can be used to generate sql for all supported databases (e.g. H2, MySQL, Postgres) Currently I have to modify db.groupId, db.artifactId, db.driver.version everytime to generate the sql files <plugin> <groupId>org.codehaus.mojo</groupId> <artifactId>hibernate3-maven-plugin</artifactId> <version>${hibernate3-maven-plugin.version}</version> <executions> <execution> <id>create schema</id> <phase>process-test-resources</phase> <goals> <goal>hbm2ddl</goal> </goals> <configuration> <componentProperties> <persistenceunit>${app.module}</persistenceunit> <drop>false</drop> <create>true</create> <outputfilename>${app.sql}-create.sql</outputfilename> </componentProperties> </configuration> </execution> <execution> <id>drop schema</id> <phase>process-test-resources</phase> <goals> <goal>hbm2ddl</goal> </goals> <configuration> <componentProperties> <persistenceunit>${app.module}</persistenceunit> <drop>true</drop> <create>false</create> <outputfilename>${app.sql}-drop.sql</outputfilename> </componentProperties> </configuration> </execution> </executions> <dependencies> <dependency> <groupId>org.hibernate</groupId> <artifactId>hibernate-core</artifactId> <version>${hibernate-core.version}</version> </dependency> <dependency> <groupId>org.slf4j</groupId> <artifactId>slf4j-api</artifactId> <version>${slf4j-api.version}</version> </dependency> <dependency> <groupId>org.slf4j</groupId> <artifactId>slf4j-nop</artifactId> <version>${slf4j-nop.version}</version> </dependency> <dependency> <groupId>${db.groupId}</groupId> <artifactId>${db.artifactId}</artifactId> <version>${db.driver.version}</version> </dependency> </dependencies> <configuration> <components> <component> <name>hbm2cfgxml</name> <implementation>annotationconfiguration</implementation> </component> <component> <name>hbm2dao</name> <implementation>annotationconfiguration</implementation> </component> <component> <name>hbm2ddl</name> <implementation>jpaconfiguration</implementation> <outputDirectory>src/main/sql</outputDirectory> </component> <component> <name>hbm2doc</name> <implementation>annotationconfiguration</implementation> </component> <component> <name>hbm2hbmxml</name> <implementation>annotationconfiguration</implementation> </component> <component> <name>hbm2java</name> <implementation>annotationconfiguration</implementation> </component> <component> <name>hbm2template</name> <implementation>annotationconfiguration</implementation> </component> </components> </configuration> </plugin>

    Read the article

  • Apache Commons EmailValidator and SeamListener Exception (not deploying)

    - by ranirani
    Hi, friends When using Apache Commons EmailValidator through Maven, I have the following problem that doesn't deploy my app: Exception sending context initialized event to listener instance of class org.jboss.seam.servlet.SeamListener java.lang.LinkageError: loader constraints violated when linking org/xml/sax/EntityResolver class I've used the following code at my pom.xml: <dependency> <groupId>commons-validator</groupId> <artifactId>commons-validator</artifactId> <version>1.3.1</version> </dependency> One help?

    Read the article

< Previous Page | 32 33 34 35 36 37 38 39 40 41 42 43  | Next Page >