Search Results

Search found 600 results on 24 pages for 'commons dbcp'.

Page 11/24 | < Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >

  • Is it possible to optimize maven dependencies automatically?

    - by AlexR
    I am working on a big project that consists of about 40 sub-projects with very not optimized dependencies. There are declared dependencies that are not in use as well as used but undeclared dependencies. The second case is possible when dependency is added via other dependency. I want to remove redundant and add required dependencies. I ran mvn dependency:analyze and got a long list of warnings I have to fix now. I wonder whether there is maven plugin or any other utility that can update my pom.xml files automatically. I tried to do it manually but it takes a lot of time. It seems it will take a couple of days of copy/paste to complete the task. In worse case I can write such script myself but probably ready stuff exists? Here is how mvn dependency:analyze reports dependency warnings: [WARNING] Used undeclared dependencies found: [WARNING] org.apache.httpcomponents:httpcore:jar:4.1:compile [WARNING] Unused declared dependencies found: [WARNING] commons-lang:commons-lang:jar:2.4:compile [WARNING] org.json:json:jar:20090211:compile

    Read the article

  • Clojure warn-on-reflection and type hints

    - by Ralph
    In the following code, I am getting a warning on reflection: (ns com.example (:import [org.apache.commons.cli CommandLine Option Options PosixParser])) (def *help-option* "help") (def *host-option* "db-host") (def *options* (doto (Options.) (.addOption "?" *help-option* false "Show this usage information") (.addOption "h" *host-option* true "Name of the database host"))) (let [^CommandLine command-line (.. (PosixParser.) (parse *options* (into-array String args))) db-host (.getOptionValue command-line "h")] ; WARNING HERE ON .getOptionValue ; Do stuff with db-host ) I have a type hint on command-line. Why the warning? I am using Clojure 1.2 on OS X 10.6.6 (Apple VM). I assume that I do not get a warning on (.addOption ...) because the compiler knows that (Options.) is a org.apache.commons.cli.Options).

    Read the article

  • How to deploy custom MBean to Tomcat?

    - by Christian
    Hi, I'm trying to deploy a custom mbean to a tomcat. This mbean is not part of a webapp. It should be instantiated when tomcat starts. My problem is, I can't find any complete documentation about how to deploy such a mbean. I'm getting different exceptions, depending on my configuration. Has anyone hints, a complete documentation or has implemented a mbean by himself and can post an example? I configured tomcat to read a configuration from his conf directory: <Engine name="Catalina" defaultHost="localhost" mbeansFile="${catalina.base}/conf/mbeans-descriptors.xml"> The content is as follows: <?xml version="1.0"?> <!-- <!DOCTYPE mbeans-descriptors PUBLIC "-//Apache Software Foundation//DTD Model MBeans Configuration File" "http://jakarta.apache.org/commons/dtds/mbeans-descriptors.dtd"> --> <!-- Descriptions of JMX MBeans --> <mbeans-descriptors> <mbean name="Performance" description="Caculate JVM throughput" type="Performance"> <attribute name="throughput" description="calculated throughput (ratio between gc times and uptime of JVM)" type="double" writeable="false"/> </mbean> </mbeans-descriptors> When name in the xml file and class name match, I get this excption: SEVERE: Error creating mbean Performance javax.management.MalformedObjectNameException: Key properties cannot be empty at javax.management.ObjectName.construct(ObjectName.java:467) at javax.management.ObjectName.<init>(ObjectName.java:1403) at org.apache.tomcat.util.modeler.modules.MbeansSource.execute(MbeansSource.java:202) at org.apache.tomcat.util.modeler.modules.MbeansSource.load(MbeansSource.java:137) at org.apache.catalina.core.StandardEngine.readEngineMbeans(StandardEngine.java:517) at org.apache.catalina.core.StandardEngine.init(StandardEngine.java:321) at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:411) at org.apache.catalina.core.StandardService.start(StandardService.java:519) at org.apache.catalina.core.StandardServer.start(StandardServer.java:710) at org.apache.catalina.startup.Catalina.start(Catalina.java:581) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.commons.daemon.support.DaemonLoader.start(DaemonLoader.java:177) When changing the name attribute in the xml file to test.example:type=Performance, I get this exception: SEVERE: Error creating mbean test.example:type=Performance javax.management.NotCompliantMBeanException: MBean class must have public constructor at com.sun.jmx.mbeanserver.Introspector.testCreation(Introspector.java:127) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.createMBean(DefaultMBeanServerInterceptor.java:284) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.createMBean(DefaultMBeanServerInterceptor.java:199) at com.sun.jmx.mbeanserver.JmxMBeanServer.createMBean(JmxMBeanServer.java:393) at org.apache.tomcat.util.modeler.modules.MbeansSource.execute(MbeansSource.java:207) at org.apache.tomcat.util.modeler.modules.MbeansSource.load(MbeansSource.java:137) at org.apache.catalina.core.StandardEngine.readEngineMbeans(StandardEngine.java:517) at org.apache.catalina.core.StandardEngine.init(StandardEngine.java:321) at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:411) at org.apache.catalina.core.StandardService.start(StandardService.java:519) at org.apache.catalina.core.StandardServer.start(StandardServer.java:710) at org.apache.catalina.startup.Catalina.start(Catalina.java:581) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.commons.daemon.support.DaemonLoader.start(DaemonLoader.java:177) The documentation from apache is not really helpful, as it just explains a small part. I'm aware of this question but it doesn't help me. The answer I gave worked just for a short time, after that I got some other exceptions. For additional info, the java interface public interface PerformanceMBean { public double getThroughput(); } and implementing class /* some import statements */ public class Performance implements PerformanceMBean { public double getThroughput() { ... } }

    Read the article

  • Python et l'agrégation d'outils, Par Laurent Pointal

    Bonjour Voici un nouveau article Intitulé: Python et l'agrégation d'outils Citation: Cet article est paru originellement dans le numéro 3/2007 de la revue francophone du Linux Developer Journal. La version présentée ici reprend globalement l'article paru, en y ajoutant des liens hypertext et des références. Ce document est mis à disposition sous un contrat Creative Commons Paternité. Bonne l...

    Read the article

  • Allegheny first-years dive into Fedora

    <b>Opensource.com:</b> "Over the course of fourteen weeks, we've introduced them to the Creative Commons, blogging, and open source software in the context of social change, trying to get them ready for their dive into the Fedora project."

    Read the article

  • Licenses that i can use for my works, web apps, desktop apps, wordpress themes etc

    - by jiewmeng
    I originally thought of creative commons when while reading a book about wordpress (professional wordpress), I learned that I should also specify that the product is provided ... WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE and they recommend GNU GPL. How do I write a license or select 1? btw, what does 'MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE' mean actually? Isn't without warranty enough?

    Read the article

  • Seattle Code Camp 5.0

    - by jerrykross
    I will be attending Seattle Code Camp 5.0 tomorrow at Microsoft Commons. Take a look at http://seattle.codecamp.us/default.aspx for details. It looks like you can still register.

    Read the article

  • Free texture sites – is it safe to use their content

    - by AllCoder
    There are multiple free texture sites, like: texturemate.com or plaintextures.com. I am wondering how safe it is to use their content? I imagine that the texture could be withdrawed from the site, making it difficult to prove it was really there. Or: a file can origin from other, restricted source. I am thinking about using wikipedia Commons, as there is the OTRS system. However most media requires attributing.

    Read the article

  • Tomcat6 can't connect to MySql (The driver has not received any packets from the server)

    - by Tobias Wiesenthal
    Hi all, i'm running an Apache Tomcat 6.0.20 / MySQL 5.1.37-lubuntu / sun-java6-jdk /sun-java6-jre / sun-java6-bin on my local machine using Ubuntu 9.10 as OS. I'm trying to get a simple DB-query example running for 2 days now, but i still get this Exception: org.apache.jasper.JasperException: javax.servlet.ServletException: javax.servlet.jsp.JspException: Unable to get connection, DataSource invalid: "org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (Communications link failure The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.)" org.apache.jasper.servlet.JspServletWrapper.handleJspException(JspServletWrapper.java:522) org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:398) org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:342) org.apache.jasper.servlet.JspServlet.service(JspServlet.java:267) javax.servlet.http.HttpServlet.service(HttpServlet.java:717) root cause javax.servlet.ServletException: javax.servlet.jsp.JspException: Unable to get connection, DataSource invalid: "org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (Communications link failure The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.)" org.apache.jasper.runtime.PageContextImpl.doHandlePageException(PageContextImpl.java:862) org.apache.jasper.runtime.PageContextImpl.handlePageException(PageContextImpl.java:791) org.apache.jsp.index_jsp._jspService(index_jsp.java:104) org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:70) javax.servlet.http.HttpServlet.service(HttpServlet.java:717) org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:374) org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:342) org.apache.jasper.servlet.JspServlet.service(JspServlet.java:267) javax.servlet.http.HttpServlet.service(HttpServlet.java:717) root cause javax.servlet.jsp.JspException: Unable to get connection, DataSource invalid: "org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (Communications link failure The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.)" org.apache.taglibs.standard.tag.common.sql.QueryTagSupport.getConnection(QueryTagSupport.java:285) org.apache.taglibs.standard.tag.common.sql.QueryTagSupport.doStartTag(QueryTagSupport.java:168) org.apache.jsp.index_jsp._jspx_meth_sql_005fquery_005f0(index_jsp.java:274) org.apache.jsp.index_jsp._jspx_meth_c_005fotherwise_005f0(index_jsp.java:216) org.apache.jsp.index_jsp._jspx_meth_c_005fchoose_005f0(index_jsp.java:130) org.apache.jsp.index_jsp._jspService(index_jsp.java:93) org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:70) javax.servlet.http.HttpServlet.service(HttpServlet.java:717) org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:374) org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:342) org.apache.jasper.servlet.JspServlet.service(JspServlet.java:267) javax.servlet.http.HttpServlet.service(HttpServlet.java:717) my web.xml looks like this : <?xml version="1.0" encoding="utf-8"?> <web-app xmlns="http://java.sun.com/xml/ns/javaee" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_2_5.xsd" version="2.5"> <resource-ref> <description>DB Connection</description> <res-ref-name>jdbc/testDB</res-ref-name> <res-type>javax.sql.DataSource</res-type> <res-auth>Container</res-auth> </resource-ref> </web-app> the context.xml looks like this : <?xml version="1.0" encoding="UTF-8"?> <Context path="/my1stApp" docBase="/var/www/jsp/my1stApp" debug="5" reloadable="true" crossContext="true"> <Resource name="jdbc/testDB" auth="Container" type="javax.sql.DataSource" maxActive="5" maxIdle="5" maxWait="10000" username="user" password="password" driverClassName="com.mysql.jdbc.Driver" url="jdbc:mysql://localhost:3306/some"/> </Context> and the jsp file looks like this: <%@ page contentType="text/html" %> <%@ taglib prefix="c" uri="http://java.sun.com/jsp/jstl/core" %> <%@ taglib prefix="fn" uri="http://java.sun.com/jsp/jstl/functions" %> <%@ taglib prefix="fmt" uri="http://java.sun.com/jsp/jstl/fmt" %> <%@ taglib prefix="sql" uri="http://java.sun.com/jsp/jstl/sql" %> <html> <head> <title>DroneLootTool</title> </head> <body bgcolor="white"> <sql:query var="res" dataSource="jdbc/testDB"> select name, othername from mytable </sql:query> <h2>Results</h2> <c:forEach var="row" items="${res.rows}"> Name ${row.name}<br/> MoreName ${row.othername}<br/><br/> </c:forEach> </body> </html> read lots of forum entries / tried lots of different settings (always changed back to original settings when it didnt' work) set TOMCAT6_SECURITY=no in /etc/default/tomcat6 because TOMCAT6_SECURITY=yes was causing trouble too the skip-networking flag is not set for the DB (BIND 127.0.0.1 is set) firewall is swiched off (sudo ufw disable) MySQL works (tested several times with user used in this skript) telnet localhost 3306 says Trying ::1... Trying 127.0.0.1... Connected to localhost. Escape character is '^]'. Connection closed by foreign host. The TestConnection.java produced the following output: me@my-laptop:~/Desktop$ java -classpath '/usr/share/java/mysql.jar:./' TestConnection com.mysql.jdbc.Driver jdbc:mysql://localhost:3306/testDB myuser mypassword com.mysql.jdbc.CommunicationsException: Communications link failure Last packet sent to the server was 0 ms ago. at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:1070) at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2103) at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:718) at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:298) at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:282) at java.sql.DriverManager.getConnection(DriverManager.java:582) at java.sql.DriverManager.getConnection(DriverManager.java:185) at TestConnection.checkConnection(TestConnection.java:40) at TestConnection.main(TestConnection.java:21) Caused by: com.mysql.jdbc.CommunicationsException: Communications link failure Last packet sent to the server was 0 ms ago. at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:1070) at com.mysql.jdbc.MysqlIO.readPacket(MysqlIO.java:666) at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1069) at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2031) ... 7 more Caused by: java.io.EOFException: Can not read response from server. Expected to read 4 bytes, read 0 bytes before connection was unexpectedly lost. at com.mysql.jdbc.MysqlIO.readFully(MysqlIO.java:2431) at com.mysql.jdbc.MysqlIO.readPacket(MysqlIO.java:590) ... 9 more Connection failed. i don't know if there is a difference between the way the java driver connects to the DB and the Perl DBI module does, but this PERL skript works #!/usr/bin/perl -w use CGI; use DBI; use strict; print CGI::header(); my $dbh = DBI->connect("dbi:mysql:some:localhost", "user", "password"); my $sSql = "SELECT * from mytable"; my $ppl = $dbh->selectall_arrayref( $sSql ); foreach my $pl (@$ppl) { my @array = @$pl; print @array; } $dbh->disconnect; enabled --log-warnings on the mysql, but i didn't get any new warnings. When i was searching the logs for warnings i found this messages when i restart the tomcat, don't know if it helps to find the problem : Feb 2 19:50:37 tobias-laptop jsvc.exec[3129]: 02.02.2010 19:50:37 org.apache.catalina.startup.HostConfig checkResources#012INFO: Undeploying context [/myapp] Feb 2 19:50:37 tobias-laptop jsvc.exec[3129]: 02.02.2010 19:50:37 org.apache.catalina.loader.WebappClassLoader clearReferencesThreads#012SCHWERWIEGEND: A web application appears to have started a thread named [MySQL Statement Cancellation Timer] but has failed to stop it. This is very likely to create a memory leak. Feb 2 19:50:37 tobias-laptop jsvc.exec[3129]: 02.02.2010 19:50:37 org.apache.catalina.startup.HostConfig deployDescriptor#012INFO: Deploying configuration descriptor myapp.xml

    Read the article

  • Oracle thin driver vs. OCI driver. Pros and Cons?

    - by Zwei Steinen
    Hi, When you develop a Java application that talks to oracle DBs, there are 2 options right? One is oracle thin driver, and the other is OCI driver that requires its own installation (please correct if I'm misunderstanding). Now, what are the pros and cons? Obviously thin driver sounds much better in terms of installation, but is there anything that OCI can and the thin one can't? Develop environment is Tomcat6 + Spring 3.0 + JPA(Hibernate) + appache-DBCP Thanks in advance.

    Read the article

  • What is best approach for connection pooling?

    - by Bhushan
    I am implementing connection pooling in project. Performance wise which is better approach to do it? Hibernate (using C3PO or DBCP) Configuring JDBC data-source in Application server. Application server Portability is not an important factor for me. Please suggest the approach.

    Read the article

  • Connection hangs after time of inactivity

    - by Sinuhe
    In my application, Spring manages connection pool for database access. Hibernate uses these connections for its queries. At first glance, I have no problems with the pool: it works correctly with concurrent clients and a pool with only one connection. I can execute a lot of queries, so I think that I (or Spring) don't leave open connections. My problem appears after some time of inactivity (sometimes 30 minutes, sometimes more than 2 hours). Then, when Hibernate does some search, it lasts too much. Setting log4j level to TRACE, I get this logs: ... 18:27:01 DEBUG nsactionSynchronizationManager - Retrieved value [org.springframework.orm.hibernate3.SessionHolder@99abd7] for key [org.hibernate.impl.SessionFactoryImpl@7d2897] bound to thread [http-8080-Processor24] 18:27:01 DEBUG HibernateTransactionManager - Found thread-bound Session [org.hibernate.impl.SessionImpl@8878cd] for Hibernate transaction 18:27:01 DEBUG HibernateTransactionManager - Using transaction object [org.springframework.orm.hibernate3.HibernateTransactionManager$HibernateTransactionObject@1b2ffee] 18:27:01 DEBUG HibernateTransactionManager - Creating new transaction with name [com.acjoventut.service.GenericManager.findByExample]: PROPAGATION_REQUIRED,ISOLATION_DEFAULT 18:27:01 DEBUG HibernateTransactionManager - Preparing JDBC Connection of Hibernate Session [org.hibernate.impl.SessionImpl@8878cd] 18:27:01 TRACE SessionImpl - setting flush mode to: AUTO 18:27:01 DEBUG JDBCTransaction - begin 18:27:01 DEBUG ConnectionManager - opening JDBC connection Here it gets frozen for about 2 - 10 minutes. But then continues: 18:30:11 DEBUG JDBCTransaction - current autocommit status: true 18:30:11 DEBUG JDBCTransaction - disabling autocommit 18:30:11 TRACE JDBCContext - after transaction begin 18:30:11 DEBUG HibernateTransactionManager - Exposing Hibernate transaction as JDBC transaction [jdbc:oracle:thin:@212.31.39.50:30998:orcl, UserName=DEVELOP, Oracle JDBC driver] 18:30:11 DEBUG nsactionSynchronizationManager - Bound value [org.springframework.jdbc.datasource.ConnectionHolder@843a9d] for key [org.apache.commons.dbcp.BasicDataSource@7745fd] to thread [http-8080-Processor24] 18:30:11 DEBUG nsactionSynchronizationManager - Initializing transaction synchronization ... After that, it works with no problems, until another period of inactivity. IMHO, it seems like connection pool returns an invalid/closed connection, and when Hibernate realizes that, ask another connection to the pool. I don't know how can I solve this problem or things I can do for delimiting it. Any help achieving this will be appreciate. Thanks. EDIT: Well, it finally was due a firewall rule. Database detects the connection is lost, but pool (dbcp or c3p0) not. So, it tries to query the database with no success. What is still strange for me is that timeout period is very variable. Maybe the rule is specially strange or firewall doesn't work correctly. Anyway, I have no access to that machine and I can only wait for an explanation. :(

    Read the article

  • Boost your infrastructure with Coherence into the Cloud

    - by Nino Guarnacci
    Authors: Nino Guarnacci & Francesco Scarano,  at this URL could be found the original article:  http://blogs.oracle.com/slc/coherence_into_the_cloud_boost. Thinking about the enterprise cloud, come to mind many possible configurations and new opportunities in enterprise environments. Various customers needs that serve as guides to this new trend are often very different, but almost always united by two main objectives: Elasticity of infrastructure both Hardware and Software Investments related to the progressive needs of the current infrastructure Characteristics of innovation and economy. A concrete use case that I worked on recently demanded the fulfillment of two basic requirements of economy and innovation.The client had the need to manage a variety of data cache, which can process complex queries and parallel computational operations, maintaining the caches in a consistent state on different server instances, on which the application was installed.In addition, the customer was looking for a solution that would allow him to manage the likely situations in load peak during certain times of the year.For this reason, the customer requires a replication site, on which convey part of the requests during periods of peak; the desire was, however, to prevent the immobilization of investments in owned hardware-software architectures; so, to respond to this need, it was requested to seek a solution based on Cloud technologies and architectures already offered by the market. Coherence can already now address the requirements of large cache between different nodes in the cluster, providing further technology to search and parallel computing, with the simultaneous use of all hardware infrastructure resources. Moreover, thanks to the functionality of "Push Replication", which can replicate and update the information contained in the cache, even to a site hosted in the cloud, it is satisfied the need to make resilient infrastructure that can be based also on nodes temporarily housed in the Cloud architectures. There are different types of configurations that can be realized using the functionality "Push-Replication" of Coherence. Configurations can be either: Active - Passive  Hub and Spoke Active - Active Multi Master Centralized Replication Whereas the architecture of this particular project consists of two sites (Site 1 and Site Cloud), between which only Site 1 is enabled to write into the cache, it was decided to adopt an Active-Passive Configuration type (Hub and Spoke). If, however, the requirement should change over time, it will be particularly easy to change this configuration in an Active-Active configuration type. Although very simple, the small sample in this post, inspired by the specific project is effective, to better understand the features and capabilities of Coherence and its configurations. Let's create two distinct coherence cluster, located at miles apart, on two different domain contexts, one of them "hosted" at home (on-premise) and the other one hosted by any cloud provider on the network (or just the same laptop to test it :)). These two clusters, which we call Site 1 and Site Cloud, will contain the necessary information, so a simple client can insert data only into the Site 1. On both sites will be subscribed a listener, who listens to the variations of specific objects within the various caches. To implement these features, you need 4 simple classes: CachedResponse.java Represents the POJO class that will be inserted into the cache, and fulfills the task of containing useful information about the hypothetical links navigation ResponseSimulatorHelper.java Represents a link simulator, which has the task of randomly creating objects of type CachedResponse that will be added into the caches CacheCommands.java Represents the model of our example, because it is responsible for receiving instructions from the controller and performing basic operations against the cache, such as insert, delete, update, listening, objects within the cache Shell.java It is our controller, which give commands to be executed within the cache of the two Sites So, summarily, we execute the java class "Shell", asking it to put into the cache 100 objects of type "CachedResponse" through the java class "CacheCommands", then the simulator "ResponseSimulatorHelper" will randomly create new instances of objects "CachedResponse ". Finally, the Shell class will listen to for events occurring within the cache on the Site Cloud, while insertions and deletions are performed on Site 1. Now, we realize the two configurations of two respective sites / cluster: Site 1 and Site Cloud.For the Site 1 we define a cache of type "distributed" with features of "read and write", using the cache class store for the "push replication", a functionality offered by the project "incubator" of Oracle Coherence.For the "Site Cloud" we expect even the definition of “distributed” cache type with tcp proxy feature enabled, so it can receive updates from Site 1.  Coherence Cache Config XML file for "storage node" on "Site 1" site1-prod-cache-config.xml Coherence Cache Config XML file for "storage node" on "Site Cloud" site2-prod-cache-config.xml For two clients "Shell" which will connect respectively to the two clusters we have provided two easy access configurations.  Coherence Cache Config XML file for Shell on "Site 1" site1-shell-prod-cache-config.xml Coherence Cache Config XML file for Shell on "Site Cloud" site2-shell-prod-cache-config.xml Now, we just have to get everything and run our tests. To start at least one "storage" node (which holds the data) for the "Cloud Site", we can run the standard class  provided OOTB by Oracle Coherence com.tangosol.net.DefaultCacheServer with the following parameters and values:-Xmx128m-Xms64m-Dcom.sun.management.jmxremote -Dtangosol.coherence.management=all -Dtangosol.coherence.management.remote=true -Dtangosol.coherence.distributed.localstorage=true -Dtangosol.coherence.cacheconfig=config/site2-prod-cache-config.xml-Dtangosol.coherence.clusterport=9002-Dtangosol.coherence.site=SiteCloud To start at least one "storage" node (which holds the data) for the "Site 1", we can perform again the standard class provided by Coherence  com.tangosol.net.DefaultCacheServer with the following parameters and values:-Xmx128m-Xms64m-Dcom.sun.management.jmxremote -Dtangosol.coherence.management=all -Dtangosol.coherence.management.remote=true -Dtangosol.coherence.distributed.localstorage=true -Dtangosol.coherence.cacheconfig=config/site1-prod-cache-config.xml-Dtangosol.coherence.clusterport=9001-Dtangosol.coherence.site=Site1 Then, we start the first client "Shell" for the "Cloud Site", launching the java class it.javac.Shell  using these parameters and values: -Xmx64m-Xms64m-Dcom.sun.management.jmxremote -Dtangosol.coherence.management=all -Dtangosol.coherence.management.remote=true -Dtangosol.coherence.distributed.localstorage=false -Dtangosol.coherence.cacheconfig=config/site2-shell-prod-cache-config.xml-Dtangosol.coherence.clusterport=9002-Dtangosol.coherence.site=SiteCloud Finally, we start the second client "Shell" for the "Site 1", re-launching a new instance of class  it.javac.Shell  using  the following parameters and values: -Xmx64m-Xms64m-Dcom.sun.management.jmxremote -Dtangosol.coherence.management=all -Dtangosol.coherence.management.remote=true -Dtangosol.coherence.distributed.localstorage=false -Dtangosol.coherence.cacheconfig=config/site1-shell-prod-cache-config.xml-Dtangosol.coherence.clusterport=9001-Dtangosol.coherence.site=Site1  And now, let’s execute some tests to validate and better understand our configuration. TEST 1The purpose of this test is to load the objects into the "Site 1" cache and seeing how many objects are cached on the "Site Cloud". Within the "Shell" launched with parameters to access the "Site 1", let’s write and run the command: load test/100 Within the "Shell" launched with parameters to access the "Site Cloud" let’s write and run the command: size passive-cache Expected result If all is OK, the first "Shell" has uploaded 100 objects into a cache named "test"; consequently the "push-replication" functionality has updated the "Site Cloud" by sending the 100 objects to the second cluster where they will have been posted into a respective cache, which we named "passive-cache". TEST 2The purpose of this test is to listen to deleting and adding events happening on the "Site 1" and that are replicated within the cache on "Cloud Site". In the "Shell" launched with parameters to access the "Site Cloud" let’s write and run the command: listen passive-cache/name like '%' or a "cohql" query, with your preferred parameters In the "Shell" launched with parameters to access the "Site 1" let’s write and run the following commands: load test/10 load test2/20 delete test/50 Expected result If all is OK, the "Shell" to Site Cloud let us to listen to all the add and delete events within the cache "cache-passive", whose objects satisfy the query condition "name like '%' " (ie, every objects in the cache; you could change the tests and create different queries).Through the Shell to "Site 1" we launched the commands to add and to delete objects on different caches (test and test2). With the "Shell" running on "Site Cloud" we got the evidence (displayed or printed, or in a log file) that its cache has been filled with events and related objects generated by commands executed from the" Shell "on" Site 1 ", thanks to "push-replication" feature.  Other tests can be performed, such as, for example, the subscription to the events on the "Site 1" too, using different "cohql" queries, changing the cache configuration,  to effectively demonstrate both the potentiality and  the versatility produced by these different configurations, even in the cloud, as in our case. More information on how to configure Coherence "Push Replication" can be found in the Oracle Coherence Incubator project documentation at the following link: http://coherence.oracle.com/display/INC10/Home More information on Oracle Coherence "In Memory Data Grid" can be found at the following link: http://www.oracle.com/technetwork/middleware/coherence/overview/index.html To download and execute the whole sources and configurations of the example explained in the above post,  click here to download them; After download the last available version of the Push-Replication Pattern library implementation from the Oracle Coherence Incubator site, and download also the related and required version of Oracle Coherence. For simplicity the required .jarS to execute the example (that can be found into the Push-Replication-Pattern  download and Coherence Distribution download) are: activemq-core-5.3.1.jar activemq-protobuf-1.0.jar aopalliance-1.0.jar coherence-commandpattern-2.8.4.32329.jar coherence-common-2.2.0.32329.jar coherence-eventdistributionpattern-1.2.0.32329.jar coherence-functorpattern-1.5.4.32329.jar coherence-messagingpattern-2.8.4.32329.jar coherence-processingpattern-1.4.4.32329.jar coherence-pushreplicationpattern-4.0.4.32329.jar coherence-rest.jar coherence.jar commons-logging-1.1.jar commons-logging-api-1.1.jar commons-net-2.0.jar geronimo-j2ee-management_1.0_spec-1.0.jar geronimo-jms_1.1_spec-1.1.1.jar http.jar jackson-all-1.8.1.jar je.jar jersey-core-1.8.jar jersey-json-1.8.jar jersey-server-1.8.jar jl1.0.jar kahadb-5.3.1.jar miglayout-3.6.3.jar org.osgi.core-4.1.0.jar spring-beans-2.5.6.jar spring-context-2.5.6.jar spring-core-2.5.6.jar spring-osgi-core-1.2.1.jar spring-osgi-io-1.2.1.jar At this URL could be found the original article: http://blogs.oracle.com/slc/coherence_into_the_cloud_boost Authors: Nino Guarnacci & Francesco Scarano

    Read the article

  • PHP/GD - Cropping and Resizing Images

    - by Alix Axel
    I've coded a function that crops an image to a given aspect ratio and finally then resizes it and outputs it as JPG: <?php function Image($image, $crop = null, $size = null) { $image = ImageCreateFromString(file_get_contents($image)); if (is_resource($image) === true) { $x = 0; $y = 0; $width = imagesx($image); $height = imagesy($image); /* CROP (Aspect Ratio) Section */ if (is_null($crop) === true) { $crop = array($width, $height); } else { $crop = array_filter(explode(':', $crop)); if (empty($crop) === true) { $crop = array($width, $height); } else { if ((empty($crop[0]) === true) || (is_numeric($crop[0]) === false)) { $crop[0] = $crop[1]; } else if ((empty($crop[1]) === true) || (is_numeric($crop[1]) === false)) { $crop[1] = $crop[0]; } } $ratio = array ( 0 => $width / $height, 1 => $crop[0] / $crop[1], ); if ($ratio[0] > $ratio[1]) { $width = $height * $ratio[1]; $x = (imagesx($image) - $width) / 2; } else if ($ratio[0] < $ratio[1]) { $height = $width / $ratio[1]; $y = (imagesy($image) - $height) / 2; } /* How can I skip (join) this operation with the one in the Resize Section? */ $result = ImageCreateTrueColor($width, $height); if (is_resource($result) === true) { ImageSaveAlpha($result, true); ImageAlphaBlending($result, false); ImageFill($result, 0, 0, ImageColorAllocateAlpha($result, 255, 255, 255, 127)); ImageCopyResampled($result, $image, 0, 0, $x, $y, $width, $height, $width, $height); $image = $result; } } /* Resize Section */ if (is_null($size) === true) { $size = array(imagesx($image), imagesy($image)); } else { $size = array_filter(explode('x', $size)); if (empty($size) === true) { $size = array(imagesx($image), imagesy($image)); } else { if ((empty($size[0]) === true) || (is_numeric($size[0]) === false)) { $size[0] = round($size[1] * imagesx($image) / imagesy($image)); } else if ((empty($size[1]) === true) || (is_numeric($size[1]) === false)) { $size[1] = round($size[0] * imagesy($image) / imagesx($image)); } } } $result = ImageCreateTrueColor($size[0], $size[1]); if (is_resource($result) === true) { ImageSaveAlpha($result, true); ImageAlphaBlending($result, true); ImageFill($result, 0, 0, ImageColorAllocate($result, 255, 255, 255)); ImageCopyResampled($result, $image, 0, 0, 0, 0, $size[0], $size[1], imagesx($image), imagesy($image)); header('Content-Type: image/jpeg'); ImageInterlace($result, true); ImageJPEG($result, null, 90); } } return false; } ?> The function works as expected but I'm creating a non-required GD image resource, how can I fix it? I've tried joining both calls but I must be doing some miscalculations. <?php /* Usage Examples */ Image('http://upload.wikimedia.org/wikipedia/commons/4/47/PNG_transparency_demonstration_1.png', '1:1', '600x'); Image('http://upload.wikimedia.org/wikipedia/commons/4/47/PNG_transparency_demonstration_1.png', '2:1', '600x'); Image('http://upload.wikimedia.org/wikipedia/commons/4/47/PNG_transparency_demonstration_1.png', '2:', '250x300'); ?> Any help is greatly appreciated, thanks.

    Read the article

  • Overwrite archetypes in Maven

    - by Random
    Hello again! I'm having some trouble using Maven for my archetypes and I will need to overwrite some. I launch an instruction that does an archetype:generate in an archetype already existing directory. Is there a parameter that let's me overwrite existing archetypes? I have search the maven definitve guide but it states that the only parameters accepted are: -DgroupId -DartifactId -Dversion -DpackageName -DarchetypeGroupId -DarchetypeArtifactId -DarchetypeVersion -DinteractiveMode I could just search the directory and delete the files, but this proccess is going to be done automatically (so no human involved, no brains involved) and I wouldn't like he machine deleting things around. Thanks for all! Edit: I almost forgot, here is some maven trace: [INFO] Scanning for projects... [INFO] Searching repository for plugin with prefix: 'archetype'. [INFO] ------------------------------------------------------------------------ [INFO] Building Maven Default Project [INFO] task-segment: [archetype:generate] (aggregator-style) [INFO] ------------------------------------------------------------------------ [INFO] Preparing archetype:generate [INFO] No goals needed for project - skipping [INFO] Setting property: classpath.resource.loader.class => 'org.codehaus.plexus.velocity.ContextClassLoaderResourceLoader'. [INFO] Setting property: velocimacro.messages.on => 'false'. [INFO] Setting property: resource.loader => 'classpath'. [INFO] Setting property: resource.manager.logwhenfound => 'false'. [INFO] [archetype:generate {execution: default-cli}] [INFO] Generating project in Batch mode [INFO] Archetype defined by properties [INFO] ---------------------------------------------------------------------------- [INFO] Using following parameters for creating OldArchetype: archetype-foo-lib:1.0 [INFO] ---------------------------------------------------------------------------- [INFO] Parameter: groupId, Value: foo.tecnologia [INFO] Parameter: packageName, Value: foo.tecnologia [INFO] Parameter: basedir, Value: C:\temp\Desarrollo [INFO] Parameter: package, Value: foo.tecnologia [INFO] Parameter: version, Value: 1.0 [INFO] Parameter: artifactId, Value: Foo-Lib-Test [ERROR] Directory Foo-Lib-Test already exists - please run from a clean directory org.apache.maven.archetype.old.ArchetypeTemplateProcessingException: Directory Foo-Lib-Test already exists - please run from a clean directory at org.apache.maven.archetype.old.DefaultOldArchetype.createArchetype(DefaultOldArchetype.java:242) at org.apache.maven.archetype.generator.DefaultArchetypeGenerator.processOldArchetype(DefaultArchetypeGenerator.java:253) at org.apache.maven.archetype.generator.DefaultArchetypeGenerator.generateArchetype(DefaultArchetypeGenerator.java:143) at org.apache.maven.archetype.generator.DefaultArchetypeGenerator.generateArchetype(DefaultArchetypeGenerator.java:286) at org.apache.maven.archetype.DefaultArchetype.generateProjectFromArchetype(DefaultArchetype.java:69) at org.apache.maven.archetype.mojos.CreateProjectFromArchetypeMojo.execute(CreateProjectFromArchetypeMojo.java:184) at org.apache.maven.plugin.DefaultPluginManager.executeMojo(DefaultPluginManager.java:490) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoals(DefaultLifecycleExecutor.java:694) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeStandaloneGoal(DefaultLifecycleExecutor.java:569) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoal(DefaultLifecycleExecutor.java:539) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoalAndHandleFailures(DefaultLifecycleExecutor.java:387) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeTaskSegments(DefaultLifecycleExecutor.java:284) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.execute(DefaultLifecycleExecutor.java:180) at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:328) at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:138) at com.foo.model.CSMavenCli.main(CSMavenCli.java:391) at com.foo.model.MavenAdmin.generateArchetype(MavenAdmin.java:399) at com.foo.model.ValidarPom.validarPom(ValidarPom.java:167) at com.foo.prueba.GenerarPOM.execute(GenerarPOM.java:93) at org.apache.struts.chain.commands.servlet.ExecuteAction.execute(ExecuteAction.java:58) at org.apache.struts.chain.commands.AbstractExecuteAction.execute(AbstractExecuteAction.java:67) at org.apache.struts.chain.commands.ActionCommandBase.execute(ActionCommandBase.java:51) at org.apache.commons.chain.impl.ChainBase.execute(ChainBase.java:191) at org.apache.commons.chain.generic.LookupCommand.execute(LookupCommand.java:305) at org.apache.commons.chain.impl.ChainBase.execute(ChainBase.java:191) at org.apache.struts.chain.ComposableRequestProcessor.process(ComposableRequestProcessor.java:283) at org.apache.struts.action.ActionServlet.process(ActionServlet.java:1913) at org.apache.struts.action.ActionServlet.doPost(ActionServlet.java:462) at javax.servlet.http.HttpServlet.service(HttpServlet.java:647) at javax.servlet.http.HttpServlet.service(HttpServlet.java:729) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:269) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:213) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:172) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:117) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:108) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:174) at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:873) at org.apache.coyote.http11.Http11BaseProtocol$Http11ConnectionHandler.processConnection(Http11BaseProtocol.java:665) at org.apache.tomcat.util.net.PoolTcpEndpoint.processSocket(PoolTcpEndpoint.java:528) at org.apache.tomcat.util.net.LeaderFollowerWorkerThread.runIt(LeaderFollowerWorkerThread.java:81) at org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:689) at java.lang.Thread.run(Unknown Source) [INFO] ------------------------------------------------------------------------ [ERROR] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] : org.apache.maven.archetype.old.ArchetypeTemplateProcessingException: Directory Foo-Lib-Test already exists - please run from a clean directory Directory Foo-Lib-Test already exists - please run from a clean directory [INFO] ------------------------------------------------------------------------ [INFO] For more information, run Maven with the -e switch [INFO] ------------------------------------------------------------------------ [INFO] Total time: 1 second [INFO] Finished at: Fri Apr 09 10:01:33 CEST 2010 [INFO] Final Memory: 15M/28M [INFO] ------------------------------------------------------------------------

    Read the article

  • com.sun.org.apache.xerces.internal.dom.DocumentImpl cannot be cast to org.codehaus.plexus.util.xml.X

    - by Random
    Afetr using the Model library (see this) for my Maven pom.xml during this weeks I jumped on this error while attempting to write a ConfiguratonContainer into the pom.xml. The javadoc for Model says: public void setConfiguration(Object configuration) Set the configuration as DOM object. Parameters: configuration - So I made some xml with Document (org.w3c.dom.Document) and the xml parser standar javax library. You know the code (it's through all the net) but I paste it here... DocumentBuilderFactory factory = DocumentBuilderFactory.newInstance(); DocumentBuilder parser = factory.newDocumentBuilder(); Document doc =parser.newDocument(); And then you fill the code with Element declaration,a lot of appendChild and such. And tried to make it work. Doesn't seem to do well. First tried using plugin.setConfiguration(doc); But it didn't work. Then I tried Obejct obj = doc; plugin.setConfiguration(obj); But it didn't work. Then I searched the net for answers, and guess... It didn't work. There are like 0 people using this library (maven.model) in all the net. So here am I again, asking the senseis for help. Where is my error? I have downloaded the source code of the library trying to see where is the error, but it seems to be ok (as should). I add the stackrace for reference java.lang.ClassCastException: com.sun.org.apache.xerces.internal.dom.DocumentImpl cannot be cast to org.codehaus.plexus.util.xml.Xpp3Dom at org.apache.maven.model.io.xpp3.MavenXpp3Writer.writePlugin(MavenXpp3Writer.java:1472) at org.apache.maven.model.io.xpp3.MavenXpp3Writer.writeBuild(MavenXpp3Writer.java:326) at org.apache.maven.model.io.xpp3.MavenXpp3Writer.writeModel(MavenXpp3Writer.java:1093) at org.apache.maven.model.io.xpp3.MavenXpp3Writer.write(MavenXpp3Writer.java:102) at com.mapfre.mutua.PDA.model.GeneratePOM.createPOM(GeneratePOM.java:28) at com.mapfre.mutua.PDA.prueba.MavenPOM.generatePOMWAR(MavenPOM.java:681) at com.mapfre.mutua.PDA.prueba.GenerarPOMMapfreAction.execute(GenerarPOMMapfreAction.java:36) at org.apache.struts.chain.commands.servlet.ExecuteAction.execute(ExecuteAction.java:58) at org.apache.struts.chain.commands.AbstractExecuteAction.execute(AbstractExecuteAction.java:67) at org.apache.struts.chain.commands.ActionCommandBase.execute(ActionCommandBase.java:51) at org.apache.commons.chain.impl.ChainBase.execute(ChainBase.java:191) at org.apache.commons.chain.generic.LookupCommand.execute(LookupCommand.java:305) at org.apache.commons.chain.impl.ChainBase.execute(ChainBase.java:191) at org.apache.struts.chain.ComposableRequestProcessor.process(ComposableRequestProcessor.java:283) at org.apache.struts.action.ActionServlet.process(ActionServlet.java:1913) at org.apache.struts.action.ActionServlet.doPost(ActionServlet.java:462) at javax.servlet.http.HttpServlet.service(HttpServlet.java:647) at javax.servlet.http.HttpServlet.service(HttpServlet.java:729) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:269) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:213) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:172) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:117) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:108) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:174) at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:873) at org.apache.coyote.http11.Http11BaseProtocol$Http11ConnectionHandler.processConnection(Http11BaseProtocol.java:665) at org.apache.tomcat.util.net.PoolTcpEndpoint.processSocket(PoolTcpEndpoint.java:528) at org.apache.tomcat.util.net.LeaderFollowerWorkerThread.runIt(LeaderFollowerWorkerThread.java:81) at org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:689) at java.lang.Thread.run(Unknown Source)

    Read the article

  • wsdl2java excpetion

    - by Daniel
    java org.apache.axis2.wsdl.WSDL2Java -s -p studs.exchange -uri https://api.betfair.com/exchange/v5/BFExchangeService.wsdl Retrieving document at 'https://api.betfair.com/exchange/v5/BFExchangeService.wsdl'. log4j:WARN No appenders could be found for logger (org.apache.axis2.description.WSDL11ToAllAxisServicesBuilder). log4j:WARN Please initialize the log4j system properly. Exception in thread "main" org.apache.axis2.wsdl.codegen.CodeGenerationException: java.lang.RuntimeException: java.lang.reflect.InvocationTargetException at org.apache.axis2.wsdl.codegen.CodeGenerationEngine.generate(CodeGenerationEngine.java:271) at org.apache.axis2.wsdl.WSDL2Code.main(WSDL2Code.java:35) at org.apache.axis2.wsdl.WSDL2Java.main(WSDL2Java.java:24) Caused by: java.lang.RuntimeException: java.lang.reflect.InvocationTargetException at org.apache.axis2.wsdl.codegen.extension.SimpleDBExtension.engage(SimpleDBExtension.java:53) at org.apache.axis2.wsdl.codegen.CodeGenerationEngine.generate(CodeGenerationEngine.java:224) ... 2 more Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.axis2.wsdl.codegen.extension.SimpleDBExtension.engage(SimpleDBExtension.java:50) ... 3 more Caused by: java.lang.NoSuchMethodError: org.apache.ws.commons.schema.XmlSchema.getTypeByName(Ljava/lang/String;)Lorg/apache/ws/commons/schema/XmlSchemaType; at org.apache.axis2.schema.SchemaCompiler.isComponetExists(SchemaCompiler.java:2728) at org.apache.axis2.schema.SchemaCompiler.getParentSchemaFromIncludes(SchemaCompiler.java:2670) at org.apache.axis2.schema.SchemaCompiler.getParentSchemaFromIncludes(SchemaCompiler.java:2704) at org.apache.axis2.schema.SchemaCompiler.getParentSchema(SchemaCompiler.java:2644) at org.apache.axis2.schema.SchemaCompiler.processElement(SchemaCompiler.java:758) at org.apache.axis2.schema.SchemaCompiler.processElement(SchemaCompiler.java:552) at org.apache.axis2.schema.SchemaCompiler.process(SchemaCompiler.java:1991) at org.apache.axis2.schema.SchemaCompiler.processParticle(SchemaCompiler.java:1874) at org.apache.axis2.schema.SchemaCompiler.processComplexType(SchemaCompiler.java:1081) at org.apache.axis2.schema.SchemaCompiler.processAnonymousComplexSchemaType(SchemaCompiler.java:980) at org.apache.axis2.schema.SchemaCompiler.processSchema(SchemaCompiler.java:934) at org.apache.axis2.schema.SchemaCompiler.processElement(SchemaCompiler.java:592) at org.apache.axis2.schema.SchemaCompiler.processElement(SchemaCompiler.java:563) at org.apache.axis2.schema.SchemaCompiler.compile(SchemaCompiler.java:370) at org.apache.axis2.schema.SchemaCompiler.compile(SchemaCompiler.java:280) at org.apache.axis2.schema.ExtensionUtility.invoke(ExtensionUtility.java:103) ... 8 more whats is happening here? what about log4j

    Read the article

  • Character encoding issues when generating MD5 hash cross-platform

    - by rogueprocess
    This is a general question about character encoding when using MD5 libraries in various languages. My concern is: suppose I generate an MD5 hash using a native Python string object, like this: message = "hello world" m = md5() m.update(message) Then I take a hex version of that MD5 hash using: m.hexdigest() and send the message & MD5 hash via a network, let's say, a JMS message or a HTTP request. Now I get this message in a Java program in the form of a native Java string, along with the checksum. Then I generate an MD5 hash using Java, like this (using the Commons Codec library): String md5 = org.apache.commons.codec.digest.DigestUtils.DigestUtils.md5Hex(s) My feeling is that this is wrong because I have not specified character encodng at either end. So the original hash will be based on the bytes of the Python version of the string; the Java one will be based on the bytes of the Java version of the string , these two byte sequences will often not be the same - is that right? So really I need to specify "UTF-8" or whatever at both ends right? (I am actually getting an intermittent error in my code where the MD5 checksum fails, and I suspect this is the reason - but because it's intermittent, it's difficult to say if changing this fixes it or not. ) Thank you!

    Read the article

  • evaluation of a java thread dump

    - by raticulin
    I got a thread dump of one of my processes. It has a bunch of these threads. I guess they are keeping a bunch of memory so I am getting OOM. "Thread-8264" prio=6 tid=0x4c94ac00 nid=0xf3c runnable [0x4fe7f000] java.lang.Thread.State: RUNNABLE at java.util.zip.Inflater.inflateBytes(Native Method) at java.util.zip.Inflater.inflate(Inflater.java:223) - locked <0x0c9bc640 (a java.util.zip.Inflater) at org.apache.commons.compress.archivers.zip.ZipArchiveInputStream.read(ZipArchiveInputStream.java:235) at com.my.ZipExtractorCommonsCompress.extract(ZipExtractorCommonsCompress.java:48) at com.my.CustomThreadedExtractorWrapper$ExtractionThread.run(CustomThreadedExtractorWrapper.java:151) Locked ownable synchronizers: - None "Thread-8241" prio=6 tid=0x4c94a400 nid=0xb8c runnable [0x4faef000] java.lang.Thread.State: RUNNABLE at java.util.zip.Inflater.inflateBytes(Native Method) at java.util.zip.Inflater.inflate(Inflater.java:223) - locked <0x0c36b808 (a java.util.zip.Inflater) at org.apache.commons.compress.archivers.zip.ZipArchiveInputStream.read(ZipArchiveInputStream.java:235) at com.my.ZipExtractorCommonsCompress.extract(ZipExtractorCommonsCompress.java:48) at com.my.CustomThreadedExtractorWrapper$ExtractionThread.run(CustomThreadedExtractorWrapper.java:151) Locked ownable synchronizers: - None I am trying to find out how it arrived to this situation. CustomThreadedExtractorWrapper is a wrapper class that fires a thread to do some work (ExtractionThread, which uses ZipExtractorCommonsCompress to extract zip contents from a compressed stream). If the task is taking too long, ExtractionThread.interrupt(); is called to cancel the operation. I can see in my logs that the cancellation happened 25 times. And I see 21 of these threads in my dump. My questions: What is the status of these threads? Alive and running? Blocked somehow? They did not die with .interrupt() apparently? Is there a sure way to really kill a thread? What does really mean 'locked ' in the stack trace? Line 223 in Inflater.java is: public synchronized int inflate(byte[] b, int off, int len) { ... //return is line 223 return inflateBytes(b, off, len); }

    Read the article

  • Lazy non-modifiable list in Google Collections

    - by mindas
    I was looking for a decent implementation of a generic lazy non-modifiable list implementation to wrap my search result entries. The unmodifiable part of the task is easy as it can be achieved by Collections.unmodifiableList() so I only need to sort out the the lazy part. Surprisingly, google-collections doesn't have anything to offer; while LazyList from Apache Commons Collections does not support generics. I have found an attempt to build something on top of google-collections but it seems to be incomplete (e.g. does not support size()), outdated (does not compile with 1.0 final) and requiring some external classes, but could be used as a good starting point to build my own class. Is anybody aware of any good implementation of a LazyList? If not, which option do you think is better: write my own implementation, based on google-collections ForwardingList, similar to what Peter Maas did; write my own wrapper around Commons Collections LazyList (the wrapper would only add generics so I don't have to cast everywhere but only in the wrapper itself); just write something on top of java.util.AbstractList; Any other suggestions are welcome.

    Read the article

  • Lazy non-modifiable list

    - by mindas
    I was looking for a decent implementation of a generic lazy non-modifiable list implementation to wrap my search result entries. The unmodifiable part of the task is easy as it can be achieved by Collections.unmodifiableList() so I only need to sort out the the lazy part. Surprisingly, google-collections doesn't have anything to offer; while LazyList from Apache Commons Collections does not support generics. I have found an attempt to build something on top of google-collections but it seems to be incomplete (e.g. does not support size()), outdated (does not compile with 1.0 final) and requiring some external classes, but could be used as a good starting point to build my own class. Is anybody aware of any good implementation of a LazyList? If not, which option do you think is better: write my own implementation, based on google-collections ForwardingList, similar to what Peter Maas did; write my own wrapper around Commons Collections LazyList (the wrapper would only add generics so I don't have to cast everywhere but only in the wrapper itself); just write something on top of java.util.AbstractList; Any other suggestions are welcome.

    Read the article

  • Why StrinUtils Apache class is not recognized in android?

    - by Maxood
    Why import org.apache.commons.lang.StringUtils cannot be imported in android by default. Do i have to include an external library? Then where can i find that library on the web? package com.myapps.urlencoding; import android.app.Activity; import org.apache.commons.lang.StringUtils; public class EncodeIdUtil extends Activity { /** Called when the activity is first created. */ private static Long multiplier=Long.parseLong("1zzzz",36); /** * Encodes the id. * @param id the id to encode * @return encoded string */ public static String encode(Long id) { return StringUtils.reverse(Long.toString((id*multiplier), 35)); } /** * Decodes the encoded id. * @param encodedId the encodedId to decode * @return the Id * @throws IllegalArgumentException if encodedId is not a validly encoded id. */ public static Long decode(String encodedId) throws IllegalArgumentException { long product; try { product = Long.parseLong(StringUtils.reverse(encodedId), 35); } catch (Exception e) { throw new IllegalArgumentException(); } if ( 0 != product % multiplier || product < 0) { throw new IllegalArgumentException(); } return product/multiplier; } }

    Read the article

  • Null Pointer Exception in Primavera P6 8.1

    - by gwrichard
    I am getting a null pointer exception in a Primavera P6 8.1 installation. The exception only occurs in one section of the web-client: Application settings.P6 web and the P6 API are deployed on the same WebLogic (10.3.5) node running on a Windows Server 2008 R2 installation. I have done this installation using this same software stack a dozen times and don't have this issue on any of the other installs. Exact error below: Match: beginTraversal Match: digest selected JREDesc: JREDesc[version 1.6.0_20+, heap=-1--1, args=null, href=http://java.sun.com/products/autodl/j2se, sel=false, null, null], JREInfo: JREInfo for index 0: platform is: 1.7 product is: 1.7.0_17 location is: http://java.sun.com/products/autodl/j2se path is: C:\Program Files (x86)\Java\jre7\bin\javaw.exe args is: null native platform is: Windows, x86 [ x86, 32bit ] JavaFX runtime is: JavaFX 2.2.7 found at C:\Program Files (x86)\Java\jre7\ enabled is: true registered is: true system is: true Match: ignoring maxHeap: -1 Match: ignoring InitHeap: -1 Match: digesting vmargs: null Match: digested vmargs: [JVMParameters: isSecure: true, args: ] Match: JVM args after accumulation: [JVMParameters: isSecure: true, args: ] Match: digest LaunchDesc: http://localhost:7001/p6/action/jnlp/appletsjnlp.jnlp?mainClass=com.primavera.pvapplets.adminpreferences.AdminPreferencesApplet&classPath=adminpreferences.jar,prm-applets-common.jar,forms-1.0.7.jar,prm-guisupport.jar,prm-to.jar,jide.jar,tablesupport.jar,formsupport.jar,applets-bo.jar,commons-lang.jar,prm-common.jar,resource_strings.jar,prm-img.jar,commons-logging.jar&name=AdminPreferences&version=8.1.2.0.0602 Match: digest properties: [] Match: JVM args: [JVMParameters: isSecure: true, args: ] Match: endTraversal .. Match: JVM args final: Match: Running JREInfo Version match: 1.7.0.17 == 1.7.0.17 Match: Running JVM args match: have:<> satisfy want:<> Java Plug-in 10.17.2.02 Using JRE version 1.7.0_17-b02 Java HotSpot(TM) Client VM User home directory = C:\Users\gwrichard ---------------------------------------------------- c: clear console window f: finalize objects on finalization queue g: garbage collect h: display this help message l: dump classloader list m: print memory usage o: trigger logging q: hide console r: reload policy configuration s: dump system and deployment properties t: dump thread list v: dump thread stack x: clear classloader cache 0-5: set trace level to <n> ----------------------------------------------------

    Read the article

  • JavaFX: File upload to REST service / servlet fails because of missing boundary

    - by spa
    I'm trying to upload a file using JavaFX using the HttpRequest. For this purpose I have written the following function. function uploadFile(inputFile : File) : Void { // check file if (inputFile == null or not(inputFile.exists()) or inputFile.isDirectory()) { return; } def httpRequest : HttpRequest = HttpRequest { location: urlConverter.encodeURL("{serverUrl}"); source: new FileInputStream(inputFile) method: HttpRequest.POST headers: [ HttpHeader { name: HttpHeader.CONTENT_TYPE value: "multipart/form-data" } ] } httpRequest.start(); } On the server side, I am trying to handle the incoming data using the Apache Commons FileUpload API using a Jersey REST service. The code used to do this is a simple copy of the FileUpload tutorial on the Apache homepage. @Path("Upload") public class UploadService { public static final String RC_OK = "OK"; public static final String RC_ERROR = "ERROR"; @POST @Produces("text/plain") public String handleFileUpload(@Context HttpServletRequest request) { if (!ServletFileUpload.isMultipartContent(request)) { return RC_ERROR; } FileItemFactory factory = new DiskFileItemFactory(); ServletFileUpload upload = new ServletFileUpload(factory); List<FileItem> items = null; try { items = upload.parseRequest(request); } catch (FileUploadException e) { e.printStackTrace(); return RC_ERROR; } ... } } However, I get a exception at items = upload.parseRequest(request);: org.apache.commons.fileupload.FileUploadException: the request was rejected because no multipart boundary was found I guess I have to add a manual boundary info to the InputStream. Is there any easy solution to do this? Or are there even other solutions?

    Read the article

< Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >