Search Results

Search found 46178 results on 1848 pages for 'java home'.

Page 876/1848 | < Previous Page | 872 873 874 875 876 877 878 879 880 881 882 883  | Next Page >

  • Can we view objects in the JVM memory?

    - by Sebastien Lorber
    Hey, At work we found that on some instances (particulary the slow ones) we have a different behaviour, acquired at the reboot. We guess a cache is not initialized correctly, or maybe a concurrency problem... Anyway it's not reproductible in any other env than production. We actually don't have loggers to activate... it's an old component... Thus i'd like to know if there are tools that can help us to see the different objets present in the JVM memory in order to check the content of the cache... Thank you!

    Read the article

  • jboss cache as hibernate 2nd level - cluster node doesn't persist replicated data

    - by Sergey Grashchenko
    I'm trying to build an architecture basically described in user guide http://www.jboss.org/file-access/default/members/jbosscache/freezone/docs/3.2.1.GA/userguide_en/html/cache_loaders.html#d0e3090 (Replicated caches with each cache having its own store.) but having jboss cache configured as hibernate second level cache. I've read manual for several days and played with the settings but could not achieve the result - the data in memory (jboss cache) gets replicated across the hosts, but it's not persisted in the datasource/database of the target (not original) cluster host. I had a hope that a node might become persistent at eviction, so I've got a cache listener and attached it to @NoveEvicted event. I found that though I could adjust eviction policy to fully control it, no any persistence takes place. Then I had a though that I could try to modify CacheLoader to set "passivate" to true, but I found that in my case (hibernate 2nd level cache) I don't have a way to access a loader. I wonder if replicated data persistence is possible at all by configuration tuning ? If not, will it work for me to create some manual peristence in CacheListener (I could check whether the eviction event is local, and if not - persist it to hibernate datasource somehow) ? I've used mvcc-entity configuration with the modification of cacheMode - set to REPL_ASYNC. I've also played with the eviction policy configuration. Last thing to mention is that I've tested entty persistence and replication in project that has been generated with Seam. I guess it's not important though.

    Read the article

  • Weak hashmap with weak references to the values?

    - by Razor Storm
    I am building an android app where each entity has a bitmap that represents its sprite. However, each entity can be be duplicated (there might be 3 copies of entity asdf for example). One approach is to load all the sprites upfront, and then put the correct sprite in the constructors of the entities. However, I want to decode the bitmaps lazily, so that the constructors of the entities will decode the bitmaps. The only problem with this is that duplicated entities will load the same bitmap twice, using 2x the memory (Or n times if the entity is created n times). To fix this, I built a SingularBitmapFactory that will store a decoded Bitmap into a hash, and if the same bitmap is asked for again, will simply return the previously hashed one instead of building a new one. Problem with this, though, is that the factory holds a copy of all bitmaps, and so won't ever get garbage collected. What's the best way to switch the hashmap to one with weakly referenced values? In otherwords, I want a structure where the values won't be GC'd if any other object holds a reference to it, but as long as no other objects refers it, then it can be GC'd.

    Read the article

  • I am faceing problem with this error

    - by Sanjeev
    I am using Struts application while running welcome page is run successfully after that the following error is appear org.apache.jasper.JasperException: Failed to load or instantiate TagLibraryValidator class: org.apache.taglibs.standard.tlv.JstlCoreTLV any idea whats the problem.

    Read the article

  • Configuring Hadoop logging to avoid too many log files

    - by Eric Wendelin
    I'm having a problem with Hadoop producing too many log files in $HADOOP_LOG_DIR/userlogs (the Ext3 filesystem allows only 32000 subdirectories) which looks like the same problem in this question: http://stackoverflow.com/questions/2091287/error-in-hadoop-mapreduce My question is: does anyone know how to configure Hadoop to roll the log dir or otherwise prevent this? I'm trying to avoid just setting the "mapred.userlog.retain.hours" and/or "mapred.userlog.limit.kb" properties because I want to actually keep the log files. I was also hoping to configure this in log4j.properties, but looking at the Hadoop 0.20.2 source, it writes directly to logfiles instead of actually using log4j. Perhaps I don't understand how it's using log4j fully. Any suggestions or clarifications would be greatly appreciated.

    Read the article

  • swing gui improving

    - by radi
    hi all , i am looking for some methods to improve swing gui for example i want to know about new components libraries , new methods to enhance swing performance , new methods to add special effect to swing gui and new plaf for swing . please i want to know how to use this topics and where to find some tutorials about it . thanks.

    Read the article

  • Jetty offline documentation

    - by Victor Sorokin
    Is there any more-or-less comprehensive documentation for Jetty (e.g., similar to Tomcat or, really, in any form)? Theirs online wikis seems to be quite informative, but servers seem to be slow. Maybe, there some way to build docs from Jetty source distribution? I tried mvn site to no avail.

    Read the article

  • understanding list[i-1] vs list[i]-1

    - by user3720527
    Hopefully this is a simple answer that I am just failing to understand. Full code is public static void mystery(int[] list) { for( int i = list.length - 1; i>1; i --) { if (list[i] > list[i - 1]) { list[i -1] = list[i] - 2; list[i]++; } } } } and lets say we are using a list of [2,3,4]. I know that it will output 2,2,5 but I am unclear how to actually work through it. I understand that the list.length is 3 here, and I understand that the for loop will only run once, but I am very unclear what happens at the list[i - 1] = list[i] - 2; area. Should it be list[2-1] = list[2] - 2? How does the two being outside the bracket effect it differently? Much thanks.

    Read the article

  • Problem signing jars for web applet

    - by nuno_cruz
    keytool -genkey -keystore myKeyStore -alias me keytool -selfcert -keystore myKeyStore -alias me jarsigner -keystore myKeyStore jarfile.jar me I'm using this way to sign jars. I use my jar and a few more as libraries and all of them are signed this way, still, when I open the browser I get the warning that there is signed and unsigned code. So this is confusing me... :/ http://dl.dropbox.com/u/1430071/images/errormessage.png

    Read the article

  • unable to set fields of a collection-property elements after changing their order (elements becoming

    - by Jaroslav Záruba
    Hello I want to change order of objects in a collection, and then to access+modify fields of those items. Unfortunately the items somehow become 'deleted'. This is what I do... if(someCondition) { MainEvent mainEvent = pm.getObjectById(MainEvent.class, mainEventKey); /* * events in the original order * MainEvent.subEvents field is not in default fetch group, * therefore I also tried to add the named group into the * persistenceManeger fetch plan, no difference * (mainEvent is not instance of the Event sub/class BTW) */ List<Event> subEvents = mainEvent.getSubEvents(); // re-arrange the events according to keysOrdered { Map<Key, Event> eventMap = new HashMap<Key, Event>(); for(Event event : subEvents) eventMap.put(event.getKey(), event); List<Event> eventsOrdered = new LinkedList<Event>(); for(Key eventKey : keysOrdered) eventsOrdered.add(eventMap.put(eventKey, eventMap.get(eventKey))); // } // put the re-arranged items back into the collection property { subEvents.clear(); subEvents.addAll(eventsOrdered); // } pm.makePersistent(mainEvent); eventsOrdered = subEvents; } else eventsOrdered = getEventsUsingAlternateApproach(); /* * so by now the mainEvent variable does not exist; * could it be this lead the persistence manager to mark * my events as abandoned/obsolete/invalid/deleted...? */ for(Event event : eventsOrdered) event.setDate(new Date()); // -> "Cannot write fields to a deleted object" What am I doing wrong please?

    Read the article

  • Accept All Cookies via HttpClient

    - by Vinay
    So this is currently how my app is set up: 1.) Login Activity. 2.) Once logged in, other activities may be fired up that use PHP scripts that require the cookies sent from logging in. I am using one HttpClient across my app to ensure that the same cookies are used, but my problem is that I am getting 2 of the 3 cookies rejected. I do not care about the validity of the cookies, but I do need them to be accepted. I tried setting the CookiePolicy, but that hasn't worked either. This is what logcat is saying: 11-26 10:33:57.613: WARN/ResponseProcessCookies(271): Cookie rejected: "[version: 0] [name: cookie_user_id][value: 1][domain: www.trackallthethings.com][path: trackallthethings][expiry: Sun Nov 25 11:33:00 CST 2012]". Illegal path attribute "trackallthethings". Path of origin: "/mobile-api/login.php" 11-26 10:33:57.593: WARN/ResponseProcessCookies(271): Cookie rejected: "[version: 0][name: cookie_session_id][value: 1985208971][domain: www.trackallthethings.com][path: trackallthethings][expiry: Sun Nov 25 11:33:00 CST 2012]". Illegal path attribute "trackallthethings". Path of origin: "/mobile-api/login.php" I am sure that my actual code is correct (my app still logs in correctly, just doesn't accept the aforementioned cookies), but here it is anyway: HttpGet httpget = new HttpGet(//MY URL); HttpResponse response; response = Main.httpclient.execute(httpget); HttpEntity entity = response.getEntity(); InputStream in = entity.getContent(); BufferedReader reader = new BufferedReader(new InputStreamReader(in)); StringBuilder sb = new StringBuilder(); From here I use the StringBuilder to simply get the String of the response. Nothing fancy. I understand that the reason my cookies are being rejected is because of an "Illegal path attribute" (I am running a script at /mobile-api/login.php whereas the cookie will return with a path of just "/" for trackallthethings), but I would like to accept the cookies anyhow. Is there a way to do this?

    Read the article

  • Exception in inserting data into data using JPA in netbeans

    - by sandeep
    SEVERE: Local Exception Stack: Exception [EclipseLink-7092] (Eclipse Persistence Services - 2.0.0.v20091127-r5931): org.eclipse.persistence.exceptions.ValidationException Exception Description: Cannot add a query whose types conflict with an existing query. Query To Be Added: [ReadAllQuery(name="Voter.findAll" referenceClass=Voter jpql="SELECT v FROM Voter v")] is named: [Voter.findAll] with arguments [[]].The existing conflicting query: [ReadAllQuery(name="Voter.findAll" referenceClass=Voter jpql="SELECT v FROM Voter v")] is named: [Voter.findAll] with arguments: [[]].

    Read the article

  • Why is JavaMail Transport.send() a static method?

    - by skiphoppy
    I'm revising code I did not write that uses JavaMail, and having a little trouble understanding why the JavaMail API is designed the way it is. I have the feeling that if I understood, I could be doing a better job. We call: transport = session.getTransport("smtp"); transport.connect(hostName, port, user, password); So why is Eclipse warning me that this: transport.send(message, message.getAllRecipients()); is a call to a static method? Why am I getting a Transport object and providing settings that are specific to it if I can't use that object to send the message? How does the Transport class even know what server and other settings to use to send the message? It's working fine, which is hard to believe. What if I had instantiated Transport objects for two different servers; how would it know which one to use? In the course of writing this question, I've discovered that I should really be calling: transport.sendMessage(message, message.getAllRecipients()); So what is the purpose of the static Transport.send() method? Is this just poor design, or is there a reason it is this way?

    Read the article

  • Collections not read from hibernate/ehcache second-level-cache

    - by Mark van Venrooij
    I'm trying to cache lazy loaded collections with ehcache/hibernate in a Spring project. When I execute a session.get(Parent.class, 123) and browse through the children multiple times a query is executed every time to fetch the children. The parent is only queried the first time and then resolved from the cache. Probably I'm missing something, but I can't find the solution. Please see the relevant code below. I'm using Spring (3.2.4.RELEASE) Hibernate(4.2.1.Final) and ehcache(2.6.6) The parent class: @Entity @Table(name = "PARENT") @Cacheable @Cache(usage = CacheConcurrencyStrategy.READ_WRITE, include = "all") public class ServiceSubscriptionGroup implements Serializable { /** The Id. */ @Id @Column(name = "ID") private int id; @OneToMany(fetch = FetchType.LAZY, mappedBy = "parent") @Cache(usage = CacheConcurrencyStrategy.READ_WRITE) private List<Child> children; public List<Child> getChildren() { return children; } public void setChildren(List<Child> children) { this.children = children; } @Override public boolean equals(Object o) { if (this == o) return true; if (o == null || getClass() != o.getClass()) return false; Parent that = (Parent) o; if (id != that.id) return false; return true; } @Override public int hashCode() { return id; } } The child class: @Entity @Table(name = "CHILD") @Cacheable @Cache(usage = CacheConcurrencyStrategy.READ_WRITE, include = "all") public class Child { @Id @Column(name = "ID") private int id; @ManyToOne(fetch = FetchType.LAZY, cascade = CascadeType.ALL) @JoinColumn(name = "PARENT_ID") @Cache(usage = CacheConcurrencyStrategy.READ_WRITE) private Parent parent; public int getId() { return id; } public void setId(final int id) { this.id = id; } private Parent getParent(){ return parent; } private void setParent(Parent parent) { this.parent = parent; } @Override public boolean equals(final Object o) { if (this == o) { return true; } if (o == null || getClass() != o.getClass()) { return false; } final Child that = (Child) o; return id == that.id; } @Override public int hashCode() { return id; } } The application context: <bean id="sessionFactory" class="org.springframework.orm.hibernate4.LocalSessionFactoryBean"> <property name="dataSource" ref="dataSource" /> <property name="annotatedClasses"> <list> <value>Parent</value> <value>Child</value> </list> </property> <property name="hibernateProperties"> <props> <prop key="hibernate.dialect">org.hibernate.dialect.SQLServer2008Dialect</prop> <prop key="hibernate.hbm2ddl.auto">validate</prop> <prop key="hibernate.ejb.naming_strategy">org.hibernate.cfg.ImprovedNamingStrategy</prop> <prop key="hibernate.connection.charSet">UTF-8</prop> <prop key="hibernate.show_sql">true</prop> <prop key="hibernate.format_sql">true</prop> <prop key="hibernate.use_sql_comments">true</prop> <!-- cache settings ehcache--> <prop key="hibernate.cache.use_second_level_cache">true</prop> <prop key="hibernate.cache.use_query_cache">true</prop> <prop key="hibernate.cache.region.factory_class"> org.hibernate.cache.ehcache.SingletonEhCacheRegionFactory</prop> <prop key="hibernate.generate_statistics">true</prop> <prop key="hibernate.cache.use_structured_entries">true</prop> <prop key="hibernate.cache.use_query_cache">true</prop> <prop key="hibernate.transaction.factory_class"> org.hibernate.engine.transaction.internal.jta.JtaTransactionFactory</prop> <prop key="hibernate.transaction.jta.platform"> org.hibernate.service.jta.platform.internal.JBossStandAloneJtaPlatform</prop> </props> </property> </bean> The testcase I'm running: @Test public void testGetParentFromCache() { for (int i = 0; i <3 ; i++ ) { getEntity(); } } private void getEntity() { Session sess = sessionFactory.openSession() sess.setCacheMode(CacheMode.NORMAL); Transaction t = sess.beginTransaction(); Parent p = (Parent) s.get(Parent.class, 123); Assert.assertNotNull(p); Assert.assertNotNull(p.getChildren().size()); t.commit(); sess.flush(); sess.clear(); sess.close(); } In the logging I can see that the first time 2 queries are executed getting the parent and getting the children. Furthermore the logging shows that the child entities as well as the collection are stored in the 2nd level cache. However when reading the collection a query is executed to fetch the children on second and third attempt.

    Read the article

  • bridge methods explaination

    - by xdevel2000
    If I do an override of a clone method the compiler create a bridge method to guarantee a correct polymorphism: class Point { Point() { } protected Point clone() throws CloneNotSupportedException { return this; // not good only for example!!! } protected volatile Object clone() throws CloneNotSupportedException { return clone(); } } so when is invoked the clone method the bridge method is invoked and inside it is invoked the correct clone method. But my question is when into the bridge method is called return clone() how do the VM to say that it must invoke Point clone() and not itself again???

    Read the article

  • Source Lookup Path is correct but debugger can't find file (Eclipse EE IDE)?

    - by Greg McNulty
    When debugging stepping over each line does work. Stepping into a function located in another file debugger displays: Source not found. Also displays option for Edit Source Lookup Path... but the correct package is listed there. (Also tried pointing with the directory path.) No other breakpoints set, as is a common solution. Any point in the right direction is helpful. Thank You. Thread[main] in the debugger window: Thread [main] (Suspended) ClassNotFoundException(Throwable).<init>(String, Throwable) line: 217 ClassNotFoundException(Exception).<init>(String, Throwable) line: not available ClassNotFoundException.<init>(String) line: not available URLClassLoader$1.run() line: not available AccessController.doPrivileged(PrivilegedExceptionAction<T>, AccessControlContext) line: not available [native method] Launcher$ExtClassLoader(URLClassLoader).findClass(String) line: not available Launcher$ExtClassLoader.findClass(String) line: not available Launcher$ExtClassLoader(ClassLoader).loadClass(String, boolean) line: not available Launcher$AppClassLoader(ClassLoader).loadClass(String, boolean) line: not available Launcher$AppClassLoader.loadClass(String, boolean) line: not available Launcher$AppClassLoader(ClassLoader).loadClass(String) line: not available MyMain.<init>() line: 24 MyMain.main(String[]) line: 36

    Read the article

  • Parsing custom time format with SimpleDateFormat

    - by ggrigery
    I'm having trouble parsing a date format that I'm getting back from an API and that I have never seen (I believe is a custom format). An example of a date: /Date(1353447000000+0000)/ When I first encountered this format it didn't take me long to see that it was the time in milliseconds with a time zone offset. I'm having trouble extracting this date using SimpleDateFormat though. Here was my first attempt: String weirdDate = "/Date(1353447000000+0000)/"; SimpleDateFormat sdf = new SimpleDateFormat("'/Date('SSSSSSSSSSSSSZ')/'"); Date d1 = sdf.parse(weirdDate); System.out.println(d1.toString()); System.out.println(d1.getTime()); System.out.println(); Date d2 = new Date(Long.parseLong("1353447000000")); System.out.println(d2.toString()); System.out.println(d2.getTime()); And output: Tue Jan 06 22:51:41 EST 1970 532301760 Tue Nov 20 16:30:00 EST 2012 1353447000000 The date (and number of milliseconds parsed) is not even close and I haven't been able to figure out why. After some troubleshooting, I discovered that the way I'm trying to use SDF is clearly flawed. Example: String weirdDate = "1353447000000"; SimpleDateFormat sdf = new SimpleDateFormat("S"); Date d1 = sdf.parse(weirdDate); System.out.println(d1.toString()); System.out.println(d1.getTime()); And output: Wed Jan 07 03:51:41 EST 1970 550301760 I can't say I've ever tried to use SDF in this way to just parse a time in milliseconds because I would normally use Long.parseLong() and just pass it straight into new Date(long) (and in fact the solution I have in place right now is just a regular expression and parsing a long). I'm looking for a cleaner solution that I can easily extract this time in milliseconds with the timezone and quickly parse out into a date without the messy manual handling. Anyone have any ideas or that can spot the errors in my logic above? Help is much appreciated.

    Read the article

  • Calculated group-by fields in MongoDB

    - by Navin Viswanath
    For this example from the MongoDB documentation, how do I write the query using MongoTemplate? db.sales.aggregate( [ { $group : { _id : { month: { $month: "$date" }, day: { $dayOfMonth: "$date" }, year: { $year: "$date" } }, totalPrice: { $sum: { $multiply: [ "$price", "$quantity" ] } }, averageQuantity: { $avg: "$quantity" }, count: { $sum: 1 } } } ] ) Or in general, how do I group by a calculated field?

    Read the article

  • How do you create your own drag-n-drop GUI designer ?

    - by panzerschreck
    Hello I was looking at creating a UI for developing web forms, similar to the Netbeans Visual JSF form designer.It will be targeted to use GWT/GXT components.I am looking at a look very similar to VS/Netbeans. Any thoughts on where/how to start ? Initially, I would prefer to have it as a standalone application, later develop it as eclipse plugin. I have already evaluated the extjs designer and the eclipse plugin from instantiations, but i would prefer to have it developed on my own, as it looks challenging, Also, I have few custom components that have been developed. Thanks

    Read the article

  • Eclipse - Force Refresh of IDs

    - by Echilon
    I'm using eclipse for Android development, and the editor always seems to take a while to actually update and recognize if I change an ID in a layout, then try to use it in a class with R.id.someId. Is there a way to force a refresh?

    Read the article

  • org.apache.commons.httpclient.NameValuePair in post method

    - by pushkins
    I'm writing some code like : PostMethod p = new PostMethod(someurl); ... NameValuePair[] data = { new NameValuePair("name1", "somevalue1"), new NameValuePair("var[3][1]", "10") }; try { hc.executeMethod(p); } ... And that's what I get, when I look at my post in Wireshark: POST /someurl HTTP/1.1 ... type=var&ship%5B3%5D%5B1%5D=10 %5B means [, %5D- ] So the problem is how I can get square brackets in my post?

    Read the article

< Previous Page | 872 873 874 875 876 877 878 879 880 881 882 883  | Next Page >