Search Results

Search found 33297 results on 1332 pages for 'java java ee'.

Page 727/1332 | < Previous Page | 723 724 725 726 727 728 729 730 731 732 733 734  | Next Page >

  • Using Hibernate to do a query involving two tables

    - by Nathan Spears
    I'm inexperienced with sql in general, so using Hibernate is like looking for an answer before I know exactly what the question is. Please feel free to correct any misunderstandings I have. I am on a project where I have to use Hibernate. Most of what I am doing is pretty basic and I could copy and modify. Now I would like to do something different and I'm not sure how configuration and syntax need to come together. Let's say I have two tables. Table A has two (relevant) columns, user GUID and manager GUID. Obviously managers can have more than one user under them, so queries on manager can return more than one row. Additionally, a manager can be managing the same user on multiple projects, so the same user can be returned multiple times for the same manager query. Table B has two columns, user GUID and user full name. One-to-one mapping there. I want to do a query on manager GUID from Table A, group them by unique User GUID (so the same User isn't in the results twice), then return those users' full names from Table B. I could do this in sql without too much trouble but I want to use Hibernate so I don't have to parse the sql results by hand. That's one of the points of using Hibernate, isn't it? Right now I have Hibernate mappings that map each column in Table A to a field (well the get/set methods I guess) in a DAO object that I wrote just to hold that Table's data. I could also use the Hibernate DAOs I have to access each table separately and do each of the things I mentioned above in separate steps, but that would be less efficient (I assume) that doing one query. I wrote a Service object to hold the data that gets returned from the query (my example is simplified - I'm going to keep some other data from Table A and get multiple columns from Table B) but I'm at a loss for how to write a DAO that can do the join, or use the DAOs I have to do the join. FYI, here is a sample of my hibernate config file (simplified to match my example): <hibernate-mapping package="com.my.dao"> <class name="TableA" table="table_a"> <id name="pkIndex" column="pk_index" /> <property name="userGuid" column="user_guid" /> <property name="managerGuid" column="manager_guid" /> </class> </hibernate-mapping> So then I have a DAOImplementation class that does queries and returns lists like public List<TableA> findByHQL(String hql, Map<String, String> params) etc. I'm not sure how "best practice" that is either.

    Read the article

  • Trouble getting started with Spring Roo and GWT

    - by Abdel Olakara
    Hi all, I am trying to get started with SpringRoo and GWT after seeing the keynote.. unfortunately I am stuck at this issue. I successfully created the project using Roo and added the persistence, the entities and when I perform the command "perform package" I get this error: 23/5/10 12:10:13 AM AST: [ERROR] ApplicationEntityTypesProcessor cannot be resolved 23/5/10 12:10:13 AM AST: [ERROR] ApplicationEntityTypesProcessor cannot be resolved to a type 23/5/10 12:10:13 AM AST: [WARN] advice defined in org.springframework.mock.staticmock.AnnotationDrivenStaticEntityMockingControl has not been applied [Xlint:adviceDidNotMatch] 23/5/10 12:10:13 AM AST: [WARN] advice defined in org.springframework.mock.staticmock.AbstractMethodMockingControl has not been applied [Xlint:adviceDidNotMatch] 23/5/10 12:10:13 AM AST: Build errors for helloroo; org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.codehaus.mojo:aspectj-maven-plugin:1.0:compile (default) on project helloroo: Compiler errors : error at import tp.gwt.request.ApplicationEntityTypesProcessor; I see this in the Maven console and cannot complete the build..I know there is some jar missing but how and why? because I downloaded all the latest version including GWT milestone release. Any idea why this error is occurring? How do I resolve this issue? Thanks in Advance, Abdel Olakara

    Read the article

  • Hibernate MySQL transaction configuration issue

    - by James
    I'm having trouble starting a transaction with Hibernate and MySQL while running in JUnit. I'm getting a HibernateException which states: "No TransactionManagerLookup specified". I believe this error is because I don't have a proper configuration setting for hibernate.transaction.manager_lookup_class. I see that under the namespace of org.hibernate.transaction there are quite a few different lookup classes that I could use. All of the documentation that I could find on these was very vague. My question is what is the appropriate one for MySQL?

    Read the article

  • Eclipselink read complex oject model in an ordered way

    - by Raven
    Hi, I need to read a complex model in an ordered way with eclipselink. The order is mandantory because it is a huge database and I want to have an output of a small portion of the database in a jface tableview. Trying to reorder it in the loading/quering thread takes too long and ordering it in the LabelProvider blocks the UI thread too much time, so I thought if Eclipselink could be used that way, that the database will order it, it might give me the performance I need. Unfortunately the object model can not be changed :-( The model is something like: @SuppressWarnings("serial") @Entity public class Thing implements Serializable { @Id @GeneratedValue(strategy = GenerationType.TABLE) private int id; private String name; @OneToMany(cascade=CascadeType.ALL) @PrivateOwned private List<Property> properties = new ArrayList<Property>(); ... // getter and setter following here } public class Property implements Serializable { @Id @GeneratedValue(strategy = GenerationType.TABLE) private int id; @OneToOne private Item item; private String value; ... // getter and setter following here } public class Item implements Serializable { @Id @GeneratedValue(strategy = GenerationType.TABLE) private int id; private String name; .... // getter and setter following here } // Code end In the table view the y-axis is more or less created with the query Query q = em.createQuery("SELECT m FROM Thing m ORDER BY m.name ASC"); using the "name" attribute from the Thing objects as label. In the table view the x-axis is more or less created with the query Query q = em.createQuery("SELECT m FROM Item m ORDER BY m.name ASC"); using the "name" attribute from the Item objects as label. Each cell has the value Things.getProperties().get[x].getValue() Unfortunately the list "properties" is not ordered, so the combination of cell value and x-axis column number (x) is not necessarily correct. Therefore I need to order the list "properties" in the same way as I ordered the labeling of the x-axis. And exactly this is the thing I dont know how it is done. So querying for the Thing objects should return the list "properties" "ORDER BY name ASC" but of the "Item"s objects. My ideas are something like having a query with two JOINs. Joing Things with Property and with Item but somehow I was unable to get it to work yet. Thank you for your help and your ideas to solve this riddle.

    Read the article

  • Memory dump much smaller than available memory

    - by Daniel
    I have a Tomcat Application Server that is configured to create a memory dump on OOM, and it is started with -Xmx1024M, so a Gigabyte should be available to him. Now I found one such dump and it contains only 260MB of unretained memory. How is it possible that the dump is so much smaller than the size he should have available?

    Read the article

  • Postgresql 8.4 reading OID style BLOBs with Hibernate

    - by peter
    I am getting this weird case when querying Postgres 8.4 for some records with Blobs (of type OIDs) with Hibernate. The query does return all right but when my code wants to read the content of the BLOB with the simple code below, it gets 0 bytes back public static byte[] readBlob(Blob blob) throws Exception { InputStream is = null; try { is = blob.getBinaryStream(); return org.apache.commons.io.IOUtils.toByteArray(is); } finally { if (is != null) try { is.close(); } catch(Exception e) {} } } Funny think is that I am getting this behavior only since I've started adding more then one such records to the table. The underlying JDBC library is type 3 (postgresq 8.4-701). Can someone give me a hint as to how to solve this issue? Thanks Peter

    Read the article

  • Eclipse plugin development for Eclipse

    - by Raven
    Hi, how do I start to develop a plugin for Eclipse? I need a tool which isn't out there yet for my further development... so my main concerns are: -How to add a perspective? -How to add menu items? -How to add a view into the "Show Views" menu? -How to add to the preference pane? -How do I get information, like * where is the workspace? * which files are open in the editor? * which projects are collapsed / expanded in the left projects view? * ..... -Which conventions (naming...) should I consider? -How to set up update site/deployment? Can you give me some hints, links, tutorials.... Dont get me wrong. I am using Eclipse RCP for developing stand alone apps for quite some time, so I am familiar with the techniques, but I never started to develop a plugin for Eclipse itself. Thanks

    Read the article

  • How to split a path platform independent?

    - by Janusz
    I'm using the following code to get an array with all sub directories from a given path. String[] subDirs = path.split(File.separator); I need the array to check if certain folders are at the right place in this path. This looked like a good solution until findBugs complains that File.separator is used as a regular expression. It seems that passing the windows file separator to a function that is building a regex from it is a bad idea because the backslash being an escape character. How can I split the path in a cross platform way without using File.separator? Or is code like this okay? String[] subDirs = path.split("/");

    Read the article

  • Convert scientific notation to decimal notation

    - by Ankur
    There is a similar question on SO which suggests using NumberFormat which is what I have done. I am using the parse() method of NumberFormat. public static void main(String[] args) throws ParseException{ DecToTime dtt = new DecToTime(); dtt.decToTime("1.930000000000E+02"); } public void decToTime(String angle) throws ParseException{ DecimalFormat dform = new DecimalFormat(); //ParsePosition pp = new ParsePosition(13); Number angleAsNumber = dform.parse(angle); System.out.println(angleAsNumber); } The result I get is 1.93 I didn't really expect this to work because 1.930000000000E+02 is a pretty unusual looking number, do I have to do some string parsing first to remove the zeros? Or is there a quick and elegant way?

    Read the article

  • Why are error pages ignored in RESTEasy web service running on Tomcat?

    - by Chandru
    I'm developing a REST-ful web service using RESTEasy deployed on Tomcat. I've configured an error page which takes the exception's message and generates an XML based on it when any exception occurs during the request. This works fine for any application generated exceptions. However, if client sends an invalid XML which cannot be unmarshalled correctly, an javax.xml.bind.UnmarshalException is thrown and Tomcat's default error page is used instead of mine. I have configured my error page to the error-code 500 in web.xml. Is using error pages the correct way to handle errors when using RESTEasy or is there an alternative way?

    Read the article

  • Flex/Flash 4 datagrid literally displays XML

    - by Setori
    Problem: Flex/Flash4 client (built with FlashBuilder4) displays the xml sent from the server exactly as is - the datagrid keeps the format of the xml. I need the datagrid to parse the input and place the data in the correct rows and columns of the datagrid. flow: click on a date in the tree and it makes a server request for batch information in xml form. Using a CallResponder I then update the datagrid's dataProvider. [code] <fx:Script> <![CDATA[ import mx.controls.Alert; [Bindable]public var selectedTreeNode:XML; public function taskTreeChanged(event:Event):void { selectedTreeNode=Tree(event.target).selectedItem as XML; var searchHubId:String = selectedTreeNode.@hub; var searchDate:String = selectedTreeNode.@lbl; if((searchHubId == "") || (searchDate == "")){ return; } findShipmentBatches(searchDate,searchHubId); } protected function findShipmentBatches(searchDate:String, searchHubId:String):void{ findShipmentBatchesResult.token = actWs.findShipmentBatches(searchDate, searchHubId); } protected function updateBatchDataGridDP():void{ task_list_dg.dataProvider = findShipmentBatchesResult.lastResult; } ]]> </fx:Script> <fx:Declarations> <actws:ActWs id="actWs" fault="Alert.show(event.fault.faultString + '\n' + event.fault.faultDetail)" showBusyCursor="true"/> <s:CallResponder id="findShipmentBatchesResult" result="updateBatchDataGridDP()"/> </fx:Declarations> <mx:AdvancedDataGrid id="task_list_dg" width="100%" height="95%" paddingLeft="0" paddingTop="0" paddingBottom="0"> <mx:columns> <mx:AdvancedDataGridColumn headerText="Receiving date" dataField="rd"/> <mx:AdvancedDataGridColumn headerText="Msg type" dataField="mt"/> <mx:AdvancedDataGridColumn headerText="SSD" dataField="ssd"/> <mx:AdvancedDataGridColumn headerText="Shipping site" dataField="sss"/> <mx:AdvancedDataGridColumn headerText="File name" dataField="fn"/> <mx:AdvancedDataGridColumn headerText="Batch number" dataField="bn"/> </mx:columns> </mx:AdvancedDataGrid> [/code] I cannot upload a pic, but this is the xml: [code] 2010-04-23 16:35:51.0 PRESHIP 2010-02-15 00:00:00.0 100000009 DF-Ocean-PRESHIPSUM-Quanta-PACT-EMEA-Scheduled Ship Date 20100215.csv 10053 [/code] and the xml is pretty much displayed exactly as is in the datagrid columns... I would appreciate your assistance.

    Read the article

  • How to figure out which jars are needed?

    - by Ari
    How can I systematically determine which jars I'll need, and thus should include in my pom.xml file (I'm using maven as my project management tool)? When learning spring, to keep things simple, added all the jars (even the ones I never used) to the classpath. Right now for the most part, I'm guessing which jars to include. For example, I know in my spring configuration file, I have: <tx:annotation-driven /> <context:annotation-config /> <aop:aspectj-autoproxy /> So, I guess I'll need: spring-context-x.x.x.jar, spring-tx-x.x.x.jar, spring-aop-x.x.x.jar Thanks.

    Read the article

  • HttpURLConnection: Is it necessary to call connect()?

    - by stormin986
    Many examples I've seen don't explicitly call connect(). Instead they just use getInputStream() or getResponseCode(). I'm assuming all of these HttpURLConnection methods that require a connection just call connect() themselves? Are there any cases where connect() must be explicitly called for an HttpURLConnection?

    Read the article

  • How to get local ActiveMQ broker to "mirror" a queue on a remote ActiveMQ broker?

    - by T.K.
    I have a local ActiveMQ broker which is on an unreliable internet connection, and also a remote ActiveMQ broker in a reliable datacenter. I have already sorted out a "store and forward" setup so that outgoing messages are sent to the remote broker when the Internet connection is available. That alone works great, but when messages are outbound. However, now I have to do the reverse. Here is the scenario: A new message appears in the remote ActiveMQ broker. The message is put into a specific queue. In a few minutes, the Internet connection becomes available to the local ActiveMQ broker. The local broker should then be able to pull the message from the remote broker, and place it in its own local queue. Local consumers will then be able to see the message. So in essence, I need the local broker to become a subscribed consumer to the remote queue. I have looked through the ActiveMQ documentations but I can't find anything yet about how to do this in the .xml configuration file. Is this what I should be looking for? See: "ActiveMQ: JMS to JMS Bridge". Any advice and tips would be highly appreciated.

    Read the article

  • How can I turn a string of text into a BigInteger representation for use in an El Gamal cryptosystem

    - by angstrom91
    I'm playing with the El Gamal cryptosystem, and my goal is to be able to encipher and decipher long sequences of text. I have come up with a method that works for short sequences, but does not work for long sequences, and I cannot figure out why. El Gamal requires the plaintext to be an integer. I have turned my string into a byte[] using the .getBytes() method for Strings, and then created a BigInteger out of the byte[]. After encryption/decryption, I turn the BigInteger into a byte[] using the .toByteArray() method for BigIntegers, and then create a new String object from the byte[]. This works perfectly when i call ElGamalEncipher with strings up to 129 characters. With 130 or more characters, the output produced is garbled. Can someone suggest how to solve this issue? Is this an issue with my method of turning the string into a BigInteger? If so, is there a better way to turn my string of text into a BigInteger and back? Below is my encipher/decipher code. public static BigInteger[] ElGamalEncipher(String plaintext, BigInteger p, BigInteger g, BigInteger r) { // returns a BigInteger[] cipherText // cipherText[0] is c // cipherText[1] is d BigInteger[] cipherText = new BigInteger[2]; BigInteger pText = new BigInteger(plaintext.getBytes()); // 1: select a random integer k such that 1 <= k <= p-2 BigInteger k = new BigInteger(p.bitLength() - 2, sr); // 2: Compute c = g^k(mod p) BigInteger c = g.modPow(k, p); // 3: Compute d= P*r^k = P(g^a)^k(mod p) BigInteger d = pText.multiply(r.modPow(k, p)).mod(p); // C =(c,d) is the ciphertext cipherText[0] = c; cipherText[1] = d; return cipherText; } public static String ElGamalDecipher(BigInteger c, BigInteger d, BigInteger a, BigInteger p) { //returns the plaintext enciphered as (c,d) // 1: use the private key a to compute the least non-negative residue // of an inverse of (c^a)' (mod p) BigInteger z = c.modPow(a, p).modInverse(p); BigInteger P = z.multiply(d).mod(p); byte[] plainTextArray = P.toByteArray(); String output = null; try { output = new String(plainTextArray, "UTF8"); } catch (Exception e) { } return output; }

    Read the article

  • PE Header Requirements

    - by Pindatjuh
    What are the requirements of a PE file (PE/COFF)? What fields should be set, which value, at a bare minimum for enabling it to "run" on Windows (i.e. executing "ret" instruction and then close, without error). The library I am building first is the linker: Now, the problem I have is the PE file (PE/COFF). I don't know what is "required" for a PE file before it can actually execute on my platform. My testing platform is Vista. I get an error message, saying "This is not a valid Win32 executable." when I execute it by double-clicking, and I get an "Access Denied." when executing it with CLI cmd. I have two sections, .text and .data. I've implemented the PE headers as provided by several online documents, i.e. MSDN and some other thirdparty documentation. If I use a hex-editor, it looks almost like a regular PE file. I don't use any imports, nor IAT, nor any directories in the PE header. Edit: I've added an import table, still not a valid .exe-file, says my Windows. I've tried to use values which are also mentioned at the smallest PE-file guide. No luck. Really the only thing I can't seem to figure out is what is required and what isn't. Some guides tell me everything is required, whilst others say about deprications: and it can be zero. I hope this is enough information. Thank you, in advance. Raw data (as requested) of current PE header: 4D 5A 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 40 00 00 00 50 45 00 00 4C 01 02 00 C8 7A 55 4B 00 00 00 00 00 00 00 00 E0 00 82 01 0B 01 0D 25 00 10 00 00 00 10 00 00 00 00 00 00 00 10 00 00 00 10 00 00 00 20 00 00 00 00 40 00 00 10 00 00 00 02 00 00 01 00 0B 00 00 00 00 00 03 00 0A 00 00 00 00 00 00 22 00 00 38 01 00 00 00 00 00 00 03 00 00 00 00 40 00 00 00 40 00 00 00 40 00 00 00 40 00 00 00 00 00 00 0E 00 00 00 00 00 00 00 00 00 00 00 00 20 00 00 24 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 2E 74 65 78 74 00 00 00 00 00 00 00 00 10 00 00 00 02 00 00 00 02 00 00 00 00 00 00 00 00 00 00 00 00 00 00 20 00 00 60 2E 69 64 61 74 61 00 00 00 00 00 00 00 20 00 00 00 02 00 00 00 04 00 00 00 00 00 00 00 00 00 00 00 00 00 00 40 00 00 C0 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 3C 20 00 00 00 00 00 00 00 00 00 00 24 20 00 00 34 20 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 4B 45 52 4E 45 4C 33 32 2E 64 6C 6C 00 00 00 00 01 00 00 80 00 00 00 00 01 00 00 80 00 00 00 00

    Read the article

  • Confused as to how to validate spring mvc form, what are my options?

    - by Blankman
    Latest spring mvc, using freemarker. Hoping someone could tell me what my options are in terms of validating a form with spring mvc, and what the recommend way would be to do this. I have a form that doesn't map directly to a model, it has input fields that when posted, will be used to initialze 2 model objects which I will then need to validate, and if they pass I will save them. If they fail, I want to return back to the form, pre-fill the values with what the user entered and display the error messages. I have read here and there about 2 methods, once of which I have done and understand how it works: @RequestMapping(...., method = RequestMethod.POST) public ModelAndView myMethod(@Valid MyModel, BindingResult bindingResult) { ModelAndView mav = new ModelAndView("some/view"); mav.addObject("mymodel", myModel); if(bindingResult.hasErrors()) { return mav; } } Now this worked if my form mapped directly to the form, but in my situation I have: form fields that don't map to any specific model, they have a few properties from 2 models. before validation occurrs, I need to create the 2 models manually, set the values from the values from the form, and manually set some properties also: Call validate on both models (model1, model2), and append these error messages to the errors collection which I need to pass back to the same view page if things don't work. when the form posts, I have to do some database calls, and based on those results may need to add additional messages to the errors collection Can someone tell me how to do this sort of validation? Pseudo code below: Model1 model1 = new Model1(); Model2 model2 = new Model2(); // manually or somehow automatically set the posted form values to model1 and model2. // set some fields manually, not from posted form model1.setProperty10(GlobalSettings.getDefaultProperty10()); model2.setProperty11(GlobalSettings.getDefaultProperty11()); // db calls, if they fail, add to errors collection if(bindingResult.hasErrors()) { return mav; } // validation passed, save Model1Service.save(model1); Model2Service.save(model2); redirect to another view Update I have using the JSR 303 annotations on my models right now, and it would great if I can use those still. Update II Please read the bounty description below for a summary of what I am looking for.

    Read the article

  • Spring MVC Best Practice Handling Unrecoverable Exceptions In Controller

    - by jboyd
    When you have a controller that does logic with services and DAO's that may throw an unrecoverable exception, what is the best practice in dealing with those method calls? Currently an app I'm working on has very lengthy try catch methods that simply err.out exception messages, this doesn't seem very robust and I think that this code smells, is there any cookie cutter best practice for handling this in spring-mvc?

    Read the article

  • Reloading of persisted entity

    - by Udi
    I'm using OpenJPA in my application as a JPA vendor. The question is theoretical or conceptual: Is there any way to tell an entity manager to load an entity from the DB rather than from it's cache? The problematic scenario: EM1.persist(Entity1) EM2.merge(Entity1) EM1.find(Entity1) <--- Entity1 is the cached version rather than the merged one.. Any elegant way to do it? I really don't want to call em.refresh(entity).

    Read the article

  • Separate log files for each web application and shared libraries with log4j

    - by oo_olo_oo
    I have few web applications run on the Tomcat server. Each application contains its own log4j library copy inside its own war. This allows for separate, flexible logging configuration per application. I also have few shared libraries (kept in Tomcat's shared libraries directory). I would like to have shared library loggers output among with the application (which uses them) loggers output (for example: if application A logs to file a.log, and uses library b.jar, I would like b.jar to log also to the a.log file). The problem is, that the shared libraries are loaded by the shared classloader, which causes that they can't access loggers defined by the applications. Is there any solution for this issue?

    Read the article

< Previous Page | 723 724 725 726 727 728 729 730 731 732 733 734  | Next Page >