Search Results

Search found 60688 results on 2428 pages for 'spring data solr'.

Page 46/2428 | < Previous Page | 42 43 44 45 46 47 48 49 50 51 52 53  | Next Page >

  • Get invalid user input with a Spring typeMismatch error

    - by TimmyJ
    I've implemented a ReloadableResourceBundleMessageSource in my Spring MVC application which I use to display prettier error messages for binding exceptions. The problem I'm having is that, due to a company policy, these errors must be displayed in the following format: [inputData] is not a valid [fieldName]. The field name is accessible by default in my message properties file (as the {0} argument), but I can't figure out a way to display the invalid user input. Is this possible?

    Read the article

  • Using single spring application context for web app

    - by Ramo
    Hi, I'm using org.springframework.web.servlet.DispatcherServlet and org.springframework.ws.transport.http.MessageDispatcherServlet In the same app but each is loading own application contexts, I need to load all beans in a single application context. The application consists of typical layers webappdao etc What I have tried is to use one single spring-root-context.xml by setting it in the contextConfigLocation. But didn't help, this has been an issue for me for a long time an I would appreciate any help with this. Any online references would be a great help. Regards Ramo

    Read the article

  • Spring Batch validation

    - by sergionni
    Hello. Does Spring Batch framework provide its specific validation mechanism? I mean, how it's possible to specify validation bean? My validation is result of @NamedQuery - if query returned result, the validation is OK, else - false.

    Read the article

  • Spring 3 MVC - Form Failure Causes Exception When Reloading JSP

    - by jboyd
    Using Spring 3 MVC, please bear with the long code example, it's quite simple, but I want to make sure all relevant information is posted. Basically here is the use case: There is a registration page, a user can login, OR fill out a registration form. The login form is a simple HTML form, the registration form is a more complicated, Spring bound form that uses a RegistrationFormData bean. Here is the relevant code: UserController.java ... @RequestMapping(value = "/login", method = RequestMethod.GET) public String login(Model model) { model.addAttribute("registrationInfo", new ProfileAdminFormData()); return "login"; } ... @RequestMapping(value = "/login.do", method = RequestMethod.POST) public String doLogin( @RequestParam(value = "userName") String userName, @RequestParam(value = "password") String password, Model model) { logger.info("login.do : userName=" + userName + ", password=" + password); try { getUser().login(userName, password); } catch (UserNotFoundException ex) { logger.error(ex); model.addAttribute("loginError", ex.getWebViewableErrorMessage()); return "login"; } return "redirect:/"; } ... @RequestMapping(value = "/register.do") public String register( @ModelAttribute(value = "registrationInfo") ProfileAdminFormData profileAdminFormData, BindingResult result, Model model) { //todo: redirect if (new RegistrationValidator(profileAdminFormData, result).validate()) { try { User().register(profileAdminFormData); return "index"; } catch (UserException ex) { logger.error(ex); model.addAttribute("registrationErrorMessage", ex.getWebViewableErrorMessage()); return "login"; } } return "login"; } and the JSP: ... <form:form commandName="registrationInfo" action="register.do"> ... So the problem here is that when login fails I get an exception because there is no bean "registrationInfo" in the model attributes. What I need is that regardless of the path through this controller that the "registrationInfo" bean is not null, that way if login fails, as opposed to registration, that bean is still in the model. As you can see I create the registrationInfo object explicitly in my controller in the method bound to "/login", which is what I thought was going to be kind of a setup method" Something doesn't feel right about the "/login" method which sets up the page, but I needed to that in order to get the page to render at all without throwing an exception because there is no "registrationInfo" model attribute, as needed by the form in the JSP

    Read the article

  • Download Spring 2.5 sample applications

    - by Don
    Hi, I've seen various references to a couple of Spring MVC 2.5 example applications, named 'petclinic' and 'jpetstore'. I can't seem to find where to download these examples. Can anyone can provide a link to download them (ideally also with instructions for setup/deploying)? Thanks, Don

    Read the article

  • Spring custom error message

    - by Ale
    I want to set a custom error message via @Controller, there is something like Struts saveMessages(...) in spring? for example: ActionErrors actionErrors = new ActionErrors(); actionErrors.add("error", new ActionMessage("error.missing.key", messageResources.getMessage("label.username"), messageResources.getMessage("label.password"))); saveErrors(request, actionErrors);

    Read the article

  • Two Persistence Units, Two entityManagerFactory(s), SPring, JPA, Hibernate

    - by sebajb
    I am using Two entityManagerFactory{1,2} based on two persistence units, autowired through spring, hibernate, with resource local. My DaoImpls are configured with the PersistenceUnit(unitName='...') according to the unit needed. I was under the impression that using them with the unitName specified would allow me to use x number of PUs without problem but I still get: expected single matching bean but found two: [emf1, emf2]. Am I missing some other configuration?

    Read the article

  • client side spring

    - by Gholi
    I wanna sth like Spring framework to use in client side. Actually I am going to abstract ui from data sources that may be added to system while it is up. An XML will be injected to system and ui will be generated automatically. Client side use would be able to search on new data source while objects are created in client side. Thanks

    Read the article

  • Spring Security Taglibs control statement

    - by Blake
    Is there a way to implement control statement with Spring Security taglibs? Currently we can only check if a user has a role... <security:authorize access="hasRole('ROLE_ADMIN')"> // display something </security:authorize> How about else?

    Read the article

  • Swiz Framework and Spring Framework - Are they related?

    - by theband
    I was looking into Swiz framework and i felt the same of Spring. Just i felt the difference between these two is one is JAVA based and the other is Action Script based. http://swizframework.org/ http://www.springsource.org/ My Question is: Does the goal of the both framework is same? Does the pattern they apply is same or different? The concept of beans, dependency injection and IOC lies in both.

    Read the article

  • Grails spring security defaultTargetUrl going wrong path

    - by fsi
    Grails 2.4 with Spring security 2 3RC I have this on my Config.groovy grails.plugin.springsecurity.controllerAnnotations.staticRules = [ '/': ['permitAll'], '/index': ['permitAll'], '/index.gsp': ['permitAll'], '/**/js/**': ['permitAll'], '/**/css/**': ['permitAll'], '/**/images/**': ['permitAll'], '/**/favicon.ico': ['permitAll'] ] grails.plugin.springsecurity.successHandler.defaultTargetUrl = "/home/index" But this keeping me redirecting to assets/favicon.ico And my HomeController is like that @Secured(['ROLE_ADMIN', 'ROLE_USER']) def index() { if (SpringSecurityUtils.ifAllGranted('ROLE_ADMIN')) { redirect controller: 'admin', action: 'index' return } } And I modify this in my UrlMapping: "/"(controller: 'home', action:'index') Why it keeps me sending wrong path? Update: using another computer, it redirects me to /asset/grails_logo.png

    Read the article

  • spring-security and jsf

    - by Mike
    Hi! i am developing in JSF a Spring Security application. the login form is fine. however, when i try to retrieve the authentication object, in future code, i always get the authentication pricipal as anonymous. i try to fetch is like this: Authentication auth = SecurityContextHolder.getContext().getAuthentication();

    Read the article

  • spring component scan for classes

    - by awk
    I want to scan all the classes in a package that that are subclasses of the particular class. Then I want to take these classes and for each of them instantiate a bean of same type, using the class as a property. Then I want to gather all these anonymous beans and put them into collection. Is it possible to configure spring context in XML like this? Thx

    Read the article

  • Custom spring bean

    - by Hari
    Hi, I want to convert some of our internal API into a spring bean that we can use in other projects. This API needs some instantiation and other logic which I want to encapsulate in this bean so that we can just put the bean into our app context with necessary propoerties alone. I remember having read an article on this somewhere in the past - but cant find it now. Any pointers to something similar will be helpful

    Read the article

  • implement acl on field in spring security

    - by Mike
    Hi! I would like implement spring acl for my object fields. does anyone has an idea what do i have to implment for it? for example, i have Purchase object. i would like admin_role to have read on all the fields, and secretary_role to have read only on username and address field

    Read the article

  • Using a "white list" for extracting terms for Text Mining, Part 2

    - by [email protected]
    In my last post, we set the groundwork for extracting specific tokens from a white list using a CTXRULE index. In this post, we will populate a table with the extracted tokens and produce a case table suitable for clustering with Oracle Data Mining. Our corpus of documents will be stored in a database table that is defined as create table documents(id NUMBER, text VARCHAR2(4000)); However, any suitable Oracle Text-accepted data type can be used for the text. We then create a table to contain the extracted tokens. The id column contains the unique identifier (or case id) of the document. The token column contains the extracted token. Note that a given document many have many tokens, so there will be one row per token for a given document. create table extracted_tokens (id NUMBER, token VARCHAR2(4000)); The next step is to iterate over the documents and extract the matching tokens using the index and insert them into our token table. We use the MATCHES function for matching the query_string from my_thesaurus_rules with the text. DECLARE     cursor c2 is       select id, text       from documents; BEGIN     for r_c2 in c2 loop        insert into extracted_tokens          select r_c2.id id, main_term token          from my_thesaurus_rules          where matches(query_string,                        r_c2.text)>0;     end loop; END; Now that we have the tokens, we can compute the term frequency - inverse document frequency (TF-IDF) for each token of each document. create table extracted_tokens_tfidf as   with num_docs as (select count(distinct id) doc_cnt                     from extracted_tokens),        tf       as (select a.id, a.token,                            a.token_cnt/b.num_tokens token_freq                     from                        (select id, token, count(*) token_cnt                        from extracted_tokens                        group by id, token) a,                       (select id, count(*) num_tokens                        from extracted_tokens                        group by id) b                     where a.id=b.id),        doc_freq as (select token, count(*) overall_token_cnt                     from extracted_tokens                     group by token)   select tf.id, tf.token,          token_freq * ln(doc_cnt/df.overall_token_cnt) tf_idf   from num_docs,        tf,        doc_freq df   where df.token=tf.token; From the WITH clause, the num_docs query simply counts the number of documents in the corpus. The tf query computes the term (token) frequency by computing the number of times each token appears in a document and divides that by the number of tokens found in the document. The doc_req query counts the number of times the token appears overall in the corpus. In the SELECT clause, we compute the tf_idf. Next, we create the nested table required to produce one record per case, where a case corresponds to an individual document. Here, we COLLECT all the tokens for a given document into the nested column extracted_tokens_tfidf_1. CREATE TABLE extracted_tokens_tfidf_nt              NESTED TABLE extracted_tokens_tfidf_1                  STORE AS extracted_tokens_tfidf_tab AS              select id,                     cast(collect(DM_NESTED_NUMERICAL(token,tf_idf)) as DM_NESTED_NUMERICALS) extracted_tokens_tfidf_1              from extracted_tokens_tfidf              group by id;   To build the clustering model, we create a settings table and then insert the various settings. Most notable are the number of clusters (20), using cosine distance which is better for text, turning off auto data preparation since the values are ready for mining, the number of iterations (20) to get a better model, and the split criterion of size for clusters that are roughly balanced in number of cases assigned. CREATE TABLE km_settings (setting_name  VARCHAR2(30), setting_value VARCHAR2(30)); BEGIN  INSERT INTO km_settings (setting_name, setting_value) VALUES     VALUES (dbms_data_mining.clus_num_clusters, 20);  INSERT INTO km_settings (setting_name, setting_value)     VALUES (dbms_data_mining.kmns_distance, dbms_data_mining.kmns_cosine);   INSERT INTO km_settings (setting_name, setting_value) VALUES     VALUES (dbms_data_mining.prep_auto,dbms_data_mining.prep_auto_off);   INSERT INTO km_settings (setting_name, setting_value) VALUES     VALUES (dbms_data_mining.kmns_iterations,20);   INSERT INTO km_settings (setting_name, setting_value) VALUES     VALUES (dbms_data_mining.kmns_split_criterion,dbms_data_mining.kmns_size);   COMMIT; END; With this in place, we can now build the clustering model. BEGIN     DBMS_DATA_MINING.CREATE_MODEL(     model_name          => 'TEXT_CLUSTERING_MODEL',     mining_function     => dbms_data_mining.clustering,     data_table_name     => 'extracted_tokens_tfidf_nt',     case_id_column_name => 'id',     settings_table_name => 'km_settings'); END;To generate cluster names from this model, check out my earlier post on that topic.

    Read the article

  • Building vs. Buying a Master Data Management Solution

    - by david.butler(at)oracle.com
    Many organizations prefer to build their own MDM solutions. The argument is that they know their data quality issues and their data better than anyone. Plus a focused solution will cost less in the long run then a vendor supplied general purpose product. This is not unreasonable if you think of MDM as a point solution for a particular data quality problem. But this approach carries significant risk. We now know that organizations achieve significant competitive advantages when they deploy MDM as a strategic enterprise wide solution: with the most common best practice being to deploy a tactical MDM solution and grow it into a full information architecture. A build your own approach most certainly will not scale to a larger architecture unless it is done correctly with the larger solution in mind. It is possible to build a home grown point MDM solution in such a way that it will dovetail into broader MDM architectures. A very good place to start is to use the same basic technologies that Oracle uses to build its own MDM solutions. Start with the Oracle 11g database to create a flexible, extensible and open data model to hold the master data and all needed attributes. The Oracle database is the most flexible, highly available and scalable database system on the market. With its Real Application Clusters (RAC) it can even support the mixed OLTP and BI workloads that represent typical MDM data access profiles. Use Oracle Data Integration (ODI) for batch data movement between applications, MDM data stores, and the BI layer. Use Oracle Golden Gate for more real-time data movement. Use Oracle's SOA Suite for application integration with its: BPEL Process Manager to orchestrate MDM connections to business processes; Identity Management for managing users; WS Manager for managing web services; Business Intelligence Enterprise Edition for analytics; and JDeveloper for creating or extending the MDM management application. Oracle utilizes these technologies to build its MDM Hubs.  Customers who build their own MDM solution using these components will easily migrate to Oracle provided MDM solutions when the home grown solution runs out of gas. But, even with a full stack of open flexible MDM technologies, creating a robust MDM application can be a daunting task. For example, a basic MDM solution will need: a set of data access methods that support master data as a service as well as direct real time access as well as batch loads and extracts; a data migration service for initial loads and periodic updates; a metadata management capability for items such as business entity matrixed relationships and hierarchies; a source system management capability to fully cross-reference business objects and to satisfy seemingly conflicting data ownership requirements; a data quality function that can find and eliminate duplicate data while insuring correct data attribute survivorship; a set of data quality functions that can manage structured and unstructured data; a data quality interface to assist with preventing new errors from entering the system even when data entry is outside the MDM application itself; a continuing data cleansing function to keep the data up to date; an internal triggering mechanism to create and deploy change information to all connected systems; a comprehensive role based data security system to control and monitor data access, update rights, and maintain change history; a flexible business rules engine for managing master data processes such as privacy and data movement; a user interface to support casual users and data stewards; a business intelligence structure to support profiling, compliance, and business performance indicators; and an analytical foundation for directly analyzing master data. Oracle's pre-built MDM Hub solutions are full-featured 3-tier Internet applications designed to participate in the full Oracle technology stack or to run independently in other open IT SOA environments. Building MDM solutions from scratch can take years. Oracle's pre-built MDM solutions can bring quality data to the enterprise in a matter of months. But if you must build, at lease build with the world's best technology stack in a way that simplifies the eventual upgrade to Oracle MDM and to the full enterprise wide information architecture that it enables.

    Read the article

< Previous Page | 42 43 44 45 46 47 48 49 50 51 52 53  | Next Page >