Search Results

Search found 3186 results on 128 pages for 'snowflake schema'.

Page 35/128 | < Previous Page | 31 32 33 34 35 36 37 38 39 40 41 42  | Next Page >

  • Writing custom Message Formatter for SOAP basicHttpBinding

    - by Lijo
    I have a WSDL published by our service development team. It is using SOAP and basicHttpBinding. I can add the service reference to the project using Add Service Reference option in Visual Studio. I need to develop the WCF client. I need to use custom Message Formatter (for mapping between Messages and CLR types). Can you please show how to write the custom Message Formatter (in C# )for the following wsdl? Note: I am planning to use custom Message Formatter due to an issue mentioned in http://stackoverflow.com/questions/12316884/header-namespace-mismatch-issue WSDL <definitions xmlns:import0="urn:thinktecture-com:demos:restaurantservice:data:v1" xmlns:import2="urn:thinktecture-com:demos:restaurantservice:messages:v1" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:import1="urn:thinktecture-com:demos:restaurantservice:headerdata:v1" xmlns:tns="urn:thinktecture-com:demos:restaurantservice:wsdl:v1" xmlns:soap="http://schemas.xmlsoap.org/wsdl/soap/" xmlns:soap12="http://schemas.xmlsoap.org/wsdl/soap12/" name="RestauarntService" targetNamespace="urn:thinktecture-com:demos:restaurantservice:wsdl:v1" xmlns="http://schemas.xmlsoap.org/wsdl/"> <wsdl:documentation xmlns:wsdl="http://schemas.xmlsoap.org/wsdl/" /> <types> <xsd:schema> <xsd:import schemaLocation="RestaurantData.xsd" namespace="urn:thinktecture-com:demos:restaurantservice:data:v1" /> <xsd:import schemaLocation="RestaurantHeaderData.xsd" namespace="urn:thinktecture-com:demos:restaurantservice:headerdata:v1" /> <xsd:import schemaLocation="RestaurantMessages.xsd" namespace="urn:thinktecture-com:demos:restaurantservice:messages:v1" /> </xsd:schema> </types> <message name="getRestaurantsIn"> <wsdl:documentation xmlns:wsdl="http://schemas.xmlsoap.org/wsdl/" /> <part name="parameters" element="import2:getRestaurants" /> </message> <message name="getRestaurantsOut"> <wsdl:documentation xmlns:wsdl="http://schemas.xmlsoap.org/wsdl/" /> <part name="parameters" element="import2:getRestaurantsResponse" /> </message> <message name="lijosCustomFaultMessage"> <wsdl:documentation xmlns:wsdl="http://schemas.xmlsoap.org/wsdl/" /> <part name="fault" element="import2:customFault" /> </message> <message name="userCredentialsIn"> <wsdl:documentation xmlns:wsdl="http://schemas.xmlsoap.org/wsdl/" /> <part name="parameters" element="import1:userCredentials" /> </message> <message name="addRestaurantIn"> <wsdl:documentation xmlns:wsdl="http://schemas.xmlsoap.org/wsdl/" /> <part name="parameters" element="import2:addRestaurant" /> </message> <message name="addRestaurantInHeader1"> <wsdl:documentation xmlns:wsdl="http://schemas.xmlsoap.org/wsdl/" /> <part name="parameters" element="import1:userCredentials" /> </message> <message name="customFaultIn"> <wsdl:documentation xmlns:wsdl="http://schemas.xmlsoap.org/wsdl/" /> <part name="parameters" element="import2:customFault" /> </message> <portType name="RestauarntServiceInterface"> <wsdl:documentation xmlns:wsdl="http://schemas.xmlsoap.org/wsdl/" /> <operation name="getRestaurants"> <wsdl:documentation xmlns:wsdl="http://schemas.xmlsoap.org/wsdl/" /> <input message="tns:getRestaurantsIn" /> <output message="tns:getRestaurantsOut" /> <fault name="lijosCustomFaultMessage" message="tns:lijosCustomFaultMessage" /> </operation> <operation name="userCredentials"> <wsdl:documentation xmlns:wsdl="http://schemas.xmlsoap.org/wsdl/" /> <input message="tns:userCredentialsIn" /> </operation> <operation name="addRestaurant"> <wsdl:documentation xmlns:wsdl="http://schemas.xmlsoap.org/wsdl/" /> <input message="tns:addRestaurantIn" /> </operation> <operation name="customFault"> <wsdl:documentation xmlns:wsdl="http://schemas.xmlsoap.org/wsdl/" /> <input message="tns:customFaultIn" /> </operation> </portType> <binding name="BasicHttpBinding_RestauarntServiceInterface" type="tns:RestauarntServiceInterface"> <soap:binding transport="http://schemas.xmlsoap.org/soap/http" /> <operation name="getRestaurants"> <soap:operation soapAction="urn:thinktecture-com:demos:restaurantservice:wsdl:v1:getRestaurantsIn" style="document" /> <input> <soap:body use="literal" /> </input> <output> <soap:body use="literal" /> </output> <fault name="lijosCustomFaultMessage"> <soap:fault use="literal" name="lijosCustomFaultMessage" namespace="" /> </fault> </operation> <operation name="userCredentials"> <soap:operation soapAction="urn:thinktecture-com:demos:restaurantservice:wsdl:v1:userCredentialsIn" style="document" /> <input> <soap:body use="literal" /> </input> </operation> <operation name="addRestaurant"> <soap:operation soapAction="urn:thinktecture-com:demos:restaurantservice:wsdl:v1:addRestaurantIn" style="document" /> <input> <soap:body use="literal" /> <soap:header message="tns:addRestaurantInHeader1" part="parameters" use="literal" /> </input> </operation> <operation name="customFault"> <soap:operation soapAction="urn:thinktecture-com:demos:restaurantservice:wsdl:v1:customFaultIn" style="document" /> <input> <soap:body use="literal" /> </input> </operation> </binding> <service name="RestauarntServicePort"> <port name="RestauarntServicePort" binding="tns:BasicHttpBinding_RestauarntServiceInterface"> <soap:address location="http://localhost/RestauarntService" /> </port> </service> </definitions>?? RestaurantData.xsd <?xml version="1.0" encoding="utf-8" ?> <xs:schema id="RestaurantData" targetNamespace="urn:thinktecture-com:demos:restaurantservice:data:v1" elementFormDefault="qualified" xmlns="urn:thinktecture-com:demos:restaurantservice:data:v1" xmlns:mstns="urn:thinktecture-com:demos:restaurantservice:data:v1" xmlns:xs="http://www.w3.org/2001/XMLSchema"> <xs:complexType name="restaurantInfo"> <xs:sequence> <xs:element name="restaurantID" type="xs:int" /> <xs:element name="name" type="xs:string" /> <xs:element name="address" type="xs:string" /> <xs:element name="city" type="xs:string" /> <xs:element name="state" type="xs:string" /> <xs:element name="zip" type="xs:string" /> <xs:element name="openFrom" type="xs:time" /> <xs:element name="openTo" type="xs:time" /> </xs:sequence> </xs:complexType> <xs:complexType name="restaurantsList"> <xs:sequence> <xs:element name="restaurant" type="restaurantInfo" maxOccurs="unbounded" minOccurs="0" /> </xs:sequence> </xs:complexType> <xs:complexType name="customFault"> <xs:sequence> <xs:element name="errorCode" type="xs:string"/> <xs:element name="message" type="xs:string"/> <xs:element maxOccurs="unbounded" minOccurs="0" name="messages" type="xs:string" /> </xs:sequence> </xs:complexType> </xs:schema> RestaurantHeaderData.xsd <?xml version="1.0" encoding="utf-8" ?> <xs:schema id="RestaurantHeaderData" targetNamespace="urn:thinktecture-com:demos:restaurantservice:headerdata:v1" elementFormDefault="qualified" xmlns="urn:thinktecture-com:demos:restaurantservice:headerdata:v1" xmlns:mstns="urn:thinktecture-com:demos:restaurantservice:headerdata:v1" xmlns:xs="http://www.w3.org/2001/XMLSchema"> <xs:complexType name="credentials"> <xs:sequence> <xs:element name="username" type="xs:string" /> <xs:element name="password" type="xs:string" /> </xs:sequence> </xs:complexType> <xs:element name="userCredentials" type="credentials"> </xs:element> </xs:schema> ? RestaurantMessages.xsd <?xml version="1.0" encoding="utf-8" ?> <xs:schema id="RestaurantMessages" targetNamespace="urn:thinktecture-com:demos:restaurantservice:messages:v1" elementFormDefault="qualified" xmlns="urn:thinktecture-com:demos:restaurantservice:messages:v1" xmlns:mstns="urn:thinktecture-com:demos:restaurantservice:messages:v1" xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:import="urn:thinktecture-com:demos:restaurantservice:data:v1"> <xs:import id="RestaurantData" schemaLocation="RestaurantData.xsd" namespace="urn:thinktecture-com:demos:restaurantservice:data:v1"> </xs:import> <xs:element name="getRestaurants"> <xs:complexType> <xs:sequence> <xs:element name="zip" type="xs:string" /> </xs:sequence> </xs:complexType> </xs:element> <xs:element name="getRestaurantsResponse"> <xs:complexType> <xs:sequence> <xs:element name="restaurants" type="import:restaurantsList" /> </xs:sequence> </xs:complexType> </xs:element> <xs:element name="addRestaurant"> <xs:complexType> <xs:sequence> <xs:element name="restaurant" type="import:restaurantInfo" /> </xs:sequence> </xs:complexType> </xs:element> <xs:element name="customFault" type="import:customFault" /> </xs:schema>

    Read the article

  • BizTalk 2009 - Scoped Record Counting in Maps

    - by StuartBrierley
    Within BizTalk there is a functoid called Record Count that will return the number of instances of a repeated record or repeated element that occur in a message instance. The input to this functoid is the record or element to be counted. As an example take the following Source schema, where the Source message has a repeated record called Box and each Box has a repeated element called Item: An instance of this Source schema may look as follows; 2 box records - one with 2 items and one with only 1 item. Our destination schema has a number of elements and a repeated box record.  The top level elements contain totals for the number of boxes and the overall number of items.  Each box record contains a single element representing the number of items in that box. Using the Record Count functoid it is easy to map the top level elements, producing the expected totals of 2 boxes and 3 items: We now need to map the total number of items per box, but how will we do this?  We have already seen that the record count functoid returns the total number of instances for the entire message, and unfortunately it does not allow you to specify a scoping parameter.  In order to acheive Scoped Record Counting we will need to make use of a combination of functoids. As you can see above, by linking to a Logical Existence functoid from the record/element to be counted we can then feed the output into a Value Mapping functoid.  Set the other Value Mapping parameter to "1" and link the output to a Cumulative Sum functoid. Set the other Cumulative Sum functoid parameter to "1" to limit the scope of the Cumulative Sum. This gives us the expected results of Items per Box of 2 and 1 respectively. I ran into this issue with a larger schema on a more complex map, but the eventual solution is still the same.  Hopefully this simplified example will act as a good reminder to me and save someone out there a few minutes of brain scratching.

    Read the article

  • HTML 5 Video OnEnded Event not Firing

    - by Eric J.
    I'm trying to react to the HTML 5 onended event for the video tag without success. In the code snippet below I added the mouseleave event to be sure the jQuery code is correct and that event does activate the alert() box. The video plays just fine but onended does not cause my alert() to fire off. Testing in Chrome, updated today to version 5.0.375.55. <html> <head> <script src="http://code.jquery.com/jquery-1.4.2.min.js"></script> <script> $(document).ready(function(){ $("#myVideoTag").bind('mouseleave onended', function(){ alert("All done, function!"); }); }); </script> </head> <body> <video id="myVideoTag" width="640" height="360" poster="Fractal-zoom-1-04-Snowflake.jpg" controls> <source src="Fractal-zoom-1-04-Snowflake.mp4" type="video/mp4"></source> <source src="Fractal-zoom-1-04-Snowflake.ogg" type="video/ogg"></source> </video> <body> <html>

    Read the article

  • Spring MVC configuration problems

    - by Smek
    i have some problems with configuring Spring MVC. I made a maven multi module project with the following modules: /api /domain /repositories /webapp I like to share the domain and the repositories between the api and the webapp (both web projects). First i want to configure the webapp to use the repositories module so i added the dependencies in the xml file like this: <dependency> <groupId>${project.groupId}</groupId> <artifactId>domain</artifactId> <version>1.0-SNAPSHOT</version> </dependency> <dependency> <groupId>${project.groupId}</groupId> <artifactId>repositories</artifactId> <version>1.0-SNAPSHOT</version> </dependency> And my controller in the webapp module looks like this: package com.mywebapp.webapp; import com.mywebapp.domain.Person; import com.mywebapp.repositories.services.PersonService; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.context.annotation.ComponentScan; import org.springframework.context.annotation.Configuration; import org.springframework.stereotype.Controller; import org.springframework.ui.ModelMap; import org.springframework.web.bind.annotation.RequestMapping; import org.springframework.web.bind.annotation.RequestMethod; @Controller @RequestMapping("/") @Configuration @ComponentScan("com.mywebapp.repositories") public class PersonController { @Autowired PersonService personservice; @RequestMapping(method = RequestMethod.GET) public String printWelcome(ModelMap model) { Person p = new Person(); p.age = 23; p.firstName = "John"; p.lastName = "Doe"; personservice.createNewPerson(p); model.addAttribute("message", "Hello world!"); return "index"; } } In my webapp module i try to load configuration files in my web.xml like this: <context-param> <param-name>contextConfigLocation</param-name> <param-value>classpath:/META-INF/persistence-context.xml, classpath:/META-INF/service-context.xml</param-value> </context-param> These files cannot be found so i get the following error: org.springframework.beans.factory.BeanDefinitionStoreException: IOException parsing XML document from class path resource [META-INF/persistence-context.xml]; nested exception is java.io.FileNotFoundException: class path resource [META-INF/persistence-context.xml] cannot be opened because it does not exist These files are in the repositories module so my first question is how can i make Spring to find these files? I also have trouble Autowiring the PersonService to my Controller class did i forget to configure something in my XML? Here is the error message: [INFO] [talledLocalContainer] SEVERE: Exception sending context initialized event to listener instance of class org.springframework.web.context.ContextLoaderListener [INFO] [talledLocalContainer] org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'personServiceImpl': Injection of autowired dependencies failed; nested exception is org.springframework.beans.factory.BeanCreationException: Could not autowire field: private com.mywebapp.repositories.repository.PersonRepository com.mywebapp.repositories.services.PersonServiceImpl.personRepository; nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException: No matching bean of type [com.mywebapp.repositories.repository.PersonRepository] found for dependency: expected at least 1 bean which qualifies as autowire candidate for this dependency. Dependency annotations: {@org.springframework.beans.factory.annotation.Autowired(required=true)} PersonServiceImple.java: package com.mywebapp.repositories.services; import com.mywebapp.domain.Person; import com.mywebapp.repositories.repository.PersonRepository; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.data.mongodb.core.MongoTemplate; import org.springframework.stereotype.Service; @Service public class PersonServiceImpl implements PersonService{ @Autowired public PersonRepository personRepository; @Autowired public MongoTemplate personTemplate; @Override public Person createNewPerson(Person person) { return personRepository.save(person); } } PersonService.java package com.mywebapp.repositories.services; import com.mywebapp.domain.Person; public interface PersonService { Person createNewPerson(Person person); } PersonRepository.java: package com.mywebapp.repositories.repository; import com.mywebapp.domain.Person; import org.springframework.data.mongodb.repository.MongoRepository; import org.springframework.stereotype.Repository; import java.math.BigInteger; @Repository public interface PersonRepository extends MongoRepository<Person, BigInteger> { } persistance-context.xml <?xml version="1.0" encoding="UTF-8"?> <beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:context="http://www.springframework.org/schema/context" xmlns:mongo="http://www.springframework.org/schema/data/mongo" xsi:schemaLocation= "http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.0.xsd http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo-1.0.xsd http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd"> <context:property-placeholder location="classpath:mongo.properties"/> <mongo:mongo host="${mongo.host}" port="${mongo.port}" id="mongo"> <mongo:options connections-per-host="${mongo.connectionsPerHost}" threads-allowed-to-block-for-connection-multiplier="${mongo.threadsAllowedToBlockForConnectionMultiplier}" connect-timeout="${mongo.connectTimeout}" max-wait-time="${mongo.maxWaitTime}" auto-connect-retry="${mongo.autoConnectRetry}" socket-keep-alive="${mongo.socketKeepAlive}" socket-timeout="${mongo.socketTimeout}" slave-ok="${mongo.slaveOk}" write-number="1" write-timeout="0" write-fsync="true"/> </mongo:mongo> <mongo:db-factory dbname="person" mongo-ref="mongo" id="mongoDbFactory"/> <bean id="personTemplate" name="personTemplate" class="org.springframework.data.mongodb.core.MongoTemplate"> <constructor-arg name="mongoDbFactory" ref="mongoDbFactory"/> </bean> <mongo:repositories base-package="com.mywebapp.repositories.repository" mongo-template-ref="personTemplate"> <mongo:repository id="personRepository" repository-impl-postfix="PersonRepository" mongo-template-ref="personTemplate" create-query-indexes="true"/> </mongo:repositories> Thanks

    Read the article

  • spring-nullpointerexception- cant access autowired annotated service (or dao) in a no-annotations class

    - by user286806
    I have this problem that I cannot fix. From my @Controller, i can easily access my autowired @Service class and play with it no problem. But when I do that from a separate class without annotations, it gives me a NullPointerException. My Controller (works)- @Controller public class UserController { @Autowired UserService userService;... My separate Java class (not working)- public final class UsersManagementUtil { @Autowired UserService userService; or @Autowired UserDao userDao; userService or userDao are always null! Was just trying if any one of them works. My component scan setting has the root level package set for scanning so that should be OK. my servlet context - <?xml version="1.0" encoding="UTF-8"?> <beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:context="http://www.springframework.org/schema/context" xmlns:tx="http://www.springframework.org/schema/tx" xsi:schemaLocation=" http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.0.xsd http://www.springframework.org/schema/tx http://www.springframework.org/schema/tx/spring-tx-3.0.xsd"> <!-- the application context definition for the springapp DispatcherServlet --> <!-- Enable annotation driven controllers, validation etc... --> <context:property-placeholder location="classpath:jdbc.properties" /> <context:component-scan base-package="x" /> <tx:annotation-driven transaction-manager="hibernateTransactionManager" /> <!-- package shortended --> <bean id="messageSource" class="o.s.c.s.ReloadableResourceBundleMessageSource"> <property name="basename" value="/WEB-INF/messages" /> </bean> <bean id="dataSource" class="org.springframework.jdbc.datasource.DriverManagerDataSource"> <property name="driverClassName" value="${database.driver}" /> <property name="url" value="${database.url}" /> <property name="username" value="${database.user}" /> <property name="password" value="${database.password}" /> </bean> <!-- package shortened --> <bean id="viewResolver" class="o.s.w.s.v.InternalResourceViewResolver"> <property name="prefix"> <value>/</value> </property> <property name="suffix"> <value>.jsp</value> </property> <property name="order"> <value>0</value> </property> </bean> <!-- package shortened --> <bean id="sessionFactory" class="o.s.o.h3.a.AnnotationSessionFactoryBean"> <property name="dataSource" ref="dataSource" /> <property name="annotatedClasses"> <list> <value>orion.core.models.Question</value> <value>orion.core.models.User</value> <value>orion.core.models.Space</value> <value>orion.core.models.UserSkill</value> <value>orion.core.models.Question</value> <value>orion.core.models.Rating</value> </list> </property> <property name="hibernateProperties"> <props> <prop key="hibernate.dialect">${hibernate.dialect}</prop> <prop key="hibernate.show_sql">${hibernate.show_sql}</prop> <prop key="hibernate.hbm2ddl.auto">${hibernate.hbm2ddl.auto}</prop> </props> </property> </bean> <bean id="hibernateTransactionManager" class="org.springframework.orm.hibernate3.HibernateTransactionManager"> <property name="sessionFactory" ref="sessionFactory" /> </bean> Any clue?

    Read the article

  • WCF Error when using “Match Data” function in MDS Excel AddIn

    - by Davide Mauri
    If you’re using MDS and DQS with the Excel Integration you may get an error when trying to use the “Match Data” feature that uses DQS in order to help to identify duplicate data in your data set. The error is quite obscure and you have to enable WCF error reporting in order to have the error details and you’ll discover that they are related to some missing permission in MDS and DQS_STAGING_DATA database. To fix the problem you just have to give the needed permession, as the following script does: use MDS go GRANT SELECT ON mdm.tblDataQualityOperationsState TO [VMSRV02\mdsweb] GRANT INSERT ON mdm.tblDataQualityOperationsState TO [VMSRV02\mdsweb] GRANT DELETE ON mdm.tblDataQualityOperationsState TO [VMSRV02\mdsweb] GRANT UPDATE ON mdm.tblDataQualityOperationsState TO [VMSRV02\mdsweb] USE [DQS_STAGING_DATA] GO ALTER AUTHORIZATION ON SCHEMA::[db_datareader] TO [VMSRV02\mdsweb] ALTER AUTHORIZATION ON SCHEMA::[db_datawriter] TO [VMSRV02\mdsweb] ALTER AUTHORIZATION ON SCHEMA::[db_ddladmin] TO [VMSRV02\mdsweb] GO Where “VMSRV02\mdsweb” is the user you configured for MDS Service execution. If you don’t remember it, you can just check which account has been assigned to the IIS application pool that your MDS website is using:

    Read the article

  • Has anyone used these database publish tools in Visual Studio 2012/2013 ?

    - by punkouter
    I am trying to figure out if they are actually useful... I have a DBProject and it would be nice if when I publish from my DEV to my TEST it what automatically copy the schema and data from DEV to TEST.. is that what this tab in visual studio is for ? On PROD I would never want to copy the data over but maybe I would change the schema and want to move that schema to PROD.. so is that what this tab is for ? For now we are using WEB DEPLOY but none of these database options.. We are manually using DATA COMPARE to sync changes on the various databases/DBproject... Any advice?

    Read the article

  • Stairway to XML: Level 3 - Working with Typed XML

    You can enforce the validation of an XML data type, variable or column by associating it with an XML Schema Collection. SQL Server validates a typed XML value against the rules defined in the schema collection so that INSERT or UPDATE operations will succeed only if the value being inserted or updated is valid as per the rules defined in the Schema Collection. NEW! Deployment Manager Early Access ReleaseDeploy SQL Server changes and .NET applications fast, frequently, and without fuss, using Deployment Manager, the new tool from Red Gate. Try the Early Access Release to get a 20% discount on Version 1. Download the Early Access Release.

    Read the article

  • SOA 11g Technology Adapters – ECID Propagation

    - by Greg Mally
    Overview Many SOA Suite 11g deployments include the use of the technology adapters for various activities including integration with FTP, database, and files to name a few. Although the integrations with these adapters are easy and feature rich, there can be some challenges from the operations perspective. One of these challenges is how to correlate a logical business transaction across SOA component instances. This correlation is typically accomplished via the execution context ID (ECID), but we lose the ECID correlation when the business transaction spans technologies like FTP, database, and files. A new feature has been introduced in the Oracle adapter JCA framework to allow the propagation of the ECID. This feature is available in the forthcoming SOA Suite 11.1.1.7 (PS6). The basic concept of propagating the ECID is to identify somewhere in the payload of the message where the ECID can be stored. Then two Binding Properties, relating to the location of the ECID in the message, are added to either the Exposed Service (left-hand side of composite) or External Reference (right-hand side of composite). This will give the JCA framework enough information to either extract the ECID from or add the ECID to the message. In the scenario of extracting the ECID from the message, the ECID will be used for the new component instance. Where to Put the ECID When trying to determine where to store the ECID in the message, you basically have two options: Add a new optional element to your message schema. Leverage an existing element that is not used in your schema. The best scenario is that you are able to add the optional element to your message since trying to find an unused element will prove difficult in most situations. The schema will be holding the ECID value which looks something like the following: 11d1def534ea1be0:7ae4cac3:13b4455735c:-8000-00000000000002dc Configuring Composite Services/References Now that you have identified where you want the ECID to be stored in the message, the JCA framework needs to have this information as well. The two pieces of information that the framework needs relates to the message schema: The namespace for the element in the message. The XPath to the element in the message. To better understand this, let's look at an example for the following database table: When an Exposed Service is created via the Database Adapter Wizard in the composite, the following schema is created: For this example, the two Binding Properties we add to the ReadRow service in the composite are: <!-- Properties for the binding to propagate the ECID from the database table --> <property name="jca.ecid.nslist" type="xs:string" many="false">  xmlns:ns1="http://xmlns.oracle.com/pcbpel/adapter/db/top/ReadRow"</property> <property name="jca.ecid.xpath" type="xs:string" many="false">  /ns1:EcidPropagationCollection/ns1:EcidPropagation/ns1:ecid</property> Notice that the property called jca.ecid.nslist contains the targetNamespace defined in the schema and the property called jca.ecid.xpath contains the XPath statement to the element. The XPath statement also contains the appropriate namespace prefix (ns1) which is defined in the jca.ecid.nslist property. When the Database Adapter service reads a row from the database, it will retrieve the ECID value from the payload and remove the element from the payload. When the component instance is created, it will be associated with the retrieved ECID and the payload contains everything except the ECID element/value. The only time the ECID is visible is when it is stored safely in the resource technology like the database, a file, or a queue. Simple Database/File/JMS Example This section contains a simplified example of how the ECID can propagate through a database table, a file, and JMS queue. The composite for the example looks like the following: The flow of this example is as follows: Invoke database insert using the insertwithecidbpelprocess_client_ep Service. The InsertWithECIDBPELProcess adds a row to the database via the Database Adapter. The JCA Framework adds the ECID to the message prior to inserting. The ReadRow Service retrieves the record and the JCA Framework extracts the ECID from the message. The ECID element is removed from the message. An instance of ReadRowBPELProcess is created and it is associated with the retried ECID. The ReadRowBPELProcess now writes the record to the file system via the File Adapter. The JCA Framework adds the ECID to the message prior to writing the message to file. The ReadFile Service retrieves the record from the file system and the JCA Framework extracts the ECID from the message. The ECID element is removed from the message. An instance of ReadFileBPELProcess is created and it is associated with the retried ECID. The ReadFileBPELProcess now enqueues the message via the JMS Adapter. The JCA Framework adds the ECID to the message prior to enqueuing the message. The DequeueMessage Service retrieves the record and the JCA Framework extracts the ECID from the message. The ECID element is removed from the message. An instance of DequeueMessageBPELProcess is created and it is associated with the retried ECID. The logical flow ends. When viewing the Flow Trace in the Enterprise Manger, you will now see all the instances correlated via ECID: Please check back here when SOA Suite 11.1.1.7 is released for this example. With the example you can run it yourself and reinforce what has been shared in this blog via a hands-on experience. One final note: the contents of this blog may be included in the official SOA Suite 11.1.1.7 documentation, but you will still need to come here to get the example.

    Read the article

  • Building a Data Warehouse

    - by Paul
    I've seen tutorials articles and posts on how to build datawarehouses with star and snowflakes schemas, denormalization of OLTP databases fact and dimension tables and so on. Also seen comments like: Star schemas are for datamarts, at best. There is absolutely no way a true enterprise data warehouse could be represented in a star schema, or snowflake either. I want to create a database that will server for reporting services and maybe (if that isn't enough) install analisys services and extract reports and data from cubes. My question was : Is it really necesarry to redesign my current database and follow the star/snowflake schemas with fact and dimension tables ? Thank you

    Read the article

  • when should a database table be broken into multiple tables with relations?

    - by GSto
    I have an application that needs to store client data, and part of that is some data about their employer as well. Assuming that a client can only have one employer, and that the chance of people having identical employer data is slim to none, which schema would make more sense to use? Schema 1 Client Table: ------------------- id int name varchar(255), email varchar(255), address varchar(255), city varchar(255), state char(2), zip varchar(16), employer_name varchar(255), employer_phone varchar(255), employer_address varchar(255), employer_city varchar(255), employer_state char(2), employer_zip varchar(16) **Schema 2** Client Table ------------------ id int name varchar(255), email varchar(255), address varchar(255), city varchar(255), state char(2), zip varchar(16), Employer Table --------------------- id int name varchar(255), phone varchar(255), address varchar(255), city varchar(255), state char(2), zip varchar(16) patient_id int Part of me thinks that since are clearly two different 'objects' in the real world, seperating them out into two different tables makes sense. However, since a client will always have an employer, I'm also not seeing any real benefits to seperating them out, and it would make querying data about clients more complex. Is there any benefit / reason for creating two tables in a situation like this one instead of one?

    Read the article

  • BAD DC transfering FSMO Roles to ADC

    - by Suleman
    I have a DC (FQDN:server.icmcpk.local) and an ADC (FQDN:file-server.icmcpk.local). Recently my DC is facing a bad sector problem so I changed the Operation Masters to file-server for all five roles. but when ever i turn off the OLD DC the file-server also stops wroking with AD and GPMC further i m also unable to join any other computer to this domain. For Test purpose i also added a new ADC (FQDN:wds-server.icmcpk.local) but no succes with the old DC off i had to turn the old DC on and then joined it. I m attaching the Dcdiags for all three servers. Kindly help me so that i b able to reinstall new HDD and it can go online again. --------------------------------------- Server --------------------------------------- C:\Program Files\Support Tools>dcdiag Domain Controller Diagnosis Performing initial setup: Done gathering initial info. Doing initial required tests Testing server: Default-First-Site-Name\SERVER Starting test: Connectivity ......................... SERVER passed test Connectivity Doing primary tests Testing server: Default-First-Site-Name\SERVER Starting test: Replications [Replications Check,SERVER] A recent replication attempt failed: From FILE-SERVER to SERVER Naming Context: DC=ForestDnsZones,DC=icmcpk,DC=local The replication generated an error (1908): Could not find the domain controller for this domain. The failure occurred at 2012-05-04 14:07:13. The last success occurred at 2012-05-04 13:48:39. 1 failures have occurred since the last success. Kerberos Error. A KDC was not found to authenticate the call. Check that sufficient domain controllers are available. [Replications Check,SERVER] A recent replication attempt failed: From WDS-SERVER to SERVER Naming Context: DC=ForestDnsZones,DC=icmcpk,DC=local The replication generated an error (1908): Could not find the domain controller for this domain. The failure occurred at 2012-05-04 14:07:13. The last success occurred at 2012-05-04 13:48:39. 1 failures have occurred since the last success. Kerberos Error. A KDC was not found to authenticate the call. Check that sufficient domain controllers are available. [Replications Check,SERVER] A recent replication attempt failed: From FILE-SERVER to SERVER Naming Context: DC=DomainDnsZones,DC=icmcpk,DC=local The replication generated an error (1908): Could not find the domain controller for this domain. The failure occurred at 2012-05-04 14:07:13. The last success occurred at 2012-05-04 13:48:39. 1 failures have occurred since the last success. Kerberos Error. A KDC was not found to authenticate the call. Check that sufficient domain controllers are available. [Replications Check,SERVER] A recent replication attempt failed: From WDS-SERVER to SERVER Naming Context: DC=DomainDnsZones,DC=icmcpk,DC=local The replication generated an error (1908): Could not find the domain controller for this domain. The failure occurred at 2012-05-04 14:07:13. The last success occurred at 2012-05-04 13:48:39. 1 failures have occurred since the last success. Kerberos Error. A KDC was not found to authenticate the call. Check that sufficient domain controllers are available. [Replications Check,SERVER] A recent replication attempt failed: From FILE-SERVER to SERVER Naming Context: CN=Schema,CN=Configuration,DC=icmcpk,DC=local The replication generated an error (1908): Could not find the domain controller for this domain. The failure occurred at 2012-05-04 14:07:13. The last success occurred at 2012-05-04 13:48:39. 1 failures have occurred since the last success. Kerberos Error. A KDC was not found to authenticate the call. Check that sufficient domain controllers are available. [Replications Check,SERVER] A recent replication attempt failed: From WDS-SERVER to SERVER Naming Context: CN=Schema,CN=Configuration,DC=icmcpk,DC=local The replication generated an error (1908): Could not find the domain controller for this domain. The failure occurred at 2012-05-04 14:07:13. The last success occurred at 2012-05-04 13:48:39. 1 failures have occurred since the last success. Kerberos Error. A KDC was not found to authenticate the call. Check that sufficient domain controllers are available. [Replications Check,SERVER] A recent replication attempt failed: From WDS-SERVER to SERVER Naming Context: DC=icmcpk,DC=local The replication generated an error (1908): Could not find the domain controller for this domain. The failure occurred at 2012-05-04 14:07:13. The last success occurred at 2012-05-04 13:48:39. 1 failures have occurred since the last success. Kerberos Error. A KDC was not found to authenticate the call. Check that sufficient domain controllers are available. ......................... SERVER passed test Replications Starting test: NCSecDesc ......................... SERVER passed test NCSecDesc Starting test: NetLogons ......................... SERVER passed test NetLogons Starting test: Advertising ......................... SERVER passed test Advertising Starting test: KnowsOfRoleHolders ......................... SERVER passed test KnowsOfRoleHolders Starting test: RidManager ......................... SERVER passed test RidManager Starting test: MachineAccount ......................... SERVER passed test MachineAccount Starting test: Services ......................... SERVER passed test Services Starting test: ObjectsReplicated ......................... SERVER passed test ObjectsReplicated Starting test: frssysvol ......................... SERVER passed test frssysvol Starting test: frsevent There are warning or error events within the last 24 hours after the SYSVOL has been shared. Failing SYSVOL replication problems may cause Group Policy problems. ......................... SERVER failed test frsevent Starting test: kccevent ......................... SERVER passed test kccevent Starting test: systemlog An Error Event occured. EventID: 0x80001778 Time Generated: 05/04/2012 14:05:39 Event String: The previous system shutdown at 1:26:31 PM on An Error Event occured. EventID: 0x825A0011 Time Generated: 05/04/2012 14:07:45 (Event String could not be retrieved) An Error Event occured. EventID: 0x00000457 Time Generated: 05/04/2012 14:13:40 (Event String could not be retrieved) An Error Event occured. EventID: 0x00000457 Time Generated: 05/04/2012 14:14:25 (Event String could not be retrieved) An Error Event occured. EventID: 0x00000457 Time Generated: 05/04/2012 14:14:25 (Event String could not be retrieved) An Error Event occured. EventID: 0x00000457 Time Generated: 05/04/2012 14:14:38 (Event String could not be retrieved) An Error Event occured. EventID: 0xC1010020 Time Generated: 05/04/2012 14:16:14 Event String: Dependent Assembly Microsoft.VC80.MFCLOC could An Error Event occured. EventID: 0xC101003B Time Generated: 05/04/2012 14:16:14 Event String: Resolve Partial Assembly failed for An Error Event occured. EventID: 0xC101003B Time Generated: 05/04/2012 14:16:14 Event String: Generate Activation Context failed for An Error Event occured. EventID: 0xC1010020 Time Generated: 05/04/2012 14:16:14 Event String: Dependent Assembly Microsoft.VC80.MFCLOC could An Error Event occured. EventID: 0xC101003B Time Generated: 05/04/2012 14:16:14 Event String: Resolve Partial Assembly failed for An Error Event occured. EventID: 0xC101003B Time Generated: 05/04/2012 14:16:14 Event String: Generate Activation Context failed for An Error Event occured. EventID: 0x825A0011 Time Generated: 05/04/2012 14:22:57 (Event String could not be retrieved) An Error Event occured. EventID: 0xC1010020 Time Generated: 05/04/2012 14:22:59 Event String: Dependent Assembly Microsoft.VC80.MFCLOC could An Error Event occured. EventID: 0xC101003B Time Generated: 05/04/2012 14:22:59 Event String: Resolve Partial Assembly failed for An Error Event occured. EventID: 0xC101003B Time Generated: 05/04/2012 14:22:59 Event String: Generate Activation Context failed for An Error Event occured. EventID: 0xC1010020 Time Generated: 05/04/2012 14:22:59 Event String: Dependent Assembly Microsoft.VC80.MFCLOC could An Error Event occured. EventID: 0xC101003B Time Generated: 05/04/2012 14:22:59 Event String: Resolve Partial Assembly failed for An Error Event occured. EventID: 0xC101003B Time Generated: 05/04/2012 14:22:59 Event String: Generate Activation Context failed for ......................... SERVER failed test systemlog Starting test: VerifyReferences ......................... SERVER passed test VerifyReferences Running partition tests on : ForestDnsZones Starting test: CrossRefValidation ......................... ForestDnsZones passed test CrossRefValidation Starting test: CheckSDRefDom ......................... ForestDnsZones passed test CheckSDRefDom Running partition tests on : DomainDnsZones Starting test: CrossRefValidation ......................... DomainDnsZones passed test CrossRefValidation Starting test: CheckSDRefDom ......................... DomainDnsZones passed test CheckSDRefDom Running partition tests on : Schema Starting test: CrossRefValidation ......................... Schema passed test CrossRefValidation Starting test: CheckSDRefDom ......................... Schema passed test CheckSDRefDom Running partition tests on : Configuration Starting test: CrossRefValidation ......................... Configuration passed test CrossRefValidation Starting test: CheckSDRefDom ......................... Configuration passed test CheckSDRefDom Running partition tests on : icmcpk Starting test: CrossRefValidation ......................... icmcpk passed test CrossRefValidation Starting test: CheckSDRefDom ......................... icmcpk passed test CheckSDRefDom Running enterprise tests on : icmcpk.local Starting test: Intersite ......................... icmcpk.local passed test Intersite Starting test: FsmoCheck ......................... icmcpk.local passed test FsmoCheck ---------------------- File-Server ---------------------- C:\Users\Administrator.ICMCPK>dcdiag Directory Server Diagnosis Performing initial setup: Trying to find home server... Home Server = FILE-SERVER * Identified AD Forest. Done gathering initial info. Doing initial required tests Testing server: Default-First-Site-Name\FILE-SERVER Starting test: Connectivity ......................... FILE-SERVER passed test Connectivity Doing primary tests Testing server: Default-First-Site-Name\FILE-SERVER Starting test: Advertising Warning: DsGetDcName returned information for \\Server.icmcpk.local, when we were trying to reach FILE-SERVER. SERVER IS NOT RESPONDING or IS NOT CONSIDERED SUITABLE. ......................... FILE-SERVER failed test Advertising Starting test: FrsEvent ......................... FILE-SERVER passed test FrsEvent Starting test: DFSREvent ......................... FILE-SERVER passed test DFSREvent Starting test: SysVolCheck ......................... FILE-SERVER passed test SysVolCheck Starting test: KccEvent ......................... FILE-SERVER passed test KccEvent Starting test: KnowsOfRoleHolders ......................... FILE-SERVER passed test KnowsOfRoleHolders Starting test: MachineAccount ......................... FILE-SERVER passed test MachineAccount Starting test: NCSecDesc Error NT AUTHORITY\ENTERPRISE DOMAIN CONTROLLERS doesn't have Replicating Directory Changes In Filtered Set access rights for the naming context: DC=ForestDnsZones,DC=icmcpk,DC=local Error NT AUTHORITY\ENTERPRISE DOMAIN CONTROLLERS doesn't have Replicating Directory Changes In Filtered Set access rights for the naming context: DC=DomainDnsZones,DC=icmcpk,DC=local ......................... FILE-SERVER failed test NCSecDesc Starting test: NetLogons Unable to connect to the NETLOGON share! (\\FILE-SERVER\netlogon) [FILE-SERVER] An net use or LsaPolicy operation failed with error 67, The network name cannot be found.. ......................... FILE-SERVER failed test NetLogons Starting test: ObjectsReplicated ......................... FILE-SERVER passed test ObjectsReplicated Starting test: Replications ......................... FILE-SERVER passed test Replications Starting test: RidManager ......................... FILE-SERVER passed test RidManager Starting test: Services ......................... FILE-SERVER passed test Services Starting test: SystemLog An Error Event occurred. EventID: 0x00000469 Time Generated: 05/04/2012 14:01:10 Event String: The processing of Group Policy failed because of lack of network con nectivity to a domain controller. This may be a transient condition. A success m essage would be generated once the machine gets connected to the domain controll er and Group Policy has succesfully processed. If you do not see a success messa ge for several hours, then contact your administrator. An Warning Event occurred. EventID: 0x8000A001 Time Generated: 05/04/2012 14:07:11 Event String: The Security System could not establish a secured connection with th e server ldap/icmcpk.local/[email protected]. No authentication protocol was available. An Warning Event occurred. EventID: 0x00000BBC Time Generated: 05/04/2012 14:30:34 Event String: Windows Defender Real-Time Protection agent has detected changes. Mi crosoft recommends you analyze the software that made these changes for potentia l risks. You can use information about how these programs operate to choose whet her to allow them to run or remove them from your computer. Allow changes only if you trust the program or the software publisher. Windows Defender can't undo changes that you allow. An Warning Event occurred. EventID: 0x00000BBC Time Generated: 05/04/2012 14:30:36 Event String: Windows Defender Real-Time Protection agent has detected changes. Mi crosoft recommends you analyze the software that made these changes for potentia l risks. You can use information about how these programs operate to choose whet her to allow them to run or remove them from your computer. Allow changes only if you trust the program or the software publisher. Windows Defender can't undo changes that you allow. ......................... FILE-SERVER failed test SystemLog Starting test: VerifyReferences ......................... FILE-SERVER passed test VerifyReferences Running partition tests on : ForestDnsZones Starting test: CheckSDRefDom ......................... ForestDnsZones passed test CheckSDRefDom Starting test: CrossRefValidation ......................... ForestDnsZones passed test CrossRefValidation Running partition tests on : DomainDnsZones Starting test: CheckSDRefDom ......................... DomainDnsZones passed test CheckSDRefDom Starting test: CrossRefValidation ......................... DomainDnsZones passed test CrossRefValidation Running partition tests on : Schema Starting test: CheckSDRefDom ......................... Schema passed test CheckSDRefDom Starting test: CrossRefValidation ......................... Schema passed test CrossRefValidation Running partition tests on : Configuration Starting test: CheckSDRefDom ......................... Configuration passed test CheckSDRefDom Starting test: CrossRefValidation ......................... Configuration passed test CrossRefValidation Running partition tests on : icmcpk Starting test: CheckSDRefDom ......................... icmcpk passed test CheckSDRefDom Starting test: CrossRefValidation ......................... icmcpk passed test CrossRefValidation Running enterprise tests on : icmcpk.local Starting test: LocatorCheck ......................... icmcpk.local passed test LocatorCheck Starting test: Intersite ......................... icmcpk.local passed test Intersite --------------------- WDS-Server --------------------- C:\Users\Administrator.ICMCPK>dcdiag Directory Server Diagnosis Performing initial setup: Trying to find home server... Home Server = WDS-SERVER * Identified AD Forest. Done gathering initial info. Doing initial required tests Testing server: Default-First-Site-Name\WDS-SERVER Starting test: Connectivity ......................... WDS-SERVER passed test Connectivity Doing primary tests Testing server: Default-First-Site-Name\WDS-SERVER Starting test: Advertising Warning: DsGetDcName returned information for \\Server.icmcpk.local, when we were trying to reach WDS-SERVER. SERVER IS NOT RESPONDING or IS NOT CONSIDERED SUITABLE. ......................... WDS-SERVER failed test Advertising Starting test: FrsEvent There are warning or error events within the last 24 hours after the SYSVOL has been shared. Failing SYSVOL replication problems may cause Group Policy problems. ......................... WDS-SERVER passed test FrsEvent Starting test: DFSREvent ......................... WDS-SERVER passed test DFSREvent Starting test: SysVolCheck ......................... WDS-SERVER passed test SysVolCheck Starting test: KccEvent ......................... WDS-SERVER passed test KccEvent Starting test: KnowsOfRoleHolders ......................... WDS-SERVER passed test KnowsOfRoleHolders Starting test: MachineAccount ......................... WDS-SERVER passed test MachineAccount Starting test: NCSecDesc Error NT AUTHORITY\ENTERPRISE DOMAIN CONTROLLERS doesn't have Replicating Directory Changes In Filtered Set access rights for the naming context: DC=ForestDnsZones,DC=icmcpk,DC=local Error NT AUTHORITY\ENTERPRISE DOMAIN CONTROLLERS doesn't have Replicating Directory Changes In Filtered Set access rights for the naming context: DC=DomainDnsZones,DC=icmcpk,DC=local ......................... WDS-SERVER failed test NCSecDesc Starting test: NetLogons Unable to connect to the NETLOGON share! (\\WDS-SERVER\netlogon) [WDS-SERVER] An net use or LsaPolicy operation failed with error 67, The network name cannot be found.. ......................... WDS-SERVER failed test NetLogons Starting test: ObjectsReplicated ......................... WDS-SERVER passed test ObjectsReplicated Starting test: Replications ......................... WDS-SERVER passed test Replications Starting test: RidManager ......................... WDS-SERVER passed test RidManager Starting test: Services ......................... WDS-SERVER passed test Services Starting test: SystemLog An Error Event occurred. EventID: 0x0000041E Time Generated: 05/04/2012 14:02:55 Event String: The processing of Group Policy failed. Windows could not obtain the name of a domain controller. This could be caused by a name resolution failure. Verify your Domain Name Sysytem (DNS) is configured and working correctly. An Error Event occurred. EventID: 0x0000041E Time Generated: 05/04/2012 14:08:33 Event String: The processing of Group Policy failed. Windows could not obtain the name of a domain controller. This could be caused by a name resolution failure. Verify your Domain Name Sysytem (DNS) is configured and working correctly. ......................... WDS-SERVER failed test SystemLog Starting test: VerifyReferences ......................... WDS-SERVER passed test VerifyReferences Running partition tests on : ForestDnsZones Starting test: CheckSDRefDom ......................... ForestDnsZones passed test CheckSDRefDom Starting test: CrossRefValidation ......................... ForestDnsZones passed test CrossRefValidation Running partition tests on : DomainDnsZones Starting test: CheckSDRefDom ......................... DomainDnsZones passed test CheckSDRefDom Starting test: CrossRefValidation ......................... DomainDnsZones passed test CrossRefValidation Running partition tests on : Schema Starting test: CheckSDRefDom ......................... Schema passed test CheckSDRefDom Starting test: CrossRefValidation ......................... Schema passed test CrossRefValidation Running partition tests on : Configuration Starting test: CheckSDRefDom ......................... Configuration passed test CheckSDRefDom Starting test: CrossRefValidation ......................... Configuration passed test CrossRefValidation Running partition tests on : icmcpk Starting test: CheckSDRefDom ......................... icmcpk passed test CheckSDRefDom Starting test: CrossRefValidation ......................... icmcpk passed test CrossRefValidation Running enterprise tests on : icmcpk.local Starting test: LocatorCheck ......................... icmcpk.local passed test LocatorCheck Starting test: Intersite ......................... icmcpk.local passed test Intersite

    Read the article

  • Folders in SQL Server Data Tools

    - by jamiet
    Recently I have begun a new project in which I am using SQL Server Data Tools (SSDT) and SQL Server Integration Services (SSIS) 2012. Although I have been using SSDT & SSIS fairly extensively while SQL Server 2012 was in the beta phase I usually find that you don’t learn about the capabilities and quirks of new products until you use them on a real project, hence I am hoping I’m going to have a lot of experiences to share on my blog over the coming few weeks. In this first such blog post I want to talk about file and folder organisation in SSDT. The predecessor to SSDT is Visual Studio Database Projects. When one created a new Visual Studio Database Project a folder structure was provided with “Schema Objects” and “Scripts” in the root and a series of subfolders for each schema: Apparently a few customers were not too happy with the tool arbitrarily creating lots of folders in Solution Explorer and hence SSDT has gone in completely the opposite direction; now no folders are created and new objects will get created in the root – it is at your discretion where they get moved to: After using SSDT for a few weeks I can safely say that I preferred the older way because I never used Solution Explorer to navigate my schema objects anyway so it didn’t bother me how many folders it created. Having said that the thought of a single long list of files in Solution Explorer without any folders makes me shudder so on this project I have been manually creating folders in which to organise files and I have tried to mimic the old way as much as possible by creating two folders in the root, one for all schema objects and another for Pre/Post deployment scripts: This works fine until different developers start to build their own different subfolder structures; if you are OCD-inclined like me this is going to grate on you eventually and hence you are going to want to move stuff around so that you have consistent folder structures for each schema and (if you have multiple databases) each project. Moreover new files get created with a filename of the object name + “.sql” and often people like to have an extra identifier in the filename to indicate the object type: The overall point is this – files and folders in your solution are going to change. Some version control systems (VCSs) don’t take kindly to files being moved around or renamed because they recognise the renamed/moved file simply as a new file and when they do that you lose the revision history which, to my mind, is one of the key benefits of using a VCS in the first place. On this project we have been using Team Foundation Server (TFS) and while it pains me to say it (as I am no great fan of TFS’s version control system) it has proved invaluable when dealing with the SSDT problems that I outlined above because it is integrated right into the Visual Studio IDE. Thus the advice from this blog post is: If you are using SSDT consider using an Visual-Studio-integrated VCS that can easily handle file renames and file moves I suspect that fans of other VCSs will counter by saying that their VCS weapon of choice can handle renames/file moves quite satisfactorily and if that’s the case…great…let me know about them in the comments. This blog post is not an attempt to make people use one particular VCS, only to make people aware of this issue that might rise when using SSDT. More to come in the coming few weeks! @jamiet

    Read the article

  • Use MTOM/streaming from C# calling a webservice in java exposed via jaxws

    - by raticulin
    We have this webservice created with jax-ws @WebService(name = "Mywebser", targetNamespace = "http://namespace") @MTOM(threshold = 2048) @SOAPBinding(style = SOAPBinding.Style.DOCUMENT, use = SOAPBinding.Use.LITERAL, parameterStyle = SOAPBinding.ParameterStyle.WRAPPED) public class Mywebser { @WebMethod(operationName = "doStreaming", action = "urn:doStreaming") @WebResult(name = "return") public ResultInfo doStreaming(String username, String pwd, @XmlMimeType("application/octet-stream") DataHandler data, boolean overw){ ... } } The generated client side looks like this: @WebMethod(action = "urn:doStreaming") @WebResult(targetNamespace = "") @RequestWrapper(localName = "doStreaming", targetNamespace = "http://namespace", className = "com.mypack.client.doStreaming") @ResponseWrapper(localName = "doStreamingResponse", targetNamespace = "http://namespace", className = "com.mypack.client.doStreamingResponse") public ResultInfo doStreaming( @WebParam(name = "arg0", targetNamespace = "") String arg0, @WebParam(name = "arg1", targetNamespace = "") String arg1, @WebParam(name = "arg2", targetNamespace = "") DataHandler arg2, @WebParam(name = "arg3", targetNamespace = "") boolean arg3); By using it this way it uses streaming properly (verified we can pass an argument of 80mb when the jvm had less allowed. MywebserService serv = ...; Mywebser wso = serv.getMywebserPort(new MTOMFeature()); Map<String, Object> ctxt = ((BindingProvider) wso).getRequestContext(); ctxt.put(JAXWSProperties.HTTP_CLIENT_STREAMING_CHUNK_SIZE, 8192); DataHandler dataHandler = new DataHandler(new FileDataSource("c:\\temp\\A.dat")); arcres = wso.doStreaming("a", "b", dataHandler, true); We generate a clienet for .net, with VS2008, using "Add Web Reference", we get this C# code: [System.Web.Services.Protocols.SoapDocumentMethodAttribute("urn:doStreaming",RequestNamespace="http://namespace",ResponseNamespace="http://namespace",Use=System.Web.Services.Description.SoapBindingUse.Literal,ParameterStyle=System.Web.Services.Protocols.SoapParameterStyle.Wrapped)] [return: System.Xml.Serialization.XmlElementAttribute("return",Form=System.Xml.Schema.XmlSchemaForm.Unqualified)] public ResultInfo doStreaming( [System.Xml.Serialization.XmlElementAttribute(Form=System.Xml.Schema.XmlSchemaForm.Unqualified)] string arg0, [System.Xml.Serialization.XmlElementAttribute(Form=System.Xml.Schema.XmlSchemaForm.Unqualified)] string arg1, [System.Xml.Serialization.XmlElementAttribute(Form=System.Xml.Schema.XmlSchemaForm.Unqualified,DataType="base64Binary")] byte[] arg2, [System.Xml.Serialization.XmlElementAttribute(Form=System.Xml.Schema.XmlSchemaForm.Unqualified)] bool arg3) Apparently this is not using streaming? The type base64Binary of arg2 seems not the right one? In java it's a DataHandler. By testing it with low memory on the java side we can see it is not using streaming as it fails with OOM. Does someone knows if this is possible, and if so how? Our environment: server: jdk1.6, jaxws 2.1.7 client: C# 2.0, visual studio 2008

    Read the article

  • NullPointerException in com.sun.tools.jxc.SchemaGenTask

    - by David Collie
    Given this ant script: <?xml version="1.0" encoding="UTF-8"?> <project name="projectname" default="generate-schema" basedir="."> <taskdef name="schemagen" classname="com.sun.tools.jxc.SchemaGenTask"> <classpath> <fileset dir="../BuildJars/lib" includes="*.jar" /> </classpath> </taskdef> <target name="generate-schema"> <schemagen srcdir="src/gb/informaticasystems/messages" destdir="schema"> <schema namespace="http://www.informatica-systems.co.uk/aquarius/messages/1.0" file="messages-1.0.xsd" /> </schemagen> </target> </project> I am getting this error: Buildfile: C:\Users\davidcollie\workspace\aquarius-feature\AquariusServerLibrary\schemagen.xml generate-schema: [schemagen] Generating schema from 4 source files [schemagen] Problem encountered during annotation processing; [schemagen] see stacktrace below for more information. [schemagen] java.lang.NullPointerException [schemagen] at com.sun.tools.jxc.model.nav.APTNavigator$2.onDeclaredType(APTNavigator.java:428) [schemagen] at com.sun.tools.jxc.model.nav.APTNavigator$2.onClassType(APTNavigator.java:402) [schemagen] at com.sun.tools.jxc.model.nav.APTNavigator$2.onClassType(APTNavigator.java:456) [schemagen] at com.sun.istack.tools.APTTypeVisitor.apply(APTTypeVisitor.java:27) [schemagen] at com.sun.tools.jxc.model.nav.APTNavigator.getBaseClass(APTNavigator.java:109) [schemagen] at com.sun.tools.jxc.model.nav.APTNavigator.getBaseClass(APTNavigator.java:85) [schemagen] at com.sun.xml.bind.v2.model.impl.PropertyInfoImpl.getIndividualType(PropertyInfoImpl.java:190) [schemagen] at com.sun.xml.bind.v2.model.impl.PropertyInfoImpl.<init>(PropertyInfoImpl.java:132) [schemagen] at com.sun.xml.bind.v2.model.impl.MapPropertyInfoImpl.<init>(MapPropertyInfoImpl.java:67) [schemagen] at com.sun.xml.bind.v2.model.impl.ClassInfoImpl.createMapProperty(ClassInfoImpl.java:917) [schemagen] at com.sun.xml.bind.v2.model.impl.ClassInfoImpl.addProperty(ClassInfoImpl.java:874) [schemagen] at com.sun.xml.bind.v2.model.impl.ClassInfoImpl.findGetterSetterProperties(ClassInfoImpl.java:993) [schemagen] at com.sun.xml.bind.v2.model.impl.ClassInfoImpl.getProperties(ClassInfoImpl.java:303) [schemagen] at com.sun.xml.bind.v2.model.impl.ModelBuilder.getClassInfo(ModelBuilder.java:243) [schemagen] at com.sun.xml.bind.v2.model.impl.ModelBuilder.getClassInfo(ModelBuilder.java:209) [schemagen] at com.sun.xml.bind.v2.model.impl.ModelBuilder.getTypeInfo(ModelBuilder.java:315) [schemagen] at com.sun.xml.bind.v2.model.impl.ModelBuilder.getTypeInfo(ModelBuilder.java:330) [schemagen] at com.sun.tools.xjc.api.impl.j2s.JavaCompilerImpl.bind(JavaCompilerImpl.java:90) [schemagen] at com.sun.tools.jxc.apt.SchemaGenerator$1.process(SchemaGenerator.java:115) [schemagen] at com.sun.mirror.apt.AnnotationProcessors$CompositeAnnotationProcessor.process(AnnotationProcessors.java:60) [schemagen] at com.sun.tools.apt.comp.Apt.main(Apt.java:454) [schemagen] at com.sun.tools.apt.main.JavaCompiler.compile(JavaCompiler.java:258) [schemagen] at com.sun.tools.apt.main.Main.compile(Main.java:1102) [schemagen] at com.sun.tools.apt.main.Main.compile(Main.java:964) [schemagen] at com.sun.tools.apt.Main.processing(Main.java:95) [schemagen] at com.sun.tools.apt.Main.process(Main.java:85) [schemagen] at com.sun.tools.apt.Main.process(Main.java:67) [schemagen] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) [schemagen] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) [schemagen] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) [schemagen] at java.lang.reflect.Method.invoke(Method.java:597) [schemagen] at com.sun.tools.jxc.AptBasedTask$InternalAptAdapter.execute(AptBasedTask.java:97) [schemagen] at com.sun.tools.jxc.AptBasedTask.compile(AptBasedTask.java:144) [schemagen] at org.apache.tools.ant.taskdefs.Javac.execute(Javac.java:820) [schemagen] at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288) [schemagen] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) [schemagen] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) [schemagen] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) [schemagen] at java.lang.reflect.Method.invoke(Method.java:597) [schemagen] at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:105) [schemagen] at org.apache.tools.ant.Task.perform(Task.java:348) [schemagen] at org.apache.tools.ant.Target.execute(Target.java:357) [schemagen] at org.apache.tools.ant.Target.performTasks(Target.java:385) [schemagen] at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1329) [schemagen] at org.apache.tools.ant.Project.executeTarget(Project.java:1298) [schemagen] at org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:41) [schemagen] at org.eclipse.ant.internal.ui.antsupport.EclipseDefaultExecutor.executeTargets(EclipseDefaultExecutor.java:32) [schemagen] at org.apache.tools.ant.Project.executeTargets(Project.java:1181) [schemagen] at org.eclipse.ant.internal.ui.antsupport.InternalAntRunner.run(InternalAntRunner.java:423) [schemagen] at org.eclipse.ant.internal.ui.antsupport.InternalAntRunner.main(InternalAntRunner.java:137) BUILD FAILED C:\Users\davidcollie\workspace\aquarius-feature\AquariusServerLibrary\schemagen.xml:11: schema generation failed Total time: 1 second Any ideas?

    Read the article

  • OpenLDAP and SSL

    - by Stormshadow
    I am having trouble trying to connect to a secure OpenLDAP server which I have set up. On running my LDAP client code java -Djavax.net.debug=ssl LDAPConnector I get the following exception trace (java version 1.6.0_17) trigger seeding of SecureRandom done seeding SecureRandom %% No cached client session *** ClientHello, TLSv1 RandomCookie: GMT: 1256110124 bytes = { 224, 19, 193, 148, 45, 205, 108, 37, 101, 247, 112, 24, 157, 39, 111, 177, 43, 53, 206, 224, 68, 165, 55, 185, 54, 203, 43, 91 } Session ID: {} Cipher Suites: [SSL_RSA_WITH_RC4_128_MD5, SSL_RSA_WITH_RC4_128_SHA, TLS_RSA_WITH_AES_128_CBC_SHA, TLS_DHE_RSA_WITH_AES_128_CBC_SHA, TLS_DHE_DSS_WITH_AES_128_CBC_SHA, SSL_RSA_W ITH_3DES_EDE_CBC_SHA, SSL_DHE_RSA_WITH_3DES_EDE_CBC_SHA, SSL_DHE_DSS_WITH_3DES_EDE_CBC_SHA, SSL_RSA_WITH_DES_CBC_SHA, SSL_DHE_RSA_WITH_DES_CBC_SHA, SSL_DHE_DSS_WITH_DES_CBC_SH A, SSL_RSA_EXPORT_WITH_RC4_40_MD5, SSL_RSA_EXPORT_WITH_DES40_CBC_SHA, SSL_DHE_RSA_EXPORT_WITH_DES40_CBC_SHA, SSL_DHE_DSS_EXPORT_WITH_DES40_CBC_SHA] Compression Methods: { 0 } *** Thread-0, WRITE: TLSv1 Handshake, length = 73 Thread-0, WRITE: SSLv2 client hello message, length = 98 Thread-0, received EOFException: error Thread-0, handling exception: javax.net.ssl.SSLHandshakeException: Remote host closed connection during handshake Thread-0, SEND TLSv1 ALERT: fatal, description = handshake_failure Thread-0, WRITE: TLSv1 Alert, length = 2 Thread-0, called closeSocket() main, handling exception: javax.net.ssl.SSLHandshakeException: Remote host closed connection during handshake javax.naming.CommunicationException: simple bind failed: ldap.natraj.com:636 [Root exception is javax.net.ssl.SSLHandshakeException: Remote host closed connection during hands hake] at com.sun.jndi.ldap.LdapClient.authenticate(Unknown Source) at com.sun.jndi.ldap.LdapCtx.connect(Unknown Source) at com.sun.jndi.ldap.LdapCtx.<init>(Unknown Source) at com.sun.jndi.ldap.LdapCtxFactory.getUsingURL(Unknown Source) at com.sun.jndi.ldap.LdapCtxFactory.getUsingURLs(Unknown Source) at com.sun.jndi.ldap.LdapCtxFactory.getLdapCtxInstance(Unknown Source) at com.sun.jndi.ldap.LdapCtxFactory.getInitialContext(Unknown Source) at javax.naming.spi.NamingManager.getInitialContext(Unknown Source) at javax.naming.InitialContext.getDefaultInitCtx(Unknown Source) at javax.naming.InitialContext.init(Unknown Source) at javax.naming.InitialContext.<init>(Unknown Source) at javax.naming.directory.InitialDirContext.<init>(Unknown Source) at LDAPConnector.CallSecureLDAPServer(LDAPConnector.java:43) at LDAPConnector.main(LDAPConnector.java:237) Caused by: javax.net.ssl.SSLHandshakeException: Remote host closed connection during handshake at com.sun.net.ssl.internal.ssl.SSLSocketImpl.readRecord(Unknown Source) at com.sun.net.ssl.internal.ssl.SSLSocketImpl.performInitialHandshake(Unknown Source) at com.sun.net.ssl.internal.ssl.SSLSocketImpl.readDataRecord(Unknown Source) at com.sun.net.ssl.internal.ssl.AppInputStream.read(Unknown Source) at java.io.BufferedInputStream.fill(Unknown Source) at java.io.BufferedInputStream.read1(Unknown Source) at java.io.BufferedInputStream.read(Unknown Source) at com.sun.jndi.ldap.Connection.run(Unknown Source) at java.lang.Thread.run(Unknown Source) Caused by: java.io.EOFException: SSL peer shut down incorrectly at com.sun.net.ssl.internal.ssl.InputRecord.read(Unknown Source) ... 9 more I am able to connect to the same secure LDAP server however if I use another version of java (1.6.0_14) I have created and installed the server certificates in the cacerts of both the JRE's as mentioned in this guide -- OpenLDAP with SSL When I run ldapsearch -x on the server I get # extended LDIF # # LDAPv3 # base <dc=localdomain> (default) with scope subtree # filter: (objectclass=*) # requesting: ALL # # localdomain dn: dc=localdomain objectClass: top objectClass: dcObject objectClass: organization o: localdomain dc: localdomain # admin, localdomain dn: cn=admin,dc=localdomain objectClass: simpleSecurityObject objectClass: organizationalRole cn: admin description: LDAP administrator # search result search: 2 result: 0 Success # numResponses: 3 # numEntries: 2 On running openssl s_client -connect ldap.natraj.com:636 -showcerts , I obtain the self signed certificate. My slapd.conf file is as follows ####################################################################### # Global Directives: # Features to permit #allow bind_v2 # Schema and objectClass definitions include /etc/ldap/schema/core.schema include /etc/ldap/schema/cosine.schema include /etc/ldap/schema/nis.schema include /etc/ldap/schema/inetorgperson.schema # Where the pid file is put. The init.d script # will not stop the server if you change this. pidfile /var/run/slapd/slapd.pid # List of arguments that were passed to the server argsfile /var/run/slapd/slapd.args # Read slapd.conf(5) for possible values loglevel none # Where the dynamically loaded modules are stored modulepath /usr/lib/ldap moduleload back_hdb # The maximum number of entries that is returned for a search operation sizelimit 500 # The tool-threads parameter sets the actual amount of cpu's that is used # for indexing. tool-threads 1 ####################################################################### # Specific Backend Directives for hdb: # Backend specific directives apply to this backend until another # 'backend' directive occurs backend hdb ####################################################################### # Specific Backend Directives for 'other': # Backend specific directives apply to this backend until another # 'backend' directive occurs #backend <other> ####################################################################### # Specific Directives for database #1, of type hdb: # Database specific directives apply to this databasse until another # 'database' directive occurs database hdb # The base of your directory in database #1 suffix "dc=localdomain" # rootdn directive for specifying a superuser on the database. This is needed # for syncrepl. rootdn "cn=admin,dc=localdomain" # Where the database file are physically stored for database #1 directory "/var/lib/ldap" # The dbconfig settings are used to generate a DB_CONFIG file the first # time slapd starts. They do NOT override existing an existing DB_CONFIG # file. You should therefore change these settings in DB_CONFIG directly # or remove DB_CONFIG and restart slapd for changes to take effect. # For the Debian package we use 2MB as default but be sure to update this # value if you have plenty of RAM dbconfig set_cachesize 0 2097152 0 # Sven Hartge reported that he had to set this value incredibly high # to get slapd running at all. See http://bugs.debian.org/303057 for more # information. # Number of objects that can be locked at the same time. dbconfig set_lk_max_objects 1500 # Number of locks (both requested and granted) dbconfig set_lk_max_locks 1500 # Number of lockers dbconfig set_lk_max_lockers 1500 # Indexing options for database #1 index objectClass eq # Save the time that the entry gets modified, for database #1 lastmod on # Checkpoint the BerkeleyDB database periodically in case of system # failure and to speed slapd shutdown. checkpoint 512 30 # Where to store the replica logs for database #1 # replogfile /var/lib/ldap/replog # The userPassword by default can be changed # by the entry owning it if they are authenticated. # Others should not be able to see it, except the # admin entry below # These access lines apply to database #1 only access to attrs=userPassword,shadowLastChange by dn="cn=admin,dc=localdomain" write by anonymous auth by self write by * none # Ensure read access to the base for things like # supportedSASLMechanisms. Without this you may # have problems with SASL not knowing what # mechanisms are available and the like. # Note that this is covered by the 'access to *' # ACL below too but if you change that as people # are wont to do you'll still need this if you # want SASL (and possible other things) to work # happily. access to dn.base="" by * read # The admin dn has full write access, everyone else # can read everything. access to * by dn="cn=admin,dc=localdomain" write by * read # For Netscape Roaming support, each user gets a roaming # profile for which they have write access to #access to dn=".*,ou=Roaming,o=morsnet" # by dn="cn=admin,dc=localdomain" write # by dnattr=owner write ####################################################################### # Specific Directives for database #2, of type 'other' (can be hdb too): # Database specific directives apply to this databasse until another # 'database' directive occurs #database <other> # The base of your directory for database #2 #suffix "dc=debian,dc=org" ####################################################################### # SSL: # Uncomment the following lines to enable SSL and use the default # snakeoil certificates. #TLSCertificateFile /etc/ssl/certs/ssl-cert-snakeoil.pem #TLSCertificateKeyFile /etc/ssl/private/ssl-cert-snakeoil.key TLSCipherSuite TLS_RSA_AES_256_CBC_SHA TLSCACertificateFile /etc/ldap/ssl/server.pem TLSCertificateFile /etc/ldap/ssl/server.pem TLSCertificateKeyFile /etc/ldap/ssl/server.pem My ldap.conf file is # # LDAP Defaults # # See ldap.conf(5) for details # This file should be world readable but not world writable. HOST ldap.natraj.com PORT 636 BASE dc=localdomain URI ldaps://ldap.natraj.com TLS_CACERT /etc/ldap/ssl/server.pem TLS_REQCERT allow #SIZELIMIT 12 #TIMELIMIT 15 #DEREF never

    Read the article

  • JSON-RPC and Json-rpc service discovery specifications

    - by Artyom
    Hello, I'm going to implement JSON-PRC web service. I need specifications for this. So far I had found only one resource that can be called as real specifications: JSON-RPC 1.0 http://json-rpc.org/wiki/specification Proposal of JSON-PRC 2.0: http://groups.google.com/group/json-rpc/web/json-rpc-2-0 (why is it on google groups?) However I've seen that JavaScript frameworks like Dojo actively use JSON-RPC SMD Service Mapping Description proposal But it requires JSON Schema specifications, but it redirects to incorrect URL as reference. So far I had found following: http://tools.ietf.org/html/draft-zyp-json-schema-02 And it is still draft... Can anybody point me to some actual specifications... At least something official updated? Because it looks like that implementing JSON-RPC 1.0 as is may be not enough, at least for frameworks like Dojo. Or am I wrong? Questions: Would implementation of JSON-RPC 1.0 specifications be enough to provide JSON-RPC service for most of modern clients, and how many clients there (if at-all) that actually support beyond JSON-RPC 1.0 capabilities (SMD, Schema, 2.0)? Because it looks like that JSON-RPC 1.0 is only one that has official specifications (and not draft) If I should implement SMD, or it is recommended can somebody point to official, most recent specifications of Json Schema and Service Mapping Description or links I found are really "the specifications?" Are JSON-RPC 2.0, SMD and JSON-Schema drafts stable enough to implement them? Note: do not suggest existing JSON-RPC service implementations. Anybody?

    Read the article

  • PHP XML Validation

    - by efritz
    What's the best way to validate an XML file (or a portion of it) against multiple XSD files? For example, I have the following schema for a configuration loader: <xsd:schema xmlns="http://www.kauriproject.org/schema/configuration" xmlns:xsd="http://www.w3.org/2001/XMLSchema" targetNamespace="http://www.kauriproject.org/schema/configuration" elementFormDefault="qualified"> <xsd:element name="configuration" type="configuration" /> <xsd:complexType name="configuration"> <xsd:choice maxOccurs="unbounded"> <xsd:element name="import" type="import" minOccurs="0" maxOccurs="unbounded" /> <xsd:element name="section" type="section" /> </xsd:choice> </xsd:complexType> <xsd:complexType name="section"> <xsd:sequence> <xsd:any minOccurs="0" maxOccurs="unbounded" processContents="lax" /> </xsd:sequence> <xsd:attribute name="name" type="xsd:string" use="required" /> <xsd:attribute name="type" type="xsd:string" use="required" /> </xsd:complexType> <xsd:complexType name="import" mixed="true"> <xsd:attribute name="resource" type="xsd:string" /> </xsd:complexType> </xsd:schema> As the Configuration class exists now, it lets one add a <section> tag with a define concrete parser class (much like custom configuration sections in ASP.NET). However, I'm unsure of how to validate the section being parsed. If it possible to validate just this section of code with an XSD file/string without writing it back to a file?

    Read the article

  • Membership provider stopped working on using Membership Provider in ASP.Net MVC

    - by Michael Meadows
    I have a production ASP.Net MVC application that has been using the membership service. It has worked fine, up until today. Nothing has changed on the server, but I now have this error: System.Configuration.Provider.ProviderException: The 'System.Web.Security.SqlMembershipProvider' requires a database schema compatible with schema version '1'. However, the current database schema is not compatible with this version. You may need to either install a compatible schema with aspnet_regsql.exe (available in the framework installation directory), or upgrade the provider to a newer version. I checked the asp_schemaversions table, and it looks fine. Please any help would be appreciated.

    Read the article

  • Is it possible to handle User Defined Exception using JAX WS Dispatch API ?

    - by snowflake
    Hello, I'm performing dynamic webservices call using following code snippet: JAXBContext jc = getJAXBContext(requestClass, responseClass, jaxbContextExtraClasses); Dispatch<Object> dispatch = service.createDispatch(portQName, jc, Service.Mode.PAYLOAD); Object requestValue = getRequestValue(requestClass, pOrderedParameters); JAXBElement<?> request = new JAXBElement(new QName(serviceQNameStr, pOperationName), requestValue.getClass(), null, requestValue); Object tmpResponse = dispatch.invoke(request); Invocation works perfectly, except if I add a user defined exception on the service (a basic UserException extends java.lang.Exception). First I get: javax.xml.bind.UnmarshalException: unexpected element (uri:"http://schemas.xmlsoap.org/soap/envelope/", local:"Fault"). Expected elements are <{http://my.namespace/}myMethod,<{http://my.namespace/}myResponse Then I added the UserException_Exception JAX-WS generated type to my JAXB Context, and then get: Caused by: com.sun.xml.bind.v2.runtime.IllegalAnnotationsException: 1 counts of IllegalAnnotationExceptions java.lang.StackTraceElement does not have a no-arg default constructor. this problem is related to the following location: at java.lang.StackTraceElement at public java.lang.StackTraceElement[] java.lang.Throwable.getStackTrace() at java.lang.Throwable at java.lang.Exception Only solution available I found are: dispatch directly a Soap message and handle Soap fault directly (this is the way Jboss JAX-WS implementation performs a standard JAX-WS call using services interfaces). This is not an available solution for me, I want to use a high level implementation (the more I get close to Soap message the less dynamic my code can be) usage of JAXBRIContext.ANNOTATION_READER, which is implementation specific and not an available solution for me, in order to annotate annotates java.lang.Exception as @XmlTransient The service with a user defined exception performs well using the JAX-WS generated standard client stubs and using a tool such Soap UI. Problem occurs in deserialization of the message when I have no user defined exception artifact in the JAXB context, and during invocation when I add those non JAXB compatible artifacts in the JAXB context. I'm usign Jboss WS webservice stack within Jboss 4.2.3.GA Any elegant solution to this problem is welcomed !

    Read the article

  • spring3.0 mvc problem(The requested resource is not Available)

    - by Daniel
    Hi, I am a newbie for Spring MVC 3.0 and trying to write a sample webapp to get the feels of it. I am able to get url to invoke my associated controller, but not able to foward the request from which to my jsp resource as indicated by the output on the browser: The requested resource (/Spring30HelloWorld/helloworldcontroller) is not available. A word of suggestion on fixing the issue would be appreciated!! Please refer below for my code set up. Thanks in advance! web.xml (location: /WebContent) <?xml version="1.0" encoding="UTF-8"?> <web-app xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://java.sun.com/xml/ns/javaee" xmlns:web="http://java.sun.com/xml/ns/javaee/web-app_2_5.xsd" xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_2_5.xsd" id="WebApp_ID" version="2.5"> <display-name>Spring30HelloWorld</display-name> <servlet> <servlet-name>A</servlet-name> <servlet-class>org.springframework.web.servlet.DispatcherServlet</servlet-class> <load-on-startup>1</load-on-startup> </servlet> <servlet-mapping> <servlet-name>A</servlet-name> <url-pattern>*.htm</url-pattern> </servlet-mapping> <welcome-file-list> <welcome-file>index.htm</welcome-file> </welcome-file-list> </web-app> A-servlet.xml (location: /WebContent/WEB-INF/) <?xml version="1.0" encoding="UTF-8"?> <beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:p="http://www.springframework.org/schema/p" xmlns:context="http://www.springframework.org/schema/context" xsi:schemaLocation=" http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.0.xsd"> <context:component-scan base-package="com.controller" /> </beans> HelloWorldController.java (location: /src/com/controller) package com.controller; import org.springframework.stereotype.Controller; import org.springframework.web.bind.annotation.RequestMapping; import org.springframework.web.portlet.ModelAndView; @Controller public class HelloWorldController { @RequestMapping("/helloWorld" ) public ModelAndView sayHello() { System.out.println("hello!"); //return new ModelAndView("helloworld.jsp", "hello", "hello"); return new ModelAndView("helloworld.jsp"); } } helloworld.jsp (location: /WebContent/) <html> <head> <title>Hello World</title> </head> <body> <h1>Simple Spring 3.0 Web App</h1> <p></p> </body> </html>

    Read the article

  • ClearCase: Versioning at file level

    - by Naveen
    Hi, I have a perculier problem about how to maintain a clearcase project. This project is a xml schema repository where each schema has a version. This repository is common and is used by all the apps in the enterprise. From clearcase prespective the project has a single component. Now the apps can be using different versions of the schema(s). So we are trying to figureout a way to setup the project in such way that a project can have a baseline of what versions of these files are included in a build. The only way we know of how to do this is to create a component for each schema or group of schemas and create a stream for each app to include the components they use. But that would result in too many components. Has anyone dealth with something like this before? We are prepared to restructre the whole project if necessary, so we are open to any idea. Thans for the help.

    Read the article

  • Does JAXWS client make difference between an empty collection and a null collection value as returne

    - by snowflake
    Since JAX-WS rely on JAXB, and since I observed the code that unpack the XML bean in JAX-B Reference Implementation, I guess the difference is not made and that a JAXWS client always return an empty collection, even the webservice result was a null element: public T startPacking(BeanT bean, Accessor<BeanT, T> acc) throws AccessorException { T collection = acc.get(bean); if(collection==null) { collection = ClassFactory.create(implClass); if(!acc.isAdapted()) acc.set(bean,collection); } collection.clear(); return collection; } I agree that for best interoperability service contracts should be non ambiguous and avoid such differences, but it seems that the JAX-WS service I'm invoking (hosted on a Jboss server with Jbossws implementation) is returning as expected either null either empty collection (tested with SoapUI). I used for my test code generated by wsimport. Return element is defined as: @XmlElement(name = "return", nillable = true) protected List<String> _return; I even tested to change the Response class getReturn method from : public List<String> getReturn() { if (_return == null) { _return = new ArrayList<String>(); } return this._return; } to public List<String> getReturn() { return this._return; } but without success. Any helpful information/comment regarding this problem is welcome !

    Read the article

  • Adding custom filter in spring framework problem?

    - by user298768
    hello there iam trying to make a custom AuthenticationProcessingFilter to save some user data in the session after successful login here's my filter: Code: package projects.internal; import java.io.IOException; import javax.servlet.http.HttpServletRequest; import javax.servlet.http.HttpServletResponse; import org.springframework.security.Authentication; import org.springframework.security.ui.webapp.AuthenticationProcessingFilter; public class MyAuthenticationProcessingFilter extends AuthenticationProcessingFilter { protected void onSuccessfulAuthentication(HttpServletRequest request, HttpServletResponse response, Authentication authResult) throws IOException { super.onSuccessfulAuthentication(request, response, authResult); request.getSession().setAttribute("myValue", "My value is set"); } } and here's my security.xml file Code: <beans:beans xmlns="http://www.springframework.org/schema/security" xmlns:beans="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd http://www.springframework.org/schema/security http://www.springframework.org/schema/security/spring-security-3.0.xsd"> <global-method-security pre-post-annotations="enabled"> </global-method-security> <http use-expressions="true" auto-config="false" entry-point-ref="authenticationProcessingFilterEntryPoint"> <intercept-url pattern="/" access="permitAll" /> <intercept-url pattern="/images/**" filters="none" /> <intercept-url pattern="/scripts/**" filters="none" /> <intercept-url pattern="/styles/**" filters="none" /> <intercept-url pattern="/p/login.jsp" filters="none" /> <intercept-url pattern="/p/register" filters="none" /> <intercept-url pattern="/p/**" access="isAuthenticated()" /> <form-login login-processing-url="/j_spring_security_check" login-page="/p/login.jsp" authentication-failure-url="/p/login_error.jsp" /> <logout /> </http> <authentication-manager alias="authenticationManager"> <authentication-provider> <jdbc-user-service data-source-ref="dataSource"/> </authentication-provider> </authentication-manager> <beans:bean id="authenticationProcessingFilter" class="projects.internal.MyAuthenticationProcessingFilter"> <custom-filter position="AUTHENTICATION_PROCESSING_FILTER" /> </beans:bean> <beans:bean id="authenticationProcessingFilterEntryPoint" class="org.springframework.security.ui.webapp.AuthenticationProcessingFilterEntryPoint"> </beans:bean> </beans:beans> it gives an error here: Code: <custom-filter position="AUTHENTICATION_PROCESSING_FILTER" /> multiple annotation found at this line:cvc-attribute.3 cvc-complex-type.4 cvc-enumeration-vaild what is the problem? thanks in advance

    Read the article

  • .Net Architecture challenge: The Change-prone Frankestein Model

    - by SDReyes
    Good Morning SO! We've been scratching our heads with with this interesting scenario at the office, and we're anxious to hear your ideas and approaches: We have a database, whose schema is prone to changes -lets call it Prony-. (is used to store configuration parameters for embedded devices. so if the embedded devices guy need a new table, property or relationship for the model, he should be able to adapt the schema in a easy way -happens so often- ). Prony needs a web interface to create/edit its data. We have another database containing data that also need to be loaded to the devices, after making some transformations - lets call this one Oddy- (this data it's generated by an already existent administrative web application). Finally we have Tracy, a server that communicates our DBs and our embedded devices. She should to auto-adapt herself, to our dbs schema changes and serialize the data to the devices. Nice puzzle, don't think so? : ) Our current candidates: Rady: The fast Lets create some views in Prony that make the data transformation from Oddy. then use DynamicData (or some RAD tool) to create/update a simple web interface for Prony (so he can even consult the transformated data from coming from Prony : ). About Tracy, she will need to be recompiled to update her DB schema (Entity framework should work) and use Reflection to explore recursively the schema and serialize data. Cons: We would have to recompile Tracy and the Prony's web interface. What do you think of the candidate(s)? What would you do?

    Read the article

< Previous Page | 31 32 33 34 35 36 37 38 39 40 41 42  | Next Page >