Search Results

Search found 29422 results on 1177 pages for 'port scanning service'.

Page 65/1177 | < Previous Page | 61 62 63 64 65 66 67 68 69 70 71 72  | Next Page >

  • Set default system audio output port (for all accounts)

    - by Ludwik Trammer
    The default output audio port Ubuntu doesn't work on my system. It should be "Analog Mono Output/Amplifier", instead of "Analog Output/Amplifier". I can easily change that in sound preferences, just by choosing the right port in the "Output" tab. The problem is this would only apply to a single account, and I would like to change it system-wide, so it applies to all accounts on the system (I have more than 100 users...). I'm after 2 hours of Googling, so any help would be appreciated.

    Read the article

  • How can I port forward with iptables?

    - by stu
    I want connections coming in on ppp0 on port 8001 to route to 192.168.1.200 on eth0 on port 8080 I've got these two rules -A PREROUTING -p tcp -m tcp --dport 8001 -j DNAT --to-destination 192.168.1.200:8080 -A FORWARD -m state -p tcp -d 192.168.1.200 --dport 8080 --state NEW,ESTABLISHED,RELATED -j ACCEPT and it doesn't work, what am I missing?

    Read the article

  • Manually closing a port from commandline

    - by codingfreak
    Hi I want to close an open port which is in listening mode between my client and server application. Is there any manual command line option in Linux to close a port ?? NOTE: I came to know that "only the application which owns the connected socket should close it, which will happen when the application terminates." I dont understand why it is only possible by the application which opens it ... But still eager to know if there is any another way to do it ??

    Read the article

  • XAMPP won't start because I'm already running MySQL on port 3306

    - by JellicleCat
    I'm trying to start XAMPP, but it fails with only the word 'Busy' in the Control Panel console. I ran xampp/install/portcheck.bat to see if the ports were available, and I see that port 3306 is busy (because I'm running MySQL as a service on it). I suppose that XAMPP wants to take over and run MySQL on that port. Can anyone tell me how to get XAMPP to just make use of the existing server and not try to control it itself?

    Read the article

  • How to open port 25 on the server

    - by liuxingruo
    I'm using centOS. I want to implement a smtp mail server, and I have installed postfix and dovecot(both have been set correctly). I tried to telnet the 25 port, but it returns Unable to connect to remote host: Connection refused So, How can I open the 25 port? Thanks!

    Read the article

  • VNC connection through machine with only SSH port open

    - by pufferfish
    I would like to make a VNC connection from home to a Windows machine at work. The Windows machine is not accessible from the outside, but there is a Linux box that does have port 22 open, so it would seem that this can be done. I suspect it's just a command that "forwards" connections to port 22 on the Linux machine to the Windows machine? Just can't find an example that does exactly this though Thanks in advance!

    Read the article

  • port 80 was blocked

    - by Kombuwa
    hi, in my home computer incoming port 80 was blocked. i gess it was done by some vires. did any body know how to open the closed port in xp. or any tool to open colosed ports.

    Read the article

  • Why does no small Rj45 port exist?

    - by Christian Sauer
    While I bought my last Tablet I noticed that no Tablet (and most smaller notebooks /convertibles) has a Rj45 port. Which I finde quite dissatisfying since I like to use RJ45 in numerous places. I think a reason could be that Rj45 is simply too big for a Tablet - it is downright massive compared to micro USB /HDMI etc. But that leads me to my question: Why is there no attempt to build a smaller micro-Rj45 port which could be used in constrained spaces?

    Read the article

  • Setting up a subdomain SSL with custom port

    - by Webnet
    I'm setting up a subdomain on a dedicated server that I'm going to use for SVN services. The SVN server is up and running I just need to setup the subdomain. The https has been switched to a custom port because there's a confliction with a port forward pointing to another server. Should I do this through GoDaddy or Apache?

    Read the article

  • What's the extra FTP port here?

    - by warl0ck
    While downloading a tar ball from gnu's FTP server, I found that other than standard 21 TCP port connection, I also seeing an extra connection: tcp 0 0 192.168.1.109:45056 208.118.235.20:21 ESTABLISHED 10956/wget tcp 0 0 192.168.1.109:56724 208.118.235.20:22259 ESTABLISHED 10956/wget What that port is used for? I checked /etc/services, only 20 and 21 should be in use, am I wrong? The command in use was wget 'ftp://ftp.gnu.org/gnu/tar/tar-1.26.tar.xz'

    Read the article

  • Remove a port from a VLAN (HP ProCurve)

    - by sixnumber
    I'm not too familiar with switches but I want to remove port 6 from the following 'VLAN 12' I've tried searching for an easy explanation to no avail - How do I do this please? Port Information Mode Unknown VLAN Status ---------------- -------- ------------ ---------- 5 Untagged Learn Up 6 Tagged Learn Up 8 Untagged Learn Up 18 Untagged Learn Down 22 Untagged Learn Up 26 Tagged Learn Up

    Read the article

  • Sharing a serial port between two processes

    - by peterrus
    As it is not possible to directly share a serial port between two processes using Linux, I am looking for another way to achieve this, I have heard about socat but could not find a concrete example of how to realize the following: Split one physical serial port (/dev/ttyUSB0) into two virtual ports, one for reading and one for writing, as one process only needs to send data, and one only needs to receive data. I can no modify the sending application unfortunately.

    Read the article

  • Webcast Q&A: Los Angeles Department of Building & Safety Lowers Customer Service Costs with Oracle WebCenter

    - by Kellsey Ruppel
    This week we had the fifth webcast in our WebCenter in Action webcast series, "Los Angeles Department of Building & Safety Lowers Customer Service Costs with Oracle WebCenter", where customers Giovani Dacumos and Minh Ong from the Los Angeles Department of Building & Safety (LADBS), and Sheetal Paranjpye and Rajiv Desai from Oracle Partner 3Di, shared how Oracle WebCenter is powering LADBS' externally facing website and providing a superior self-service experience for their customers. We asked the speakers to provide some dialogue for Q&A.   Giovani Dacumos, Director of Systems and Minh Ong, LADBS Q: Did you run into any issues when integrating all of the different applications together?A: Yes. We did have issues integrating a secure sign on between the portal and other legacy applications. We used portlets and iframes to overcome those.  This is a new technology for us and we are also learning as we go so there were a lot of challenges in developing and implementing our vision. Q: What has been the biggest benefit your end users have seen?A: The biggest benefit for our ends users is ease-of-use. We've given them a system that provided a new and improved source of information, as well as a very organized flow of transaction processing. It has made our online service very user friendly. Q: Was there any resistance internally when implementing the solution? If so, how did you overcome that?A: There was no internal resistance during the implementation, only challenges. As mentioned earlier, this is a new technology for us. We've come across issues that needed assistance from Oracle. Working with 3Di and Oracle has helped us tremendously to find solutions to our implementation issues. Q: Given the performance, what do you estimate to be the top end capacity of the system? A: With the current performance and architecture we have, we are able to support approx 300-400 concurrent users.  We would need more hardware to support additional user load. Q: What's the overview or summary of feedback from the users interacting with the site?A: LADBS has a wide spectrum of customers, from simple users like homeowners to large construction firms. Anything new that we offer could be a little bit challenging for some, but overall, the customers liked it. They saw a huge improvement on the usability. Q: Can you describe the impressions about the site before and after the project within LADBS?A: The old site was using old technology and it was hard for us to keep on building into it as we got more business requirements. It made our application seem a bit complicated.  It was confusing for our new customers to use and we've improved on this with the new site. It's now easier for them to complete their transactions and, at the same time, allowed us to provide more useful information. Sheetal Paranjpye and Rajiv Desai, 3Di Q: Did you run into any obstacles when implementing the solution?A: Yes we did run into some obstacles. One of the key show stoppers was the issue with portlet to portal communication. The GIS viewer (portlet) needed information to be passed  to and from Permit LA (Portal), but we were able to get everything configured and up and working quickly! Q: Was there a lot of custom work that needed to be done for this particular solution?A: We have done some customizations where workflows/ Task flows are involved.  Q: What do you think were the keys to success for rolling out WebCenter?A: Having a service oriented architecture and using portlets have been the key areas for rolling out Oracle WebCenter at LADBS. The Oracle WebCenter Content integration allows the flexibility to business users to maintain the content, which has really cut down on the reliance of IT, and employee productivity has increased as a result. If you missed the webcast, be sure to catch the replay to see a live demonstration of WebCenter in action! Los Angeles Department of Building & Safety Lowers Customer Service Costs with Oracle WebCenter from Oracle WebCenter

    Read the article

  • Technical workshop with the gurus: Architecting Oracle Database-As-A-Service (DBaaS)

    - by Javier Puerta
    Hardware and Software, Engineered to Work Together inside the Click Here The order you must follow to make the colored link appear in browsers. If not the default window link will appear 1. Select the word you want to use for the link 2. Select the desired color, Red, Black, etc 3. Select bold if necessary ___________________________________________________________________________________________________________________ Templates use two sizes of fonts and the sans-serif font tag for the email. All Fonts should be (Arial, Helvetica, sans-serif) tags Normal size reading body fonts should be set to the size of 2. Small font sizes should be set to 1 !!!!!!!DO NOT USE ANY OTHER SIZE FONT FOR THE EMAILS!!!!!!!! ___________________________________________________________________________________________________________________ -- OCTOBER 2013 Invitation: Architecting Oracle Database-As-A-Service (DBaaS) Stay Connected Sign up for Specific Updates Architecting Oracle Database-As-A-Service (DBaaS) Dear partner, We are pleased to invite you to a 2-day workshop dedicated to EMEA partners on "Architecting Oracle Private Database Cloud & Delivering Database-As-A-Service (DBaaS)". This exclusive workshop will be delivered by Product Management and Product Development from Oracle HQ and focuses on the main theme CIOs are tackling with in the last decade: Consolidation to Private Cloud. For many customers the journey to consolidation has led to DBaaS Cloud deployments to significantly reduce costs and offer agile IT services. With the recent launch of Oracle Database 12c, the game really has changed in terms of what Oracle offers and how database clouds can be deployed. REGISTER NOW Who should attend: Enterprise Architects Infrastructure Architects DB Architects from System Integrators and large Independent Software Vendors. Take this opportunity to learn from the gurus, how you can help your customers maximize on their cloud consolidation strategies. The workshops main focus is service delivery, which includes standardization and consolidation, and how you would help your customers transform their current IT infrastructure to a service delivery model. It will discuss best practices and reviews customer examples that have successfully implemented a database cloud. The agenda is split into two days sessions: Day 1: Overview & Planning Database Cloud - Demos Customer Case Studies Database 12c Day 2: Database Cloud - Design Database Cloud - Implementation EM Cloud Control DBaaS on Engineered Systems Question and Answers Attendance is free of charge for qualified Oracle partners - Register now for one of the below sessions: Date Country Location 5 & 6 November 2013  United Kingdom   Manchester 7 & 8 November 2013  Germany  Munich 11 & 12 November 2013  Netherlands  Amsterdam 14 & 15 November 2013  Turkey Istanbul 18 & 19 November 2013  Austria Vienna Looking forward to seeing you! Javier Puerta Director, Core Technology Partner Programs EMEA Prashant Barot Director, Core Technology Resources OPN Portal OPN Enablement News Blog Oracle Partner Store Use Oracle Trademark in Google AdWords OPN Events Calendar OPN Information Center OPN Solutions Catalog Promote Your Events on Oracle Calendar Copyright © 2013, Oracle and/or its affiliates. All rights reserved. Contact Us | Legal Notices and Terms of Use | Privacy Statement Oracle Corporation - Worldwide Headquarters, 500 Oracle Parkway, OPL - E-mail Services, Redwood Shores, CA 94065, United States

    Read the article

  • Technical workshop with the gurus: Architecting Oracle Database-As-A-Service (DBaaS)

    - by Javier Puerta
    Hardware and Software, Engineered to Work Together inside the Click Here The order you must follow to make the colored link appear in browsers. If not the default window link will appear 1. Select the word you want to use for the link 2. Select the desired color, Red, Black, etc 3. Select bold if necessary ___________________________________________________________________________________________________________________ Templates use two sizes of fonts and the sans-serif font tag for the email. All Fonts should be (Arial, Helvetica, sans-serif) tags Normal size reading body fonts should be set to the size of 2. Small font sizes should be set to 1 !!!!!!!DO NOT USE ANY OTHER SIZE FONT FOR THE EMAILS!!!!!!!! ___________________________________________________________________________________________________________________ -- OCTOBER 2013 Invitation: Architecting Oracle Database-As-A-Service (DBaaS) Stay Connected Sign up for Specific Updates Architecting Oracle Database-As-A-Service (DBaaS) Dear partner, We are pleased to invite you to a 2-day workshop dedicated to EMEA partners on "Architecting Oracle Private Database Cloud & Delivering Database-As-A-Service (DBaaS)". This exclusive workshop will be delivered by Product Management and Product Development from Oracle HQ and focuses on the main theme CIOs are tackling with in the last decade: Consolidation to Private Cloud. For many customers the journey to consolidation has led to DBaaS Cloud deployments to significantly reduce costs and offer agile IT services. With the recent launch of Oracle Database 12c, the game really has changed in terms of what Oracle offers and how database clouds can be deployed. REGISTER NOW Who should attend: Enterprise Architects Infrastructure Architects DB Architects from System Integrators and large Independent Software Vendors. Take this opportunity to learn from the gurus, how you can help your customers maximize on their cloud consolidation strategies. The workshops main focus is service delivery, which includes standardization and consolidation, and how you would help your customers transform their current IT infrastructure to a service delivery model. It will discuss best practices and reviews customer examples that have successfully implemented a database cloud. The agenda is split into two days sessions: Day 1: Overview & Planning Database Cloud - Demos Customer Case Studies Database 12c Day 2: Database Cloud - Design Database Cloud - Implementation EM Cloud Control DBaaS on Engineered Systems Question and Answers Attendance is free of charge for qualified Oracle partners - Register now for one of the below sessions: Date Country Location 5 & 6 November 2013  United Kingdom   Manchester 7 & 8 November 2013  Germany  Munich 11 & 12 November 2013  Netherlands  Amsterdam 14 & 15 November 2013  Turkey Istanbul 18 & 19 November 2013  Austria Vienna Looking forward to seeing you! Javier Puerta Director, Core Technology Partner Programs EMEA Prashant Barot Director, Core Technology     Resources OPN Portal OPN Enablement News Blog Oracle Partner Store Use Oracle Trademark in Google AdWords OPN Events Calendar OPN Information Center OPN Solutions Catalog Promote Your Events on Oracle Calendar Copyright © 2013, Oracle and/or its affiliates. All rights reserved. Contact Us | Legal Notices and Terms of Use | Privacy Statement Oracle Corporation - Worldwide Headquarters, 500 Oracle Parkway, OPL - E-mail Services, Redwood Shores, CA 94065, United States

    Read the article

  • ADF Mobile - Update through Web Service (with ADF Business Components)

    - by Shay Shmeltzer
    In my previous blog entry I went over the basics of exposing ADF Business Components through service interfaces, and developing a simple ADF Mobile application that access and fetches data from those services. In this entry we'll dive a bit deeper  and address an update scenario through these web service interfaces. You can see the full demo video at the end of the post. In the first steps I show how to add an explicit method execution to fetch a specific record we want to update on the second page of a flow. For an update you'll be invoking a service method and passing the record you want to update as a parameter. As in many other Web services scenarios, we need to provide a complete object of specific type to the method. The ADF Web service data control helps you here by offering an object of this type that you can drag and drop into your page. The next step is to make sure to fill that object with the values you want to update. In the demo we do this through  coding in a backing bean that shows how to use the AdfmfJavaUtilities utility. The code gets the value from one field, gets a pointer to the parallel update field, and then copy from one to the other. At the end of the bean we manually execute the call to the update method on the Web service. Here is the demo: &amp;amp;amp;amp;&amp;lt;span id=&amp;quot;XinhaEditingPostion&amp;quot;&amp;gt;&amp;lt;/span&amp;gt;lt;span id=&amp;amp;amp;amp;quot;XinhaEditingPostion&amp;amp;amp;amp;quot;&amp;amp;amp;amp;gt;&amp;amp;amp;amp;lt;/span&amp;amp;amp;amp;gt; Here is the code used in the backing bean in the demo above. package a.mobile;import oracle.adfmf.amx.event.ActionEvent;import javax.el.MethodExpression;import javax.el.ValueExpression;import oracle.adfmf.amx.event.ActionEvent;import oracle.adfmf.framework.api.AdfmfJavaUtilities;import oracle.adfmf.framework.model.AdfELContext;public class backing {    public backing() {    }    public void copyAndUpdate(ActionEvent actionEvent) {        // Add event code here...        AdfELContext adfELContext = AdfmfJavaUtilities.getAdfELContext();        ValueExpression ve = AdfmfJavaUtilities.getValueExpression("#{bindings.DepartmentName.inputValue}", String.class);        ValueExpression ve3 =            AdfmfJavaUtilities.getValueExpression("#{bindings.DepartmentName1.inputValue}", String.class);        ve3.setValue(adfELContext, ve.getValue(adfELContext));        ve = AdfmfJavaUtilities.getValueExpression("#{bindings.DepartmentId.inputValue}", int.class);        ve3 = AdfmfJavaUtilities.getValueExpression("#{bindings.DepartmentId1.inputValue}", int.class);        ve3.setValue(adfELContext, ve.getValue(adfELContext));        ve = AdfmfJavaUtilities.getValueExpression("#{bindings.ManagerId.inputValue}", int.class);        ve3 = AdfmfJavaUtilities.getValueExpression("#{bindings.ManagerId1.inputValue}", int.class);        ve3.setValue(adfELContext, ve.getValue(adfELContext));        ve = AdfmfJavaUtilities.getValueExpression("#{bindings.LocationId.inputValue}", int.class);        ve3 = AdfmfJavaUtilities.getValueExpression("#{bindings.LocationId1.inputValue}", int.class);        ve3.setValue(adfELContext, ve.getValue(adfELContext));        MethodExpression me = AdfmfJavaUtilities.getMethodExpression("#{bindings.updateDepartmentsView1.execute}", Object.class, new Class[] {});         me.invoke(adfELContext, new Object[] {});        }    }

    Read the article

  • User defined datatypes CANNOT be returned in web service in Jboss 5.0.1

    - by user1503117
    I am using Jboss 5.0.1, jdk 1.6.0 update 31 and implementing an EJB as a web service and my method in web service module returns an Array of JavaBean objects in my example BenefitLevel array object. When executed in JBoss it throws the following exception: 08:57:08,552 ERROR [ServiceProxy] Service error javax.xml.rpc.ServiceException: Cannot create proxy at org.jboss.ws.core.jaxrpc.client.ServiceImpl.getPort(ServiceImpl.java:359) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.jboss.ws.core.jaxrpc.client.ServiceProxy.invoke(ServiceProxy.java:127) at $Proxy105.getCarrierWSSEIPort(Unknown Source) at org.apache.jsp.index_jsp._jspService(index_jsp.java:92) at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:70) at javax.servlet.http.HttpServlet.service(HttpServlet.java:717) at org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:369) at org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:322) at org.apache.jasper.servlet.JspServlet.service(JspServlet.java:249) at javax.servlet.http.HttpServlet.service(HttpServlet.java:717) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.jboss.web.tomcat.filters.ReplyHeaderFilter.doFilter(ReplyHeaderFilter.java:96) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:235) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191) at org.jboss.web.tomcat.security.SecurityAssociationValve.invoke(SecurityAssociationValve.java:190) at org.jboss.web.tomcat.security.JaccContextValve.invoke(JaccContextValve.java:92) at org.jboss.web.tomcat.security.SecurityContextEstablishmentValve.process(SecurityContextEstablishmentValve.java:126) at org.jboss.web.tomcat.security.SecurityContextEstablishmentValve.invoke(SecurityContextEstablishmentValve.java:70) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102) at org.jboss.web.tomcat.service.jca.CachedConnectionValve.invoke(CachedConnectionValve.java:158) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:330) at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:829) at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:601) at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:447) at java.lang.Thread.run(Thread.java:662) Caused by: java.lang.IllegalStateException: Cannot synchronize to any of these methods: public abstract stubs.BenefitLevel[] stubs.CarrierWSSEI.getActiveBenData() throws java.rmi.RemoteException OperationMetaData: qname={urn:CarrierWS/wsdl}getActiveBenData javaName=getActiveBenData style=rpc/literal oneWay=false soapAction= ReturnMetaData: xmlName=result partName=result xmlType={urn:CarrierWS/types/arrays/com/test/cas/carrier/plan/info}BenefitLevelArray javaType=com.benefitpartnersinc.cas.carrier.plan.info.BenefitLevel[] mode=OUT inHeader=false index=-1 at org.jboss.ws.metadata.umdm.OperationMetaData.eagerInitialize(OperationMetaData.java:491) at org.jboss.ws.metadata.umdm.EndpointMetaData.eagerInitializeOperations(EndpointMetaData.java:557) at org.jboss.ws.metadata.umdm.EndpointMetaData.initializeInternal(EndpointMetaData.java:541) at org.jboss.ws.metadata.umdm.EndpointMetaData.setServiceEndpointInterfaceName(EndpointMetaData.java:220) at org.jboss.ws.core.jaxrpc.client.ServiceImpl.getPort(ServiceImpl.java:345) ... 33 more 08:57:08,567 ERROR [STDERR] javax.xml.rpc.ServiceException: Cannot create proxy 08:57:08,567 ERROR [STDERR] at org.jboss.ws.core.jaxrpc.client.ServiceImpl.getPort(ServiceImpl.java:359) 08:57:08,567 ERROR [STDERR] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 08:57:08,567 ERROR [STDERR] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) 08:57:08,567 ERROR [STDERR] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) 08:57:08,567 ERROR [STDERR] at java.lang.reflect.Method.invoke(Method.java:597) 08:57:08,567 ERROR [STDERR] at org.jboss.ws.core.jaxrpc.client.ServiceProxy.invoke(ServiceProxy.java:127) 08:57:08,567 ERROR [STDERR] at $Proxy105.getCarrierWSSEIPort(Unknown Source) 08:57:08,567 ERROR [STDERR] at org.apache.jsp.index_jsp._jspService(index_jsp.java:92) 08:57:08,567 ERROR [STDERR] at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:70) 08:57:08,567 ERROR [STDERR] at javax.servlet.http.HttpServlet.service(HttpServlet.java:717) 08:57:08,567 ERROR [STDERR] at org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:369) 08:57:08,567 ERROR [STDERR] at org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:322) 08:57:08,567 ERROR [STDERR] at org.apache.jasper.servlet.JspServlet.service(JspServlet.java:249) 08:57:08,567 ERROR [STDERR] at javax.servlet.http.HttpServlet.service(HttpServlet.java:717) 08:57:08,567 ERROR [STDERR] at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290) 08:57:08,567 ERROR [STDERR] at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) 08:57:08,567 ERROR [STDERR] at org.jboss.web.tomcat.filters.ReplyHeaderFilter.doFilter(ReplyHeaderFilter.java:96) 08:57:08,567 ERROR [STDERR] at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235) 08:57:08,567 ERROR [STDERR] at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) 08:57:08,567 ERROR [STDERR] at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:235) 08:57:08,567 ERROR [STDERR] at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191) 08:57:08,567 ERROR [STDERR] at org.jboss.web.tomcat.security.SecurityAssociationValve.invoke(SecurityAssociationValve.java:190) 08:57:08,567 ERROR [STDERR] at org.jboss.web.tomcat.security.JaccContextValve.invoke(JaccContextValve.java:92) 08:57:08,567 ERROR [STDERR] at org.jboss.web.tomcat.security.SecurityContextEstablishmentValve.process(SecurityContextEstablishmentValve.java:126) 08:57:08,567 ERROR [STDERR] at org.jboss.web.tomcat.security.SecurityContextEstablishmentValve.invoke(SecurityContextEstablishmentValve.java:70) 08:57:08,567 ERROR [STDERR] at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127) 08:57:08,567 ERROR [STDERR] at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102) 08:57:08,567 ERROR [STDERR] at org.jboss.web.tomcat.service.jca.CachedConnectionValve.invoke(CachedConnectionValve.java:158) 08:57:08,567 ERROR [STDERR] at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109) 08:57:08,567 ERROR [STDERR] at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:330) 08:57:08,567 ERROR [STDERR] at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:829) 08:57:08,567 ERROR [STDERR] at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:601) 08:57:08,567 ERROR [STDERR] at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:447) 08:57:08,567 ERROR [STDERR] at java.lang.Thread.run(Thread.java:662) 08:57:08,567 ERROR [STDERR] Caused by: java.lang.IllegalStateException: Cannot synchronize to any of these methods: public abstract stubs.BenefitLevel[] stubs.CarrierWSSEI.getActiveBenData() throws java.rmi.RemoteException OperationMetaData: qname={urn:CarrierWS/wsdl}getActiveBenData javaName=getActiveBenData style=rpc/literal oneWay=false soapAction= ReturnMetaData: xmlName=result partName=result xmlType={urn:CarrierWS/types/arrays/com/test/cas/carrier/plan/info}BenefitLevelArray javaType=com.test.cas.carrier.plan.info.BenefitLevel[] mode=OUT inHeader=false index=-1 08:57:08,567 ERROR [STDERR] at org.jboss.ws.metadata.umdm.OperationMetaData.eagerInitialize(OperationMetaData.java:491) 08:57:08,567 ERROR [STDERR] at org.jboss.ws.metadata.umdm.EndpointMetaData.eagerInitializeOperations(EndpointMetaData.java:557) 08:57:08,567 ERROR [STDERR] at org.jboss.ws.metadata.umdm.EndpointMetaData.initializeInternal(EndpointMetaData.java:541) 08:57:08,567 ERROR [STDERR] at org.jboss.ws.metadata.umdm.EndpointMetaData.setServiceEndpointInterfaceName(EndpointMetaData.java:220) 08:57:08,567 ERROR [STDERR] at org.jboss.ws.core.jaxrpc.client.ServiceImpl.getPort(ServiceImpl.java:345) 08:57:08,567 ERROR [STDERR] ... 33 more My Web client code is as follows : <%@page import="java.util.Hashtable"%> <%@page import="javax.naming.*,com.q4.*,javax.xml.rpc.Stub,stubs.CarrierWS,stubs.CarrierWSSEI,stubs.CarrierWSSEI_Impl"%> <%@page contentType="text/html" pageEncoding="UTF-8"%> <!DOCTYPE html> <html> <head> <meta http-equiv="Content-Type" content="text/html; charset=UTF-8"> <title>JSP Page</title> </head> <body> <h1>Hello World!</h1> <% try { InitialContext ic = new InitialContext( ); CarrierWS carrierws = (CarrierWS)ic.lookup("java:comp/env/service/CarrierWS"); out.println("========================" + carrierws); CarrierWSSEI sei = carrierws.getCarrierWSSEIPort(); out.println("Invoking the service please wait ............." + carrierws.getCarrierWSSEIPort()); ((Stub)sei)._setProperty(Stub.ENDPOINT_ADDRESS_PROPERTY,"http://localhost:8080/TestWS3WAR/CarrierWS"); out.println("Invoking the service please wait ............." + sei.getActiveBenData().length); } catch(Exception e) { out.println("Exception occurred : " + e.getMessage()); e.printStackTrace(); } %> </body> </html> Please help me where I am going wrong.

    Read the article

  • SharePoint custom web service consumption problems - HTTP 401: Unauthorized

    - by alekz
    I have a custom web service deployed into WSS 3. It has two web methods. The first one returns the version of the loaded assembly without any invocation of the SharePoint objects. The second returns some basic info about the library, something like: var spLibrary = [find library logic]; return spLibrary.Name+"@"+spLibrary.Url; In the client app I have something like the following: var service = new WebService1(); service.Url = [url]; service.Credentials = System.Net.CredentialCache.DefaultCredentials; service.Method1(); service.Method2(); When the client app runs on the machine where SharePoint is deployed, everything works just fine. When the client app runs on the remote machine (but under the same user) the first method still works, but the second one throws System.Net.WebException: HTTP 401: Unauthorized. I have tried to set credentials manualy (service.Credentials = new System.Net.NetworkCredential(login, password, domain);) but this doesnt help. I've tried to invoke the built in SharePoint web services using a similar scenario, and they work just fine: Sorry for the mistake... Some methods were not working fine without the appropriate privileges. var service = new GroupsService(); service.Url = [url]; service.Credentials = System.Net.CredentialCache.DefaultCredentials; service.SomeMethod();

    Read the article

  • DocumentDB - Another Azure NoSQL Storage Service

    - by Shaun
    Originally posted on: http://geekswithblogs.net/shaunxu/archive/2014/08/25/documentdb---another-azure-nosql-storage-service.aspxMicrosoft just released a bunch of new features for Azure on 22nd and one of them I was interested in most is DocumentDB, a document NoSQL database service on the cloud.   Quick Look at DocumentDB We can try DocumentDB from the new azure preview portal. Just click the NEW button and select the item named DocumentDB to create a new account. Specify the name of the DocumentDB, which will be the endpoint we are going to use to connect later. Select the capacity unit, resource group and subscription. In resource group section we can select which region our DocumentDB will be located. Same as other azure services select the same location with your consumers of the DocumentDB, for example the website, web services, etc.. After several minutes the DocumentDB will be ready. Click the KEYS button we can find the URI and primary key, which will be used when connecting. Now let's open Visual Studio and try to use the DocumentDB we had just created. Create a new console application and install the DocumentDB .NET client library from NuGet with the keyword "DocumentDB". You need to select "Include Prerelase" in NuGet Package Manager window since this library was not yet released. Next we will create a new database and document collection under our DocumentDB account. The code below created an instance of DocumentClient with the URI and primary key we just copied from azure portal, and create a database and collection. And it also prints the document and collection link string which will be used later to insert and query documents. 1: static void Main(string[] args) 2: { 3: var endpoint = new Uri("https://shx.documents.azure.com:443/"); 4: var key = "LU2NoyS2fH0131TGxtBE4DW/CjHQBzAaUx/mbuJ1X77C4FWUG129wWk2oyS2odgkFO2Xdif9/ZddintQicF+lA=="; 5:  6: var client = new DocumentClient(endpoint, key); 7: Run(client).Wait(); 8:  9: Console.WriteLine("done"); 10: Console.ReadKey(); 11: } 12:  13: static async Task Run(DocumentClient client) 14: { 15:  16: var database = new Database() { Id = "testdb" }; 17: database = await client.CreateDatabaseAsync(database); 18: Console.WriteLine("database link = {0}", database.SelfLink); 19:  20: var collection = new DocumentCollection() { Id = "testcol" }; 21: collection = await client.CreateDocumentCollectionAsync(database.SelfLink, collection); 22: Console.WriteLine("collection link = {0}", collection.SelfLink); 23: } Below is the result from the console window. We need to copy the collection link string for future usage. Now if we back to the portal we will find a database was listed with the name we specified in the code. Next we will insert a document into the database and collection we had just created. In the code below we pasted the collection link which copied in previous step, create a dynamic object with several properties defined. As you can see we can add some normal properties contains string, integer, we can also add complex property for example an array, a dictionary and an object reference, unless they can be serialized to JSON. 1: static void Main(string[] args) 2: { 3: var endpoint = new Uri("https://shx.documents.azure.com:443/"); 4: var key = "LU2NoyS2fH0131TGxtBE4DW/CjHQBzAaUx/mbuJ1X77C4FWUG129wWk2oyS2odgkFO2Xdif9/ZddintQicF+lA=="; 5:  6: var client = new DocumentClient(endpoint, key); 7:  8: // collection link pasted from the result in previous demo 9: var collectionLink = "dbs/AAk3AA==/colls/AAk3AP6oFgA=/"; 10:  11: // document we are going to insert to database 12: dynamic doc = new ExpandoObject(); 13: doc.firstName = "Shaun"; 14: doc.lastName = "Xu"; 15: doc.roles = new string[] { "developer", "trainer", "presenter", "father" }; 16:  17: // insert the docuemnt 18: InsertADoc(client, collectionLink, doc).Wait(); 19:  20: Console.WriteLine("done"); 21: Console.ReadKey(); 22: } the insert code will be very simple as below, just provide the collection link and the object we are going to insert. 1: static async Task InsertADoc(DocumentClient client, string collectionLink, dynamic doc) 2: { 3: var document = await client.CreateDocumentAsync(collectionLink, doc); 4: Console.WriteLine(await JsonConvert.SerializeObjectAsync(document, Formatting.Indented)); 5: } Below is the result after the object had been inserted. Finally we will query the document from the database and collection. Similar to the insert code, we just need to specify the collection link so that the .NET SDK will help us to retrieve all documents in it. 1: static void Main(string[] args) 2: { 3: var endpoint = new Uri("https://shx.documents.azure.com:443/"); 4: var key = "LU2NoyS2fH0131TGxtBE4DW/CjHQBzAaUx/mbuJ1X77C4FWUG129wWk2oyS2odgkFO2Xdif9/ZddintQicF+lA=="; 5:  6: var client = new DocumentClient(endpoint, key); 7:  8: var collectionLink = "dbs/AAk3AA==/colls/AAk3AP6oFgA=/"; 9:  10: SelectDocs(client, collectionLink); 11:  12: Console.WriteLine("done"); 13: Console.ReadKey(); 14: } 15:  16: static void SelectDocs(DocumentClient client, string collectionLink) 17: { 18: var docs = client.CreateDocumentQuery(collectionLink + "docs/").ToList(); 19: foreach(var doc in docs) 20: { 21: Console.WriteLine(doc); 22: } 23: } Since there's only one document in my collection below is the result when I executed the code. As you can see all properties, includes the array was retrieve at the same time. DocumentDB also attached some properties we didn't specified such as "_rid", "_ts", "_self" etc., which is controlled by the service.   DocumentDB Benefit DocumentDB is a document NoSQL database service. Different from the traditional database, document database is truly schema-free. In a short nut, you can save anything in the same database and collection if it could be serialized to JSON. We you query the document database, all sub documents will be retrieved at the same time. This means you don't need to join other tables when using a traditional database. Document database is very useful when we build some high performance system with hierarchical data structure. For example, assuming we need to build a blog system, there will be many blog posts and each of them contains the content and comments. The comment can be commented as well. If we were using traditional database, let's say SQL Server, the database schema might be defined as below. When we need to display a post we need to load the post content from the Posts table, as well as the comments from the Comments table. We also need to build the comment tree based on the CommentID field. But if were using DocumentDB, what we need to do is to save the post as a document with a list contains all comments. Under a comment all sub comments will be a list in it. When we display this post we just need to to query the post document, the content and all comments will be loaded in proper structure. 1: { 2: "id": "xxxxx-xxxxx-xxxxx-xxxxx", 3: "title": "xxxxx", 4: "content": "xxxxx, xxxxxxxxx. xxxxxx, xx, xxxx.", 5: "postedOn": "08/25/2014 13:55", 6: "comments": 7: [ 8: { 9: "id": "xxxxx-xxxxx-xxxxx-xxxxx", 10: "content": "xxxxx, xxxxxxxxx. xxxxxx, xx, xxxx.", 11: "commentedOn": "08/25/2014 14:00", 12: "commentedBy": "xxx" 13: }, 14: { 15: "id": "xxxxx-xxxxx-xxxxx-xxxxx", 16: "content": "xxxxx, xxxxxxxxx. xxxxxx, xx, xxxx.", 17: "commentedOn": "08/25/2014 14:10", 18: "commentedBy": "xxx", 19: "comments": 20: [ 21: { 22: "id": "xxxxx-xxxxx-xxxxx-xxxxx", 23: "content": "xxxxx, xxxxxxxxx. xxxxxx, xx, xxxx.", 24: "commentedOn": "08/25/2014 14:18", 25: "commentedBy": "xxx", 26: "comments": 27: [ 28: { 29: "id": "xxxxx-xxxxx-xxxxx-xxxxx", 30: "content": "xxxxx, xxxxxxxxx. xxxxxx, xx, xxxx.", 31: "commentedOn": "08/25/2014 18:22", 32: "commentedBy": "xxx", 33: } 34: ] 35: }, 36: { 37: "id": "xxxxx-xxxxx-xxxxx-xxxxx", 38: "content": "xxxxx, xxxxxxxxx. xxxxxx, xx, xxxx.", 39: "commentedOn": "08/25/2014 15:02", 40: "commentedBy": "xxx", 41: } 42: ] 43: }, 44: { 45: "id": "xxxxx-xxxxx-xxxxx-xxxxx", 46: "content": "xxxxx, xxxxxxxxx. xxxxxx, xx, xxxx.", 47: "commentedOn": "08/25/2014 14:30", 48: "commentedBy": "xxx" 49: } 50: ] 51: }   DocumentDB vs. Table Storage DocumentDB and Table Storage are all NoSQL service in Microsoft Azure. One common question is "when we should use DocumentDB rather than Table Storage". Here are some ideas from me and some MVPs. First of all, they are different kind of NoSQL database. DocumentDB is a document database while table storage is a key-value database. Second, table storage is cheaper. DocumentDB supports scale out from one capacity unit to 5 in preview period and each capacity unit provides 10GB local SSD storage. The price is $0.73/day includes 50% discount. For storage service the highest price is $0.061/GB, which is almost 10% of DocumentDB. Third, table storage provides local-replication, geo-replication, read access geo-replication while DocumentDB doesn't support. Fourth, there is local emulator for table storage but none for DocumentDB. We have to connect to the DocumentDB on cloud when developing locally. But, DocumentDB supports some cool features that table storage doesn't have. It supports store procedure, trigger and user-defined-function. It supports rich indexing while table storage only supports indexing against partition key and row key. It supports transaction, table storage supports as well but restricted with Entity Group Transaction scope. And the last, table storage is GA but DocumentDB is still in preview.   Summary In this post I have a quick demonstration and introduction about the new DocumentDB service in Azure. It's very easy to interact through .NET and it also support REST API, Node.js SDK and Python SDK. Then I explained the concept and benefit of  using document database, then compared with table storage.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • Service Oriented Architecture & Domain-Driven Design

    - by Michael
    I've always developed code in a SOA type of way. This year I've been trying to do more DDD but I keep getting the feeling that I'm not getting it. At work our systems are load balanced and designed not to have state. The architecture is: Website ===Physical Layer== Main Service ==Physical Layer== Server 1/Service 2/Service 3/Service 4 Only Server 1,Service 2,Service 3 and Service 4 can talk to the database and the Main Service calls the correct service based on products ordered. Every physical layer is load balanced too. Now when I develop a new service, I try to think DDD in that service even though it doesn't really feel like it fits. I use good DDD principles like entities, value types, repositories, aggregates, factories and etc. I've even tried using ORM's but they just don't seem like they fit in a stateless architecture. I know there are ways around it, for example use IStatelessSession instead of ISession with NHibernate. However, ORM just feel like they don't fit in a stateless architecture. I've noticed I really only use some of the concepts and patterns DDD has taught me but the overall architecture is still SOA. I am starting to think DDD doesn't fit in large systems but I do think some of the patterns and concepts do fit in large systems. Like I said, maybe I'm just not grasping DDD or maybe I'm over analyzing my designs? Maybe by using the patterns and concepts DDD has taught me I am using DDD? Not sure if there is really a question to this post but more of thoughts I've had when trying to figure out where DDD fits in overall systems and how scalable it truly is. The truth is, I don't think I really even know what DDD is?

    Read the article

  • how to verify browser IP for server-side web service

    - by Anthony
    I have a web service that needs to be able to verify the end-user's IP that called the server-script that is requesting the web service. Simple layout: Person A goes to Webpage B. Webpage B calls Web Service C to get some info on Person A. Web Service C won't give Webpage B the requested information without confirmation that the request originated from Person A's IP and not someone who has stolen Person A's session. I'm thinking that for a browser-based solution, the original site (Webpage B) can open an iframe that goes to the Web Service's authentication page. A key of some kind is passed to the browser which will some how indicate both the user's IP and Web Page B's IP, so that the Web Service can confirm that no one has nabbed anything. I have two challenges, but I'll stick to the more immediate one first: I'm not sure if my browser-based plan really makes sense. If someone steals the session cookie, how is the Web Service going to know? Would this cookie be held be Web Page B and thus be harder to steal? Is it a sound assumption that a cookie or key held by the server only and not the browser is safe? Also, would the web service, based on the iframe initial connection, be expecting the server/user-ip combo? What I mean is, does the session key provided via the iframe get stored by the web service and the Web Site B shows it has a match? Or is the session key more generic, meaning the web service is passed the key by Website B and the Web Service verifies that this is a valid session key based on what a valid session key should look like?

    Read the article

  • Need Help With Conflicting Customer Support Goals?

    - by Tom Floodeen
    It seems that every OPS review Customer Support Executives are being asked to improve the customer KPIs while also improving gross margin. This is a tough road for even experienced leaders. You need to reduce your agents research time while increasing their answer accuracy. You want to spend less time training them while growing the number of products and systems being used. You have to deal with increasing service volumes but at the same time you need to focus on creating appropriate service insight. After all, to be a great support center you not only have to be good at answering questions, you also need to be good at preventing them.   Five Key Benefits of knowledge Management in Customer Service will help you start down the path meeting these, and other, objectives. With Oracle Knowledge Products, fully integrated with Oracle’s CRM solutions, you can accomplish both increased  service demand while driving your costs down. And you can handle both while positively impacting the satisfaction and loyalty of your customers.  Take advantage of Oracle to not only provide you with a great integrated tool suite, but also with the vision to drive you down the path of success.

    Read the article

< Previous Page | 61 62 63 64 65 66 67 68 69 70 71 72  | Next Page >