Search Results

Search found 7306 results on 293 pages for 'wcf proxy'.

Page 94/293 | < Previous Page | 90 91 92 93 94 95 96 97 98 99 100 101  | Next Page >

  • howto configure mod_proxy for apache2, jetty

    - by Kaustubh P
    Hello, This is how I have setup my environment, atm. An apache2 instance on port 80. Jetty instance on the same server, on port 8090. Use-Case: When I visit foo.com, I should see the webapp, which is hosted on jetty, port 8090. If I put foo.com/blog, I should see the wordpress blog, which is hosted on apache. (I read howtos on the web, and installed it using AMP.) Below are my various configuration files: /etc/apache2/mods-enabled/proxy.conf: ProxyPass / http://foo.com:8090/ << this is the jetty server ProxyPass /blog http://foo.com/blog ProxyRequests On ProxyVia On <Proxy *> Order deny,allow Allow from all </Proxy> ProxyPreserveHost On ProxyStatus On /etc/apache2/httpd.conf: LoadModule proxy_module /usr/lib/apache2/modules/mod_proxy.so LoadModule proxy_balancer_module /usr/lib/apache2/modules/mod_proxy_balancer.so LoadModule proxy_http_module /usr/lib/apache2/modules/mod_proxy_http.so LoadModule proxy_ajp_module /usr/lib/apache2/modules/mod_proxy_ajp.so I have not created any other files, in sites-available or sites-enabled. Current situation: If I goto foo.com, I see the webapp. If I goto foo.com/blog, I see a HTTP ERROR 404 Problem accessing /errors/404.html. Reason: NOT_FOUND powered by jetty:// If I comment out the first ProxyPass line, then on foo.com, I only see the homepage, without CSS applied, ie, only text.. .. and going to foo.com/blog gives me a this error: The proxy server received an invalid response from an upstream server. The proxy server could not handle the request GET /blog. Reason: Error reading from remote server I also cannot access /phpmyadmin, giving the same 404 NOT_FOUND error as above. I am running Debian squeeze on an Amazon EC2 Instance. Question: Where am I going wrong? What changes should I make in the proxy.conf (or another conf files) to be able to visit the blog?

    Read the article

  • Virtualbox: host only networking - proxy internet connection

    - by Russell
    I'll ask my question first, then give details about where I am coming from: Is it possible to use host only, then have ubuntu act as a proxy to provide internet access to windows? If so, how? I am trying to get the right combination of networking for my virtualbox windows client VM (win7). My host is ubuntu 10.10 (maverick). I believe I understand the basic network options (please correct me if I am incorrect): NAT - Host can't communicate with guest but guest has access to all host's adapters Host only - Separate adapter but guest has no net access Bridged - bridge an adapter in the host with the virtual adapter to give the host access to the host adapter I am trying to give my win guest internet access, but also access the host in a separate network. Bridged only works when the host is connected to the internet (this is a laptop) so when it's not connected the network is down. Thanks I appreciate your help.

    Read the article

  • Handle HTTPS Request in Proxy Server by C#

    - by Masoud Zohrabi
    I'm trying to write a Home Proxy Server in C# and I almost succeeded but I have problem to handle HTTPS requests (CONNECT). I don't know really how to handle this type of requests. In my studies I realized that for this requests we must to connect client to target host directly. Steps for these requests (that I realized): Receive first request from client (CONNECT https://www.example.ltd:443 HTTP/1.1) and send that to target host Send HTTP/1.1 200 Connection Established\r\n\r\n to client Listen to both sockets (client and target host) and send receives from each other to each other Listen until one of sockets disconnected Is this correct? If it is, how?

    Read the article

  • Confusion installing mongodb manually because of sitting behing a proxy

    - by black sensei
    I am forced to install mongodb manually on an Ubuntu server because the machine sits behind a proxy and there is no way to temporary open port 11371 for key exchange to happen. I am following this official mongoDB tutorial. So I downloaded the tgz and extracted it in /usr/local/bin/mongodb. Where I got confused is when assigning the ownership of the /data/db to the user mongo. So who creates the user mongo? How to get control-scripts working?

    Read the article

  • DevDays ‘00 The Netherlands day #2

    - by erwin21
    Day 2 of DevDays 2010 and again 5 interesting sessions at the World Forum in The Hague. The first session of the today in the big world forum theater was from Scott Hanselman, he gives a lap around .NET 4.0. In his way of presenting he talked about all kind of new features of .NET 4.0 like MEF, threading, parallel processing, changes and additions to the CLR and DLR, WPF and all new language features of .NET 4.0. After a small break it was ready for session 2 from Scott Allen about Tips, Tricks and Optimizations of LINQ. He talked about lazy and deferred executions, the difference between IQueryable and IEnumerable and the two flavors of LINQ syntax. The lunch was again very good prepared and delicious, but after that it was time for session 3 Web Vulnerabilities and Exploits from Alex Thissen. This was no normal session but more like a workshop, we decided what kind of subjects we discussed, the subjects where OWASP, XSS and other injections, validation, encoding. He gave some handy tips and tricks how to prevent such attacks. Session 4 was about the new features of C# 4.0 from Alex van Beek. He talked about Optional- en Named Parameters, Generic Co- en Contra Variance, Dynamic keyword and COM Interop features. He showed how to use them but also when not to use them. The last session of today and also the last session of DevDays 2010 was about WCF Best Practices from Gerben van Loon. He talked about 7 best practices that you must know when you are going to use WCF. With some quick demos he showed the problem and the solution for some common issues. It where two interesting days and next year i sure will be attending again.

    Read the article

  • Top 10 Posts in 2010

    - by dwahlin
    Blogging’s a lot of fun and a great way to share what you’ve learned. It’s also a great way to learn based upon comments people leave that help you see things in an entirely new way in some cases.  Since we’ve now moved on to 2011 (Happy New Year’s!) I wanted to list the Top 10 posts from my blog during 2010 based on individual views.  Thanks to everyone who follows my blog and adds comments from time to time. Here’s wishing everyone a great 2011!   1. Reducing Code by Using jQuery Templates 2. Integrating HTML into Silverlight Applications 3. Silverlight is Dead, the Moon is Made of Cheese, and HTML 5 is Ready for Prime Time 4. Understanding the Role of Commanding in Silverlight 4 Applications 5. New Article – Getting Started with WCF RIA Services 6. Simplify Your Code with LINQ 7. My Favorite iPad Apps….So Far 8. Final Release of Silverlight Tools for Visual Studio 2010 Released 9. Handling WCF Service Paths in Silverlight 4 – Relative Path Support 10. Tales from the Trenches – Building a Real-World Silverlight Line of Business Application   Getting Started with the MVVM Pattern in Silverlight Applications – Posted late 2009 so I’m giving it honorable mention status since it’s still one of the most popular posts.

    Read the article

  • Service Testing made easy with SO-Aware Test Workbench

    - by cibrax
    I happy to announce today a new addition to our SO-Aware service repository toolset, SO-Aware Test Workbench, a WPF desktop application for doing functional and load testing against existing WCF Services. This tool is completely integrated to the SO-Aware service repository, which makes configuring new load and functional tests for WCF Soap and REST services a breeze. From now on, the service repository can play a very important role in an organization by facilitating collaboration between developers and testers. Developers can create and register new services in the repository with all the related artifacts like configuration. On the other hand, Testers can just pick one of the existing services in the repository and create functional or load tests from there, with no need to deal with specific details of the service implementation, location or configuration settings. Developers and Testers can later use the result of those tests to modify the services or adjust different settings on the tests or service configuration. Gustavo Machado, one of the developers behind this project, has written an excellent post describing all the functionality that can find today in the tool. You can also see the tool in action in this Endpoint Tv episode with Jesus and Ron Jacobs.

    Read the article

  • If I implement a web-service, how do I respond to POST requests with JSON?

    - by Vova Stajilov
    I have to make a rather complex system for my diploma work. Logically it will consist of the following components: Database Web-service Management with web interface Client iOS application that will consume web-service I decided to implement all the first three components under .NET. Firstly I will create a database depending on the information load - this is clear. But then I need a web-service that will return data in JSON format for iOS clients to consume - that's obvious and not that hard to implement. For this I will use WCF technology. Now I have a question, if I implement the web-service, how will I be able to respond to POST requests with JSON? It probably involves WCF JSON or something related? But I also need some web pages as admin part, so will this web-application be able to consume my centralized web-services as well or I should develop it separately? I just want my web service to act like a set of controllers. There is a related question here but this doesn't quite reflect my question.

    Read the article

  • What's the best practice to do SOA exception handling?

    - by sun1991
    Here's some interesting debate going on between me and my colleague when coming to handle SOA exceptions: On one side, I support what Juval Lowy said in Programming WCF Services 3rd Edition: As stated at the beginning of this chapter, it is a common illusion that clients care about errors or have anything meaningful to do when they occur. Any attempt to bake such capabilities into the client creates an inordinate degree of coupling between the client and the object, raising serious design questions. How could the client possibly know more about the error than the service, unless it is tightly coupled to it? What if the error originated several layers below the service—should the client be coupled to those lowlevel layers? Should the client try the call again? How often and how frequently? Should the client inform the user of the error? Is there a user? By having all service exceptions be indistinguishable from one another, WCF decouples the client from the service. The less the client knows about what happened on the service side, the more decoupled the interaction will be. On the other side, here's what my colleague suggest: I believe it’s simply incorrect, as it does not align with best practices in building a service oriented architecture and it ignores the general idea that there are problems that users are able to recover from, such as not keying a value correctly. If we considered only systems exceptions, perhaps this idea holds, but systems exceptions are only part of the exception domain. User recoverable exceptions are the other part of the domain and are likely to happen on a regular basis. I believe the correct way to build a service oriented architecture is to map user recoverable situations to checked exceptions, then to marshall each checked exception back to the client as a unique exception that client application programmers are able to handle appropriately. Marshall all runtime exceptions back to the client as a system exception, along with the stack trace so that it is easy to troubleshoot the root cause. I'd like to know what you think about this? Thank you.

    Read the article

  • Is Moving Entity Framework objects over a webservice really the best way?

    - by aceinthehole
    I've inherited a .NET project that has close to 2 thousand clients out in the field that need to push data periodically up to a central repository. The clients wake up and attempt to push the data up via a series of WCF webservices where they are passing each entity framework entity as parameter. Once the service receives this object, it preforms some business logic on the data, and then turns around and sticks it in it's own database that mirrors the database on the client machines. The trick is, is that this data is being transmitted over a metered connection, which is very expensive. So optimizing the data is a serious priority. Now, we are using a custom encoder that compresses the data (and decompresses it on the other end) while it is being transmitted, and this is reducing the data footprint. However, the amount of data that the clients are using, seem ridiculously large, given the amount of information that is actually being transmitted. It seems me that entity framework itself may be to blame. I'm suspecting that the objects are very large when serialized to be sent over wire, with a lot context information and who knows what else, when what we really need is just the 'new' inserts. Is using the entity framework and WCF services as we have done so far the correct way, architecturally, of approaching this n-tiered, asynchronous, push only problem? Or is there a different approach, that could optimize the data use?

    Read the article

  • How can I prevent a field from being copied to the client proxy in WCF RIA?

    - by Martin Doms
    Is there a metadata attribute I can use to prevent a field from being accessible on the client in a WCF RIA services? I sure I have seen this before, but I'm drawing a blank and Google isn't helping. It would look something like [MetadataType(typeof(User.UserMetadata))] public partial class User { internal sealed class UserMetadata { private UserMetadata() { } public int Id { get; set; } [HideFromClientProxy] public string PasswordSalt { get; set; } } }

    Read the article

  • Best PerfCounters for monitoring system health of IIS, WCF, WWF and .Net for a Workflow based soluti

    - by Gineer
    We have a solution built in .Net that will be installed into a client environment. The solution will span multiple servers and be running on multiple tiers. The client makes us of MOM (Microsoft operations Manager) to monitor the system. What are the best counters to use for monitoring the overall health of the system? Are there any built in counters that we could add into a MOM Pack (as an Alert) to test a given scenario? Any thoughts suggestions would be much apreciated. Thanks

    Read the article

  • WIF, ADFS 2 and WCF&ndash;Part 2: The Service

    - by Your DisplayName here!
    OK – so let’s first start with a simple WCF service and connect that to ADFS 2 for authentication. The service itself simply echoes back the user’s claims – just so we can make sure it actually works and to see how the ADFS 2 issuance rules emit claims for the service: [ServiceContract(Namespace = "urn:leastprivilege:samples")] public interface IService {     [OperationContract]     List<ViewClaim> GetClaims(); } public class Service : IService {     public List<ViewClaim> GetClaims()     {         var id = Thread.CurrentPrincipal.Identity as IClaimsIdentity;         return (from c in id.Claims                 select new ViewClaim                 {                     ClaimType = c.ClaimType,                     Value = c.Value,                     Issuer = c.Issuer,                     OriginalIssuer = c.OriginalIssuer                 }).ToList();     } } The ViewClaim data contract is simply a DTO that holds the claim information. Next is the WCF configuration – let’s have a look step by step. First I mapped all my http based services to the federation binding. This is achieved by using .NET 4.0’s protocol mapping feature (this can be also done the 3.x way – but in that scenario all services will be federated): <protocolMapping>   <add scheme="http" binding="ws2007FederationHttpBinding" /> </protocolMapping> Next, I provide a standard configuration for the federation binding: <bindings>   <ws2007FederationHttpBinding>     <binding>       <security mode="TransportWithMessageCredential">         <message establishSecurityContext="false">           <issuerMetadata address="https://server/adfs/services/trust/mex" />         </message>       </security>     </binding>   </ws2007FederationHttpBinding> </bindings> This binding points to our ADFS 2 installation metadata endpoint. This is all that is needed for svcutil (aka “Add Service Reference”) to generate the required client configuration. I also chose mixed mode security (SSL + basic message credential) for best performance. This binding also disables session – you can control that via the establishSecurityContext setting on the binding. This has its pros and cons. Something for a separate blog post, I guess. Next, the behavior section adds support for metadata and WIF: <behaviors>   <serviceBehaviors>     <behavior>       <serviceMetadata httpsGetEnabled="true" />       <federatedServiceHostConfiguration />     </behavior>   </serviceBehaviors> </behaviors> The next step is to add the WIF specific configuration (in <microsoft.identityModel />). First we need to specify the key material that we will use to decrypt the incoming tokens. This is optional for web applications but for web services you need to protect the proof key – so this is mandatory (at least for symmetric proof keys, which is the default): <serviceCertificate>   <certificateReference storeLocation="LocalMachine"                         storeName="My"                         x509FindType="FindBySubjectDistinguishedName"                         findValue="CN=Service" /> </serviceCertificate> You also have to specify which incoming tokens you trust. This is accomplished by registering the thumbprint of the signing keys you want to accept. You get this information from the signing certificate configured in ADFS 2: <issuerNameRegistry type="...ConfigurationBasedIssuerNameRegistry">   <trustedIssuers>     <add thumbprint="d1 … db"           name="ADFS" />   </trustedIssuers> </issuerNameRegistry> The last step (promised) is to add the allowed audience URIs to the configuration – WCF clients use (by default – and we’ll come back to this) the endpoint address of the service: <audienceUris>   <add value="https://machine/soapadfs/service.svc" /> </audienceUris> OK – that’s it – now we have a basic WCF service that uses ADFS 2 for authentication. The next step will be to set-up ADFS to issue tokens for this service. Afterwards we can explore various options on how to use this service from a client. Stay tuned… (if you want to have a look at the full source code or peek at the upcoming parts – you can download the complete solution here)

    Read the article

  • How do I avoid web method parameters using proxy classes?

    - by Alex Angas
    I have a serializable POCO called DataUnification.ClientData.ClientInfo in a .NET class library project A. It's used in a parameter for a web service defined in project B: public XmlDocument CreateNewClient(ClientInfo ci, string system) I now wish to call this web method from project C and use the original DataUnification.ClientData.ClientInfo type in the parameter. However due to the generated proxy class it has now become a different type: WebServices.ClientDataUnification.DataUnificationWebService.ClientInfo. As far as .NET is concerned these are not the same types. How can I get around this?

    Read the article

  • Castle Dynamic Proxy is it possible to intercept value types?

    - by JS Future Software
    Hi, I have a problem and can not find answer and any tip if it is possible to intercept value types in C# by Castle dynamic proxy? I want to intercept IDictionary with INotifyChanged interface. I need this to update view when presenter is changing model. Boxing decimal in object only for making interface is not good idea... maybe somebody have idea how to intrcept value types? Thanks to all answers

    Read the article

  • How to reliably categorize HTTP sessions in proxy to corresponding browser' windows/tabs user is viewing?

    - by Jehonathan
    I was using the Fiddler core .Net library as a local proxy to record the user activity in web. However I ended up with a problem which seems dirty to solve. I have a web browser say Google Chrome, and the user opened like 10 different tabs each with different web URLs. The problem is that the proxy records all the HTTP session initiated by each pages separately, causing me to figure out using my intelligence the tab which the corresponding HTTP session belonged to. I understand that this is because of the stateless nature of HTTP protocol. However I am just wondering is there an easy way to do this? I ended up with below c# code for that in Fiddler. Still its not a reliable solution due to the heuristics. This is a modification of the sample project bundled with Fiddler core for .NET 4. Basically what it does is filtering HTTP sessions initiated in last few seconds to find the first request or switching to another page made by the same tab in browser. It almost works, but not seems to be a universal solution. Fiddler.FiddlerApplication.AfterSessionComplete += delegate(Fiddler.Session oS) { //exclude other HTTP methods if (oS.oRequest.headers.HTTPMethod == "GET" || oS.oRequest.headers.HTTPMethod == "POST") //exclude other HTTP Status codes if (oS.oResponse.headers.HTTPResponseStatus == "200 OK" || oS.oResponse.headers.HTTPResponseStatus == "304 Not Modified") { //exclude other MIME responses (allow only text/html) var accept = oS.oRequest.headers.FindAll("Accept"); if (accept != null) { if(accept.Count>0) if (accept[0].Value.Contains("text/html")) { //exclude AJAX if (!oS.oRequest.headers.Exists("X-Requested-With")) { //find the referer for this request var referer = oS.oRequest.headers.FindAll("Referer"); //if no referer then assume this as a new request and display the same if(referer!=null) { //if no referer then assume this as a new request and display the same if (referer.Count > 0) { //lock the sessions Monitor.Enter(oAllSessions); //filter further using the response if (oS.oResponse.MIMEType == string.Empty || oS.oResponse.MIMEType == "text/html") //get all previous sessions with the same process ID this session request if(oAllSessions.FindAll(a=>a.LocalProcessID == oS.LocalProcessID) //get all previous sessions within last second (assuming the new tab opened initiated multiple sessions other than parent) .FindAll(z => (z.Timers.ClientBeginRequest > oS.Timers.ClientBeginRequest.AddSeconds(-1))) //get all previous sessions that belongs to the same port of the current session .FindAll(b=>b.port == oS.port ).FindAll(c=>c.clientIP ==oS.clientIP) //get all previus sessions with the same referrer URL of the current session .FindAll(y => referer[0].Value.Equals(y.fullUrl)) //get all previous sessions with the same host name of the current session .FindAll(m=>m.hostname==oS.hostname).Count==0 ) //if count ==0 that means this is the parent request Console.WriteLine(oS.fullUrl); //unlock sessions Monitor.Exit(oAllSessions); } else Console.WriteLine(oS.fullUrl); } else Console.WriteLine(oS.fullUrl); Console.WriteLine(); } } } } };

    Read the article

  • Connect to a site using proxy code in java

    - by Nithin
    I want to connect to as site through proxy in java.This is the code which I have written public class ConnectThroughProxy { Proxy proxy = new Proxy(Proxy.Type.HTTP, new InetSocketAddress("proxy ip", 8080)); public static void main(String[] args) { try{ URL url = new URL("http://www.rgagnon.com/javadetails/java-0085.html"); URLConnection connection=url.openConnection(); String encoded = new String(Base64.encode(new String("user_name:pass_word").getBytes())); connection.setDoOutput(true); connection.setRequestProperty("Proxy-Authorization","Basic "+encoded); String page=""; String line; StringBuffer tmp = new StringBuffer(); BufferedReader in = new BufferedReader(new InputStreamReader(connection.getInputStream())); while ((line=in.readLine()) != null){ page.concat(line + "\n"); } System.out.println(page); }catch(Exception ex){ ex.printStackTrace(); } } while trying to run this code it throws the following error java.lang.IllegalArgumentException: Illegal character(s) in message header value: Basic dXNlcl9uYW1lOnBhc3Nfd29yZA== at sun.net.www.protocol.http.HttpURLConnection.checkMessageHeader(HttpURLConnection.java:323) at sun.net.www.protocol.http.HttpURLConnection.setRequestProperty(HttpURLConnection.java:2054) at test.ConnectThroughProxy.main(ConnectThroughProxy.java:30) Any Idea how to do it.

    Read the article

  • C# IE Proxy Detection not working from a Windows Service

    - by mytwocents
    This is very odd. I've written a windows form tool in C# to return the default IE proxy. It works (See code snippets below). I've created a service to return the proxy using the same code, it works when I step into the debugger, BUT it doesn't work when the service is running as binary code. This must be some sort of permissions or context thing that I cannot recreate. Further, if I call the tool from the windows service, it still doesn't return the proxy information.... this seems isolated to the service somehow. I don't see any permission errors in the events log, it's as if a service has a totally different context than a regular windows form exe. I've tride both: WebRequest.GetSystemWebProxy() and Uri requestUri = new Uri(urlOfDomain); HttpWebRequest request = (HttpWebRequest)WebRequest.Create(requestUri); IWebProxy proxy = request.Proxy; if (!proxy.IsBypassed(requestUri)){ Uri uri2 = proxy.GetProxy(request.RequestUri); } Thanks for any help!

    Read the article

  • WCF - The maximum nametable character count quota (16384) has been exceeded while reading XML data.

    - by Jankhana
    I'm having a WCF Service that uses wsHttpBinding. The server configuration is as follows : <bindings> <wsHttpBinding> <binding name="wsHttpBinding" maxBufferPoolSize="2147483647" maxReceivedMessageSize="2147483647"> <readerQuotas maxDepth="2147483647" maxStringContentLength="2147483647" maxArrayLength="2147483647" maxBytesPerRead="2147483647" maxNameTableCharCount="2147483647" /> <security mode="None"> <transport clientCredentialType="Windows" proxyCredentialType="None" realm="" /> <message clientCredentialType="Windows" negotiateServiceCredential="true" algorithmSuite="Default" establishSecurityContext="true" /> </security> </binding> </wsHttpBinding> </bindings> At the client side I'm including the Service reference of the WCF-Service. It works great if I have limited functions say 90 Operation Contract in my IService but if add one more OperationContract than I'm unable to Update the Service reference nor i'm able to add that service reference. In this article it's mentioned that by changing those config files(i.e devenv.exe.config, WcfTestClient.exe.config and SvcUtil.exe.config) it will work but even including those bindings in those config files still that error pops up saying There was an error downloading 'http://10.0.3.112/MyService/Service1.svc/mex'. The request failed with HTTP status 400: Bad Request. Metadata contains a reference that cannot be resolved: 'http://10.0.3.112/MyService/Service1.svc/mex'. There is an error in XML document (1, 89549). The maximum nametable character count quota (16384) has been exceeded while reading XML data. The nametable is a data structure used to store strings encountered during XML processing - long XML documents with non-repeating element names, attribute names and attribute values may trigger this quota. This quota may be increased by changing the MaxNameTableCharCount property on the XmlDictionaryReaderQuotas object used when creating the XML reader. Line 1, position 89549. If the service is defined in the current solution, try building the solution and adding the service reference again. Any idea how to solve this????

    Read the article

  • Why would Basic Auth not work with my WCF client to Java SOAP Web Service?

    - by orj
    I have a Java based web service that requires basic authentication to communicate with it. If I type the WSDL url into my browser I'm prompted for Basic Auth. Which I can get by entering the correct credentials. However using my WCF client doesn't work. I construct my WCF client like this: var binding = new BasicHttpBinding { MaxReceivedMessageSize = 2048 * 10240, Security = { Mode = BasicHttpSecurityMode.TransportCredentialOnly, Transport = { ClientCredentialType = HttpClientCredentialType.Basic, Realm = "MYREALM", ProxyCredentialType = HttpProxyCredentialType.None }, Message = { ClientCredentialType = BasicHttpMessageCredentialType.UserName, AlgorithmSuite = SecurityAlgorithmSuite.Default } } }; var client = new WebServiceClient(binding, endpoint); client.ClientCredentials.UserName.UserName = username; client.ClientCredentials.UserName.Password = password; client.DoWebServiceMethod(); I get the following exception. System.Net.WebException: The remote server returned an error: (401) Unauthorized. at System.Net.HttpWebRequest.GetResponse() at System.ServiceModel.Channels.HttpChannelFactory.HttpRequestChannel.HttpChannelRequest.WaitForReply(TimeSpan timeout) System.ServiceModel.Security.MessageSecurityException: The HTTP request is unauthorized with client authentication scheme 'Basic'. The authentication header received from the server was 'Basic realm="MYREALM"'. From what I can tell I'm doing things right. Where am I going wrong?

    Read the article

  • How can I inject an object into an WCF IErrorHandler implementation with Castle Windsor?

    - by Michael Johnson
    I'm developing a set of services using WCF. The application is doing dependency injection with Castle Windsor. I've added an IErrorHandler implementation that is added to services via an attribute. Everything is working thus far. The IErrorHandler object (of a class called FaultHandler is being applied properly and invoked. Now I'm adding logging. Castle Windsor is set up to inject the logger object (an instance of IOurLogger). This is working. But when I try to add it to FaultHandler my logger is null. The code for FaultHandler looks something like this: class FaultHandler : IErrorHandler { public IOurLogger logger { get; set; } public bool HandleError(Exception error) { logger.Write("Exception type {0}. Message: {1}", error.GetType(), error.Message); // Let WCF handle things its way. We only want to log. return false; } public void ProvideFault(Exception error, MessageVersion version, Message fault) { } } This throws it's own exception, since logger is null when HandleError() is called. The logger is being successfully injected into the service itself and is usable there, but for some reason I can't use it in FaultHandler. Update: Here is the relevant part of the Windsor configuration file (edited to protect the innocent): <configuration> <components> <component id="Logger" service="Our.Namespace.IOurLogger, Our.Namespace" type="Our.Namespace.OurLogger, Our.Namespace" /> </components> </configuration>

    Read the article

  • How to produce an HTTP 403-equivalent WCF Message from an IErrorHandler?

    - by Andras Zoltan
    I want to write an IErrorHandler implementation that will handle AuthenticationException instances (a proprietary type), and then in the implementation of ProvideFault provide a traditional Http Response with a status code of 403 as the fault message. So far I have my first best guess wired into a service, but WCF appears to be ignoring the output message completely, even though the error handler is being called. At the moment, the code looks like this: public class AuthWeb403ErrorHandler : IErrorHandler { #region IErrorHandler Members public bool HandleError(Exception error) { return error is AuthenticationException; } public void ProvideFault(Exception error, MessageVersion version, ref Message fault) { //first attempt - just a stab in the dark, really HttpResponseMessageProperty property = new HttpResponseMessageProperty(); property.SuppressEntityBody = true; property.StatusCode = System.Net.HttpStatusCode.Forbidden; property.StatusDescription = "Forbidden"; var m = Message.CreateMessage(version, null); m.Properties[HttpResponseMessageProperty.Name] = property; fault = m; } #endregion } With this in place, I just get the standard WCF html 'The server encountered an error processing the request. See server logs for more details.' - which is what would happen if there was no IErrorHandler. Is this a feature of the behaviours added by WebServiceHost? Or is it because the message I'm building is simply wrong!? I can verify that the event log is indeed not receiving anything. My current test environment is a WebGet method (both XML and Json) hosted in a service that is created with the WebServiceHostFactory, and Asp.Net compatibility switched off. The service method simply throws the exception in question.

    Read the article

  • Is there a more easy way to create a WCF/OData Data Service Query Provider?

    - by routeNpingme
    I have a simple little data model resembling the following: InventoryContext { IEnumerable<Computer> GetComputers() IEnumerable<Printer> GetPrinters() } Computer { public string ComputerName { get; set; } public string Location { get; set; } } Printer { public string PrinterName { get; set; } public string Location { get; set; } } The results come from a non-SQL source, so this data does not come from Entity Framework connected up to a database. Now I want to expose the data through a WCF OData service. The only way I've found to do that thus far is creating my own Data Service Query Provider, per this blog tutorial: http://blogs.msdn.com/alexj/archive/2010/01/04/creating-a-data-service-provider-part-1-intro.aspx ... which is great, but seems like a pretty involved undertaking. The code for the provider would be 4 times longer than my whole data model to generate all of the resource sets and property definitions. Is there something like a generic provider in between Entity Framework and writing your own data source from zero? Maybe some way to build an object data source or something, so that the magical WCF unicorns can pick up my data and ride off into the sunset without having to explicitly code the provider?

    Read the article

  • WCF identity when moving from dev to prod. environment

    - by Anders Abel
    I have a web service developed with WCF. In the development environment the endpoint has the following identity section under the endpoint configuration. <identity> <dns value="myservice.devdomain.local" /> </identity> myservice.devdomain.local is the dns name used to reach the development version of the service. The binding used is: <basicHttpBinding> <binding name ="myBinding"> <security mode ="TransportCredentialOnly"> <transport clientCredentialType="Windows"/> </security> </binding> </basicHttpBinding> I am about to put this into production. The binding will be the same, but the address will be a new production address myservice.proddomain.local. I have planned to change the dns value in the configuration to myservice.proddomain.local in the production environment. However this MSDN article on WCF Identity makes me worried about the impact on the clients when I change the identity. There are two clients - one .NET and one Java using this service. Both of those have been developed against the dev instance of the service. The idea is to just reconfigure the endpoint used by the clients, without reloading the WSDL. But if the identity is somehow part of the WSDL and the identity changes when deploying to prod that might not work. Will the new identity in the prod version cause issues for the clients that were developed using the dev wsdl? Do the Java and the .NET clients handle this differently?

    Read the article

< Previous Page | 90 91 92 93 94 95 96 97 98 99 100 101  | Next Page >