Search Results

Search found 50062 results on 2003 pages for 'http 1 0'.

Page 95/2003 | < Previous Page | 91 92 93 94 95 96 97 98 99 100 101 102  | Next Page >

  • Simulate Incorrect Content-Length Headers for HTTP in C#

    - by cfeduke
    We are building a comprehensive integration test framework in C# for our application which exists on top of HTTP using IIS7 to host our applications. As part of our integration tests we want to test incoming requests which will result in EndOfStreamExceptions ("Unable to read beyond end of stream") that occur when a client sends up a HTTP header indicating a larger body size than it actually transmits as part of the body. We want to test our error recovery code for this condition so we need to simulate these sorts of requests. I am looking for a .NET Fx-based socket library or custom HttpWebRequest replacement that specifically allows developers to simulate such conditions to add to our integration test suite. Does anyone know of any such libraries? A scriptable solution would work as well.

    Read the article

  • Multiple key/value pairs in HTTP POST where key is the same name

    - by randombits
    I'm working on an API that accepts data from remote clients, some of which where the key in an HTTP POST almost functions as an array. In english what this means is say I have a resource on my server called "class". A class in this sense, is the type a student sits in and a teacher educates in. When the user submits an HTTP POST to create a new class for their application, a lot of the key value pairs look like: student_name: Bob Smith student_name: Jane Smith student_name: Chris Smith What's the best way to handle this on both the client side (let's say the client is cURL or ActiveResource, whatever..) and what's a decent way of handling this on the server-side if my server is a Ruby on Rails app? Need a way to allow for multiple keys with the same name and without any namespace clashing or loss of data.

    Read the article

  • Problem in getting Http Response in crome

    - by Bhaskasr
    hi Am trying to get http response from php web service in javascript , but am getting null in firefox and crome. plz tell me where am doing mistake here is my code, function fetch_details() { if (window.XMLHttpRequest) { xhttp=new XMLHttpRequest() alert("first"); } else { xhttp=new ActiveXObject("Microsoft.XMLHTTP") alert("sec"); } alert("hi."); xhttp.open("GET","http://122.166.97.94:8080/proxim_live/phpsend.php?UserID=881&DeviceID=Imm123&LastSyncDate=",false); xhttp.send(""); xmlDoc=xhttp.responseXML; alert(xmlDoc); alert(xmlDoc.getElementsByTagName("Inbox")[0].childNodes[0].nodeValue); }

    Read the article

  • Why won't IIS serve my website? - 404 Page Not Found

    - by Giffyguy
    Built a brand new server, with a fresh copy of Windows Server 2003 Enterprise x86 Edition. Installed the .NET Framework 1.1, 2.0, 3.5, and 4.0 Added the "Domain Controller" and "Application Server" roles. Created a new website, pointed it to a local directory: C:\Inetpub\angryoctopus.net\ Added the appropriate headers: angryoctopus.net, www.angryoctopus.net, TCP port 80, all IPs Moved the website content into the local directory. Configured the default document in IIS: Default.aspx Enabled ASP.NET for this website, and set it to the correct version: 2.0.50727 Configured the zone angryoctopus.net in DNS. Tested DNS lookup here to ensure DNS was functional. Opened website in VS 2008 and re-built (and debugged) to ensure the content was functional. I can clearly see that IIS is responding normally, by browsing directly to my server's IP address. Since this does not use the angryoctopus HTTP header, the default website is displayed instead: the "Under Construction" page. And yet, after all of this, angryoctopus.net still returns 404. Does anybody know what could be wrong? What troubleshooting steps have I forgotten? Is there a command-line diagnostic that might provide more information?

    Read the article

  • Securely persist session between https://secure.yourname.com and http://www.yourname.com on rails ap

    - by Matt
    My rails site posts to a secure host (e.g. 'https://secure.yourname.com') when the user logs into the site. Session data is stored in the database, with the cookie containing only the session ID. The problem is that when the user returns to a non-https page, such as the home page (e.g. 'http://www.yourname.com') the user appears to have logged out. I believe the reason for this is that a separate cookie is stored for each host (www vs. secure). Is this correct? What is the best secure way to persist the session between both the http and https sections of the site? Does anyone know of any plugins that address this problem? The site runs on Heroku.

    Read the article

  • Subversion: svn protocol with HTTP/HTTPS proxy

    - by Neeraj
    Hi all, I need to do a svn checkout,say svn checkout svn://XYZ.com/trunk. I am using the svn client from behind the proxy. I had accessed other repositries using the http protocol in past but with svn protocol,it fails with "Connection Refused", reason I think being the port not allowed by the proxy.Nonetheless, the HTTP protocol is not supported on the server. However, svn+ssh gets connected but it prompts for an account at that server which I don't have? Is there any way out other than requesting for an account? Note that I can't affect the settings of the proxy server.

    Read the article

  • Nginx wont send POST to fastcgi backend, but GET works fine?

    - by xyld
    Not sure why, but it is happy sending a GET to the fastcgi backend (Mercurial hgwebdir in this case), but simply resorts to the filesystem if the request is a POST. Relevant parts of nginx.conf: location / { root /var/www/htdocs/; index index.html; autoindex on; } location /hg { fastcgi_pass unix:/var/run/hg-fastcgi.socket; include fastcgi_params; if ($request_uri ~ ^/hg([^?#]*)) { set $rewritten_uri $1; } limit_except GET { allow all; deny all; auth_basic "hg secured repos"; auth_basic_user_file /var/trac.htpasswd; } fastcgi_param SCRIPT_NAME "/hg"; fastcgi_param PATH_INFO $rewritten_uri; # for authentication fastcgi_param AUTH_USER $remote_user; fastcgi_param REMOTE_USER $remote_user; #fastcgi_pass_header Authorization; #fastcgi_intercept_errors on; } GET's work fine, but POST delivers this error to the error_log: 2010/05/17 14:12:27 [error] 18736#0: *1601 open() "/usr/html/hg/test" failed (2: No such file or directory), client: XX.XX.XX.XX, server: domain.com, request: "POST /hg/test HTTP/1.1", host: "domain.com" What could possibly be the issue? I'm trying to allow read-only access via GET's to the page, but require authorization when using hg push to the same url which sends a POST request.

    Read the article

  • How to clone a mercurial repository over an ssh connection initiated by fabric when http authorizati

    - by Monika Sulik
    I'm attempting to use fabric for the first time and I really like it so far, but at a certain point in my deployment script I want to clone a mercurial repository. When I get to that point I get an error: err: abort: http authorization required My repository requires http authorization and fabric doesn't prompt me for the user and password. I can get around this by changing my repository address from: https://hostname/repository to: https://user:password@hostname/repository But for various reasons I would prefer not to go this route. Are there any other ways in which I could bypass this problem?

    Read the article

  • WCF service hosted in IIS 5.0 - http get methods not invoked through the browser

    - by Dmitry
    I have service hosted in IIS5.0 and configured such way : ** <behaviors> <endpointBehaviors> <behavior name="GalleriesBehavior"> <webHttp/> </behavior> ** When I make request to http://localhost/sandboxWidget/Galleries.svc?Test ,the message I get is : "Metadata publishing for this service is currently disabled." and instructions how to expose meta data exchange...Tried this but doesn't help.. I can write any function name ,even not existing (http://localhost/sandboxWidget/Galleries.svc?blabla) ,but the message is same .... I'm new to Wcf ,but what looks me strange is that ,the same application was working in my job environment ,but not on my home PC .... I have same VS2008,NET3.5 with service pack 1

    Read the article

  • Good tool to test WCF service (not SOAP/HTTP webservice)

    - by Kangkan
    I recently found SOAPUI and discovered that it is just a great tool for testing any SOAP/HTTP service. Conventionally, we have been developing our own driver to test our services (WCF based netTCP binding) so far. But with SOAPUI experience, I am really looking for some such tool that can be used with such ease with built-in facilities for load testing, functional testing etc. The other thought in my mind is that for services that I wish to deploy with netTCP can be first tested using a HTTP binding using SOAPUI. Once found suitable, the binding can be changed for the intended one. I shall like the views from all the experts here. Thanks.

    Read the article

  • Is it valid to replace http:// with // in a <script src="http://...">?

    - by Darryl Hein
    I have the following tag: <script type="text/javascript" src="https://cdn.example.com/js_file.js"></script> In this case the site is HTTPS, but the site may also be just HTTP. (The JS file is on another domain.) I'm wondering if it's valid to do the following for convenience sake: <script type="text/javascript" src="//cdn.example.com/js_file.js"></script> I'm wondering if it's valid to remove the http: or https: ? It seems to work everywhere I have tested, but are there any cases where it doesn't work?

    Read the article

  • How can I enable IIS to run Perl scripts?

    - by eidylon
    we're trying to get awstats up and running on our IIS6 server. awstats is running fine and generating output and all that jazz... no problem there. When trying to change the selected month/year in the output page though, it is trying to run awstats.pl through IIS, and coming up with a 404 error. To debug I made a simple hello.pl in my root, and tried to run that, also 404s. I followed the directions on this page http://support.microsoft.com/kb/245225 regarding installing ActiveState Perl and then configuring IIS. I added the extension mapping on my directory and registered the web services extension as directed. The perl scripts all run fine and output if run from the command line, so I know perl is good, but I can't get IIS to find the files. Here is the configuration on my the home directory tab of my site: Here is the configuration of my web service extension: I turned on directory browsing for this site, and when i get the listing of the directory, IIS actually does show the .pl files being in the directory. But if I click on one of them, I get the 404 error. 12/17 15:22 Also tried adding .pl as a mime-type on my site's configuration. This did not help. 12/17 16:57 Also tried Everyone Read/Execute permissions on both the Perl direcory and the directory housing awstats. This did not help.

    Read the article

  • getRequestProperty("Authorization") always returns null

    - by Thilo
    I am trying to read the authorization header for an HTTP request (because I need to add something to it), but I always get null for the header value. Other headers work fine. public void testAuth() throws MalformedURLException, IOException{ URLConnection request = new URL("http://google.com").openConnection(); request.setRequestProperty("Authorization", "MyHeader"); request.setRequestProperty("Stackoverflow", "anotherHeader"); // works fine assertEquals("anotherHeader", request.getRequestProperty("Stackoverflow")); // Auth header returns null assertEquals("MyHeader", request.getRequestProperty("Authorization")); } Am I doing something wrong? Is this a "security" feature? Is there a way to make this work with URLConnection, or do I need to use another HTTP client library?

    Read the article

  • Is Webhooks a style/pattern or a specification?

    - by Emilio
    I've been reading about Webhooks and I'm trying to determine if it's a specification vs a style/pattern. By "specification" I mean that the implementation details, e.g. headers, payload and so on are well defined. By "style" or "pattern" I mean in the sense that REST is a style (as opposed to a spec) or a pattern which describes usage but doesn't define implementation details. From what I see, Webhooks is a style/pattern. That the event(s) which triggers the http callbacks are generated however the developer wants, and that the http callbacks have no specific implementation requirements except to be an http post. Is this correct?

    Read the article

  • I can't change HTTP request header Content-Type value using jQuery

    - by Matt
    Hi I tried to override HTTP request header content by using jQuery's AJAX function. It looks like this $.ajax({ type : "POST", url : url, data : data, contentType: "application/x-www-form-urlencoded;charset=big5", beforeSend: function(xhr) { xhr.setRequestHeader("Accept-Charset","big5"); xhr.setRequestHeader("Content-Type","application/x-www-form-urlencoded;charset=big5"); }, success: function(rs) { target.html(rs); } }); Content-Type header is default to "application/x-www-form-urlencoded; charset=UTF-8", but it obviously I can't override its value no matter I use 'contentType' or 'beforeSend' approaches. Could anyone adivse me a hint that how do I or can I change the HTTP request's content-type value? thanks a lot. btw, is there any good documentation that I can study JavaScript's XMLHttpRequest's encoding handling?

    Read the article

  • Getting error when compiled Http webrequest

    - by Afnan
    i have written a program to search value from google every thing works fine but first time when page is loaded then i encounter error.after words if i click any link it is working fine no errors further. Code is as follow private void backgroundWorker1_DoWork(object sender, DoWorkEventArgs e) { string raw = "http://www.google.com/search?hl=en&q={0}&aq=f&oq=&aqi=n1g10"; string search = string.Format(raw, HttpUtility.UrlEncode(searchTerm)); //string search = "http://www.whatismyip.com/"; HttpWebRequest request = (HttpWebRequest)WebRequest.Create(search); using (HttpWebResponse response = (HttpWebResponse)request.GetResponse()) { using (StreamReader reader = new StreamReader(response.GetResponseStream(), Encoding.ASCII)) { browserA = reader.ReadToEnd(); this.Invoke(new EventHandler(IE1)); } } }

    Read the article

  • How should I handle searching through byte arrays in Java?

    - by Zombies
    Preliminary: I am writting my own httpclient in Java. I am trying to parse out the contents of chunked encoding. Here is my dilema: Since I am trying to parse out chunked http transfer encoding with a gzip payload there is a mix of ascii and binary. I can't just take the http resp content and convert it to a string and make use of StringUtils since the binary data can easily contain nil characters. So what I need to do is some basic things for parsing out each chunk and its chunk length (as per chunked transfer/HTTP/1.1 spec). Are there any helpful ways of searching through byte arrays of binary/part ascii data for certain patterns (like a CR LF) (instead of just a single byte) ? Or must I write the for loops for this?

    Read the article

  • Force php through the .net engine in iis7

    - by Rippo
    I have converted a php to asp.net mvc and have it hosted with the Rackspace cloud. All works great apart from some php links are still linked from other sites and within search engines. My question is what do I need to add to my web.config to force php sites to go through the .net engine? These links work as expected as I can catch the 404 and redirect where need be:- http://www.securahome.net/myjunk.info http://www.securahome.net/myjunk.phpp However this one doesn't:- http://www.securahome.net/myjunk.php I have spoken to Rackspace cloud and they say "its not possible as IIS doesn't recognize php files. You can setup mime types to handle them" This however makes no sense and I think they did not understand the problem. Does anyone have a solution?

    Read the article

  • how much concurrent http request can erlang handle

    - by user209123
    I am developing a application for benchmarking purposes, for which I require to create large number of http connection in a short time, I created a program in java to test how much threads is java able to create, it turns out in my 2GB single core machine, the limit is variable between 5000 and 6000 with 1 GB of memory given to JVM after which it hits outofmemoryerror with heap limit reached. It is suggested around that erlang will be able to generate much more concurrent processes, I am willing to learn erlang if it is capable of solving the problem , although I am interested in knowing can erlang be able to say generate somewhere around 100000 processes which are essentially http requests waiting for responses, in a matter of few seconds without reaching any limit like memory error etc.,

    Read the article

  • [Rails] HTTP Get Request

    - by Karl
    I've been trying to get Rails to play with the new Facebook Graph API. After I get the authorization "code", I need to send another request which returns the access token in JSON form. It seems to work fine, however I want to fetch the access token JSON without redirecting the user. I'm attempting to use Net::HTTP.get, but I'm not sure how to use it to get a request body, or even if it's the right thing to use to begin with. Can anyone give an example of performing an HTTP GET?

    Read the article

  • Ngins wont send POST to fastcgi backend, but GET works fine?

    - by xyld
    Not sure why, but it is happy sending a GET to the fastcgi backend (Mercurial hgwebdir in this case), but simply resorts to the filesystem if the request is a POST. Relevant parts of nginx.conf: location / { root /var/www/htdocs/; index index.html; autoindex on; } location /hg { fastcgi_pass unix:/var/run/hg-fastcgi.socket; include fastcgi_params; if ($request_uri ~ ^/hg([^?#]*)) { set $rewritten_uri $1; } limit_except GET { allow all; deny all; auth_basic "hg secured repos"; auth_basic_user_file /var/trac.htpasswd; } fastcgi_param SCRIPT_NAME "/hg"; fastcgi_param PATH_INFO $rewritten_uri; # for authentication fastcgi_param AUTH_USER $remote_user; fastcgi_param REMOTE_USER $remote_user; #fastcgi_pass_header Authorization; #fastcgi_intercept_errors on; } GET's work fine, but POST delivers this error to the error_log: 2010/05/17 14:12:27 [error] 18736#0: *1601 open() "/usr/html/hg/test" failed (2: No such file or directory), client: XX.XX.XX.XX, server: domain.com, request: "POST /hg/test HTTP/1.1", host: "domain.com" What could possibly be the issue? I'm trying to allow read-only access via GET's to the page, but require authorization when using hg push to the same url which sends a POST request.

    Read the article

  • What is the current standard for authenticating Http requests (REST, Xml over Http)?

    - by CodeToGlory
    The standard should solve the following Authentication challenges like- Replay attacks Man in the Middle Plaintext attacks Dictionary attacks Brute force attacks Spoofing by counterfeit servers I have already looked at Amazon Web Services and that is one possibility. More importantly there seems to be two most common approaches: Use apiKey which is encoded in a similar fashion like AWS but is a post parameter to a request Use Http AuthenticationHeader and use a similar signature like AWS. Signature is typically obtained by signing a date stamp with an encrypted shared secret. This signature is therefore passed either as an apiKey or in the Http AuthenticationHeader. I would like to know weigh both the options from the community, who may have used one or more and would also like to explore other options that I am not considering. I would also use HTTPS to secure my services.

    Read the article

  • Force encoding with IIS 7

    - by Cédric Boivin
    I try to force encoding with IIS 7. When I add in the http response headers the key : Content-Type and value charset=utf-8 i got this key content-type : text/html,content-type=utf-8 it's there a way to remove the comma ? Thanks Justin for your answer. But it's seen don't work. There is my config, i need to do that for asp classic. <?xml version="1.0" encoding="UTF-8"?> <configuration> <system.webServer> <staticContent> <remove fileExtension=".html" /> <remove fileExtension=".hxt" /> <remove fileExtension=".htm" /> <remove fileExtension=".asp" /> <mimeMap fileExtension=".htm" mimeType="text/html" /> <mimeMap fileExtension=".hxt" mimeType="text/html" /> <mimeMap fileExtension=".html" mimeType="text/html" /> <mimeMap fileExtension=".asp" mimeType="text/html; charset=UTF-8" /> </staticContent> </system.webServer> </configuration>

    Read the article

  • Why are cookies only sent to http://www.example.com and NOT http://example.com?

    - by Axel
    I have a PHP login which sets 2 cookies once someone login. The problem is that if you login from http://www.example.com and you go to http://example.com, you will find yourself not logged in. I think that is because the browser only send the cookies to the first syntax. It is only one domain, the difference is the www. before the domain name, so how to set cookies to the whole domain whatever there is www. or not? <?php setcookie('username',$username,time()+3600); ?>

    Read the article

< Previous Page | 91 92 93 94 95 96 97 98 99 100 101 102  | Next Page >