Search Results

Search found 92294 results on 3692 pages for 'http error 401 2'.

Page 38/3692 | < Previous Page | 34 35 36 37 38 39 40 41 42 43 44 45  | Next Page >

  • HTTP Content-type header for cached files

    - by Brian
    Hello, Using Apache with mod_rewrite, when I load a .css or .js file and view the HTTP headers, the Content-type is only set correctly the first time I load it - subsequent refreshes are missing Content-type altogether and it's creating some problems for me. I can get around this by appending a random query string value to the end of each filename, eg. http://www.site.com/script.js?12345 However, I don't want to have to do that, since caching is good and all I want is for the Content-type to be present. I've tried using a RewriteRule to force the type but still didn't solve the problem. Any ideas? Thanks, Brian

    Read the article

  • Are rails timers reliable when using Net::HTTP?

    - by Frank
    Hi All. When reading data from a potentially slow website, I want to ensure that get_response can not hang, and so added a timer to timeout after x seconds. So far, so good. I then read http://ph7spot.com/musings/system-timer which illustrates that in certain situations timer.rb doesn't work due to ruby's implementation of threads. Does anyone know if this is one of these situations? url = URI.parse(someurl) begin Timeout::timeout(30) do response = Net::HTTP.get_response(url) @responseValue = CGI.unescape(response.body) end rescue Exception = e dosomething end

    Read the article

  • Jaxer and HTTP proxy requests...

    - by rakhavan
    Thanks to everyone in advance. I'm using Jaxer.sandbox and making requests just fine. I'd like these requests to go through my http proxy (like squid for example). Here is the code I that is currently working for me. window.onload = function() { //the url to scrape var url = "http://www.cnn.com/"; //our sandboxed browser var sandbox = new Jaxer.Sandbox(); //open optons var openOptions = new Jaxer.Sandbox.OpenOptions(); openOptions.allowJavaScript = false; openOptions.allowMetaRedirects = false; openOptions.allowSubFrames = false; openOptions.allowSubFrames = false; openOptions.onload = function() { //do something onload }; //make the call sandbox.open(url, null, openOptions); //write the response Jaxer.response.setContents(sandbox.toHTML()); }; How can I send this request through a proxy server? Thanks, Reza.

    Read the article

  • How to display characters in http get response correctly with the right encoding

    - by DixieFlatline
    Hello! Does anyone know how to read c,š,ž characters in http get response properly? When i make my request in browser the browser displays all characters correctly. But in java program with apache jars i don't know how to set the encoding right. I tried with client.getParams().setParameter(CoreProtocolPNames.HTTP_CONTENT_CHARSET, "UTF-8"); but it's not working. My code: HttpClient client = new DefaultHttpClient(); String getURL = "http://www.google.com"; HttpGet get = new HttpGet(getURL); HttpResponse responseGet = client.execute(get); HttpEntity resEntityGet = responseGet.getEntity(); if (resEntityGet != null) { Log.i("GET RESPONSE",EntityUtils.toString(resEntityGet)); } } catch (Exception e) { e.printStackTrace(); }

    Read the article

  • HTTP Data chunks over multiple packets?

    - by myforwik
    What is the correct way for a HTTP server to send data over multiple packets? For example I want to transfer a file, the first packet I send is: HTTP/1.1 200 OK Content-type: application/force-download Content-Type: application/download Content-Type: application/octet-stream Content-Description: File Transfer Content-disposition: attachment; filename=test.dat Content-Transfer-Encoding: chunked 400 <first 1024 bytes here> 400 <next 1024 bytes here> 400 <next 1024 bytes here> Now I need to make a new packet, if I just send: 400 <next 1024 bytes here> All the clients close there connections on me and the files are cut short. What headers do I put in a second packet to continue on with the data stream?

    Read the article

  • Ruby on Rails: reducing complexity of parameters in a RESTFul HTTP POST request (multi-model)

    - by randombits
    I'm using cURL to test a RESTFul HTTP web service. The problem is I'm normally submitting a bunch of values normally like this: curl -d "firstname=bob&lastname=smith&age=25&from=kansas&someothermodelattr=val" -H "Content-Type: application/x-www-form-urlencoded" http://mysite/people.xml -i The problem with this is my controller will then have code like this: unless params[:firstname].nil? end unless params[:lastname].nil? end // FINALLY @person = People.new(params[:firstname], params[:lastname], params[:age], params[:from]) etc.. What's the best way to simplify this? My Person model has all the validations it needs. Is there a way (assuming the request has multi-model parameters) that I can just do: @person = People.new(params[:person]) and then the initializer can take care of the rest?

    Read the article

  • http response to GET request - working in FF not Chromium

    - by Tyler
    For fun I'm trying to write a very simple server in C. When I send this response to Firefox it prints out the body "hello, world" but with Chromium it gives me a Error 100 (net::ERR_CONNECTION_CLOSED): Unknown error. This, I believe, is the relevant code: char *response = "HTTP/1.0 200 OK\r\nVary: Accept-Encoding, Accept-Language\r\nConnection: Close\r\nContent-Type: text/plain\r\nContent-Length:20\r\n\r\nhello, world"; if(send(new_fd, response, strlen(response), 0) == strlen(response)) { printf("sent\n"); }; close(new_fd); What am I missing? Thanks!

    Read the article

  • How do I make an HTTP Post with HTTP Basic Authentication, using POCO?

    - by Alyoshak
    I'm trying to make an HTTP Post with HTTP Basic Authentication (cleartext username and password), using POCO. I found an example of a Get and have tried to modify it, but being a rookie I think I've mangled it beyond usefulness. Anyone know how to do this? Yes, I've already seen the other SO question on this: POCO C++ - NET SSL - how to POST HTTPS request, but I can't make sense of how it is trying to implement the username and password part. I also don't understand the use of "x-www-form-urlencoded". Is this required for a Post? I don't have a form. Just want to POST to the server with username and password parameters.

    Read the article

  • Java HTTP Client Request with defined timeout

    - by Maxim Veksler
    Hello, I would like to make BIT (Built in tests) to a number of server in my cloud. I need the request to fail on large timeout. How should I do this with java? Trying something like the below does not seem to work. public class TestNodeAliveness { public static NodeStatus nodeBIT(String elasticIP) throws ClientProtocolException, IOException { HttpClient client = new DefaultHttpClient(); client.getParams().setIntParameter("http.connection.timeout", 1); HttpUriRequest request = new HttpGet("http://192.168.20.43"); HttpResponse response = client.execute(request); System.out.println(response.toString()); return null; } public static void main(String[] args) throws ClientProtocolException, IOException { nodeBIT(""); } } -- EDIT: Clarify what library is being used -- I'm using httpclient from apache, here is the relevant pom.xml section org.apache.httpcomponents httpclient 4.0.1 jar

    Read the article

  • Best practices retrieving XML/stream from HTTP in Android

    - by Jeffrey
    Hello everyone, what are the best practices parsing XML from an HTTP resource in Android? I've been using HttpURLConnection to retrieve an InputStream, wrapping it with a BufferedInputStream, and then using SAX to parse the buffered stream. For the most part it works, though I do receive error reports of SocketTimeoutException: The operation timed out or general parsing error. I believe it's due to the InputStream. Would using HttpClient instead of HttpURLConnection help? If yes, why? Should the stream be output to a file, having the file parsed instead of the stream? Any input or direction would be greatly appreciated. Thanks for your time.

    Read the article

  • How to communicate/share a session between pages over HTTP and HTTPS

    - by spirytus
    What is common practice for coding web applications where part of the site has to be secured (e.g. checkout section) and part not necessarily, let's say homepage? As far as I know sharing sessions in between HTTP and HTTPS parts of the site is not easily possible (or is it?). What would be common approach if I wanted to display on HTTP page like homepage, shopping cart data (items) that users ordered on HTTPS pages? How those two parts of the site would communicate if necessary? Also isn't it security flaw in popular shopping carts as it seems that many of these have only checkout pages secured (SSL) and the rest not? I'm using PHP if it makes any difference.

    Read the article

  • WCF client using basic HTTP authentication

    - by AZ
    I'm trying to connect to a service that uses basic HTTP authentication. I've configured my binding like this <bindings> <basicHttpBinding> <binding name ="binding"> <security mode="TransportCredentialOnly"> <transport clientCredentialType="Basic"/> </security> </binding> </basicHttpBinding> </bindings> and i'm setting the credentials like this: client.ClientCredentials.UserName.UserName = Settings.UserName; client.ClientCredentials.UserName.Password = Settings.Password; Sill when i make a request i get a "The HTTP request is unauthorized with client authentication scheme 'Basic'" fault back. What am i doing wrong? (i don't have control over the service so all solutions must relate to the client configuration)

    Read the article

  • J2ME's extra annoying HTTP permission prompt

    - by Hans Malherbe
    Some phones only prompt the user for permission the first time a connection is made. Others pop up the permission prompt whenever the MIDlet attempts to make a HTTP connection! What are the options if we want to suppress the prompt? Can we sign the JAR using only one CA (Certificate Authority) and have it work on all devices? Do we have to pay for a signature on every release? Is it an option to create our own CA certificate and tell our customers to install it on there device? Alternatively, it seems that plain socket connections do not suffer so. Is there a free implementation of HTTP on top of TCP for J2ME?

    Read the article

  • failed to open stream: HTTP request failed! HTTP/1.1 400 Bad Request

    - by muralikalpana
    I am accessing images from another website. I am getting "failed to open stream: HTTP request failed! HTTP/1.1 400 Bad Request " error when copying 'some(not all)' images. here is my code. $img=$_GET['img']; //another website url $file=$img; function getFileextension($file) { return end(explode(".", $file)); } $fileext=getFileextension($file); if($fileext=='jpg' || $fileext=='gif' || $fileext=='jpeg' || $fileext=='png' || $fileext=='x-png' || $fileext=='pjpeg'){ if($img!=''){ $rand_variable1=rand(10000,100000); $node_online_name1=$rand_variable1."image.".$fileext; $s=copy($img,"images/".$node_online_name1); }

    Read the article

  • Redirected wikipedia request

    - by Le_Coeur
    Hi people, i need to write a program, that can redirect's http://localhost:8080 to en.wikipedia.org, it seems to be easy, but i have some problems(only with wikipedia with another sites works good). I make url to wikipedia: URL url = new URL("http", "en.wikipedia.org", 80, "/wiki"); than URLConnection, extract headers, and when i want connection.getInputStream(), i received message 404 Not Found. So i have tried some hack for host header, because in this way host header is localhost:8080, therefor i have tried to change host header to wikipedia, and it works, but after request in browser http://localhost:8080 wikipedia opens, but url in browser changes to en.wikipedia.org, but i want proceed with localhost :)

    Read the article

  • Downloading HTTP URLs asynchronously in C++

    - by Joey Adams
    What's a good way to download HTTP URLs (e.g. such as http://0.0.0.0/foo.htm ) in C++ on Linux ? I strongly prefer something asynchronous. My program will have an event loop that repeatedly initiates multiple (very small) downloads and acts on them when they finish (either by polling or being notified somehow). I would rather not have to spawn multiple threads/processes to accomplish this. That shouldn't be necessary. Should I look into libraries like libcurl? I suppose I could implement it manually with non-blocking TCP sockets and select() calls, but that would likely be less convenient.

    Read the article

  • After HTTP GET request, the resulting string is cut-off (incomplete)

    - by Jayomat
    hi all, I'm making a http get request like this: try { HttpClient client = new DefaultHttpClient(); String getURL = "http://busspur02.aseag.de/bs.exe?SID=5FC39&ScreenX=1440&ScreenY=900&CMD=CR&Karten=true&DatumT="+day+"&DatumM="+month+"&DatumJ="+year+"&ZeitH="+hour+"&ZeitM="+min+"&Intervall=60&Suchen=(S)uchen&GT0=Aachen&T0=H&HT0="+start_from+"&GT1=Aachen&T0=H&HT1="+destination+""; HttpGet get = new HttpGet(getURL); HttpResponse responseGet = client.execute(get); HttpEntity resEntityGet = responseGet.getEntity(); if (resEntityGet != null) { //do something with the response Log.i("GET RESPONSE",EntityUtils.toString(resEntityGet)); } ........ It all works well... the only problem: the output from Log.i is cut-off... It's not the complete html page. If I make the same request in a browser, I get 3x the output in opposition to making the request in the emulator and using the above code.... what's wrong?

    Read the article

  • problem with multiple ajax HTTP get requests with different imput variables using jQuery

    - by Thanasis
    I want to make asychronous get requests and to take different results based on the input that I provide to each one. Here is my code: param=1; $.get('http://localhost/my_page_1.php', param, function(data) { alert("id = "+param); $('.resul 5.t').html(data); }); param=2; $.get('http://localhost/my_page_2.php', param, function(data) { alert("id = "+param); $('.result').html(data); }); The result for both requests is: "id = 2" I want the results to be: "id = 1" for the 1st request, and "id = 2" for the second one.. I want to do this for many requests in one HTML file and integrate the results into the HTML as soon as they are ready. Can anyone help me to solve this problem? thank you in advance, Thanasis

    Read the article

  • 2 way communication over http between a .Net service and Windows Forms Client

    - by user1802969
    I am looking to accomplish 2 way communication over http between a .Net service (WCF SOAP or REST, both options are open) hosted on IIS 7 and Windows Forms Client running on Windows 7. WebSockets are not supported with IIS 7 and all other Comet techniques allow only the web server to push data to client and not the other way around. The client will be very chatty, and there are thousands of clients, so I want to avoid creating a new HTTP request for each message to the server, though that is the last option. Is there any way to do this ?

    Read the article

  • Looking for a way to get HTTP Digest Authentication headers from incoming http requests

    - by duncancarroll
    I've been working on a REST implementation with my existing Cake install, and it's looking great except that I want to use HTTP Digest Authentication for all requests (Basic Auth won't cut it). So great, I'll generate a header in the client app (which is not cake) and send it to my cake install. Only problem is, I can't find a method for extracting that Digest from the request... I've looked through the Cake API for something that I can use to get the Digest Header. You'd think that Request Handler would be able to grab it, but I can't find anything resembling that. There must be another method of getting the digest that I am overlooking? In the meantime I'm writing my own regex to parse it out of the Request... once I'm done I'll post it here so no one has to waste as much time as I did hunting for it.

    Read the article

  • checksum error with building an HTTP packet(but over TCP, like syn/ack its ok)

    - by Hila
    I am building a NAT program,I change each packet that comes from our internal subnet, change it's source IP address by libnet functions.( catch the packet with libpcap, put it sniff structures and build the new packet with libnet) I am trying to build an http packet. When I look on wireshark, I see that the new packet that I have built is exectly like the original packet(the only diffrent is that I changed the src port and ip), but there is a checksum error, So the server don't do anything with the packet that I have sent to him, beacuse the cheksum field is wrong. When I send a tcp packet(like syn or ack), the checksum is ok, and the server respons. Is anyone knows what can cause this problem? the new checksum in other packets is calculated as it should be.. but in the HTTP packet it doesn't..

    Read the article

  • Java: stopping long HTTP operations

    - by kilonet
    I'm using Apache Common library for HTTP operations: HttpClient client = getClient(); PutMethod put = new PutMethod(url); FileRequestEntity countingFileRequestEntity = new FileRequestEntity(file, "application/octet-stream"); put.setRequestEntity(countingFileRequestEntity); client.executeMethod(put); put.releaseConnection(); I wonder how can safely interrup long HTTP operation. Running it in new thread and stopping it seems to be wrong way. HttpMethodBase has abort() method, but I can't understand how to use it because client.executeMethod blocks execution until it complets

    Read the article

< Previous Page | 34 35 36 37 38 39 40 41 42 43 44 45  | Next Page >