Search Results

Search found 50062 results on 2003 pages for 'http'.

Page 12/2003 | < Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >

  • HTTP Non-persistent connection and objects splitting

    - by Fabio Carello
    I hope this is the right board for my question about HTTP protocol with non-persistent connections. Suppose a single request for a html object that requires to be split in two different HTML response messages. My question is quite simple: do the connection will be closed after the first packet dispatch and then the other one will be sent on a new connection? I can't figure out if the non-persistent connection is applied at "single object level" (does't care if on multiples messages) or for every single message. Thanks for your answers!

    Read the article

  • Http 400 'Bad Request' and win32status 1450 when larger messages are sended to a WCF service

    - by Tim Mahy
    we sometimes receive Http 400 bad request resultcodes when posting a large file (10mb) to a WCF service hosted in IIS 6. We can reproduce this using SOAP UI and it seems that it is unpredictable when this happens. In our WCF log the call is not received, so we believe that the request does not reach the ASP.NET nor WCF runtime. This happens on multiple websites on the same machine each having their own application pool. All IIS settings are default, only in ASP.NET and WCF we allow bigger readerQuota's etc.... The win32status that is logged in the IIS log is 1450 which we think means "error no system resources". So now the question: a) how can we solve this b) (when a is not applicable :) ) which performance counters or logs are usefull to learn more about this problem? greetings, Tim

    Read the article

  • HTTP 401 Challenge and HTTP 302 Login/Redirect won't work together in IIS7

    - by RandomBen
    I am developing a website using .NET 3.5 that allow users to visit the site and create logins using the standard Microsoft login controls. However, users do not need to login to do general things like view products. Now I need to setup the site so some of our Traveling Sales people are able to access it but not allow anyone else to access it. The easiest way I know how to do this is to turn on Windows Authentication for the Site in IIS7. When I do that I get all sorts of errors due to also having Forms Authentication turned on. If I turn Forms Auth then I get a different kind of error. Does anyone know how to make Forms Auth and Windows Auth play nice on a single site in IIS7 or some other way to create a required login without having me kill Forms Auth?

    Read the article

  • What does 'http: getaddrinfo*.gaih_getanswer: got type "46"' mean

    - by koffie
    Today I got an e-mail from logcheck informing me that the following system event occurred. http: getaddrinfo*.gaih_getanswer: got type "46" Indeed, the above message occurs 4 times in /var/log/aut.log it occured twice yesterday around 6:46 am. And this morning it also occured twice around the same time. I wonder what this message means and if this message is something to worry about or if should just tell logcheck to ignore it. If it is something to worry about, then any pointers on how to fix the problem are appreciated.

    Read the article

  • In IIS why do HTTP requests use the host header, and FTP requests do not

    - by Keeno
    So.... In IIS, if you use the in-build FTP you need to combine the FTP host header in the FTP username e.g. www.hello.com|domain/username So, the FTP program gets its "hook" from the username. However, you can connect to the FTP site using www.hello.com:21 over the FTP port. Why then, doesnt the FTP service work the same way as the HTTP service? IIS knows what site to serve back based on the host header after all.... Thanks!

    Read the article

  • HTTP 400 error for all websites

    - by Jason Sherman
    A couple of days ago, I started getting HTTP 400 responses from all websites. Nothing will go across port 80. However, everything works if I connect to VPN. The weird thing is, without VPN, other things still work; such as IM and anything else that doesn’t use port 80. Pinging also works. I haven’t noticed this behavior on any other computer on my network. The kicker is, if I log on as a local admin, everything works fine!!! I haven’t installed anything in the last couple weeks and I don’t remember changing any configuration. I ran Forefront and HouseCall and neither found any problems.

    Read the article

  • Unable to connect my computer from LAN (http, smb) in UBUNTU 10.04

    - by Abdul Majeed
    I installed ubuntu 10.04, Apache, PHP, mysql, smb. Everything work fine in locally in my IP. When i trying to access my computer from LAN (other computer), it shows unable to connect. when i ping my IP from remote computer, its pinging OK. I can access internet, and all other systems (http, smb). But the problem is no one can't access my computer remotely in my LAN network. My ip is 192.168.85.105 and i want access(Appaceh,SMB) from 192.168.85.10. Is there any proxy firewall settings? I had tried following commands.. sudo iptables -F or sudo iptables-restore [logout require] If it does not work then try to disable net-filter sudo ufw --disable Please give me the solution.

    Read the article

  • Download a file via HTTP from a script in Windows

    - by Jason R. Coombs
    I want a way to download a file via HTTP given its URL (similar to how wget works). I've seen the answers to this question, but I have two changes to the requirements: I'd like it to run on Windows 7 or later (though if it works on Windows XP, that's a bonus). I need to be able to do this on a stock machine with nothing but the script, which should be text that could be easily entered on a keyboard or copy/pasted. The shorter, the better. So, essentially, I'd like a .cmd (batch) script, VBScript, or Powershell script that can accomplish the download. It could use COM or invoke IE, but it needs to run without any input, and should behave well when invoked without a display (such as through a telnet session).

    Read the article

  • BITS http download job fails to connect for owner Local SYSTEM account

    - by MikeT
    A service I have written that uses BITS (Background Intelligent Transfer Service) to auto update itself is having a problem on some machines (Windows 7 so far). I have been investigating and have discovered that some of the jobs that my service adds to the bits queue are failing immediately with the error code 0x80072efd (a connection with this server could not be established). The is not problem with connecting to the server for the download as it works fine on the same machine using IE (or any other web browser) and other clients can connect and update from the same server. I tried using the BITSADMIN.exe tool to add the jobs manually and they worked ok. I then changed the account my service was running under to the network service account so the bits jobs would be created with a different owner and the jobs completed successfully. My question is I don't want to run my service as this account as it wont have the required local permissions, so how to I change the permissions of the local system user to allow it to download from the HTTP source, I'm not aware of any way of this being restricted for this account but it obviously is.

    Read the article

  • Slow http traffic between VMWare guest and host.

    - by toluju
    I have a web application running as an http server inside the VMWare guest OS, and I'm trying to access the content from the host OS. The guest is running Ubuntu, and the host is running Windows XP. The problem is, when I try to access the application from a browser in the host OS, the content takes a very long time to load (up to a minute for a single page). A browser in the guest OS can access the application with no problems. I've tried using both NAT and bridged networking, but the results are the same. The Windows firewall is turned off. The connection itself appears fine, as ping requests from guest to host as well as host to guest complete without errors or delays. Both guest and host can access the external Internet connection without a problem. I'm using VMWare Player. Any ideas?

    Read the article

  • Funnelling http traffic

    - by spencer p
    I have a situation where a large batch of servers (X), on demand, need to request data from a smaller set of web servers (Y). The worst case scenario is if all servers in X decide to fetch different requests to one server in Y. That would be X amount of connections, which could be a very large burst of traffic. The best case scenario is if 1 server in X hit 1 server in Y in tandem. Life does not work like this. One idea to entertain is placing a proxy, similar to squid between X and Y. All of X servers can connect to this proxy, but would result in a few persistent (http keepalive) connections to Y. If The few were say, 3 or 4, then it would funnel. If we could then rate limit those connections and traffic decides to spike unusually high, we wouldn't hurt anyone but ourselves. Thoughts?

    Read the article

  • curl XPUT returning HTTP 500 error message

    - by pradeepchhetri
    I have added the following changes in nginx configuration. server { listen 8080; root /usr/share/nginx/www; client_body_temp_path /tmp/; dav_methods PUT DELETE MKCOL COPY MOVE; create_full_put_path on; dav_access user:rw group:rw all:rw; } I have my nginx configured with --with-http_dav_module also. But when I am trying to running the command: $ curl -XPUT http://172.16.31.127:8080/test.html -d 'test' I am getting 500 Internal Server error. Can anyone help me out in solving this.

    Read the article

  • Serve mirrored (static) web-page with original headers

    - by aioobe
    I have a dynamic webpage which I want to create a "frozen" copy of. Typically I would do something like wget -m http://example.com, and then put the files in the document root of the web-server. This site however has some dynamic content, including dynamically generated images, for instance http://example.com/company/123/logo This means that in order to mirror the page, I need to Save whatever headers the server currently serves for each URL. This can be done using the wget option --save-headers. Serve the static pages and serve the proper headers for each file. (This I have no idea of how to do.) What is the best way to solve this? Any suggestions are welcome.

    Read the article

  • iptables - drop all HTTP(S) traffic but from CloudFlare

    - by Martin
    I would like to allow only HTTP(S) traffic coming from CloudFlare. In that way attackers cannot attack the server directly. I know CloudFlare is not mainly a DDoS mitigator, but I would like to try it either way. I'm currently only having access to iptables (ipv4 only), but will try to install ip6tables soon. I just need to have this fixed soon. (we're getting (D)DoSed atm.) I was thinking about something like this: iptables -I INPUT -s <CloudFlare IP> --dport 80 -j ACCEPT iptables -I INPUT -s <CloudFlare IP> --dport 443 -j ACCEPT iptables -I INPUT -p tcp --dport 80 -j DROP iptables -I INPUT -p tcp --dport 443 -j DROP I know that CloudFlare has multiple IPs, but just for an example. Would this be the right way?

    Read the article

  • HTTPS and HTTP issue on server with SSL

    - by Asghar
    I have a site www.example.com for which i purchased SSL cert and installed. And it was working fine, I also have a subdomain with app.example.com which was not on SSL. Both www.example.com and app.example.com are on same IP address. At later we decided to put SSL only on app.frostbox.com and then i configured SSL with app.frostbox.com and it worked fine, Now the issue is that Google is indexing my site as https://www.example.com/ and when users hits the web , Invalid security warning is issued and when user allow security issue they are shown my app.example.com contents. Note: I have my SSL configuration files in /etc/httpd/conf.d/ssl.conf The contents of the ssl.conf are below. NOTE: I tried solutions in .httaccess but none of those worked. Like redirecting 301 redirects etc http://pastebin.com/GCWhpQJq

    Read the article

  • Intermittent HTTP 401 errors

    - by forthrin
    I am using an Intranet solution which requires basic HTTP login. However, there is an intermittent error which requires me to log in again, and then the server says "Forbidden" whether I give the correct login information or not. To add insult to injury, Safari (and Chrome) seems to show the login dialog for every included resource in the HTML, and it's impossible to cancel this modal dialog sequence, so the whole browser is blocked until I've pressed Esc some 30 odd times. After an hour, I may gain access again, without having really done anything. My questions: What could cause temporal 401 errors? Why do the browsers show the login dialog 30 times per page load (assumedly for every included resource in the HTML from the same domain)?

    Read the article

  • Super simple high performance http server

    - by masylum
    I´m building a url shortener web application and I would like to know the best architecture to do it in order to provide a fast and reliable service. I would like to have two separate servicies in different machines. The first machine will have the application itself with a apache, nginx, whatever.. The second one will contain the database. The third one will be the one that will be responsible to handle the short url petitions. For the third machine I just need to accept one kind of http petition (GET www.domain.com/shorturl), but it have to do it really fast and it should be stable enough. Which server do you recommend me? Thank's in advance and sorry for my english

    Read the article

  • Do HTTP reverse proxies typically enable HTTP Keep-Alive on the client side of the proxied connection and not on the server side?

    - by LostInComputer
    HAProxy has the ability to enable HTTP keep-alive on the client side (client <- HAProxy) but disable it on the server side (HAProxy <- server). Some of our clients connect to our web service via satellite so the latency is ~600ms and I think that by enabling keep-alive, it will speed things up a bit. Am I right? Is this supported by Nginx? Is this a widely implemented feature in other software and hardware load balancers? What else besides HAProxy?

    Read the article

  • Use XMPP instead of HTTP

    - by pavel
    Hey guys. My friend and I, we are working on iPhone application. This application uses XMPP protocol to provide chat functionality. Right now we are designing architecture for the application. So my friend is working on iPhone side, and I am ruby on rails guy. My friend suggested, that we wrap every call, that is usually served via HTTP into XMPP. So, user registration, users search, profile editing, photo uploading, everything goes via XMPP. No HTTP at all. My friend wants to use XMPP, because he says, that it's much easier to implement XMPP on client-side rather HTTP. As for me, this is bullshit, but we've got a product owner, who have been working with my friend for a long time and he trusts him. So what I'm trying to do is to convince my friend and product owner that using XMPP for what HTTP can work find — is totally not the best idea. I feel, that if we implement everything on XMPP, we will have a pain in an ass till the end of lives. But how do I prove it? P.S. I'm not against chat over XMPP, I am against users search, photo uploading, rankings, nearby search and various other restful requests. Please, leave response. Any help appreciated. A little update: Yesterday we had a long discussion. And it turns out, it's quiete hard to receive response from both XMPP and HTTP in Objective-C. Because every single object and its data should be stored in Core Data model, while this model can't be securely modified from various places. Say, if you use HTTP transport, you always want to use only HTTP transport to update data in your model. And if you use XMPP, you should always use XMPP. So, you can't use both. That's what my iPhone buddy told me. It sounds weird for me, can anyone explain me that?

    Read the article

  • Hooking the http/https protocol in IE causes GET requests to be sequential

    - by watsonmw
    I'm using the PassthruAPP method to hook into HTTP/HTTPS requests made by IE. It's working well for the most part, however I noticed a problem. Only one download thread is active at a time. I can see two IInternetProtocol objects getting created, but IE uses only one at a time. This is happening with IE7. The odd thing is that the problem occurs when overriding the existing default HTTP/HTTPS handler, even if the handler is not the one being used to make the request. E.g. Registering a handler for the HTTPS protocol will cause HTTP requests to be made sequentially, even though HTTP requests are not hooked. I installed Google Gears and it has the same problem. This always happens for the first few items on the page, but it seems that after the document complete is issued, concurrent downloads can occur again. For example Javascript code that is executed after the page has finished loading can load images concurrently just fine. One option is to try to IAT patch the 'IInternetProtocol' registered for HTTP requests, but Google Gears does this already and it has the same problem. I know installing a HTTP Proxy is another option, but I don't want to monkey with the users' HTTP Proxy settings if there another option.

    Read the article

  • error while installing ia32-libs

    - by user3405516
    I am trying to install "ia32-libs" After doing google I did following steps. Yet not able to do it... 1st step i have added dpkg --add-architecture i386 2nd step added "deb http://archive.ubuntu.com/ubuntu/ raring main restricted universe multiverse" ia32-libs-raring.list" root@user:/etc/apt/sources.list.d# sudo dpkg --add-architecture i386 root@user:/etc/apt/sources.list.d# echo "deb http://archive.ubuntu.com/ubuntu/ raring main restricted universe multiverse" >ia32-libs-raring.list root@user:/etc/apt/sources.list.d# apt-get update Ign http://us-east-1.ec2.archive.ubuntu.com trusty InRelease Ign http://security.ubuntu.com trusty-security InRelease Ign http://archive.ubuntu.com raring InRelease Ign http://us-east-1.ec2.archive.ubuntu.com trusty-updates InRelease Hit http://security.ubuntu.com trusty-security Release.gpg Ign http://archive.ubuntu.com raring Release.gpg Hit http://us-east-1.ec2.archive.ubuntu.com trusty Release.gpg Hit http://security.ubuntu.com trusty-security Release Hit http://us-east-1.ec2.archive.ubuntu.com trusty-updates Release.gpg Ign http://archive.ubuntu.com raring Release Hit http://us-east-1.ec2.archive.ubuntu.com trusty Release Hit http://us-east-1.ec2.archive.ubuntu.com trusty-updates Release Hit http://security.ubuntu.com trusty-security/main Sources Hit http://us-east-1.ec2.archive.ubuntu.com trusty/main Sources Hit http://security.ubuntu.com trusty-security/universe Sources Hit http://us-east-1.ec2.archive.ubuntu.com trusty/universe Sources Hit http://security.ubuntu.com trusty-security/main amd64 Packages Hit http://us-east-1.ec2.archive.ubuntu.com trusty/main amd64 Packages Hit http://security.ubuntu.com trusty-security/universe amd64 Packages Hit http://us-east-1.ec2.archive.ubuntu.com trusty/universe amd64 Packages Hit http://security.ubuntu.com trusty-security/main i386 Packages Hit http://us-east-1.ec2.archive.ubuntu.com trusty/main i386 Packages Hit http://security.ubuntu.com trusty-security/universe i386 Packages Hit http://us-east-1.ec2.archive.ubuntu.com trusty/universe i386 Packages Hit http://security.ubuntu.com trusty-security/main Translation-en Hit http://security.ubuntu.com trusty-security/universe Translation-en Hit http://us-east-1.ec2.archive.ubuntu.com trusty/main Translation-en Hit http://us-east-1.ec2.archive.ubuntu.com trusty/universe Translation-en Hit http://us-east-1.ec2.archive.ubuntu.com trusty-updates/main Sources Hit http://us-east-1.ec2.archive.ubuntu.com trusty-updates/universe Sources Hit http://us-east-1.ec2.archive.ubuntu.com trusty-updates/main amd64 Packages Hit http://us-east-1.ec2.archive.ubuntu.com trusty-updates/universe amd64 Packages Hit http://us-east-1.ec2.archive.ubuntu.com trusty-updates/main i386 Packages Hit http://us-east-1.ec2.archive.ubuntu.com trusty-updates/universe i386 Packages Hit http://us-east-1.ec2.archive.ubuntu.com trusty-updates/main Translation-en Hit http://us-east-1.ec2.archive.ubuntu.com trusty-updates/universe Translation-en Ign http://us-east-1.ec2.archive.ubuntu.com trusty/main Translation-en_US Ign http://us-east-1.ec2.archive.ubuntu.com trusty/universe Translation-en_US Err http://archive.ubuntu.com raring/main amd64 Packages 404 Not Found [IP: 91.189.91.13 80] Err http://archive.ubuntu.com raring/restricted amd64 Packages 404 Not Found [IP: 91.189.91.13 80] Err http://archive.ubuntu.com raring/universe amd64 Packages 404 Not Found [IP: 91.189.91.13 80] Err http://archive.ubuntu.com raring/multiverse amd64 Packages 404 Not Found [IP: 91.189.91.13 80] Err http://archive.ubuntu.com raring/main i386 Packages 404 Not Found [IP: 91.189.91.13 80] Err http://archive.ubuntu.com raring/restricted i386 Packages 404 Not Found [IP: 91.189.91.13 80] Err http://archive.ubuntu.com raring/universe i386 Packages 404 Not Found [IP: 91.189.91.13 80] Err http://archive.ubuntu.com raring/multiverse i386 Packages 404 Not Found [IP: 91.189.91.13 80] Ign http://archive.ubuntu.com raring/main Translation-en_US Ign http://archive.ubuntu.com raring/main Translation-en Ign http://archive.ubuntu.com raring/multiverse Translation-en_US Ign http://archive.ubuntu.com raring/multiverse Translation-en Ign http://archive.ubuntu.com raring/restricted Translation-en_US Ign http://archive.ubuntu.com raring/restricted Translation-en Ign http://archive.ubuntu.com raring/universe Translation-en_US Ign http://archive.ubuntu.com raring/universe Translation-en W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/raring/main/binary-amd64/Packages 404 Not Found [IP: 91.189.91.13 80] W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/raring/restricted/binary-amd64/Packages 404 Not Found [IP: 91.189.91.13 80] W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/raring/universe/binary-amd64/Packages 404 Not Found [IP: 91.189.91.13 80] W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/raring/multiverse/binary-amd64/Packages 404 Not Found [IP: 91.189.91.13 80] W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/raring/main/binary-i386/Packages 404 Not Found [IP: 91.189.91.13 80] W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/raring/restricted/binary-i386/Packages 404 Not Found [IP: 91.189.91.13 80] W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/raring/universe/binary-i386/Packages 404 Not Found [IP: 91.189.91.13 80] W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/raring/multiverse/binary-i386/Packages 404 Not Found [IP: 91.189.91.13 80] E: Some index files failed to download. They have been ignored, or old ones used instead.

    Read the article

  • Announcing RSS feeds of Microsoft All-In-One Code Framework code samples

    - by Jialiang
    Today, we are not only announcing Sample Browser v2 CTP, but we are also excited to announce the availability of RSS feeds of All-In-One Code Framework code samples. By using these feeds, you can easily track and download the new code samples. English RSS feeds All code samples: http://support.microsoft.com/rss/en/rss.xml ASP.NET code samples: http://support.microsoft.com/rss/en/ASPNET.xml Silverlight code samples: http://support.microsoft.com/rss/en/Silverlight.xml Azure code samples: http://support.microsoft.com/rss/en/Azure.xml COM code samples: http://support.microsoft.com/rss/en/COM.xml Data Platform code samples: http://support.microsoft.com/rss/en/Data%20Platform.xml Library code samples: http://support.microsoft.com/rss/en/Library.xml Office dev code samples: http://support.microsoft.com/rss/en/Office.xml VSX code samples: http://support.microsoft.com/rss/en/VSX.xml Windows 7 code samples: http://support.microsoft.com/rss/en/Windows%207.xml Windows Forms code samples: http://support.microsoft.com/rss/en/Windows%20Forms.xml Windows General code samples: http://support.microsoft.com/rss/en/Windows%20General.xml Windows Service code samples: http://support.microsoft.com/rss/en/Windows%20Service.xml Windows Shell code samples: http://support.microsoft.com/rss/en/Windows%20Shell.xml Windows UI code samples: http://support.microsoft.com/rss/en/Windows%20UI.xml WPF code samples: http://support.microsoft.com/rss/en/WPF.xml ??RSS?? ??????:http://support.microsoft.com/rss/zh-cn/codeplex/rss.xml ASP.NET????:http://support.microsoft.com/rss/zh-cn/codeplex/ASPNET.xml Silverlight????:http://support.microsoft.com/rss/zh-cn/codeplex/Silverlight.xml Azure ????: http://support.microsoft.com/rss/zh-cn/codeplex/Azure.xml COM ????: http://support.microsoft.com/rss/zh-cn/codeplex/COM.xml Data Platform ????: http://support.microsoft.com/rss/zh-cn/codeplex/Data%20Platform.xml Library ????: http://support.microsoft.com/rss/zh-cn/codeplex/Library.xml Office dev ????: http://support.microsoft.com/rss/zh-cn/codeplex/Office.xml VSX ????: http://support.microsoft.com/rss/zh-cn/codeplex/VSX.xml Windows 7 ????: http://support.microsoft.com/rss/zh-cn/codeplex/Windows%207.xml Windows Forms ????: http://support.microsoft.com/rss/zh-cn/codeplex/Windows%20Forms.xml Windows General ????: http://support.microsoft.com/rss/zh-cn/codeplex/Windows%20General.xml Windows Service ????: http://support.microsoft.com/rss/zh-cn/codeplex/Windows%20Service.xml Windows Shell ????: http://support.microsoft.com/rss/zh-cn/codeplex/Windows%20Shell.xml Windows UI ????: http://support.microsoft.com/rss/zh-cn/codeplex/Windows%20UI.xml WPF ????: http://support.microsoft.com/rss/zh-cn/codeplex/WPF.xml

    Read the article

  • HTTP request, strange socket behavoir

    - by hoodoos
    I expirience strange behavior when doing HTTP requests through sockets, here the request: POST https://test.com:443/service/XMLSelect HTTP/1.1 Content-Length: 10926 Host: test.com User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; .NET CLR 1.1.4322; .NET CLR 1.0.3705) Authorization: Basic XXX SOAPAction: http://test.com/SubmitXml Later on there goes body of my request with given content length. After that I recive something like: HTTP/1.1 200 OK Server: Apache-Coyote/1.1 Content-Type: text/xml;charset=utf-8 Transfer-Encoding: chunked Date: Tue, 30 Mar 2010 06:13:52 GMT So everything seem to be fine here. I read all contents from network stream and successfuly recieve response. But my socket which I'm doing polling on switches it's modes like that: write ( i write headers and request here ) read ( after headers sent i begin to recieve response ) write ( STRANGE BEHAVIOUR HERE. WHY? here i send nothing really ) read ( here it switches to read back again ) last two steps can repeat several times. So I want to ask what leads for socket's mode change? And in this case it's not a big problem, but when I use gzip compression in my request ( no idea how it's related ) and ask server to send gzipped response to me like this: POST https://test.com:443/service/XMLSelect HTTP/1.1 Content-Length: 1076 Accept-Encoding: gzip Content-Encoding: gzip Host: test.com User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; .NET CLR 1.1.4322; .NET CLR 1.0.3705) Authorization: Basic XXX SOAPAction: http://test.com/SubmitXml I recieve response like that: HTTP/1.1 200 OK Server: Apache-Coyote/1.1 Content-Encoding: gzip Content-Type: text/xml;charset=utf-8 Transfer-Encoding: chunked Date: Tue, 30 Mar 2010 07:26:33 GMT 2000 ? I recieve a chunk size and GZIP header, it's all okay. And here's what is happening with my poor little socket meanwhile: write ( i write headers and request here ) read ( after headers sent i begin to recieve response ) write ( STRANGE BEHAVIOUR HERE. And it finally sits here forever waiting for me to send something! But if i refer to HTTP I don't have to send anything more! ) What can it be related to? What it wants me to send? Is it remote web server's problem or do I miss something? PS All actual service references and login/passwords replaced with fake ones :)

    Read the article

  • How to maximize http.sys file upload performance

    - by anelson
    I'm building a tool that transfers very large streaming data sets (possibly on the order of terabytes in a single stream; routinely in the tens of gigabytes) from one server to another. The client portion of the tool will read blocks from the source disk, and send them over the network. The server side will read these blocks off the network and write them to a file on the server disk. Right now I'm trying to decide which transport to use. Options are raw TCP, and HTTP. I really, REALLY want to be able to use HTTP. The HttpListener (or WCF if I want to go that route) make it easy to plug in to the HTTP Server API (http.sys), and I can get things like authentication and SSL for free. The problem right now is performance. I wrote a simple test harness that sends 128K blocks of NULL bytes using the BeginWrite/EndWrite async I/O idiom, with async BeginRead/EndRead on the server side. I've modified this test harness so I can do this with either HTTP PUT operations via HttpWebRequest/HttpListener, or plain old socket writes using TcpClient/TcpListener. To rule out issues with network cards or network pathways, both the client and server are on one machine and communicate over localhost. On my 12-core Windows 2008 R2 test server, the TCP version of this test harness can push bytes at 450MB/s, with minimal CPU usage. On the same box, the HTTP version of the test harness runs between 130MB/s and 200MB/s depending upon how I tweak it. In both cases CPU usage is low, and the vast majority of what CPU usage there is is kernel time, so I'm pretty sure my usage of C# and the .NET runtime is not the bottleneck. The box has two 6-core Xeon X5650 processors, 24GB of single-ranked DDR3 RAM, and is used exclusively by me for my own performance testing. I already know about HTTP client tweaks like ServicePointManager.MaxServicePointIdleTime, ServicePointManager.DefaultConnectionLimit, ServicePointManager.Expect100Continue, and HttpWebRequest.AllowWriteStreamBuffering. Does anyone have any ideas for how I can get HTTP.sys performance beyond 200MB/s? Has anyone seen it perform this well on any environment?

    Read the article

  • How to update facebook status by http request

    - by Narjes
    I try to send http request like: ..."POST http://api.facebook.com/restserver.php?method=facebook.users.setStatus&api_key=762ec91e7987aaeaee7e2cdfdfcb3c30&call_id=$call_id&sig=$s&v=1.0&uid=1533439618&status=44 HTTP/1.1";.... but I receive nothing... in twitter I success: .. "POST ht tp://twitter.com/statuses/update.xml?status=123123 HTTP/1.1" ...

    Read the article

< Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >