Search Results

Search found 50062 results on 2003 pages for 'http 1 1'.

Page 96/2003 | < Previous Page | 92 93 94 95 96 97 98 99 100 101 102 103  | Next Page >

  • choosing an image locally from http url and serving that image without a server round trip

    - by serverman
    Hi folks I am a complete novice to Flash (never created anything in flash). I am quite familiar with web applications (J2EE based) and have a reasonable expertise in Javascript. Here is my requirement. I want the user to select (via an html form) an image. Normally in the post, this image would be sent to server and may be stored there to be served later. I do not want that. I want to store this image locally and then serve it via HTTP to the user. So, the flow is: 1. Go to the "select image url":mywebsite.com/selectImage Browse the image and select the image This would transfer control locally to some code running on the client (Javascript or flash), which would then store the image locally at some place on the client machine. Go to the "show image url": mywebsite.com/showImage This would eventually result in some client code running on the browser that retrieves the image and renders it (without any server round trips.) I considered the following options: Use HTML5 local storage. Since I am a complete novice to flash, I looked into this. I found that it is fairly straightforward to store and retrieve images in javascript (only strings are allowed but I am hoping storing base64 encoded strings would work at least for small images). However, how do I serve the image via http url that points to my server without a server round trip? I saw the interesting article at http://hacks.mozilla.org/category/fileapi/ but that would work only in firefox and I need to work on all latest browsers (at least the ones supporting HTML5 local storage) Use flash SharedObjects. OK, this would have been good - the only thing is I am not sure where to start. Snippets of actionscripts to do this are scattered everywhere but I do not know how to use those scripts in an actual html page:) I do not need to create any movies or anything - just need to store an image and serve it locally. If I go this route, I would also use it to store other "strings" locally. If you suggest this, please give me the exact steps (could be pointers to other web sites) on how to do this. I would like to avoid paying for any flash development environment software ideally:) Thank you!

    Read the article

  • 412 Precodition Failed error only occurs on certain networks

    - by Andy
    One of my favorite websites: http://jessiejofficial.com (yes, I'm a Jessie J Fan :')) has recently started displaying the error message "412 Precondition Failed" whenever I visit it from my home network, even when I use Tor Browser. At first I thought that this was an issue with the whole website, however I have contacted the web developer and he has said that they has been plenty of hits within the last 48 hours. Plus, I discovered tonight that I can access the website from my phone, through the mobile network. So it appears to just be my network as all of the devices in my house connected to the WiFi display the same error when I try to visit any page of the site. However there have been no changes that we are aware of or are noticeable to our network since the website was accessible, and I have just heard that another person in a different part of the country is experiencing the same difficulties also. Any help/advice/suggestions would be appreciated greatly Update: When trying to ping 'jessiejofficial.com' in Windows command prompt the request times out on all four attempts, on any computer connected to the wireless network. I can now also confirm that the same thing occurs on my MacBook Pro.

    Read the article

  • Play mp3 stream from http URL on Windows Mobile 6.0

    - by Thyphuong
    After a short period of time learning about how to play a mp3 http url on windows mobile 6.0, I found that very less dll support that (until now, I just found out Bass.dll work nice). So I intend to change to another way to approach the goal. Here's my idea: Get a stream from http url. Decode the mp3 stream. Play the result from step 2. Coz I'm new on this field, so feel free and explain to me what I'm wrong and/or show me the way.

    Read the article

  • using php Download File From a given URL by passing username and password for http authentication

    - by Acharya
    Hi all, I need to download a text file using php code. The file is having http authentication. What procedure I should use for this. Should I use fsocketopen or curl or Is there any other way to do this? I am using fsocketopen but it does not seem to work. $fp=fsockopen("www.example.com",80,$errno,$errorstr); $out = "GET abcdata/feed.txt HTTP/1.1\r\n"; $out .= "User: xyz \r\n"; $out .= "Password: xyz \r\n\r\n"; fwrite($fp, $out); while(!feof($fp)) { echo fgets($fp,1024); } fclose($fp); Here fgets is returning false. Any help!!!

    Read the article

  • What is the best way to read GetResponseStream() ?

    - by Dev Dona
    What is the best way to read an HTTP response from GetResponseStream ? Currently I'm using the following approach. Using SReader As StreamReader = New StreamReader(HttpRes.GetResponseStream) SourceCode = SReader.ReadToEnd() End Using I'm not quite sure if this is the most effecient way to read an http response. I need the output as string, I've seen an article ( http://www.informit.com/guides/content.aspx?g=dotnet&seqNum=583 ) with a different approach but I'm not quite if it's a good one. And in my tests that code had some encoding issues with in different websites. How do you read web responses?

    Read the article

  • Using jQuery, how do I way attach a string array as a http parameter to a http request?

    - by predhme
    I have a spring controller with a request mapping as follows @RequestMapping("/downloadSelected") public void downloadSelected(@RequestParam String[] ids) { // retrieve the file and write it to the http response outputstream } I have an html table of objects which for every row has a checkbox with the id of the object as the value. When they submit, I have a jQuery callback to serialize all ids. I want to stick those ids into an http request parameter called, "ids" so that I can grab them easily. I figured I could do the following var ids = $("#downloadall").serializeArray(); Then I would need to take each of the ids and add them to a request param called ids. But is there a "standard" way to do this? Like using jQuery?

    Read the article

  • .htaccess Problem

    - by ocergynohtna
    I'm having trouble with redirecting urls using the .htaccess file. This is how my htaccess file looks like: Redirect 301 /file-name/example.php http://www.mysite.com/file-name/example-001.php Redirect 301 /section-name/example.php http://www.my-site.com/section-name/example-002.php RewriteEngine on RewriteCond %{HTTP_HOST} !^www.mysite.com$ [NC] RewriteRule ^(.*)$ http://www.mysite.com/$1 [L,R=301] RewriteCond %{REQUEST_FILENAME} !-f RewriteRule ^(.+)/(.*)$ hqtemplates/articles.php?file_name=$2 [L] php_value session.use_only_cookies 1 php_value session.use_trans_sid 0 Now the problem is that when I go to page: www.my-site.com/file-name/example.php instead of redirecting me to www.my-site.com/file-name/example-001.php it redirects me to www.my-site.com/file-name/example.php?file_name=example-001.php. For some reason it adds "?file_name=example-001.php" to the url. Anyone know's why this is happening and how to fix this?

    Read the article

  • Custom Response + HTTP status?

    - by Cristian Boariu
    Hi, I have a rest interface for my project. For one class i have a POST method where you can post an xml and i RETURN a custom response like: <userInvitation>Invalid email</userInvitation> if the email from the xml which was posted, was incorrect + other custom messages i have defined for different situations. For all of these the HTTP STATUS is automatically put on 200 (OK). Is there any way to change it? Ps: I know that i can throw a web application like : throw new WebApplicationException(Response.Status.BAD_REQUEST); but in this case my custom response is no more included. So i just want to return my custom error + 400 as http response. Thanks in advance.

    Read the article

  • how can I reliably check that requests to my service file have come from my website?

    - by woot586
    I have a service.php class that I use to service AJAX calls from my website. To prevent other people accessing the service using PHP CURL I would normally check the request has come from mysite, and if they are not then just redirect to my home page e.g. if($_SERVER['HTTP_REFERER'] != "http://www.mysite.com"){ header('location: http://www.mysite.com'); exit; } I read in the PHP holy bible: http://www.php.net/manual/en/reserved.variables.server.php that "Not all user agents will set this, and some provide the ability to modify HTTP_REFERER as a feature. In short, it cannot really be trusted." So if this method is not reliable, my question is how can I reliably check that requests to my service file have come from my website? Thanks for any help you can provide!

    Read the article

  • Rails gives wrong headers after upgrade 2.3.5 -> 2.3.8

    - by macsniper
    I just upgraded from rails 2.3.5 to rails 2.3.8, but now my redirects are not working properly. I get the following as the response HTTP Headers: HTTP/1.1 302 Moved Temporarily Date: Wed, 02 Jun 2010 09:40:39 GMT Content-Length: 93 Content-Type: text/html whereas I got previous: HTTP/1.1 302 Moved Temporarily Connection: close Date: Wed, 02 Jun 2010 09:41:18 GMT Set-Cookie: _session_id=<correct id>; path=/ Status: 302 Found Location: <correct url> Cache-Control: no-cache Server: Mongrel 1.1.5 Content-Type: text/html; charset=utf-8 Content-Length: 93 Anyone knows how to fix this? Despite the fact that the redirect is not working, the login-cookie is not set too (but I think, this is both related somehow). I have already tried to override redirect_to in order to set response.headers['Location'] etc., but they did not appear in the response.

    Read the article

  • Setting cookie for site in http and https under different subdomains in PHP

    - by nilacqua
    Situation: 1. I'm trying run an https store(xcart) under one domain secure.mydomain and I want to have access to a cookie it sets in http www.mydomain 2. I'm Running PHP on apache(MAMP), testing in firefox with firecookie 3. The existing code sets cookies to .secure.mydomain. I'm not sure if this is xcart related, but setcookie is actually called using secure.mydomain. I'm not sure why the "." is appended. Problems: 1. When I try to use setcookie in https to use the domain .mydomain or just mydomain, no cookie is created, whether I'm running the store under http or https. The testing code I'm using is: setcookie('three', 'two', 0, "/", ".mydomain"); If I set the cookie to secure.mydomain or .secure.mydomain it does show up. Is there a reason the cookie isn't showing up?

    Read the article

  • Facebook iframe app redirecting https to http, how?

    - by Paul Whitrow
    I'm trying to get an app working within Facebook, but it seems that no matter what I try including forcing just https in the app settings (see screen shot), the iframe source (Facebooks canvas) seems to change the https address to http (301) which is then producing SEC7111: HTTPS security errors in IE? (sorry I can't post screen shots or extra links yet:( ) Header dump of page in question: Request URL:https://[hidden] Request Method:POST Status Code:301 Moved Permanently Request Headers (13) Form Data (1) Response Headersview source Connection:keep-alive Content-Encoding:gzip Content-Length:253 Content-Type:text/html; charset=iso-8859-1 Date:Mon, 01 Jul 2013 09:42:32 GMT Location:http://[hidden] Server:Apache/2.2.22 Vary:Accept-Encoding I'm getting so confused by this, and would welcome any help that the community could offer.

    Read the article

  • Sending a file from memory (rather than disk) over HTTP using libcurl

    - by cinek1lol
    Hi! I would like to send pictures via a program written in C + +. - OK WinExec("C:\\curl\\curl.exe -H Expect: -F \"fileupload=@C:\\curl\\ok.jpg\" -F \"xml=yes\" -# \"http://www.imageshack.us/index.php\" -o data.txt -A \"Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.1) Gecko/20061204 Firefox/2.0.0.1\" -e \"http://www.imageshack.us\"", NULL); It works, but I would like to send the pictures from pre-loaded carrier to a variable char (you know what I mean? First off, I load the pictures into a variable and then send the variable), cause now I have to specify the path of the picture on a disk. I wanted to write this program in c++ by using the curl library, not through exe. extension. I have also found such a program (which has been modified by me a bit) #include <stdio.h> #include <string.h> #include <iostream> #include <curl/curl.h> #include <curl/types.h> #include <curl/easy.h> int main(int argc, char *argv[]) { CURL *curl; CURLcode res; struct curl_httppost *formpost=NULL; struct curl_httppost *lastptr=NULL; struct curl_slist *headerlist=NULL; static const char buf[] = "Expect:"; curl_global_init(CURL_GLOBAL_ALL); /* Fill in the file upload field */ curl_formadd(&formpost, &lastptr, CURLFORM_COPYNAME, "send", CURLFORM_FILE, "nowy.jpg", CURLFORM_END); curl_formadd(&formpost, &lastptr, CURLFORM_COPYNAME, "nowy.jpg", CURLFORM_COPYCONTENTS, "nowy.jpg", CURLFORM_END); curl_formadd(&formpost, &lastptr, CURLFORM_COPYNAME, "submit", CURLFORM_COPYCONTENTS, "send", CURLFORM_END); curl = curl_easy_init(); headerlist = curl_slist_append(headerlist, buf); if(curl) { curl_easy_setopt(curl, CURLOPT_URL, "http://www.imageshack.us/index.php"); if ( (argc == 2) && (!strcmp(argv[1], "xml=yes")) ) curl_easy_setopt(curl, CURLOPT_HTTPHEADER, headerlist); curl_easy_setopt(curl, CURLOPT_HTTPPOST, formpost); res = curl_easy_perform(curl); curl_easy_cleanup(curl); curl_formfree(formpost); curl_slist_free_all (headerlist); } system("pause"); return 0; }

    Read the article

  • server not sending custom header values

    - by egza
    I'm using PHP 5.2.17 to get a remote page, the HTTP requests contains some cookie values but cookies are not delivered to the destination page. $url = 'http://somesite.com/'; $opts = array( 'http' => array ( 'header' => array("Cookie: field1=value1; field2=value2\r\n") ) ); $context = stream_context_create($opts); echo file_get_contents($url, false, $context); Can you help me find the problem? Note: I can't use curl. Thanks.

    Read the article

  • nm-applet gone?

    - by welp
    nm-applet seems to have disappeared from my system. I am running 12.10. Here's what I get when I check my package manager logs: ? ~ grep network-manager /var/log/dpkg.log 2012-10-06 10:37:08 upgrade network-manager-gnome:amd64 0.9.6.2-0ubuntu5 0.9.6.2-0ubuntu6 2012-10-06 10:37:08 status half-configured network-manager-gnome:amd64 0.9.6.2-0ubuntu5 2012-10-06 10:37:08 status unpacked network-manager-gnome:amd64 0.9.6.2-0ubuntu5 2012-10-06 10:37:08 status half-installed network-manager-gnome:amd64 0.9.6.2-0ubuntu5 2012-10-06 10:37:08 status half-installed network-manager-gnome:amd64 0.9.6.2-0ubuntu5 2012-10-06 10:37:08 status half-installed network-manager-gnome:amd64 0.9.6.2-0ubuntu5 2012-10-06 10:37:08 status half-installed network-manager-gnome:amd64 0.9.6.2-0ubuntu5 2012-10-06 10:37:08 status half-installed network-manager-gnome:amd64 0.9.6.2-0ubuntu5 2012-10-06 10:37:08 status half-installed network-manager-gnome:amd64 0.9.6.2-0ubuntu5 2012-10-06 10:37:08 status half-installed network-manager-gnome:amd64 0.9.6.2-0ubuntu5 2012-10-06 10:37:09 status half-installed network-manager-gnome:amd64 0.9.6.2-0ubuntu5 2012-10-06 10:37:09 status unpacked network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-06 10:37:09 status unpacked network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-06 10:39:50 configure network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-06 10:39:50 status unpacked network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-06 10:39:50 status unpacked network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-06 10:39:50 status half-configured network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-06 10:39:50 status installed network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-28 22:27:23 status installed network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-28 22:27:23 remove network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-28 22:27:23 status half-configured network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-28 22:27:23 status half-installed network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-28 22:27:23 status half-installed network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-28 22:27:23 status half-installed network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-28 22:27:23 status half-installed network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-28 22:27:23 status half-installed network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-28 22:27:23 status half-installed network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-28 22:27:23 status half-installed network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-28 22:27:23 status half-installed network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-28 22:27:23 status config-files network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-28 22:27:23 status config-files network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-31 19:58:03 install network-manager-gnome:amd64 0.9.6.2-0ubuntu6 0.9.6.2-0ubuntu6 2012-10-31 19:58:03 status half-installed network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-31 19:58:03 status half-installed network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-31 19:58:03 status half-installed network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-31 19:58:03 status half-installed network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-31 19:58:03 status half-installed network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-31 19:58:03 status half-installed network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-31 19:58:03 status half-installed network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-31 19:58:03 status half-installed network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-31 19:58:03 status unpacked network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-31 19:58:03 status unpacked network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-31 19:58:06 configure network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-31 19:58:06 status unpacked network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-31 19:58:07 status unpacked network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-31 19:58:07 status half-configured network-manager-gnome:amd64 0.9.6.2-0ubuntu6 2012-10-31 19:58:07 status installed network-manager-gnome:amd64 0.9.6.2-0ubuntu6 ? ~ Unfortunately, I cannot find network-manager-applet package at all: ? ~ apt-cache search network-manager-applet ? ~ Here are the contents of /etc/apt/sources.list: ? ~ cat /etc/apt/sources.list # deb cdrom:[Ubuntu 12.04 LTS _Precise Pangolin_ - Release amd64 (20120425)]/ dists/precise/main/binary-i386/ # deb cdrom:[Ubuntu 12.04 LTS _Precise Pangolin_ - Release amd64 (20120425)]/ dists/precise/restricted/binary-i386/ # deb cdrom:[Ubuntu 12.04 LTS _Precise Pangolin_ - Release amd64 (20120425)]/ precise main restricted # See http://help.ubuntu.com/community/UpgradeNotes for how to upgrade to # newer versions of the distribution. deb http://gb.archive.ubuntu.com/ubuntu/ quantal main restricted deb-src http://gb.archive.ubuntu.com/ubuntu/ quantal main restricted ## Major bug fix updates produced after the final release of the ## distribution. deb http://gb.archive.ubuntu.com/ubuntu/ quantal-updates main restricted deb-src http://gb.archive.ubuntu.com/ubuntu/ quantal-updates main restricted ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team. Also, please note that software in universe WILL NOT receive any ## review or updates from the Ubuntu security team. deb http://gb.archive.ubuntu.com/ubuntu/ quantal universe deb-src http://gb.archive.ubuntu.com/ubuntu/ quantal universe deb http://gb.archive.ubuntu.com/ubuntu/ quantal-updates universe deb-src http://gb.archive.ubuntu.com/ubuntu/ quantal-updates universe ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team, and may not be under a free licence. Please satisfy yourself as to ## your rights to use the software. Also, please note that software in ## multiverse WILL NOT receive any review or updates from the Ubuntu ## security team. deb http://gb.archive.ubuntu.com/ubuntu/ quantal multiverse deb-src http://gb.archive.ubuntu.com/ubuntu/ quantal multiverse deb http://gb.archive.ubuntu.com/ubuntu/ quantal-updates multiverse deb-src http://gb.archive.ubuntu.com/ubuntu/ quantal-updates multiverse ## N.B. software from this repository may not have been tested as ## extensively as that contained in the main release, although it includes ## newer versions of some applications which may provide useful features. ## Also, please note that software in backports WILL NOT receive any review ## or updates from the Ubuntu security team. deb http://gb.archive.ubuntu.com/ubuntu/ quantal-backports main restricted universe multiverse deb-src http://gb.archive.ubuntu.com/ubuntu/ quantal-backports main restricted universe multiverse deb http://security.ubuntu.com/ubuntu quantal-security main restricted deb-src http://security.ubuntu.com/ubuntu quantal-security main restricted deb http://security.ubuntu.com/ubuntu quantal-security universe deb-src http://security.ubuntu.com/ubuntu quantal-security universe deb http://security.ubuntu.com/ubuntu quantal-security multiverse deb-src http://security.ubuntu.com/ubuntu quantal-security multiverse ## Uncomment the following two lines to add software from Canonical's ## 'partner' repository. ## This software is not part of Ubuntu, but is offered by Canonical and the ## respective vendors as a service to Ubuntu users. # deb http://archive.canonical.com/ubuntu precise partner # deb-src http://archive.canonical.com/ubuntu precise partner ## This software is not part of Ubuntu, but is offered by third-party ## developers who want to ship their latest software. deb http://extras.ubuntu.com/ubuntu quantal main deb-src http://extras.ubuntu.com/ubuntu quantal main ? ~ Right now, I can't think of anything else. Happy to provide more info upon request.

    Read the article

  • WIF-less claim extraction from ACS: SWT

    - by Elton Stoneman
    WIF with SAML is solid and flexible, but unless you need the power, it can be overkill for simple claim assertion, and in the REST world WIF doesn’t have support for the latest token formats.  Simple Web Token (SWT) may not be around forever, but while it's here it's a nice easy format which you can manipulate in .NET without having to go down the WIF route. Assuming you have set up a Relying Party in ACS, specifying SWT as the token format: When ACS redirects to your login page, it will POST the SWT in the first form variable. It comes through in the BinarySecurityToken element of a RequestSecurityTokenResponse XML payload , the SWT type is specified with a TokenType of http://schemas.xmlsoap.org/ws/2009/11/swt-token-profile-1.0 : <t:RequestSecurityTokenResponse xmlns:t="http://schemas.xmlsoap.org/ws/2005/02/trust">   <t:Lifetime>     <wsu:Created xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd">2012-08-31T07:31:18.655Z</wsu:Created>     <wsu:Expires xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd">2012-08-31T09:11:18.655Z</wsu:Expires>   </t:Lifetime>   <wsp:AppliesTo xmlns:wsp="http://schemas.xmlsoap.org/ws/2004/09/policy">     <EndpointReference xmlns="http://www.w3.org/2005/08/addressing">       <Address>http://localhost/x.y.z</Address>     </EndpointReference>   </wsp:AppliesTo>   <t:RequestedSecurityToken>     <wsse:BinarySecurityToken wsu:Id="uuid:fc8d3332-d501-4bb0-84ba-d31aa95a1a6c" ValueType="http://schemas.xmlsoap.org/ws/2009/11/swt-token-profile-1.0" EncodingType="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-soap-message-security-1.0#Base64Binary" xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd" xmlns:wsse="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd"> [ base64string ] </wsse:BinarySecurityToken>   </t:RequestedSecurityToken>   <t:TokenType>http://schemas.xmlsoap.org/ws/2009/11/swt-token-profile-1.0</t:TokenType>   <t:RequestType>http://schemas.xmlsoap.org/ws/2005/02/trust/Issue</t:RequestType>   <t:KeyType>http://schemas.xmlsoap.org/ws/2005/05/identity/NoProofKey</t:KeyType> </t:RequestSecurityTokenResponse> Reading the SWT is as simple as base-64 decoding, then URL-decoding the element value:     var wrappedToken = XDocument.Parse(HttpContext.Current.Request.Form[1]);     var binaryToken = wrappedToken.Root.Descendants("{http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd}BinarySecurityToken").First();     var tokenBytes = Convert.FromBase64String(binaryToken.Value);     var token = Encoding.UTF8.GetString(tokenBytes);     var tokenType = wrappedToken.Root.Descendants("{http://schemas.xmlsoap.org/ws/2005/02/trust}TokenType").First().Value; The decoded token contains the claims as key/value pairs, along with the issuer, audience (ACS realm), expiry date and an HMAC hash, which are in query string format. Separate them on the ampersand, and you can write out the claim values in your logged-in page:     var decoded = HttpUtility.UrlDecode(token);     foreach (var part in decoded.Split('&'))     {         Response.Write("<pre>" + part + "</pre><br/>");     } - which will produce something like this: http://schemas.microsoft.com/ws/2008/06/identity/claims/authenticationinstant=2012-08-31T06:57:01.855Z http://schemas.microsoft.com/ws/2008/06/identity/claims/authenticationmethod=http://schemas.microsoft.com/ws/2008/06/identity/authenticationmethod/windows http://schemas.microsoft.com/ws/2008/06/identity/claims/windowsaccountname=XYZ http://schemas.xmlsoap.org/ws/2005/05/identity/claims/[email protected] http://schemas.xmlsoap.org/ws/2005/05/identity/claims/[email protected] http://schemas.microsoft.com/accesscontrolservice/2010/07/claims/identityprovider=http://fs.svc.xyz.com/adfs/services/trust Audience=http://localhost/x.y.z ExpiresOn=1346402225 Issuer=https://x-y-z.accesscontrol.windows.net/ HMACSHA256=oDCeEDDAWEC8x+yBnTaCLnzp4L6jI0Z/xNK95PdZTts= The HMAC hash lets you validate the token to ensure it hasn’t been tampered with. You'll need the token signing key from ACS, then you can re-sign the token and compare hashes. There's a full implementation of an SWT parser and validator here: How To Request SWT Token From ACS And How To Validate It At The REST WCF Service Hosted In Windows Azure, and a cut-down claim inspector on my github code gallery: ACS Claim Inspector. Interestingly, ACS lets you have a value for your logged-in page which has no relation to the realm for authentication, so you can put this code into a generic claim inspector page, and set that to be your logged-in page for any relying party where you want to check what's being sent through. Particularly handy with ADFS, when you're modifying the claims provided, and want to quickly see the results.

    Read the article

  • Why won't IE let users login to a website unless in In Private mode?

    - by Richard Fawcett
    I'm not entirely sure this belongs on SuperUser.com. I also considered ServerFault.com and StackOverflow.com, but on balance, I think it should belong here? We host a website which has the same code responding to multiple domain names. On 28th December (without any changes deployed to the website) a percentage of users suddenly could not login, and the blank login page was just rendered again even when the correct credentials were entered. The issue is still ongoing. After remote controlling an affected user's PC, we've found the following: The issue affects Internet Explorer 9. The user can login from the same machine on Chrome. The user can login from an In Private browser session using IE9. The user can login if the website is added to the Trusted Sites security zone. The user can NOT login from an IE session in safe mode (started with iexplore -extoff). Only one hostname that the website responds to prevents login, the same user account on the other hostname works fine (note that this is identical code and database running server side), even though that site is not in trusted sites zone. Series of HTTP requests in the failure case: GET request to protected page, returns a 302 FOUND response to login page. GET request to login page. POST to login page, containing credentials, returns redirect to protected page. GET request to protected page... for some reason auth fails and browser is redirected to login page, as in step 1. Other information: Operating system is Windows 7 Ultimate Edition. AV system is AVG Internet Security 2012. I can think of lots of things that could be going wrong, but in every case, one of the findings above is incompatible with the theory. Any ideas what is causing login to fail? Update 06-Jan-2012 Enhanced logging has shown that the .ASPXAUTH cookie is being set in step 3. Its expiry date is 28 days in the future, its path is /, the domain is mysite.com, and its value is an encrypted forms ticket, as expected. However, the cookie is not being received by the web server during step 4. Other cookies are being presented to the server during step 4, it's just this one that is missing. I've seen that cookies are usually set with a domain starting with a period, but mine isn't. Should it be .mysite.com instead of mysite.com? However, if this was wrong, it would presumably affect all users?

    Read the article

  • WCF GZip Compression Request/Response Processing

    - by IanT8
    How do I get a WCF client to process server responses which have been GZipped or Deflated by IIS? On IIS, I've followed the instructions here on how to make IIS 6 gzip all responses (where the request contained "Accept-Encoding: gzip, deflate") emitted by .svc wcf services. On the client, I've followed the instructions here and here on how to inject this header into the web request: "Accept-Encoding: gzip, deflate". Fiddler2 shows the response is binary and not plain old Xml. The client crashes with an exception which basically says there's no Xml header, which ofcourse is true. In my IClientMessageInspector, the app crashes before AfterReceiveReply is called. Some further notes: (1) I can't change the WCF service or client as they are supplied by a 3rd party. I can however attach behaviors and/or message inspectors via configuration if this is the right direction to take. (2) I don't want to compress/uncompress just the soap body, but the entire message. Any ideas/solutions? * SOLVED * It was not possible to write a WCF extension to achieve these goals. Instead I followed this CodeProject article which advocate a helper class: public class CompressibleHttpRequestCreator : IWebRequestCreate { public CompressibleHttpRequestCreator() { } WebRequest IWebRequestCreate.Create(Uri uri) { HttpWebRequest httpWebRequest = Activator.CreateInstance(typeof(HttpWebRequest), BindingFlags.CreateInstance | BindingFlags.Public | BindingFlags.NonPublic | BindingFlags.Instance, null, new object[] { uri, null }, null) as HttpWebRequest; if (httpWebRequest == null) { return null; } httpWebRequest.AutomaticDecompression =DecompressionMethods.GZip | DecompressionMethods.Deflate; return httpWebRequest; } } and also, an addition to the application configuration file: <configuration> <system.net> <webRequestModules> <remove prefix="http:"/> <add prefix="http:" type="Pajocomo.Net.CompressibleHttpRequestCreator, Pajocomo" /> </webRequestModules> </system.net> </configuration> What seems to be happening is that WCF eventually asks some factory or other deep down in system.net to provide an HttpWebRequest instance, and we provide the helper that will be asked to create the required instance. In the WCF client configuration file, a simple basicHttpBinding is all that is required, without the need for any custom extensions. When the application runs, the client Http request contains the header "Accept-Encoding: gzip, deflate", the server returns a gzipped web response, and the client transparently decompresses the http response before handing it over to WCF. When I tried to apply this technique to Web Services I found that it did NOT work. Although the helper class was executed in the same was as when used by the WCF client, the http request did not contain the "Accept-Encoding: ..." header. To make this work for Web Services, I had to edit the Web Proxy class, and add this method: protected override System.Net.WebRequest GetWebRequest(Uri uri) { System.Net.HttpWebRequest rq = (System.Net.HttpWebRequest)base.GetWebRequest(uri); rq.AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate; return rq; } Note that it did not matter whether the CompressibleHttpRequestCreator and block from the application config file were present or not. For web services, only overriding GetWebRequest in the Web Service Proxy worked.

    Read the article

  • IIS logs show sc-win32-status=64 but only through some networks

    - by wweicker
    I have an ASP.NET application running on a client server (W2k3, IIS6, .NET 2.0). FWIW, this is a Test instance, it hasn't been moved into Production yet. So it is not running under SSL, load balancing, etc. When I access one of the pages on their server from our office, the page gets hit once. Inspecting the IIS logs (c:WINDOWS\system32\LogFiles\W3SVC1) show a GET for that page, then I push a button on the page and the log file shows a POST. This seems to be working fine so far. Now when I remote into the client's network and access the page from one of their local machines, the log file shows a GET, then I push the button on the page and the log shows two POSTs at the same second. The first one shows status (sc-status, sc-substatus, sc-win32-status) 200 0 64, the second shows 200 0 0. In the log file, both POSTs are identical. Basically the log looks like this (except I masked some of the data): #Fields: date time s-ip cs-method cs-uri-stem cs-uri-query s-port cs-username c-ip cs(User-Agent) sc-status sc-substatus sc-win32-status 2009-08-11 20:19:32 x.x.x.x GET /File.aspx - 80 - y.y.y.y Mozilla/4.0+(compatible;+MSIE+8.0;+Windows+NT+6.0;+WOW64;+Trident/4.0;+SLCC1;+.NET+CLR+2.0.50727;+.NET+CLR+3.5.21022;+.NET+CLR+3.5.30729;+.NET+CLR+3.0.30618;+MDDR;+OfficeLiveConnector.1.4;+OfficeLivePatch.0.0) 200 0 0 2009-08-11 20:19:45 x.x.x.x POST /File.aspx - 80 - y.y.y.y Mozilla/4.0+(compatible;+MSIE+8.0;+Windows+NT+6.0;+WOW64;+Trident/4.0;+SLCC1;+.NET+CLR+2.0.50727;+.NET+CLR+3.5.21022;+.NET+CLR+3.5.30729;+.NET+CLR+3.0.30618;+MDDR;+OfficeLiveConnector.1.4;+OfficeLivePatch.0.0) 200 0 64 2009-08-11 20:19:45 x.x.x.x POST /File.aspx - 80 - y.y.y.y Mozilla/4.0+(compatible;+MSIE+8.0;+Windows+NT+6.0;+WOW64;+Trident/4.0;+SLCC1;+.NET+CLR+2.0.50727;+.NET+CLR+3.5.21022;+.NET+CLR+3.5.30729;+.NET+CLR+3.0.30618;+MDDR;+OfficeLiveConnector.1.4;+OfficeLivePatch.0.0) 200 0 0 The problem is, the page is getting hit twice. The database performs an operation for the first request, then the second request detects that a duplicate operation is being performed and throws an error message. The users think their operation failed, but it actually succeeded. The error description of sc-win32-status 64 is: "The specified network name is no longer available." This leads me to believe, given that both POST requests show an HTTP status of 200, that the server is successful in serving the request, but the client is never notified and resubmits the request. How can I troubleshoot this? Any ideas what could be causing this behavior on their internal network only? I should mention, this is happening at two separate client sites, but does not happen at six of our other client sites, or in our office, or connecting to any of our eight clients over the web. What could be making this reproducible 100% of the time on their local network but 0% of the time anywhere else? Update: I found a very small number of the duplicated POST requests had sc-win32-status of 995 instead of 64 as originally reported. The error description of sc-win32-status=995 is: "The I/O operation has been aborted because of either a thread exit or an application request." This doesn't make any sense (considering I have full access to the code). I still don't understand how or why this issue is occurring, but the new error code leads me to believe it may not be a network issue after all and I am now investigating the possibility of a random code bug.

    Read the article

  • Apache logs 200 instead of 404

    - by elle
    I've been getting the following in my apache access log: "GET /work//?module=www&section=working=../../../../../../../../../../../../../../../../../../../../../../../..//proc/self/environ%0000 HTTP/1.1" 200 5187 "-" "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.12) Gecko/20101026 Firefox/3.6.12\",\"Mozilla/5.0 (Windows; U; Windows NT 5.1; pl-PL; rv:1.8.1.24pre) Gecko/20100228 K-Meleon/1.5.4\",\"Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/540.0 (KHTML,like Gecko) Chrome/9.1.0.0 Safari/540.0\",\"Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.5 (KHTML, like Gecko) Comodo_Dragon/4.1.1.11 Chrome/4.1.249.1042 Safari/532.5\",\"Mozilla/5.0 (X11; U; Linux i686 (x86_64); en-US; rv:1.9.0.16) Gecko/2009122206 Firefox/3.0.16 Flock/2.5.6\",\"Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/533.1 (KHTML, like Gecko) Maxthon/3.0.8.2 Safari/533.1\",\"Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.8.1.8pre) Gecko/20070928 Firefox/2.0.0.7 Navigator/9.0RC1\",\"Opera/9.99 (Windows NT 5.1; U; pl) Presto/9.9.9\",\"Mozilla/5.0 (Windows; U; Windows NT 6.1; zh-HK) AppleWebKit/533.18.1 (KHTML, like Gecko) Version/5.0.2 Safari/533.18.5\",\"Seamonkey-1.1.13-1(X11; U; GNU Fedora fc 10) Gecko/20081112\",\"Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Win64; x64; Trident/5.0; .NET CLR 2.0.50727; SLCC2; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; Zune 4.0; Tablet PC 2.0; InfoPath.3; .NET4.0C; .NET4.0E)\",\"Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.1; WOW64; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; MS-RTC LM 8; .NET4.0C; .NET4.0E; InfoPath.3)" If I try the URL, I get a 404 instead of 200 which the above request received. Is there a way I can confirm that the 200 was real and not spoofed? Where is the long info on the client coming from?

    Read the article

  • Any way to turn off quips in OOWeb?

    - by Misha Koshelev
    http://ooweb.sourceforge.net/tutorial.html Not really a question, but I can't seem to stop writing stuff like this. Maybe someone will find it useful. I know rewriting an HTTP server is not the way to turn off the quips ;) /* Copyright 2010 Misha Koshelev. All Rights Reserved. */ package com.mksoft.common; import java.io.BufferedReader; import java.io.InputStreamReader; import java.io.IOException; import java.io.PrintWriter; import java.io.UnsupportedEncodingException; import java.net.URLDecoder; import java.text.SimpleDateFormat; import java.util.Date; import java.util.LinkedHashMap; import java.net.ServerSocket; import java.net.Socket; /** * Simple HTTP Server. * * @author Misha Koshelev */ public class HttpServer extends Thread { /* * Constants */ /** * 404 Not Found Result */ protected final static String result404NotFound="<html><head><title>404 Not Found</title></head><body bgcolor='#ffffff'><h1>404 Not Found</h1></body></html>"; /* * Variables */ /** * Port on which HTTP server handles requests. */ protected int port; public int getPort() { return port; } public void setPort(int _port) { port=_port; } /* * Constructors */ public HttpServer(int _port) { setPort(_port); } /* * Helpers */ /** * Errors */ protected void error(String message) { System.err.println(message); System.err.flush(); } /** * Debugging */ protected boolean debugOutput=true; protected void debug(String message) { if (debugOutput) { error(message); } } /** * Lock object */ private Object lock=new Object(); /** * Should we quit? */ protected boolean doQuit=false; /** * Are we done? */ protected boolean areWeDone=false; /** * Process POST request headers */ protected String processPostRequest(String url,LinkedHashMap<String,String> headers,String inputLine) { debug("HttpServer.processPostRequest: url=\""+url); if (debugOutput) { for (String key: headers.keySet()) { debug("HttpServer.processPostRequest: headers."+key+"=\""+headers.get(key)+"\""); } } debug("HttpServer.processPostRequest: inputLine=\""+inputLine+"\""); try { inputLine=new URLDecoder().decode(inputLine,"UTF-8"); } catch (UnsupportedEncodingException uee) { uee.printStackTrace(); } String[] keyValues=inputLine.split("&"); LinkedHashMap<String,String> post=new LinkedHashMap<String,String>(); for (int i=0;i<keyValues.length;i++) { String keyValue=keyValues[i]; int equals=keyValue.indexOf('='); String key=keyValue.substring(0,equals); String value=keyValue.substring(equals+1); post.put(key,value); } return post(url,headers,post); } /** * Server loop (here for exception handling purposes) */ protected void serverLoop() throws IOException { /* Start server socket */ ServerSocket serverSocket=null; try { serverSocket=new ServerSocket(getPort()); } catch (IOException ioe) { ioe.printStackTrace(); System.exit(1); } Socket clientSocket=null; while (true) { /* Quit if necessary */ if (doQuit) { break; } /* Accept incoming connections */ try { clientSocket=serverSocket.accept(); } catch (IOException ioe) { ioe.printStackTrace(); System.exit(1); } /* Read request */ BufferedReader in=null; String inputLine=null; String firstLine=null; String blankLine=null; LinkedHashMap<String,String> headers=new LinkedHashMap<String,String>(); try { in=new BufferedReader(new InputStreamReader(clientSocket.getInputStream())); while (true) { if (blankLine==null) { inputLine=in.readLine(); } else { /* POST request, read Content-length bytes */ int contentLength=new Integer(headers.get("Content-Length")).intValue(); StringBuilder sb=new StringBuilder(contentLength); for (int i=0;i<contentLength;i++) { sb.append((char)in.read()); } inputLine=sb.toString(); break; } if (firstLine==null) { firstLine=inputLine; } else if (blankLine==null) { if (inputLine.equals("")) { if (firstLine.startsWith("GET ")) { break; } blankLine=inputLine; } else { int colon=inputLine.indexOf(": "); String key=inputLine.substring(0,colon); String value=inputLine.substring(colon+2); headers.put(key,value); } } } } catch (IOException ioe) { ioe.printStackTrace(); } /* Process request */ String result=null; firstLine=firstLine.replaceAll(" HTTP/.*",""); if (firstLine.startsWith("GET ")) { result=get(firstLine.replaceFirst("GET ",""),headers); } else if (firstLine.startsWith("POST ")) { result=processPostRequest(firstLine.replaceFirst("POST ",""),headers,inputLine); } else { error("HttpServer.ServerLoop: Unhandled request \""+firstLine+"\""); } debug("HttpServer.ServerLoop: result=\""+result+"\""); /* Send response */ PrintWriter out=null; try { out=new PrintWriter(clientSocket.getOutputStream(),true); } catch (IOException ioe) { ioe.printStackTrace(); } if (result!=null) { out.println("HTTP/1.1 200 OK"); } else { out.println("HTTP/1.0 404 Not Found"); result=result404NotFound; } Date now=new Date(); out.println("Date: "+new SimpleDateFormat("EEE, d MMM yyyy HH:mm:ss z").format(now)); out.println("Content-Type: text/html; charset=UTF-8"); out.println("Content-Length: "+result.length()); out.println(""); out.print(result); /* Clean up */ out.close(); if (in!=null) { in.close(); } clientSocket.close(); } serverSocket.close(); areWeDone=true; synchronized(lock) { lock.notifyAll(); } } /* * Methods */ /** * Run server on port specified in constructor. */ public void run() { try { serverLoop(); } catch (IOException ioe) { ioe.printStackTrace(); System.exit(1); } } /** * Process GET request (should be overwritten). */ public String get(String url,LinkedHashMap<String,String> headers) { debug("HttpServer.get: url=\""+url+"\""); if (debugOutput) { for (String key: headers.keySet()) { debug("HttpServer.get: headers."+key+"=\""+headers.get(key)+"\""); } } if (url.equals("/")) { return "<html><head><title>HttpServer GET Test Page</title></head>\r\n"+ "<body bgcolor='#ffffff'>\r\n"+ "<center><h1>HttpServer GET Test Page</h1></center>\r\n"+ "<hr />\r\n"+ "<center><table>\r\n"+ "<form method='post' action='/'>\r\n"+ "<tr><td align=right>Test 1:</td>\r\n"+ " <td><input type='text' name='text 1' value='test me !!! !@#$'></td></tr>\r\n"+ "<tr><td align=right>Test 2:</td>\r\n"+ " <td><input type='text' name='text 2' value='type smthng'></td></tr>\r\n"+ "<tr><td>&nbsp;</td>\r\n"+ " <td align=right><input type='submit' value='Submit'></td></tr>\r\n"+ "</form>\r\n"+ "</table></center>\r\n"+ "<hr />\r\n"+ "<center><a href='/quit'>Shutdown Server</a></center>\r\n"+ "</html>"; } else if (url.equals("/quit")) { quit(); return ""; } else { return null; } } /** * Process POST request (should be overwritten). */ public String post(String url,LinkedHashMap<String,String> headers,LinkedHashMap<String,String> post) { debug("HttpServer.post: url=\""+url+"\""); if (debugOutput) { for (String key: headers.keySet()) { debug("HttpServer.post: headers."+key+"=\""+headers.get(key)+"\""); } } if (url.equals("/")) { String result="<html><head><title>HttpServer Post Test Page</title></head>\r\n"+ "<body bgcolor='#ffffff'>\r\n"+ "<center><h1>HttpServer Post Test Page</h1></center>\r\n"+ "<hr />\r\n"+ "<center><table>\r\n"+ "<tr><th>Key</th><th>Value</th></tr>\r\n"; for (String key: post.keySet()) { result+="<tr><td align=right>"+key+"</td><td align=left>"+post.get(key)+"</td></tr>\r\n"; } result+="</table></center>\r\n"+ "</html>"; return result; } else { return null; } } /** * Wait for server to quit. */ public void waitForCompletion() { while (areWeDone==false) { synchronized(lock) { try { lock.wait(); } catch (InterruptedException ie) { } } } } /** * Shutdown server. */ public void quit() { doQuit=true; } }

    Read the article

< Previous Page | 92 93 94 95 96 97 98 99 100 101 102 103  | Next Page >