Search Results

Search found 64048 results on 2562 pages for 'http post'.

Page 21/2562 | < Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >

  • Whats the difference in GET and POST encryption?

    - by Dju
    What is the difference when encrypting GET and POST data? Thx for answer Edit: i need to write it more specific. When https-SSL encrypts both of this methods, what is the difference in way browser does this. Which parts are encrypted and which are not? I somewhere read, that the destination url is not encrypted in POST, is that true? If it is true and same in GET, where are all the parameters?

    Read the article

  • Wordpress - Get Post Number in a Category

    - by Rfvgyhn
    Is there a way for me to get a pseudo-ID of a post from the category it belongs to? Let's say I have these posts post_id | post_title | post_cat --------+------------+--------- 0 | a post | cat1 1 | a post1 | cat2 2 | a post2 | cat1 3 | a post3 | cat2 ... 57 | a post57 | cat2 I want the posts from cat2 and the posts' ids to be relative to the category they were posted in. Something like post_id | post_title | post_cat | cat_post_id --------+------------+----------+-------- 1 | a post1 | cat2 | 1 3 | a post3 | cat2 | 2 57 | a post57 | cat2 | 3

    Read the article

  • Looping through array in PHP to post several multipart form-data

    - by Léon Pelletier
    I'm trying in an asp web application to code a function that would loop through a list of files in a multiple upload form and send them one by one. Is this something that can be done in ASP? Because I've read some posts about how to attach several files together, but saw nothing about looping through the files. I can easily imagine it in C# via HttpWebRequest or with socket, but in php, I guess there are already function designed to handle it? // This is false/pseudo-code :) for (int index = 0; index < number_of_files; index++) { postfile(file[index]); } And in each iteration, it should send a multipart form-data POST. postfile(TheFileInfos) should make a POST like it: POST /afs.aspx?fn=upload HTTP/1.1 [Header stuff] Content-Type: multipart/form-data; boundary=----------Ef1Ef1cH2Ij5GI3ae0gL6KM7GI3GI3 [Header stuff] ------------Ef1Ef1cH2Ij5GI3ae0gL6KM7GI3GI3 Content-Disposition: form-data; name="Filename" myimage1.png ------------Ef1Ef1cH2Ij5GI3ae0gL6KM7GI3GI3 Content-Disposition: form-data; name="fileid" 58e21ede4ead43a5201206101806420000007667212251 ------------Ef1Ef1cH2Ij5GI3ae0gL6KM7GI3GI3 Content-Disposition: form-data; name="Filedata"; filename="myimage1.png" Content-Type: application/octet-stream [Octet Stream] [Edit] I'll try it: <html> <head> <title>Untitled Document</title> <meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1"> </head> <body> <form name="form1" enctype="multipart/form-data" method="post" action="processFiles.php"> <p> <? // start of dynamic form $uploadNeed = $_POST['uploadNeed']; for($x=0;$x<$uploadNeed;$x++){ ?> <input name="uploadFile<? echo $x;?>" type="file" id="uploadFile<? echo $x;?>"> </p> <? // end of for loop } ?> <p><input name="uploadNeed" type="hidden" value="<? echo $uploadNeed;?>"> <input type="submit" name="Submit" value="Submit"> </p> </form> </body> </html>

    Read the article

  • PHP - getting content of the POST request

    - by user364622
    I have problem with getting the content. I don't know the names of the post variables so I can't do this using = $_Post['name'] because I don't know the "name". I want to catch all of the variables send by POST method. How can I get keys of the $_Post[] array and the values related with them?

    Read the article

  • Get info about Http Post field order

    - by Eugene
    Is it possible to get information about post field order in ASP.NET? I need to know whether some field was the last one or not. I know I can do it through Request.InputStream, but I’m looking for a more high level solution without manually stream parsing. Generally I’m doing testing of http post sent by my application and there is no practical usage for this in ASP.NET.

    Read the article

  • IIS 401.3 - Unauthorized on only 1 server out of 3 set up for network load balancing

    - by Tony
    Over the weekend our Server Admin set up two virtual Windows 2008 machines with IIS installed and set them up under NLB. I came in and changed the application pool the website was running under to our domain account that has proper access to the database and the file share hosting our .NET web application Sitefinity, and changed it to .NET 4 Integrated. NLB and everything was running fine on both servers. He brought up the third server for our cluster on Tuesday and I performed the same actions.. The only difference was that I was given admin rights for the third server so I could set it up remotely instead of going to his office. He has full control over the share and NTFS perms on \\hostname\Sitefinity and I believe I only had read access. I pointed the web site to the same \\hostname\Sitefinity\sitename share that the others were on and the authentication/authorization test settings passed. I hit the site from http://localhost (like I did successfully from the other two before trying the cluster's IP address) and I received a HTTP Error 401.3 - Unauthorized. I've verified many times that the application pool is running under the same service account. I tried hitting just a simple test.htm.. works fine on both of the first two servers but I get the same 401.3 on the third. I copied my dev project to the local inetpub directory and re-pointed the website and that ran perfectly. I turned on Failed Request Tracing and it acts like it's still running the local IUSR account I guess (instead of my domain account)? Here is an excerpt of the File Cache Access Start and the error from the trace: FileName \\hostname\sitefinity\sitename\test.htm UserName IUSR DomainName NT AUTHORITY ---------- Successful false FileFromCache false FileAddedToCache false FileDirmoned true LastModCheckErrorIgnored true ErrorCode 2147942405 LastModifiedTime ErrorCode Access is denied. (0x80070005) ---------- ModuleName IIS Web Core Notification 2 HttpStatus 401 HttpReason Unauthorized HttpSubStatus 3 ErrorCode 2147942405 ConfigExceptionInfo Notification AUTHENTICATE_REQUEST ErrorCode Access is denied. (0x80070005) ---------- My personal AD account was then granted read/write perms to the share so I created a new application pool and set the site under it in case there was an issue with the application pool but no success. I created another under my own account and it still failed. It just seems like maybe it's not trying to access the files under the account my application pools are running under although that's the only way I've done things before. I set the Physicial Path Credentials in Advanced Settings on the site to the service account and it threw a 500 error of some sort so I assume that's not the answer (and I don't have to do it on the other servers). It's like somehow I'm trying to force impersonation on the IUSR account or something?

    Read the article

  • Logging Into a site that uses Live.com authentication with C#

    - by Josh
    I've been trying to automate a log in to a website I frequent, www.bungie.net. The site is associated with Microsoft and Xbox Live, and as such makes uses of the Windows Live ID API when people log in to their site. I am relatively new to creating web spiders/robots, and I worry that I'm misunderstanding some of the most basic concepts. I've simulated logins to other sites such as Facebook and Gmail, but live.com has given me nothing but trouble. Anyways, I've been using Wireshark and the Firefox addon Tamper Data to try and figure out what I need to post, and what cookies I need to include with my requests. As far as I know these are the steps one must follow to log in to this site. 1. Visit https: //login.live.com/login.srf?wa=wsignin1.0&rpsnv=11&ct=1268167141&rver=5.5.4177.0&wp=LBI&wreply=http:%2F%2Fwww.bungie.net%2FDefault.aspx&id=42917 2. Recieve the cookies MSPRequ and MSPOK. 3. Post the values from the form ID "PPSX", the values from the form ID "PPFT", your username, your password all to a changing URL similar to: https: //login.live.com/ppsecure/post.srf?wa=wsignin1.0&rpsnv=11&ct= (there are a few numbers that change at the end of that URL) 4. Live.com returns the user a page with more hidden forms to post. The client then posts the values from the form "ANON", the value from the form "ANONExp" and the values from the form "t" to the URL: http ://www.bung ie.net/Default.aspx?wa=wsignin1.0 5. After posting that data, the user is returned a variety of cookies the most important of which is "BNGAuth" which is the log in cookie for the site. Where I am having trouble is on fifth step, but that doesn't neccesarily mean I've done all the other steps correctly. I post the data from "ANON", "ANONExp" and "t" but instead of being returned a BNGAuth cookie, I'm returned a cookie named "RSPMaybe" and redirected to the home page. When I review the Wireshark log, I noticed something that instantly stood out to me as different between the log when I logged in with Firefox and when my program ran. It could be nothing but I'll include the picture here for you to review. I'm being returned an HTTP packet from the site before I post the data in the fourth step. I'm not sure how this is happening, but it must be a side effect from something I'm doing wrong in the HTTPS steps. ![alt text][1] http://img391.imageshack.us/img391/6049/31394881.gif using System; using System.Collections.Generic; using System.Collections.Specialized; using System.Text; using System.Net; using System.IO; using System.IO.Compression; using System.Security.Cryptography; using System.Security.Cryptography.X509Certificates; using System.Web; namespace SpiderFromScratch { class Program { static void Main(string[] args) { CookieContainer cookies = new CookieContainer(); Uri url = new Uri("https://login.live.com/login.srf?wa=wsignin1.0&rpsnv=11&ct=1268167141&rver=5.5.4177.0&wp=LBI&wreply=http:%2F%2Fwww.bungie.net%2FDefault.aspx&id=42917"); HttpWebRequest http = (HttpWebRequest)HttpWebRequest.Create(url); http.Timeout = 30000; http.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 (.NET CLR 3.5.30729)"; http.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"; http.Headers.Add("Accept-Language", "en-us,en;q=0.5"); http.Headers.Add("Accept-Charset", "ISO-8859-1,utf-8;q=0.7,*;q=0.7"); http.Headers.Add("Keep-Alive", "300"); http.Referer = "http://www.bungie.net/"; http.ContentType = "application/x-www-form-urlencoded"; http.CookieContainer = new CookieContainer(); http.Method = WebRequestMethods.Http.Get; HttpWebResponse response = (HttpWebResponse)http.GetResponse(); StreamReader readStream = new StreamReader(response.GetResponseStream()); string HTML = readStream.ReadToEnd(); readStream.Close(); //gets the cookies (they are set in the eighth header) string[] strCookies = response.Headers.GetValues(8); response.Close(); string name, value; Cookie manualCookie; for (int i = 0; i < strCookies.Length; i++) { name = strCookies[i].Substring(0, strCookies[i].IndexOf("=")); value = strCookies[i].Substring(strCookies[i].IndexOf("=") + 1, strCookies[i].IndexOf(";") - strCookies[i].IndexOf("=") - 1); manualCookie = new Cookie(name, "\"" + value + "\""); Uri manualURL = new Uri("http://login.live.com"); http.CookieContainer.Add(manualURL, manualCookie); } //stores the cookies to be used later cookies = http.CookieContainer; //Get the PPSX value string PPSX = HTML.Remove(0, HTML.IndexOf("PPSX")); PPSX = PPSX.Remove(0, PPSX.IndexOf("value") + 7); PPSX = PPSX.Substring(0, PPSX.IndexOf("\"")); //Get this random PPFT value string PPFT = HTML.Remove(0, HTML.IndexOf("PPFT")); PPFT = PPFT.Remove(0, PPFT.IndexOf("value") + 7); PPFT = PPFT.Substring(0, PPFT.IndexOf("\"")); //Get the random URL you POST to string POSTURL = HTML.Remove(0, HTML.IndexOf("https://login.live.com/ppsecure/post.srf?wa=wsignin1.0&rpsnv=11&ct=")); POSTURL = POSTURL.Substring(0, POSTURL.IndexOf("\"")); //POST with cookies http = (HttpWebRequest)HttpWebRequest.Create(POSTURL); http.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 (.NET CLR 3.5.30729)"; http.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"; http.Headers.Add("Accept-Language", "en-us,en;q=0.5"); http.Headers.Add("Accept-Charset", "ISO-8859-1,utf-8;q=0.7,*;q=0.7"); http.Headers.Add("Keep-Alive", "300"); http.CookieContainer = cookies; http.Referer = "https://login.live.com/login.srf?wa=wsignin1.0&rpsnv=11&ct=1268158321&rver=5.5.4177.0&wp=LBI&wreply=http:%2F%2Fwww.bungie.net%2FDefault.aspx&id=42917"; http.ContentType = "application/x-www-form-urlencoded"; http.Method = WebRequestMethods.Http.Post; Stream ostream = http.GetRequestStream(); //used to convert strings into bytes System.Text.ASCIIEncoding encoding = new System.Text.ASCIIEncoding(); //Post information byte[] buffer = encoding.GetBytes("PPSX=" + PPSX +"&PwdPad=IfYouAreReadingThisYouHaveTooMuc&login=YOUREMAILGOESHERE&passwd=YOURWORDGOESHERE" + "&LoginOptions=2&PPFT=" + PPFT); ostream.Write(buffer, 0, buffer.Length); ostream.Close(); HttpWebResponse response2 = (HttpWebResponse)http.GetResponse(); readStream = new StreamReader(response2.GetResponseStream()); HTML = readStream.ReadToEnd(); response2.Close(); ostream.Dispose(); foreach (Cookie cookie in response2.Cookies) { Console.WriteLine(cookie.Name + ": "); Console.WriteLine(cookie.Value); Console.WriteLine(cookie.Expires); Console.WriteLine(); } //SET POSTURL value string POSTANON = "http://www.bungie.net/Default.aspx?wa=wsignin1.0"; //Get the ANON value string ANON = HTML.Remove(0, HTML.IndexOf("ANON")); ANON = ANON.Remove(0, ANON.IndexOf("value") + 7); ANON = ANON.Substring(0, ANON.IndexOf("\"")); ANON = HttpUtility.UrlEncode(ANON); //Get the ANONExp value string ANONExp = HTML.Remove(0, HTML.IndexOf("ANONExp")); ANONExp = ANONExp.Remove(0, ANONExp.IndexOf("value") + 7); ANONExp = ANONExp.Substring(0, ANONExp.IndexOf("\"")); ANONExp = HttpUtility.UrlEncode(ANONExp); //Get the t value string t = HTML.Remove(0, HTML.IndexOf("id=\"t\"")); t = t.Remove(0, t.IndexOf("value") + 7); t = t.Substring(0, t.IndexOf("\"")); t = HttpUtility.UrlEncode(t); //POST the Info and Accept the Bungie Cookies http = (HttpWebRequest)HttpWebRequest.Create(POSTANON); http.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 (.NET CLR 3.5.30729)"; http.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"; http.Headers.Add("Accept-Language", "en-us,en;q=0.5"); http.Headers.Add("Accept-Encoding", "gzip,deflate"); http.Headers.Add("Accept-Charset", "ISO-8859-1,utf-8;q=0.7,*;q=0.7"); http.Headers.Add("Keep-Alive", "115"); http.CookieContainer = new CookieContainer(); http.ContentType = "application/x-www-form-urlencoded"; http.Method = WebRequestMethods.Http.Post; http.Expect = null; ostream = http.GetRequestStream(); int test = ANON.Length; int test1 = ANONExp.Length; int test2 = t.Length; buffer = encoding.GetBytes("ANON=" + ANON +"&ANONExp=" + ANONExp + "&t=" + t); ostream.Write(buffer, 0, buffer.Length); ostream.Close(); //Here lies the problem, I am not returned the correct cookies. HttpWebResponse response3 = (HttpWebResponse)http.GetResponse(); GZipStream gzip = new GZipStream(response3.GetResponseStream(), CompressionMode.Decompress); readStream = new StreamReader(gzip); HTML = readStream.ReadToEnd(); //gets both cookies string[] strCookies2 = response3.Headers.GetValues(11); response3.Close(); } } } This has given me problems and I've put many hours into learning about HTTP protocols so any help would be appreciated. If there is an article detailing a similar log in to live.com feel free to point the way. I've been looking far and wide for any articles with working solutions. If I could be clearer, feel free to ask as this is my first time using Stack Overflow. Cheers, --Josh

    Read the article

  • How to post Source Code inside a Wordpress blog post

    - by jasondavis
    I just started a coding related wordpress bog to basicly discuss topics and store code snippets in. I installed a code syntax highlighter plugin which is very nice but I am having a problem. I just tried posting my first code related blog post and wordpress seems to only post some of my code and then it makes some of it not show, like it filters some of it out even though I am wrapping it in tags like this.. <pre class="brush:php"> <?PHP my code ?> </pre> Some of my code works and some gets filtered out somehow. I know this is possible as many wordpress coding sites show source code. Do I need to modify something to make it work in wordpress? UPDATE I have confirmed that it does save ALL my code into the database it just filters it out when it try to print to screen, so when I go in to edit the post and textbox populates with ALL the source code intact

    Read the article

  • how to do asynchronous http requests with epoll and python 3.1

    - by flow
    there is an interesting page http://scotdoyle.com/python-epoll-howto.html about how to do asnchronous / non-blocking / AIO http serving in python 3. there is the tornado web server which does include a non-blocking http client. i have managed to port parts of the server to python 3.1, but the implementation of the client requires pyCurl and seems to have problems (with one participant stating how ‘Libcurl is such a pain in the neck’, and looking at the incredibly ugly pyCurl page i doubt pyCurl will arrive in py3+ any time soon). now that epoll is available in the standard library, it should be possible to do asynchronous http requests out of the box with python. i really do not want to use asyncore or whatnot; epoll has a reputation for being the ideal tool for the task, and it is part of the python distribution, so using anything but epoll for non-blocking http is highly counterintuitive (prove me wrong if you feel like it). oh, and i feel threading is horrible. no threading. i use stackless. people further interested in the topic of asynchronous http should not miss out on this talk by peter portante at PyCon2010; also of interest is the keynote, where speaker antonio rodriguez at one point emphasizes the importance of having up-to-date web technology libraries right in the standard library.

    Read the article

  • OData / WCF Data Service - HTTP 500 Error

    - by Eric
    I have created an OData/WCF service using Visual Studio 2010 on Windows XP SP3 with all current patches installed. When I click on "view in browser", the service opens and I see the 3 tables from my EF model. However, when I add a table name ("Commands" in this case) to the end of the query string, rather than seeing the data from the table, I get an HTTP 500 error. (This error (HTTP 500 Internal Server Error) means that the website you are visiting had a server problem which prevented the webpage from displaying.). I have not only followed the examples from 2 sites, but have also tried running the sample application that the blog poster sent me (that works on his machine), and still am not having any luck. The blog post is at Exposing OData from an Entity Framework Model Does anyone have an idea why this is occurring and how to resolve it? Here is the output of the "View in Browser": <?xml version="1.0" encoding="utf-8" standalone="yes" ?> - <service xml:base="http://localhost:1883/VistaDBCommandService.svc/" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:app="http://www.w3.org/2007/app" xmlns="http://www.w3.org/2007/app"> - <workspace> <atom:title>Default</atom:title> - <collection href="Commands"> <atom:title>Commands</atom:title> </collection> - <collection href="Databases"> <atom:title>Databases</atom:title> </collection> - <collection href="Statuses"> <atom:title>Statuses</atom:title> </collection> </workspace> </service> ============================= Thanks, Eric

    Read the article

  • Relation between HTTP Keep Alive duration and TCP timeout duration

    - by Suresh Kumar
    I am trying to understand the relation between TCP/IP and HTTP timeout values. Are these two timeout values different or same? Most Web servers allow users to set the HTTP Keep Alive timeout value through some configuration. How is this value used by the Web servers? is this value just set on the underlying TCP/IP socket i.e is the HTTP Keep Alive timeout and TCP/IP Keep Alive Timeout same? or are they treated differently? My understanding is (maybe incorrect): The Web server uses the default timeout on the underlying TCP socket (i.e. indefinite) regardless of the configured HTTP Keep Alive timeout and creates a Worker thread that counts down the specified HTTP timeout interval. When the Worker thread hits zero, it closes the connection. EDIT: My question is about the relation or difference between the two timeout durations i.e. what will happen when HTTP keep-alive timeout duration and the timeout on the Socket (SO_TIMEOUT) which the Web server uses is different? should I even worry about these two being same or not?

    Read the article

  • Rails' page caching vs. HTTP reverse proxy caches

    - by John Topley
    I've been catching up with the Scaling Rails screencasts. In episode 11 which covers advanced HTTP caching (using reverse proxy caches such as Varnish and Squid etc.), they recommend only considering using a reverse proxy cache once you've already exhausted the possibilities of page, action and fragment caching within your Rails application (as well as memcached etc. but that's not relevant to this question). What I can't quite understand is how using an HTTP reverse proxy cache can provide a performance boost for an application that already uses page caching. To simplify matters, let's assume that I'm talking about a single host here. This is my understanding of how both techniques work (maybe I'm wrong): With page caching the Rails process is hit initially and then generates a static HTML file that is served directly by the Web server for subsequent requests, for as long as the cache for that request is valid. If the cache has expired then Rails is hit again and the static file is regenerated with the updated content ready for the next request With an HTTP reverse proxy cache the Rails process is hit when the proxy needs to determine whether the content is stale or not. This is done using various HTTP headers such as ETag, Last-Modified etc. If the content is fresh then Rails responds to the proxy with an HTTP 304 Not Modified and the proxy serves its cached content to the browser, or even better, responds with its own HTTP 304. If the content is stale then Rails serves the updated content to the proxy which caches it and then serves it to the browser If my understanding is correct, then doesn't page caching result in less hits to the Rails process? There isn't all that back and forth to determine if the content is stale, meaning better performance than reverse proxy caching. Why might you use both techniques in conjunction?

    Read the article

  • Empty Post with JQuery Ajax Post

    - by chandru
    Hi, I am getting a some unusual problem with JQuery ajax. I am using IIS to host my web application and I have http handler for which I have enabled only POST verb on it. Using JQuery ajax, I am posting data to this http handler, this is working fine in our development and testing environment and also most of the time on production environment as well. But sometimes we are getting empty post data on to the server. When we look into the csBytes on IISLog we found that it was very less compare to other success post requests. We are using JSON.js to convert the javascript object back to raw json string and latest jquery.1-3.js for posting to server. Anbody know why this is happening?

    Read the article

  • unable to install anything ,getting error subprocess installed post-installation script returned error exit status 1

    - by soum
    dpkg: error processing mono-4.0-gac (--configure): subprocess installed post-installation script returned error exit status 1 Processing triggers for mousetweaks ... No apport report written because MaxReports is reached already postinst called with unknown argument `triggered' dpkg: error processing mousetweaks (--configure): subprocess installed post-installation script returned error exit status 1 No apport report written because MaxReports is reached already Processing triggers for mozilla-plugin-vlc ... postinst called with unknown argument `triggered' dpkg: error processing mozilla-plugin-vlc (--configure): subprocess installed post-installation script returned error exit status 1 Processing triggers for mtools ... No apport report written because MaxReports is reached already postinst called with unknown argument `triggered' dpkg: error processing mtools (--configure): subprocess installed post-installation script returned error exit status 1 Processing triggers for network-manager-pptp-gnome ... No apport report written because MaxReports is reached already postinst called with unknown argument `triggered' dpkg: error processing network-manager-pptp-gnome (--configure): subprocess installed post-installation script returned error exit status 1 No apport report written because MaxReports is reached already Processing triggers for network-manager-pptp ... postinst called with unknown argument `triggered' dpkg: error processing network-manager-pptp (--configure): subprocess installed post-installation script returned error exit status 1 No apport report written because MaxReports is reached already Processing triggers for network-manager-gnome ... /var/lib/dpkg/info/network-manager-gnome.postinst called with unknown argument `triggered' dpkg: error processing network-manager-gnome (--configure): subprocess installed post-installation script returned error exit status 1 Processing triggers for network-manager ... No apport report written because MaxReports is reached already /var/lib/dpkg/info/network-manager.postinst called with unknown argument `triggered' dpkg: error processing network-manager (--configure): subprocess installed post-installation script returned error exit status 1 No apport report written because MaxReports is reached already Processing triggers for mscompress ... postinst called with unknown argument `triggered' dpkg: error processing mscompress (--configure): subprocess installed post-installation script returned error exit status 1 No apport report written because MaxReports is reached already Errors were encountered while processing: netbase mtr-tiny module-init-tools mountmanager mono-4.0-gac mousetweaks mozilla-plugin-vlc mtools network-manager-pptp-gnome network-manager-pptp network-manager-gnome network-manager mscompress E: Sub-process /usr/bin/dpkg returned an error code (1)

    Read the article

  • How can I prevent HTTPS on another domain from wrongly showing on my HTTP-only domain?

    - by Earlz
    So, I have a blog at domain.com. This blog is HTTP-only because I would gain almost nothing from adding SSL support. I have a web service now that I want to enable SSL support on that runs on the same server and IP address as my blog. I got it all working pretty easily, but not if I go to https://domain.com I will see a huge warning about an SSL certificate error and then if I click "ok" through the warning, I'll see the web service with SSL support, not my blog. My biggest fear with this scheme is Google indexing an HTTPS version of it and penalizing my blog because the content between the two doesn't match. How can I somehow for my blog's domain to either not serve anything on HTTPS, or to redirect back to my HTTP blog, or to serve my blog, but with an invalid SSL certificate? What can I do, preferably without buying another dedicated IP for my website?

    Read the article

  • Cookie access within a HTTP Class

    - by James Jeffery
    I have a HTTP class that has a Get, and Post, method. It's a simple class I created to encapsulate Post and Get requests so I don't have to repeat the get/post code throughout the application. In C#: class HTTP { private CookieContainer cookieJar; private String userAgent = "..."; public HTTP() { this.cookieJar = new CookieContainer(); } public String get(String url) { // Make get request. Return the JSON } public String post(String url, String postData) { // Make post request. Return the JSON } } I've made the CookieJar a property because I want to preserve the cookie values throughout the session. If the user is logged into Twitter with my application, each request I make (be it get or post) I want to use the cookies so they remain logged in. That's the basics of it anyway. But, I don't want to return a string in all instances. Sometimes I may want the cookie, or a header value, or something else from the request. Ideally I'd like to be able to do this in my code: Cookie cookie = http.get("http://google.com").cookie("g_user"); String g_user = cookie.value; or String source = http.get("http://google.com").body; My question - To do this, would I need to have a Get class, and a Post class, that are included within the HTTP class and are accessible via accessors? Within the Get and Post class I would then have the Cookie method, and the body property, and whatever else is needed. Should I also use an interface, or create a Request class and have Post and Get extend it so that common methods and properties are available to both classes? Or, am I thinking totally wrong?

    Read the article

  • How to post JSON object to a URL without cURL in PHP? [closed]

    - by empyreanphoenix
    I have written a send sms code which uses an available sms api. So, I need to post a string in the json(string in $json) format to a url(url specified in $url var), but I am getting the following error String index out of range: -1 I understand that the error arises when requesting for an index that isn't there, but I dont find how that applies in this case. Please help. Note: name1,name2 and name3 are sender name, phone number and content respectively. Thanks array ( 'method' = 'POST', 'content' = $data, 'header' ="Authentication:$key" . "timeout:50000"."ContentLength: " . strlen ( $data) . "\r\n" ) ); if ($optional_headers != null) { $params ['http'] ['header'] = $optional_headers; } $ctx = stream_context_create ( $params ); try { $fp = fopen ( $url, 'rb', false, $ctx ); $response = stream_get_contents ( $fp ); echo "response"; } catch ( Exception $e ) { echo 'Exception: ' . $e-getMessage (); echo "caught"; } echo "done"; return $response; }

    Read the article

  • Google.com and clients1.google.com/generate_204

    - by David Murdoch
    I was looking into google.com's Net activity in firebug just because I was curious and noticed a request was returning "204 No Content." It turns out that a 204 No Content "is primarily intended to allow input for actions to take place without causing a change to the user agent's active document view, although any new or updated metainformation SHOULD be applied to the document currently in the user agent's active view." Whatever. I've looked into the JS source code and saw that "generate_204" is requested like this: (new Image).src="http://clients1.google.com/generate_204" No variable declaration/assignment at all. My first idea is that it was being used to track if Javascript is enabled. But the "(new Image).src='...'" call is called from a dynamically loaded external JS file anyway, so that would be pointless. Anyone have any ideas as to what the point could be? UPDATE "/generate_204" appears to be available on many google services/servers (e.g., maps.google.com/generate_204, maps.gstatic.com/generate_204, etc...). You can take advantage of this by pre-fetching the generate_204 pages for each google-owned service your web app may use. Like This: window.onload = function(){ var two_o_fours = [ // google maps domain ... "http://maps.google.com/generate_204", // google maps images domains ... "http://mt0.google.com/generate_204", "http://mt1.google.com/generate_204", "http://mt2.google.com/generate_204", "http://mt3.google.com/generate_204", // you can add your own 204 page for your subdomains too! "http://sub.domain.com/generate_204" ]; for(var i = 0, l = two_o_fours.length; i < l; ++i){ (new Image).src = two_o_fours[i]; } };

    Read the article

  • C# SOCKS proxy service for HTTP requests

    - by Ed
    I'm trying to build a service that will forward HTTP requests from agents like a browser to the Tor service. Problem is, the Tor service only accepts SOCKS4a connections. So my solution is to listen for HTTP requests, get the URL they're requesting, and make a request via Tor with the help of the Starksoft.Net.Proxy library. Then return the response. The library kind of works, but I'm not happy. It returns HTTP headers with the response and it can't handle images. So the responses are messed up. How could I improve my code? I'm very new to network programming. Sorry for the long example. public AnonymiserService(ILogger logger) { try { _logger = logger; _logger.Log("Listening on port {0}...", Properties.Settings.Default.ListeningPort); StartListener(new string[] { string.Format("http://*:{0}/", Properties.Settings.Default.ListeningPort) }); } catch (Exception ex) { _logger.LogError("Exception!", ex); } } private void StartListener(string[] prefixes) { if (!HttpListener.IsSupported) { _logger.LogError("HttpListener isn't supported on this machine!"); return; } HttpListener listener = new HttpListener(); foreach (string s in prefixes) listener.Prefixes.Add(s); while (true) { listener.Start(); IAsyncResult result = listener.BeginGetContext(new AsyncCallback(ListenerCallback), listener); result.AsyncWaitHandle.WaitOne(); } } private void ListenerCallback(IAsyncResult result) { try { // Get HTTP request HttpListener listener = (HttpListener)result.AsyncState; HttpListenerContext context = listener.EndGetContext(result); _logger.Log("Retrieving [{0}]", context.Request.RawUrl); // Create connection // Use Tor as proxy IProxyClient proxyClient = new Socks4aProxyClient("localhost", 9050); TcpClient tcpClient = proxyClient.CreateConnection(context.Request.UserHostName, 80); // Create message // Need to set Connection: close to close the connection as soon as it's done byte[] data = Encoding.UTF8.GetBytes(String.Format("GET {0} HTTP/1.1\r\nHost: {1}\r\nConnection: close\r\n\r\n", context.Request.Url.PathAndQuery, context.Request.UserHostName)); // Send message NetworkStream ns = tcpClient.GetStream(); ns.Write(data, 0, data.Length); // Pass on HTTP response HttpListenerResponse responseOut = context.Response; if (ns.CanRead) { byte[] buffer = new byte[32768]; int read = 0; string responseString = string.Empty; // Read response while ((read = ns.Read(buffer, 0, buffer.Length)) > 0) { responseString += Encoding.UTF8.GetString(buffer, 0, read); } // Remove headers if (responseString.IndexOf("HTTP/1.1 200 OK") > -1) responseString = responseString.Substring(responseString.IndexOf("\r\n\r\n")); // Forward response byte[] byteArray = Encoding.UTF8.GetBytes(responseString); responseOut.OutputStream.Write(byteArray, 0, byteArray.Length); } // Close streams responseOut.OutputStream.Close(); ns.Close(); // Close connection tcpClient.Close(); _logger.Log("Retrieved [{0}]", context.Request.RawUrl); } catch (Exception ex) { _logger.LogError("Exception in ListenerCallback!", ex); } }

    Read the article

  • WCF Restful services getting error 400 (bad request) when post xml data

    - by Wayne Lo
    I am trying to self host a WCF services and calling the services via javascript. It works when I pass the request data via Json but not xml (400 bad request). Please help. Contract: public interface iSelfHostServices { [OperationContract] [WebInvoke(Method = "POST", UriTemplate = INFOMATO.RestTemplate.hello_post2,RequestFormat = WebMessageFormat.Xml, ResponseFormat = WebMessageFormat.Xml, BodyStyle = WebMessageBodyStyle.Wrapped)] Stream hello_post2(string helloString); } Server side code: public Stream hello_post2(string helloString) { if (helloString == null) { WebOperationContext.Current.OutgoingResponse.StatusCode = System.Net.HttpStatusCode.BadRequest; return null; } WebOperationContext.Current.OutgoingResponse.StatusCode = System.Net.HttpStatusCode.OK; return new MemoryStream(Encoding.UTF8.GetBytes(helloString)); } JavaScript: function testSelfHost_WCFService_post_Parameter() { var xmlString = "<helloString>'hello via Post'</helloString>"; Ajax_sendData("hello/post2", xmlString); } function Ajax_sendData(url, data) { var request = false; request = getHTTPObject(); if (request) { request.onreadystatechange = function() { parseResponse(request); }; request.open("post", url, true); request.setRequestHeader("Content-Type", "text/xml; charset=utf-8"); charset=utf-8"); request.send(data); return true; } } function getHTTPObject() { var xhr = false; if (window.XMLHttpRequest) { xhr = new XMLHttpRequest(); } else if (window.ActiveXObject) {...} }

    Read the article

  • jQuery DataTables: Problems with POST Server Side JSON output

    - by Tim
    Hello Everyone, I am trying to get my datatable to take a POST JSON output from my server. This is my client side code <script> $(document).ready(function() { $('#example').dataTable( { "bProcessing": true, "bServerSide": true, "sAjaxSource": "http://localhost/staff/jobs/my_jobs", "fnServerData": function ( sSource, aoData, fnCallback ) { $.ajax( { "dataType": 'json', "type": "POST", "url": sSource, "data": aoData, "success": fnCallback } ); } } ); } ); </script> Now I have copied and pasted the server side code found in the DataTables examples found here. When I change my sAjaxSource to view this page the table doesn't move beyond 'processing'. When I view the JSON directly I see this output. {"sEcho": 1, "iTotalRecords": 1, "iTotalDisplayRecords": 1, "aaData": [ ["Trident","First Ever Job"]] } Just for fun I went to the POST server-side example and copied some of the JSON they are using for their example and just PHP echoed it straight out of another page. This is the output of that page. {"sEcho": 1, "iTotalRecords": 1, "iTotalDisplayRecords": 1, "aaData": [ ["Trident","Internet Explorer 4.0"]] } Here is where it gets interesting. The JSON that has been processed by the server fails to work yet the JSON simply echo'd by the same server on a different page does work... yet both are almost identical in outputs. I hope someone can shed some light on this because as the tree said to the lumberjack... I'm stumped. Thanks, Tim

    Read the article

  • jQuery, PHP, AJAX, "tu" variable beeing posted for no reason, shows in var_dump()

    - by Mattis
    A jQuery AJAX request .post()s data to page.php, which creates $res and var_dump()s it. $res: $res = array(); foreach ($_REQUEST as $key => $value) { if($key){ $res[$key] = $value; } } var_dump($res): array(4) { ["text1"]=> string(6) "mattis" ["text2"]=> string(4) "test" ["tu"]=> string(32) "deb6adbbff4234b5711cc4368c153bc4" ["PHPSESSID"]=> string(32) "cda24363cb9d3226bd37b2577ed0bc0b" } My javascript only sends text1 and text2: $.post("page.php",{ text1:"mattis", text2:"test" } What is the "tu" variable beeing sent? Apparantly it's very similar to the session id, but I've never seen it before. EDIT: It is sent in IE but not in FF.

    Read the article

  • Java: Handling cookies when logging in with POST

    - by Cris Carter
    I'm having quite some trouble logging in to any site in Java. I'm using the default URLconnection POST request, but I'm unsure how to handle the cookies properly. I tried this guide: http://www.hccp.org/java-net-cookie-how-to.html But couldn't get it working. I've been trying basically for days now, and I really need help if anyone wants to help me. I'll probably be told that it's messy and that I should use a custom library meant for this stuff. I tried downloading one, but wasn't sure how to get it set up and working. I've been trying various things for hours now, and it just won't work. I'd rather do this with a standard URLconnection, but if anyone can help me get another library working that's better for this, that would be great, too. I would really appreciate if someone could post a working source that I could study. What I need is: POST login data to site - Get and store the cookie from the site - use cookie with next URLconnection requests to get logged-in version of the site. Can anyone help me with this? Would be EXTREMELY appreciated. It really does mean a lot. If anyone wants to actually help me out live, please leave an instant-messenger address. Thank you a lot for your time.

    Read the article

< Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >