Search Results

Search found 58094 results on 2324 pages for 'http status codes'.

Page 17/2324 | < Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >

  • uploading via http post (multipart/form-data) silently fails with big files

    - by matteo
    When uploading multipart/form-data forms via a http post request to my apache web server, very big files (i.e. 30MB) are silently discarded. On the server side all looks as if the attached file was received with 0 bytes size. On the client side all looks like it had been uploaded succesfully (it takes the expected long time to upload and the browser gives no error message). On the server, nothing is logged into the error log. An entry is logged into the access log as if everything was ok (a post request and a 200 ok response). These uploads are being posted to a php script. In the php script, If I print_r $_FILES, I see the following information for the relevant file: [file5] => Array ( [name] => MOV023.3gp [type] => video/3gpp [tmp_name] => /tmp/phpgOdvYQ [error] => 0 [size] => 0 ) Note both [error] = 0 (which should mean no error) and [size] = 0 (as if the file was empty). My php script runs fine and receives all the rest of the data except these files. move_uploaded_file succeeds on these files and actually copies them as 0byte files. I've already changed the php directives max_upload_size to 50M and post_max_size to 200M, so neither the single file nor the request exceed any size limit. max_execution_time is not relevant, because the time to transfer the data does not count; and I've increased max_input_time to 1000 seconds, though this shouldn't be necessary since this is the time taken to parse the input data, not the time taken to upload it. Is there any apache configuration, prior to php, that could be causing these files to be discarded even prior to php execution? Some limit in size or in upload time? I've read about a default 300 seconds timeout limit, but this should apply to the time the connection is idle, not the time it takes while actually transferring data, right? Needless to say, uploads with all exactly identical conditions (including file format, client and everything) except smaller file size, work seamlessly, so the issue is clearly related to the file or request size, or to the time it takes to send it.

    Read the article

  • Axis 1.4 Java: Modify HTTP response code

    - by Achim Bitzer
    Hello, Is there a way to modify the HTTP response code when using apache axis 1.4 for java? This would be useful for testing purposes, for example simulating server side errors. I've already tried the following: Set the HTTP code directly in the http servlet Request: MessageContext context = MessageContext.getCurrentContext(); HttpServletResponse response = (HttpServletResponse) context.getProperty( HTTPConstants.MC_HTTP_SERVLETRESPONSE); response.setStatus(e.getErrorCode()); // no effect Set the HTTP code as axis message context property: MessageContext context = MessageContext.getCurrentContext(); context.setProperty(HTTPConstants.MC_HTTP_STATUS_CODE, e.getErrorCode()); But this didn't seem to work, the actual HTTP code always was 200. Any ideas would be greatly appreciated :-) Greetings, Achim Bitzer

    Read the article

  • Not getting a response when using Android Async HTTP (from loopj)

    - by conor
    I am using the Async Http library from loopj.com and also the sample code from the site. The problem is that when the request is made I don't get a response. I have even overridden the onFinish() function which isn't getting fire either. I am using the sample code from their site which is as follows: import com.loopj.android.http.AsyncHttpClient; import com.loopj.android.http.AsyncHttpResponseHandler; Log.v("bopzy_debug", "Testing HTTP Connectivity"); System.out.println("123"); AsyncHttpClient client = new AsyncHttpClient(); client.get("http://www.google.com", new AsyncHttpResponseHandler() { @Override public void onSuccess(String response) { Log.v("bopzy_debug", response); } @Override public void onFinish() { Log.v("bopzy_debug", "Finished.."); } }); Any ideas on how to solve would be greatly appreciated, not really sure what is going on here.

    Read the article

  • htaccess mod rewrite changes http://www to http:/www

    - by Nir
    I want to replace calls like this: www.mysite.com/sub/file.php?param1=x&param2=http://www.someurl.com with: www.mysite.com/sub/param1/param2 Param 1 is an integer number Param 2 is a url I wrote this rewrite rule in htaccess: RewriteCond %{REQUEST_URI} \/sub\/ RewriteRule sub\/([0-9]+)\/(.*)$ sub\/file.php?param1=$2&param2=$1 [L] Unfortunately param2 (the URL) starts with http:/www.someurl.com instead of http://www.someurl.com (note the single slash). Any idea what causes it? When I call the same file with same parameters in the format www.mysite.com/sub/file.php?param1=x&param2=http://www.someurl.com , param2 does appear OK so it must be something with the rewrite rule.

    Read the article

  • Server Push / HTTP Streaming on Windows Mobile / Windows CE

    - by afriza
    Hi, I find that HTTP Streaming / Server Push is quite promising for my project. Does someone have any clue on how to implement this in Windows Mobile? .NET / Native / other implementations are welcomed. Preferably with permissive license. some links on HTTP Steaming / Server Push: - Push Technology - Streaming HTTP / Server Push - Cross-browser implementation of “HTTP Streaming” (push) AJAX pattern - XEP-0124: Bidirectional-streams Over Synchronous HTTP (BOSH) I was thinking of using some Qt XMPP library (QXmpp) to do the job, but I'm not sure if it's up for the task and I also want to hear some opinions on this.

    Read the article

  • XML over HTTP with JMS and Spring

    - by Will Sumekar
    I have a legacy HTTP server where I need to send an XML file over HTTP request (POST) using Java (not browser) and the server will respond with another XML in its HTTP response. It is similar to Web Service but there's no WSDL and I have to follow the existing XML structure to construct my XML to be sent. I have done a research and found an example that matches my requirement here. The example uses HttpClient from Apache Commons. (There are also other examples I found but they use java.net networking package (like URLConnection) which is tedious so I don't want to use them). But it's also my requirement to use Spring and JMS. I know from Spring's reference that it's possible to combine HttpClient, JMS and Spring. My question is, how? Note that it's NOT in my requirement to use HttpClient. If you have a better suggestion, I'm welcome. Appreciate it. For your reference, here's the XML-over-HTTP example I've been talking about: /* * $Header: * $Revision$ * $Date$ * ==================================================================== * * Copyright 2002-2004 The Apache Software Foundation * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. * ==================================================================== * * This software consists of voluntary contributions made by many * individuals on behalf of the Apache Software Foundation. For more * information on the Apache Software Foundation, please see * <http://www.apache.org/>. * * [Additional notices, if required by prior licensing conditions] * */ import java.io.File; import java.io.FileInputStream; import org.apache.commons.httpclient.HttpClient; import org.apache.commons.httpclient.methods.InputStreamRequestEntity; import org.apache.commons.httpclient.methods.PostMethod; /** * * This is a sample application that demonstrates * how to use the Jakarta HttpClient API. * * This application sends an XML document * to a remote web server using HTTP POST * * @author Sean C. Sullivan * @author Ortwin Glück * @author Oleg Kalnichevski */ public class PostXML { /** * * Usage: * java PostXML http://mywebserver:80/ c:\foo.xml * * @param args command line arguments * Argument 0 is a URL to a web server * Argument 1 is a local filename * */ public static void main(String[] args) throws Exception { if (args.length != 2) { System.out.println( "Usage: java -classpath <classpath> [-Dorg.apache.commons."+ "logging.simplelog.defaultlog=<loglevel>]" + " PostXML <url> <filename>]"); System.out.println("<classpath> - must contain the "+ "commons-httpclient.jar and commons-logging.jar"); System.out.println("<loglevel> - one of error, "+ "warn, info, debug, trace"); System.out.println("<url> - the URL to post the file to"); System.out.println("<filename> - file to post to the URL"); System.out.println(); System.exit(1); } // Get target URL String strURL = args[0]; // Get file to be posted String strXMLFilename = args[1]; File input = new File(strXMLFilename); // Prepare HTTP post PostMethod post = new PostMethod(strURL); // Request content will be retrieved directly // from the input stream // Per default, the request content needs to be buffered // in order to determine its length. // Request body buffering can be avoided when // content length is explicitly specified post.setRequestEntity(new InputStreamRequestEntity( new FileInputStream(input), input.length())); // Specify content type and encoding // If content encoding is not explicitly specified // ISO-8859-1 is assumed post.setRequestHeader( "Content-type", "text/xml; charset=ISO-8859-1"); // Get HTTP client HttpClient httpclient = new HttpClient(); // Execute request try { int result = httpclient.executeMethod(post); // Display status code System.out.println("Response status code: " + result); // Display response System.out.println("Response body: "); System.out.println(post.getResponseBodyAsString()); } finally { // Release current connection to the connection pool // once you are done post.releaseConnection(); } } }

    Read the article

  • Pretty automatically updating lighttpd server status

    - by Marko
    I am trying to use your modified lighttpd server-status. server-status mod is enabled and running, I am geting lighttpd status but I can not use yours becouse I can not use port 80 again ($SERVER["socket"] == "192.168......:80" { status.status-url = "/server-status" server.document-root = "/var/www/status" mimetype.assign = ( ".html" = "text/html" ) index-file.names = ( "pretty-status.html" ) } this gives me an error on lighty restart saying port (= already in use. Any help?

    Read the article

  • How to handle "100 continue" HTTP message ?

    - by Stephane
    Hello, I'm writing a simplistic HTTP server that will accept PUT requests mostly from cURL as client and I'm having a bit of an issue with handling the "Expect: 100-continue" header. As I understand it, the server is supposed to read the header, send back a "HTTP/1.1 100 Continue" response on the connection, read the stream up to the value on "Content-Length" and then send back the real response code (Usually "HTTP/1.1 200 OK" but any other valid HTTP answer should do). Well, that's exactly what my server does. The problem is that, apparently, if I send a "100 Continue" answer, cURL fails to report any subsequent HTTP error code and assumes the upload was a success. For instance, if the upload is rejected due to the nature of the content (there is a basic data check happening), I want the calling client to detect the problem and act accordingly. Am I missing something obvious ? Thanks

    Read the article

  • ColdFusion Server crash after thousands of HTTP requests

    - by Jason Bristol
    We are running ColdFusion 8 on a windows server 2003 VPS with an API that exposes student records to a partner API through a connector. Our API returns around 50k student records serialized in XML format pretty seamlessly. My question originates when something very frightening happened today when we tested our connector to our partners API. Our entire website and web host went down. We assumed that our host was just having some issues and after 4 hours with no resolution and no response from their customer service we finally got a response from them claiming that they had an "unauthorized user" in their network. After our server was back up we were unable to connect to our website as if the web service or coldfusion itself had froze. This is really where my concern comes from as I fear we may have overloaded the web service. As I mentioned before we tried sending over 50k HTTP POST requests over to our partner's API, however everything stopped after around 1.6k Is this bad practice or is there some sort of rate limiting I can relax somewhere in server configuration? We managed to find a workaround, but it bypasses our connector which is critical to our design. This would have been a one time deal as the purpose of so many requests was to populate our partner's website with current data, after that hourly syncs will keep requests down to around 100 per hour. UPDATE Our partner API is owned and operated by Pardot. We are converting students to prospects by passing student data to their API which unfortunately only seems to accept one student at a time. For that reason we have to do all 50k requests individually. Our server has 4GB of RAM, an Intel Core 2 Duo @ 2.8GHz running Windows Server 2003 SP2. I monitored the server during a 100 student sync, a 400 student sync, and a 1.4k student sync with the following results: 100 students - 2.25GB of Memory, 30-40% CPU utilization, 0.2-0.3% network bandwidth 400 students - 2.30GB of Memory, 30-50% CPU utilization, 0.2-1.0% network bandwidth 1.4k students - 2.30GB of Memory, 30-70% CPU utilization, 0.2-1.0% network bandwidth I know this is a far cry from 50k students, but I don't want to risk taking down our CMS system again assuming that was the cause. To give you a look at our code: <cfif (#getStudents.statusCode# eq "200 OK")> <cftry> <cfloop index="StudentXML" array="#XmlSearch(responseSTUD,'/students/student')#"> <cfset StudentXML = XmlParse(StudentXML)> <cfhttp url="#PARDOT_CMS_UPSERT#" method="post" timeout="10000" > <cfhttpparam type="url" name="user_key" value="#PARDOT_CMS_USERKEY#"> <cfhttpparam type="url" name="api_key" value="#api_key#"> <cfhttpparam type="url" name="email" value="#StudentXML.student.email.XmlText#"> <cfhttpparam type="url" name="first_name" value="#StudentXML.student.first.XmlText#"> <cfhttpparam type="url" name="last_name" value="#StudentXML.student.last.XmlText#"> <cfhttpparam type="url" name="in_cms" value="#StudentXML.student.studentid.XmlText#"> <cfhttpparam type="url" name="company" value="#StudentXML.student.agencyname.XmlText#"> <cfhttpparam type="url" name="country" value="#StudentXML.student.countryname.XmlText#"> <cfhttpparam type="url" name="address_one" value="#StudentXML.student.address.XmlText#"> <cfhttpparam type="url" name="address_two" value="#StudentXML.student.address2.XmlText#"> <cfhttpparam type="url" name="city" value="#StudentXML.student.city.XmlText#"> <cfhttpparam type="url" name="state" value="#StudentXML.student.state_province.XmlText#"> <cfhttpparam type="url" name="zip" value="#StudentXML.student.postalcode.XmlText#"> <cfhttpparam type="url" name="phone" value="#StudentXML.student.phone.XmlText#"> <cfhttpparam type="url" name="fax" value="#StudentXML.student.fax.XmlText#"> <cfhttpparam type="url" name="output" value="simple"> </cfhttp> </cfloop> <cfcatch type="any"> <cfdump var="#cfcatch.Message#"> </cfcatch> </cftry> </cfif> UPDATE 2 I checked the CF logs and found a couple of these: "Error","jrpp-8","06/06/13","16:10:18","CMS-API","Java heap space The specific sequence of files included or processed is: D:\Clients\www.xxx.com\www\dev.cms\api\v1\api.cfm, line: 675 " java.lang.OutOfMemoryError: Java heap space at java.util.Arrays.copyOf(Arrays.java:2882) at java.io.CharArrayWriter.write(CharArrayWriter.java:105) at coldfusion.runtime.CharBuffer.replace(CharBuffer.java:37) at coldfusion.runtime.CharBuffer.replace(CharBuffer.java:50) at coldfusion.runtime.NeoBodyContent.write(NeoBodyContent.java:254) at cfapi2ecfm292155732._factor30(D:\Clients\www.xxx.com\www\dev.cms\api\v1\api.cfm:675) at cfapi2ecfm292155732._factor31(D:\Clients\www.xxx.com\www\dev.cms\api\v1\api.cfm:662) at cfapi2ecfm292155732._factor36(D:\Clients\www.xxx.com\www\dev.cms\api\v1\api.cfm:659) at cfapi2ecfm292155732._factor42(D:\Clients\www.xxx.com\www\dev.cms\api\v1\api.cfm:657) at cfapi2ecfm292155732._factor37(D:\Clients\www.xxx.com\www\dev.cms\api\v1\api.cfm) at cfapi2ecfm292155732._factor44(D:\Clients\www.xxx.com\www\dev.cms\api\v1\api.cfm:456) at cfapi2ecfm292155732._factor38(D:\Clients\www.xxx.com\www\dev.cms\api\v1\api.cfm) at cfapi2ecfm292155732._factor46(D:\Clients\www.xxx.com\www\dev.cms\api\v1\api.cfm:455) at cfapi2ecfm292155732._factor39(D:\Clients\www.xxx.com\www\dev.cms\api\v1\api.cfm) at cfapi2ecfm292155732._factor47(D:\Clients\www.xxx.com\www\dev.cms\api\v1\api.cfm:453) at cfapi2ecfm292155732.runPage(D:\Clients\www.xxx.com\www\dev.cms\api\v1\api.cfm:1) at coldfusion.runtime.CfJspPage.invoke(CfJspPage.java:192) at coldfusion.tagext.lang.IncludeTag.doStartTag(IncludeTag.java:366) at coldfusion.filter.CfincludeFilter.invoke(CfincludeFilter.java:65) at coldfusion.filter.ApplicationFilter.invoke(ApplicationFilter.java:279) at coldfusion.filter.RequestMonitorFilter.invoke(RequestMonitorFilter.java:48) at coldfusion.filter.MonitoringFilter.invoke(MonitoringFilter.java:40) at coldfusion.filter.PathFilter.invoke(PathFilter.java:86) at coldfusion.filter.ExceptionFilter.invoke(ExceptionFilter.java:70) at coldfusion.filter.ClientScopePersistenceFilter.invoke(ClientScopePersistenceFilter.java:28) at coldfusion.filter.BrowserFilter.invoke(BrowserFilter.java:38) at coldfusion.filter.NoCacheFilter.invoke(NoCacheFilter.java:46) at coldfusion.filter.GlobalsFilter.invoke(GlobalsFilter.java:38) at coldfusion.filter.DatasourceFilter.invoke(DatasourceFilter.java:22) at coldfusion.CfmServlet.service(CfmServlet.java:175) at coldfusion.bootstrap.BootstrapServlet.service(BootstrapServlet.java:89) at jrun.servlet.FilterChain.doFilter(FilterChain.java:86) Looks like I might have crashed the JVM in CF, is there a better way to do this? We are thinking of just exporting all records initially as a CSV file and importing it into Pardot seeing as we will never have to do a request this large again.

    Read the article

  • insert null character using tamper data

    - by Jeremy Comulu
    Using the firefox extension tamper data (for modifing http requests that firefox makes) how do I insert a null character into a post field? I can enter normal characters, but binary characters in it are not urlencoded and are shown as is, so how do I enter the null character into a field? If you know of a firefox extension like tamper data that I can do this or a way to do this using tamper data please post.

    Read the article

  • IIS 401.3 - Unauthorized on only 1 server out of 3 set up for network load balancing

    - by Tony
    Over the weekend our Server Admin set up two virtual Windows 2008 machines with IIS installed and set them up under NLB. I came in and changed the application pool the website was running under to our domain account that has proper access to the database and the file share hosting our .NET web application Sitefinity, and changed it to .NET 4 Integrated. NLB and everything was running fine on both servers. He brought up the third server for our cluster on Tuesday and I performed the same actions.. The only difference was that I was given admin rights for the third server so I could set it up remotely instead of going to his office. He has full control over the share and NTFS perms on \\hostname\Sitefinity and I believe I only had read access. I pointed the web site to the same \\hostname\Sitefinity\sitename share that the others were on and the authentication/authorization test settings passed. I hit the site from http://localhost (like I did successfully from the other two before trying the cluster's IP address) and I received a HTTP Error 401.3 - Unauthorized. I've verified many times that the application pool is running under the same service account. I tried hitting just a simple test.htm.. works fine on both of the first two servers but I get the same 401.3 on the third. I copied my dev project to the local inetpub directory and re-pointed the website and that ran perfectly. I turned on Failed Request Tracing and it acts like it's still running the local IUSR account I guess (instead of my domain account)? Here is an excerpt of the File Cache Access Start and the error from the trace: FileName \\hostname\sitefinity\sitename\test.htm UserName IUSR DomainName NT AUTHORITY ---------- Successful false FileFromCache false FileAddedToCache false FileDirmoned true LastModCheckErrorIgnored true ErrorCode 2147942405 LastModifiedTime ErrorCode Access is denied. (0x80070005) ---------- ModuleName IIS Web Core Notification 2 HttpStatus 401 HttpReason Unauthorized HttpSubStatus 3 ErrorCode 2147942405 ConfigExceptionInfo Notification AUTHENTICATE_REQUEST ErrorCode Access is denied. (0x80070005) ---------- My personal AD account was then granted read/write perms to the share so I created a new application pool and set the site under it in case there was an issue with the application pool but no success. I created another under my own account and it still failed. It just seems like maybe it's not trying to access the files under the account my application pools are running under although that's the only way I've done things before. I set the Physicial Path Credentials in Advanced Settings on the site to the service account and it threw a 500 error of some sort so I assume that's not the answer (and I don't have to do it on the other servers). It's like somehow I'm trying to force impersonation on the IUSR account or something?

    Read the article

  • Logging Into a site that uses Live.com authentication with C#

    - by Josh
    I've been trying to automate a log in to a website I frequent, www.bungie.net. The site is associated with Microsoft and Xbox Live, and as such makes uses of the Windows Live ID API when people log in to their site. I am relatively new to creating web spiders/robots, and I worry that I'm misunderstanding some of the most basic concepts. I've simulated logins to other sites such as Facebook and Gmail, but live.com has given me nothing but trouble. Anyways, I've been using Wireshark and the Firefox addon Tamper Data to try and figure out what I need to post, and what cookies I need to include with my requests. As far as I know these are the steps one must follow to log in to this site. 1. Visit https: //login.live.com/login.srf?wa=wsignin1.0&rpsnv=11&ct=1268167141&rver=5.5.4177.0&wp=LBI&wreply=http:%2F%2Fwww.bungie.net%2FDefault.aspx&id=42917 2. Recieve the cookies MSPRequ and MSPOK. 3. Post the values from the form ID "PPSX", the values from the form ID "PPFT", your username, your password all to a changing URL similar to: https: //login.live.com/ppsecure/post.srf?wa=wsignin1.0&rpsnv=11&ct= (there are a few numbers that change at the end of that URL) 4. Live.com returns the user a page with more hidden forms to post. The client then posts the values from the form "ANON", the value from the form "ANONExp" and the values from the form "t" to the URL: http ://www.bung ie.net/Default.aspx?wa=wsignin1.0 5. After posting that data, the user is returned a variety of cookies the most important of which is "BNGAuth" which is the log in cookie for the site. Where I am having trouble is on fifth step, but that doesn't neccesarily mean I've done all the other steps correctly. I post the data from "ANON", "ANONExp" and "t" but instead of being returned a BNGAuth cookie, I'm returned a cookie named "RSPMaybe" and redirected to the home page. When I review the Wireshark log, I noticed something that instantly stood out to me as different between the log when I logged in with Firefox and when my program ran. It could be nothing but I'll include the picture here for you to review. I'm being returned an HTTP packet from the site before I post the data in the fourth step. I'm not sure how this is happening, but it must be a side effect from something I'm doing wrong in the HTTPS steps. ![alt text][1] http://img391.imageshack.us/img391/6049/31394881.gif using System; using System.Collections.Generic; using System.Collections.Specialized; using System.Text; using System.Net; using System.IO; using System.IO.Compression; using System.Security.Cryptography; using System.Security.Cryptography.X509Certificates; using System.Web; namespace SpiderFromScratch { class Program { static void Main(string[] args) { CookieContainer cookies = new CookieContainer(); Uri url = new Uri("https://login.live.com/login.srf?wa=wsignin1.0&rpsnv=11&ct=1268167141&rver=5.5.4177.0&wp=LBI&wreply=http:%2F%2Fwww.bungie.net%2FDefault.aspx&id=42917"); HttpWebRequest http = (HttpWebRequest)HttpWebRequest.Create(url); http.Timeout = 30000; http.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 (.NET CLR 3.5.30729)"; http.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"; http.Headers.Add("Accept-Language", "en-us,en;q=0.5"); http.Headers.Add("Accept-Charset", "ISO-8859-1,utf-8;q=0.7,*;q=0.7"); http.Headers.Add("Keep-Alive", "300"); http.Referer = "http://www.bungie.net/"; http.ContentType = "application/x-www-form-urlencoded"; http.CookieContainer = new CookieContainer(); http.Method = WebRequestMethods.Http.Get; HttpWebResponse response = (HttpWebResponse)http.GetResponse(); StreamReader readStream = new StreamReader(response.GetResponseStream()); string HTML = readStream.ReadToEnd(); readStream.Close(); //gets the cookies (they are set in the eighth header) string[] strCookies = response.Headers.GetValues(8); response.Close(); string name, value; Cookie manualCookie; for (int i = 0; i < strCookies.Length; i++) { name = strCookies[i].Substring(0, strCookies[i].IndexOf("=")); value = strCookies[i].Substring(strCookies[i].IndexOf("=") + 1, strCookies[i].IndexOf(";") - strCookies[i].IndexOf("=") - 1); manualCookie = new Cookie(name, "\"" + value + "\""); Uri manualURL = new Uri("http://login.live.com"); http.CookieContainer.Add(manualURL, manualCookie); } //stores the cookies to be used later cookies = http.CookieContainer; //Get the PPSX value string PPSX = HTML.Remove(0, HTML.IndexOf("PPSX")); PPSX = PPSX.Remove(0, PPSX.IndexOf("value") + 7); PPSX = PPSX.Substring(0, PPSX.IndexOf("\"")); //Get this random PPFT value string PPFT = HTML.Remove(0, HTML.IndexOf("PPFT")); PPFT = PPFT.Remove(0, PPFT.IndexOf("value") + 7); PPFT = PPFT.Substring(0, PPFT.IndexOf("\"")); //Get the random URL you POST to string POSTURL = HTML.Remove(0, HTML.IndexOf("https://login.live.com/ppsecure/post.srf?wa=wsignin1.0&rpsnv=11&ct=")); POSTURL = POSTURL.Substring(0, POSTURL.IndexOf("\"")); //POST with cookies http = (HttpWebRequest)HttpWebRequest.Create(POSTURL); http.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 (.NET CLR 3.5.30729)"; http.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"; http.Headers.Add("Accept-Language", "en-us,en;q=0.5"); http.Headers.Add("Accept-Charset", "ISO-8859-1,utf-8;q=0.7,*;q=0.7"); http.Headers.Add("Keep-Alive", "300"); http.CookieContainer = cookies; http.Referer = "https://login.live.com/login.srf?wa=wsignin1.0&rpsnv=11&ct=1268158321&rver=5.5.4177.0&wp=LBI&wreply=http:%2F%2Fwww.bungie.net%2FDefault.aspx&id=42917"; http.ContentType = "application/x-www-form-urlencoded"; http.Method = WebRequestMethods.Http.Post; Stream ostream = http.GetRequestStream(); //used to convert strings into bytes System.Text.ASCIIEncoding encoding = new System.Text.ASCIIEncoding(); //Post information byte[] buffer = encoding.GetBytes("PPSX=" + PPSX +"&PwdPad=IfYouAreReadingThisYouHaveTooMuc&login=YOUREMAILGOESHERE&passwd=YOURWORDGOESHERE" + "&LoginOptions=2&PPFT=" + PPFT); ostream.Write(buffer, 0, buffer.Length); ostream.Close(); HttpWebResponse response2 = (HttpWebResponse)http.GetResponse(); readStream = new StreamReader(response2.GetResponseStream()); HTML = readStream.ReadToEnd(); response2.Close(); ostream.Dispose(); foreach (Cookie cookie in response2.Cookies) { Console.WriteLine(cookie.Name + ": "); Console.WriteLine(cookie.Value); Console.WriteLine(cookie.Expires); Console.WriteLine(); } //SET POSTURL value string POSTANON = "http://www.bungie.net/Default.aspx?wa=wsignin1.0"; //Get the ANON value string ANON = HTML.Remove(0, HTML.IndexOf("ANON")); ANON = ANON.Remove(0, ANON.IndexOf("value") + 7); ANON = ANON.Substring(0, ANON.IndexOf("\"")); ANON = HttpUtility.UrlEncode(ANON); //Get the ANONExp value string ANONExp = HTML.Remove(0, HTML.IndexOf("ANONExp")); ANONExp = ANONExp.Remove(0, ANONExp.IndexOf("value") + 7); ANONExp = ANONExp.Substring(0, ANONExp.IndexOf("\"")); ANONExp = HttpUtility.UrlEncode(ANONExp); //Get the t value string t = HTML.Remove(0, HTML.IndexOf("id=\"t\"")); t = t.Remove(0, t.IndexOf("value") + 7); t = t.Substring(0, t.IndexOf("\"")); t = HttpUtility.UrlEncode(t); //POST the Info and Accept the Bungie Cookies http = (HttpWebRequest)HttpWebRequest.Create(POSTANON); http.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 (.NET CLR 3.5.30729)"; http.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"; http.Headers.Add("Accept-Language", "en-us,en;q=0.5"); http.Headers.Add("Accept-Encoding", "gzip,deflate"); http.Headers.Add("Accept-Charset", "ISO-8859-1,utf-8;q=0.7,*;q=0.7"); http.Headers.Add("Keep-Alive", "115"); http.CookieContainer = new CookieContainer(); http.ContentType = "application/x-www-form-urlencoded"; http.Method = WebRequestMethods.Http.Post; http.Expect = null; ostream = http.GetRequestStream(); int test = ANON.Length; int test1 = ANONExp.Length; int test2 = t.Length; buffer = encoding.GetBytes("ANON=" + ANON +"&ANONExp=" + ANONExp + "&t=" + t); ostream.Write(buffer, 0, buffer.Length); ostream.Close(); //Here lies the problem, I am not returned the correct cookies. HttpWebResponse response3 = (HttpWebResponse)http.GetResponse(); GZipStream gzip = new GZipStream(response3.GetResponseStream(), CompressionMode.Decompress); readStream = new StreamReader(gzip); HTML = readStream.ReadToEnd(); //gets both cookies string[] strCookies2 = response3.Headers.GetValues(11); response3.Close(); } } } This has given me problems and I've put many hours into learning about HTTP protocols so any help would be appreciated. If there is an article detailing a similar log in to live.com feel free to point the way. I've been looking far and wide for any articles with working solutions. If I could be clearer, feel free to ask as this is my first time using Stack Overflow. Cheers, --Josh

    Read the article

  • how to do asynchronous http requests with epoll and python 3.1

    - by flow
    there is an interesting page http://scotdoyle.com/python-epoll-howto.html about how to do asnchronous / non-blocking / AIO http serving in python 3. there is the tornado web server which does include a non-blocking http client. i have managed to port parts of the server to python 3.1, but the implementation of the client requires pyCurl and seems to have problems (with one participant stating how ‘Libcurl is such a pain in the neck’, and looking at the incredibly ugly pyCurl page i doubt pyCurl will arrive in py3+ any time soon). now that epoll is available in the standard library, it should be possible to do asynchronous http requests out of the box with python. i really do not want to use asyncore or whatnot; epoll has a reputation for being the ideal tool for the task, and it is part of the python distribution, so using anything but epoll for non-blocking http is highly counterintuitive (prove me wrong if you feel like it). oh, and i feel threading is horrible. no threading. i use stackless. people further interested in the topic of asynchronous http should not miss out on this talk by peter portante at PyCon2010; also of interest is the keynote, where speaker antonio rodriguez at one point emphasizes the importance of having up-to-date web technology libraries right in the standard library.

    Read the article

  • OData / WCF Data Service - HTTP 500 Error

    - by Eric
    I have created an OData/WCF service using Visual Studio 2010 on Windows XP SP3 with all current patches installed. When I click on "view in browser", the service opens and I see the 3 tables from my EF model. However, when I add a table name ("Commands" in this case) to the end of the query string, rather than seeing the data from the table, I get an HTTP 500 error. (This error (HTTP 500 Internal Server Error) means that the website you are visiting had a server problem which prevented the webpage from displaying.). I have not only followed the examples from 2 sites, but have also tried running the sample application that the blog poster sent me (that works on his machine), and still am not having any luck. The blog post is at Exposing OData from an Entity Framework Model Does anyone have an idea why this is occurring and how to resolve it? Here is the output of the "View in Browser": <?xml version="1.0" encoding="utf-8" standalone="yes" ?> - <service xml:base="http://localhost:1883/VistaDBCommandService.svc/" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:app="http://www.w3.org/2007/app" xmlns="http://www.w3.org/2007/app"> - <workspace> <atom:title>Default</atom:title> - <collection href="Commands"> <atom:title>Commands</atom:title> </collection> - <collection href="Databases"> <atom:title>Databases</atom:title> </collection> - <collection href="Statuses"> <atom:title>Statuses</atom:title> </collection> </workspace> </service> ============================= Thanks, Eric

    Read the article

  • Relation between HTTP Keep Alive duration and TCP timeout duration

    - by Suresh Kumar
    I am trying to understand the relation between TCP/IP and HTTP timeout values. Are these two timeout values different or same? Most Web servers allow users to set the HTTP Keep Alive timeout value through some configuration. How is this value used by the Web servers? is this value just set on the underlying TCP/IP socket i.e is the HTTP Keep Alive timeout and TCP/IP Keep Alive Timeout same? or are they treated differently? My understanding is (maybe incorrect): The Web server uses the default timeout on the underlying TCP socket (i.e. indefinite) regardless of the configured HTTP Keep Alive timeout and creates a Worker thread that counts down the specified HTTP timeout interval. When the Worker thread hits zero, it closes the connection. EDIT: My question is about the relation or difference between the two timeout durations i.e. what will happen when HTTP keep-alive timeout duration and the timeout on the Socket (SO_TIMEOUT) which the Web server uses is different? should I even worry about these two being same or not?

    Read the article

  • Rails' page caching vs. HTTP reverse proxy caches

    - by John Topley
    I've been catching up with the Scaling Rails screencasts. In episode 11 which covers advanced HTTP caching (using reverse proxy caches such as Varnish and Squid etc.), they recommend only considering using a reverse proxy cache once you've already exhausted the possibilities of page, action and fragment caching within your Rails application (as well as memcached etc. but that's not relevant to this question). What I can't quite understand is how using an HTTP reverse proxy cache can provide a performance boost for an application that already uses page caching. To simplify matters, let's assume that I'm talking about a single host here. This is my understanding of how both techniques work (maybe I'm wrong): With page caching the Rails process is hit initially and then generates a static HTML file that is served directly by the Web server for subsequent requests, for as long as the cache for that request is valid. If the cache has expired then Rails is hit again and the static file is regenerated with the updated content ready for the next request With an HTTP reverse proxy cache the Rails process is hit when the proxy needs to determine whether the content is stale or not. This is done using various HTTP headers such as ETag, Last-Modified etc. If the content is fresh then Rails responds to the proxy with an HTTP 304 Not Modified and the proxy serves its cached content to the browser, or even better, responds with its own HTTP 304. If the content is stale then Rails serves the updated content to the proxy which caches it and then serves it to the browser If my understanding is correct, then doesn't page caching result in less hits to the Rails process? There isn't all that back and forth to determine if the content is stale, meaning better performance than reverse proxy caching. Why might you use both techniques in conjunction?

    Read the article

  • Add an Opera Style Status Bar to Firefox

    - by Asian Angel
    Anyone who has used Opera will be familiar with the information presented for the webpage that is currently loading in the browser (i.e. number of images loaded). If you would like to have that same functionality in Firefox then join us as we look at the Extended Statusbar extension. Before Here is the default setup for Firefox…not a lot of information available to indicate exactly how much of the webpage has already loaded versus what has not. For some people this is enough but what if you like more details? Extended Statusbar in Action You may be curious about the information that the Extended Statusbar extension will provide. The information includes: Percentage of the webpage loaded The number of images loaded Bytes downloaded Average download speed The load time After emptying the cache we once again reloaded the HTG homepage. The default style/mode is “Classic Style” and the “webpage load information” will be displayed within your “Status Bar” as shown here. The information available after the webpage finished loading in “Classic Style”. If you prefer “Slim Mode” this is how your “Status Bar” should look afterwards…very condensed. For those preferring the “New Style” a temporary addition will appear above your regular “Status Bar” and disappear just a few seconds after the webpage has fully loaded (unless changed in the “Settings”). Settings The “Settings” are set up in two different ways. For those who prefer to use the “Classic Style & Slim Mode” these are the options available to you. If you prefer the “New Style” then you will have a whole different set of options available. Notice that you can exclude certain webpages and set a custom style if desired. Conclusion If you have been wanting to add Opera style webpage loading information to your “Status Bar” then you should definitely give this extension a try. Links Download the Extended Statusbar extension (Mozilla Add-ons) Similar Articles Productive Geek Tips Move the Progress Bar to the Tabs in FirefoxSet the Speed Dial as the Opera Startup PageAuto-Hide Your Cluttered Firefox Status Bar ItemsSimplify Text Copying & Pasting in Firefox with AutoCopyScan Files for Viruses Before You Download With Dr.Web TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 If it were only this easy SyncToy syncs Files and Folders across Computers on a Network (or partitions on the same drive) Classic Cinema Online offers 100’s of OnDemand Movies OutSync will Sync Photos of your Friends on Facebook and Outlook Windows 7 Easter Theme YoWindoW, a real time weather screensaver

    Read the article

  • How can I prevent HTTPS on another domain from wrongly showing on my HTTP-only domain?

    - by Earlz
    So, I have a blog at domain.com. This blog is HTTP-only because I would gain almost nothing from adding SSL support. I have a web service now that I want to enable SSL support on that runs on the same server and IP address as my blog. I got it all working pretty easily, but not if I go to https://domain.com I will see a huge warning about an SSL certificate error and then if I click "ok" through the warning, I'll see the web service with SSL support, not my blog. My biggest fear with this scheme is Google indexing an HTTPS version of it and penalizing my blog because the content between the two doesn't match. How can I somehow for my blog's domain to either not serve anything on HTTPS, or to redirect back to my HTTP blog, or to serve my blog, but with an invalid SSL certificate? What can I do, preferably without buying another dedicated IP for my website?

    Read the article

  • Build Controller status Unavailable issue in TFS2010

    - by jehan
    I ran into this problem few days back, I was not able to run the builds because the Build Controller was showing Status as Unavailable. It was showing the below exception: There was no endpoint listening at http://fullmachinename:9191/Build/v3.0/Services/Controller/2 that could accept the message. This is often caused by an incorrect address or SOAP action. See InnerException, if present, for more details. After trying out few things, I looked at below Build Service Properties.   Then, I did below modifications to the Build Services Properties: 1)      Changed the Local Build Service Endpoint(incoming) from http://machinename.domain.com:9191 to http://machinename:9191 2)      Changed the Connect to Team Project Collection (outgoing) from localhost to machine name. http://localhost:8080/tfs/defaultCollection to http://machinename:8080/tfs/DefaultCollection   After that Started the Build Services and it fixed the issue, the Build Controller was showing Available Status and was able to run the builds.

    Read the article

  • Setting MSN or Yahoo! Messenger status to Invisible or Offline when idle for an hour

    - by Jian Lin
    Where, or how, do I set it up in MSN Messenger or Yahoo! Messenger to automatically switch my status to either "Invisible" or "Offline" when idle for a half hour, or an hour? I know how to set my status as "Away" or "Busy" after 10 minutes, but can't seem to find a way to set the offline status options without manual intervention. Back story As a software developer, I am very used to turning the computer on for the whole day and not turning it back off. (For example, checking email for urgent fixes, fix issue and push to web server). It's not even turned off when heading to sleep in case I might find it hard to fall asleep and come back to check on the computer. Or to have it there ready in the morning to check that everything is okay. If I'm seen as being online for 24 hours of a day, some people see me as weird. Their perception of my value decreases as I'm always there (hard to get = high value; always there = low value). Leaving it on makes everyone in my contacts list think I have nothing better to do all day than sit in front of the computer. Even though it's my job and I do admittedly spend more time online than other people. That's why I'd like to find a way to set my status as Invisible or Offline.

    Read the article

  • Google.com and clients1.google.com/generate_204

    - by David Murdoch
    I was looking into google.com's Net activity in firebug just because I was curious and noticed a request was returning "204 No Content." It turns out that a 204 No Content "is primarily intended to allow input for actions to take place without causing a change to the user agent's active document view, although any new or updated metainformation SHOULD be applied to the document currently in the user agent's active view." Whatever. I've looked into the JS source code and saw that "generate_204" is requested like this: (new Image).src="http://clients1.google.com/generate_204" No variable declaration/assignment at all. My first idea is that it was being used to track if Javascript is enabled. But the "(new Image).src='...'" call is called from a dynamically loaded external JS file anyway, so that would be pointless. Anyone have any ideas as to what the point could be? UPDATE "/generate_204" appears to be available on many google services/servers (e.g., maps.google.com/generate_204, maps.gstatic.com/generate_204, etc...). You can take advantage of this by pre-fetching the generate_204 pages for each google-owned service your web app may use. Like This: window.onload = function(){ var two_o_fours = [ // google maps domain ... "http://maps.google.com/generate_204", // google maps images domains ... "http://mt0.google.com/generate_204", "http://mt1.google.com/generate_204", "http://mt2.google.com/generate_204", "http://mt3.google.com/generate_204", // you can add your own 204 page for your subdomains too! "http://sub.domain.com/generate_204" ]; for(var i = 0, l = two_o_fours.length; i < l; ++i){ (new Image).src = two_o_fours[i]; } };

    Read the article

  • C# SOCKS proxy service for HTTP requests

    - by Ed
    I'm trying to build a service that will forward HTTP requests from agents like a browser to the Tor service. Problem is, the Tor service only accepts SOCKS4a connections. So my solution is to listen for HTTP requests, get the URL they're requesting, and make a request via Tor with the help of the Starksoft.Net.Proxy library. Then return the response. The library kind of works, but I'm not happy. It returns HTTP headers with the response and it can't handle images. So the responses are messed up. How could I improve my code? I'm very new to network programming. Sorry for the long example. public AnonymiserService(ILogger logger) { try { _logger = logger; _logger.Log("Listening on port {0}...", Properties.Settings.Default.ListeningPort); StartListener(new string[] { string.Format("http://*:{0}/", Properties.Settings.Default.ListeningPort) }); } catch (Exception ex) { _logger.LogError("Exception!", ex); } } private void StartListener(string[] prefixes) { if (!HttpListener.IsSupported) { _logger.LogError("HttpListener isn't supported on this machine!"); return; } HttpListener listener = new HttpListener(); foreach (string s in prefixes) listener.Prefixes.Add(s); while (true) { listener.Start(); IAsyncResult result = listener.BeginGetContext(new AsyncCallback(ListenerCallback), listener); result.AsyncWaitHandle.WaitOne(); } } private void ListenerCallback(IAsyncResult result) { try { // Get HTTP request HttpListener listener = (HttpListener)result.AsyncState; HttpListenerContext context = listener.EndGetContext(result); _logger.Log("Retrieving [{0}]", context.Request.RawUrl); // Create connection // Use Tor as proxy IProxyClient proxyClient = new Socks4aProxyClient("localhost", 9050); TcpClient tcpClient = proxyClient.CreateConnection(context.Request.UserHostName, 80); // Create message // Need to set Connection: close to close the connection as soon as it's done byte[] data = Encoding.UTF8.GetBytes(String.Format("GET {0} HTTP/1.1\r\nHost: {1}\r\nConnection: close\r\n\r\n", context.Request.Url.PathAndQuery, context.Request.UserHostName)); // Send message NetworkStream ns = tcpClient.GetStream(); ns.Write(data, 0, data.Length); // Pass on HTTP response HttpListenerResponse responseOut = context.Response; if (ns.CanRead) { byte[] buffer = new byte[32768]; int read = 0; string responseString = string.Empty; // Read response while ((read = ns.Read(buffer, 0, buffer.Length)) > 0) { responseString += Encoding.UTF8.GetString(buffer, 0, read); } // Remove headers if (responseString.IndexOf("HTTP/1.1 200 OK") > -1) responseString = responseString.Substring(responseString.IndexOf("\r\n\r\n")); // Forward response byte[] byteArray = Encoding.UTF8.GetBytes(responseString); responseOut.OutputStream.Write(byteArray, 0, byteArray.Length); } // Close streams responseOut.OutputStream.Close(); ns.Close(); // Close connection tcpClient.Close(); _logger.Log("Retrieved [{0}]", context.Request.RawUrl); } catch (Exception ex) { _logger.LogError("Exception in ListenerCallback!", ex); } }

    Read the article

  • Quick Quips on QR Codes

    - by Tim Dexter
    Yes, I'm an alliterating all-star; I missed my calling as a newspaper headline writer. I have recently received questions from several folks on support for QR codes. You know them they are everywhere you look, even here! How does Publisher handle QR codes then? In theory, exactly the same way we handle any other 2D barcode font. We need the font file, a mapping entry and an encoding class. With those three pieces we can embed QR codes into any output. To test the theory, I went off to IDAutomation, I have worked with them and many customers over the years and their fonts and encoders have worked great and have been very reliable. They kindly provide demo fonts which has made my life so much easier to be able to write posts like this. Their QR font and encoder is a little tough to find. I started here and then hit the Demo Now button. On the next page I hit the right hand Demo Now button. In the resulting zip file you'll need two files: AdditionalFonts.zip >> Automation2DFonts >> TrueType >> IDAutomation2D.ttf Java Class Encoder >> IDAutomation_JavaFontEncoder_QRCode.jar - the QRBarcodeExample.java is useful to see how to call the encoder. The font file needs to be installed into the windows/fonts directory, just copy and paste it in using file explorer and windows will install it for you. Remember, we are using the demo font here and you'll see if you get your phones decoder to looks a the font above there is a fixed string 'DEMO' at the beginning. You want that removed? Go buy the font from the IDAutomation folks. The Encoder Next you need to create your encoding wrapper class. Publisher does ship a class but its compiled and I do not recommend trying to modify it, you can just build your own. I have loaded up my class here. You do not need to be a java guru, its pretty straightforward. I'd recommend a java IDE like JDeveloper from a convenience point of view. I have annotated my class and added a main method to it so you can test your encoders from JDeveloper without having to deploy them first. You can load up the project form the zip file straight into JDeveloper.Next, take a look at IDAutomation's example java class and you'll see: QRCodeEncoder qre=new QRCodeEncoder();  String DataToEncode = "IDAutmation Inc.";  boolean ApplyTilde = false;  int EncodingMode = 0;  int Version = 0;  int ErrorCorrectionLevel = 0;  System.out.println( qre.FontEncode(DataToEncode, ApplyTilde, EncodingMode, Version, ErrorCorrectionLevel) ); You'll need to check what settings you need to set for the ApplyTilde, EncodingMode, Version and ErrorCorrectionLevel. They are covered in the user guide from IDAutomation here. If you do not want to hard code the values in the encoder then you can quite easily externalize them and read the values from a text file. I have not covered that scenario here, I'm going with IDAutomation's defaults and my phone app is reading the fonts no problem. Now you know how to call the encoder, you need to incorporate it into your encoder wrapper class. From my sample class:       Class[] clazz = new Class[] { "".getClass() };        ENCODERS.put("code128a",mUtility.getClass().getMethod("code128a", clazz));       ENCODERS.put("code128b",mUtility.getClass().getMethod("code128b", clazz));       ENCODERS.put("code128c",mUtility.getClass().getMethod("code128c", clazz));       ENCODERS.put("qrcode",mUtility.getClass().getMethod("qrcode", clazz)); I just added a new entry to register the encoder method 'qrcode' (in red). Then I created a new method inside the class to call the IDAutomation encoder. /** Call to IDAutomations QR Code encoder. Passing the data to encode      Returning the encoded string to the template for formatting **/ public static final String qrcode (String DataToEncode) {   QRCodeEncoder qre=new QRCodeEncoder();    boolean ApplyTilde = false;    int EncodingMode = 0;    int Version = 0;    int ErrorCorrectionLevel = 0; return qre.FontEncode(DataToEncode, ApplyTilde, EncodingMode, Version, ErrorCorrectionLevel); } Almost the exact same code in their sample class. The DataToEncode string is passed in rather than hardcoded of course. With the class done you can now compile it, but you need to ensure that the IDAutomation_JavaFontEncoder_QRCode.jar is in the classpath. In JDeveloper, open the project properties >> Libraries and Classpaths and then add the jar to the list. You'll need the publisher jars too. You can find those in the jlib directory in your Template Builder for Word directory.Note! In my class, I have used package oracle.psbi.barcode; As my package spec, yours will be different but you need to note it for later. Once you have it compiling without errors you will need to generate a jar file to keep it in. In JDeveloper highlight your project node >> New >> Deployment Profile >> JAR file. Once you have created the descriptor, just take the defaults. It will tell you where the jar is located. Go get it and then its time to copy it and the IDAutomation jar into the Template Builder for Word directory structure. Deploying the jars On your windows machine locate the jlib directory under the Template Builder for Word install directory. On my machine its here, F:\Program Files\Oracle\BI Publisher\BI Publisher Desktop\Template Builder for Word\jlib. Copy both of the jar files into the directory. The next step is to get the jars into the classpath for the Word plugin so that Publisher can find your wrapper class and it can then find the IDAutomation encoder. The most consistent way I have found so far, is to open up the RTF2PDF.jar in the same directory and make some mods. First make a backup of the jar file then open it using winzip or 7zip or similar and get into the META-INF directory. In there is a file, MANIFEST.MF. This contains the classpath for the plugin, open it in an editor and add the jars to the end of the classpath list. In mine I have: Manifest-Version: 1.0 Class-Path: ./activation.jar ./mail.jar ./xdochartstyles.jar ./bicmn.jar ./jewt4.jar ./share.jar ./bipres.jar ./xdoparser.jar ./xdocore.jar ./xmlparserv2.jar ./xmlparserv2-904.jar  ./i18nAPI_v3.jar ./versioninfo.jar ./barcodejar.jar ./IDAutomation_JavaFontEncoder_QRCode.jar Main-Class: RTF2PDF I have put in carriage returns above to make the Class-Path: entry more readable, make sure yours is all on one line. Be sure to use the ./ as a prefix to the jar name. Ensure the file is saved inside the jar file 7zip and winzip both have popups asking if you want to update the file in the jar file.Now you have the jars on the classpath, the Publisher plugin will be able to find our classes at run time. Referencing the Font The next step is to reference the font location so that the rendering engine can find it and embed a subset into the PDF output. Remember the other output formats rely on the font being present on the machine that is opening the document. The PDF is the only truly portable format. Inside the config directory under the Template Builder for Word install directory, mine is here, F:\Program Files\Oracle\BI Publisher\BI Publisher Desktop\Template Builder for Word\config. You'll find the file, 'xdo example.cfg'. Rename it to xdo.cfg and open it in a text editor. In the fonts section, create a new entry:       <font family="IDAutomation2D" style="normal" weight="normal">              <truetype path="C:\windows\fonts\IDAutomation2D.ttf" />       </font> Note, 'IDAutomation2D' (in red) is the same name as you can see when you open MSWord and look for the QRCode font. This must match exactly. When Publisher looks at the fonts in the RTF template at runtime it will see 'IDAutomation2D' it will then look at its font mapping entries to find where that font file resides on the disk. If the names do not match or the font is not present then the font will not get used and it will fall back on Helvetica. Building the Template Now you have the data encoder and the font in place and mapped; you can use it in the template. The two commands you will need to have present are: <?register-barcode-vendor:'ENCODER WRAPPER CLASS'; 'ENCODER NAME'?> for my encoder I have: <?register-barcode-vendor:'oracle.psbi.barcode.BarcodeUtil'; 'MyBarcodeEncoder'?> Notice the two parameters for the command. The first provides the package 'path' and class name (remember I said you need to remember that above.)The second is the name of the encoder, in my case 'MyBarcodeEncoder'. Check my full encoder class in the zip linked below to see where I named it. You can change it to something else, no problem.This command needs to be near the top of the template. The second command is the encoding command: <?format-barcode:DATAT_TO_ENCODE;'ENCODER_METHOD_NAME';'ENCODER_NAME'?> for my command I have <?format-barcode:DATATEXT;'qrcode';'MyBarcodeEncoder'?>DATATEXT is the XML element that contains the text to be encoded. If you want to hard code a piece of text just surround it with single quotes. qrcode is the name of my encoder method that calls the IDAutomation encoder. Remember this.MyBarcodeEncoder is the name of my encoder. Repetition? Yes but its needed again. Both of these commands are put inside their own form fields. Do not apply the QRCode font to the second field just yet. Lets make sure the encoder is working. Run you template with some data and you should get something like this for your encoded data: AHEEEHAPPJOPMOFADIPFJKDCLPAHEEEHA BNFFFNBPJGMDIDJPFOJGIGBLMPBNFFFNB APIBOHFJCFBNKHGGBMPFJFJLJBKGOMNII OANKPJFFLEPLDNPCLMNGNIJIHFDNLJFEH FPLFLHFHFILKFBLOIGMDFCFLGJGOPJJME CPIACDFJPBGDODOJCHALJOBPECKMOEDDF MFFNFNEPKKKCHAIHCHPCFFLDAHFHAGLMK APBBBPAPLDKNKJKKGIPDLKGMGHDDEPHLN HHHHHHHPHPHHPHPPHPPPPHHPHHPHPHPHP Grooovy huh? If you do not get the encoded text then go back and check that your jars are in the right spot and that you have the MANIFEST.MF file updated correctly. Once you do get the encoded text, highlight the field and apply the IDAutomation2D font to it. Then re-run the report and you will hopefully see the QR code in your output. If not, go back and check the xdo.cfg entry and make sure its in the right place and the font location is correct. That's it, you now have QR codes in Publisher outputs. Everything I have written above, has been tested with the 5.6.3, 10.1.3.4.2 codelines. I'll be testing the 11g code in the next day or two and will update you with any changes. One thing I have not covered yet and will do in the next few days is how to deploy all of this to your server. Look out for a follow up post. One note on the apparent white lines in the font (see the image above). Once printed they disappear and even viewing the code on a screen with the white lines, my phone app is still able to read and interpret the contents no problem. I have zipped up my encoder wrapper class as a JDeveloper 11.1.1.6 project here. Just dig into the src directories to find the BarcodeUtil.java file if you just want the code. I have put comments into the file to hopefully help the novice java programmer out. Happy QR'ing!

    Read the article

  • Server returned HTTP response code: 500 for URL

    - by user617162
    java.io.IOException: Server returned HTTP response code: 500 for URL: http://ww .huadt.com.cn/zh-cn/i/l/@357671030745308@V500@0000@AUTOLOW@1@11d590f7$GPRMC,065 48.000,A,3959.8587,N,11617.2447,E,0.00,55.32,210311,,,A*56@@ at sun.net.www.protocol.http.HttpURLConnection.getInputStream(Unknown S urce) at hdt.SendCmdToP.Sendplatform(SendCmdToP.java:67) at hdt.SendCmdToP.process(SendCmdToP.java:198) at hdt.SendCmdToP.run(SendCmdToP.java:131) java.lang.NullPointerException at hdt.SendCmdToP.Sendplatform(SendCmdToP.java:91) at hdt.SendCmdToP.process(SendCmdToP.java:198) at hdt.SendCmdToP.run(SendCmdToP.java:131) Appeared pointer, and 500 wrong with the closure of the abnormal, a firewall relationship? If not is it code problems? Please help everybody see how to solve the problem. thanks?

    Read the article

  • JAX-RS --- How to return JSON and HTTP Status code together?

    - by masato-san
    I'm writing REST web app (Netbean6.9, JAX-RS, Toplink-essential) and trying to return JSON and Http status code. I have code ready and working just to return JSON when HTTP GET Method is called from client. Code snippet @Path("get/id") @GET @Produces("application/json") public M_?? getMachineToUpdate(@PathParam("id") String id) { //some code to return JSON . . return myJson But I also want to return HTTP status code (500, 200, 204 etc) along with returning JSON. I tried using HttpServletResponse object, response.sendError("error message", 500); But this made browser to think it's real 500 so output web page was regular Http 500 error page. What I want to is just to return status code so that my Javascript on client side can handle some logic depending on what HTTP status code is returned. (maybe just to display the error code and message on html page.) Is it possible to do so? or should HTTP status code not be used for such thing?

    Read the article

< Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >