Search Results

Search found 8989 results on 360 pages for 'response'.

Page 71/360 | < Previous Page | 67 68 69 70 71 72 73 74 75 76 77 78  | Next Page >

  • how to gzip-compress large Ajax responses (HTML only) in Coldfusion?

    - by frequent
    I'm running Coldfusion8 and jquery/jquery-mobile on the front-end. I'm playing around with an Ajax powered search engine trying to find the best tradeoff between data-volume and client-side processing time. Currently my AJAX search returns 40k of (JQM-enhanced markup), which avoids any client-side enhancement. This way I'm getting by without the page stalling for about 2-3 seconds, while JQM enhances all elements in the search results. What I'm curious is whether I can gzip Ajax responses sent from Coldfusion. If I check the header of my search right now, I'm having this: RESPONSE-header Connection Keep-Alive Content-Type text/html; charset=UTF-8 Date Sat, 01 Sep 2012 08:47:07 GMT Keep-Alive timeout=5, max=95 Server Apache/2.2.21 (Win32) mod_ssl/2.2.21 ... Transfer-Encoding chunked REQUEST-header Accept */* Accept-Encoding gzip, deflate Accept-Language de-de,de;q=0.8,en-us;q=0.5,en;q=0.3 Connection keep-alive Cookie CFID= ; CFTOKEN= ; resolution=1143 Host www.host.com Referer http://www.host.com/dev/users/index.cfm So, my request would accept gzip, deflate, but I'm getting back chunked. I'm generating the AJAX response in a cfsavecontent (called compressedHTML) and run this to eliminate whitespace <cfrscipt> compressedHTML = reReplace(renderedResults, "\>\s+\<", "> <", "ALL"); compressedHTML = reReplace(compressedHTML, "\s{2,}", chr(13), "ALL"); compressedHTML = reReplace(compressedHTML, "\s{2,}", chr(09), "ALL"); </cfscript> before sending the compressedHTML in a response object like this: {"SUCCESS":true,"DATA": compressedHTML } Question If I know I'm sending back HTML in my data object via Ajax, is there a way to gzip the response server-side before returning it vs sending chunked? If this is at all possible? If so, can I do this inside my response object or would I have to send back "pure" HTML? Thanks! EDIT: Found this on setting a 'web.config' for dynamic compression - doesn't seem to work EDIT2: Found thi snippet and am playing with it, although I'm not sure this will work. <cfscript> compressedHTML = reReplace(renderedResults, "\>\s+\<", "> <", "ALL"); compressedHTML = reReplace(compressedHTML, "\s{2,}", chr(13), "ALL"); compressedHTML = reReplace(compressedHTML, "\s{2,}", chr(09), "ALL"); if ( cgi.HTTP_ACCEPT_ENCODING contains "gzip" AND not showRaw ){ cfheader name="Content-Encoding" value="gzip"; bos = createObject("java","java.io.ByteArrayOutputStream").init(); gzipStream = createObject("java","java.util.zip.GZIPOutputStream"); gzipStream.init(bos); gzipStream.write(compressedHTML.getBytes("utf-8")); gzipStream.close(); bos.flush(); bos.close(); encoder = createObject("java","sun.misc. outStr= encoder.encode(bos.toByteArray()); compressedHTML = toString(bos.toByteArray()); } </cfscript> Probably need to try this on the response object and not the compressedTHML variable

    Read the article

  • Visual Studio reports that not all code path return a value, even though they do

    - by chris12892
    I have an API in NETMF C# that I am writing that includes a function to send an HTTP request. For those who are familiar with NETMF, this is a heavily modified version of the "webClient" example, which a simple application that demonstrates how to submit an HTTP request, and recive a response. In the sample, it simply prints the response and returns void,. In my version, however, I need it to return the HTTP response. For some reason, Visual Studio reports that not all code paths return a value, even though, as far as I can tell, they do. Here is my code... /// <summary> /// This is a modified webClient /// </summary> /// <param name="url"></param> private string httpRequest(string url) { // Create an HTTP Web request. HttpWebRequest request = HttpWebRequest.Create(url) as HttpWebRequest; // Set request.KeepAlive to use a persistent connection. request.KeepAlive = true; // Get a response from the server. WebResponse resp = request.GetResponse(); // Get the network response stream to read the page data. if (resp != null) { Stream respStream = resp.GetResponseStream(); string page = ""; byte[] byteData = new byte[4096]; char[] charData = new char[4096]; int bytesRead = 0; Decoder UTF8decoder = System.Text.Encoding.UTF8.GetDecoder(); int totalBytes = 0; // allow 5 seconds for reading the stream respStream.ReadTimeout = 5000; // If we know the content length, read exactly that amount of // data; otherwise, read until there is nothing left to read. if (resp.ContentLength != -1) { for (int dataRem = (int)resp.ContentLength; dataRem > 0; ) { Thread.Sleep(500); bytesRead = respStream.Read(byteData, 0, byteData.Length); if (bytesRead == 0) throw new Exception("Data laes than expected"); dataRem -= bytesRead; // Convert from bytes to chars, and add to the page // string. int byteUsed, charUsed; bool completed = false; totalBytes += bytesRead; UTF8decoder.Convert(byteData, 0, bytesRead, charData, 0, bytesRead, true, out byteUsed, out charUsed, out completed); page = page + new String(charData, 0, charUsed); } page = new String(System.Text.Encoding.UTF8.GetChars(byteData)); } else throw new Exception("No content-Length reported"); // Close the response stream. For Keep-Alive streams, the // stream will remain open and will be pushed into the unused // stream list. resp.Close(); return page; } } Any ideas? Thanks...

    Read the article

  • php soapclient returns null but getPreviousResults has proper results

    - by Joseph.Chambers
    I've ran into trouble with SOAP, I've never had this issue before and can't find any information on line that helps me solve it. The following code $wsdl = "path/to/my/wsdl"; $client = new SoapClient($wsdl, array('trace' => true)); //$$textinput is passed in and is a very large string with rows in <item></item> tags $soapInput = new SoapVar($textinput, XSD_ANYXML); $res = $client->dataprofilingservice(array("contents" => $soapInput)); $response = $client->__getLastResponse(); var_dump($res);//outputs null var_dump($response);//provides the proper response as I would expect. I've tried passing params into the SoapClient constructor to define soap version but that didnt' help. I've also tried it with the trace param set to false and not present which as expected made $response null but $res was still null. I've tried the code on both a linux and windows install running Apache. The function definition in the WSDL is (xxxx is for security reasons) <portType name="xxxxServiceSoap"> <operation name="dataprofilingservice"> <input message="tns:dataprofilingserviceSoapIn"/> <output message="tns:dataprofilingserviceSoapOut"/> </operation> </portType> I have it working using the __getLastResponse() but its annoying me it will not work properly. I've put together a small testing script, does anyone see any issues here. //very simplifed dataset that would normally be //read in from a CSV file of about 1mb $soapInput = getSoapInput("asdf,qwer\r\nzzxvc,ewrwe\r\n23424,2113"); $wsdl = "path to wsdl"; try { $client = new SoapClient($wsdl,array('trace' => true,'exceptions' => true)); } catch (SoapFault $fault) { $error = 1; var_dump($fault); } try { $res = $client->dataprofilingservice(array("contents" => $soapInput)); $response = $client->__getLastResponse(); echo htmlentities($client->__getLastRequest()); echo '<hr>'; var_dump($res); echo "<hr>"; echo(htmlentities($response)); } catch (SoapFault $fault) { $error = 1; var_dump($fault); } function getSoapInput($input){ $rows = array(); $userInputs = explode("\r\n", $input); $userInputs = array_filter($userInputs); // $inputTemplate = " <contents>%s</contents>"; $rowTemplate = "<Item>%s</Item>"; // $soapString = ""; foreach ($userInputs as $row) { // sanitize $row = htmlspecialchars(addslashes($row)); $xmlStr = sprintf($rowTemplate, $row); $rows[] = $xmlStr; } $textinput = sprintf($inputTemplate, implode(PHP_EOL, $rows)); $soapInput = new SoapVar($textinput, XSD_ANYXML); return $soapInput; }

    Read the article

  • jQuery code not working in Google Chrome...

    - by Jonathan
    Hi, I have writen a simple jQuery code to control ajax tabs navigation.. Its working in good on FireFox but in Chrome it working in one page but not in the home page I don't know why... Its really simple code just a lot of animations and callbacks and stuff like that.. here's the code: jQuery.fn.tabs = function({movieID, movieTitle}) { var tabsWrap = '#movie_details_wrap'; var tabsContent = '#tab_content'; var firstTab = '#tab_detalles'; var postPHP = 'index.php?controlador=pelicula'; //When page loads... first tab actions $('ul.tabs_nav a:first').addClass('active'); //Activate first tab nav $.get(postPHP, {"activeTab": firstTab, "movieID": movieID}, function(response){ $(tabsContent).html(response); // insert response into the faded out div $(tabsWrap).animate({ // animate the wrap div using the new container div height height: $(tabsContent).height() + "px" }, function() { $(tabsContent).fadeIn(); // fade in the div with all the info }); }); //On Click Event $('ul.tabs_nav li').click(function() { $('ul.tabs_nav a').removeClass('active'); //Remove any 'active' class $(this).find('a').addClass('active'); //Add 'active' class to selected tab var activeTab = $(this).find('a').attr('href'); //Find the href attribute value to identify the active tab + content var orgHeight = $(tabsContent).height() + 'px'; // get original height $(tabsWrap).css('height', orgHeight); // set height with css to freeze the wrap div when we hide the inner div $(tabsContent).fadeOut(200, function() { // fade out the inner div // send data by ajax (post) $.get(postPHP, {"activeTab": activeTab, "movieID": movieID , "movieTitle": movieTitle}, function(response){ $(tabsContent).html(response); // insert response into the faded out div $(tabsWrap).animate({ // animate the wrap div using the new container div height height: $(tabsContent).height() + "px" }, function() { $(tabsContent).fadeIn(); // fade in the div with all the info }); }); }); return false; }); }; Here's the HTML: <script type="text/javascript"> $(document).ready(function(){ $('.tabs_nav').tabs({movieID:'135353', movieTitle: 'Some Title'}); }); </script> <!--Navigation--> <ul id="details_nav" class="tabs_nav"> <li><a href="#tab_detalles">Detalles</a></li> <li><a href="#tab_criticas">Criticas</a></li> <li><a href="#tab_posters">Posters</a></li> <li><a href="#tab_trailers">Trailers</a></li> </ul> <div class="border_wrap"> <div id="movie_details_wrap"> <div id="tab_content"> <!--Tabs content here--> </div> </div> </div>

    Read the article

  • Java installation problem

    - by Zxy
    I cannot install java on my ubuntu 12.04: zero@ghostrider:~$ sudo apt-get purge openjdk* [sudo] password for zero: Reading package lists... Done Building dependency tree Reading state information... Done Note, selecting 'openjdk-6-demo' for regex 'openjdk*' Note, selecting 'openjdk-7-jre-headless' for regex 'openjdk*' Note, selecting 'uwsgi-plugin-jwsgi-openjdk-6' for regex 'openjdk*' Note, selecting 'openjdk-jre' for regex 'openjdk*' Note, selecting 'openjdk-7-source' for regex 'openjdk*' Note, selecting 'openjdk-6-dbg' for regex 'openjdk*' Note, selecting 'openjdk7-jdk' for regex 'openjdk*' Note, selecting 'openjdk-6-doc' for regex 'openjdk*' Note, selecting 'openjdk-7-jre-zero' for regex 'openjdk*' Note, selecting 'openjdk-7-demo' for regex 'openjdk*' Note, selecting 'openjdk-6-jre-headless' for regex 'openjdk*' Note, selecting 'openjdk-6-jdk' for regex 'openjdk*' Note, selecting 'openjdk-6-jre' for regex 'openjdk*' Note, selecting 'openjdk-6-jre-lib' for regex 'openjdk*' Note, selecting 'openjdk-6-jre-zero' for regex 'openjdk*' Note, selecting 'openjdk-7-dbg' for regex 'openjdk*' Note, selecting 'openjdk-7-doc' for regex 'openjdk*' Note, selecting 'openjdk-7-jdk' for regex 'openjdk*' Note, selecting 'openjdk-7-jre' for regex 'openjdk*' Note, selecting 'openjdk-6-source' for regex 'openjdk*' Note, selecting 'openjdk-7-jre-lib' for regex 'openjdk*' Note, selecting 'uwsgi-plugin-jvm-openjdk-6' for regex 'openjdk*' Package uwsgi-plugin-jvm-openjdk-6 is not installed, so not removed Package uwsgi-plugin-jwsgi-openjdk-6 is not installed, so not removed Package openjdk-6-dbg is not installed, so not removed Package openjdk-6-demo is not installed, so not removed Package openjdk-6-doc is not installed, so not removed Package openjdk-6-jdk is not installed, so not removed Package openjdk-6-jre is not installed, so not removed Package openjdk-6-jre-headless is not installed, so not removed Package openjdk-6-jre-lib is not installed, so not removed Package openjdk-6-source is not installed, so not removed Package openjdk-6-jre-zero is not installed, so not removed Package openjdk-7-dbg is not installed, so not removed Package openjdk-7-demo is not installed, so not removed Package openjdk-7-doc is not installed, so not removed Package openjdk-7-jdk is not installed, so not removed Package openjdk-7-jre is not installed, so not removed Package openjdk-7-jre-headless is not installed, so not removed Package openjdk-7-jre-lib is not installed, so not removed Package openjdk-7-jre-zero is not installed, so not removed Package openjdk-7-source is not installed, so not removed 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 1 not fully installed or removed. After this operation, 0 B of additional disk space will be used. Setting up oracle-java7-installer (7u3-0~eugenesan~precise4) ... Downloading... --2012-06-11 23:56:42-- http://download.oracle.com/otn-pub/java/jdk/7u3-b04/jdk- 7u3-linux-i586.tar.gz Resolving download.oracle.com (download.oracle.com)... 64.209.77.18 Connecting to download.oracle.com (download.oracle.com)|64.209.77.18|:80... connected. HTTP request sent, awaiting response... 302 Moved Temporarily Location: https://edelivery.oracle.com/otn-pub/java/jdk/7u3-b04/jdk-7u3-linux-i586.tar.gz [following] --2012-06-11 23:56:42-- https://edelivery.oracle.com/otn-pub/java/jdk/7u3-b04/jdk-7u3-linux-i586.tar.gz Resolving edelivery.oracle.com (edelivery.oracle.com)... 95.101.122.174 Connecting to edelivery.oracle.com (edelivery.oracle.com)|95.101.122.174|:443... connected. HTTP request sent, awaiting response... 302 Moved Temporarily Location: http://download.oracle.com/errors/download-fail-1505220.html [following] --2012-06-11 23:56:44-- http://download.oracle.com/errors/download-fail-1505220.html Connecting to download.oracle.com (download.oracle.com)|64.209.77.18|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 5307 (5.2K) [text/html] Saving to: `./jdk-7u3-linux-i586.tar.gz' 0K ..... 100% 1007K=0.005s 2012-06-11 23:56:44 (1007 KB/s) - `./jdk-7u3-linux-i586.tar.gz' saved [5307/5307] Download done. sha256sum mismatch jdk-7u3-linux-i586.tar.gz Oracle JDK 7 is NOT installed. dpkg: error processing oracle-java7-installer (--configure): subprocess installed post-installation script returned error exit status 1 No apport report written because MaxReports is reached already Errors were encountered while processing: oracle-java7-installer E: Sub-process /usr/bin/dpkg returned an error code (1) zero@ghostrider:~$ sudo add-apt-repository ppa:eugenesan/java You are about to add the following PPA to your system: More info: https://launchpad.net/~eugenesan/+archive/java Press [ENTER] to continue or ctrl-c to cancel adding it Executing: gpg --ignore-time-conflict --no-options --no-default-keyring --secret- keyring /tmp/tmp.uGcZHfsoNF --trustdb-name /etc/apt/trustdb.gpg --keyring /etc/apt/trusted.gpg --primary-keyring /etc/apt/trusted.gpg --keyserver hkp://keyserver.ubuntu.com:80/ --recv 4346FBB158F4022C896164EEE61380B28313A596 gpg: requesting key 8313A596 from hkp server keyserver.ubuntu.com gpg: key 8313A596: "Launchpad synergy+" not changed gpg: Total number processed: 1 gpg: unchanged: 1 zero@ghostrider:~$ sudo apt-get update Ign http://tr.archive.ubuntu.com precise InRelease Ign http://tr.archive.ubuntu.com precise-updates InRelease Ign http://tr.archive.ubuntu.com precise-backports InRelease Hit http://tr.archive.ubuntu.com precise Release.gpg Hit http://tr.archive.ubuntu.com precise-updates Release.gpg Hit http://tr.archive.ubuntu.com precise-backports Release.gpg Hit http://tr.archive.ubuntu.com precise Release Ign http://extras.ubuntu.com precise InRelease Ign http://security.ubuntu.com precise-security InRelease Hit http://tr.archive.ubuntu.com precise-updates Release Ign http://ppa.launchpad.net precise InRelease Hit http://tr.archive.ubuntu.com precise-backports Release Hit http://tr.archive.ubuntu.com precise/main Sources Hit http://tr.archive.ubuntu.com precise/restricted Sources Hit http://tr.archive.ubuntu.com precise/universe Sources Hit http://tr.archive.ubuntu.com precise/multiverse Sources Hit http://tr.archive.ubuntu.com precise/main i386 Packages Hit http://tr.archive.ubuntu.com precise/restricted i386 Packages Hit http://tr.archive.ubuntu.com precise/universe i386 Packages Hit http://extras.ubuntu.com precise Release.gpg Hit http://ppa.launchpad.net precise Release.gpg Hit http://security.ubuntu.com precise-security Release.gpg Hit http://tr.archive.ubuntu.com precise/multiverse i386 Packages Hit http://tr.archive.ubuntu.com precise/main TranslationIndex Hit http://tr.archive.ubuntu.com precise/multiverse TranslationIndex Hit http://tr.archive.ubuntu.com precise/restricted TranslationIndex Hit http://tr.archive.ubuntu.com precise/universe TranslationIndex Hit http://tr.archive.ubuntu.com precise-updates/main Sources Hit http://tr.archive.ubuntu.com precise-updates/restricted Sources Hit http://tr.archive.ubuntu.com precise-updates/universe Sources Hit http://tr.archive.ubuntu.com precise-updates/multiverse Sources Hit http://tr.archive.ubuntu.com precise-updates/main i386 Packages Hit http://extras.ubuntu.com precise Release Hit http://ppa.launchpad.net precise Release Hit http://security.ubuntu.com precise-security Release Hit http://tr.archive.ubuntu.com precise-updates/restricted i386 Packages Hit http://tr.archive.ubuntu.com precise-updates/universe i386 Packages Hit http://tr.archive.ubuntu.com precise-updates/multiverse i386 Packages Hit http://tr.archive.ubuntu.com precise-updates/main TranslationIndex Hit http://tr.archive.ubuntu.com precise-updates/multiverse TranslationIndex Hit http://tr.archive.ubuntu.com precise-updates/restricted TranslationIndex Hit http://tr.archive.ubuntu.com precise-updates/universe TranslationIndex Hit http://tr.archive.ubuntu.com precise-backports/main Sources Hit http://tr.archive.ubuntu.com precise-backports/restricted Sources Hit http://tr.archive.ubuntu.com precise-backports/universe Sources Hit http://tr.archive.ubuntu.com precise-backports/multiverse Sources Hit http://tr.archive.ubuntu.com precise-backports/main i386 Packages Hit http://tr.archive.ubuntu.com precise-backports/restricted i386 Packages Hit http://tr.archive.ubuntu.com precise-backports/universe i386 Packages Hit http://tr.archive.ubuntu.com precise-backports/multiverse i386 Packages Hit http://tr.archive.ubuntu.com precise-backports/main TranslationIndex Hit http://extras.ubuntu.com precise/main Sources Hit http://ppa.launchpad.net precise/main Sources Hit http://security.ubuntu.com precise-security/main Sources Hit http://tr.archive.ubuntu.com precise-backports/multiverse TranslationIndex Hit http://tr.archive.ubuntu.com precise-backports/restricted TranslationIndex Hit http://tr.archive.ubuntu.com precise-backports/universe TranslationIndex Hit http://tr.archive.ubuntu.com precise/main Translation-en Hit http://tr.archive.ubuntu.com precise/multiverse Translation-en Hit http://extras.ubuntu.com precise/main i386 Packages Ign http://extras.ubuntu.com precise/main TranslationIndex Hit http://tr.archive.ubuntu.com precise/restricted Translation-en Hit http://tr.archive.ubuntu.com precise/universe Translation-en Hit http://tr.archive.ubuntu.com precise-updates/main Translation-en Hit http://tr.archive.ubuntu.com precise-updates/multiverse Translation-en Hit http://tr.archive.ubuntu.com precise-updates/restricted Translation-en Hit http://ppa.launchpad.net precise/main i386 Packages Ign http://ppa.launchpad.net precise/main TranslationIndex Hit http://security.ubuntu.com precise-security/restricted Sources Hit http://security.ubuntu.com precise-security/universe Sources Hit http://security.ubuntu.com precise-security/multiverse Sources Hit http://security.ubuntu.com precise-security/main i386 Packages Hit http://security.ubuntu.com precise-security/restricted i386 Packages Hit http://tr.archive.ubuntu.com precise-updates/universe Translation-en Hit http://tr.archive.ubuntu.com precise-backports/main Translation-en Hit http://tr.archive.ubuntu.com precise-backports/multiverse Translation-en Hit http://tr.archive.ubuntu.com precise-backports/restricted Translation-en Hit http://tr.archive.ubuntu.com precise-backports/universe Translation-en Hit http://security.ubuntu.com precise-security/universe i386 Packages Hit http://security.ubuntu.com precise-security/multiverse i386 Packages Hit http://security.ubuntu.com precise-security/main TranslationIndex Hit http://security.ubuntu.com precise-security/multiverse TranslationIndex Hit http://security.ubuntu.com precise-security/restricted TranslationIndex Hit http://security.ubuntu.com precise-security/universe TranslationIndex Hit http://security.ubuntu.com precise-security/main Translation-en Hit http://security.ubuntu.com precise-security/multiverse Translation-en Hit http://security.ubuntu.com precise-security/restricted Translation-en Hit http://security.ubuntu.com precise-security/universe Translation-en Ign http://ppa.launchpad.net precise/main Translation-en_US Ign http://extras.ubuntu.com precise/main Translation-en_US Ign http://ppa.launchpad.net precise/main Translation-en Ign http://extras.ubuntu.com precise/main Translation-en Reading package lists... Done zero@ghostrider:~$ sudo apt-get install oracle-java7-installer Reading package lists... Done Building dependency tree Reading state information... Done oracle-java7-installer is already the newest version. 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 1 not fully installed or removed. After this operation, 0 B of additional disk space will be used. Do you want to continue [Y/n]? Y Setting up oracle-java7-installer (7u3-0~eugenesan~precise4) ... Downloading... --2012-06-11 23:57:11-- http://download.oracle.com/otn-pub/java/jdk/7u3-b04/jdk- 7u3-linux-i586.tar.gz Resolving download.oracle.com (download.oracle.com)... 64.209.77.18 Connecting to download.oracle.com (download.oracle.com)|64.209.77.18|:80... connected. HTTP request sent, awaiting response... 302 Moved Temporarily Location: https://edelivery.oracle.com/otn-pub/java/jdk/7u3-b04/jdk-7u3-linux-i586.tar.gz [following] --2012-06-11 23:57:11-- https://edelivery.oracle.com/otn-pub/java/jdk/7u3-b04/jdk-7u3-linux-i586.tar.gz Resolving edelivery.oracle.com (edelivery.oracle.com)... 95.101.122.174 Connecting to edelivery.oracle.com (edelivery.oracle.com)|95.101.122.174|:443... connected. HTTP request sent, awaiting response... 302 Moved Temporarily Location: http://download.oracle.com/errors/download-fail-1505220.html [following] --2012-06-11 23:57:12-- http://download.oracle.com/errors/download-fail-1505220.html Connecting to download.oracle.com (download.oracle.com)|64.209.77.18|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 5307 (5.2K) [text/html] Saving to: `./jdk-7u3-linux-i586.tar.gz' 0K ..... 100% 976K=0.005s 2012-06-11 23:57:12 (976 KB/s) - `./jdk-7u3-linux-i586.tar.gz' saved [5307/5307] Download done. sha256sum mismatch jdk-7u3-linux-i586.tar.gz Oracle JDK 7 is NOT installed. dpkg: error processing oracle-java7-installer (--configure): subprocess installed post-installation script returned error exit status 1 No apport report written because MaxReports is reached already Errors were encountered while processing: oracle-java7-installer E: Sub-process /usr/bin/dpkg returned an error code (1) zero@ghostrider:~$

    Read the article

  • Translating with Google Translate without API and C# Code

    - by Rick Strahl
    Some time back I created a data base driven ASP.NET Resource Provider along with some tools that make it easy to edit ASP.NET resources interactively in a Web application. One of the small helper features of the interactive resource admin tool is the ability to do simple translations using both Google Translate and Babelfish. Here's what this looks like in the resource administration form: When a resource is displayed, the user can click a Translate button and it will show the current resource text and then lets you set the source and target languages to translate. The Go button fires the translation for both Google and Babelfish and displays them - pressing use then changes the language of the resource to the target language and sets the resource value to the newly translated value. It's a nice and quick way to get a quick translation going. Ch… Ch… Changes Originally, both implementations basically did some screen scraping of the interactive Web sites and retrieved translated text out of result HTML. Screen scraping is always kind of an iffy proposition as content can be changed easily, but surprisingly that code worked for many years without fail. Recently however, Google at least changed their input pages to use AJAX callbacks and the page updates no longer worked the same way. End result: The Google translate code was broken. Now, Google does have an official API that you can access, but the API is being deprecated and you actually need to have an API key. Since I have public samples that people can download the API key is an issue if I want people to have the samples work out of the box - the only way I could even do this is by sharing my API key (not allowed).   However, after a bit of spelunking and playing around with the public site however I found that Google's interactive translate page actually makes callbacks using plain public access without an API key. By intercepting some of those AJAX calls and calling them directly from code I was able to get translation back up and working with minimal fuss, by parsing out the JSON these AJAX calls return. I don't think this particular Warning: This is hacky code, but after a fair bit of testing I found this to work very well with all sorts of languages and accented and escaped text etc. as long as you stick to small blocks of translated text. I thought I'd share it in case anybody else had been relying on a screen scraping mechanism like I did and needed a non-API based replacement. Here's the code: /// <summary> /// Translates a string into another language using Google's translate API JSON calls. /// <seealso>Class TranslationServices</seealso> /// </summary> /// <param name="Text">Text to translate. Should be a single word or sentence.</param> /// <param name="FromCulture"> /// Two letter culture (en of en-us, fr of fr-ca, de of de-ch) /// </param> /// <param name="ToCulture"> /// Two letter culture (as for FromCulture) /// </param> public string TranslateGoogle(string text, string fromCulture, string toCulture) { fromCulture = fromCulture.ToLower(); toCulture = toCulture.ToLower(); // normalize the culture in case something like en-us was passed // retrieve only en since Google doesn't support sub-locales string[] tokens = fromCulture.Split('-'); if (tokens.Length > 1) fromCulture = tokens[0]; // normalize ToCulture tokens = toCulture.Split('-'); if (tokens.Length > 1) toCulture = tokens[0]; string url = string.Format(@"http://translate.google.com/translate_a/t?client=j&text={0}&hl=en&sl={1}&tl={2}", HttpUtility.UrlEncode(text),fromCulture,toCulture); // Retrieve Translation with HTTP GET call string html = null; try { WebClient web = new WebClient(); // MUST add a known browser user agent or else response encoding doen't return UTF-8 (WTF Google?) web.Headers.Add(HttpRequestHeader.UserAgent, "Mozilla/5.0"); web.Headers.Add(HttpRequestHeader.AcceptCharset, "UTF-8"); // Make sure we have response encoding to UTF-8 web.Encoding = Encoding.UTF8; html = web.DownloadString(url); } catch (Exception ex) { this.ErrorMessage = Westwind.Globalization.Resources.Resources.ConnectionFailed + ": " + ex.GetBaseException().Message; return null; } // Extract out trans":"...[Extracted]...","from the JSON string string result = Regex.Match(html, "trans\":(\".*?\"),\"", RegexOptions.IgnoreCase).Groups[1].Value; if (string.IsNullOrEmpty(result)) { this.ErrorMessage = Westwind.Globalization.Resources.Resources.InvalidSearchResult; return null; } //return WebUtils.DecodeJsString(result); // Result is a JavaScript string so we need to deserialize it properly JavaScriptSerializer ser = new JavaScriptSerializer(); return ser.Deserialize(result, typeof(string)) as string; } To use the code is straightforward enough - simply provide a string to translate and a pair of two letter source and target languages: string result = service.TranslateGoogle("Life is great and one is spoiled when it goes on and on and on", "en", "de"); TestContext.WriteLine(result); How it works The code to translate is fairly straightforward. It basically uses the URL I snagged from the Google Translate Web Page slightly changed to return a JSON result (&client=j) instead of the funky nested PHP style JSON array that the default returns. The JSON result returned looks like this: {"sentences":[{"trans":"Das Leben ist großartig und man wird verwöhnt, wenn es weiter und weiter und weiter geht","orig":"Life is great and one is spoiled when it goes on and on and on","translit":"","src_translit":""}],"src":"en","server_time":24} I use WebClient to make an HTTP GET call to retrieve the JSON data and strip out part of the full JSON response that contains the actual translated text. Since this is a JSON response I need to deserialize the JSON string in case it's encoded (for upper/lower ASCII chars or quotes etc.). Couple of odd things to note in this code: First note that a valid user agent string must be passed (or at least one starting with a common browser identification - I use Mozilla/5.0). Without this Google doesn't encode the result with UTF-8, but instead uses a ISO encoding that .NET can't easily decode. Google seems to ignore the character set header and use the user agent instead which is - odd to say the least. The other is that the code returns a full JSON response. Rather than use the full response and decode it into a custom type that matches Google's result object, I just strip out the translated text. Yeah I know that's hacky but avoids an extra type and firing up the JavaScript deserializer. My internal version uses a small DecodeJsString() method to decode Javascript without the overhead of a full JSON parser. It's obviously not rocket science but as mentioned above what's nice about it is that it works without an Google API key. I can't vouch on how many translates you can do before there are cut offs but in my limited testing running a few stress tests on a Web server under load I didn't run into any problems. Limitations There are some restrictions with this: It only works on single words or single sentences - multiple sentences (delimited by .) are cut off at the ".". There is also a length limitation which appears to happen at around 220 characters or so. While that may not sound  like much for typical word or phrase translations this this is plenty of length. Use with a grain of salt - Google seems to be trying to limit their exposure to usage of the Translate APIs so this code might break in the future, but for now at least it works. FWIW, I also found that Google's translation is not as good as Babelfish, especially for contextual content like sentences. Google is faster, but Babelfish tends to give better translations. This is why in my translation tool I show both Google and Babelfish values retrieved. You can check out the code for this in the West Wind West Wind Web Toolkit's TranslationService.cs file which contains both the Google and Babelfish translation code pieces. Ironically the Babelfish code has been working forever using screen scraping and continues to work just fine today. I think it's a good idea to have multiple translation providers in case one is down or changes its format, hence the dual display in my translation form above. I hope this has been helpful to some of you - I've actually had many small uses for this code in a number of applications and it's sweet to have a simple routine that performs these operations for me easily. Resources Live Localization Sample Localization Resource Provider Administration form that includes options to translate text using Google and Babelfish interactively. TranslationService.cs The full source code in the West Wind West Wind Web Toolkit's Globalization library that contains the translation code. © Rick Strahl, West Wind Technologies, 2005-2011Posted in CSharp  HTTP   Tweet (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Rails/Node.js interaction

    - by lpvn
    I and my co-worker are developing a web application with rails and node.js and we can't reach a consensus regarding a particular architectural decision. Our setup is basically a rails server working with node.js and redis, when a client makes a http request to our rails API in some cases our rails application posts the response to a redis database and then node.js transmits the response via websocket. Our disagreement occurs in the following point: my co-worker thinks that using node.js to send data to clients is somewhat business logic and should be inside the model, so in the first code he wrote he used commands of broadcast in callbacks and other places of the model, he's convinced that the models are the best place for the interaction between rails and node. I on the other hand think that using node.js belongs to the runtime realm, my take is that the broadcast commands and other node.js interactions should be in the controller and should only be used in a model if passed through a well defined interface, just like the situation when a model needs to access the current user of a session. At this point we're tired of arguing over this same thing and our discussion consists in us repeating to ourselves our same opinions over and over. Could anyone, preferably with experience in the same setup, give us an unambiguous response saying which solution is more adequate and why it is?

    Read the article

  • Delete eth0 avahi from the ifconfig list

    - by sai
    Hello this is the response I get from ifconfig. Now I have two eth0 things being showed up. I need to delete the second one which says eth0:avahi. I posted my ifconfig's response on a site as I has problem using wired internet, and they suggested to remove the eth0 avahi, to get internet. But I am a newbie to linux networking and have no idea how to delete this. response for ifconfig eth0 Link encap:Ethernet HWaddr 18:a9:05:22:cd:f9 UP BROADCAST MULTICAST MTU:1500 Metric:1 RX packets:0 errors:0 dropped:0 overruns:0 frame:0 TX packets:0 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:0 (0.0 B) TX bytes:0 (0.0 B) Interrupt:28 Base address:0x4000 eth0:avahi Link encap:Ethernet HWaddr 18:a9:05:22:cd:f9 inet addr:169.254.10.43 Bcast:169.254.255.255 Mask:255.255.0.0 UP BROADCAST MULTICAST MTU:1500 Metric:1 Interrupt:28 Base address:0x4000 lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:16436 Metric:1 RX packets:796 errors:0 dropped:0 overruns:0 frame:0 TX packets:796 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:64016 (64.0 KB) TX bytes:64016 (64.0 KB) wlan0 Link encap:Ethernet HWaddr 00:26:82:3c:ac:27 inet6 addr: fe80::226:82ff:fe3c:ac27/64 Scope:Link UP BROADCAST MULTICAST MTU:1500 Metric:1 RX packets:52142 errors:0 dropped:0 overruns:0 frame:0 TX packets:30404 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:60816983 (60.8 MB) TX bytes:4160159 (4.1 MB)

    Read the article

  • How do I implement a Bullet Physics CollisionObject that represents my cube like terrain?

    - by Byte56
    I've successfully integrated the Bullet Physics library into my entity/component system. Entities can collide with each other. Now I need to enable them to collide with the terrain, which is finite and cube-like (think InfiniMiner or it's clone Minecraft). I only started using the Bullet Physics library yesterday, so perhaps I'm missing something obvious. So far I've extended the RigidBody class to override the checkCollisionWith(CollisionObject co) function. At the moment it's just a simple check of the origin, not using the other shape. I'll iterate on that later. For now it looks like this: @Override public boolean checkCollideWith(CollisionObject co) { Transform t = new Transform(); co.getWorldTransform(t); if(COLONY.SolidAtPoint(t.origin.x, t.origin.y,t.origin.z)){ return true; } return false; } This works great, as far as detecting when collisions happen. However, this doesn't handle the collision response. It seems that the default collision response is to move the colliding objects outside of each others shapes, possibly their AABBs. At the moment the shape of the terrain is just a box the size of the world. This means the entities that collide with the terrain just shoot away to outside that world size box. So it's clear that I either need to modify the collision response or I need to create a shape that conforms directly to the shape of the terrain. So which option is best and how do I go about implementing it? It should be noted that the terrain is dynamic and frequently modified by the player.

    Read the article

  • Stream.CopyTo() extension method

    - by DigiMortal
    In one of my applications I needed copy data from one stream to another. After playing with streams a little bit I wrote CopyTo() extension method to Stream class you can use to copy the contents of current stream to target stream. Here is my extension method. It is my working draft and it is possible that there must be some more checks before we can say this extension method is ready to be part of some API or class library. public static void CopyTo(this Stream fromStream, Stream toStream) {     if (fromStream == null)         throw new ArgumentNullException("fromStream");     if (toStream == null)         throw new ArgumentNullException("toStream");       var bytes = new byte[8092];     int dataRead;     while ((dataRead = fromStream.Read(bytes, 0, bytes.Length)) > 0)         toStream.Write(bytes, 0, dataRead); } And here is example how to use this extension method. using(var stream = response.GetResponseStream()) using(var ms = new MemoryStream()) {     stream.CopyTo(ms);       // Do something with copied data } I am using this code to copy data from HTTP response stream to memory stream because I have to use serializer that needs more than response stream is able to offer.

    Read the article

  • Why does "quickly share --ppa share" abort with a "can't create" error?

    - by desgua
    I can not figure out what I am doing wrong. The package builds ok with quickly package, I could submit it, but I can not update my ppa. Here is what I got: desgua@desguai7:~/quickly/sbk$ quickly share --ppa sbk Get Launchpad Settings Launchpad connection is ok ..........An error has occurred when creating debian packaging ERROR: can't create or update ubuntu package ERROR: share command failed Aborting Edit The name of my ppa was wrong, but even using ppa:desgua/sbk still doesn't work: desgua@desguai7:~/quickly/sbk$ quickly share --ppa ppa:desgua/sbk Get Launchpad Settings Traceback (most recent call last): File "/usr/share/quickly/templates/ubuntu-application/share.py", line 101, in launchpad = launchpadaccess.initialize_lpi() File "/usr/lib/python2.7/dist-packages/quickly/launchpadaccess.py", line 91, in initialize_lpi allow_access_levels=["WRITE_PRIVATE"]) File "/usr/lib/python2.7/dist-packages/launchpadlib/launchpad.py", line 539, in login_with credential_save_failed, version) File "/usr/lib/python2.7/dist-packages/launchpadlib/launchpad.py", line 359, in _authorize_token_and_login service_root, cache, timeout, proxy_info, version) File "/usr/lib/python2.7/dist-packages/launchpadlib/launchpad.py", line 198, in __init__ credentials, service_root, cache, timeout, proxy_info, version) File "/usr/lib/python2.7/dist-packages/lazr/restfulclient/resource.py", line 460, in __init__ self._wadl = self._browser.get_wadl_application(self._root_uri) File "/usr/lib/python2.7/dist-packages/lazr/restfulclient/_browser.py", line 299, in get_wadl_application response, content = self._request(url, media_type=wadl_type) File "/usr/lib/python2.7/dist-packages/lazr/restfulclient/_browser.py", line 242, in _request str(url), method=method, body=data, headers=headers) File "/usr/lib/python2.7/dist-packages/lazr/restfulclient/_browser.py", line 211, in _request_and_retry url, method=method, body=body, headers=headers) File "/usr/lib/python2.7/dist-packages/httplib2/__init__.py", line 1414, in request (response, new_content) = self._request(conn, authority, uri, request_uri, method, body, headers, redirections, cachekey) File "/usr/lib/python2.7/dist-packages/launchpadlib/launchpad.py", line 126, in _request LaunchpadOAuthAwareHttp, self)._request(*args) File "/usr/lib/python2.7/dist-packages/lazr/restfulclient/_browser.py", line 130, in _request redirections, cachekey) File "/usr/lib/python2.7/dist-packages/httplib2/__init__.py", line 1196, in _request (response, content) = self._conn_request(conn, request_uri, method, body, headers) File "/usr/lib/python2.7/dist-packages/httplib2/__init__.py", line 1138, in _conn_request raise ServerNotFoundError("Unable to find the server at %s" % conn.host) httplib2.ServerNotFoundError: Unable to find the server at api.launchpad.net ERROR: share command failed Aborting Any ideas? How could I troubleshot this error?

    Read the article

  • Unit testing ASP.NET Web API controllers that rely on the UrlHelper

    - by cibrax
    UrlHelper is the class you can use in ASP.NET Web API to automatically infer links from the routing table without hardcoding anything. For example, the following code uses the helper to infer the location url for a new resource,public HttpResponseMessage Post(User model) { var response = Request.CreateResponse(HttpStatusCode.Created, user); var link = Url.Link("DefaultApi", new { id = id, controller = "Users" }); response.Headers.Location = new Uri(link); return response; } That code uses a previously defined route “DefaultApi”, which you might configure in the HttpConfiguration object (This is the route generated by default when you create a new Web API project). The problem with UrlHelper is that it requires from some initialization code before you can invoking it from a unit test (for testing the Post method in this example). If you don’t initialize the HttpConfiguration and Request instances associated to the controller from the unit test, it will fail miserably. After digging into the ASP.NET Web API source code a little bit, I could figure out what the requirements for using the UrlHelper are. It relies on the routing table configuration, and a few properties you need to add to the HttpRequestMessage. The following code illustrates what’s needed,var controller = new UserController(); controller.Configuration = new HttpConfiguration(); var route = controller.Configuration.Routes.MapHttpRoute( name: "DefaultApi", routeTemplate: "api/{controller}/{id}", defaults: new { id = RouteParameter.Optional } ); var routeData = new HttpRouteData(route, new HttpRouteValueDictionary { { "id", "1" }, { "controller", "Users" } } ); controller.Request = new HttpRequestMessage(HttpMethod.Post, "http://localhost:9091/"); controller.Request.Properties.Add(HttpPropertyKeys.HttpConfigurationKey, controller.Configuration); controller.Request.Properties.Add(HttpPropertyKeys.HttpRouteDataKey, routeData);  The HttpRouteData instance should be initialized with the route values you will use in the controller method (“id” and “controller” in this example). Once you have correctly setup all those properties, you shouldn’t have any problem to use the UrlHelper. There is no need to mock anything else. Enjoy!!.

    Read the article

  • Top 10 solution documents for Weblogic Server J2EE Feb 2014 - May 2014

    - by jhpierce -Oracle
    The following are the top 10 documents linked to SRs as solutions, for Weblogic Server J2EE issues, from Feb 2014 thru May 2014. 1163020.1 How to configure Filtering class loader in weblogic.xml   To configure the Filtering Class Loader to specify a certain package is loaded from an application, add a prefer-application-packages descriptor element. 1276593.1 WLS - How to supress servlet/JSP version details In WebLogic HTTP response header The string "X-Powered-By: Servlet/2.4 JSP/2.0" is showing up in the servlet response header.How to stop Weblogic from including servlet/JSP version details in the x-powered-by HTTP response header. 1490080.1 WebLogic Server 12.1.1.0 in a Cluster Environment Throws NotSerializableException for CDI Applications at com.sun.jersey.server.impl.cdi.CDIExtension When running in clustered environment, server start-up is not clean when you have CDI applications deployed. 1268138.1 Sample TwoWay SSL implementation for JAX-WS Webservice!   In this sample provided the recipient checks for the initiator's public certificate. Note that the client certificate can be used for authentication. 1584779.1 Socket Leaks When Calling Web-Service Over SSL This is a known bug 16810786 1598617.1 Secure WebService call throwing CANNOT RESOLVE URL FOR PROTOCOL HTTP/HTTPS through web server(APACHE) plug-in.    1056121.1 How to Timeout Weblogic Webservice Client   How to timeout a WebService client with and without using Stubs. 1568638.1 When packaging Jersey JAX-RS libraries into webapp throws NoSuchMethodError()  When attempting to include custom Jersey implementation libraries in to web application in a OSB domain. 1118264.1 WLS 10.3: Intermittent XA error: XAResource.XAER_RMERR In WebLogic 10.3, a CMP EJB sometimes throws the exception.   1608951.1 How to get More Details About Error BEA-101215 Malformed Request. Request parsing failed Code: -1   Which was seen when accessing the application via loadbalancer?

    Read the article

  • How can I check Internet connectivity in a console?

    - by Ashfame
    Is there an easy way to check Internet connectivity from console? I am trying to play around in a shell script. One idea I seem is to wget --spider http://www.google.co.in/ and check the HTTP response code to interpret if the Internet connection is working fine. But I think there must be easy way without the need of checking a site that never crash ;) Edit: Seems like there can be a lot of factors which can be individually examined, good thing. My intention at the moment is to check if my blog is down. I have setup cron to check it every minute. For this, I am checking the HTTP response code of wget --spider to my blog. If its not 200, it notifies me (I believe this will be better than just pinging it, as the site may under be heavy load and may be timing out or respond very late). Now yesterday, there was some problem with my Internet. LAN was connected fine but just I couldn't access any site. So I keep on getting notifications as the script couldn't find 200 in the wget response. Now I want to make sure that it displays me notification when I do have internet connectivity. So, checking for DNS and LAN connectivity is a bit overkill for me as I don't have that much specific need to figure out what problem it is. So what do you suggest how I do it?

    Read the article

  • ASP.NET: Serializing and deserializing JSON objects

    - by DigiMortal
    ASP.NET offers very easy way to serialize objects to JSON format. Also it is easy to deserialize JSON objects using same library. In this posting I will show you how to serialize and deserialize JSON objects in ASP.NET. All required classes are located in System.Servicemodel.Web assembly. There is namespace called System.Runtime.Serialization.Json for JSON serializer. To serialize object to stream we can use the following code. var serializer = new DataContractJsonSerializer(typeof(MyClass)); serializer.WriteObject(myStream, myObject); To deserialize object from stream we can use the following code. CopyStream() is practically same as my Stream.CopyTo() extension method. var serializer = new DataContractJsonSerializer(typeof(MyClass));   using(var stream = response.GetResponseStream()) using (var ms = new MemoryStream()) {     CopyStream(stream, ms);     results = serializer.ReadObject(ms) as MyClass; } Why I copied data from response stream to memory stream? Point is simple – serializer uses some stream features that are not supported by response stream. Using memory stream we can deserialize object that came from web.

    Read the article

  • How to post JSON object to a URL without cURL in PHP? [closed]

    - by empyreanphoenix
    I have written a send sms code which uses an available sms api. So, I need to post a string in the json(string in $json) format to a url(url specified in $url var), but I am getting the following error String index out of range: -1 I understand that the error arises when requesting for an index that isn't there, but I dont find how that applies in this case. Please help. Note: name1,name2 and name3 are sender name, phone number and content respectively. Thanks array ( 'method' = 'POST', 'content' = $data, 'header' ="Authentication:$key" . "timeout:50000"."ContentLength: " . strlen ( $data) . "\r\n" ) ); if ($optional_headers != null) { $params ['http'] ['header'] = $optional_headers; } $ctx = stream_context_create ( $params ); try { $fp = fopen ( $url, 'rb', false, $ctx ); $response = stream_get_contents ( $fp ); echo "response"; } catch ( Exception $e ) { echo 'Exception: ' . $e-getMessage (); echo "caught"; } echo "done"; return $response; }

    Read the article

  • Search option: jqGrid + PHP

    - by Felix Guerrero
    Hi, I'm trying to get work this code: $("#list").jqGrid({ url: 'docente.php', datatype: 'json', mtype: 'POST', postData: {c : "", n: "", a: "", d: "", t: "", e: "", f: "", g: $('#selectD').value(), p: "", call: "report"}, colNames: ['C&eacute;dula','Nombres', 'Apellidos', 'Direcci&oacute;n', 'E-mail','Tel&eacute;fono', 'Profesi&oacute;n'], colModel: [ { name:'rows.cedula', index: 'cedula', search:true, jsonmap: 'cedula', width: 150, align: 'left', sortable:true}, { name:'rows.nombre', index: 'nombre', jsonmap: 'nombre', width: 150, align: 'left'}, { name:'rows.apellido', index: 'apellido', jsonmap: 'apellido', width: 240, align: 'left'}, { name:'rows.direccion', index: 'direccion', jsonmap: 'direccion', width: 330, align: 'left'}, { name:'rows.email', index: 'email', jsonmap: 'email',width: 200, align: 'left'}, { name:'rows.telefono', index: 'telefono', jsonmap: 'telefono', width: 170, align: 'left'}, { name:'rows.descripcion_profesion', index: 'descripcion_profesion', jsonmap: 'descripcion_profesion',width: 120, align: 'left'}], pager: '#pager', rowNum: 8, autowidth: true, rowList: [10, 20], sortname: 'cedula', sortorder: 'asc', viewrecords: true, caption: 'Docentes', jsonReader : { root: "rows", repeatitems: false }, height: 350, width: 900 }); $("#list").jqGrid('navGrid','#pager',{search:true, searchtext: "buscar",edit:false,add:false,del:false}); The HMTL I'm using: <div id="reporte"> <table id="list"></table> <div id="pager"> </div> <div> And the PHP code: $total_pages = 0; $limit=10; $count = Profesor::CountAll(); if($count> 0) $total_pages= ceil($count/$limit); else $total_pages=0; $cnx = new PDO("mysql:host=localhost;dbname=appsms","root",""); $rows = array(); $strQuery="select cedula, nombre, apellido, direccion, telefono, email, descripcion_profesion from Profesor join Profesion on " . " (profesor.profesion_id = profesion.id) and (profesor.grado_id_grado = '" .$this->grado . "') order by cedula;"; $stmt = $cnx->prepare($strQuery); $stmt->execute(array("reporte")); $rows = $stmt->fetchAll(PDO::FETCH_ASSOC); $response->page = 1; $response->total= $total_pages; $response->records = sizeof($rows); $i=0; foreach($rows as $result){ $response->rows[$i] = $result; $i++; } echo json_encode($response); The JSON I get: {"page":1,"total":1,"records":2, "rows":[{"cedula":"v-108984103","nombre":"Soneia","apellido":"Rond\u00f3n Contreras","direccion":"El Rosal, calle 2-44","telefono":"04544247008457","email":"[email protected]","descripcion_profesion":"Docente"}, {"cedula":"v-135254741","nombre":"Judith","apellido":"Rangel M\u00e1rquez","direccion":"Sabaneta","telefono":"04247644152499","email":"","descripcion_profesion":"Docente"}]} It loads the query from PHP but the search option doesn't work neither the sorting function on "cedula"'s column. What I need to do?. Thanks in advance.

    Read the article

  • Asynchronous callback for network in Objective-C Iphone

    - by vodkhang
    I am working with network request - response in Objective-C. There is something with asynchronous model that I don't understand. In summary, I have a view that will show my statuses from 2 social networks: Twitter and Facebook. When I clicked refresh, it will call a model manager. That model manager will call 2 service helpers to request for latest items. When 2 service helpers receive data, it will pass back to model manager and this model will add all data into a sorted array. What I don't understand here is that : when response from social networks come back, how many threads will handle the response. From my understanding about multithreading and networking (in Java), there must have 2 threads handle 2 responses and those 2 threads will execute the code to add the responses to the array. So, it can have race condition and the program can go wrong right? Is it the correct working model of iphone objective-C? Or they do it in a different way that it will never have race condition and we don't have to care about locking, synchronize? Here is my example code: ModelManager.m - (void)updateMyItems:(NSArray *)items { self.helpers = [self authenticatedHelpersForAction:NCHelperActionGetMyItems]; for (id<NCHelper> helper in self.helpers) { [helper updateMyItems:items]; // NETWORK request here } } - (void)helper:(id <NCHelper>)helper didReturnItems:(NSArray *)items { [self helperDidFinishGettingMyItems:items callback:@selector(model:didGetMyItems:)]; break; } } // some private attributes int *_currentSocialNetworkItemsCount = 0; // to count the number of items of a social network - (void)helperDidFinishGettingMyItems:(NSArray *)items { for (Item *item in items) { _currentSocialNetworkItemsCount ++; } NSLog(@"count: %d", _currentSocialNetworkItemsCount); _currentSocialNetworkItemsCount = 0; } I want to ask if there is a case that the method helperDidFinishGettingMyItems is called concurrently. That means, for example, faceboook returns 10 items, twitter returns 10 items, will the output of count will ever be larger than 10? And if there is only one single thread, how can the thread finishes parsing 1 response and jump to the other response because, IMO, thread is only executed sequently, block of code by block of code

    Read the article

  • HTTP request, strange socket behavoir

    - by hoodoos
    I expirience strange behavior when doing HTTP requests through sockets, here the request: POST https://test.com:443/service/XMLSelect HTTP/1.1 Content-Length: 10926 Host: test.com User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; .NET CLR 1.1.4322; .NET CLR 1.0.3705) Authorization: Basic XXX SOAPAction: http://test.com/SubmitXml Later on there goes body of my request with given content length. After that I recive something like: HTTP/1.1 200 OK Server: Apache-Coyote/1.1 Content-Type: text/xml;charset=utf-8 Transfer-Encoding: chunked Date: Tue, 30 Mar 2010 06:13:52 GMT So everything seem to be fine here. I read all contents from network stream and successfuly recieve response. But my socket which I'm doing polling on switches it's modes like that: write ( i write headers and request here ) read ( after headers sent i begin to recieve response ) write ( STRANGE BEHAVIOUR HERE. WHY? here i send nothing really ) read ( here it switches to read back again ) last two steps can repeat several times. So I want to ask what leads for socket's mode change? And in this case it's not a big problem, but when I use gzip compression in my request ( no idea how it's related ) and ask server to send gzipped response to me like this: POST https://test.com:443/service/XMLSelect HTTP/1.1 Content-Length: 1076 Accept-Encoding: gzip Content-Encoding: gzip Host: test.com User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; .NET CLR 1.1.4322; .NET CLR 1.0.3705) Authorization: Basic XXX SOAPAction: http://test.com/SubmitXml I recieve response like that: HTTP/1.1 200 OK Server: Apache-Coyote/1.1 Content-Encoding: gzip Content-Type: text/xml;charset=utf-8 Transfer-Encoding: chunked Date: Tue, 30 Mar 2010 07:26:33 GMT 2000 ? I recieve a chunk size and GZIP header, it's all okay. And here's what is happening with my poor little socket meanwhile: write ( i write headers and request here ) read ( after headers sent i begin to recieve response ) write ( STRANGE BEHAVIOUR HERE. And it finally sits here forever waiting for me to send something! But if i refer to HTTP I don't have to send anything more! ) What can it be related to? What it wants me to send? Is it remote web server's problem or do I miss something? PS All actual service references and login/passwords replaced with fake ones :)

    Read the article

  • Telnet connection using c#

    - by alejandrobog
    Our office currently uses telnet to query an external server. The procedure is something like this. Connect - telnet opent 128........ 25000 Query - we paste the query and then hit alt + 019 Response - We receive the response as text in the telnet window So I’m trying to make this queries automatic using a c# app. My code is the following First the connection. (No exceptions) SocketClient = new Socket(AddressFamily.InterNetwork, SocketType.Stream, ProtocolType.Tcp); String szIPSelected = txtIPAddress.Text; String szPort = txtPort.Text; int alPort = System.Convert.ToInt16(szPort, 10); System.Net.IPAddress remoteIPAddress = System.Net.IPAddress.Parse(szIPSelected); System.Net.IPEndPoint remoteEndPoint = new System.Net.IPEndPoint(remoteIPAddress, alPort); SocketClient.Connect(remoteEndPoint); Then I send the query (No exceptions) string data ="some query"; byte[] byData = System.Text.Encoding.ASCII.GetBytes(data); SocketClient.Send(byData); Then I try to receive the response byte[] buffer = new byte[10]; Receive(SocketClient, buffer, 0, buffer.Length, 10000); string str = Encoding.ASCII.GetString(buffer, 0, buffer.Length); txtDataRx.Text = str; public static void Receive(Socket socket, byte[] buffer, int offset, int size, int timeout) { int startTickCount = Environment.TickCount; int received = 0; // how many bytes is already received do { if (Environment.TickCount > startTickCount + timeout) throw new Exception("Timeout."); try { received += socket.Receive(buffer, offset + received, size - received, SocketFlags.None); } catch (SocketException ex) { if (ex.SocketErrorCode == SocketError.WouldBlock || ex.SocketErrorCode == SocketError.IOPending || ex.SocketErrorCode == SocketError.NoBufferSpaceAvailable) { // socket buffer is probably empty, wait and try again Thread.Sleep(30); } else throw ex; // any serious error occurr } } while (received < size); } Every time I try to receive the response I get "an exsiting connetion has forcibly closed by the remote host" if open telnet and send the same query I get a response right away Any ideas, or suggestions?

    Read the article

  • Codeigniter or PHP Amazon API help

    - by faya
    Hello, I have a problem searching through amazon web servise using PHP in my CodeIgniter. I get InvalidParameter timestamp is not in ISO-8601 format response from the server. But I don't think that timestamp is the problem,because I have tryed to compare with given date format from http://associates-amazon.s3.amazonaws.com/signed-requests/helper/index.html and it seems its fine. Could anyone help? Here is my code: $private_key = 'XXXXXXXXXXXXXXXX'; // Took out real secret key $method = "GET"; $host = "ecs.amazonaws.com"; $uri = "/onca/xml"; $timeStamp = gmdate("Y-m-d\TH:i:s.000\Z"); $timeStamp = str_replace(":", "%3A", $timeStamp); $params["AWSAccesskeyId"] = "XXXXXXXXXXXX"; // Took out real access key $params["ItemPage"] = $item_page; $params["Keywords"] = $keywords; $params["ResponseGroup"] = "Medium2%2525COffers"; $params["SearchIndex"] = "Books"; $params["Operation"] = "ItemSearch"; $params["Service"] = "AWSECommerceService"; $params["Timestamp"] = $timeStamp; $params["Version"] = "2009-03-31"; ksort($params); $canonicalized_query = array(); foreach ($params as $param=>$value) { $param = str_replace("%7E", "~", rawurlencode($param)); $value = str_replace("%7E", "~", rawurlencode($value)); $canonicalized_query[] = $param. "=". $value; } $canonicalized_query = implode("&", $canonicalized_query); $string_to_sign = $method."\n\r".$host."\n\r".$uri."\n\r".$canonicalized_query; $signature = base64_encode(hash_hmac("sha256",$string_to_sign, $private_key, True)); $signature = str_replace("%7E", "~", rawurlencode($signature)); $request = "http://".$host.$uri."?".$canonicalized_query."&Signature=".$signature; $response = @file_get_contents($request); if ($response === False) { return "response fail"; } else { $parsed_xml = simplexml_load_string($response); if ($parsed_xml === False) { return "parse fail"; } else { return $parsed_xml; } } P.S. - Personally I think that something is wrong in the generation of the from the $string_to_sign when hashing it.

    Read the article

  • Error while rendering .rdl file into pdf format

    - by Arka Chatterjee
    Hi, I an generating reports using SQL Server reporting services. I have generated a report and have put .rdl report file in the "E" drive. Now, when I am going to render the .rdl report file into pdf format,I am getting the exception : - "An error occurred during local report processing." The stack trace is follows : - " at Microsoft.Reporting.WebForms.LocalReport.InternalRender(String format, Boolean allowInternalRenderers, String deviceInfo, CreateAndRegisterStream createStreamCallback, Warning[]& warnings)\r\n at Microsoft.Reporting.WebForms.LocalReport.InternalRender(String format, Boolean allowInternalRenderers, String deviceInfo, String& mimeType, String& encoding, String& fileNameExtension, String[]& streams, Warning[]& warnings)\r\n at Microsoft.Reporting.WebForms.LocalReport.Render(String format, String deviceInfo, String& mimeType, String& encoding, String& fileNameExtension, String[]& streams, Warning[]& warnings)\r\n at SaltlakeSoft.APEX2.Controllers.TestPageController.RenderReport() in E:\Documents and Settings\Administrator\Desktop\afetbuild15thmayapex2\apex2\Controllers\TestPageController.cs:line 1626\r\n at lambda_method(ExecutionScope , ControllerBase , Object[] )\r\n at System.Web.Mvc.ActionMethodDispatcher.<c_DisplayClass1.b_0(ControllerBase controller, Object[] parameters)\r\n at System.Web.Mvc.ActionMethodDispatcher.Execute(ControllerBase controller, Object[] parameters)\r\n at System.Web.Mvc.ReflectedActionDescriptor.Execute(ControllerContext controllerContext, IDictionary2 parameters)\r\n at System.Web.Mvc.ControllerActionInvoker.InvokeActionMethod(ControllerContext controllerContext, ActionDescriptor actionDescriptor, IDictionary2 parameters)\r\n at System.Web.Mvc.ControllerActionInvoker.<c_DisplayClassa.b_7()\r\n at System.Web.Mvc.ControllerActionInvoker.InvokeActionMethodFilter(IActionFilter filter, ActionExecutingContext preContext, Func`1 continuation)" I am using the following code : - LocalReport report = new LocalReport(); report.ReportPath = @"E:\Report1.rdl"; List employeeCollection = empRepository.FindAll().ToList(); ReportDataSource reportDataSource = new ReportDataSource("dataSource1",employeeCollection); report.DataSources.Clear(); report.DataSources.Add(reportDataSource); report.Refresh(); string reportType = "PDF"; string mimeType; string encoding; string fileNameExtension; string deviceInfo ="" +"PDF" + "8.5in" + "11in" + "0.5in" +"1in" + "1in" +"0.5in" + ""; Warning[] warnings; string[] streams; byte[] renderedBytes; renderedBytes = report.Render(reportType,deviceInfo,out mimeType,out encoding, out fileNameExtension, out streams, out warnings); Response.Clear(); Response.ContentType = mimeType; Response.AddHeader("content-disposition", "attachment; filename=foo." + fileNameExtension); Response.BinaryWrite(renderedBytes); Response.End(); Please help me. Thanks in advance- Arka

    Read the article

  • Logging in with WebFinger and OpenID

    - by Ryan
    I would like to apologize in advance for the ugly formatting. In order to talk about the problem, I need to be posting a bunch of URLs, but the excessive URLs and my lack of reputation makes StackOverflow think I could be a spammer. Any instance of 'ht~tp' is supposed to be 'http'. '{dot}' is supposed to be '.' and '{colon}' is supposed to be ':'. Also, my lack of reputation has prevented me from tagging my question with 'webfinger' and 'google-profiles'. Onto my question: I am messing around with WebFinger and trying to create a small rails app that enables a user to log in using nothing but their WebFinger account. I can succesfully finger myself, and I get back an XRD file with the following snippet: Link rel="ht~tp://specs{dot}openid{dot}net/auth/2.0/provider" href="ht~tp://www{dot}google{dot}com/profiles/{redacted}"/ Which, to me, reads, "I have an OpenID 2.0 login at the url: ht~tp://www{dot}google{dot}com/profiles/{redacted}". But when I try to use that URL to log in, I get the following error OpenID::DiscoveryFailure (Failed to fetch identity URL ht~tp://www{dot}google{dot}com/profiles/{redacted} : Error encountered in redirect from ht~tp://www{dot}google{dot}com/profiles/{redacted}: Error fetching /profiles/{Redacted}: Connection refused - connect(2)): When I replace the profile URL with 'ht~tps://www{dot}google{dot}com/accounts/o8/id', the login works perfectly. here is the code that I am using (I'm using RedFinger as a plugin, and JanRain's ruby-openid, installed without the gem) require "openid" require 'openid/store/filesystem.rb' class SessionsController < ApplicationController def new @session = Session.new #render a textbox requesting a webfinger address, and a submit button end def create ####################### # # Pay Attention to this section right here # ####################### #use given webfinger address to retrieve openid login finger = Redfinger.finger(params[:session][:webfinger_address]) openid_url = finger.open_id.first.to_s #openid_url is now: ht~tp://www{dot}google{dot}com/profiles/{redacted} #Get needed info about the acquired OpenID login file_store = OpenID::Store::Filesystem.new("./noncedir/") consumer = OpenID::Consumer.new(session,file_store) response = consumer.begin(openid_url) #ERROR HAPPENS HERE #send user to OpenID login for verification redirect_to response.redirect_url('ht~tp://localhost{colon}3000/','ht~tp://localhost{colon}3000/sessions/complete') end def complete #interpret return parameters file_store = OpenID::Store::Filesystem.new("./noncedir/") consumer = OpenID::Consumer.new(session,file_store) response = consumer.complete params case response.status when OpenID::SUCCESS session[:openid] = response.identity_url #redirect somehwere here end end end Is it possible for me to use the URL I received from my WebFinger to log in with OpenID?

    Read the article

  • DataTable Filter mystery

    - by user283897
    Hi, could you please help me find the reason of the mystery I've found? In the below code, I create a DataTable and filter it. When I use filter1, everything works as expected. When I use filter2, everything works as expected only if the SubsectionAmount variable is less than 10. As soon as I set SubsectionAmount=10, the dr2 array returns Nothing. I can't find what is wrong. Here is the code: Imports System.Data Partial Class FilterTest Inherits System.Web.UI.Page Protected Sub Page_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load Call FilterTable() End Sub Sub FilterTable() Dim dtSubsections As New DataTable Dim SectionID As Integer, SubsectionID As Integer Dim SubsectionAmount As Integer Dim filter1 As String, filter2 As String Dim rowID As Integer Dim dr1() As DataRow, dr2() As DataRow With dtSubsections .Columns.Add("Section") .Columns.Add("Subsection") .Columns.Add("FieldString") SectionID = 1 SubsectionAmount = 10 '9 For SubsectionID = 1 To SubsectionAmount .Rows.Add(SectionID, SubsectionID, "abcd" & CStr(SubsectionID)) Next SubsectionID For rowID = 0 To .Rows.Count - 1 Response.Write(.Rows(rowID).Item(0).ToString & " " _ & .Rows(rowID).Item(1).ToString & " " _ & .Rows(rowID).Item(2).ToString & "<BR>") Next SubsectionID = 1 filter1 = "Section=" & SectionID & " AND " & "Subsection=" & SubsectionID filter2 = "Section=" & SectionID & " AND " & "Subsection=" & SubsectionID + 1 dr1 = .Select(filter1) dr2 = .Select(filter2) Response.Write(dr1.Length & "<BR>") Response.Write(dr2.Length & "<BR>") If dr1.Length > 0 Then Response.Write(dr1(0).Item("FieldString").ToString & "<BR>") End If If dr2.Length > 0 Then Response.Write(dr2(0).Item("FieldString").ToString & "<BR>") End If End With End Sub End Class

    Read the article

  • No exception, no error, still i dont recieve the json object from my http post

    - by user2978538
    My source code: final Thread t = new Thread() { public void run() { Looper.prepare(); HttpClient client = new DefaultHttpClient(); HttpConnectionParams.setConnectionTimeout(client.getParams(), 10000); HttpResponse response; JSONObject obj = new JSONObject(); try { HttpPost post = new HttpPost("http://pc.dyndns-office.com/mobile.asp"); obj.put("Model", ReadIn1); obj.put("Product", ReadIn2); obj.put("Manufacturer", ReadIn3); obj.put("RELEASE", ReadIn4); obj.put("SERIAL", ReadIn5); obj.put("ID", ReadIn6); obj.put("ANDROID_ID", ReadIn7); obj.put("Language", ReadIn8); obj.put("BOARD", ReadIn9); obj.put("BOOTLOADER", ReadIn10); obj.put("BRAND", ReadIn11); obj.put("CPU_API", ReadIn12); obj.put("DISPLAY", ReadIn13); obj.put("FINGERPRINT", ReadIn14); obj.put("HARDWARE", ReadIn15); obj.put("UUID", ReadIn16); StringEntity se = new StringEntity(obj.toString()); se.setContentType(new BasicHeader(HTTP.CONTENT_TYPE, "application/json")); post.setEntity(se); post.setHeader("host", "http://pc.dyndns-office.com/mobile.asp"); response = client.execute(post); if (response != null) { InputStream in = response.getEntity().getContent(); } } catch (Exception e) { e.printStackTrace(); } Looper.loop(); } }; t.start(); } } i want to send an Json object to a Website. As far as I can see, I set the header, but still I get this exception, can someone help me? (I'm using Android-Studio) __ Edit: i don't get any exceptions anymore, but still i do not receive the json packet. When i manually call the website i get a log file entry. Does anyone know, what's wrong? Edit2: When i debug i get as response "HTTP/1.1 400 bad request" i'm sure its not an permission problem. Any ideas?

    Read the article

< Previous Page | 67 68 69 70 71 72 73 74 75 76 77 78  | Next Page >