Search Results

Search found 18568 results on 743 pages for 'url shortener'.

Page 208/743 | < Previous Page | 204 205 206 207 208 209 210 211 212 213 214 215  | Next Page >

  • Help Redirecting A Page to Another Page with adverts for 5 seconds, and then redirecting to another page.

    - by XcodeDev
    Hey, I am trying to redirect a page to another page, and that was working successfully. However I am trying to redirect the first page to another page with adverts. This page will then redirect to another page after five seconds. I am trying to do that by doing this: <?php include('ads.php'); ?> <?php sleep(2); $url = $_GET['url']; header("Location: ".$url.""); exit; ?> However it is showing the advert in ads.php perfectly, but it is not redirecting after five seconds. I am receiving this error in my web browser: Warning: Cannot modify header information - headers already sent by (output started at /home/nucleusi/public_html/adverts/ads.php:1) in /home/nucleusi/public_html/adverts/index.php on line 7 A typical link I would be redirecting to would be this: http://nucleusiphone.com/adverts/index.php/?url=http%3A%2F%2Fitunes.apple.com%2Fmx%2Falbum%2Fstill-got-the-blues%2Fid14135178%3Fi%3D14135158 Thanks in advanced. PS. I don't know any php so any code helps!

    Read the article

  • Java: Using GSon incorrectly? (null pointer exception)

    - by Rosarch
    I'm trying to get the hits of a google search from a string of the query. public class Utils { public static int googleHits(String query) throws IOException { String googleAjax = "http://ajax.googleapis.com/ajax/services/search/web?v=1.0&q="; String json = stringOfUrl(googleAjax + query); JsonObject hits = new Gson().fromJson(json, JsonObject.class); return hits.get("estimatedResultCount").getAsInt(); } public static String stringOfUrl(String addr) throws IOException { ByteArrayOutputStream output = new ByteArrayOutputStream(); URL url = new URL(addr); IOUtils.copy(url.openStream(), output); return output.toString(); } public static void main(String[] args) throws URISyntaxException, IOException { System.out.println(googleHits("odp")); } } The following exception is thrown: Exception in thread "main" java.lang.NullPointerException at odp.compling.Utils.googleHits(Utils.java:48) at odp.compling.Utils.main(Utils.java:59) What am I doing incorrectly? Should I be defining an entire object for the Json return? That seems excessive, given that all I want to do is get one value. For reference: the returned JSON structure.

    Read the article

  • Nginx - Treats PHP as binary

    - by Think Floyd
    We are running Nginx+FastCgi as the backend for our Drupal site. Everything seems to work like fine, except for this one url. http:///sites/all/modules/tinymce/tinymce/jscripts/tiny_mce/plugins/smimage/index.php (We use TinyMCE module in Drupal, and the url above is invoked when user tries to upload an image) When we were using Apache, everything was working fine. However, nginx treats that above url Binary and tries to Download it. (We've verified that the file pointed out by the url is a valid PHP file) Any idea what could be wrong here? I think it's something to do with the NGINX configuration, but not entirely sure what that is. Any help is greatly appreciated. Config: Here's the snippet from the nginx configuration file: root /var/www/; index index.php; if (!-e $request_filename) { rewrite ^/(.*)$ /index.php?q=$1 last; } error_page 404 index.php; location ~* \.(engine|inc|info|install|module|profile|po|sh|.*sql|theme|tpl(\.php)?|xtmpl)$|^(code-style\.pl|Entries.*|Repository|Root|Tag|Template)$ { deny all; } location ~* ^.+\.(jpg|jpeg|gif|png|ico)$ { access_log off; expires 7d; } location ~* ^.+\.(css|js)$ { access_log off; expires 7d; } location ~ .php$ { include /etc/nginx/fcgi.conf; fastcgi_pass 127.0.0.1:8888; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; fastcgi_param QUERY_STRING $query_string; fastcgi_param REQUEST_METHOD $request_method; fastcgi_param CONTENT_TYPE $content_type; fastcgi_param CONTENT_LENGTH $content_length; } location ~ /\.ht { deny all; }

    Read the article

  • how to pass value to controller??

    - by rajesh
    When I try to pass url value to controller action, action is not getting the required value. I'm sending the value like this: function value(url,id) { alert(url); document.getElementById('rating').innerHTML=id; var params = 'artist='+id; alert(params); // var newurl='http://localhost/songs_full/public/eslresult/ratesong/userid/1/id/27'; var myAjax = new Ajax.Request(newurl,{method: 'post',parameters:params,onComplete: loadResponse}); //var myAjax = new Ajax.Request(url,{method:'POST',parameters:params,onComplete: load}); //alert(myAjax); } function load(http) { alert('success'); } and in the controller I have: public function ratesongAction() { $user=$_POST['rating']; echo $user; $post= $this->getRequest()->getPost(); //echo $post; $ratesongid= $this->_getParam('id'); } But still not getting the result. I am using zend framework.

    Read the article

  • Post valuse and upload Image to php server in android

    - by lawat
    I am trying to upload image from android phone to php server with additional values,the method is post my php file look like this if($_POST['val1']){ if($_POST['val2']){ if($_FILE['image']){ ...... } } }else{ echo "Value not found"; } I am doing is URL url=new URL("http://www/......../myfile.php"); HttpURLConnection con=(HttpURLConnection) url.openConnection(); con.setDoInput(true); con.setDoOutput(true); con.setUseCaches(false); con.setRequestMethod("POST");//Enable http POST con.setRequestProperty("Connection", "Keep-Alive"); con.setRequestProperty("Content-Type", "multipart/form-data;boundary="+"****"); connection.setRequestProperty("uploaded_file", imagefilePath); DataOutputStream ostream = new DataOutputStream( con.getOutputStream()); String res=("Content-Disposition: form-data; name=\"val1\""+val1+"****"+ "Content-Disposition: form-data; name=\"val2\""+val2+"****" "Content-Disposition: form-data; name=\"image\";filename=\"" + imagefilePath +"\""+"****"); outputStream.writeBytes(res); my actual problem is values are not posting so first if condition get false and else section is executed that is it give value not found please help me

    Read the article

  • twitter streaming api instead of search api

    - by user1711576
    I am using twitters search API to view all the tweets that use a particular hashtag I want to view. However, I want to use the stream function, so, I only get recent ones, and so, I can then store them. <?php global $total, $hashtag; $hashtag = $_POST['hash']; $total = 0; function getTweets($hash_tag, $page) { global $total, $hashtag; $url = 'http://search.twitter.com/search.json?q='.urlencode($hash_tag).'&'; $url .= 'page='.$page; $ch = curl_init($url); curl_setopt ($ch, CURLOPT_RETURNTRANSFER, TRUE); $json = curl_exec ($ch); curl_close ($ch); echo "<pre>"; $json_decode = json_decode($json); print_r($json_decode->results); $json_decode = json_decode($json); $total += count($json_decode->results); if($json_decode->next_page){ $temp = explode("&",$json_decode->next_page); $p = explode("=",$temp[0]); getTweets($hashtag,$p[1]); } } getTweets($hashtag,1); echo $total; ?> The above code is what I have been using to search for the tweets I want. What do I need to do to change it so I can stream the tweets instead? I know I would have to use the stream url https://api.twitter.com/1.1/search/tweets.json , but what do I need to change after that is where I don't know what to do. Obviously, I know I'll need to write the database sql but I want to just capture the stream first and view it. How would I do this? Is the code I have been using not any good for just capturing the stream?

    Read the article

  • Android: Gzip/Http supported by default?

    - by OneWorld
    I am using the code shown below to get Data from our server where Gzip is turned on. Does my Code already support Gzip (maybe this is already done by android and not by my java program) or do I have to add/change smth.? How can I check that it's using Gzip? For my opionion the download is kinda slow. private static InputStream OpenHttpConnection(String urlString) throws IOException { InputStream in = null; int response = -1; URL url = new URL(urlString); URLConnection conn = url.openConnection(); if (!(conn instanceof HttpURLConnection)) throw new IOException("Not an HTTP connection"); try { HttpURLConnection httpConn = (HttpURLConnection) conn; httpConn.setAllowUserInteraction(false); httpConn.setInstanceFollowRedirects(true); httpConn.setRequestMethod("GET"); httpConn.connect(); response = httpConn.getResponseCode(); if (response == HttpURLConnection.HTTP_OK) { in = httpConn.getInputStream(); if(in == null) throw new IOException("No data"); } } catch (Exception ex) { throw new IOException("Error connecting"); } return in; }

    Read the article

  • hover buttons background image change not working

    - by Brae
    Background I'm trying to make a menu, you hover over the button and the background image shifts its Y position to give you the 'over' effect for each button. CSS .menu {float: left;} .menu span {display: none;} .menu a {display: block; margin: 10px; width: 200px; height: 50px;} #itemA {background: url('images/btnA.png') no-repeat 0 0;} #itemB {background: url('images/btnB.png') no-repeat 0 0;} #itemC {background: url('images/btnC.png') no-repeat 0 0;} #itemD {background: url('images/btnD.png') no-repeat 0 0;} HTML <div class="menu"> <a id="itemA" href="#"><span>AAAAA</span></a> <a id="itemB" href="#"><span>BBBBB</span></a> <a id="itemC" href="#"><span>CCCCC</span></a> <a id="itemD" href="#"><span>DDDDD</span></a> </div> Problem why do none of these work? /*** - test A a.menu:link {background-position: 0 -51px;} a.menu:visited {display: block; margin: 10px; width: 200px; height: 32px;} ***/ /*** - test B a.menu:hover {background-position: 0 -51px;} ***/ /*** - test C .menu a:hover {background-position: 0 -51px;} ***/ /*** - test D .menu:hover a {background-position: 0 -51px;} ***/ /*** - test E a:hover .menu {background-position: 0 -51px;} ***/ Notes images are 200x101px (50px high with a 1px seperator) Question why do none of these work, should any of them work, is there a solution im missing? thanks in advance!

    Read the article

  • Time out when creating a site collection

    - by Daeko
    I am trying to create a site collection programmatically. It has worked for about 6 months, but after the servers have been updated (various patches) it doesn’t work anymore (we have 3 servers: 1 development, 1 test, 1 production). It is still working in my development environment which hasn’t been updated, but not on the two others. I don’t receive any error messages, it just hangs at the code that is supposed to add the site collection (see code below). I am using Windows Server 2003 R2 and Sharepoint 2007 (version 12.0.0.6421 ). It doesn’t give me any errors, it just hangs until Internet Explorer comes with a “request timed out” response. If I try and debug the code, the code just stops there and nothing happens. No error messages or anything. public static string CreateSPAccountSite(string siteName) { string url = ""; SPSecurity.RunWithElevatedPrivileges(delegate() { SPWeb web = SPContext.Current.Web; using (SPSite siteCollectionOuter = new SPSite(web.Site.ID)) { SPWebApplication webApp = siteCollectionOuter.WebApplication; SPSiteCollection siteCollection = webApp.Sites; SPSite site = siteCollection.Add("sites/" + siteName, siteName, "Auto generated Site collection.", 1033, "STS#0", siteCollectionOuter.Owner.LoginName, siteCollectionOuter.Owner.Name, siteCollectionOuter.Owner.Email); //Hangs here site.PortalName = "Portal"; site.PortalUrl = mainUrl; // https://www.ourdomain.net url = site.Url; } }); return url; //Should be "https://www.outdomain.net/sites/siteName" }

    Read the article

  • Response Redirect - Open Link in New Window

    - by bacis09
    First, I've taken the time to review this question which seems to be the most similar, however, the solution that seems to have been selected will not work in my scenario.Not to mention I worry about some of the comments claiming it to be brittle or an inadequate solution. http://stackoverflow.com/questions/104601/asp-net-response-redirect-to-new-window -We have an XML document which basically contains all of the information for a Side menu. -We have numerous URLS which are stored in a constants class. -One of the elements in a string of XML (well call it label) is used to determine if the menu item is created as a LinkButton or a Label. -Links use a custom user control that is used standard for all links across the application (why suggestion on similar thread doesn't work - I don't want all links to open in a new window - just one) -One of the elements in a string of XML (well call it function) is used in a Switch statement to generate our links using Response Redirect. It may look something like this. switch (function) { case goto 1: string url; if (user_group == 1) { url = Constants.CONSTANT1 } else { url = Constants.CONSTANT2 } Response.Redirect(url) case goto 2: ...... default: ...... break; } Given this scenario, I'm trying to find the best way to quickly open a New Window, when a specific case in this switch statement is met. Can it be done with Response Redirect (this seems to be arguable - people say no it can't, yet other people say they have made it work)? If not, what alternative can work here?

    Read the article

  • Java scripts conflict on my home page

    - by naveen
    <script type="text/javascript" src="jquery-1.js"></script> <script type="text/javascript" src="mootools.js"></script> <script type="text/javascript" src="slideshow.js"></script> <script type="text/javascript"> //<![CDATA[ window.addEvent('domready', function(){ var data = { '1.jpg': { caption: 'Volcano Asención in Ometepe, Nicaragua.' }, '2.jpg': { caption: 'A Ceibu tree.' }, '3.jpg': { caption: 'The view from Volcano Maderas.' }, '4.jpg': { caption: 'Beer and ice cream.' } }; var myShow = new Slideshow('show', data, {controller: true, height: 400, hu: 'images/', thumbnails: true, width: 500}); }); //]]> </script> <script type="text/javascript"> $(document).ready(function() { //slides the element with class "menu_body" when paragraph with class "menu_head" is clicked $("#firstpane p.menu_head").click(function() { $(this).css({backgroundImage:"url(down.png)"}).next("div.menu_body").slideToggle(300).siblings("div.menu_body").slideUp("slow"); $(this).siblings().css({backgroundImage:"url(left.png)"}); }); //slides the element with class "menu_body" when mouse is over the paragraph $("#secondpane p.menu_head").mouseover(function() { $(this).css({backgroundImage:"url(down.png)"}).next("div.menu_body").slideDown(500).siblings("div.menu_body").slideUp("slow"); $(this).siblings().css({backgroundImage:"url(left.png)"}); }); }); </script> <!--[if lt IE 7]> <script type="text/javascript" src="unitpngfix.js"></script> <![endif]-->

    Read the article

  • Cross domain iframe content load detection

    - by fpb
    I have a rather interesting problem. I have a parent page that will create a modal jquery dialog with an iframe contained within the dialog. The iframe will be populated with content from a 3rd party domain. My issue is that I need to create some dialog level javascript that can detect if the content of the iframe loaded successfully and if it hasn't within a 5 second time frame, then to close the dialog and return the user to the parent page. I have researched numerous solutions and only two are of any true value. Get the remote site to include a javascript line of document.domain = 'our-domain.com'. Use a URL Fragment hack, but again I would need the request that the remote site able to modify the URL by appending '#some_value' to the end of the URL and my dialog window would have to poll the URL until it either sees it or times out. Are these honestly the only options I have to work with? Is there not a simpler way to just detect this? I have been researching if there's a way to poll for http response errors, but this still remains confined to the same restrictions. Any help would be immensely appreciated. Thanks

    Read the article

  • Pipelining String in Powershell

    - by ChvyVele
    I'm trying to make a simple PowerShell function to have a Linux-style ssh command. Such as: ssh username@url I'm using plink to do this, and this is the function I have written: function ssh { param($usernameAndServer) $myArray = $usernameAndServer.Split("@") $myArray[0] | C:\plink.exe -ssh $myArray[1] } If entered correctly by the user, $myArray[0] is the username and $myArray[1] is the URL. Thus, it connects to the URL and when you're prompted for a username, the username is streamed in using the pipeline. Everything works perfectly, except the pipeline keeps feeding the username ($myArray[0]) and it is entered as the password over and over. Example: PS C:\Users\Mike> ssh xxxxx@yyyyy login as: xxxxx@yyyyy's password: Access denied xxxxx@yyyyy's password: Access denied xxxxx@yyyyy's password: Access denied xxxxx@yyyyy's password: Access denied xxxxx@yyyy's password: Access denied xxxxx@yyyyy's password: FATAL ERROR: Server sent disconnect message type 2 (protocol error): "Too many authentication failures for xxxxx" Where the username has been substituted with xxxxx and the URL has been substituted with yyyyy. Basically, I need to find out how to stop the script from piping in the username ($myArray[0]) after it has been entered once. Any ideas? I've looked all over the internet for a solution and haven't found anything.

    Read the article

  • Can I keep git from pushing the master branch to all remotes by default?

    - by Curtis
    I have a local git repository with two remotes ('origin' is for internal development, and 'other' is for an external contractor to use). The master branch in my local repository tracks the master in 'origin', which is correct. I also have a branch 'external' which tracks the master in 'other'. The problem I have now is that my master brach ALSO wants to push to the master in 'other' as well, which is an issue. Is there any way I can specify that the local master should NOT push to other/master? I've already tried updating my .git/config file to include: [branch "master"] remote = origin merge = refs/heads/master [branch "external"] remote = other merge = refs/heads/master [push] default = upstream But remote show still shows that my master is pushing to both remotes: toko:engine cmlacy$ git remote show origin Password: * remote origin Fetch URL: <REPO LOCATION> Push URL: <REPO LOCATION> HEAD branch: master Remote branches: master tracked refresh-hook tracked Local branch configured for 'git pull': master merges with remote master Local ref configured for 'git push': master pushes to master (up to date) Those are all correct. toko:engine cmlacy$ git remote show other Password: * remote other Fetch URL: <REPO LOCATION> Push URL: <REPO LOCATION> HEAD branch: master Remote branch: master tracked Local branch configured for 'git pull': external merges with remote master Local ref configured for 'git push': master pushes to master (local out of date) That last section is the problem. 'external' should merge with other/master, but master should NEVER push to other/master. It's never gong to work.

    Read the article

  • How to implement a search page which shows results on the same page?

    - by Andrew
    I'm using ASP.NET MVC 2 for the first time on a project at work and am feeling like a bit of a noob. I have a page with a customer search control/partial view. The control is a textbox and a button. You enter a customer id into the textbox and hit search. The page then "refreshes" and shows the customer details on the same page. In other words, the customer details appear below the customer search control. This is so that if the customer isn't the right one, the user can search again without hitting back in the browser. Or, perhaps they mistyped the customer id and need to try again. I want the URL to look like this: /Customer/Search/1 Obviously, this follows the default route in the project. Now, if I type the URL above directly into my browser, it works fine. However, when I then use the search control on that page to search for say customer 2, the page refreshes with the correct customer details but the URL does not change! It stays as /Customer/Search/1 When I want it to be /Customer/Search/2 How can I get it to change to the correct URL? I am only using the default route in Global.asax. My Search method looks like this: <AcceptVerbs(HttpVerbs.Get)> _ Function Search(ByVal id As String) As ActionResult Dim customer As Customer = New CustomerRepository().GetById(id) Return View("SearchResult", customer) End Function

    Read the article

  • How to get the earliest checkout-able revision info from subversion?

    - by zhongshu
    I want to check a svn url and to get the earliest revision, then checkout it, I don't want to use HEAD because I will compare the earliest revision to others. so I use "svn info" to get the "Last Changed Rev" for the url like this: D:\Project>svn info svn://.../branches/.../path Path: ... URL: svn://.../branches/.../path Repository Root: svn://yt-file-srv/ Repository UUID: 9ed5ffd7-7585-a14e-96b2-4aab7121bb21 Revision: 2400 Node Kind: directory Last Changed Author: xxx Last Changed Rev: 2396 Last Changed Date: 2010-03-12 09:31:52 +0800 but, I found the 2396 revision is not checkout-able, because this path is in a branch copied from trunk, and the 2396 is the revision modified in the trunk. so when I use svn checkout -r 2396, I will get a working copy for the path in the trunk, then I can not do checkin for the branch. D:\Project>svn checkout svn://.../branches/.../path -r 2396 workcopy ..... ..... D:\Project>svn info workcopy Path: workcopy URL: svn://.../trunk/.../path Repository Root: svn://yt-file-srv/ Repository UUID: 9ed5ffd7-7585-a14e-96b2-4aab7121bb21 Revision: 2396 Node Kind: directory Schedule: normal Last Changed Author: xxx Last Changed Rev: 2396 Last Changed Date: 2010-03-12 09:31:52 +0800 So, my question is how to get a checkout-able revision for the branch path, for this example, I want to get 2397 (because 2397 is the revision which copy occur). And I know "svn log" can get the info, but "svn log" output maybe very long and parse it will be difficult than "svn info". I just want know which revision is the earliest checkout-able revision for the path.

    Read the article

  • Troubleshooting multiple GET variables In PHP

    - by V413HAV
    This may be a very simple question but I don't what's the wrong thing am doing here... To explain you clearly, I've set a real simple example below: <ul> <li><a href="test.php?link1=true">Link 1</a></li> <li><a href="test.php?link2=true">Link 2</a></li> <li><a href="test.php?link3=true">Link 3</a></li> </ul> <?php if(isset($_GET['link1'])) { if(($_GET['link1']) == 'true') { echo 'This Is Link 1'; } } if(isset($_GET['link2'])) { if(($_GET['link2']) == 'true') { echo 'This Is Link 2'; } } if(isset($_GET['link3'])) { if(($_GET['link3']) == 'true') { echo 'This Is Link 3'; } } ?> This is a test.php page, here I've set 3 different arguments for $_GET, and show contents accordingly, now everything works perfect, the only thing am not understanding is how to block this kind of url say if user clicks on link 1 the url will be : http://localhost/test.php?link1=true And the Output of this url is This is Link 1 Now if I change this url to : http://localhost/test.php?link3=true&link2=true&link1=true And the Output what I get is This Is Link 1This Is Link 2This Is Link 3 Now this is ok here, but it's very annoying if someone types this and see's forms one below the other, any way I can stop this tampering?

    Read the article

  • Make a PHP GET request from a PHP script and exit

    - by Abs
    Hello all, Is there something simpler than the following. I am trying to make a GET request to a PHP script and then exit the current script. I think this is a job for CURL but is there something simpler as I don't want to really worry about enabling the CURL php extension? In addition, will the below start the PHP script and then just come back and not wait for it to finish? //set GET variables $url = 'http://domain.com/get-post.php'; $fields = array( 'lname'=>urlencode($last_name), 'fname'=>urlencode($first_name) ); //url-ify the data for the GET foreach($fields as $key=>$value) { $fields_string .= $key.'='.$value.'&'; } rtrim($fields_string,'&'); //open connection $ch = curl_init(); //set the url, number of POST vars, POST data curl_setopt($ch,CURLOPT_URL,$url); curl_setopt($ch,CURLOPT_GET,count($fields)); curl_setopt($ch,CURLOPT_GETFIELDS,$fields_string); //execute GET $result = curl_exec($ch); //close connection curl_close($ch); I want to run the other script which contains functions when a condition is met so a simple include won't work as the if condition wraps around the functions, right? Please note, I am on windows machine and the code I am writing will only be used on a Windows OS. Thanks all for any help and advice

    Read the article

  • Perl: Value of response code in HTTP::Request

    - by lola
    Hi all, So, I am writing a code to get a document from the internet. The document size is around 200 KB. This is the code: !/usr/local/bin/perl -w use strict; use LWP::UserAgent; my $ua = LWP::UserAgent->new; my $url = "SOME URL"; my $req = HTTP::Request->new(GET => $url); my $res = $ua->request($req); if($res->is_success){ print $res->content ."\n"; } else{ print "Error: " . $res->status_line; } Now, the only problem is I can't mention what the URL is. However, the output is: "Error: 500 read timeout". When I checked the link externally, the data is being downloaded in under 5 seconds. I even changed the timeout to 1000s, but it still didn't work. How should I go about finding more information related to the response. The size of the file (around 200KB) is also not too great to warrant a read timeout. The server is also not a busy one, didn't give a problem whenever I checked the link on the browser. Thanks.

    Read the article

  • How to make chrome.tabs.update works with content script

    - by user1673772
    I work on a little extension on Google Chrome, I want to create a new tab, go on the url "sample"+i+".com", launch a content script on this url, update the current tab to "sample"+(i+1)+".com", and launch the same script. I looked the Q&A available on stackoverflow and I google it but I didn't found a solution who works. This is my actually code of background.js (it works), it creates two tabs (i=21 and i=22) and load my content script for each url, when I tried to do a chrome.tabs.update Chrome launchs directly a tab with i = 22 (and the script works only one time) : function extraction(tab) { for (var i =21; i<23;i++) { chrome.storage.sync.set({'extraction' : 1}, function() {}); //for my content script chrome.tabs.create({url: "http://example.com/"+i+".html"}, function() {}); } } chrome.browserAction.onClicked.addListener(function(tab) {extraction(tab);}); If anyone can help me, the content script and manifest.json are not the problem. I want to make that 15000 times so I can't do otherwise. Thank you.

    Read the article

  • Sending an HTTP POST request through the android emulator doesn't work

    - by Sotirios Delimanolis
    I'm running a tomcat servlet on my local machine and an Android emulator with an app that makes a post request to the servlet. The code for the POST is below (without exceptions and the like): String strUrl = "http://10.0.2.2:8080/DeviceDiscoveryServer/server/devices/"; Device device = Device.getUniqueInstance(); urlParameters += URLEncoder.encode("user", "UTF-8") + "=" + URLEncoder.encode(device.getUser(), "UTF-8"); urlParameters += "&" + URLEncoder.encode("port", "UTF-8") + "=" + URLEncoder.encode(new Integer(Device.PORT).toString(), "UTF-8"); urlParameters += "&" + URLEncoder.encode("address", "UTF-8") + "=" + URLEncoder.encode(device.getAddress().getHostAddress(), "UTF-8"); URL url = new URL(strUrl); HttpURLConnection connection = (HttpURLConnection) url.openConnection(); connection.setDoOutput(true); connection.setRequestMethod("POST"); OutputStreamWriter wr = new OutputStreamWriter(connection.getOutputStream()); wr.write(urlParameters); wr.flush(); wr.close(); Whenever this code is executed, the servlet isn't called. However if I change the type of the request to 'GET' and don't write anything to the outputstream, the servlet gets called and everything works fine. Am I just not making the POST correctly or is there some other error?

    Read the article

  • How to check Isavailablity in ajax?

    - by udaya
    Hi This is what i have in my view page <td width=""><input type="text" name="txtUserName" id="txtUserName" /></td> <td><input type="button" name="CheckUsername" id="CheckUsername" value="Check Availablity" onclick="Check_User_Name();"/></td> Onclick of the button the Check_User_Name function in my ajax.js loads This is the Check_User_Name function function Check_User_Name(source) { var UserName = document.getElementById('txtUserName').value; if(window.ActiveXObject) User_Name = new ActiveXObject("Microsoft.XMLHTTP"); else if(window.XMLHttpRequest) User_Name = new XMLHttpRequest(); var URL = newURL+"ssit/system/application/views/ssitAjax.php"; URL = URL +"?CheckUsername="+UserName; User_Name.onreadystatechange = User_Name_Fun; User_Name.open("GET",URL,true); User_Name.send(null); } function User_Name_Fun() { document.getElementById('User_div').innerHTML=User_Name.responseText; } Then I can have the value in echo username then the $result has all the user name Hows can i checkIsavailablity of username from here if(($_GET['CheckUsername']!="") || (isset($_GET['CheckUsername']))) { echo $UserName = $_GET['CheckUsername'];//echo username $_SESSION['state'] = $State; $queryres = "SELECT dUser_name FROM tbl_login WHERE dIsDelete='0'"; $result = mysql_query($queryres,$cn) or die("Selection Query Failed !!!");

    Read the article

  • How can I dynamically call the named route in a :partial in rails?

    - by Angela
    I have the following partial. It can be called from three different times in a view as follows: <%= render :partial => "contact_event", :collection => @contacts, :locals => {:event => email} %> Second time: <%= render :partial => "contact_event", :collection => @contacts, :locals => {:event => call} %> Third time: <%= render :partial => "contact_event", :collection => @contacts, :locals => {:event => letter} %> In each instance, call, email, letter refer to a specific instance of a Model Call, Email, or Letter. Here is what I tried to do and conceptually what I'd like to do: assign the route based on the class name that has been passed to the :event from the :partial. What I did was create what the actual url should be. The 'text' of it is correct, but doesn't seem to recognize it as a named route. <% url = "skip_contact_#{event.class.name.tableize.singularize}_url" % <%= link_to_remote "Skip #{url} Remote", :url = skip_contact_email_url(contact_event, event), :update = "update-area-#{contact_event.id}-#{event.id}" % ' My challenge: skip_contact_email_url only works when the event refers to an email. How can I dynamically define skip_contact_email_url to be skip_contact_letter_url if the local variable is letter? Even better, how can I have a single named route that would do the appropriate action?

    Read the article

  • jQuery and array of objects

    - by sepoto
    $(document).ready(function () { output = ""; $.ajax({ url: 'getevents.php', data: { ufirstname: 'ufirstname' }, type: 'post', success: function (output) { alert(output); var date = new Date(); var d = date.getDate(); var m = date.getMonth(); var y = date.getFullYear(); $('#calendar').fullCalendar({ header: { left: 'prev,next today', center: 'title', right: 'month,basicWeek,basicDay' }, editable: true, events: output }); } }); }); I have code like this and if I copy the text verbatim out of my alert box and replace events: output with events: [{ id: 1, title: 'Birthday', start: new Date(1355011200*1000), end: new Date(1355011200*1000), allDay: true, url: 'http://www.yahoo.com/'},{ id: 2, title: 'Birthday Hangover', start: new Date(1355097600*1000), end: new Date(1355097600*1000), allDay: false, url: 'http://www.yahoo.com'},{ id: 3, title: 'Sepotomus Maximus Christmas', start: new Date(1356393600*1000), end: new Date(1356393600*1000), allDay: false, url: 'http://www.yahoo.com/'},] Everything works just fine. What can I do to fix this problem? I though that using events: output would place the text in that location but it does not seem to be working. Thank you all kindly in advance for any comments or answers!

    Read the article

  • How would I automate my array to be used with cURL?

    - by Rob
    I have an array containing the contents of a MySQL table. I need to put each of these contents into curl_multi_handles so that I can execute them all simultaneously Here is the code for the array, in case it helps: $SQL = mysql_query("SELECT url FROM urls") or die(mysql_error()); while($resultSet = mysql_fetch_array($SQL)){ $urls[]=$resultSet } So I need to put be able to send data to each url at the same time. I don't need to get any data back, and in fact I'll be having them time out after two seconds. It only needs to send the data and then close. My code prior to this, was executing them one at a time. here is that code: $SQL = mysql_query("SELECT url FROM shells") or die(mysql_error()); while($resultSet = mysql_fetch_array($SQL)){ $ch = curl_init($resultSet['url'] . $fullcurl); //load the urls and send GET data curl_setopt($ch, CURLOPT_TIMEOUT, 2); //Only load it for two seconds (Long enough to send the data) curl_exec($ch); curl_close($ch); So my question is: How can I load the contents of the array into curl_multi_handle, execute it, and then remove each handle and close the curl_multi_handle?

    Read the article

< Previous Page | 204 205 206 207 208 209 210 211 212 213 214 215  | Next Page >