Search Results

Search found 1194 results on 48 pages for 'curl'.

Page 8/48 | < Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >

  • PHP4 HTTP Post without cURL

    - by Luke
    I have the following code that works on PHP5 to send a HTTP POST without using cURL. I would like this to work on PHP 4.3.0 and above: $opts = array('http' => array( 'method' => 'POST', 'header' => "Content-type: application/x-www-form-urlencoded\r\n" . "Content-Type: application/json\r\n", 'content' => $query ) ); $context = stream_context_create($opts); $result = file_get_contents($url, false, $context); HTTP context is only supported on PHP5. Is there anyway to make this work with PHP 4.3.0 - I need a fallback method if PHP5 or cURL is not installed.

    Read the article

  • Advice needed- aweber form submission using curl?

    - by i need help
    Advice needed for backend form submission to aweber and get response. Scenario When customer signup at my form, I will 1. insert the customer details into my own database, 2. send them a welcome email from my system, 3. at the same time I want the email to be added into aweber (this should run in the background, so that customer no need to fill in details for second time) If I use the php curl call alone, is it a good solution? I want to submit form value to aweber, so that aweber add the new email into their system, and then response to my backend script? I have seen many versions outside, which may include: http://scripts.incutio.com/httpclient/ http://freshmeat.net/projects/curl_http_client/ http://snoopy.sourceforge.net/ Are they having any special benefit over the normal php curl call to pass in data?

    Read the article

  • php cURL POST how to follow location

    - by webfac
    Hi Guys, I am in a bit of a rut with a cURL issue. The post works greate, the data is POSTED just fine and received ok, but the url of the posted page never appears in the browser after the cURL session is executed, for example look at the following code: $ch = curl_init("http://localhost/eterniti/cart-step-1.php"); curl_setopt($ch, CURLOPT_HEADER, false); curl_setopt($ch, CURLOPT_POST, true); curl_setopt($ch, CURLOPT_POSTFIELDS, "error=1&em=$em&fname=$fname&lname=$lname&email1=$email1&email2=$email2&code=$code&area=$area&number=$num&mobile=$mobile&address1=$address1&address2=$address2&address3=$address3&suburb=$suburb&postcode=$postcode&country=$country"); curl_exec($ch); curl_close($ch); The post works fine and I am taken to the cart-step-1.php where I can process the posted data, HOWEVER the location in the URL address bar of the browser remains that of the script page, in this case proc_xxxxxx.php Any ideas how to get the URL address to reflect the page I am actually POSTED to? Thanks a mill

    Read the article

  • getting vbulletin captcha image with curl

    - by ermac2014
    hi I need to download Vbulletin captcha images on my HDD "from vbulletin register page" using curl and PHP. I really need to get samples of captcha images from several VBulletin boards. I'm collecting these samples for research purposes. anyway, here is what I done with curl till now. 1- download register.php page. 2- parse the downloaded page to get captcha image url. 3- download that image. now I have done step 1 and 2 correctly. but in step 3 when I try to download the captcha image I don't get the captcha. I just get either a very tiny blank gif picture. or I get a png picture with vbulletin word on it. I really don't know what i'm doing wrong. I tried to output the html and push it to the browser the image shows correctly. but thats not what I want. I want to download the image and save it on my HDD. here are some codes I've been working on: //get contents with curl function get_content($url) { $theString = parse_url($url); $cookieName = $theString['host']; $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, $url."register.php"); curl_setopt($ch, CURLOPT_REFERER, $url."register.php"); curl_setopt($ch, CURLOPT_HEADER, 0); curl_setopt($ch, CURLOPT_USERAGENT, 'Mozilla/5.0 (Windows; U; MSIE 7.0; Windows NT 6.0; en-US)'); curl_setopt($ch, CURLOPT_COOKIEJAR, "cookies/cookie.txt"); //saved cookies curl_setopt($ch, CURLOPT_COOKIEFILE, "cookies/cookie.txt"); //saved cookies curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1); curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1); $string = curl_exec ($ch); //print_r(curl_getinfo($ch)); curl_close ($ch); return $string; } //vbulletin main page $url = 'http://blavbulletin.com/'; //get the page $results = get_content($url); if (preg_match_all('/<img[^>]*id\=\"imagereg\"[^>]*src\=\"([^\"]*)\"[^>]*>/s', $results , $captchaimages)) { $captcha = $captchaimages[1][0]; echo "<img src='$url"."$captcha'>"; //when echoed the pic shows correctly //now get the pic $file = get_content("$url"."$captcha"); //save the pic on HDD file_put_contents("captcha.jpg", $file); } any help would be appreciated.. regards,

    Read the article

  • PHP Curl and Loop based on a numeric value

    - by danit
    Im using the Twitter API to collect the number of tweets I've favorited, well to be accurate the total pages of favorited tweets. I use this URL: http://api.twitter.com/1/users/show/username.xml I grab the XML element 'favorites_count' For this example lets assume favorites_count=5 The Twitter API uses this URL to get the favorties: http://twitter.com/favorites.xml (Must be authenticated) You can only get the last 20 favorties using this URL, however you can alter the URL to include a 'page' option by adding: ?page=3 to the end of the favorites URL e.g. http://twitter.com/favorites.xml?page=2 So what I need to do is use CURL (I think) to collect the favorite tweets, but using the URL: http://twitter.com/favorites.xml?page=1 http://twitter.com/favorites.xml?page=2 http://twitter.com/favorites.xml?page=3 http://twitter.com/favorites.xml?page=4 etc... Some kind of loop to visit each URL, and collect the Tweets and then output the cotents. Can anyone help with this: - Need to use CURL to authenticate - Collect the number of pages of tweets (Already scripted this) - Then use a loop to go through each page URL based on the pages value?

    Read the article

  • Re-factoring a CURL request to Ruby's RestClient

    - by user94154
    I'm having trouble translating this CURL request into Ruby using RestClient: system("curl --digest -u #{@user}:#{@pass} '#{@endpoint}/#{id}' --form image_file=@'#{path}' -X PUT") I keep getting 400 Bad Request errors. As far as I can tell, the request does get properly authenticated, but hangs up from the file upload part. Here are my best attempts, all of which get me those 400 errors: resource = RestClient::Resource.new "#{@endpoint}/#{id}", @user, @pass #attempt 1 resource.put :image_file => File.new(path, 'rb'), :content_type => 'image/jpg' #attempt 2 resource.put File.read(path), :content_type => 'image/jpg' #attempt 3 resource.put File.open(path) {|f| f.read}, :content_type => 'image/jpg'

    Read the article

  • Escaping CURL @ symbol with PHP

    - by bkildow
    I'm writing a php application that submits via curl data to sign up for an iContact email list. However I keep getting an invalid email address error. I think this may be due to the fact that I'm escaping the @ symbol so it looks like %40 instead of @. Also, according to the php documentation for curl_setopt with CURLOPT_POSTFIELDS: The full data to post in a HTTP "POST" operation. To post a file, prepend a filename with @ and use the full path. So, is there anyway to pass the @ symbol as post data through curl in php without running it through urlencode first?

    Read the article

  • [PHP] Using cURL to download large XML files

    - by ndg
    I'm working with PHP and need to parse a number of fairly large XML files (50-75MB uncompressed). The issue, however, is that these XML files are stored remotely and will need to be downloaded before I can parse them. Having thought about the issue, I think using a system() call in PHP in order to initiate a cURL transfer is probably the best way to avoid timeouts and PHP memory limits. Has anyone done anything like this before? Specifically, what should I pass to cURL to download the remote file and ensure it's saved to a local folder of my choice?

    Read the article

  • recaptcha image and curl + php

    - by user253530
    $page = $curl->post($baseUrl.'/submit.php', array('url'=>$address,'phase'=>'1','randkey'=>$randKey[0],'id'=>'c_1')); $exp = explode('recaptcha_image',$page); The id recaptcha_image is not found although if i echo $page; the webpage will be displayed and surprisingly even the recpatcha div (with the image itself). Curl shouldn't load the image for recaptcha but somehow it does though when i try to find the div, it is not there. Is there a way to capture the url of the recaptcha image?

    Read the article

  • Reliably detect caller domain over cURL request?

    - by Utkanos
    OK so server-side security is not my forte. Basically, I'm building a service which users may use (via an SDK) only on the domain they stipulated when they signed up. The SDK calls my web service over cURL in PHP. Would I be right in thinking I cannot reliably detect the caller domain, i.e. enforce that it is the same domain they stipulated when signing up? cURL of course sends this over headers, but headers can always (?) be faked. Is there a better course of action to enforce domain for this sort of thing? (NB I'm already using an API key, too - it's just I wanted to restrict domain, too) Thanks in advance

    Read the article

  • Upload image using CURL + PHP via remote form

    - by user253530
    I have a few images that i need to upload using an online form. So far here's my code $info = array('test title','1234','virginia','@'.realpath('e:\wamp\www\1.jpg'),'@'.realpath('e:\wamp\www\2.jpg'),'@'.realpath('e:\wamp\www\3.jpg'),'@'.realpath('e:\wamp\www\4.jpg'),'test description'); $post->postAd($url, $info); The $info array is processed in the postAd method and it's being sent as an associative array using a method from a Curl class i have (it has been tested and worked nicely with everything i needed to do so far with curl). The problem is that the data is completed correctly on the form but the images are not uploaded. Can anyone help with advices/code/guidance?

    Read the article

  • Using 'Copy as cURL' from Chrome in windows command line

    - by user2029890
    So, Google Chrome as this great 'copy as cURL' option under 'Network' of the Chrome DevTools. Works great in command lines for linux but not in windows. Apparently it has something to do with the single quotes as the error I get is protocol 'http not supported In other words its reading that single quote. Is there a simple way to make this formatable for windows? I tried replacing all the single quotes with double quotes but then nothing happens at all. The command is: curl 'http://www.test.com/login/' -H 'Cookie: PHPSESSID=7dvb25maaaaaa9d7bbbbbc3f6' -H 'Origin: http://www.test.com' -H 'Accept-Encoding: gzip,deflate,sdch' -H 'Host: www.test.com' -H 'Accept-Language: en-US,en;q=0.8' -H 'User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/28.0.1500.95 Safari/537.36' -H 'Content-Type: application/x-www-form-urlencoded' -H 'Accept: text/html,application/xhtml+xml,application/xml;q=0.9,/;q=0.8' -H 'Cache-Control: max-age=0' -H 'Referer: http://www.test.com/login/' -H 'Connection: keep-alive' --data 'loc=&login=user%40test.com&password=password&submit1=Sign+In' --compressed Thank you

    Read the article

  • Logging into SO with curl

    - by Good Person
    I'm working on a project and I want to log into SO via curl. I use an openid via Google which means that I need to log into Google first. Here is the code I have so far #!/usr/bin/env sh . ./params.sh #the file with username and password curl --silent https://www.google.com/accounts/ClientLogin \ -d Email=$username -d Passwd=$password \ -d accountType=GOOGLE \ -d source=localhost-test-1 \ -d service=lh2 \ -o tokens #get $Auth as a variable . ./tokens echo $Auth; How do I use the $Auth token to log into SO? edit: I found http://code.google.com/apis/gdata/articles/using_cURL.html and I'll post the updated code soon.

    Read the article

  • Need help with cURL and POSTing in PHP

    - by alex
    I need to post to a payment gateway. The example PHP script for the gateway simply sets the XML like this curl_setopt($ch, CURLOPT_POSTFIELDS, $xmlRequest); // $xmlRequest is just a string of XML In all of my experience, generally you need to use an array with key/values or a string similar to GET params. I am using Kohana, and tracked down a cURL module. It accepts the POST key/values as an array only. Now, I could ditch the module and throw some cURL straight in, but I am using this module fine all throughout the site, so would prefer to use it here. So, my question is, how does that first one work? Does it just POST the whole thing without any named key? Is there a default key I could use for an array to get the module to work?

    Read the article

  • Using PHP's $_COOKIE to create cookie.txt for cURL

    - by boogie
    Hi, I'm running Drupal and Mediawiki on my server under the same domain. They are connected to each other with extensions/modules. Meaning if I log in to Mediawiki, it automatically logs into Drupal also. Mediawiki has some extensions that filter the information shown on the page. What I want to do is to show that filtered information from Mediawiki on my Drupal page. I'm trying to fetch the data with PHP cURL, but I'm not able to force it to use the login information. How can I convert PHP's $_COOKIE into cURL cookie.txt format? I suppose after that I'm able to fetch the data from Mediawiki with the right permissions. This is what print_r($_COOKIE) outputs: [wiki_session] = gg05lhd6pcfs5g6iokhoo0gue7 [wikiUserName] = WikiSysop [wikiLoggedOut] = 20100510110913 [wikiUserID] = 1 [wikiToken] = 52cdb19a7b4a43e5a2f86939e4f54941 Thanks for any help!

    Read the article

  • php cURL POST how to follow location

    - by One Stuck Pixel
    I am in a bit of a rut with a cURL issue. The post works greate, the data is POSTED just fine and received ok, but the url of the posted page never appears in the browser after the cURL session is executed, for example look at the following code: $ch = curl_init("http://localhost/eterniti/cart-step-1.php"); curl_setopt($ch, CURLOPT_HEADER, false); curl_setopt($ch, CURLOPT_POST, true); curl_setopt($ch, CURLOPT_POSTFIELDS, "error=1&em=$em&fname=$fname&lname=$lname&email1=$email1&email2=$email2&code=$code&area=$area&number=$num&mobile=$mobile&address1=$address1&address2=$address2&address3=$address3&suburb=$suburb&postcode=$postcode&country=$country"); curl_exec($ch); curl_close($ch); The post works fine and I am taken to the cart-step-1.php where I can process the posted data, HOWEVER the location in the URL address bar of the browser remains that of the script page, in this case proc_xxxxxx.php Any ideas how to get the URL address to reflect the page I am actually POSTED to? Thanks a mill

    Read the article

  • curl issue with URL not connecting

    - by bmucklow
    So I'm not a very good network person so I was hoping someone could point me in the right direction to figuring out what I am doing wrong. I am trying to use curl to post a SOAP message. I am running the following: curl -d "string of xml message" -H "Content-Type:text/xml; charset=utf-8" ":/" This results in a 'Connection refused' message. So I try pinging ip by itself...no problems. Then I think maybe I need http://:/ so I tried pinging http:// and I get: unknown host http:// yet if I ping the IP by itself I get no issues. Any thoughts on where to start debugging this issue?

    Read the article

  • how to maintain session in cURL in php?

    - by newbie programmer
    how can we maintain session in cURL? i'am having a code the sends login details of a site and logs in successfully i need to get the session maintained at the site to continue. here is my code that used to login to the site using cURL <?php $socket = curl_init(); curl_setopt($socket, CURLOPT_URL, "http://www.XXXXXXX.com"); curl_setopt($socket, CURLOPT_REFERER, "http://www.XXXXXXX.com"); curl_setopt($socket, CURLOPT_POST, true); curl_setopt($socket, CURLOPT_USERAGENT, $agent); curl_setopt($socket, CURLOPT_POSTFIELDS, "form_logusername=XXXXX&form_logpassword=XXXXX"); curl_setopt($socket, CURLOPT_COOKIESESSION, true); curl_setopt($socket, CURLOPT_COOKIEJAR, "cookies.txt"); curl_setopt($socket, CURLOPT_COOKIEFILE, "cookies.txt"); $data = curl_exec($socket); curl_close($socket); ?>

    Read the article

  • Curl: How to insert value to a cookie?

    - by Crazy_Bash
    Ho to insert cookies value in curl? from firebug "request headers" i can see in the following "Cookie: PHPSESSID=gg792c2ktu6sch6n8q0udd94o0; was=1; uncheck2=1; uncheck3=1; uncheck4=1; uncheck5=0; hd=1; uncheck1=1" I have tried the following: curl http://site.com/ -s -L -b cookie.c -c cookie.c -d "was=1; uncheck2=1; uncheck3=1; uncheck4=1; uncheck5=0; hd=1; uncheck1=1" > comic and the only thing i see in cookie.c is PHPSESSID=gg792c2ktu6sch6n8q0udd94o0; was=1;

    Read the article

  • SFTP not supported error with PHP & cURL

    - by Bad Programmer
    I followed the advice from this Stack Overflow question thread, but I keep hitting a snag. I am receiving the following error message: Unsupported protocol: sftp Here is my code: $ch = curl_init(); if(!$ch) { $error = curl_error($ch); die("cURL session could not be initiated. ERROR: $error.""); } $fp = fopen($docname, 'r'); if(!$fp) { $error = curl_error($ch); die("$docname could not be read."); } curl_setopt($ch, CURLOPT_URL, "sftp://$user_name:$user_pass@$server:22/$docname"); curl_setopt($ch, CURLOPT_UPLOAD, 1); curl_setopt($ch, CURLOPT_PROTOCOLS, CURLPROTO_SFTP); curl_setopt($ch, CURLOPT_INFILE, $fp); curl_setopt($ch, CURLOPT_INFILESIZE, filesize($docname)); //this is where I get the failure $exec = curl_exec ($ch); if(!$exec) { $error = curl_error($ch); die("File $docname could not be uploaded. ERROR: $error."); } curl_close ($ch); I used the curl_version() function to see my curl information, and found that sftp doesn't seem to be in the array of supported protocols: [version_number] => 462597 [age] => 2 [features] => 1597 [ssl_version_number] => 0 [version] => 7.15.5 [host] => x86_64-redhat-linux-gnu [ssl_version] => OpenSSL/0.9.8b [libz_version] => 1.2.3 [protocols] => Array ( [0] => tftp [1] => ftp [2] => telnet [3] => dict [4] => ldap [5] => http [6] => file [7] => https [8] => ftps ) Is this a matter of my version of cURL being outdated, or is the SFTP protocol simply not supported at all? Any advice is greatly appreciated.

    Read the article

< Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >