Search Results

Search found 1194 results on 48 pages for 'curl'.

Page 20/48 | < Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >

  • screen scraper templates for various websites

    - by intuited
    I'm looking specifically for a convenient way to locally archive posts from this and other similar sites. I'd like to separate the question itself from the answers, or maybe crop the question and store it, keeping the page title. Obviously I don't need to store the menu or the various other site interface chrome. The best way to do this would seem to be to associate an XSLT template with a match on the URL and use that template to pull the various relevant informations and format them. My two-part question: Is there a tool specifically built for this task? I.E. something that takes a URL and checks it against a map of path-matching expressions to templates, and outputs the result of applying the template to that resource? xmlto seems to be most of the way there, and could probably just be called from a script that does the pattern-matching, but something already integrated would be more convenient. Is such a URL_pattern-to-XSLT_template map publicly available somewhere? Question 2.5: Is it legal to do this with sites like this one that have public licenses on their content?

    Read the article

  • preg_match() find all values inside of table?

    - by mathiregister
    hey guys, a curl function returns a string $widget that contains regular html - two divs where the first div holds a table with various values inside of <td>'s. i wonder what's the easiest and best way for me to extract only all the values inside of the <td>'s so i have blank values without the remaining html. any idea what the pattern for the preg_match should look like? thank you.

    Read the article

  • How to perform an action when file changed?

    - by ZeissS
    Hi, I want to create a script that checks an URL and perform an action (download + unzip) when the "Last-Modified" header of the remote file changed. I thought about fetching the header with curl but then I have to store it somewhere for each file and perform a date comparison. Does anyone have a different idea using (mostly) standard unix tools? thanks

    Read the article

  • protecting my web site content from external

    - by Testadmin
    Hai I heard about external access of a web site using curl by the following code $curl_handle=curl_init(); curl_setopt($curl_handle,CURLOPT_URL,'http://example.com'); $buffer=curl_exec($curl_handle); curl_close($curl_handle); I want to protect my web site from this external access. I am using Php. how will I protect my web site? Does any one know?

    Read the article

  • PHP - Problem using file_get_contents

    - by shyam
    I have a problem while using the file_get_contents function. I am using it to get a response from a different web server, but it's not returning anything (shown as empty string using var_dump). Also, the problem is only while calling this specific server, because I got result when I used Google's address; and it's working fine in my local machine. I've tried cUrl too - but same result.

    Read the article

  • How to perform an action when a remote (Http) file changed?

    - by ZeissS
    Hi, I want to create a script that checks an URL and perform an action (download + unzip) when the "Last-Modified" header of the remote file changed. I thought about fetching the header with curl but then I have to store it somewhere for each file and perform a date comparison. Does anyone have a different idea using (mostly) standard unix tools? thanks

    Read the article

  • How to scrape user's data without being banned by the server?

    - by embedded
    I'm developing a site which monitors user's date. It uses the cURL over PHP. It first gets authorized using cookie and then parses the required data. My problem is that it needs to fire multiple requests to the server (for all registered users) and this may Get me banned by the remote server. I would like to know if there is something I could do to prevent being banned. (This activity is legal - the users have provided their login information) Thanks

    Read the article

  • Matching part of website with a regexp.

    - by richardverbruggen
    With a cURL request I load a complete website into a variable: $buffer. In the source of the site there are two labels in between which my relevant content is placed. ****** bunch of code ******* <!-- InstanceBeginEditable name="Kopij" --> this part I want to store in a match <!-- InstanceEndEditable --> ****** bunch of code ******* I've been messing around with preg_match and its regexp. Can someone try to help me? Thanx in advance.

    Read the article

  • Security header is not valid - using curl php

    - by toni
    Hi all, Im implementing the Express Checkout, Paypal API using PHP. I have no problem with the first step:SetExpressCheckout. I a have awk=success. But in method GetExpressCheckout I get "Security header is not valid". I try to figure out the problem and i think found out maybe it was the curl not working well.. What i did i copy the whole URL: https://api-3t.sandbox.paypal.com/nvp?USER=sanbox_1276609583_biz_api1.gmail.com&PWD=1276609589&SIGNATURE=AYVosblmD7khKkvvb.bNxvFT0OQ2A8GopwByEuC.CfMHt65VaUmvAEy-&VERSION=62.0&token=EC-3YG18670X88588437&METHOD=GetExpressCheckoutDetails and paste it to the browser. This will result to: TOKEN=EC%2d3YG18670X88588437&CHECKOUTSTATUS=PaymentActionNotInitiated&TIMESTAMP=2010%2d06%2d16T07%3a40%3a12Z&CORRELATIONID=e1a1e469bf066&ACK=Success&VERSION=62%2e0&BUILD=1356926... But when that url executed in the function I made it will not work. Below is my function: function mycurl($url,$querystr){ $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, $url); curl_setopt($ch, CURLOPT_VERBOSE, 1); curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, false); curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false); curl_setopt($ch, CURLOPT_POST, 1); curl_setopt($ch, CURLOPT_POSTFIELDS, $querystr); curl_setopt($ch, CURLOPT_RETURNTRANSFER,1); $response = curl_exec($ch); curl_close ($ch); return $response; } I hope somebody can help on this. thanks so much. Note: - I Used the sandbox for this. I created a sandbox account, I have a Business account to represent a merchant, and a Personal account to represent a buyer. And I used this: endpoint url: api-3t.sandbox.paypal.com/nvp sandbox url: www.sandbox.paypal.com/cgi-bin/webscr This should not be the issue.

    Read the article

  • Update twitter profile image using OAuth

    - by sjobe
    I'm trying to get twitter update_profile_image to work using OAuth. I was using curl with regular authentication and everything was working fine, but I switched to OAuth using this library, and now everything except update_profile_image works. I read something about twitter OAuth having problems with multipart data, but that was a while ago and the plugin is supposed to have dealt with that issue. My working regular authentication with curl code $url = 'http://api.twitter.com/1/account/update_profile_image.xml'; $uname = $_POST['username']; $pword = $_POST['password']; $img_path = 'xxx'; $userpwd = $uname . ':' . $pword; $img_post = array('image' => '@' . $img_path . ';type=image/jpeg', 'tile' => 'true'); $format = 'xml'; //alternative: json $message = 'Test update with a random num'.rand(); $opts = array(CURLOPT_URL => $url, CURLOPT_FOLLOWLOCATION => true, CURLOPT_RETURNTRANSFER => true, CURLOPT_HEADER => true, CURLOPT_POST => true, CURLOPT_POSTFIELDS => $img_post, CURLOPT_HTTPAUTH => CURLAUTH_ANY, CURLOPT_USERPWD => $userpwd, CURLOPT_HTTPHEADER => array('Expect:'), CURLINFO_HEADER_OUT => true); $ch = curl_init(); curl_setopt_array($ch, $opts); $response = curl_exec($ch); $err = curl_error($ch); $info = curl_getinfo($ch); curl_close($ch); My current OAuth code [I had to cut it down, so do not minor look for syntax errors] include 'EpiCurl.php'; include 'EpiOAuth.php'; include 'EpiTwitter.php'; include 'secret.php'; $twitterObj = new EpiTwitter($consumer_key, $consumer_secret); $twitterObj->setToken($_GET['oauth_token']); $token = $twitterObj->getAccessToken(); $twitterObj->setToken($token->oauth_token, $token->oauth_token_secret); try{ $img_path = 'xxx'; //$twitterObj->post_accountUpdate_profile_image(array('@image' => "@".$img_path)); $twitterObj->post('/account/update_profile_image.json', array('@image' => "@".$img_path)); $twitterObj->post_statusesUpdate(array('status' => 'This is my new status:'.rand())); //This works $twitterInfo= $twitterObj->get_accountVerify_credentials(); echo $twitterInfo->responseText; }catch(Exception $e){ echo $e->getMessage(); } I've been trying to figure this out for a while, ANY help would be greatly appreciated. I'm not in any way tied to this library, so feel free to recommend others.

    Read the article

  • Could not load php_curl

    - by Ruslan
    Hello I have installed php5.2.13, apache2.2.15 on WindowsXp.Add C:\php to PATH ssystem variable. I can't enable curl extension. I configure extension_dir and remove ";" form php_curl in php.ini but nothing! I copied ssleay32.dll and libeay32.dll in system32 directory - nothing Error log say: PHP Warning: PHP Startup: Unable to load dynamic library 'C:\php\ext\php_curl.dll' - Attempt to access invalid address.\r\n in Unknown on line 0 Can someone help me? Thanks and sorry for bad english.

    Read the article

  • What $_POST[] do i need to post to a forum?

    - by ikky
    Hi! I am admin on a forum. Earlier we had phpbb 2.0 and i made a bot that could write to the forum. Now, we have upgraded the forum to phpbb 3.0, but i can't get my bot to write to the forum anymore. I have looked for a solution, but now i am out of ideas. So it would have been great if anyone have a suggestion. i have btw used CURL and php to make this bot. Usage of bot: Users log in on an external website to report results of footballmatches they have played online. The bot will then automatically write a post to the forum. So basically i need to know what $_POST[] i need to send.

    Read the article

  • C++ library for dealing with multiple HTTP connections

    - by JWood
    Hi, I'm looking for a library to deal with multiple simultaneous HTTP connections (pref. on a single thread) to use in C++ in Windows so it can be Win32 API based. So far, I have tried cURL (multi interface) which seems to be the most appropriate that I have found but my problem is that I may have a queue of 200 requests but I need to only run 4 of them at a time. This becomes problematic when one request may take 2 seconds and another may take 2 mins as you have to wait on all handles and receive the result of all requests in one block. If anyone knows a way round this it would be very useful. I have also tried rolling my own using WinHTTP but I need to throttle the requests so they would ideally need to be on a single thread and use callbacks for data which WinHTTP does not do. The best thing I've found which would solve all my problems is ASIHTTPRequest but unfortunately it's Mac OSX only. Thanks, J

    Read the article

  • I want to query whitepages.com 4,000 times, how to save the results?

    - by John Corbin
    I have an old customer list of 4,000 businesses. I want to determine if the phone numbers associated with each listing are still working (and therefore the business is probably still open). I can put each number in whitepages.com and check them one by one... but want to automate the results. I have looked at their API and can't digest it. I can form the correct query URL, but trying things like cURL -O doesn't work. I have access to Mac tools, Unix tools, and could try various javascript stuff if anyone could point me in the right direction... would even pay. Help? Thx

    Read the article

  • Multithreaded FTP upload. Is it possible?

    - by Arty
    I need to upload multiple files from directory to the server via FTP and SFTP. I've solved this task for SFTP with python, paramiko and threading. But I have problem with doing it for FTP. I tried to use ftplib for python, but it seems that it doesn't support threading and I upload all files one by one, which is very slow. I'm wondering is it even possible to do multithreading uploads with FTP protocol without creating separate connections/authorizations (it takes too long)? Solution can be on Python or PHP. Maybe CURL? Would be grateful for any ideas.

    Read the article

  • Add a new member to a Twitter List

    - by yc10
    I'm trying to add a user (by variable $id) to a Twitter List using PHP CURL, and I can't get it to work. $curl_handle = curl_init(); curl_setopt($curl_handle, CURLOPT_URL, "http://twitter.com/username/list/members.xml"); curl_setopt($curl_handle, CURLOPT_POST, 1); curl_setopt($curl_handle, CURLOPT_POSTFIELDS, "id=$id"); curl_setopt($curl_handle, CURLOPT_CONNECTTIMEOUT, 2); curl_setopt($curl_handle, CURLOPT_RETURNTRANSFER, 1); curl_setopt($curl_handle, CURLOPT_USERPWD, "username:password"); curl_setopt($curl_handle, CURLOPT_HTTP_VERSION, CURL_HTTP_VERSION_1_1); curl_setopt($curl_handle, CURLOPT_VERBOSE, 1); $result = curl_exec($curl_handle); // Look at the returned header $resultArray = curl_getinfo($curl_handle); curl_close($curl_handle); if($resultArray['http_code'] == "200"){ echo 'Success'; } else { echo var_dump($resultArray); } The var_dump reveals that the http_code of the return is 403.

    Read the article

  • Sun webstack vs Installing PHP, MySQL, Apache individually

    - by Vincent
    Is it possible to install PHP, MySQL, Apache individually on Solaris instead of installing them through a webstack? What are the advantages and disadvantages? I seem to frequently get a CURL error on Solaris when dealing with HTTPS sites. (error:81072080:lib(129):func(114):reason(128). I have no clue why that error is occuring and thought it might solve it, if I upgrade to latest PHP,MySQL,Apache versions. At this point I am not even sure if it's a Solaris issue. Any advice? Thanks

    Read the article

  • getting real link from rss feed link

    - by pfunc
    I am experimenting with scraping certain pages from an RSS feed using curl and php. The page scraping was working fine when I was just using actual links, not links from the rss feeds. However, I realize now that links in rss feeds are usually just redirects to the actual page (at least this is what it seems like). Because now when I scrape a page with the rss link, it doesn't actually get the information I am looking for. Has anyone encountered this and know of a workaround. Is there anyway to see where the rss link is redirecting to and capturing that value?

    Read the article

  • file_get_contents returns 403 forbidden

    - by absk
    I am trying to make a sitescraper. I made it on my local machine and it works very fine there. When I execute the same on my server, it shows a 403 forbidden error. I am using the PHP Simple HTML DOM Parser. The error I get on the server is this: Warning: file_get_contents(http://example.com/viewProperty.html?id=7715888) [function.file-get-contents]: failed to open stream: HTTP request failed! HTTP/1.1 403 Forbidden in /home/scraping/simple_html_dom.php on line 40 The line of code triggering it is: $url="http://www.example.com/viewProperty.html?id=".$id; $html=file_get_html($url); I have checked the php.ini on the server and allow_url_fopen is On. Possible solution can be using curl, but I need to know where I am going wrong.

    Read the article

  • How can you pass GET values to another url in php? GET value forwarding

    - by gobackpacking
    Ok, so I'm using Jquery's AJAX function and it's having trouble passing a URL with a http address. So I'm hoping to "get" the GET values and send them to another URL — so: a local php file begin passed GET values, which in turn forwards the GET values to another url. Maybe curl is the answer? I don't know. It's got to be a very short answer I know. pseudo code: //retrieve the GET values $var retrieve [GET] //passing it to another url send get values to url ($var, url_address)

    Read the article

< Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >