Search Results

Search found 1194 results on 48 pages for 'curl'.

Page 5/48 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Auto Submitting a form (cURL)

    - by cast01
    Im writing a form to post data off to paypal, and this works fine, i create the form with hidden fields and then have a submit button that submits everything to paypal. However, when the user clicks that button, there is more that i want to do, for example change their cart status in the database. So, i want to be able to execute some code when they click submit and then post the data to paypal. I dont want to use javascript for this. My method at the moment is using cURL, the form posts back to the current page, i then check for $_POST data, do my other commands like updating the status of the cart, and then create a curl command, and post the form data to paypal. Now, it gets posted successfully, but the browser doesnt go off to paypal. At first i was just retirieving the result in a string and then echoing it out and i could see that the post was successful, then i set CURLOPT_FOLLOWLOCATION to 1 assuming this would let the browser go off to paypal but it doesnt, what it seems to do is grab some stuff from paypal and put it on my page. Is using cURL the right thing for this? and if so, how do i get round this problem? I want to stay away from javascript for this as only users with javascript enabled would have their cart updated. The code im using for curl is below, the post_data is an array i created and then read key-value pairs into post_string. //Set cURL Options curl_setopt($curl_connection, CURLOPT_CONNECTTIMEOUT, 30); curl_setopt($curl_connection, CURLOPT_USERAGENT, "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1)"); curl_setopt($curl_connection, CURLOPT_RETURNTRANSFER, false); curl_setopt($curl_connection, CURLOPT_SSL_VERIFYPEER, false); curl_setopt($curl_connection, CURLOPT_FOLLOWLOCATION, 1); curl_setopt($curl_connection, CURLOPT_MAXREDIRS, 20); //Set Data to be Posted curl_setopt($curl_connection, CURLOPT_POSTFIELDS, $post_string); //Execute Request curl_exec($curl_connection);

    Read the article

  • Translating from cURL to straight HTTP requests

    - by Joshua
    What would the following cURL command look like as a generic (without cURL) http request? feedUri="https://www.someservice.com/feeds\ ?prettyprint=true" curl $feedUri --silent \ --header "GData-Version: 2" For example how could such an http request be expressed in the browser address bar? Partucluarly, how do I express the --header information if I were to just type out the plain http request?

    Read the article

  • PHP Header redirection problem within page called by cURL

    - by Das123
    I have a site that uses cURL to access some pages, stores the returned results in variables, and then uses these variables within its own page. The script works well except where the target cURL page has a header('Location: ...') command inside it. It seems to just ignore this header command. The cURL command is as follows... //Load result page into variable so portions can be allocated to correct variables $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, $url); # URL to post to curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1 ); # return into a variable curl_setopt($ch, CURLOPT_HEADER, 0); curl_setopt($ch, CURLOPT_POST, true); curl_setopt($ch, CURLOPT_POSTFIELDS, $postfields); $loaded_result = curl_exec( $ch ); # run! curl_close($ch); I've tried changing the CURLOPT_HEADER to 1 but it doesn't do anything. So how can I allow script redirection within the target urls using cURL to grab the results? By the way, the pages work fine if accessed other than via cURL but iFrames are not an option in this instance.

    Read the article

  • CURL request incomplete, suspect timeout but not sure.

    - by girlygeek
    I am currently using CURL via a php script running as daily cron to export product data in csv format from a site's admin area. The normal way of exporting data will be to go to the Export page in a browser, and set the configuration, then click on "export data" button. But as the number of products I am exporting is very large, and it takes more than 5-10 mins to export the data, I've decided to use php's curl function to mimic this on a daily basis via cron. Previously, it is working fine, but recently as I increased the number of products in the store by 500+, the script fails to return the exported data. Testing it manually via clicking on the "export" button in a browser, does return the data correctly. Thus there is no "timeout" issue with running the export in a browser manually. I've tested and by removing/decreasing the number of products (thus the time needed), the php-curl script works fine again when run from cron. So I suspect that it has something to do with timeouts issue, specifically with the curl function in php. I've set both CURLOPT_TIMEOUT and CURLOPT_CONNECTTIMEOUT to '0' respectively to try. In the php-curl script, I've also set "set_time_limit(3000)". But still it does not work, and the request will timeout, with the script failing to return with a complete set of csv data. Any help in helping me resolve/understand this issue will be much appreciated!

    Read the article

  • CURL alternative in Python

    - by Gaurav
    I have a cURL call that I use in PHP: curl -i -H 'Accept: application/xml' -u login:key "https://app.streamsend.com/emails" I need a way to do the same thing in Python. Is there an alternative to cURL in Python. I know of urllib but I'm a Python noob and have no idea how to use it.

    Read the article

  • cURL/PHP Request Executes 50% of the Time

    - by makavelli
    After searching all over, I can't understand why cURL requests issued to a remote SSL-enabled host are successful only 50% or so of the time in my case. Here's the situation: I have a sequence of cURL requests, all of them issued to a HTTPS remote host, within a single PHP script that I run using the PHP CLI. Occasionally when I run the script the requests execute successfully, but for some reason most of the times I run it I get the following error from cURL: * About to connect() to www.virginia.edu port 443 (#0) * Trying 128.143.22.36... * connected * Connected to www.virginia.edu (128.143.22.36) port 443 (#0) * successfully set certificate verify locations: * CAfile: none CApath: /etc/ssl/certs * error:140943FC:SSL routines:SSL3_READ_BYTES:sslv3 alert bad record mac * Closing connection #0 If I try again a few times I get the same result, but then after a few tries the requests will go through successfully. Running the script after that again results in an error, and the pattern continues. Researching the error 'alert bad record mac' didn't give me anything helpful, and I hesitate to blame it on an SSL issue since the script still runs occasionally. I'm on Ubuntu Server 10.04, with php5 and php5-curl installed, as well as the latest version of openssl. In terms of cURL specific options, CURLOPT_SSL_VERIFYPEER is set to false, and both CURLOPT_TIMEOUT and CURLOPT_CONNECTTIMEOUT are set to 4 seconds. Further illustrating this problem is the fact that the same exact situation occurs on my Mac OS X dev machine - the requests only go through ~50% of the time.

    Read the article

  • PHP CURL works fine from localhost not from server

    - by Joby Joseph
    I have 2 servers, srv1 and srv2. All client sites are stored in srv2 and all authentication details are stored in srv1. Each time a client site is loaded, the site in srv2 sends a curl request to srv1 to validate. I am always getting bool(false) when I print the curl response using var_dump. But if I am requesting for validation from my local wamp installation(localhost) it is perfectly returning the response. According to my understanding srv2 is blocking srv1 ip or something like that. Any help or suggestions will be greatly appreciated. The solution is really urgent as all my clients are stuck with invalid authorization message. Edit: Now when i tested curl in srv2 is working fine i think. Because I am able to fetch other websites without any trouble. But I am not able to fetch data from srv1. I can view a page from srv1 through browser url but not through curl.

    Read the article

  • How would you show a file with cURL?

    - by Kyle
    There is an epic lack of PHP cURL love on the Internet for beginners like me. I was wondering how to use cURL to download & display an ICS file (They're plain text to me...) in my PHP code. Unless fopen() is 1,000 times easier, I'd like to stick with cURL for this one.

    Read the article

  • php cURL problem

    - by dfilkovi
    I have a problem logging onto a page and then using it with cURL. I login, get PHPSESSID and cookie, and then try to do an action but page returns 'not logged in'. But if I manually log in and copy/paste that PHPSESSID into curl cookies .txt file, everything works fine. So why doesn't it work with PHPSESSID from cURL?

    Read the article

  • PHP cURL loading delay

    - by lasquarte
    Hello. My problem is, I need to cURL-load a page that uses Ajax-based search, to get results of that search. And I need to organize a delay between curl_exec() and value returning. In other words, I need to execute curl_exec() for no less than 5 seconds. sleep() seems to stop curl execution and does not work. Will greatly appreciate any hint or clue UPD I don't know how, but on this page http://vkontakte.ru/gsearch.php?section=video&q=sample&name=1, but it requires an account to access curl DOES capture the search made by ajax. But if page tooks too long to load, Ajax returns "action was too fast" error. So I just need to prolong curl execution. Sorry if being unclear.

    Read the article

  • Curl CONNECTION OPTIONS in C++

    - by cinek1lol
    HI I'd like to know how to check out the speed of a file being uploaded in real time using the curl library in c++. This is what I have written: curl_easy_getinfo(curl,CURLINFO_SPEED_UPLOAD,&c); But the manual says that it shows average speed, but even this doesn't seem to work with me, because I can only see a 0. There is one more thing: How to set an upload limit that works, because if I write this: curl_easy_setopt(curl, CURLOPT_MAX_SEND_SPEED_LARGE, 100); I get an error 502 message

    Read the article

  • cURL working on localhost but not on live website

    - by randyttp
    I have a small piece of script to grab images from a different website, works perfectly in localhost but does not work on on the live website. According to the phpinfo() on the site CURL is enabled. I can't post the script as its sensitive - But off the top of your head what things should I check for to solve this issue? Why would CURL work on localhost and other hosts, but not with others that also have CURL supposedly enabled?

    Read the article

  • PCI compliance - Setting BIND to no recursion, cURL can't access external sites

    - by Exit
    I was running a PCI scan and was following direction to change the BIND options from: // recursion no; allow-recursion { trusted;}; allow-notify { trusted;}; allow-transfer { trusted;}; to: recursion no; allow-recursion { none;}; allow-notify { trusted;}; allow-transfer { none;}; The end result was that cURL operations stopped being able to access external sites. I realize that not everything will be 100% for PCI compliance, but can someone explain if there is a way to balance this for both PCI compliance and function?

    Read the article

  • WGet or cURL: Mirror Site from http://site.com And No Internal Access

    - by alharaka
    I have tried wget -m wget -r and a whole bunch of variations. I am getting some of the images on http://site.com, one of the scripts, and none of the CSS, even with the fscking -p parameter. The only HTML page is index.html and there are several more referenced, so I am at a loss. curlmirror.pl on the cURL developers website does not seem to get the job done either. Is there something I am missing? I have tried different levels of recursion with only this URL, but I get the feeling I am missing something. Long story short, some school allows its students to submit web projects, but they want to know how they can collect everything for the instructor who will grade it, instead of him going to all the externally hsoted sites.

    Read the article

  • SSL / HTTP / No Response to Curl

    - by Alex McHale
    I am trying to send commands to a SOAP service, and getting nothing in reply. The SOAP service is at a completely separate site from either server I am testing with. I have written a dummy script with the SOAP XML embedded. When I run it at my local site, on any of three machines -- OSX, Ubuntu, or CentOS 5.3 -- it completes successfully with a good response. I then sent the script to our public host at Slicehost, where I fail to get the response back from the SOAP service. It accepts the TCP socket and proceeds with the SSL handshake. I do not however receive any valid HTTP response. This is the case whether I use my script or curl on the command line. I have rewritten the script using SOAP4R, Net::HTTP and Curb. All of which work at my local site, none of which work at the Slicehost site. I have tried to assemble the CentOS box as closely to match my Slicehost server as possible. I rebuilt the Slice to be a stock CentOS 5.3 and stock CentOS 5.4 with the same results. When I look at a tcpdump of the bad sessions on Slicehost, I see my script or curl send the XML to the remote server, and nothing comes back. When I look at the tcpdump at my local site, I see the response just fine. I have entirely disabled iptables on the Slice. Does anyone have any ideas what could be causing these results? Please let me know what additional information I can furnish. Thank you! Below is a wire trace of a sample session. The IP that starts with 173 is my server while the IP that starts with 12 is the SOAP server's. No. Time Source Destination Protocol Info 1 0.000000 173.45.x.x 12.36.x.x TCP 36872 > https [SYN] Seq=0 Win=5840 Len=0 MSS=1460 TSV=137633469 TSER=0 WS=6 Frame 1 (74 bytes on wire, 74 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 0, Len: 0 No. Time Source Destination Protocol Info 2 0.040000 12.36.x.x 173.45.x.x TCP https > 36872 [SYN, ACK] Seq=0 Ack=1 Win=8760 Len=0 MSS=1460 Frame 2 (62 bytes on wire, 62 bytes captured) Ethernet II, Src: Dell_fb:49:a1 (00:21:9b:fb:49:a1), Dst: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6) Internet Protocol, Src: 12.36.x.x (12.36.x.x), Dst: 173.45.x.x (173.45.x.x) Transmission Control Protocol, Src Port: https (443), Dst Port: 36872 (36872), Seq: 0, Ack: 1, Len: 0 No. Time Source Destination Protocol Info 3 0.040000 173.45.x.x 12.36.x.x TCP 36872 > https [ACK] Seq=1 Ack=1 Win=5840 Len=0 Frame 3 (54 bytes on wire, 54 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 1, Ack: 1, Len: 0 No. Time Source Destination Protocol Info 4 0.050000 173.45.x.x 12.36.x.x SSLv2 Client Hello Frame 4 (156 bytes on wire, 156 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 1, Ack: 1, Len: 102 Secure Socket Layer No. Time Source Destination Protocol Info 5 0.130000 12.36.x.x 173.45.x.x TCP [TCP segment of a reassembled PDU] Frame 5 (1434 bytes on wire, 1434 bytes captured) Ethernet II, Src: Dell_fb:49:a1 (00:21:9b:fb:49:a1), Dst: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6) Internet Protocol, Src: 12.36.x.x (12.36.x.x), Dst: 173.45.x.x (173.45.x.x) Transmission Control Protocol, Src Port: https (443), Dst Port: 36872 (36872), Seq: 1, Ack: 103, Len: 1380 Secure Socket Layer No. Time Source Destination Protocol Info 6 0.130000 173.45.x.x 12.36.x.x TCP 36872 > https [ACK] Seq=103 Ack=1381 Win=8280 Len=0 Frame 6 (54 bytes on wire, 54 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 103, Ack: 1381, Len: 0 No. Time Source Destination Protocol Info 7 0.130000 12.36.x.x 173.45.x.x TLSv1 Server Hello, Certificate, Server Hello Done Frame 7 (1280 bytes on wire, 1280 bytes captured) Ethernet II, Src: Dell_fb:49:a1 (00:21:9b:fb:49:a1), Dst: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6) Internet Protocol, Src: 12.36.x.x (12.36.x.x), Dst: 173.45.x.x (173.45.x.x) Transmission Control Protocol, Src Port: https (443), Dst Port: 36872 (36872), Seq: 1381, Ack: 103, Len: 1226 [Reassembled TCP Segments (2606 bytes): #5(1380), #7(1226)] Secure Socket Layer No. Time Source Destination Protocol Info 8 0.130000 173.45.x.x 12.36.x.x TCP 36872 > https [ACK] Seq=103 Ack=2607 Win=11040 Len=0 Frame 8 (54 bytes on wire, 54 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 103, Ack: 2607, Len: 0 No. Time Source Destination Protocol Info 9 0.130000 173.45.x.x 12.36.x.x TLSv1 Client Key Exchange, Change Cipher Spec, Encrypted Handshake Message Frame 9 (236 bytes on wire, 236 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 103, Ack: 2607, Len: 182 Secure Socket Layer No. Time Source Destination Protocol Info 10 0.190000 12.36.x.x 173.45.x.x TLSv1 Change Cipher Spec, Encrypted Handshake Message Frame 10 (97 bytes on wire, 97 bytes captured) Ethernet II, Src: Dell_fb:49:a1 (00:21:9b:fb:49:a1), Dst: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6) Internet Protocol, Src: 12.36.x.x (12.36.x.x), Dst: 173.45.x.x (173.45.x.x) Transmission Control Protocol, Src Port: https (443), Dst Port: 36872 (36872), Seq: 2607, Ack: 285, Len: 43 Secure Socket Layer No. Time Source Destination Protocol Info 11 0.190000 173.45.x.x 12.36.x.x TLSv1 Application Data Frame 11 (347 bytes on wire, 347 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 285, Ack: 2650, Len: 293 Secure Socket Layer No. Time Source Destination Protocol Info 12 0.190000 173.45.x.x 12.36.x.x TCP [TCP segment of a reassembled PDU] Frame 12 (1514 bytes on wire, 1514 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 578, Ack: 2650, Len: 1460 Secure Socket Layer No. Time Source Destination Protocol Info 13 0.450000 12.36.x.x 173.45.x.x TCP https > 36872 [ACK] Seq=2650 Ack=578 Win=64958 Len=0 Frame 13 (54 bytes on wire, 54 bytes captured) Ethernet II, Src: Dell_fb:49:a1 (00:21:9b:fb:49:a1), Dst: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6) Internet Protocol, Src: 12.36.x.x (12.36.x.x), Dst: 173.45.x.x (173.45.x.x) Transmission Control Protocol, Src Port: https (443), Dst Port: 36872 (36872), Seq: 2650, Ack: 578, Len: 0 No. Time Source Destination Protocol Info 14 0.450000 173.45.x.x 12.36.x.x TCP [TCP segment of a reassembled PDU] Frame 14 (206 bytes on wire, 206 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 2038, Ack: 2650, Len: 152 No. Time Source Destination Protocol Info 15 0.510000 12.36.x.x 173.45.x.x TCP [TCP Dup ACK 13#1] https > 36872 [ACK] Seq=2650 Ack=578 Win=64958 Len=0 Frame 15 (54 bytes on wire, 54 bytes captured) Ethernet II, Src: Dell_fb:49:a1 (00:21:9b:fb:49:a1), Dst: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6) Internet Protocol, Src: 12.36.x.x (12.36.x.x), Dst: 173.45.x.x (173.45.x.x) Transmission Control Protocol, Src Port: https (443), Dst Port: 36872 (36872), Seq: 2650, Ack: 578, Len: 0 No. Time Source Destination Protocol Info 16 0.850000 173.45.x.x 12.36.x.x TCP [TCP Retransmission] [TCP segment of a reassembled PDU] Frame 16 (1514 bytes on wire, 1514 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 578, Ack: 2650, Len: 1460 Secure Socket Layer No. Time Source Destination Protocol Info 17 1.650000 173.45.x.x 12.36.x.x TCP [TCP Retransmission] [TCP segment of a reassembled PDU] Frame 17 (1514 bytes on wire, 1514 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 578, Ack: 2650, Len: 1460 Secure Socket Layer No. Time Source Destination Protocol Info 18 3.250000 173.45.x.x 12.36.x.x TCP [TCP Retransmission] [TCP segment of a reassembled PDU] Frame 18 (1514 bytes on wire, 1514 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 578, Ack: 2650, Len: 1460 Secure Socket Layer No. Time Source Destination Protocol Info 19 6.450000 173.45.x.x 12.36.x.x TCP [TCP Retransmission] [TCP segment of a reassembled PDU] Frame 19 (1514 bytes on wire, 1514 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 578, Ack: 2650, Len: 1460 Secure Socket Layer

    Read the article

  • WGet or cURL: Mirror Site from http://site.com And No Internal Access

    - by alharaka
    I have tried wget -m wget -r and a whole bunch of variations. I am getting some of the images on http://site.com, one of the scripts, and none of the CSS, even with the fscking -p parameter. The only HTML page is index.html and there are several more referenced, so I am at a loss. curlmirror.pl on the cURL developers website does not seem to get the job done either. Is there something I am missing? I have tried different levels of recursion with only this URL, but I get the feeling I am missing something. Long story short, some school allows its students to submit web projects, but they want to know how they can collect everything for the instructor who will grade it, instead of him going to all the externally hsoted sites. UPDATE: I think I figured out the issue. I though the links to the other pages were in the index.html page that downloaded. I was way off. Turns out the footer of the page, which has all the navigation links, is handled by a JavaScript file Include.js, which reads JLSSiteMap.js and some other JS files to do page navigation and the like. As a result, wget does not pick up an other dependencies because a lot of this crap is handled not on web pages. How can I handle such a website? This is one of several problem cases. I assume little can be done if wget cannot parse JavaScript.

    Read the article

  • HTTP responses curl and wget different results

    - by Fab
    To check HTTP response header for a set of urls I send with curl the following request headers foreach ( $urls as $url ) { // Setup headers - I used the same headers from Firefox version 2.0.0.6 $header[ ] = "Accept: text/xml,application/xml,application/xhtml+xml,"; $header[ ] = "text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5"; $header[ ] = "Cache-Control: max-age=0"; $header[ ] = "Connection: keep-alive"; $header[ ] = "Keep-Alive: 300"; $header[ ] = "Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7"; $header[ ] = "Accept-Language: en-us,en;q=0.5"; $header[ ] = "Pragma: "; // browsers keep this blank. curl_setopt( $ch, CURLOPT_URL, $url ); curl_setopt( $ch, CURLOPT_USERAGENT, 'Googlebot/2.1 (+http://www.google.com/bot.html)'); curl_setopt( $ch, CURLOPT_HTTPHEADER, $header); curl_setopt( $ch, CURLOPT_REFERER, 'http://www.google.com'); curl_setopt( $ch, CURLOPT_HEADER, true ); curl_setopt( $ch, CURLOPT_NOBODY, true ); curl_setopt( $ch, CURLOPT_RETURNTRANSFER, true ); curl_setopt( $ch, CURLOPT_FOLLOWLOCATION, true ); curl_setopt( $ch, CURLOPT_HTTPAUTH, CURLAUTH_ANY ); curl_setopt( $ch, CURLOPT_TIMEOUT, 10 ); //timeout 10 seconds } Sometimes I receive 200 OK which is good other time 301, 302, 307 which I consider good as well, but other times I receive weird status as 406, 500, 504 which should identify an invalid url but when I open it on the browser they are fine for example the script returns http://www.awe.co.uk/ => HTTP/1.1 406 Not Acceptable and wget returns wget http://www.awe.co.uk/ --2011-06-23 15:26:26-- http://www.awe.co.uk/ Resolving www.awe.co.uk... 77.73.123.140 Connecting to www.awe.co.uk|77.73.123.140|:80... connected. HTTP request sent, awaiting response... 200 OK Does anyone know which request header I am missing or adding in excess?

    Read the article

  • Saving images only available when logged in

    - by James
    Hi, I've been having some trouble getting images to download when logged into a website that requires you to be logged in. The images can only be viewed when you are logged in to the site, but you cannot seem to view them directly in the browser if you copy its location into a tab/new window (it redirects to an error page - so I guess the containing folder has be .htaccess-ed). Anyway, the code I have below allows me to log in and grab the HTML content, which works well - but I cannot grab the images ... this is where I need help! <? // curl.php class Curl { public $cookieJar = ""; public function __construct($cookieJarFile = 'cookies.txt') { $this->cookieJar = $cookieJarFile; } function setup() { $header = array(); $header[0] = "Accept: text/xml,application/xml,application/xhtml+xml,"; $header[0] .= "text/html;q=0.9,text/plain;q=0.8,image/gif;q=0.8,image/x-bitmap;q=0.8,image/jpeg;q=0.8,image/png,*/*;q=0.5"; $header[] = "Cache-Control: max-age=0"; $header[] = "Connection: keep-alive"; $header[] = "Keep-Alive: 300"; $header[] = "Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7"; $header[] = "Accept-Language: en-us,en;q=0.5"; $header[] = "Pragma: "; // browsers keep this blank. curl_setopt($this->curl, CURLOPT_USERAGENT, 'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US; rv:1.8.1.7) Gecko/20070914 Firefox/2.0.0.7'); curl_setopt($this->curl, CURLOPT_HTTPHEADER, $header); curl_setopt($this->curl, CURLOPT_COOKIEJAR, $this->cookieJar); curl_setopt($this->curl, CURLOPT_COOKIEFILE, $this->cookieJar); curl_setopt($this->curl, CURLOPT_AUTOREFERER, true); curl_setopt($this->curl, CURLOPT_FOLLOWLOCATION, true); curl_setopt($this->curl, CURLOPT_RETURNTRANSFER, true); } function get($url) { $this->curl = curl_init($url); $this->setup(); return $this->request(); } function getAll($reg, $str) { preg_match_all($reg, $str, $matches); return $matches[1]; } function postForm($url, $fields, $referer = '') { $this->curl = curl_init($url); $this->setup(); curl_setopt($this->curl, CURLOPT_URL, $url); curl_setopt($this->curl, CURLOPT_POST, 1); curl_setopt($this->curl, CURLOPT_REFERER, $referer); curl_setopt($this->curl, CURLOPT_POSTFIELDS, $fields); return $this->request(); } function getInfo($info) { $info = ($info == 'lasturl') ? curl_getinfo($this->curl, CURLINFO_EFFECTIVE_URL) : curl_getinfo($this->curl, $info); return $info; } function request() { return curl_exec($this->curl); } } ?> And below is the page that uses it. <? // data.php include('curl.php'); $curl = new Curl(); $url = "http://domain.com/login.php"; $newURL = "http://domain.com/go_here.php"; $username = "user"; $password = "pass"; $fields = "user=$username&pass=$password"; // Calling URL $referer = "http://domain.com/refering_page.php"; $html = $curl->postForm($url, $fields, $referer); $html = $curl->get($newURL); echo $html; ?> I've tried putting the direct URL for the image into $newURL but that doesn't get the image - it simply returns an error saying since that folder is not available to view directly. I've tried varying the above in different ways, but I haven't been successful in getting an image, though I have managed to get a screen through basically saying error 405 and/or 406 (but not the image I want). Any help would be great!

    Read the article

  • Convert this curl to multi PHP

    - by user1642423
    I have this code and I want made 10 curl connections like this with multi but I don't know how to that with this specific code: What the code does? Make a curl requiest to an .asp page Uses the result to send some data in a form ($ciudad) then the page get this submit and make an internal request and show an result. Output the final result of that. function curl($header,$encoded,$cookie){ $options = array( CURLOPT_USERAGENT => $_SERVER['HTTP_USER_AGENT'], CURLOPT_TIMEOUT => 120, //CURLOPT_REFERER => '', //CURLOPT_HTTPHEADER => $header, CURLOPT_COOKIE => $cookie, CURLOPT_POST => true, CURLOPT_POSTFIELDS => $encoded, CURLOPT_RETURNTRANSFER => true, CURLOPT_HEADER => false, CURLOPT_FOLLOWLOCATION => true, ); $ch = curl_init("http://procesos.ramajudicial.gov.co/consultaprocesos/consultap.aspx"); curl_setopt_array( $ch, $options ); $output = curl_exec($ch); curl_close($ch); return $output; } $cookie = ""; foreach($_COOKIE as $k => $v) $cookie .= $k."=".$v.";"; $cookie = substr($cookie,0,strlen($cookie)-1); $encoded = ''; foreach($_POST as $name => $value) { $encoded .= urlencode($name).'='.urlencode($value).'&'; } $lk = "http://procesos.ramajudicial.gov.co/consultaprocesos/"; $header[] = 'User-Agent: '.$_SERVER['HTTP_USER_AGENT']; $header[] = 'Accept: text/xml,application/xml,application/xhtml+xml,text /html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5'; $header[] = 'Accept-Language: en-us,en;q=0.5'; $header[] = 'Accept-Encoding: gzip,deflate'; $header[] = 'Connection: keep-alive'; $header[] = 'Cookie : '.$cookie; $header[] = 'Content-Type: application/x-www-form-urlencoded'; $output = curl($header,$encoded,$cookie); $CIUDAD = urlencode("Medellin"); // to change $CORPORACION = urlencode("JUZGADOS CIVILES MUNICIPALES DE MEDELLIN"); // to change $DIGITOS = $numsus; // BEGIN STEP 1 $__VIEWSTATE = 'id="__VIEWSTATE" value="'; $i = stripos($output,$__VIEWSTATE) + strlen($__VIEWSTATE); $j = stripos($output,'"',$i); $__VIEWSTATE = substr($output,$i,$j-$i); $__EVENTVALIDATION = 'id="__EVENTVALIDATION" value="'; $i = stripos($output,$__EVENTVALIDATION) + strlen($__EVENTVALIDATION); $j = stripos($output,'"',$i); $__EVENTVALIDATION = substr($output,$i,$j-$i); $encoded = '__EVENTTARGET=DropDownList1&__EVENTARGUMENT=&__LASTFOCUS=&__VIEWSTATE='.urlencode($__VIEWSTATE).'&__EVENTVALIDATION='.urlencode($__EVENTVALIDATION).'&DropDownList1='.$CIUDAD.'&TextBox13='; $output = curl($header,$encoded,$cookie);

    Read the article

  • how can i set cookie in curl

    - by Sushant Panigrahi
    i am fetching somesite page.. but it display nothing and url address change. example i have typed http://localhost/sushant/EXAMPLE_ROUGH/curl.php in curl page my coding is= $fp = fopen("cookie.txt", "w"); fclose($fp); $agent= 'Mozilla/5.0 (Windows; U; Windows NT 5.1; pl; rv:1.9) Gecko/2008052906 Firefox/3.0'; $ch = curl_init(); curl_setopt($ch, CURLOPT_USERAGENT, $agent); // 2. set the options, including the url curl_setopt($ch, CURLOPT_URL, "http://www.fnacspectacles.com/place-spectacle/manifestation/Grand-spectacle-LE-ROI-LION-ROI4.htm"); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_HEADER, 0); curl_setopt ($ch, CURLOPT_COOKIEJAR, 'cookie.txt'); curl_setopt($ch, CURLOPT_COOKIEFILE, "cookie.txt"); // 3. execute and fetch the resulting HTML output if(curl_exec($ch) === false) { echo 'Curl error: ' . curl_error($ch); } else echo $output = curl_exec($ch); // 4. free up the curl handle curl_close($ch); ? but it canege url like this.. http://localhost/aide.do?sht=_aide_cookies_ object not found. how can solve these problem help me

    Read the article

  • PHP & cUrl - POST problems in while loop.

    - by Max Hoff
    I have a while loop, with cUrl inside. (I can't use curl_multi for various reasons.) The problem is that cUrl seems to save the data it POSTS after each loop traversal. For instance, if parameter X is One the first loop through, and if it's Two the second loop through, cUrl posts: "One,Two". It should just POST "Two".(This is despite closing and unsetting the curl handle.) Here's a simplified version of the code (with unecessary info stripped out): while(true){ // code to form the $url. this code is local to the loop. so the variables should be "erased" and made new for each loop through. $ch = curl_init(); $userAgent = 'Googlebot/2.1 (http://www.googlebot.com/bot.html)'; curl_setopt($ch,CURLOPT_USERAGENT, $userAgent); curl_setopt($ch,CURLOPT_URL,$url); curl_setopt($ch,CURLOPT_POST,count($fields)); curl_setopt($ch,CURLOPT_POSTFIELDS,$fields_string); curl_setopt($ch, CURLOPT_FAILONERROR, true); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true); curl_setopt($ch, CURLOPT_AUTOREFERER, true); curl_setopt($ch, CURLOPT_RETURNTRANSFER,true); curl_setopt($ch, CURLOPT_TIMEOUT, 10); $html = curl_exec($ch); curl_close($ch); unset($ch); $dom = new DOMDocument(); @$dom->loadHTML($html); $xpath = new DOMXPath($dom); $resultTable = $xpath->evaluate("/html/body//table"); // $resultTable is 20 the first time through the loop, and 0 everytime thereafter becauset he POSTing doesn't work right with the "saved" parameters. What am I doing wrong here?

    Read the article

  • CURL C API: callback was not called

    - by Pierre
    Hi all, The code below is a test for the CURL C API . The problem is that the callback function write_callback is never called. Why ? /** compilation: g++ source.cpp -lcurl */ #include <assert.h> #include <iostream> #include <cstdlib> #include <cstring> #include <cassert> #include <curl/curl.h> using namespace std; static size_t write_callback(void *ptr, size_t size, size_t nmemb, void *userp) { std::cerr << "CALLBACK WAS CALLED" << endl; exit(-1); return size*nmemb; } static void test_curl() { int any_data=1; CURLM* multi_handle=NULL; CURL* handle_curl = ::curl_easy_init(); assert(handle_curl!=NULL); ::curl_easy_setopt(handle_curl, CURLOPT_URL, "http://en.wikipedia.org/wiki/Main_Page"); ::curl_easy_setopt(handle_curl, CURLOPT_WRITEDATA, &any_data); ::curl_easy_setopt(handle_curl, CURLOPT_VERBOSE, 1); ::curl_easy_setopt(handle_curl, CURLOPT_WRITEFUNCTION, write_callback); ::curl_easy_setopt(handle_curl, CURLOPT_USERAGENT, "libcurl-agent/1.0"); multi_handle = ::curl_multi_init(); assert(multi_handle!=NULL); ::curl_multi_add_handle(multi_handle, handle_curl); int still_running=0; /* lets start the fetch */ while(::curl_multi_perform(multi_handle, &still_running) == CURLM_CALL_MULTI_PERFORM ); std::cerr << "End of curl_multi_perform."<< endl; //cleanup should go here ::exit(EXIT_SUCCESS); } int main(int argc,char** argv) { test_curl(); return 0; } Many thanks Pierre

    Read the article

  • Wordpress curl save Images

    - by Jeton Ramadani
    I am working on saving images from external sites into a folder in my wordpress theme. And I was wondering if its Ok to call curl twice or can it be done with one time. Example: $data = get_url('http://www.veoh.com/watch/v19935546Y8hZPgbZ'); // getting the url first curl instance preg_match('/fullHighResImagePath="(.*?)"/', $data, $thumbnail); // find the image from content savePhoto($thumbnail, $post->ID); //2nd instance of curl to save the image function get_url($url) { $user_agent = "Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2)"; $ytc = curl_init(); // initialize curl handle curl_setopt($ytc, CURLOPT_URL, $url); // set url to post to curl_setopt($ytc, CURLOPT_FAILONERROR, 1); // Fail on errors curl_setopt($ytc, CURLOPT_FOLLOWLOCATION, 1); // allow redirects curl_setopt($ytc, CURLOPT_RETURNTRANSFER, 1); // return into a variable curl_setopt($ytc, CURLOPT_PORT, 80); //Set the port number curl_setopt($ytc, CURLOPT_TIMEOUT, 15); // times out after 15s curl_setopt($ytc, CURLOPT_HEADER, 1); // include HTTP headers curl_setopt($ytc, CURLOPT_USERAGENT, $user_agent); $source = curl_exec($ytc); curl_close($ytc); $data = trim( $source ); return $data; } function savePhoto($remoteImage, $isbn) { $ch = curl_init(); curl_setopt ($ch, CURLOPT_URL, $remoteImage); curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, 0); $fileContents = curl_exec($ch); curl_close($ch); if (DIRECTORY_SEPARATOR=='/'){ $absolute_path = dirname(__FILE__).'/'; } else { $absolute_path = str_replace('\\', '/', dirname(__FILE__)).'/'; } $newImg = imagecreatefromstring($fileContents); return imagejpeg($newImg, $absolute_path ."video_images/{$isbn}.jpg",100); }

    Read the article

  • PHP, Apache and curl: Differences between Windows and Linux?

    - by beginner_
    I'm trying to run my php App on Ubuntu Server 11.10. This App works fine under Apache + PHP in windows. I have other applications that I can simply copy&paste between the 2 OS and they work on both. (These don't use cURL). However this one uses the php library tonic (RESTful webservices) and makes us of php cURL module. The issue is I'm not getting an error message which makes it impossible to find the issue. I (must) use NTLM authentication and this is done with AuthenNTLM Apache Module: Order allow,deny Allow from all PerlAuthenHandler Apache2::AuthenNTLM AuthType ntlm AuthName "Protected Access" require valid-user PerlAddVar ntdomain "domainName server" PerlSetVar defaultdomain domainName PerlSetVar ntlmsemtimeout 2 PerlSetVar ntlmdebug 1 PerlSetVar splitdomainprefix 0 All files that cURL needs to fetch override AuthenNTLM authentication: order deny,allow deny from all allow from 127.0.0.1 Satisfy any Since these files are only fectehd by cURL from same server, access can be limited to localhost. Possible issues are: NTLM auth isn't overridden for files requested through cURL (even though AllowOverride All is set) curl works differently on linux $ch = curl_init(); curl_setopt($ch, CURLOPT_COOKIE, $strCookie); curl_setopt($ch, CURLOPT_URL, $baseUrl . $queryString); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); $html = curl_exec($ch); curl_close($ch); other? Apache log says: [error] Bad/Missing NTLM/Basic Authorization Header for /myApp/webservice/local/viewList.php But this directory should override NTLM authentication using curl command line from windows to access same resource i get: <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"> <html> <head> <title>406 Not Acceptable</title> </head> <body> <h1>Not Acceptable</h1> <p>An appropriate representation of the requested resource /myApp/webservice/myResource could not be found on this server.</p> Available variants: <ul> <li><a href="myResource.php">myResource.php</a> , type application/x-httpd-php</li> </ul> <hr> <address>Apache/2.2.20 (Ubuntu) Server at localhost Port 80</address> </body> </html> Note: This is duplicate from http://stackoverflow.com/questions/9821979/php-curl-on-linux-what-is-the-difference-to-curl-on-windows Is it was suggested I post it here. EDIT: Please see Ubuntu Server: Apache2 seems to attach .php to URI as I discovered why it does not work but need help so the issue does not occur anymore. ANSWER: The issue is the default Apache configuration on Ubuntu: Options Indexes FollowSymLinks MultiViews MultiViews is changing request_uri from myResource to myResource.php. Solutions: disable MultiViews in .htaccess: Options -MultiViews remove MultiViews from default config rename the file as example to myResourceClass I chose last option because that should work regardless of configuration and I only have 3 such files so the change took about 30 secs...

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >