Search Results

Search found 593 results on 24 pages for 'wget'.

Page 12/24 | < Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >

  • Open Web Page in Windows 2008 R2 Task Scheduler runs forever

    - by Nissan Fan
    I have a number of scheduled tasks which simply open a web page in Windows Server 2008 R2. They used to run and end without abending, but now they open and stay open and I have to setup the task to quit them by force before their next scheduled run. I've thought about installing CURL or WGET, but is there a way to do this with R2 without going to that step? Regards.

    Read the article

  • Enterprise desktop antivirus without a Windows server

    - by Adam
    Are there any desktop antivirus products suitable for use in an enterprise environment without a Windows server? We're currently using McAfee for our Windows desktops but to get updates and alerts with the latest version it looks like you need to be running their EPO server software. I'd like to avoid the cost of hardware and Windows licensing, and if possible to run just client-based antivirus. Ideally it would support: Updates from an internal copy of the definitions (e.g. a wget mirror) Automated configuration of the install Alerts from the client via email

    Read the article

  • Get tarball of any public SVN repository

    - by Sridhar Ratnakumar
    Is there a website that allows one to get the tarball of any specified SVN repository? For example I want to get the tarball or zip of http://svn.python.org/view/python/trunk/ without having to use a local SVN client, but only use my browser or some command line HTTP client (such as wget). This is mainly for some old unix machines that do not have SVN client.

    Read the article

  • Debugging IO limitation

    - by Martin F
    I have a Fedora box with some severe IO limitations which I have no idea how to debug. The server has a Areca Technology Corp. ARC-1130 12-Port PCI-X to SATA RAID Controller with 12 7200 RPM 1.5 TB disks and a Marvell Technology Group Ltd. 88E8050 PCI-E ASF Gigabit Ethernet Controller. uname -a output: 2.6.32.11-99.fc12.x86_64 #1 SMP Mon Apr 5 19:59:38 UTC 2010 x86_64 x86_64 x86_64 GNU/Linux The server is a file server running Nginx with the stub status module enabled, so I can see the current amount of connections. The problem present itself when I have a high number of simultaneous connections in a writing state. Usually around 350, at this very moment it's at 590 and the server is almost unusable and stuck at 230mbit/s. If I run stop and hit 1 to see CPU core usages I have all 4 cores with around 99% io wait, if I run iotop the nginx workers are the only processes producing any read load, currently at around 25MB/s. I have each of the workers bound to their own core. Initially I figured it was just the disks being bugged. But I've run fscheck and smartmontools checks and found no errors. I also ran an iozone test which you can see the result of here: http://www.pastie.org/951667.txt?key=fimcvljulnuqy2dcdxa Additionally, when the amount of connections are low I have no problem getting a good speed. If I wget over the local network it easily hits 60MB/sec. Right now I just tried putting a file in /dev/shm, then I symlinked a file from the public dir to it and used wget over the local network and only got 50KB/s. Also, if I try to cp /dev/shm/test /root/test it quickly copies around 740MB and then slows down HEAVILY. Again with iotop reporting 99% iowait. I'm not really sure how to go about figuring out what the problems are. It could be a natural disk limitation but then the file from /dev/shm ought to transfer so it seems there's a network limit, but that's fine when there's not many connections. Perhaps it's a TCP stack problem but I really have no idea how to check that. Any suggestions on how to proceed with debugging would be very welcome. If additional information is required then let me know and I'll try to get it. Thanks.

    Read the article

  • Uninstall Git completely on Ubuntu?

    - by Millisami
    I installed Git on Ubuntu Lucid (latest) manually as following. cd ~/tmp wget http://kernel.org/pub/software/scm/git/git-1.7.0.6.tar.gz tar -xzvf git-1.7.0.6.tar.gz cd git-1.7.0.6.tar.gz ./configure sudo make sudo make install Now, how can I completely uninstall it?

    Read the article

  • Any simple tools for re-loading a specified web page periodically (Windows)?

    - by Luke
    I've got a website that is exhibiting slow performance on the first load and would like to attempt to load it every 5 minutes or so to keep the cache fresh. Are there any simple tools to accomplish this? Scheduled tasks doesn't have quite the time resolution I need. The tricky thing is that this site uses Windows authentication so a wget script won't work. I'm also worried about instantiating a bunch of copies of internet explorer or attempting to kill iexplore.exe tasks blindly.

    Read the article

  • Open Web Page in Windows Server 2008 R2 runs forever

    - by Nissan Fan
    I have a number of scheduled tasks which simply open a web page in Windows Server 2008 R2. They used to run and end without abending, but now they open and stay open and I have to setup the task to quit them by force before their next scheduled run. I've thought about installing CURL or WGET, but is there a way to do this with R2 without going to that step? Regards.

    Read the article

  • How to disable ipv6 on MacOS and never see an address resolved to ipv6

    - by shabunc
    On MacOS (10.8.5 if that matters) I'm trying to disable off ipv6 via networksetup -setv6off Wi-Fi. Nevertheless, when I'm trying to wget a specific files URL resolves to ipv6 address and download fails. I just wonder what I am missing in order not to feel as stupid as I currently feel. ifconfig shows that some of interfaces have inet6 enabled, but I just don't know whether this is relevant at all. networksetup -listallnetworkservices does not show anything extraordinary I've forgot about.

    Read the article

  • Port-forwarding HTTPS web server

    - by James Moore
    I have port forwarded our front-facing IP to an internal HTTPS server. The browser does not connect. A wget command determines that the certificate is self-signed for the internal IP. Hence why the browser is refusing to display the page properly. What is the best-practice scenario for this sort of stuff? Thanks

    Read the article

  • Are there any easy ways to create a YouTube playlist or download queue from a text file?

    - by Eric Johnson
    In other words, if I have a list with entries formatted "[Artist Name] - [Track Name]" without advanced knowledge of their corresponding YouTube URLs (if any exist), how might I script things so that YouTube searches for each entry, and generates the URLs for the first n number of identical or similar entries per a YouTube query? Do any simple solutions come to mind that might lever wget, PowerShell, or Python? Thanks!

    Read the article

  • How to save a remote server SSL certificate locally as a file

    - by Kimvais
    I need to download an SSL certificate of a remote server (not HTTPS, but the SSL handshake should be the same as Google Chrome / IE / wget and curl all give certificate check fail errors) and add the certificate as trusted in my laptops Windows' certificate store since I am not able to get my IT guys to give me the CA cert. this is for office commnunicator so I cannot really use the actual client to get the cert. How do I do this, I have Windows 7 and a pile of Linuxes handy so any tool / scripting language is fine.

    Read the article

  • Maximise network transfer speed of various applications

    - by Alex
    When using nc, scp, wget to transfer files between 2 machines on a dedicated 2Mbps link, I get speeds between 0.5 and 1 Mbps. However, when I use iperf -c 10.0.1.4 -t 20 -P 12 (for example) I can maximise the speed of the link (getting stable 2Mbps). Is there a way to make single stream transfers (such as those done by scp) to utilise all/most of the link? Some kind of tcp settings, or iptables...?

    Read the article

  • How can I set up Retrospect on Ubuntu 10.04 64 bit?

    - by David
    Problem Retrospect is a backup system that my organization uses, but I can not find support for my Ubuntu 10.04 64bit desktop. What I have tried (but did not work) download the Redhat version and attempt to convert to deb wget http://download.dantz.com/archives/Linux_Client-7_6_100.rpm sudo alien Linux_Client-7_6_100.rpm The Restrospect user forum has this thread, which provides an i386 .deb file for installing Retrospect Question Is there a way to install this on my system?

    Read the article

  • Download file from vbscript?

    - by Eye of Hell
    I need a script that can be run on freshly installed windows xp+ and download specified files from internet, like http://www.python.org/ftp/python/2.6.2/python-2.6.2.msi Is it any easy way to do it without hand-crafting HTTP/FTP requests or using third-party programs like wget? I can suggest that WScript.CreateObject("internetexplorer.application") will do the magic, but documentation on it is extremely huge and Google is silent, as always :).

    Read the article

  • Updated script for downloading from youtube

    - by asksuperuser
    I'm not looking for a software or site to download youtube but for an opensource script in bash or any which is up todate as youtube often changes the download url. I've found this but seems deprecated: http://linux.byexamples.com/archives/302/how-to-wget-flv-from-youtube/ http://www.daniweb.com/forums/thread104419.html

    Read the article

  • preg_match in .bat file to flusk image url.

    - by marcell22
    welcome, I have problem with .bat script on windows. I use wget to download html stats page, now i have to find (in html source) url like this http://www.example.com/stats/367895.jpeg The 367895 is a random generated number. and download chart jpeg. I think i can't do this in .bat, Do You know any external command line application what i could trigger from .bat and show in output finded url ? Regards

    Read the article

  • Can I determine a machine's outward facing IP with PHP without relying on external services?

    - by editor
    I'm working with an API that requires the machine's external IP. As far as I know, the PHP environment I'm using can only get our internal IP. The option on the table is using an external service such as whatismyip.com to tell us: wget -q -O - http://whatismyip.com/automation/n09230945.asp My concern is what happens if that fails. Is there a bulletproof way of determining a machine's IP without relying on external services?

    Read the article

  • How do i call bash script function using exec function by passing parameter in php?

    - by Stan
    I have created a bash script that install magento in a cpanel. but i have a problem regarding the exec function. $function_path = Mage::getBaseDir()."/media/installer/function.sh"; exec("$function_path $db_host $db_name $db_user $db_pass $url $ad_user $ad_pass $ad_email"); This the bash shell script function.sh #!/bin/bash magento_detail $dbhost $dbname $dbuser $dbpass $url $admin_username $admin_password $admin_email function magento_detail() { stty erase '^?' echo "To install Magento, you will need a blank database ready with a user assigned to it." echo -n "Do you have all of your database information" dbinfo = "y" echo $dbinfo if [ "$dbinfo" -eq 'y' ] then echo "Database Host (usually localhost) : $dbhost " echo "Database Name : $dbname " echo "Database User : $dbuser " echo "Database Password : $dbpass " echo "Store Url : $url " echo "Admin Username : $admin_username " echo "Admin Password : $admin_password " echo "Admin Email Address : $admin_email " echo -n "Include Sample Data? (y/n) " echo sample = "y" if [ "$sample" -eq "y" ]; then echo echo "Now installing Magento with sample data..." echo echo "Downloading packages..." echo wget http://www.magentocommerce.com/downloads/assets/1.5.1.0/magento-1.5.1.0.tar.gz wget http://www.magentocommerce.com/downloads/assets/1.2.0/magento-sample-data-1.2.0.tar.gz echo echo "Extracting data..." echo tar -zxvf magento-1.5.1.0.tar.gz tar -zxvf magento-sample-data-1.2.0.tar.gz echo echo "Moving files..." echo mv magento-sample-data-1.2.0/media/* magento/media/ mv magento-sample-data-1.2.0/magento_sample_data_for_1.2.0.sql magento/data.sql mv magento/index.php magento/.htaccess ./$test1 echo echo "Setting permissions..." echo chmod o+w var var/.htaccess app/etc chmod -R o+w media echo echo "Importing sample products..." echo mysql -h $dbhost -u $dbuser -p$dbpass $dbname < data.sql echo echo "Initializing PEAR registry..." echo chmod 550 mage ./mage mage-setup . echo echo "Downloading packages..." echo echo echo "Cleaning up files..." echo rm -rf downloader/pearlib/cache/* downloader/pearlib/download/* rm -rf magento/ magento-sample-data-1.2.0/ rm -rf magento-1.5.1.0.tar.gz magento-sample-data-1.2.0.tar.gz data.sql rm -rf index.php.sample .htaccess.sample php.ini.sample LICENSE.txt STATUS.txt data.sql echo echo "Installing Magento..." echo php -f install.php --license_agreement_accepted "yes" --locale "en_US" --timezone "America/Los_Angeles" --default_currency "USD" --db_host "$dbhost" --db_name "$dbname" --db_user "$dbuser" --db_pass "$dbpass" --url "$url" --use_rewrites "yes" --use_secure "no" --secure_base_url "" --use_secure_admin "no" --admin_email "$admin_email" --admin_username "$admin_username" --admin_password "$admin_password" echo echo "Finished installing Magento" echo exit else echo "Now installing Magento without sample data..." echo echo "Downloading packages..." echo wget http://www.magentocommerce.com/downloads/assets/1.5.1.0/magento-1.5.1.0.tar.gz echo echo "Extracting data..." echo tar -zxvf magento-1.5.1.0.tar.gz echo echo "Moving files..." echo mv magento/* magento/.htaccess . echo echo "Setting permissions..." echo chmod o+w var var/.htaccess app/etc chmod -R o+w media echo echo "Initializing PEAR registry..." echo chmod 550 mage ./mage mage-setup . echo echo "Downloading packages..." echo echo echo "Cleaning up files..." echo rm -rf downloader/pearlib/cache/* downloader/pearlib/download/* rm -rf magento/ magento-1.5.1.0.tar.gz rm -rf index.php.sample .htaccess.sample php.ini.sample LICENSE.txt STATUS.txt echo echo "Installing Magento..." echo php -f install.php --license_agreement_accepted "yes" --locale "en_US" --timezone "America/Los_Angeles" --default_currency "USD" --db_host "$dbhost" --db_name "$dbname" --db_user "$dbuser" --db_pass "$dbpass" --url "$url" --use_rewrites "yes" --use_secure "no" --secure_base_url "" --use_secure_admin "no" --admin_email "$admin_email" --admin_username "$admin_username" --admin_password "$admin_password" echo echo "Finished installing Magento else part" exit fi else echo "Please setup a database first. Don't forget to assign a database user!" exit fi }` when i run this exec command,at that time it calls bash script function magento_installer() which contains arguments $db_host $db_name $db_user $db_pass $url $ad_user $ad_pass $ad_email. above arguments i'll pass in exec command to call magento_installer() function of bash script. so, is it right way of calling a bash script function? It directly goes to the last step of if condition and prints "Please setup a database first. Don't forget to assign a database user!". It cant enter it in if condition and directly goes to else condition. so please help me?

    Read the article

  • libcurl - unable to download a file

    - by marmistrz
    I'm working on a program which will download lyrics from sites like AZLyrics. I'm using libcurl. It's my code lyricsDownloader.cpp #include "lyricsDownloader.h" #include <curl/curl.h> #include <cstring> #include <iostream> #define DEBUG 1 ///////////////////////////////////////////////////////////////////////////// size_t lyricsDownloader::write_data_to_var(char *ptr, size_t size, size_t nmemb, void *userdata) // this function is a static member function { ostringstream * stream = (ostringstream*) userdata; size_t count = size * nmemb; stream->write(ptr, count); return count; } string AZLyricsDownloader::toProviderCode() const { /*this creates an url*/ } CURLcode AZLyricsDownloader::download() { CURL * handle; CURLcode err; ostringstream buff; handle = curl_easy_init(); if (! handle) return static_cast<CURLcode>(-1); // set verbose if debug on curl_easy_setopt( handle, CURLOPT_VERBOSE, DEBUG ); curl_easy_setopt( handle, CURLOPT_URL, toProviderCode().c_str() ); // set the download url to the generated one curl_easy_setopt(handle, CURLOPT_WRITEDATA, &buff); curl_easy_setopt(handle, CURLOPT_WRITEFUNCTION, &AZLyricsDownloader::write_data_to_var); err = curl_easy_perform(handle); // The segfault should be somewhere here - after calling the function but before it ends cerr << "cleanup\n"; curl_easy_cleanup(handle); // copy the contents to text variable lyrics = buff.str(); return err; } main.cpp #include <QString> #include <QTextEdit> #include <iostream> #include "lyricsDownloader.h" int main(int argc, char *argv[]) { AZLyricsDownloader dl(argv[1], argv[2]); dl.perform(); QTextEdit qtexted(QString::fromStdString(dl.lyrics)); cout << qPrintable(qtexted.toPlainText()); return 0; } When running ./maelyrica Anthrax Madhouse I'm getting this logged from curl * About to connect() to azlyrics.com port 80 (#0) * Trying 174.142.163.250... * connected * Connected to azlyrics.com (174.142.163.250) port 80 (#0) > GET /lyrics/anthrax/madhouse.html HTTP/1.1 Host: azlyrics.com Accept: */* < HTTP/1.1 301 Moved Permanently < Server: nginx/1.0.12 < Date: Thu, 05 Jul 2012 16:59:21 GMT < Content-Type: text/html < Content-Length: 185 < Connection: keep-alive < Location: http://www.azlyrics.com/lyrics/anthrax/madhouse.html < Segmentation fault Strangely, the file is there. The same error is displayed when there's no such page (redirect to azlyrics.com mainpage) What am I doing wrong? Thanks in advance EDIT: I made the function for writing data static, but this changes nothing. Even wget seems to have problems $ wget http://www.azlyrics.com/lyrics/anthrax/madhouse.html --2012-07-06 10:36:05-- http://www.azlyrics.com/lyrics/anthrax/madhouse.html Resolving www.azlyrics.com... 174.142.163.250 Connecting to www.azlyrics.com|174.142.163.250|:80... connected. HTTP request sent, awaiting response... No data received. Retrying. Why does opening the page in a browser work and wget/curl not? EDIT2: After adding this: curl_easy_setopt(handle, CURLOPT_FOLLOWLOCATION, 1); The log is: * About to connect() to azlyrics.com port 80 (#0) * Trying 174.142.163.250... * connected * Connected to azlyrics.com (174.142.163.250) port 80 (#0) > GET /lyrics/anthrax/madhouse.html HTTP/1.1 Host: azlyrics.com Accept: */* < HTTP/1.1 301 Moved Permanently < Server: nginx/1.0.12 < Date: Fri, 06 Jul 2012 09:09:47 GMT < Content-Type: text/html < Content-Length: 185 < Connection: keep-alive < Location: http://www.azlyrics.com/lyrics/anthrax/madhouse.html < * Ignoring the response-body * Connection #0 to host azlyrics.com left intact * Issue another request to this URL: 'http://www.azlyrics.com/lyrics/anthrax/madhouse.html' * About to connect() to www.azlyrics.com port 80 (#1) * Trying 174.142.163.250... * connected * Connected to www.azlyrics.com (174.142.163.250) port 80 (#1) > GET /lyrics/anthrax/madhouse.html HTTP/1.1 Host: www.azlyrics.com Accept: */* < HTTP/1.1 200 OK < Server: nginx/1.0.12 < Date: Fri, 06 Jul 2012 09:09:47 GMT < Content-Type: text/html < Transfer-Encoding: chunked < Connection: keep-alive < Segmentation fault

    Read the article

  • Teamviewer cannot install on 13.10 - no teamviewerd

    - by rubo77
    I tried to install Teamviewer 8 on Xubuntu 13.10 after the first problems I tries this solution: Teamviewer dependends of lib32asound2 But that doesen't work either, after trying (as root) apt-get install libc6:i386 libgcc1:i386 libasound2:i386 libfreetype6:i386 zlib1g:i386 libsm6:i386 libxdamage1:i386 libxext6:i386 libxfixes3:i386 libxrender1:i386 libxtst6:i386 wget http://www.teamviewer.com/download/teamviewer_linux.deb dpkg -i teamviewer_linux.deb I get these messages: Vorbereitung zum Ersetzen von teamviewer 8.0.20931 (durch teamviewer_linux.deb) ... initctl: Unbekannter Auftrag: teamviewerd Ersatz für teamviewer wird entpackt ... teamviewer (8.0.20931) wird eingerichtet ... initctl: Unbekannter Auftrag: teamviewerd I guess in english: initctl: unknown task: teamviewerd

    Read the article

< Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >