Search Results

Search found 593 results on 24 pages for 'wget'.

Page 14/24 | < Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >

  • At Symbol not working for apt get proxy authentication Ubuntu 11.10

    - by Shivhari
    I Have tried two things in three places to see if it works please do help me out. Two methords: 1) replacing @ with %40 2) replacing @ with \@ Three places: 1) export with the .bashrc file 2) editing /etc/apt/apt.conf and setting acquires there 3) using gconf editor and setting the values in /system/http_proxy and setting authentication name and password and checking the use_authentication checkbox. still there is no success and i still get 407 error when trying wget or apt-get update. please do help me, been stuck with this for three hours now. also, i read somewhere that creating a file in /etc/apt/apt.conf.d and then creating a 01proxy file with acquire might work. I tried that also, but it doesnt work. Please help.

    Read the article

  • Install Skype on Ubuntu 12.04 LTS 64-bit

    - by Samir R. Bhogayta
    For 32Bit Terminal Commands: wget http://download.skype.com/linux/skype-ubuntu-lucid_4.2.0.11-1_i386.debsudo dpkg -i skype-ubuntu-lucid_4.2.0.11-1_i386.debsudo apt-get -f install;rm skype-ubuntu-lucid_4.2.0.11-1_i386.deb For 64Bit Terminal Commands: sudo dpkg --add-architecture i386sudo apt-get install ia32-libssudo apt-get updatewget http://download.skype.com/linux/skype-ubuntu-lucid_4.2.0.11-1_i386.debsudo dpkg -i skype-ubuntu-lucid_4.2.0.11-1_i386.debsudo apt-get -f install;rm skype-ubuntu-lucid_4.2.0.11-1_i386.debAfter all of this run in terminal sudo apt-get install sni-qt:i386; This will restore the skype contact window That's all, work done in maximum 5 minutes. I use Ubuntu on 64bit and this method to install Skype worked always perfectly.

    Read the article

  • Set up linux box for secure local hosting a-z

    - by microchasm
    I am in the process of reinstalling the OS on a machine that will be used to host a couple of apps for our business. The apps will be local only; access from external clients will be via vpn only. The prior setup used a hosting control panel (Plesk) for most of the admin, and I was looking at using another similar piece of software for the reinstall - but I figured I should finally learn how it all works. I can do most of the things the software would do for me, but am unclear on the symbiosis of it all. This is all an attempt to further distance myself from the land of Configuration Programmer/Programmer, if at all possible. I can't find a full walkthrough anywhere for what I'm looking for, so I thought I'd put up this question, and if people can help me on the way I will edit this with the answers, and document my progress/pitfalls. Hopefully someday this will help someone down the line. The details: CentOS 5.5 x86_64 httpd: Apache/2.2.3 mysql: 5.0.77 (to be upgraded) php: 5.1 (to be upgraded) The requirements: SECURITY!! Secure file transfer Secure client access (SSL Certs and CA) Secure data storage Virtualhosts/multiple subdomains Local email would be nice, but not critical The Steps: Download latest CentOS DVD-iso (torrent worked great for me). Install CentOS: While going through the install, I checked the Server Components option thinking I was going to be using another Plesk-like admin. In hindsight, considering I've decided to try to go my own way, this probably wasn't the best idea. Basic config: Setup users, networking/ip address etc. Yum update/upgrade. Upgrade PHP/MySQL: To upgrade PHP and MySQL to the latest versions, I had to look to another repo outside CentOS. IUS looks great and I'm happy I found it! Add IUS repository to our package manager cd /tmp wget http://dl.iuscommunity.org/pub/ius/stable/Redhat/5/x86_64/epel-release-1-1.ius.el5.noarch.rpm rpm -Uvh epel-release-1-1.ius.el5.noarch.rpm wget http://dl.iuscommunity.org/pub/ius/stable/Redhat/5/x86_64/ius-release-1-4.ius.el5.noarch.rpm rpm -Uvh ius-release-1-4.ius.el5.noarch.rpm yum list | grep -w \.ius\. # list all the packages in the IUS repository; use this to find PHP/MySQL version and libraries you want to install Remove old version of PHP and install newer version from IUS rpm -qa | grep php # to list all of the installed php packages we want to remove yum shell # open an interactive yum shell remove php-common php-mysql php-cli #remove installed PHP components install php53 php53-mysql php53-cli php53-common #add packages you want transaction solve #important!! checks for dependencies transaction run #important!! does the actual installation of packages. [control+d] #exit yum shell php -v PHP 5.3.2 (cli) (built: Apr 6 2010 18:13:45) Upgrade MySQL from IUS repository /etc/init.d/mysqld stop rpm -qa | grep mysql # to see installed mysql packages yum shell remove mysql mysql-server #remove installed MySQL components install mysql51 mysql51-server mysql51-devel transaction solve #important!! checks for dependencies transaction run #important!! does the actual installation of packages. [control+d] #exit yum shell service mysqld start mysql -v Server version: 5.1.42-ius Distributed by The IUS Community Project Upgrade instructions courtesy of IUS wiki: http://wiki.iuscommunity.org/Doc/ClientUsageGuide Install rssh (restricted shell) to provide scp and sftp access, without allowing ssh login cd /tmp wget http://dag.wieers.com/rpm/packages/rssh/rssh-2.3.2-1.2.el5.rf.x86_64.rpm rpm -ivh rssh-2.3.2-1.2.el5.rf.x86_64.rpm useradd -m -d /home/dev -s /usr/bin/rssh dev passwd dev Edit /etc/rssh.conf to grant access to SFTP to rssh users. vi /etc/rssh.conf Uncomment or add: allowscp allowsftp This allows me to connect to the machine via SFTP protocol in Transmit (my FTP program of choice; I'm sure it's similar with other FTP apps). rssh instructions appropriated (with appreciation!) from http://www.cyberciti.biz/tips/linux-unix-restrict-shell-access-with-rssh.html Set up virtual interfaces ifconfig eth1:1 192.168.1.3 up #start up the virtual interface cd /etc/sysconfig/network-scripts/ cp ifcfg-eth1 ifcfg-eth1:1 #copy default script and match name to our virtual interface vi ifcfg-eth1:1 #modify eth1:1 script #ifcfg-eth1:1 | modify so it looks like this: DEVICE=eth1:1 IPADDR=192.168.1.3 NETMASK=255.255.255.0 NETWORK=192.168.1.0 ONBOOT=yes NAME=eth1:1 Add more Virtual interfaces as needed by repeating. Because of the ONBOOT=yes line in the ifcfg-eth1:1 file, this interface will be brought up when the system boots, or the network starts/restarts. service network restart Shutting down interface eth0: [ OK ] Shutting down interface eth1: [ OK ] Shutting down loopback interface: [ OK ] Bringing up loopback interface: [ OK ] Bringing up interface eth0: [ OK ] Bringing up interface eth1: [ OK ] ping 192.168.1.3 64 bytes from 192.168.1.3: icmp_seq=1 ttl=64 time=0.105 ms Virtualhosts In the rssh section above I added a user to use for SFTP. In this users' home directory, I created a folder called 'https'. This is where the documents for this site will live, so I need to add a virtualhost that will point to it. I will use the above virtual interface for this site (herein called dev.site.local). vi /etc/http/conf/httpd.conf Add the following to the end of httpd.conf: <VirtualHost 192.168.1.3:80> ServerAdmin [email protected] DocumentRoot /home/dev/https ServerName dev.site.local ErrorLog /home/dev/logs/error_log TransferLog /home/dev/logs/access_log </VirtualHost> I put a dummy index.html file in the https directory just to check everything out. I tried browsing to it, and was met with permission denied errors. The logs only gave an obscure reference to what was going on: [Mon May 17 14:57:11 2010] [error] [client 192.168.1.100] (13)Permission denied: access to /index.html denied I tried chmod 777 et. al., but to no avail. Turns out, I needed to chmod+x the https directory and its' parent directories. chmod +x /home chmod +x /home/dev chmod +x /home/dev/https This solved that problem. DNS I'm handling DNS via our local Windows Server 2003 box. However, the CentOS documentation for BIND can be found here: http://www.centos.org/docs/5/html/Deployment_Guide-en-US/ch-bind.html SSL To get SSL working, I changed the following in httpd.conf: NameVirtualHost 192.168.1.3:443 #make sure this line is in httpd.conf <VirtualHost 192.168.1.3:443> #change port to 443 ServerAdmin [email protected] DocumentRoot /home/dev/https ServerName dev.site.local ErrorLog /home/dev/logs/error_log TransferLog /home/dev/logs/access_log </VirtualHost> Unfortunately, I keep getting (Error code: ssl_error_rx_record_too_long) errors when trying to access a page with SSL. As JamesHannah gracefully pointed out below, I had not set up the locations of the certs in httpd.conf, and thusly was getting the page thrown at the broswer as the cert making the browser balk. So first, I needed to set up a CA and make certificate files. I found a great (if old) walkthrough on the process here: http://www.debian-administration.org/articles/284. Here are the relevant steps I took from that article: mkdir /home/CA cd /home/CA/ mkdir newcerts private echo '01' > serial touch index.txt #this and the above command are for the database that will keep track of certs Create an openssl.cnf file in the /home/CA/ dir and edit it per the walkthrough linked above. (For reference, my finished openssl.cnf file looked like this: http://pastebin.com/raw.php?i=hnZDij4T) openssl req -new -x509 -extensions v3_ca -keyout private/cakey.pem -out cacert.pem -days 3650 -config ./openssl.cnf #this creates the cacert.pem which gets distributed and imported to the browser(s) Modified openssl.cnf again per walkthrough instructions. openssl req -new -nodes -out dev.req.pem -config ./openssl.cnf #generates certificate request, and key.pem which I renamed dev.key.pem. Modified openssl.cnf again per walkthrough instructions. openssl ca -out dev.cert.pem -config ./openssl.cnf -infiles dev.req.pem #create and sign certificate. cp dev.cert.pem /home/dev/certs/cert.pem cp dev.key.pem /home/certs/key.pem I updated httpd.conf to reflect the certs and turn SSLEngine on: NameVirtualHost 192.168.1.3:443 <VirtualHost 192.168.1.3:443> ServerAdmin [email protected] DocumentRoot /home/dev/https SSLEngine on SSLCertificateFile /home/dev/certs/cert.pem SSLCertificateKeyFile /home/dev/certs/key.pem ServerName dev.site.local ErrorLog /home/dev/logs/error_log TransferLog /home/dev/logs/access_log </VirtualHost> Put the CA cert.pem in a web-accessible place, and downloaded/imported it into my browser. Now I can visit https://dev.site.local with no errors or warnings. And this is where I'm at. I will keep editing this as I make progress. Any tips on how to configure SSL email would be appreciated.

    Read the article

  • URL encoding yes/or no?

    - by csetzkorn
    I have a restful webservice which receives some structured data which is put straight into a database. The data is send from an OS using wget. I am just wondering whether I actually need to URL encode the data and if so why? Please note that it is no problem to do it but it might be uneccessary in this scenario. Thanks. Christian

    Read the article

  • Squid proxy not serving modified html content

    - by Matthew
    I'm trying to use squid to modify the page content of web page requests. I followed the upside-down-ternet tutorial which showed instructions for how to flip images on pages. I need to change the actual html of the page. I've been trying to do the same thing as in the tutorial, but instead of editing the image I'm trying to edit the html page. Below is a php script I'm using to try to do it. All jpg images get flipped, but the content on the page does not get edited. The edited index.html files written contain the edited content, but the pages the users receive don't contain the edited content. #!/usr/bin/php <?php $temp = array(); while ( $input = fgets(STDIN) ) { $micro_time = microtime(); // Split the output (space delimited) from squid into an array. $temp = split(' ', $input); //Flip jpg images, this works correctly if (preg_match("/.*\.jpg/i", $temp[0])) { system("/usr/bin/wget -q -O /var/www/cache/$micro_time.jpg ". $temp[0]); system("/usr/bin/mogrify -flip /var/www/cache/$micro_time.jpg"); echo "http://127.0.0.1/cache/$micro_time.jpg\n"; } //Don't edit files that are obviously not html. $temp[0] contains url of file to get elseif (preg_match("/(jpg|png|gif|css|js|\(|\))/i", $temp[0], $matches)) { echo $input; } //Otherwise, could be html (e.g. `wget http://www.google.com` downloads index.html) else{ $time = time() . microtime(); //For unique directory names $time = preg_replace("/ /", "", $time); //Simplify things by removing the spaces mkdir("/var/www/cache/". $time); //Create unique folder system("/usr/bin/wget -q --directory-prefix=\"/var/www/cache/$time/\" ". $temp[0]); $filename = system("ls /var/www/cache/$time/"); //Get filename of downloaded file //File is html, edit the content (this does not work) if(preg_match("/.*\.html/", $filename)){ //Get the html file contents $contentfh = fopen("/var/www/cache/$time/". $filename, 'r'); $content = fread($contentfh, filesize("/var/www/cache/$time/". $filename)); fclose($contentfh); //Edit the html file contents $content = preg_replace("/<\/body>/i", "<!-- content served by proxy --></body>", $content); //Write the edited file $contentfh = fopen("/var/www/cache/$time/". $filename, 'w'); fwrite($contentfh, $content); fclose($contentfh); //Return the edited page echo "http://127.0.0.1/cache/$time/$filename\n"; } //Otherwise file is not html, don't edit else{ echo $input; } } } ?>

    Read the article

  • batch source code downloading perl

    - by Jake
    Hello, I know of the "wget" function in shell, but I'm running perl from the command line on a windows machine and I was looking for a method of sequentially downloading the web source code from a site. For example: for www.abcd.com has the extension of it's subsites as 1,2,3 etc such that www.abcd.com/1 or www.abcd.com/2 is the syntax. I would like the source to be labeled as 1.source, 2.source etc for a defined set of pages 1-100 say. Thanks for the help, Jake

    Read the article

  • determine if chipset is capable off packet injection and monitor mode

    - by Richard
    Hi, I am new to linux and I want to know if my chipset is capable off doing those things My chipset is a intel centrino advanced 6200-n on a sony vayo laptop running on windows 7. Now, I know that windows is only capable off listening, so I boot backtrack 4 from a usb stick. I also want to know if a live distribution can work flawlessly with the wificard even if it does not support formentioned things, because I try'd to use wget to download something and it says it ca not resolve the address? thanks, Richard

    Read the article

  • Static Map API that supports HTTPS requests and is free

    - by gravyface
    Any options out there? Google Static Maps with SSL (HTTPS) is only available to Premier members. Bing's Open Map API seems to have the same restricts. Any other ideas? I'm debating whether to schedule a cronjob to wget over http and dump the images I need into a folder (named after the content object IDs that reference them), but I'd rather find a free map API (or something less klugey) that supports HTTPS requests.

    Read the article

  • Unable to have nokogiri obey custom path parameters during install

    - by Christopher
    I am trying to install nokogiri locally on dreamhost using the commands: $ wget ftp://xmlsoft.org/libxml2/libxml2-2.7.6.tar.gz $ wget ftp://xmlsoft.org/libxml2/libxslt-1.1.26.tar.gz $ tar zxvf libxml2-2.7.6.tar.gz $ cd libxml2-2.7.6 $ ./configure --prefix=$HOME/local/ --exec-prefix=$HOME/local $ make && make install $ cd .. $ tar zxvf libxslt-1.1.26.tar.gz $ cd libxslt-1.1.26 $ ./configure --prefix=$HOME/local/ --with-libxml-prefix=$HOME/local/ $ make && make install $ export LD_LIBRARY_PATH=$HOME/local/lib $ gem install nokogiri -- --with-xslt-dir=$HOME/local \ --with-xml2-include=$HOME/local/include/libxml2 \ --with-xml2-lib=$HOME/local/lib but it still gives the error: Building native extensions. This could take a while... ERROR: Error installing nokogiri: ERROR: Failed to build gem native extension. /usr/bin/ruby1.8 extconf.rb checking for iconv.h in /opt/local/include/,/opt/local/include/libxml2,/opt/local/include,/opt/local/include,/opt/local/include/libxml2,/usr/local/include,/usr/local/include/libxml2,/usr/include,/usr/include/libxml2,/usr/include,/usr/include/libxml2... yes checking for libxml/parser.h in /opt/local/include/,/opt/local/include/libxml2,/opt/local/include,/opt/local/include,/opt/local/include/libxml2,/usr/local/include,/usr/local/include/libxml2,/usr/include,/usr/include/libxml2,/usr/include,/usr/include/libxml2... yes checking for libxslt/xslt.h in /opt/local/include/,/opt/local/include/libxml2,/opt/local/include,/opt/local/include,/opt/local/include/libxml2,/usr/local/include,/usr/local/include/libxml2,/usr/include,/usr/include/libxml2,/usr/include,/usr/include/libxml2... no libxslt is missing. try 'port install libxslt' or 'yum install libxslt-devel' *** extconf.rb failed *** Could not create Makefile due to some reason, probably lack of necessary libraries and/or headers. Check the mkmf.log file for more details. You may need configuration options. Provided configuration options: --with-opt-dir --without-opt-dir --with-opt-include --without-opt-include=${opt-dir}/include --with-opt-lib --without-opt-lib=${opt-dir}/lib --with-make-prog --without-make-prog --srcdir=. --curdir --ruby=/usr/bin/ruby1.8 --with-iconv-dir --without-iconv-dir --with-iconv-include --without-iconv-include=${iconv-dir}/include --with-iconv-lib --without-iconv-lib=${iconv-dir}/lib --with-xml2-dir --without-xml2-dir --with-xml2-include --without-xml2-include=${xml2-dir}/include --with-xml2-lib --without-xml2-lib=${xml2-dir}/lib --with-xslt-dir --without-xslt-dir --with-xslt-include --without-xslt-include=${xslt-dir}/include --with-xslt-lib --without-xslt-lib=${xslt-dir}/lib Gem files will remain installed in /home/myusername/.gems/gems/nokogiri-1.4.1 for inspection. Results logged to /home/myusername/.gems/gems/nokogiri-1.4.1/ext/nokogiri/gem_make.out where it doesn't seem to be looking in the paths I have specified for the libraries. Is there something wrong with my installation method?

    Read the article

  • Need help again altering output of script

    - by Aaron
    wget --output-document=- http://runescape.com/title.ws 2>/dev/null \ | grep PlayerCount \ | head -1l \ | sed 's/^[^>]*>//' \ | sed "s/currently.*$/$(date '+%m\/%d\/%Y %H:%m:%S')/" \ | cut -d">" -f 3,4 \ | sed 's/<\/span>//' \ | awk '{print $3, $4, $1, $2}' Will output: 03/19/2012 18:03:58 123,822 people Would anyone be able to help me rewrite this so the output looks like: 03/19/2012 18:03:58,123822,people I need it this way because when I import it into googledocs, everything with a comma gets separated. Thanks if you help!

    Read the article

  • Downloading all ctrl alt del webcomics using terminal.

    - by Conner
    I've tried using the following commands to download the ctrl alt del comics. $ for filename in $(seq 20021023 20100503); do wget http://www.ctrlaltdel-online.com/comics/"$filename".jpg; done I get the following error code, "bash: syntax error near unexpected token 'do'" I've also tried using cURL, using this command, curl http://ctrlaltdel-online.com/comics[20021023..20100503].jpg I get the following error code, "curl: (3) [globbing] error: bad range specification after pos 37" Any help would be great.

    Read the article

  • Uninstall Git completely on Ubuntu?

    - by Millisami
    I installed Git on Ubuntu Lucid (latest) manually as following. cd ~/tmp wget http://kernel.org/pub/software/scm/git/git-1.7.0.6.tar.gz tar -xzvf git-1.7.0.6.tar.gz cd git-1.7.0.6.tar.gz ./configure sudo make sudo make install Now, how can I completely uninstall it?

    Read the article

  • How do I clone over HTTP a repository that has no info/refs?

    - by gbacon
    Given a repository served over HTTP whose owner forgot to chmod +x hooks/post-update, is there a workaround for cloning it? I tried running wget --mirror url, but rather than fetching the subtree only, it tried to mirror the entire site—which I assume happened due to the parent-directory links in the autogenerated index.html resources.

    Read the article

  • Scraping a page from a secure URL which is possibly using a session ID

    - by VN44CA
    How to scrape a page like this. https://www.procom.ca/JobList.aspx?keywords=&Cities=&reference=&JobType=0 It is secure, and requires a referrer? I can't get anything using wget or httplib2. If you go through this page, you get a list and it works on a browser but not the command line. https://www.procom.ca/jobsearch.aspx I am interested in command line fetching. thx

    Read the article

  • [Python]Download an image embedded in a mime multipart message

    - by michele
    Hi, I have to download some images from links. This links return me a file where is embedded a multipart mime and a tiff image. I have writed this code but it downloads the file with mime. How I can remove the mime from this file and have the image returned? Can I do this with wget or curl? My code: def download(url,local): import urllib urllib.urlretrieve(url,local) urllib.urlcleanup() Thanks a lot.

    Read the article

  • Installing software on Solaris

    - by sturda
    I'd like to install several unix utilities (incl. xmlstarlet, wget) on a solaris 10 machine which I don't have root access to (obviously, I have a user account). I'm not that experienced with solaris and am wondering if I can simply get hold of an uber binary for each utility I need and just place this in my home directory? Is this feasible? Many thanks

    Read the article

  • What is a better way to convert a simple sinatra app to static html pages?

    - by dimus
    A friend of mine asked to create a static website and I found that making such site using Sinatra is a pure joy. I just wrote all my routes like this: get '/index.html' do haml :index end get '/app.css' do sass :app end .... So I was able to use layouts, and haml and sass to put site together quickly. To create the static site I used wget -r -l2 http://localhost:4567 Which did work pretty well, but I imagine there is a better way to create a static site from a Sinatra code?

    Read the article

  • How to install new packages on Cygwin?

    - by Mulone
    Hi all, I installed the latest version of Cygwin with a number of packages. I soon realised that I need more packages (such as wget, etc) and I coudln't find a way to install the new packages without running the set up again and reinstall everything from scratch. What I'm looking for is the equivalent of apt-get on Cygwin (if such a thing exists). Cheers, Mulone

    Read the article

  • Set Hudson build number from a script

    - by Joe Schneider
    Is there a way to set the next build number in Hudson from a script? I have the nextBuildNumber plug-in installed, and attempted to use wget with --post-data, but that page appears to require login. I have two steps of a chained build and I want to keep the build numbers in sync.

    Read the article

  • shell tool which renders web site including javascript

    - by drholzmichl
    Hi, we want to test our webpages on linux shell. For that reason I'm looking for a shell tool, which gets the html page from server (like 'wget') and then executes contained Javascript, include pictures and so on. After this, the tool should give me a 'screenshot' of the rendered page, so that I can create a checksum for that screen. (So I want to the same as opening browser in Windows, open webpage and make a screenshot after page load, but on linux) Can anyone give me a hint?

    Read the article

  • how to test rails js and ajax without a browser

    - by user1679052
    when i use rspec with capybara to test my rails js page , I got the following error: "Selenium::WebDriver::Error::WebDriverError: Could not find Firefox binary (os=linux). " Actually my rails script are all written on the linux server, where there is on brower installed, and any desktop software is not supported on the server (since no X11 is installed). How can I test js in this situation. Or is there and brower that works without X11 installed like wget? Thanks.

    Read the article

< Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >