Search Results

Search found 593 results on 24 pages for 'wget'.

Page 18/24 | < Previous Page | 14 15 16 17 18 19 20 21 22 23 24  | Next Page >

  • installing latest apache on centos

    - by fivelitresofsoda
    hi, I'm trying to install the newest version of apache on my centos server. I did the following: Download $ wget http://httpd.apache.org/path/to/latest/version/ Extract $ gzip -d httpd-2_0_NN.tar.gz $ tar xvf httpd-2_0_NN.tar Configure $ ./configure Compile $ make Install $ make install Test $ PREFIX/bin/apachectl start And that all worked except the last step, when i type apachectl start it says 'command not found'. I ran this command from /usr/local/apache2/bin/ where it is installed but no cigar. Any idea what i am doing wrong? Thanks.

    Read the article

  • How to invoke a command using specific proxy server?

    - by Xiè Jìléi
    Some applications support proxy (http proxy or socks proxy), and some are not. For browsers, I can specify proxy server in the preferences/options dialog, and other applications may be able to configure proxy servers in config files. For general purpose, can I invoke a command using a specific proxy? Like following: $ proxy-exec --type sock5 --server 1.2.3.4:8000 -- wget/ftp ... I'm using Ubuntu Maverick. P.S. In win32, it can be implemented by hijacking the socket dlls, maybe, I'm not familiar with Linux programming, but I guess it's possible in Linux. though.

    Read the article

  • receiving "command not found" error messages after fresh reinstall of Lubuntu 14.04

    - by user236378
    Lubuntu 14.04 was working really great. . .until I messed up and had to do a complete fresh reinstall. Now I receive error messages when I input commands into the Terminal, even after immediately completing the fresh install. For example I type: sudo leafpad ?/etc/default/ or sudo leafpad ?/etc/default/grub I get: sudo: leafpad: command not found I type: sudo update-initramfs ?-u or sudo update-grub I get: sudo: update-initramfs: command not found or sudo: update-grub: command not found If I use the command mkdir I get: mkdir: command not found I also get this same exact error message, command not found, with sudo apt-get and wget In other words I can't do anything that I was able to do when inputting commands into the terminal. So I cannot add any repositories or update anything at all. I am not really sure what is causing the problem(s). It appeared to me that Lubuntu installed and booted up OK. However just as soon as I enter anything into the Terminal I immediately get the above error messages. I have tried to do the reinstall three times, same error messages. If anyone can suggest any fixes I would really appreciate it very much. Thank you!

    Read the article

  • I need a few minutes of dedicated server a week, but not for hosting, just to convert ogg etc

    - by talkingnews
    I'm completely happy with my webhosting, it's just that I need to do one little thing they won't allow, and that's run an instance of Sox to convert about 30 mp3s to ogg files, in various directories, a couple of times a week, to be done automatically in response to the detection of the upload of an mp3. Probably looking at a minute of server time over the whole week. I've had unhelpful suggestions on other forums like "why not leave your home PC on 24 hours a day and then use all your isp bandwidth to do this", which doesn't work for me. I know that I can host files on, say, Amazon S3, but is there something similar for my needs? All it would need to do would be: wget/ftp the mp3 files, convert them to ogg, ftp the files back to my hosting. Of course, all this wouldn't be needed if there was such a thing as a compiled binary of Sox (or any mp3ogg converter) for Centos which I could upload without needing root access, but I've given up asking that one, but always open to suggestions!

    Read the article

  • MongoDB REST interface not listening after update

    - by Ones and Zeroes
    I replaced the mongodb-10gen install with the Ubuntu package (mongodb-server, mongodb-client and dev). apt-get install mongodb Thereafter, I am now unable to connect to the REST interface, where it worked before. Doing a wget to http://127.0.0.1:27018, I receive the following response: Connecting to 127.0.0.1:27018... failed: Connection refused. My previous /etc/mongodb.conf file had the following in: #enable REST rest = true Adding it to the packaged conf file does not resolve the issue, not even after restarting. I also tried changing the following with no effect: # Disable the HTTP interface (Defaults to localhost:27018). # nohttpinterface = true to # Disable the HTTP interface (Defaults to localhost:27018). nohttpinterface = false I have searched for days, and there doesn't seem to be anything on the Mongo site about a similar anomaly. If you have encountered a similar issue on Ubuntu Oneiric, please add your comments, even if you haven't found a solution to this issue.

    Read the article

  • How to set which IP to use for a HTTP request?

    - by GetFree
    This is probably a silly question. I'm doing some http requests using wget from the command line, and I want those connections to be made through one specific IP of the 4 IPs my server has. Those http requests go to one specific range of IPs so I only want those to be routed differently. The 4 interfaces in my server are eth0, eth0:0, eth0:1, eth0:2. I tried with the following command: route add -net 192.164.10.0/24 dev eth0:0 But when I see the routing table it says: Destination Gateway Genmask Flags MSS Window irtt Iface 192.164.10.0 0.0.0.0 255.255.255.0 U 0 0 0 eth0 The interface is set to eth0 not eth0:0 as my command says. What am I doing wrong?

    Read the article

  • Best way to use mod_rewrite to replace WordPress pages with static files

    - by David Moles
    Here's the situation: I've got an old WordPress installation that I'd like to archive as static files, but I'd also like to preserve old URLs. I've already created the static archive with wget and sorted out the filenames and links. Now I'd like to configure Apache to intercept requests for the old dynamic URL and replace them with the new static one, e.g.: http://www.example.org/log/?p=1234 or http://www.example.org/log/index.php?p=1234 should redirect to http://www.example.org/log/archives/1234.html I've tried adding the following to the VirtualHost config for example.org, but to no effect -- I just get the PHP page. RewriteCond %{REQUEST_URI} /log/ RewriteCond %{QUERY_STRING} p=([^&;]*) RewriteRule ^/$ http://%{SERVER_NAME}/log/archives/%1.html [R,L] I've enabled logging and I can see what look like other rules being applied, but not this one. None of my other guesses at match patterns for %{REQUEST_URI} seem to have any effect either (log, log/, log.*, even .*). I'm new to mod_rewrite and this is mostly cargo cult, so I'm pretty sure I've gotten it wrong. Anyone know what I should be doing here?

    Read the article

  • Why blogger puts BR tags only when content served via RSS feed?

    - by tamashumi
    I have a problem with using my new blog RSS feed. I wrote a post (the first one) with some code examples formatted by SyntaxHighlighter. To paste a code sample I'm switching from WYSIWYG to HTML view and put the code inside pre tag like this (don't worry, h4 tag was opened a line above the screenshot area): The problem is that such pre tag, when later accessed via RSS feed contains br tags instead of new line characters. Below is screenshot of the adequate RSS source code: What's most important when accessed via www, the post html is formatted fine, no brs inside pre. I verified that by downloading the blog post with wget. So I believe this isn't SyntaxHighlighter issue nor any 'new line' formatting on blog post save. This is a real problem as I want aggregate my blog on employers blog and all formatting of code examples is broken because of that. The base question is: how to get rid of those unwanted brs served via RSS? What's strangest friend of mine also uses blogger for such aggregation and he has no such issue. I checked his RSS feed and there are no brs inside pre tags. We also compared settings of our blogs. However we have found no clue. The blog post RSS for the blog (just check source and search for string: pre class="brush) Mentioned friend's blog Last thing: I see content served from RSS is now also html encoded. If I remember well, it wasn't previously.

    Read the article

  • Download from http server all directories,files and subdirectories and so on

    - by Jack
    I want to download from remote http server all files directories,files and so on. I found some solutions to ftp server,but doesn't work to http. Until now no luck with wget -r or -m. It download all direcotories in the root and the respective index.html. Not all files and sub-directory under such it(note the sub-directory may have another directory and so on) not sure on tags fix for me if needs. Note: I'm not a native english speaker,sorry for bad english.

    Read the article

  • Ubuntu apt-get install (--download-only) executed from another machine on behalf of mine

    - by Maroloccio
    I have a server on a network segment with no direct or indirect access to the Internet. I want to perform an: apt-get install <package_name> Is there a way to somehow delegate the process of downloading the required files to another machine by exporting the server configuration so as to satisfy all dependencies while running: apt-get install --download-only <package_name> Can, in effect, apt-get install read a configuration from an exported archive rather than from the local package database? Can the list of packages to be downloaded be retrieved, along with an installation script to perform the installation, instead of the actual packages? (a further level of indirection which would help me schedule this with wget at appropriate times...)

    Read the article

  • Using command line to connect to a wireless network with an http login

    - by Shane
    I'm trying to connect to a wifi network where it hijacks all requests and redirects you to a page where you have to agree to a terms of use before it lets you connect to the actual outside world. This is a pretty common practice, and usually doesn't pose much of a problem. However, I've got a computer running Ubuntu 9.10 server with no windowing system. How can I use the command line to agree to the terms of use? I don't have internet access on the computer to download packages via apt-get or anything like that. Sure, I can think of any number of workarounds, but I suspect there's an easy way to use wget or curl or something. Basically, I need a command line solution for sending an HTTP POST request essentially clicking on a button. For future reference, it'd be helpful to know how to send a POST request with, say, a username and password if I ever find myself in that situation in another hotel or airport.

    Read the article

  • Determine which version of linux/unix/darwin I have

    - by John
    I have root ssh/terminal access to a linux server. How do I determine which version of centos I have? Some people suggested I run the command cat /etc/redhat-release but I got an error saying file not found. In fact, i'm not entirely sure i'm even using CentOS. That's what some suggested it might be. Here's a list of commands I tried that gave me no file or directory error: cat /etc/*release* cat /etc/*version* cat /proc/*version* cat /proc/*release* Here's a list of linux commands that do not exist: lsb_release: command not found wget: command not found yum: command not found

    Read the article

  • What is the difference between yum, apt-get, rpm, ./configure && make install

    - by Saif Bechan
    I am new to Linux and am running CentOs. When I want to update or install certain software I came across three ways. Sometimes it's: yum install program rpm -i program.rpm wget program.tar.gz unpack ./configure make make install That last one is a real pain, esp when you come from windows where a program install is usually one click and then a nice guide. Now can someone please explain to me: Why are there so many different ways to do this? Which one do you recommend to use and why? Are there any other ways for installing programs?

    Read the article

  • My Laptop (HP/Compaq 2510p) running ubuntu 10.04 LTS keeps losing the WLAN connection.

    - by Ernelli
    I am using Wicd and can successfully connect to my ADSL router (Thomson TG 787) using WPA PSK. But with regular interval I lose the ability to connect to Internet. I can ping the GW and can actually ping servers on the Internet but not connect to them using HTTP (Tested with both Firefox and wget). I would suspect the router unless for the fact that the problem does not show up when running Windows XP on the same computer and also, when the problem arises, a simple disconnect/connect in Wicd solves the problem, which does not involve the router (Except for the DHCP request). I have searched Ubuntu forums without luck, most problems described relate to specific network drivers or other problems. Does anyone have the same experience with Linux/Ubuntu and WLAN?

    Read the article

  • DVD wont mount Ubuntu 12.04

    - by CyborgGold
    I can't seem to be able to mount my optical drive. I have tried numerous solutions from this site with no results. I am not able to see the device inside the file browser either. There is a DVD in the drive. I am running 12.04 on an HP g60-235dx portable. I have a link below to the specs. I will also list what I have tried (that I can find back right now.) I know the drive is functioning, because just before Windows 7 crashed and my MBR went fubar I was watching movies just fine. I am fairly new to linux, so don't assume I know anything. Ok, so here is what I have tried: sudo wget --output-document=/etc/apt/sources.list.d/medibuntu.list http://www.medibuntu.org/sources.list.d/$(lsb_release -cs).list sudo apt-get --quiet update sudo apt-get --yes --quiet --allow-unauthenticated install medibuntu-keyring sudo apt-get --quiet update sudo apt-get install libdvdcss2 dmesg | grep sr0 (no output) apt-get install libdvdnav4 (already installed, and up to date) sudo /usr/share/doc/libdvdread4/install-css.sh ls -l /dev/cdrom /dev/cdrw /dev/dvd /dev/dvdrw /dev/scd0 /dev/sr0 ls: cannot access /dev/scd0: No such file or directory lrwxrwxrwx 1 root root 3 Sep 10 03:51 /dev/cdrom -> sr0 lrwxrwxrwx 1 root root 3 Sep 10 03:51 /dev/cdrw -> sr0 lrwxrwxrwx 1 root root 3 Sep 10 03:51 /dev/dvd -> sr0 lrwxrwxrwx 1 root root 3 Sep 10 03:51 /dev/dvdrw -> sr0 brw-rw----+ 1 root cdrom 11, 0 Sep 10 03:51 /dev/sr0 wodim --devices wodim: Overview of accessible drives (1 found) : ------------------------------------------------------------------------- 0 dev='/dev/sg1' rwrw-- : 'TSSTcorp' 'CDDVDW TS-L633M' ------------------------------------------------------------------------- sudo lshw optical *-cdrom description: DVD-RAM writer product: CDDVDW TS-L633M vendor: TSSTcorp physical id: 1 bus info: scsi@1:0.0.0 logical name: /dev/cdrom logical name: /dev/cdrw logical name: /dev/dvd logical name: /dev/dvdrw logical name: /dev/sr0 version: 0200 capabilities: removable audio cd-r cd-rw dvd dvd-r dvd-ram configuration: ansiversion=5 status=nodisc sudo lshw | grep cdrom *-cdrom logical name: /dev/cdrom Spec sheet for portable: http://www.cnet.com/laptops/hp-g60-235dx/4507-3121_7-33496192.html If you need any more information than all of that... please let me know.

    Read the article

  • Mac OS X 10.6 executable not found without full path

    - by Danack
    I just installed Apache via MacPorts. It seems that my Mac was absolutely confused about which version of the Apache executables to run. After moving the Apache executables that ship with the Mac to a directory that is not listed in the PATH variable, trying to run the httpd built by MacPorts fails even though the correct directory (/opt/local/apache2/bin) is listed in the PATH variable. If I navigate to the directory /opt/local/apache2/bin and type the command httpd I still get the error message -bash: httpd: command not found If I type the command with the full path /opt/local/apache2/bin/httpd it works fine. I've run the command alias to see if something was clashing but the only thing listed is: alias wget='curl -O' How do I find what is intercepting the command and preventing the executable being found in the directory, even when I'm inside the same directory? By the way, the httpd file is executable: -rwxr-xr-x 1 root admin 442496 9 May 2012 httpd

    Read the article

  • iptables block everything except http

    - by arminb
    I'm trying to configure my iptables to block any network traffic except HTTP: iptables -P INPUT DROP #set policy of INPUT to DROP iptables -P OUTPUT DROP #set policy of OUTPUT to DROP iptables -A INPUT -p tcp --sport 80 -m state --state ESTABLISHED,RELATED -j ACCEPT iptables -A OUTPUT -p tcp --dport 80 -m state --state NEW,ESTABLISHED -j ACCEPT The iptables output (iptables -L -v) gives me: Chain INPUT (policy DROP 0 packets, 0 bytes) pkts bytes target prot opt in out source destination 4 745 ACCEPT tcp -- any any anywhere anywhere tcp spt:http state RELATED,ESTABLISHED Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) pkts bytes target prot opt in out source destination Chain OUTPUT (policy DROP 0 packets, 0 bytes) pkts bytes target prot opt in out source destination 2 330 ACCEPT tcp -- any any anywhere anywhere tcp dpt:http state NEW,ESTABLISHED When I try to wget 127.0.0.1 (yes i do have a web server and it works fine) i get: --2012-11-14 16:29:01-- http://127.0.0.1/ Connecting to 127.0.0.1:80... The request never finishes. What am I doing wrong? I'm setting iptables to DROP everything by default and add a rule to ACCEPT HTTP.

    Read the article

  • How can I apply proxy settings system-wide on Linux?

    - by Sravan
    Our campus employs proxy server with authentication. So, I have to apply http://username:password@proxyIp:port/ bash configure file(suppose for wget or curl) or manually entering details for every graphical application (like gtalk).And also if I work with localhost (XAMPP), I have to configure XAMPP, and so on. If I have my proxy password changed I have to change it everywhere on the system! Is there a way I can apply proxy settings system-wide at one place.Even though I am asking for Linux, I would like to know it on windows also.

    Read the article

  • How to auto-update a website mirror with exceptions to certain pages?

    - by tomatosalad
    I'm currently mirroring a website on my server. The site itself is rarely updated, but it is updated enough that info can become outdated quickly. I mirrored it first with wget, and this worked fine, but I made some changes: The original index.html used frames, but the site also provides a main.html which is essentially index.html but without frames. I deleted index.html and renamed main.html. I did not want to mirror the webchat, blog or forum, so I deleted those files and directories and made directories "blogs" "forum" and "chat" and placed a php redirect in each of those, redirecting visitors to the orignal site. I'd like to auto-update the mirror (maybe once every 24-72 hours), but preserve the changes I made. Is this possible? How would I go about doing it? I am completely clueless as to how. Thanks for any and all help! :)

    Read the article

  • What is the significance of these different width, height and resolution parameters?

    - by ??????? ???????????
    An image with a pixel resolution of 640 x 480 has additional dimension and resolution parameters according to exiftool. I'm unsure what they mean. Why are the X / Y Resolution parameters the same?Should they not reflect the pixel dimensions of the image? What does Exif Image Size mean and how is it different from the pixel dimensions? What is the focal plane? Does it have any relation to the device used to capture this image? $ exiftool evil1.jpg | egrep 'Width|Height|Resolution' X Resolution : 180 Y Resolution : 180 Resolution Unit : inches Exif Image Width : 400 Exif Image Height : 300 Focal Plane X Resolution : 8114.285714 Focal Plane Y Resolution : 8114.285714 Focal Plane Resolution Unit : inches Image Width : 640 Image Height : 480 If needed, the original image can be obtained from: here=http://www.pythonchallenge.com/pc/return/evil1.jpg wget --user=huge --password=file $here

    Read the article

  • Red Hat 6.5- sysctl -w net.ipv6.conf.default.accept_redirects=0

    - by kjbradley
    I am in the process of writing a Red Hat 6.5 Kickstart disc with hardened security. I have run a program to determine where the weaknesses are in my system, and apparently there is a medium severity problem by accepting IPV6 redirects. When I implement the following line in my post script in my kickstart, I can't access any websites externally with wget, or ssh/scp in from my computer. sysctl -w net.ipv6.conf.default.accept_redirects=0 Is there a workaround to this so that the system will still be hardened but I will be able to access systems that are external?

    Read the article

  • Why can't I access a webserver through a load balancer on my local network?

    - by Karptonite
    When I try to use curl (or wget, lynx, etc) to connect from a server on our local network to our website, which is on a local server behind a CoyotePoint load balancer, curl fails. Ping does not have this problem. When I curl directly to any of the servers behind that load balancer (from and to the same local network), I also have no problem. It doesn't matter whether the local server I'm curling from is behind the load balancer or not. Does anyone have any idea why I can't access my webserver through the load balancer on my local network?

    Read the article

  • Monitoring/logging a malfunctioning internet connection

    - by Pekka
    I have a mysterious internet connection problem: Every 15-20 minutes, the connection will become very slow, and take 2-3 minutes for anything to load. I've had a technician from the ISP over here to test the hardware, and everything is in pristine condition. They have no other explanation than a configuration error on my machine, a possibility I can exclude 90% because I'm experiencing the same problems with another machine. I will have to monitor the situation now, and I would like to run a program that logs when internet connections become slow. I thought about putting something together using at and wget. Does anybody know of some other tool for this that does this out of the box - maybe something with an adjustable request frequency, logging connection speeds etc.?

    Read the article

  • ipv6 ssh tunnel service for testing?

    - by Geuis
    I need to do some testing on a service that I run to make sure that it can handle ipv6 addresses. Basically, I need to connect to it from an ipv6 address. I've created a tunnel via tunnelbroker.net, but I'm finding the steps required to get a tunnel configured on my machine and router to be a lot of trouble. Given that I'm not a networking specialist and that I haven't had to dig into routing configuration in years, I'd like to know if there's an existing service that I can just ssh into and use it as my ipv6 endpoint. Simply being able to curl or wget from such an endpoint to my service would be more than enough to test what I need. Thanks!

    Read the article

  • Connections to Cotendo CDN servers are unreliable

    - by user139050
    I've been having a lot of trouble viewing certain websites - Gawker Media, DeviantArt, etc. - and through further investigation they all appear to be using a CDN called Cotendo. On my machine, and only this one, connections reset themselves midway through the download most of the time. This is not browser-specific; even wget (Cygwin) is unable to download anything without retrying a few times. This happens inside virtual machines as well. Unfortunately, I'm pretty stumped on why this is happening. My hosts file is empty (except for a couple LAN-specific things) and I've checked a few different DNS servers, but I can't really think of anything else to try. Anyone have any ideas?

    Read the article

< Previous Page | 14 15 16 17 18 19 20 21 22 23 24  | Next Page >