Search Results

Search found 50980 results on 2040 pages for 'http compression'.

Page 256/2040 | < Previous Page | 252 253 254 255 256 257 258 259 260 261 262 263  | Next Page >

  • How to access localhost from by ip address?

    - by Malte Werschy
    I am trying to access localhost by my ip address. My ip address is/was http://217.164.79.62/ (It is set to be automatically assigned so it keeps changing). When I enter http://localhost/ into the browser I get the xampp homepage. However when I enter http://217.164.79.62/ into my browser I get the following message: The server 217.164.79.62:80 requires a username and password. User Name: Password: How do I get the username and password?

    Read the article

  • Bad request - Invalid Hostname Error when using ARR IP address

    - by syloc
    I'm trying to setup a simple ARR system. I have 1 ARR machine load balancing between 2 APP servers. I can reach the app sites if i use the server name of the ARR machine. (http://arrserver/app) But i can't do it with its IP address. (http://10.7.10.25/app). It gives the "Bad Request - Invalid Hostname". In the ARR machine i configured the default site's bindings to "All Unassigned","80" (default values). Do i need to change the binding rule or need additional url rewrite rules? And also, in the ARR server http://127.0.0.1/app doesn't work. But http://localhost/app works fine. Thx in advance

    Read the article

  • 403 forbidden error from cron

    - by user112570
    I have some php code that runs fine in a browser but now I want to use the same code and execute it from a cron script I'm getting issues. i tried the command on cron wget -O /dev/null http://www.mydomain.com/test.php but if i try that in the terminal i get the error below. What is the correct command to run a php file from cron? and do I need to add extra line of code to the top of my php file? The problem I'm getting is -bash-3.2$ wget -O /dev/null http://www.mydomain.com/test.php --2012-04-08 15:59:41-- http://www.mydomain.com/test.php Resolving www.mydomain.com... 46.***.***.1 Connecting to www.mydomain.com|46.***.***.1|:80... connected. HTTP request sent, awaiting response... 403 Forbidden 2012-04-08 15:59:41 ERROR 403: Forbidden. I gave the file 755 permissions and even 777 permissions, but can't see what I'm doing wrong.

    Read the article

  • setting up a sub domain on windows hosting

    - by jason
    I'm trying to set up a sub domain for development on a windows server and am having problems setting the correct details in the httpd.ini file and hoped someone could help. I have set up the subdomain http://dev.website.com The files that I want to use for this subdomain are on the server in a folder called development http://www.website.com/development in the directory structure they are in /htdocs/development What do I need to add the the httpd.ini file to point the http://dev.website.com to the files located in the /htdocs/development folder on the server?

    Read the article

  • Import Firefox passwords into KeePassX or KeePass2

    - by rubo77
    I have an XML export of my Firefox Passwords in the form (I replaced real passwords with *): <xml> <entries ext="Password Exporter" extxmlversion="1.1" type="saved" encrypt="false"> <entry host="chrome://weave" user="****" password="****" formSubmitURL="" httpRealm="Mozilla Services Password" userFieldName="" passFieldName=""/> <entry host="chrome://weave" user="****" password="****" formSubmitURL="" httpRealm="Mozilla Services Encryption Passphrase" userFieldName="" passFieldName=""/> <entry host="http://www.example.de" user="rubo77" password="****" formSubmitURL="http://www.example.de" httpRealm="" userFieldName="benutzername" passFieldName="passwort"/> <entry host="http://example2.de" user="qqq" password="pppp" formSubmitURL="http://example2.de" httpRealm="" userFieldName="username" passFieldName="pass"/> ... Can I somehow convert this into a form KeePassX understands?

    Read the article

  • Performance issues with new dedicated server [closed]

    - by Pierre Espenan
    I have just subscribed to a new dedicated server and am getting worst than expected PHP execution performance. Execution times are twice as high as on my old mutualized server! I'm definitely not an expert at server management, so I'm wondering what I missed. Here are some stuff that can help you understand what's wrong here : My server (in french but easy to understand) : http://www.online.net/fr/serveur-dedie/dedibox-sc phpinfo(); output : http://jsfiddle.net/E8b7W/embedded/result/ PHP bench script (dedicated server) : http://jsfiddle.net/EhXzK/embedded/result/ PHP bench script (old mutualized) : http://jsfiddle.net/ANbWt/embedded/result/ Is it normal to get such poor performances after a kernel update and basics "apt-get install" for apache2 and php ? Thanks !

    Read the article

  • Read NTFS partition on RHEL 5.8

    - by Alex Farber
    I have RHEL 5.8 64 bit, and NTFS partition on the same disk. How can I get access to this partition? This answer Unable to mount NTFS drive with RHEL 6 doesn't work for me: [root@localhost alex]# rpm -Uvh http://download.fedora.redhat.com/pub/epel/6/i386/epel-release-6-5.noarch.rpm Retrieving http://download.fedora.redhat.com/pub/epel/6/i386/epel-release-6-5.noarch.rpm error: skipping http://download.fedora.redhat.com/pub/epel/6/i386/epel-release-6-5.noarch.rpm - transfer failed - Unknown or unexpected error

    Read the article

  • 404 when doing safe-upgrade in lucid 64 box?

    - by Millisami
    Why I see 404 when doing sudo aptitude safe-upgrade in my lucid 64 box? deploy@li167-251:~$ sudo aptitude safe-upgrade Reading package lists... Done Building dependency tree Reading state information... Done Reading extended state information Initializing package states... Done The following packages will be upgraded: apache2 apache2-mpm-prefork apache2-threaded-dev apache2-utils apache2.2-bin apache2.2-common apt apt-utils base-files binutils bzip2 dpkg dpkg-dev gzip ifupdown krb5-multidev language-pack-en language-pack-en-base language-selector-common libatk1.0-0 libatk1.0-dev libavahi-client3 libavahi-common-data libavahi-common3 libbz2-1.0 libc-bin libc-dev-bin libc6 libc6-dev libc6-i686 libcups2 libfreetype6 libfreetype6-dev libglib2.0-0 libglib2.0-dev libgssapi-krb5-2 libgssrpc4 libgtk2.0-0 libgtk2.0-common libgtk2.0-dev libk5crypto3 libkadm5clnt-mit7 libkadm5srv-mit7 libkdb5-4 libkrb5-3 libkrb5-dev libkrb5support0 libldap-2.4-2 libldap2-dev libmysqlclient-dev libmysqlclient16 libnotify-dev libnotify1 libpam-modules libpam-runtime libpam0g libparted0debian1 libpng12-0 libpng12-dev libpq-dev libpq5 libssl-dev libssl0.9.8 libtiff4 libudev0 libusb-0.1-4 linux-libc-dev mountall mysql-client mysql-client-5.1 mysql-client-core-5.1 mysql-common mysql-server mysql-server-5.1 mysql-server-core-5.1 openssh-client openssh-server openssl parted python-apt sudo tzdata udev upstart ureadahead wget xulrunner-1.9.2 xulrunner-1.9.2-dev The following packages are RECOMMENDED but will NOT be installed: colibri debhelper fakeroot hicolor-icon-theme libatk1.0-data libglib2.0-data libgtk2.0-bin libhtml-template-perl manpages-dev notification-daemon notify-osd ssl-cert xauth xfce4-notifyd 88 packages upgraded, 0 newly installed, 0 to remove and 0 not upgraded. Need to get 85.8MB of archives. After unpacking 1712kB will be used. Do you want to continue? [Y/n/?] y Writing extended state information... Done Get:1 http://security.ubuntu.com/ubuntu/ lucid-updates/main libpam-modules 1.1.1-2ubuntu5 [358kB] Get:2 http://security.ubuntu.com/ubuntu/ lucid-updates/main base-files 5.0.0ubuntu20.10.04.2 [70.2kB] Get:3 http://security.ubuntu.com/ubuntu/ lucid-updates/main gzip 1.3.12-9ubuntu1.1 [102kB] Err http://security.ubuntu.com/ubuntu/ lucid-updates/main libc-bin 2.11.1-0ubuntu7.2 404 Not Found [IP: 91.189.88.37 80] Err http://security.ubuntu.com/ubuntu/ lucid-updates/main libc6 2.11.1-0ubuntu7.2 404 Not Found [IP: 91.189.88.37 80] Err http://security.ubuntu.com/ubuntu/ lucid-updates/main libc6-i686 2.11.1-0ubuntu7.2 .........

    Read the article

  • Organizing automatically Windows Files and Folders

    - by Kiquenet
    For Windows only, Organizing the eleventy-billion files you've got stuffed into folders on your hard drive is very "hard". For example, I have one folder on my computer that I save all web downloads to, regardless of file type, size or purpose. Many of the files are only temporary downloads, for instance setup files of applications that I test, demonstration videos that I watch once or documents that I want to read. Some files on the other hand are there to stay, and I used to move them out of the download folder manually in the past. Another files in folders in my computer: many source code, tests, programs, tools, ... I need tecnology for organize billion files. Which best tools for organize, sort, etc automatically your files-folders? Digital Janitor http://davidevitelaru.com/software/digital-janitor/ Belverede http://lifehacker.com/5510961/how-to-automatically-clean-and-organize-your-desktop-downloads-and-other-folders Download Mover http://www.neoteo.com/download-mover-reorganiza-tus-descargas-14188 File/Folder Date Organizer http://seedling.dcmembers.com/other/ffdorg.zip DropIt http://www.lupopensuite.com/db/dropit.htm Others issues about organization files, desktop, etc How to automate the process of organizing audio files on Windows Organizing My Windows Desktop What's a good way for organizing PDF documents on Windows? Folksonomy tagging for files What is your method of “folksonomy” tagging for files on your local machine?

    Read the article

  • mod-rewrite: what's wrong with this simple rewrite to redirect to a subdirectory?

    - by Tom Auger
    the root directory http: // www .mydomain .com (SF won't let me post hyperlinks - rep is too low) has a catchall index.php page in it, and an .htaccess file. Within this root directoy I have a wordpress/ directory which contains (suprise surprise) a wordpress installation. My goal is that when the user types in http: // www .mydomain .com they are instead taken to http: // www .mydomain .com/wordpress Here is my rewrite rule: RewriteEngine on RewriteCond %{REQUEST_URI} !^/wordpress RewriteRule ^/(.*)$ http://%{SERVER_NAME}/wordpress/$1 [L] At the moment it appears to do nothing - it still loads index.php within the root directory. What should my rewrite rule be (I'm assuming the one I'm using is wrong)?

    Read the article

  • ssl_error_log apache issue

    - by lakshmipathi
    https://localhost works but https://ipaddress didn't cat logs/ssl_error_log [Mon Aug 02 19:04:11 2010] [error] [client 192.168.1.158] (13)Permission denied: access to /ajaxterm denied [root@space httpd]# cat logs/ssl_access_log 192.168.1.158 - - [02/Aug/2010:19:04:11 +0530] "GET /ajaxterm HTTP/1.1" 403 290 [root@space httpd]# cat logs/ssl_request_log [02/Aug/2010:19:04:11 +0530] 192.168.1.158 SSLv3 DHE-RSA-CAMELLIA256-SHA "GET /ajaxterm HTTP/1.1" 290 httpd.conf file NameVirtualHost *:443 <VirtualHost *:443> ServerName localhost SSLEngine on SSLCertificateFile /etc/pki/tls/certs/ca.crt SSLCertificateKeyFile /etc/pki/tls/private/ca.key <Directory /usr/share/ajaxterm > Options FollowSymLinks AllowOverride None Order deny,allow Allow from All </Directory> DocumentRoot /usr/share/ajaxterm DirectoryIndex ajaxterm.html ProxyRequests Off <Proxy *> # Order deny,allow Allow from all </Proxy> ProxyPass /ajaxterm/ http://localhost:8022/ ProxyPassReverse /ajaxterm/ http://localhost:8022/ ErrorLog error_log.log TransferLog access_log.log </VirtualHost> How to fix this ?

    Read the article

  • Nginx + Wordpress Multisite 3.4.2 + subdirectories + static pages and permalinks

    - by UrkoM
    I am trying to setup Wordpress Multisite, using subdirectories, with Nginx, php5-fpm, APC, and Batcache. As many other people, I am getting stuck in the rewrite rules for permalinks. I have followed these two guides, which seem to be as official as you can get: http://evansolomon.me/notes/faster-wordpress-multisite-nginx-batcache/ http://codex.wordpress.org/Nginx#WordPress_Multisite_Subdirectory_rules It is partially working: http://blog.ssis.edu.vn works. http://blog.ssis.edu.vn/umasse/ works. But other permalinks, like these two to a post or to a static page, don't work: http://blog.ssis.edu.vn/umasse/2008/12/12/hello-world-2/ http://blog.ssis.edu.vn/umasse/sample-page/ They either take you to a 404 error, or to some other blog! Here is my configuration: server { listen 80 default_server; server_name blog.ssis.edu.vn; root /var/www; access_log /var/log/nginx/blog-access.log; error_log /var/log/nginx/blog-error.log; location / { index index.php; try_files $uri $uri/ /index.php?$args; } # Add trailing slash to */wp-admin requests. rewrite /wp-admin$ $scheme://$host$uri/ permanent; # Add trailing slash to */username requests rewrite ^/[_0-9a-zA-Z-]+$ $scheme://$host$uri/ permanent; # Directives to send expires headers and turn off 404 error logging. location ~* \.(js|css|png|jpg|jpeg|gif|ico)$ { expires 24h; log_not_found off; } # this prevents hidden files (beginning with a period) from being served location ~ /\. { access_log off; log_not_found off; deny all; } # Pass uploaded files to wp-includes/ms-files.php. rewrite /files/$ /index.php last; if ($uri !~ wp-content/plugins) { rewrite /files/(.+)$ /wp-includes/ms-files.php?file=$1 last; } # Rewrite multisite '.../wp-.*' and '.../*.php'. if (!-e $request_filename) { rewrite ^/[_0-9a-zA-Z-]+(/wp-.*) $1 last; rewrite ^/[_0-9a-zA-Z-]+.*(/wp-admin/.*\.php)$ $1 last; rewrite ^/[_0-9a-zA-Z-]+(/.*\.php)$ $1 last; } location ~ \.php$ { # Forbid PHP on upload dirs if ($uri ~ "uploads") { return 403; } client_max_body_size 25M; try_files $uri =404; fastcgi_pass unix:/var/run/php5-fpm.sock; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; include /etc/nginx/fastcgi_params; } } Any ideas are welcome! Have I done something wrong? I have disabled Batcache to see if it makes any difference, but still no go.

    Read the article

  • How to download a url as a file?

    - by Michelle
    A website url has "hidden" some mp3 files by embedding them as shockwave files, as follows: <span class="caption"><!-- Odeo player --><embed src="http://odeo.com/flash/audio_player_tiny_gray.swf"quality="high" name="audio_player_tiny_gray" align="middle" allowScriptAccess="always" wmode="transparent" type="application/x-shockwave-flash" flashvars="valid_sample_rate=true external_url=http://podcast.cbc.ca/mp3/sundayeditionstream_20081125_9524.mp3" pluginspage="http://www.macromedia.com/go/getflashplayer"></embed></span> How can I download the files for off-line listening? I've found two methods: 1. The StackOverflow Method Create a new local html file with just the links eg <a href="http://podcast.cbc.ca/mp3/sundayeditionstream_20081125_9524.mp3">Sunday Edition 25Nov2008</a> Open the file in the browser, right click the link and File Save Link As. 2. The SuperUser Method Install the Firefox addin Iget. (Be sure to use the right version for your Firefox version.) Tools Downloads Enter url in field. Are there any other ways?

    Read the article

  • Store profile image of all users into single directory or per subdirectory id?

    - by Luccas
    I'm using amazon s3 as storage for users profile pic. I see that many websites generates large random filenames and put them into the same root directory like: http://xxx.us-east-1.amazonaws.com/aHR0cHM6Ly9mYmNkbi1wcm9maWxlLWEuYWthbWFpaGQubmV0L2hwcm9maWxlLWFrLWFzaDIvMjczMzkxXzEwMDAwMDMxMjAxMzg5OV81NTk3MjM4Mzdfbi5qcGc.jpg And my question is: What are the pros and cons of that approach? If I palce them into different directories, what problems I will have in future? http://xxx.us-east-1.amazonaws.com/users/id/username.jpg or http://xxx.us-east-1.amazonaws.com/users/id/random_number.jpg Thanks!

    Read the article

  • IP tables blocking access to most hosts but some accesses being logged

    - by epo
    What am I getting wrong? A while back I locked down my web hosting service while hardening it or at least trying to. Apache listens on port 80 only and I set up iptables using the following: IPS="list of IPs" iptables --new-chain webtest # Accept all established connections iptables -A INPUT --protocol tcp --dport 80 --jump webtest iptables -A INPUT --match state --state ESTABLISHED,RELATED --jump ACCEPT iptables -A webtest --match state --state ESTABLISHED,RELATED --jump ACCEPT for ip in $IPS; do iptables -A webtest --match state --state NEW --source $ip --jump ACCEPT done iptables -A webtest --jump DROP However looking at my apache logs I notice various log entries in access_log, e.g. 221.192.199.35 - - [16/May/2010:13:04:31 +0100] "GET http://www.wantsfly.com/prx2.php?hash=926DE27C156B40E55E4CFC8F005053E2D81E6D688AF0 HTTP/1.0" 404 206 "-" "Mozilla/ 4.0 (compatible; MSIE 6.0; Windows NT 5.0)" 201.228.144.124 - - [16/May/2010:11:54:16 +0100] "GET /w00tw00t.at.ISC.SANS.DFind:) HTTP/1.1" 400 226 "-" "-" 207.46.195.224 - - [16/May/2010:04:06:48 +0100] "GET /robots.txt HTTP/1.1" 200 311 "-" "msnbot/2.0b (+http://search.msn.com/msnbot.htm)" How are these slipping through? I don't mind the indexing bots (though I am a little surprised to see them get through). I suppose they must be getting through using the ESTABLISHED,RELATED rules. And no, I can't for the life of me remember why the first match state rule is there So 2 questions: is there a better way to set up iptables to restrict access to specified hosts? How exactly are these 3 examples slipping through?

    Read the article

  • Can I make a computer connecting via VPN visible to computers within the network it is connecting to

    - by SCdF
    OK, here's the deal: I have a computer (specifically, a MacBook Pro) that is connected to a standard network that is then connected to the big nasty internet. Let's call it foo. It runs a web server on 8084, and so if you were on its local network you could get to this with http://foo:8084/, or http://192.168.1.2:8084/, or whatever. From foo I can VPN into my companies intranet and see a computer on the local company network called bar (another MacBook Pro, incidentally). Is there any way to set this up so that while foo is on the VPN bar can access http://foo:8084/ (or http://x.x.x.x:8084/, or whatever)? (From my limited understanding of how VPNs work I have a sneaking suspicion the answer is no, but it doesn't hurt to ask...)

    Read the article

  • .htaccess, mod_rewrite Issue

    - by Shoaibi
    What i want: Force www [works] Restrict access to .inc.php [works] Force redirection of abc.php to /abc/ Removal of extension from url Add a trailing slash if needed old .htaccess : Options +FollowSymLinks <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / ### Force www RewriteCond %{HTTP_HOST} ^example\.net$ RewriteRule ^(.*)$ http://www\.example\.net/$1 [L,R=301] ### Restrict access RewriteCond %{REQUEST_URI} ^/(.*)\.inc\.php$ [NC] RewriteRule .* - [F,L] #### Remove extension: RewriteRule ^(.*)/$ /$1.php [L,R=301] ######### Trailing slash: RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_URI} !(.*)/$ RewriteRule ^(.*)$ http://www.example.net/$1/ [R=301,L] </IfModule> New .htaccess: Options +FollowSymLinks <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / ### Force www RewriteCond %{HTTP_HOST} ^example\.net$ RewriteRule ^(.*)$ http://www\.example\.net/$1 [L,R=301] ### Restrict access RewriteCond %{REQUEST_URI} ^/(.*)\.inc\.php$ [NC] RewriteRule .* - [F,L] #### Remove extension: RewriteCond %{REQUEST_FILENAME} \.php$ RewriteCond %{REQUEST_FILENAME} -f RewriteRule (.*)\.php$ /$1/ [L,R=301] #### Map pseudo-directory to PHP file RewriteCond %{REQUEST_FILENAME}\.php -f RewriteRule (.*) /$1.php [L] ######### Trailing slash: RewriteCond %{REQUEST_FILENAME} -d RewriteCond %{REQUEST_FILENAME} !/$ RewriteRule (.*) $1/ [L,R=301] </IfModule> errorlog: Request exceeded the limit of 10 internal redirects due to probable configuration error. Use 'LimitInternalRecursion' to increase the limit if necessary. Use 'LogLevel debug' to get a backtrace., referer: http://www.example.net/ Rewrite.log: http://pastebin.com/x5PKeJHB

    Read the article

  • Using mozilla firefox with utf-8 addresses (in greek) on mac

    - by Panagiotis
    Very often when I use firefox (any version from 10+) and I type my utf-8 seo url it behaves strangely. For example it randomly cuts the url and adds the url again at whole like this: http://www.mysite.com/????G????S/???? would make it as http://www.mysite.com/????G???http://www.mysite.com/????G????S/???? resulting in converting the url to urlencoded letters and 404 errors. I am using Lion with the latest firefox (yes I have uninstalled it once and reinstalled it).

    Read the article

  • Apache mod_proxy to another server

    - by trobrock
    I am using the proxy_balancer in Apache2 to proxy requests to a Rails application to my rails server on the port the application is running on. This is how its set up... Rails Server Mongrel running on port 8000, when accessing the url directly to http://rails_server:8000 the site loads fine Apache Server Conf file for the site: <VirtualHost *:80> ServerAdmin webmaster@localhost ServerName myserver.com ServerAlias application.myserver.com <Proxy balancer://application_cluster> Allow from localhost BalancerMember http://ip.to.server:8000 retry=10 </Proxy> ProxyPass / balancer://application_cluster </VirtualHost> The problem I am having is going to http://rails_server:8000 works fine, but going to http://application.myserver.com Loads the right content, but is displaying all the HTML as text and not rendering it as html

    Read the article

  • Nginx redirect one path to another

    - by SteveEdson
    I'm sure this has been asked before, but I can't find a solution that works. A website has switched CMS services, but has the same domain, how do I set up an nginx rewrite for a single page? E.g. Old Page http://sitedomain.co.uk/content/unique-page-name New page http://sitedomain.co.uk/new-name/unique-page-name Please note, I don't want everything within the content page to be redirected, but literally just the url mentioned above. I have about 9 redirects to set up, non of which fit in a pattern. Thanks! Edit: I found this solution, which seems to be working, except for the fact that it redirects without a slash: if ( $request_filename ~ content/unique-page-name/ ) { rewrite ^ http://sitedomain.co.uk/new-name/unique-page-name/? permanent; } But this redirects to: http://sitedomain.co.uknew-name/unique-page-name/

    Read the article

  • mod_rewrite not working for subdomain in Apache2

    - by Matt
    Hi, I'm having some trouble with mod_rewrite. So I'm implementing it through .htaccess, and I can get it working on my main vhost, domain.com - what I want it to do is rewrite http:// domain.com to force it to https:// domain.com, which it does well. I want to have name-based vhosts for the one IP with the following redirects: (I'm breaking up domain names with a space because otherwise serverfault recognises them as links) http:// domain.com -- https:// domain.com http:// staging.domain.com -- https:// staging.domain.com http:// test.domain.com -- https:// test.domain.com http:// beta.domain.com -- https:// beta.domain.com domain.com redirects to https:// domain.com, but staging.domain.com doesn't, although I can access https:// staging.domain.com. The .htaccess is identical for both, just with the domain name different. It doesn't seem to do any rewriting at all for staging.domain.com, I've tested this by trying to get it to rewrite to www.google.com. I have a wildcard DNS record, *.domain.com which points to the domain IP. Is there a particular way I should have the virtualhosts configured to allow this? I keep reading in the Apache documentation that it doesn't support multiple SSL name-based vhosts. But I can access both https:// domain.com and https:// staging.domain.com just fine. Any thoughts? Thanks to everyone for your help with this.

    Read the article

  • Redirecting pages from the root folder to a subfolder

    - by MarcoPRT
    I have a Joomla site in the root directory of my domain, and I have a forum at /forum subdirectory. How can I redirect visitors from the main site to the forum, continuing to have the possibility to access the site from a link at the forum? Example: http://example.com redirected to http://example.com/forum , but I can access the main site by the link http://example.com/index.php

    Read the article

  • Reverse proxy for a subdirectory in nginx

    - by Maple
    I want to set up a Reverse proxy on my VPS for my Heroku app (http://lovemaple.heroku.com) So if I visit mysite.com/blog I can get the content in http://lovemaple.heroku.com I followed the instructions on the Apache wiki. location /couchdb { rewrite /couchdb/(.*) /$1 break; proxy_pass http://localhost:5984; proxy_redirect off; proxy_set_header Host $host; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; } I changed it to fit my situation: location /blog { rewrite /blog/(.*) /$1 break; proxy_pass http://lovemaple.heroku.com; proxy_redirect off; proxy_set_header Host $host; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; } When I visit mysite.com/blog, the page show up, but js/css file cannot be gotten (404). Their link becomes mysite.com/style.css but not mysite.com/blog/style.css. What's wrong and how can I fix it?

    Read the article

  • Plesk SiteBuilder Install (UNIX)

    - by Clear.Cache
    Trying to install site builder from Plesk, for Unix. I have Centos 4x I installed fine using tarball files on their site. Ran: * rpm -Uhv updates/*.rpm * rpm -Uhv sitebuilder/*.rpm Documentation: http://download1.parallels.com/SiteBuilder/4.5.0/doc/install/en_US/html/index.htm Now, http://64.64.96.138/login Doesn't work. http://download1.parallels.com/SiteBuilder/4.5.0/doc/install/en_US/html/index.htm?fileName=performing_first_login_to_sitebuilder.htm What am I doing wrong here?

    Read the article

< Previous Page | 252 253 254 255 256 257 258 259 260 261 262 263  | Next Page >