Search Results

Search found 1512 results on 61 pages for 'deny prasetyo'.

Page 17/61 | < Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >

  • LameUser trying - apache2 webserver authentication - IP range to access without pass prompt others with it

    - by Mikee
    I have (maybe silly) question regarding the apache2 webserver and security - I am trying to archieve this: Users connecting from 192.168.1.24 not to be prompted for password and allowed Others asked for username and password if correct then connect. I am trying to do this for the whole directory /var/www No matter whether I put the code into .htaccess file or in httpd.conf it doesn't work for me. Order deny,allow Deny from all AuthName "PassRequest" AuthType Basic AuthUserFile /var/.htpasswd Require valid-user Allow from 192.168.1.24 Satisfy Any If I try to connect to the page I am allowed from both the allowed IP or any other, If I remove the satisfy any line then I am prompted for password, if I remove the password too and try to connect from different IP I am NOT REFUSED ... is there some module that needs to be activated or why is the IP directive skipped ? It needs to be put in every folder or /var/www/.htaccess is enough ? can I just put it in httpd.conf instead or not ?? I spend last 4 hours trying to google up why it is acting like that, Any help will be highly appreciated :-))

    Read the article

  • How do I force .htaccess authorization to occur over ssl?

    - by kenja
    I'm trying to force a particular directory to require only allowed IPs and a valid username/password through basic authorization. To ensure that the username/password are sent in encrypted form, I want the directory to also force SSL use. Here is what I have in my .htaccess file: # Force HTTPS-Connection RewriteEngine On RewriteCond %{SERVER_PORT} !^443$ RewriteRule (.*) https://www.mywebsite.com%{REQUEST_URI} [R,L] ## password begin ## AuthName "Restricted Access" AuthUserFile /var/www/admin/.htpasswd AuthType Basic Require valid-user Order deny,allow Deny from all Allow from 79.1.231.151 62.123.134.83 Satisfy All Unfortunately, when I access that directory using http protocol, it is asking for the password before it redirects the page to the secure version. This means the password is sent unencrypted. What am I doing wrong? Is there a way to do this?

    Read the article

  • IIS6 Permissions

    - by Gordon Carpenter-Thompson
    We have a set of IIS6 Jakarta/ASP.NET applications (implemented as virtual directories) on a machine without a domain. The directories all exist under the default website. We need to setup the permissions so that certain users can access only specific applications yet others users can access several of the applications. The way it's been setup previously has been to explicitly deny access to the users for every application except the ones that they are allowed to see. The problem is that the list of applications changes fairly often (for demos etc) and it's been known for the developers to forget to deny the old users access to the new applications which leads to security problems. This is all quite unmaintainable. Does anybody have any advice on this? Surely I can't be the only person to find this all a bit of a mess? Thanks

    Read the article

  • .htaccess template, suggestions needed

    - by purpler
    # Defaults AddDefaultCharset UTF-8 DefaultLanguage en-US FileETag None Header unset ETag ServerSignature Off SetEnv TZ Europe/Belgrade # Rewrites Options +FollowSymLinks RewriteEngine On RewriteBase / # Redirect to WWW RewriteCond %{HTTP_HOST} ^serpentineseo.com RewriteRule (.*) http://www.serpentineseo.com/$1 [R=301,L] # Redirect index to root RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*index\.html\ HTTP/ RewriteRule ^(.*)index\.html$ /$1 [R=301,L] # Cache media files: ExpiresActive On ExpiresDefault A0 # Month <filesMatch "\.(gif|jpg|jpeg|png|ico|swf|js)$"> Header set Cache-Control "max-age=2592000, public" </filesMatch> # Week <FilesMatch "\.(css|pdf)$"> Header set Cache-Control "max-age=604800" </FilesMatch> # 10 Min <FilesMatch "\.(html|htm|txt)$"> Header set Cache-Control "max-age=600" </FilesMatch> # Do not cache <FilesMatch "\.(pl|php|cgi|spl|scgi|fcgi)$"> Header unset Cache-Control </FilesMatch> # Compress output <IfModule mod_deflate.c> <FilesMatch "\.(html|js|css)$"> SetOutputFilter DEFLATE </FilesMatch> </IfModule> # Error Documents ErrorDocument 206 /error/206.html ErrorDocument 401 /error/401.html ErrorDocument 403 /error/403.html ErrorDocument 404 /error/404.html ErrorDocument 500 /error/500.html # Prevent hotlinking RewriteCond %{HTTP_REFERER} !^$ RewriteCond %{HTTP_REFERER} !^http://(www\.)?serpentineseo.com/.*$ [NC] RewriteRule \.(gif|jpg|png)$ http://www.serpentineseo.com/images/angryman.png [R,L] # Prevent offline browsers RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR] RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:[email protected] [OR] RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR] RewriteCond %{HTTP_USER_AGENT} ^Custo [OR] RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR] RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR] RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR] RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR] RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [OR] RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR] RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR] RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR] RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR] RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR] RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR] RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR] RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR] RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR] RewriteCond %{HTTP_USER_AGENT} ^HMView [OR] RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR] RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [OR] RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR] RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [OR] RewriteCond %{HTTP_USER_AGENT} ^larbin [OR] RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [OR] RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [OR] RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [OR] RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR] RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR] RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR] RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [OR] RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [OR] RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [OR] RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR] RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR] RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR] RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR] RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR] RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR] RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [OR] RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR] RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR] RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR] RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [OR] RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR] RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [OR] RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Wget [OR] RewriteCond %{HTTP_USER_AGENT} ^Widow [OR] RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR] RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Zeus RewriteRule ^.*$ http://www.google.com [R,L] # Protect against DOS attacks by limiting file upload size LimitRequestBody 10240000 # Deny access to sensitive files <FilesMatch "\.(htaccess|psd|log)$"> Order Allow,Deny Deny from all </FilesMatch>

    Read the article

  • .htaccess template, suggestions needed

    - by purpler
    DefaultLanguage en-US FileETag None Header unset ETag ServerSignature Off SetEnv TZ Europe/Belgrade # Rewrites Options +FollowSymLinks RewriteEngine On RewriteBase / # Redirect to WWW RewriteCond %{HTTP_HOST} ^serpentineseo.com RewriteRule (.*) http://www.serpentineseo.com/$1 [R=301,L] # Redirect index to root RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*index\.html\ HTTP/ RewriteRule ^(.*)index\.html$ /$1 [R=301,L] # Cache media files: ExpiresActive On ExpiresDefault A0 # Month <filesMatch "\.(gif|jpg|jpeg|png|ico|swf|js)$"> Header set Cache-Control "max-age=2592000, public" </filesMatch> # Week <FilesMatch "\.(css|pdf)$"> Header set Cache-Control "max-age=604800" </FilesMatch> # 10 Min <FilesMatch "\.(html|htm|txt)$"> Header set Cache-Control "max-age=600" </FilesMatch> # Do not cache <FilesMatch "\.(pl|php|cgi|spl|scgi|fcgi)$"> Header unset Cache-Control </FilesMatch> # Compress output <IfModule mod_deflate.c> <FilesMatch "\.(html|js|css)$"> SetOutputFilter DEFLATE </FilesMatch> </IfModule> # Error Documents ErrorDocument 206 /error/206.html ErrorDocument 401 /error/401.html ErrorDocument 403 /error/403.html ErrorDocument 404 /error/404.html ErrorDocument 500 /error/500.html # Prevent hotlinking RewriteCond %{HTTP_REFERER} !^$ RewriteCond %{HTTP_REFERER} !^http://(www\.)?serpentineseo.com/.*$ [NC] RewriteRule \.(gif|jpg|png)$ http://www.serpentineseo.com/images/angryman.png [R,L] # Prevent offline browsers RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR] RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:[email protected] [OR] RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR] RewriteCond %{HTTP_USER_AGENT} ^Custo [OR] RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR] RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR] RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR] RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR] RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [OR] RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR] RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR] RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR] RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR] RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR] RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR] RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR] RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR] RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR] RewriteCond %{HTTP_USER_AGENT} ^HMView [OR] RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR] RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [OR] RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR] RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [OR] RewriteCond %{HTTP_USER_AGENT} ^larbin [OR] RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [OR] RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [OR] RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [OR] RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR] RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR] RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR] RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [OR] RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [OR] RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [OR] RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR] RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR] RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR] RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR] RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR] RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR] RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [OR] RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR] RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR] RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR] RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [OR] RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR] RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [OR] RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Wget [OR] RewriteCond %{HTTP_USER_AGENT} ^Widow [OR] RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR] RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Zeus RewriteRule ^.*$ http://www.google.com [R,L] # Protect against DOS attacks by limiting file upload size LimitRequestBody 10240000 # Deny access to sensitive files <FilesMatch "\.(htaccess|psd|log)$"> Order Allow,Deny Deny from all </FilesMatch>

    Read the article

  • Does Apache2 Configured as ReverseProxy Hide Cookies Set by Backend Servers?

    - by Ianthe
    I use Apache 2.2.16 as Reverse Proxy. For a static website, I don't have any issues. However, when began to use cookies, I've noticed that cookies are not being sent to the client. Here's a snippet of my config: <VirtualHost *:80> ServerName app.somewhere.com:80 ServerAlias app ProxyRequests Off ProxyPreserveHost On <Proxy *> Order deny,allow Allow from all </Proxy> ProxyPass /app http://10.x.x.x/app ProxyPassReverse /app http://10.x.x.x/app <Location /> Order allow,deny Allow from all </Location> </VirtualHost> But when I try to access the app server directly, I receive the cookies ok. Is this an expected behaviour for Apache2? I'm using HAProxy for another application that sends cookies to the client and I get all of them.

    Read the article

  • Enable Server Status using Plesk 11

    - by Lars Ebert
    I am trying to get apaches server status to work with Plesk 11. But running sudo /usr/sbin/apache2ctl fullstatus results in: Forbidden You don't have permission to access /server-status on this server. __________________________________________________________________ Apache Server at localhost Port 80 'www-browser -dump http://localhost:80/server-status' failed. Maybe you need to install a package providing www-browser or you need to adjust the APACHE_LYNX variable in /etc/apache2/envvars How can I enable server status? So far I have tried to insert <Location /server-status> SetHandler server-status Order Deny,Allow Deny from all Allow from localhost </Location> into the httpd.conf, but I am not sure if it is active. I also tried adding it to /var/www/vhosts/somedomain/conf/vhost.conf but I do not know which domain I have to add this to, as fullstatus seems to query localhost directly. I guess I am a little confused by the use of vhost configuration in Plesk.

    Read the article

  • PHP app breaks on Nginx, but works on Apache

    - by rizon1990
    I want to migrate a PHP application from Apache to Nginx. The problem is that the App breaks, because the routing doesn't work anymore and I'm not exactly sure how to fix it. The PHP application includes some .htaccess files and I tried to convert those to Nginx. The first one is in the document root: <IfModule mod_rewrite.c> RewriteEngine on RewriteRule ^$ public/ [L] RewriteRule (.*) public/$1 [L] </IfModule> The second one is in /public/ <IfModule mod_rewrite.c RewriteEngine On RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d # Rewrite all other URLs to index.php/URL RewriteRule ^(.*)$ index.php?url=$1 [PT,L] </IfModule> <IfModule !mod_rewrite.c> ErrorDocument 404 index.php </IfModule> The third and last one is: deny from all My nginx version of it looks like the following: #user nobody; worker_processes 1; #pid logs/nginx.pid; events { worker_connections 1024; } http { include mime.types; default_type application/octet-stream; sendfile on; #tcp_nopush on; keepalive_timeout 65; gzip on; server { listen 8080; server_name localhost; root /Library/WebServer/Documents/admin; location / { index index.php; rewrite ^/$ /public/ break; rewrite ^(.*)$ /public/$1 break; } location /public { if (!-e $request_filename){ rewrite ^(.*)$ /index.php?url=$1 break; } } location /library { deny all; } #error_page 404 /404.html; # redirect server error pages to the static page /50x.html # error_page 500 502 503 504 /50x.html; location = /50x.html { root html; } location ~ \.php$ { root /Library/WebServer/Documents/admin; fastcgi_pass 127.0.0.1:9000; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; include fastcgi_params; } } } The problem I face is that something that the routing is broken and just returns a 404 page instead. Hopefully someone has an idea and know how to fix it ;) Thanks EDIT I got it working with this config location /library { deny all; } location ~* ^.+.(jpg|jpeg|gif|png|ico|css|zip|tgz|gz|rar|bz2|doc|xls|exe|pdf|ppt|txt|tar|mid|midi|wav|bmp|rtf|js)$ { access_log off; rewrite ^(.*)$ /public/$1 break; } location / { rewrite ^/(.+)$ /index.php?url=$1 last; } I'm sure there are better solutions and I'm open for suggestions.

    Read the article

  • Blocking IP Range in .htaccess Problem

    - by Pedro
    Hi, I'm trying to block the access of one of my webapps using IP Filter in the .htaccess, the problem is that after updating the file with: order allow,deny deny from 58.14.0.0/15 allow from all I get the folowing error: Internal Server Error The server encountered an internal error or misconfiguration and was unable to complete your request. Please contact the server administrator, [email protected] and inform them of the time the error occurred, and anything you might have done that may have caused the error. More information about this error may be available in the server error log. What is wrong? Regards, Pedro

    Read the article

  • Blocking a specific URL by IP (a URL create by mod-rewrite)

    - by Alex
    We need to block a specific URL for anyone not on a local IP (anyone without a 192.168.. address) We however cannot use apache's <Directory /var/www/foo/bar> Order allow,deny Allow from 192.168 </Directory> <Files /var/www/foo/bar> Order allow,deny Allow from 192.168 <Files> Because these would block specific files or directories, we need to block a specific URL which is created by mod-rewrite and the page is dynamically created using PHP. Any ideas would be greatly appreciated

    Read the article

  • Apache AliasMatch and DirectoryMatch not working?

    - by Alex
    I have the following config - please notice the Alias and Directory equivalent -- uncommented they work as expected but the dynamic/regex based versions don't - any ideas??? <VirtualHost *:80> ServerName temp.dev.local ServerAlias temp.dev.local DocumentRoot "C:\wamp\www\temp\public" <Directory "C:\wamp\www\temp\public"> AllowOverride all Order Allow,Deny Allow from all </Directory> # Alias /private/application/core/page/assets/images/ "C:/wamp/www/temp/private/application/core/page/assets/images/" # <Directory "C:/wamp/www/temp/private/application/core/page/assets/images/"> AliasMatch ^/private/application/(.*)/(.*)/assets/images/ /private/application/$1/$2/assets/images/ <DirectoryMatch "^/private/application/(.*)/(.*)/assets/images/"> Options Indexes FollowSymlinks MultiViews Includes AllowOverride None Order allow,deny Allow from all </DirectoryMatch> </VirtualHost>

    Read the article

  • phpmyadmin forbidden after changing config for my IP

    - by Jonathan Kushner
    I followed the phpmyadmin setup and changed the config to require ip my ipaddress and allow from my ipaddress and its still telling me forbidden You don't have permission to access /phpmyadmin on this server. when I try to access the page on my browser (my server is not located on my machine). I installed everything using root. I also chmod 775 the entire phpMyAdmin folder. Im running RHEL 6.1. Any idea what to do at this point? Here is my /etc/httpd/conf.d/phpMyAdmin.conf: <Directory /usr/share/phpMyAdmin/> <IfModule mod_authz_core.c> # Apache 2.4 <RequireAny> Require ip myserveripaddress Require ip ::1 </RequireAny> </IfModule> <IfModule !mod_authz_core.c> # Apache 2.2 Order Deny,Allow Deny from All Allow from myserveripaddress Allow from ::1 </IfModule> </Directory>

    Read the article

  • unable to join domain using virtualbox

    - by FreshPrinceOfSO
    I'm in the process of setting up a VM environment for a MS certification exam (70-462). Following the training kit's instructions, I've set up a domain controller (DC) and two members (SQL-A, SQL-B) thus far. I can't figure out why I can't join the domain. DC IPv4 Address . . . : 10.10.10.10(Preferred) Subnet Mask. . . . : 255.0.0.0 DNS Servers. . . . : ::1 127.0.0.1 SQL-A IPv4 Address . . . : 10.10.10.20(Preferred) Subnet Mask. . . . : 255.0.0.0 DNS Servers. . . . : 10.10.10.10 SQL-B IPv4 Address . . . : 10.10.10.30(Preferred) Subnet Mask. . . . : 255.0.0.0 DNS Servers. . . . : 10.10.10.10 I've read how to do networking between virtual machines in virtualbox and the documentation. After trying various network adapter configurations, I can't get them to communicate in order to have the two members join the domain. When I ping from .30 to .10, I get: ping 10.10.10.10 Pinging 10.10.10.10 with 32 bytes of data: Reply from 10.10.10.20: Destination host unreachable. Reply from 10.10.10.20: Destination host unreachable. Reply from 10.10.10.20: Destination host unreachable. Reply from 10.10.10.20: Destination host unreachable. Trying to join the domain: netdom join SQL-A /domain:contso.com The specified domain either does not exist or could not be contacted. The command failed to complete successfully. Within VirtualBox, I've tried the following combinations for network adapter: Attached to - Promiscuous Mode ------------------------------- NAT Bridged Adapter - Deny Bridged Adapter - Allow VMs Bridged Adapter - Allow All Internal Network - Deny Internal Network - Allow VMs Internal Network - Allow All Host-only Adapter - Deny Host-only Adapter - Allow VMs Host-only Adapter - Allow All Edit ipconfig /all of DC ipconfig /all of SQL-A

    Read the article

  • Run script when POST data is sent to Apache

    - by Nathan Adams
    Among my several years of running servers there seems to be a pattern with most spam activity. My question/idea is that is there a way to tell Apache to run a script when POST data is detected? What I would want to do is perform a reverse DNS lookup on the client's IP address, and then perform a DNS lookup on the hostname in the PTR record. Afterwards, perform some checks, excuse the pseudo-code: if PTR does not exist: deny POST request if IP of PTR hostname = client's IP Allow POST request else deny POST request Though I don't care about GET requests, even though they can be just as malicious, this idea is targeted towards spam comments which use POST data to send the comment data to the web server. In order to make sure there isn't much of a time delay, I would run my own recursive DNS server. Please do note, this isn't meant to be a sliver bullet to spam, but it should decrease the volume. Possible or impossible?

    Read the article

  • Nginx. How do I reject request to unlisted ssl virtual server?

    - by Osw
    I have a wildcard SSL certificate and several subdomains on the same ip. Now I want my nginx to handle only mentioned server names and drop connection for others so that it'd look like nginx is not running for unlisted server names (not responding, rejecting, dead, not a single byte in response). I do the following ssl_certificate tls/domain.crt; ssl_certificate_key tls/domain.key; server { listen 1.2.3.4:443 ssl; server_name validname.domain.com; // } server { listen 1.2.3.4:443 ssl; server_name _; // deny all; // return 444; // return 404; //location { // deny all; //} } I've tried almost everything in the last server block, but no success. I get either valid response from known virtual server or error code. Please help.

    Read the article

  • Drupal + Lighttpd: enabling clean urls (rewriting)

    - by Patrick
    I'm emulating Ubuntu on my mac, and I use it as a server. I've installed lighttpd + Drupal and the following configuration section requires a domain name in order to make clean urls to work. Since I'm using a local server I don't have a domain name and I was wondering how to make it work given the fact the ip of the local machine is usually changing. thanks $HTTP["host"] =~ "(^|\.)mywebsite\.com" { server.document-root = "/var/www/sites/mywebsite" server.errorlog = "/var/log/lighttpd/mywebsite/error.log" server.name = "mywebsite.com" accesslog.filename = "/var/log/lighttpd/mywebsite/access.log" include_shell "./drupal-lua-conf.sh mywebsite.com" url.access-deny += ( "~", ".inc", ".engine", ".install", ".info", ".module", ".sh", "sql", ".theme", ".tpl.php", ".xtmpl", "Entries", "Repository", "Root" ) # "Fix" for Drupal SA-2006-006, requires lighttpd 1.4.13 or above # Only serve .php files of the drupal base directory $HTTP["url"] =~ "^/.*/.*\.php$" { fastcgi.server = () url.access-deny = ("") } magnet.attract-physical-path-to = ("/etc/lighttpd/drupal-lua-scripts/p-.lua") }

    Read the article

  • Declaring multiple ports for the same VirtualHosts

    - by user65567
    Declare multiple ports for the same VirtualHosts: SSLStrictSNIVHostCheck off # Apache setup which will listen for and accept SSL connections on port 443. Listen 443 # Listen for virtual host requests on all IP addresses NameVirtualHost *:443 <VirtualHost *:443> ServerName domain.localhost DocumentRoot "/Users/<my_user_name>/Sites/domain/public" <Directory "/Users/<my_user_name>/Sites/domain/public"> Order allow,deny Allow from all </Directory> # SSL Configuration SSLEngine on ... </VirtualHost> How can I declare a new port ('listen', ServerName, ...) for 'domain.localhost'? If I add the following code, apache works (too much) also for all other subdomain of 'domain.localhost' (subdomain1.domain.localhost, subdomain2.domain.localhost, ...): <VirtualHost *:80> ServerName pjtmain.localhost:80 DocumentRoot "/Users/Toto85/Sites/pjtmain/public" RackEnv development <Directory "/Users/Toto85/Sites/pjtmain/public"> Order allow,deny Allow from all </Directory> </VirtualHost>

    Read the article

  • Only 192.168.0.3 can request most files, but anyone can request /public/file.html

    - by mattalexx
    I have the following virtual host on my development server: <VirtualHost *:80> ServerName example.com DocumentRoot /srv/web/example.com/pub <Directory /srv/web/example.com/pub> Order Deny,Allow Deny from all Allow from 192.168.0.3 </Directory> </VirtualHost> The Allow from 192.168.0.3 part is to only allow requests from my workstation machine. I want to tweak this to allow anyone to request a certain URL: http://example.com/public/file.html How do I change this to allow /public/file.html requests to get through from anyone? Note: /public/file.html doesn't actually exist as a file on the server. I redirect all incoming requests through a single index file using mod_rewrite.

    Read the article

  • SSL support with Apache and Proxytunnel

    - by whuppy
    I'm inside a strict corporate environment. https traffic goes out via an internal proxy (for this example it's 10.10.04.33:8443) that's smart enough to block ssh'ing directly to ssh.glakspod.org:443. I can get out via proxytunnel. I set up an apache2 VirtualHost at ssh.glakspod.org:443 thus: ServerAdmin [email protected] ServerName ssh.glakspod.org <!-- Proxy Section --> <!-- Used in conjunction with ProxyTunnel --> <!-- proxytunnel -q -p 10.10.04.33:8443 -r ssh.glakspod.org:443 -d %host:%port --> ProxyRequests on ProxyVia on AllowCONNECT 22 <Proxy *> Order deny,allow Deny from all Allow from 74.101 </Proxy> So far so good: I hit the Apache proxy with a CONNECT and then PuTTY and my ssh server shake hands and I'm off to the races. There are, however, two problems with this setup: The internal proxy server can sniff my CONNECT request and also see that an SSH handshake is taking place. I want the entire connection between my desktop and ssh.glakspod.org:443 to look like HTTPS traffic no matter how closely the internal proxy inspects it. I can't get the VirtualHost to be a regular https site while proxying. I'd like the proxy to coexist with something like this: SSLEngine on SSLProxyEngine on SSLCertificateFile /path/to/ca/samapache.crt SSLCertificateKeyFile /path/to/ca/samapache.key SSLCACertificateFile /path/to/ca/ca.crt DocumentRoot /mnt/wallabee/www/html <Directory /mnt/wallabee/www/html/> Options Indexes FollowSymLinks MultiViews AllowOverride None Order allow,deny allow from all </Directory> <!-- Need a valid client cert to get into the sanctum --> <Directory /mnt/wallabee/www/html/sanctum> SSLVerifyClient require SSLOptions +FakeBasicAuth +ExportCertData SSLVerifyDepth 1 </Directory> So my question is: How to I enable SSL support on the ssh.glakspod.org:443 VirtualHost that will work with ProxyTunnel? I've tried various combinations of proxytunnel's -e, -E, and -X flags without any luck. The only lead I've found is Apache Bug No. 29744, but I haven't been able to find a patch that will install cleanly on Ubuntu Jaunty's Apache version 2.2.11-2ubuntu2.6. Thanks in advance.

    Read the article

  • How to use mod_proxy to let my index of Apache go to Tomcat ROOT and be able to browse my other Apac

    - by Dagvadorj
    Hello, I am trying to use my Tomcat application (deployed at ROOT) to be viewed from Apache port 80. To do this, I used mod_proxy, since mod_jk made me try harder. I used sth like this in httpd.conf: <location http://www.example.com> Order deny,allow Allow from all PassProxy http://localhost:8080/ PassProxyReverse http://localhost:8080/ </location> <Proxy *> Order deny,allow Allow from all </Proxy> And now I can not retrieve my previous sites on Apache, which was running prior to my configuration. How can I have both running?

    Read the article

  • Why does Apache ignore my Directory block?

    - by Codemonkey
    I just moved my projects into a new workstation. I'm having trouble getting my Apache installation to acknowledge my .htaccess files. This is my /etc/apache2/conf.d/dev config file: <Directory /home/codemonkey/dev/myproject/> Options -Indexes AllowOverride All Order Allow,Deny Deny from all </Directory> I know the config file is being included by Apache because it complains if I put erroneous syntax in it (Action 'configtest' fails). My project is reachable through Apache by a symlink in the /var/www directory. The server is running with my user and group, so it has my permissions. My entire dev folder has permissions set to 770 recursively. Despite all this, I'm still getting an indexed display of my project folder when I visit http://localhost/myproject. Why isn't the above config making it impossible to view the folder in the browser?

    Read the article

  • How to block a user in apache httpd server from accessing a *.php file inside a Directory, instead user should access this using Directory name

    - by Oxi
    My requirement looks Simple, But Googling Did not help me yet. my query is i want to Throw a 404 page to a user(Not Re-Direct to another folder or file), who is trying to Access *.php files in my website ex: when a client asks for www.example.com/home/ i want to show the content , but when user says www.example.com/home/index.php i want to show a 404 page. i tried different methods, nothing worked for me, one of which tried is shown below <Directory "C:/xampp/htdocs/*"> <FilesMatch "^\.php"> Order Deny,Allow Deny from all ErrorDocument 403 /test/404/ ErrorDocument 404 /test/404/ </FilesMatch> </Directory> Thanks in Advance

    Read the article

  • htaccess IP blocking with custom 403 Error not working

    - by mrc0der
    I'm trying to block everyone but 1 IP address from my site on a server running apache & centos. My setup is follows the example below. My server: `http://www.myserver.com/` My .htaccess file <limit GET> order deny,allow deny from all allow from 176.219.192.141 </limit> ErrorDocument 403 http://www.google.com ErrorDocument 404 http://www.google.com When I visit http://www.myserver.com/ from an invalid IP, it gives me a generic 403 error. When I visit http://www.myserver.com/page-does-not-exist/ it redirects me correctly to http://www.google.com but I can't figure out why the 403 error doesn't redirect me too. Anyone have any ideas?

    Read the article

  • ssl_error_log apache issue

    - by lakshmipathi
    https://localhost works but https://ipaddress didn't cat logs/ssl_error_log [Mon Aug 02 19:04:11 2010] [error] [client 192.168.1.158] (13)Permission denied: access to /ajaxterm denied [root@space httpd]# cat logs/ssl_access_log 192.168.1.158 - - [02/Aug/2010:19:04:11 +0530] "GET /ajaxterm HTTP/1.1" 403 290 [root@space httpd]# cat logs/ssl_request_log [02/Aug/2010:19:04:11 +0530] 192.168.1.158 SSLv3 DHE-RSA-CAMELLIA256-SHA "GET /ajaxterm HTTP/1.1" 290 httpd.conf file NameVirtualHost *:443 <VirtualHost *:443> ServerName localhost SSLEngine on SSLCertificateFile /etc/pki/tls/certs/ca.crt SSLCertificateKeyFile /etc/pki/tls/private/ca.key <Directory /usr/share/ajaxterm > Options FollowSymLinks AllowOverride None Order deny,allow Allow from All </Directory> DocumentRoot /usr/share/ajaxterm DirectoryIndex ajaxterm.html ProxyRequests Off <Proxy *> # Order deny,allow Allow from all </Proxy> ProxyPass /ajaxterm/ http://localhost:8022/ ProxyPassReverse /ajaxterm/ http://localhost:8022/ ErrorLog error_log.log TransferLog access_log.log </VirtualHost> How to fix this ?

    Read the article

  • .htaccess enabled, but not working.

    - by Aristotle
    I been having some very frustrating issues with getting .htaccess to work on my new server. I'm not very experienced in managing a server, but I've spent the last three days pouring over documentation and every resource I can get my hands on. I've attempted to try a very basic application of .htaccess, and yet it fails to give the expected results. I've setup a test over at http://173.201.185.217/test/. Immediately within this directory is an .htaccess file with the following command: deny from all If I'm not mistaken, this should deny access to /test, but it doesn't. It just sits there allowing anybody to view the contents. What could be wrong with my server/configuration? The server is running CentOS.

    Read the article

< Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >