Search Results

Search found 1512 results on 61 pages for 'deny prasetyo'.

Page 35/61 | < Previous Page | 31 32 33 34 35 36 37 38 39 40 41 42  | Next Page >

  • Apache: how to set custom 401 error page and save original behaviour

    - by petRUShka
    I have Kerberos-based authentication with Apache/2.2.3 (Linux/SUSE). When user is trying to open some url, browser ask him about domain login and password like in HTTP Basic Auth. If user cancel such request 3 times Apache returns 401 Authorization Required error page. My current virtual host config is <Directory /home/user/www/current/public/> Options -MultiViews +FollowSymLinks AllowOverride None Order allow,deny Allow from all AuthType Kerberos AuthName "Domain login" KrbAuthRealms DOMAIN.COM KrbMethodK5Passwd On Krb5KeyTab /etc/httpd/httpd.keytab require valid-user </Directory> I want to set nice custom 401 error page with some instructions for users. And I added such line in virtual host config: ErrorDocument 401 /pages/401 It works, when user can't authorize apache redirects him to my nice page. But Apache doesn't ask user login\password as it did before. I want this functionality and nice error page simultaneously! Is it possible to make it works properly?

    Read the article

  • redirect all youtube video requests to a specific one

    - by iTayb
    I'm on an IT team in my company and I would like to block youtube to users. I don't want to just deny access to the whole youtube domain, but only to replace the .flv/.mp4 request with the one that I want. That way, if someone tries to watch youtube videos on the network, He'll get a video of why using our expensive bandwidth for pleasure is a no-no. I thought about using a packet manipulation program and just replace the video ID with something that I want, but I didn't manage to do it right.

    Read the article

  • Set top level directory to be handled by Perl?

    - by Sam Lee
    I have an Apache server set up to use mod_perl. I have it set up to handle all requests using a Perl module MyModule. Here is part of my httpd.conf: LoadModule perl_module modules/mod_perl.so <Directory /> Order Deny,Allow Allow from all </Directory> PerlModule MyModule <Location /> SetHandler modperl PerlResponseHandler MyModule </Location> This seems to work fine, except top level directory (ie. www.mysite.com/) is not being sent to MyModule. What's going wrong?

    Read the article

  • apache proxypass to webmin

    - by Ricardo
    I have a problem with apache2 webmin redirect. My ProxyPass is: ProxyRequests Off ProxyPreserveHost On SSLProxyEngine On ProxyPass /admin/webmin/ https://localhost:10000/ ProxyHTMLURLMap https://localhost:10000 /admin/webmin <Location /admin/webmin/> ProxyHTMLExtended On SetOutputFilter proxy-html ProxyPassReverse https://localhost:10000/ ProxyPassReverse https://xxxxxxxxxxxxxxxxxxxx.amazonaws.com:10000/ Order allow,deny Allow from all </Location> When I connect using https://xxxxxxxxxxxxxxxxxxxx.amazonaws.com:10000/ there is no problem. But when I connect use https://xxxxxxxxxxxxxxxxxxxx.amazonaws.com/admin/webmin the page lost css and after login show me the error: The requested URL /session_login.cgi was not found on this server. I think is an error with my ProxyPass but I don´t know what is.

    Read the article

  • performance block countries using iptables /netfilter

    - by markus
    It's easy to block IPs from country using iptables (e.g. like http://www.cyberciti.biz/faq/block-entier-country-using-iptables/). However I read that the performance can go down if the deny list get too large. An alternative is installing the iptables geoip patch or using ipset ( http://www.jsimmons.co.uk/2010/06/08/using-ipset-with-iptables-in-ubuntu-lts-1004-to-block-large-ip-ranges/) instead of iptables. Does anyone have experience with the various approaches and can say something about the performance differences ? Are there are other ways to block country IPs in linux which I did't mentioned above?

    Read the article

  • Apache server as reverse proxy is removing xmlns info from html tag

    - by Johnco
    I have a Java application running in tomcat, in front of which I have an Apache http server as a reverse proxy. However, the proxy is removing all xmlns data from the html tag, which breaks all the Facebook's FBML which is never parsed. My current config is as follows: ProxyRequests off ProxyHTMLDocType XHTML ProxyPassReverseCookiePath /cas / <Location /> ProxyPass http://localhost:8080/cas ProxyPassReverse http://localhost:8080/cas </Location> ProxyHTMLURLMap /cas / SetOutputFilter proxy-html <Proxy *> Order deny,allow Allow from all Satisfy all </Proxy> Thanks in advance.

    Read the article

  • administraitor denies command prompt

    - by roxas383
    I looked at the command prompt faqs on this webiste and tried to get me to make a new profile but when i went to open it it said "The command prompt has been disabled by your administraitor-click any key to countinue" and when i do it closes which i knew that was going to happen. Is there someway i can stop the admin from blocking my command prompt? oh and i was at school too (Dont know if this helps but i am in the West Allis West Milwaukee school district). Also, is there a way i can hack into the blocking program for my school at all? I don't remember what the program is but it might have to do with vision? I'm not sure but it would be greatly appreciated if someone could help (could i download a program on a flash drive that denies any blocking and/or disabling SafeSearch and plug it in into a computer at my school and use it from the flash drive? Would the school blocking thingy deny it from working?) HELP ME PLEASE!!!!!!!!!

    Read the article

  • Apache in front of tomcat on Railo proxy with ajp

    - by user1468116
    I'm trying to setup apache in front of the tomcat embedded in railo. I have this settings: <VirtualHost *:80> DocumentRoot "/var/www/myapp" ServerName www.myapp.test ServerAlias www.myapp.test ProxyRequests Off ProxyPass /app ! <Proxy *> Order deny,allow Allow from all </Proxy> ProxyPreserveHost On ProxyPassReverse / ajp://%{HTTP_HOST}:8009/ RewriteEngine On # If it's a CFML (*.cfc or *.cfm) request, just proxy it to Tomcat: RewriteRule ^(.+\.cf[cm])(/.*)?$ ajp://%{HTTP_HOST}:8009/$1$2 [P] My server.xml : <Host name="www.myapp.test" appBase="webapps" unpackWARs="true" autoDeploy="true" xmlValidation="false" xmlNamespaceAware="false"> <Context path="" docBase="/var/www/myapp" /> <Alias>myapp.test</Alias> </Host> The index is loaded, but if I try to load some internal page I got: The proxy server could not handle the request GET /report/myreportname. Reason: DNS lookup failure for: localhost:8009report Could you help me?

    Read the article

  • Apache server as reverse proxy is removing xmlns info from html tag

    - by Johnco
    I have a Java application running in tomcat, in front of which I have an Apache http server as a reverse proxy. However, the proxy is removing all xmlns data from the html tag, which breaks all the Facebook's FBML which is never parsed. My current config is as follows: ProxyRequests off ProxyHTMLDocType XHTML ProxyPassReverseCookiePath /cas / <Location /> ProxyPass http://localhost:8080/cas ProxyPassReverse http://localhost:8080/cas </Location> ProxyHTMLURLMap /cas / SetOutputFilter proxy-html <Proxy *> Order deny,allow Allow from all Satisfy all </Proxy> Thanks in advance.

    Read the article

  • suphp how disable ls /

    - by Pol Hallen
    Using suphp, I set a php.ini to every virtual host. In php.ini I also setted: open_basedir = /home/site1 php script runs, but if I ve a script with ls / I can see whole root directory. How can disable this hole security? <VirtualHost *:80> ServerName site1 ServerAlias www.site1.com DirectoryIndex index.html index.htm DocumentRoot /home/site1/ suPHP_Engine on AddHandler x-httpd-php .php .php3 .php4 .php5 suPHP_AddHandler x-httpd-php # THIS READ php.ini suPHP_ConfigPath /home/site1/ <Directory /home/site1/> Options -Includes -Indexes -FollowSymLinks -ExecCGI -MultiViews AllowOverride none Order allow,deny Allow from all </Directory> </VirtualHost>

    Read the article

  • Limit HTTP VERBS on Apache2

    - by user72295
    I am trying to limit the use of certain HTTP verbs on my site. I entered the following into my VirtualHost config file within the Directory element: <Limit GET POST HEAD> Allow from all </Limit> <Limit PUT DELETE OPTIONS> Deny from all </Limit> This seemed to work but with unexpected results: I ran the following telnet/HTTP commands before and after this change, open server 80 OPTIONS server/abs_path HTTP/1.1 User-Agent: Telnet/1.0 Host: server before the change I received a successful response with the Allowed headers. After the change, however, I was expecting to receive a 405 'Method not allowed' response but rather I received a 403 'Access Forbidden' response. What do I need to change in apache to return the 405 HTTP response? Many thanks

    Read the article

  • Blocking a country (mass iP Ranges), best practice for the actual block

    - by kwiksand
    Hi all, This question has obviously been asked many times in many different forms, but I can't find an actual answer to the specific plan I've got. We run a popular European Commercial deals site, and are getting a large amount of incoming registrations/traffic from countries who cannot even take part in the deals we offer (and many of the retailers aren't even known outside Western Europe). I've identified the problem area to block a lot of this traffic, but (as expected) there are thousands of ip ranges required. My question now (finally!). On a test server, I created a script to block each range within iptables, but the amount of time it took to add the rules was large, and then iptables was unresponsive after this (especially when attempting a iptables -L). What is the most efficient way of blocking large numbers of ip ranges: iptables? Or a plugin where I can preload them efficiantly? hosts.deny? .htaccess (nasty as I'd be running it in apache on every load balanced web server)? Cheers

    Read the article

  • Using both domain users and local users for Squid authentication?

    - by Massimo
    I'm working on a Squid proxy which needs to authenticate users against an Active Directory domain; this works fine, Samba was correctly set up and Squid authenticates users via ntlm_auth. Relevant lines in squid.conf: auth_param ntlm program /usr/bin/ntlm_auth --helper-protocol=squid-2.5-ntlmssp auth_param ntlm children 5 auth_param ntlm keep_alive on acl Authenticated proxy_auth REQUIRED http_access allow Authenticated http_access deny all Now, I need a way to allow access to users which don't have a domain account. I know I could create an "internet user" account in the domain, but this would allow access, although limited, to domain resources (file shares, etc.); I need something that will allow only Internet access. The ideal solution would be using a local account on the proxy server, either a Linux account or a Squid one; I know Squid supports this, but I'm unable to have it use both domain authentication and Squid/local authentication if domain auth is unsuccesful. Can this be done? How?

    Read the article

  • Block P2P traffic on a Linksys router WRT54G with Tomato firmware

    - by Kami
    I'm running a small wireless network (6 to 10 users) on a Linksys WRT54G with Tomato firmware sharing an Internet connection. I don't want the users to download files with BitTorrent (mainly used) and other P2P apps. I've also found some solutions about lowering P2P traffic priority using QoS. I really need to ban P2P traffic. Does anyone know how to setup some rules to deny that kind of traffic? I've tried to setup an Access Restriction Rule: However it's not working at all.

    Read the article

  • Allow HTTPS cookies but not HTTP?

    - by Ken
    I want to allow cookies for a domain but only over HTTPS -- not cookies from the same domain that come from HTTP. For example, I don't want any http://www.google.com cookies, but I do want to allow https://www.google.com cookies (because Calendars are there). Is there a way to do this? Does the goal even make sense? In Chrome, it only allows domain names, not URLs, to be added to the cookie exception list. In Firefox, it allows a protocol, but it only records the domain name, and if you click "Allow" or "Deny", it changes the same entry in the list.

    Read the article

  • Mod_jk Tomcat VirtualHost

    - by user37143
    Hi, I have two applications in Tomcat app1 and app2. I have mod_jk configured for Apache front end and I am able to get the Tomcat index.jsp Now I created two virtualhosts for app1 and app2 so that app1.domain.com will point to app1 in tomcat and app2.domain.com will point to app2 in Tomcat but it's not working. I have the Vhost as ServerName www.app1.domain.com ServerAlias app1.domain.com DocumentRoot "/opt/tomcat/webapps/app1" DirectoryIndex index.jsp Options Indexes FollowSymLinks AllowOverride None Order allow,deny Allow from all The following section added for Jk JkMount /.do ajp13 JkMount /.jsp ajp13 JkMount / ajp13 JkMount /* ajp13 JkUnMount /.php ajp13 JkUnMount /.gif ajp13 JkUnMount /.html ajp13 JkUnMount /.css ajp13 JkUnMount /.png ajp13 JkUnMount /.jpg ajp13 # But this did not work both the sub domains loads Tomcat's index.jsp. Can some one help me? Thanks

    Read the article

  • Apache ProxyPass ignore static files

    - by virtualeyes
    Having an issue with Apache front server connecting to a Jetty application server. I thought that ProxyPass ! in a location block was supposed to NOT pass on processing to the application server, but for some reason that is not happening in my case, Jetty shows a 404 on the missing statics (js, css, etc.) Here's my Apache (v 2.4, BTW) virtual host block: DocumentRoot /path/to/foo ServerName foo.com ServerAdmin [email protected] RewriteEngine On <Directory /path/to/foo> AllowOverride None Require all granted </Directory> ProxyRequests Off ProxyVia Off ProxyPreserveHost On <Proxy *> AddDefaultCharset off Order deny,allow Allow from all </Proxy> # don't pass through requests for statics (image,js,css, etc.) <Location /static/> ProxyPass ! </Location> <Location /> ProxyPass http://localhost:8081/ ProxyPassReverse http://localhost:8081/ SetEnv proxy-sendchunks 1 </Location>

    Read the article

  • Nginx Ip Whitelist

    - by Will
    Is it possible to create a ip whitelist for my nginx proxy server without adding allow or deny in the config file is it possible i can get nginx to link to a separate database to check if the user is allowed to access the website . Ideally i could do with nginx linking to an external database or at minimum a list off allowed ips on the same server so i can easily update the list whit out restarting nginx every time. In the future i would like to link nginx to my website and a user will login and there ip will be linked to there account and they will be able to update there ip if it has changed to there new one to grant them access so i need to keep in mind that it would be easyer to do this if i have external list off ips in some kind off database any help is apreshiated

    Read the article

  • Can't get MultiViews to work on Apache 2.2 - negotiation problem

    - by Doe
    Hi I can't get MultiViews to set up properly on my Apache 2.2. When I go to filtered.com/something, I expect it to execute something.pl but it doesn't. I get a Error 404 page. In my error logs it says: " [Fri Apr 16 13:04:20 2010] [error] [client 78.85.152.94] Negotiation: discovere\ d file(s) matching request: /var/www/html/filtered.net/translate-english (None could be negotiated)., referer: http://filtered.net/" Would anyone kindly help me so that MultiViews is properly installed on my server? ServerAdmin [email protected] ServerAlias *.filtered.net DocumentRoot /var/www/html/filtered.net ServerName filtered.net ErrorLog logs/filtered.net-error_log CustomLog logs/filtered.net-access_log common Options ExecCGI +Indexes +IncludesNoExec +MultiViews +ExecCGI AllowOverride None Order allow,deny Allow from all <IfModule mod_dir.c> DirectoryIndex index.php index.html index.pl </IfModule> </Directory> </VirtualHost>

    Read the article

  • Permission forbidden on localhost with apache2

    - by N Alex
    Here is what I am trying to do. I tried to add another folder to apache and I get the following error when trying to acces testing/index.html. The idea is that I would like to have for every customer a folder like /home/neagoe/Work/InterWebs/Projects/[PROJECT NAME]/CustomerProjects/website/dist. Forbidden You don't have permission to access /index.html on this server. Apache/2.2.22 (Ubuntu) Server at testing Port 80 Here are the steps that I followed: Step1: sudo chmod a+x /home/neagoe/Work/InterWebs/Projects/testing/CustomerProjects/website/dist Step2: sudo chown -R www-data:www-data /home/neagoe/Work/InterWebs/Projects/testing/CustomerProjects/website/dist sudo chmod -R 775 /home/neagoe/Work/InterWebs/Projects/testing/CustomerProjects/website/dist Step3: sudo adduser $USER www-data Step4: sudo a2enmod userdir Step5: sudo cp /etc/apache/sites-available/default /etc/apache/sites-available/testing I edited the file /etc/apache/sites-available/testing so it looks like this: <VirtualHost *:80> ServerAdmin webmaster@localhost ServerName testing DocumentRoot /home/neagoe/Work/InterWebs/Projects/testing/CustomerProjects/website/dist <Directory /> Options FollowSymLinks AllowOverride None </Directory> <Directory /home/neagoe/Work/InterWebs/Projects/testing/CustomerProjects/website/dist/ > Options Indexes FollowSymLinks MultiViews AllowOverride All Order allow,deny allow from all </Directory> ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/ <Directory "/usr/lib/cgi-bin"> AllowOverride None Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch Order allow,deny Allow from all </Directory> ErrorLog ${APACHE_LOG_DIR}/error.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn CustomLog ${APACHE_LOG_DIR}/access.log combined </VirtualHost> Step6: I edited hosts ("/etc/hosts") so it looks like this: 127.0.0.1 localhost 127.0.0.1 testing # The following lines are desirable for IPv6 capable hosts ::1 ip6-localhost ip6-loopback fe00::0 ip6-localnet ff00::0 ip6-mcastprefix ff02::1 ip6-allnodes ff02::2 ip6-allrouters Step7: sudo a2ensite testing sudo service apache2 restart I searched for about 2 hours on the internet but I can't figure out what went wrong. All the pages that I found following the same steps as described above. I know there are similar questions here on the internet, but the answer is to change permission to the directory which I did on Step2. I am sorry if this is really a duplicate but I could't find the right answer. Thank you! PS. I asked this also on AskUbuntu but didn't get any answers so I'm trying my luck here. Edit: There isn't much on the error log or the access log. On the access.log: ::1 - - [10/Aug/2013:11:23:28 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" ::1 - - [10/Aug/2013:11:23:29 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" ::1 - - [10/Aug/2013:11:23:31 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" ::1 - - [10/Aug/2013:11:23:32 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" ::1 - - [10/Aug/2013:11:23:33 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" ::1 - - [10/Aug/2013:11:23:34 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" ::1 - - [10/Aug/2013:11:23:35 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" 127.0.0.1 - - [10/Aug/2013:11:23:23 +0300] "POST /wordpress-testing/wp-cron.php?doing_wp_cron=1376123003.7026669979095458984375 HTTP/1.0" 200 705 "-" "WordPress/3.6; http://localhost/wordpress-testing" ::1 - - [10/Aug/2013:11:23:36 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" ::1 - - [10/Aug/2013:11:23:37 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" ::1 - - [10/Aug/2013:11:23:38 +0300] "OPTIONS * HTTP/1.0" 200 126 "-" "Apache/2.2.22 (Ubuntu) (internal dummy connection)" 127.0.0.1 - - [10/Aug/2013:11:31:32 +0300] "GET /index.html HTTP/1.1" 200 485 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:23.0) Gecko/20100101 Firefox/23.0" And the last line repeats for about 200 rows. On the error.log: 1. This lines repeat from time to time. PHP Warning: PHP Startup: Unable to load dynamic library '/usr/lib/php5/20100525 /msql.so' - /usr/lib/php5/20100525/msql.so: cannot open shared object file: No such file or directory in Unknown on line 0 [Sat Aug 10 13:06:42 2013] [notice] Apache/2.2.22 (Ubuntu) PHP/5.4.9-4ubuntu2.2 configured -- resuming normal operations [Sat Aug 10 13:07:36 2013] [notice] caught SIGTERM, shutting down PHP Warning: PHP Startup: Unable to load dynamic library '/usr/lib/php5/20100525/msql.so' - /usr/lib/php5/20100525/msql.so: cannot open shared object file: No such file or directory in Unknown on line 0 [Sat Aug 10 13:07:37 2013] [notice] Apache/2.2.22 (Ubuntu) PHP/5.4.9-4ubuntu2.2 configured -- resuming normal operations 2. And this is the predominant error. (hundreds of lines) [Sat Aug 10 13:07:40 2013] [error] [client 127.0.0.1] (13)Permission denied: access to /index.html denied

    Read the article

  • Exchange 2010 install locks out high level accounts

    - by tearman
    Basically, when we installed Exchange 2010 alongside our Exchange 2003 server (we assume), this is what caused our problem. The Exchange 2010 server is not active, just running on the domain. What's actually going on is that user groups like Enterprise Admins are getting a single deny flag on Full Control over mailboxes currently residing on the Exchange 2003 server which is preventing any of us from making changes. It says these permissions are inherited from the Parent Object, but we have no idea which one that is. Any idea on how to go about fixing this?

    Read the article

  • Protecting PHP packages on server

    - by Jack
    Hi, I am a php developer and have recently decided to make one of my Magento extensions commercial. I have downloaded and configured MageParts CEM Server and that is all working perfectly in regard to licencing and delivery of module packages. The only issue is that the directory that the packages are stored in could be accessed by anyone. I tried this in a .htaccess file, but now it is not working. <Files services.wsdl> allow from all </Files> deny from all Clients are receiving a 403 Forbidden response. Have I done something wrong in the .htaccess file or would there be a better way to secure the directory? Any help would be greatly appreciated.

    Read the article

  • How to exclude a specific URL from basic authentication in Apache?

    - by ripper234
    Two scenarios: Directory I want my entire server to be password-protected, so I included this directory config in my sites-enabled/000-default: <Directory /> Options FollowSymLinks AllowOverride None AuthType Basic AuthName "Restricted Files" AuthUserFile /etc/apache2/passwords Require user someuser </Directory> The question is how can I exclude a specific URL from this? Proxy I found that the above password protection doesn't apply to mod_proxy, so I added this to my proxy.conf: <Proxy *> Order deny,allow Allow from all AuthType Basic AuthName "Restricted Files" AuthUserFile /etc/apache2/passwords Require user someuser </Proxy> How do I exclude a specific proxied URL from the password protection? I tried adding a new segment: <Proxy http://myspecific.url/> AuthType None </Proxy> but that didn't quite do the trick.

    Read the article

  • Intercept Apache communication

    - by Nathan Adams
    I am looking to develop a solution that eliminates potential spammers. The way this system will work is that it will watch connections and requests. Going into the specifics is more for stackoverflow, But, what I am interested in is if it is possible to tell Apache to pass the request over to my application first and give it the ability to accept/deny the request. Sure, it will make requests slower, but I think that is a trade off I am willing to take. I still want, however, Apache to run the request through any interpreters (such as PHP). The idea is that one wouldn't have to implement anti-spam measures on a per app basis but have an "umbrella" of spam protection.

    Read the article

  • Selectively allow NetBIOS inbound traffic

    - by shayan
    This is what I try to achieve from a very high point of view: Every time someone tries to access my shared folders (on Windows) a popup should open and ask for my permission. Do you know any tool? Something like "NetShareMonitor" is helpful for monitoring only A tool like an Antivirus these days has a focus on outgoing traffic A normal firewall does not allow me to select at the time of request. Setting User Permissions is not an option, I want to allow/deny at the time of request even if it is the same user over and over again.

    Read the article

< Previous Page | 31 32 33 34 35 36 37 38 39 40 41 42  | Next Page >