Search Results

Search found 5578 results on 224 pages for 'transport rules'.

Page 16/224 | < Previous Page | 12 13 14 15 16 17 18 19 20 21 22 23  | Next Page >

  • How to add exceptions to apache reverse proxy rules

    - by Tania
    I am trying to set a Apache reverse proxy so that requests get proxyed to another application running on 8080. However, I want some directories to be directly served rather than forwarded to proxy. What I want is: http://localhost/ - http:// localhost:8080/myapp http:// localhost/images - /var/www/html/images http:// localhost/anything-else - http:// localhost:8080/myapp/anyhthing-else My current httpd.conf is ProxyRequests Off ProxyTimeout 600 ProxyPreserveHost On ProxyPass / http:// localhost:8080/ ProxyPassReverse / http:// localhost:8080/ RewriteEngine On RewriteRule ^/(.*) http:// localhost:8080/VirtualHostBase/http/%{SERVER_NAME}:80/myapp/VirtualHostRoot/$1 [L,P] What configuration should I do to make the local path exception to work? Thank you, Tania

    Read the article

  • Widespread misinterpretation of DNS rules in resolving wildcards

    - by Dominic Sayers
    [EDITED to add: This problem has gone away on its own. I believe Cloudflare's name resolution may have been to blame. See my own answer below] Here is a snippet of my zone file *.example.com. 300 IN CNAME proxy.herokuapp.com. foo.example.com. 300 IN A 111.111.111.111 If I dig @8.8.8.8 foo.example.com I get the answer I expect: ;; ANSWER SECTION: foo.example.com. 30 IN A 111.111.111.111 The same is true of all other public DNS servers I've tried. However, when I try to set up a check with Pingdom to a URL on foo.example.com it instead sends the traffic to my Heroku app referenced by the *.example.com RR. The same is true of checks set up on New Relic, Errplane and traffic generated by the Heroku app itself. So on the one side, all public DNS servers interpret the zone file one way. Yet four service providers all interpret it a different way, one that differs to the standard suggested by RFC 4592. My question is: are these reputable, mature service providers all wrong? Or is it little me?

    Read the article

  • Help with Rewrite rules.

    - by Kyle
    I was wondering what a rewrite statement that looks for this situation. I want to have multiple users on my server. Each user can have 'VirtualDocumentRoot' like sites in his directory. For example, they just make a directory like 'example.com' in their home directory, and it's hosted. The problem is I don't know if VirtualDocumentRoot can do this, or if it would take a rewrite rule that looks in all the users folders for a domain. Can anybody help me?

    Read the article

  • How make rewrite rules relative to .htaccess file.

    - by Kendall Hopkins
    Current I have an .htaccess file like this. RewriteEngine On RewriteCond %{REQUEST_FILENAME} !-f [OR] RewriteCond %{REQUEST_URI} ^/(always|rewrite|these|dirs)/ [NC] RewriteRule ^(.*)$ router.php [L,QSA] It works create when the site files are in the document_root of the webserver (ie. domain.com/abc.php - /abc.php). But in our current setup (which isn't changeable), this isn't ensured. We can sometimes have arbitrary folder in between the document root and folder of the .htaccess file (ie. domain.com/something/abc.php - /something/abc.php). The only problem with is that is the second RewriteCond no longer works. Is there anyway to dynamically check if the accessed path by a path relative to .htaccess file. For Example: If I have a site where domain.com/rewrite/ is the directory of the .htaccess file. NOT FORCED TO REWRITE -> domain.com/rewrite/index.php FORCED TO REWRITE -> domain.com/rewrite/rewrite/index.php If I have a site where domain.com/ is the directory of the .htaccess file. NOT FORCED TO REWRITE -> domain.com/index.php FORCED TO REWRITE -> domain.com/rewrite/index.php

    Read the article

  • Help with Apache rewriteengine rules

    - by Vinay
    Hello - I am trying to write a simple rewrite rule using the rewriteengine in apache. I want to redirect all traffic destined to a website unless the traffic originates from a specific IP address and the URI contains two specific strings. RewriteEngine On RewriteLog /var/log/apache2/rewrite_kudithipudi.log RewriteLogLevel 1 RewriteCond %{REMOTE_ADDR} ^199\.27\.130\.105 RewriteCond %{REQUEST_URI} !/StringOne [NC, OR] RewriteCond %{REQUEST_URI} !/StringTwo [NC] RewriteRule ^/(.*) http://www.google.com [R=302,L] I put these statements in my virtual host configuration. But the rewriteengine seems to be redirect all requests, whether they match the condition or not. Am I missing something? Thank you. Vinay.

    Read the article

  • Apache Rewrite Rules breaking each other?

    - by neezer
    I have this rule: RewriteCond %{REQUEST_URI} ^/(manhattan|queens|westchester|new-jersey|bronx|brooklyn)-apartments/.*$ RewriteCond %{REQUEST_URI} !^/guide/(.*)$ RewriteRule ^(.*)$ /home/neezer/public-html/domain.com/guide/$1 [L] Which works great on it's own. Essentially, I have a bunch of directories that have a bunch of files in them that I want to keep in the "/guide" folder, but I want them to appear at the web root for SEO reasons. This rule works, but unfortunately the original URL's still work too (with "/guide"). I want to 301 Redirect the ones with "/guide" in the URL to those without, without actually moving the files on the server. I tried adding this rule: RewriteCond %{REQUEST_URI} ^/guide/(manhattan|queens|westchester|new-jersey|bronx|brooklyn)-apartments/.*$ RewriteRule ^guide/(.*)$ http://www.domain.com/$1 [R=301,L] ... but that breaks my first rule completely. Any thoughts about what I might be doing wrong? Please let me know if you need to know anything else from me to help me with this issue.

    Read the article

  • Do two portforward rules translate to "and"?

    - by blsub6
    I just set up an Exchange server to replace my DeskNow mail server. I want to start testing my internet mail exchange of my Exchange server. I can only set the MX records on my DNS up to my one external IP address so I was thinking that I could set up a firewall rule on my internet-facing firewall that port forwarded the smtp packets to two different servers. My question is: If I do that, will that mean that the smtp packets will be forwarded to just the first internal IP on the list? Or does it mean that the packet will be cloned and sent to both IPs?

    Read the article

  • AdvancedFirewall: Adding a Program to the Inbound Rules

    - by bvanderw
    I am writing an application that contains a web server running on port 50000. On Windows 7, short of turning the firewall off completely, I am having trouble configuring the firewall to allow other computers on the same private LAN to connect to the server. Simply adding the program to the allowed programs list doesn't seem to work. The network connection is set to be a "Home" network (but I am not using Home Networking). Can anyone suggest where I should be looking to troubleshoot this? Bruce

    Read the article

  • iptables firewall rules not allowing ssh from lan to DMZ

    - by ageis23
    Chain INPUT (policy ACCEPT) target prot opt source destination REJECT tcp -- anywhere anywhere tcp dpt:www reject-with tcp-reset REJECT tcp -- anywhere anywhere tcp dpt:telnet reject-with tcp-reset ACCEPT 0 -- anywhere anywhere state RELATED,ESTABLISHED DROP udp -- anywhere anywhere udp dpt:route DROP udp -- anywhere anywhere udp dpt:route ACCEPT udp -- anywhere anywhere udp dpt:route logdrop icmp -- anywhere anywhere logdrop igmp -- anywhere anywhere ACCEPT udp -- anywhere anywhere udp dpt:5060 ACCEPT 0 -- anywhere anywhere state NEW logaccept 0 -- anywhere anywhere state NEW ACCEPT 0 -- anywhere anywhere ACCEPT 0 -- anywhere anywhere ACCEPT 0 -- anywhere anywhere logdrop 0 -- anywhere anywhere Chain FORWARD (policy ACCEPT) target prot opt source destination REJECT 0 -- 192.168.0.0/24 192.168.2.0/24 reject-with icmp-port-unreachable ACCEPT tcp -- choister 192.168.2.142 tcp dpt:ssh state NEW REJECT 0 -- 192.168.0.0/24 192.168.3.0/24 reject-with icmp-port-unreachable ACCEPT gre -- 192.168.1.0/24 anywhere ACCEPT tcp -- 192.168.1.0/24 anywhere tcp dpt:1723 ACCEPT 0 -- anywhere anywhere ACCEPT 0 -- anywhere anywhere ACCEPT 0 -- anywhere anywhere ACCEPT 0 -- anywhere anywhere TCPMSS tcp -- anywhere anywhere tcp flags:SYN,RST/SYN TCPMSS clamp to PMTU lan2wan 0 -- anywhere anywhere ACCEPT 0 -- anywhere anywhere state RELATED,ESTABLISHED logaccept tcp -- anywhere choister tcp dpt:www TRIGGER 0 -- anywhere anywhere TRIGGER type:in match:0 relate:0 trigger_out 0 -- anywhere anywhere logaccept 0 -- anywhere anywhere state NEW logdrop 0 -- anywhere anywhere The ssh server I'm trying to connect to is in the DMZ(192.168.0.145). It's mainly used as a web server. I need access to it from my room 192.168.2.142. I don't get why ssh can't forward onto the 192.168.2.0 subnet? I'm sure it's the reject rule that causing this because it works without it.

    Read the article

  • Amazon EC2 firewall rules & VPN connections

    - by John
    I'm moving from Rackspace to Amazon EC2. One thing I like about our Rackspace setup is that it is extremely secure. The MySQL box can only be accessed via internal IPs, and we have a Cisco VPN firewall that allows us to dial in remotely and access port 3306 as though we were on the internal network. I'd like to figure out how to replicate this setup with EC2. How can I make the MySQL box so that port 3306 can only be accessed on the internal network? What about the VPN piece of things? I know Amazon has the VPC service, but it seems like that's for the purpose of connecting to an existing network. I don't have an existing network. I want to essentially create one inside Amazon and connect to that. What are my options? Any good tutorials on how to get started? Thanks in advance for your help

    Read the article

  • Ignore subdomain rewrite rules

    - by user55745
    I'm having difficulty having a sub folder act differently to the main domain in my web.config for iss. I want to prevent the sub folder from rewriting to the baselevel index.php and instead re-write to /subfolder/index.php/ I've tried this <rule name="Remove index.php for quiz" enabled="true" stopProcessing="false"> <match url="^(gsoquiz/)(.*)$" ignoreCase="false" trackAllCaptures="false" /> <conditions logicalGrouping="MatchAll"> <add input="{R:1}" negate="true" pattern="^(index\.php|admin\.php)" /> </conditions> <action type="Rewrite" url="/gsoquiz/index.php/{R:1}" /> </rule> But all I get is The page cannot be displayed because an internal server error has occurred. Any help as to where I'm going wrong would be greatly appreciated. Going mad trying to figure this out :).

    Read the article

  • Rules for setting hostname [duplicate]

    - by Ilia Rostovtsev
    This question already has an answer here: Hostnames - What are they all about? 5 answers Setting the hostname: FQDN or short name? 6 answers It's thought that for the hostname should be used FQDN. I have a doubts about whether using: host.domain.ltd and domain.ltd for the hostname is the same thing and will be equally correct / acceptable? I'm willing to use domain.ltd for the hostname. Is it alright?

    Read the article

  • Logging violations of rules in limits.conf

    - by PaulDaviesC
    I am trying to log the details of the programs that where failed due to the limit cap defined in the limits.conf. My initial plan was to do it using the audit system. The idea was to track the system calls related to limits in the limits.conf that where failed. However the problem with this approach is that , it is not possible to track the violations of cpu time, since that violation do not involve failure of system calls. In the case of CPU time , one thing happens is that the program which violated the cpu time will be delivered a SIGXCPU. So my question is how should I go about logging the programs that violated CPU time? Also is there any limits.conf specific logs available?

    Read the article

  • robots.txt file with more restrictive rules for certain user agents

    - by Carson63000
    Hi, I'm a bit vague on the precise syntax of robots.txt, but what I'm trying to achieve is: Tell all user agents not to crawl certain pages Tell certain user agents not to crawl anything (basically, some pages with enormous amounts of data should never be crawled; and some voracious but useless search engines, e.g. Cuil, should never crawl anything) If I do something like this: User-agent: * Disallow: /path/page1.aspx Disallow: /path/page2.aspx Disallow: /path/page3.aspx User-agent: twiceler Disallow: / ..will it flow through as expected, with all user agents matching the first rule and skipping page1, page2 and page3; and twiceler matching the second rule and skipping everything?

    Read the article

  • How to forward traffic using iptables rules?

    - by ProbablePattern
    I am new to iptables and I have been doing Google searches for a few days now without finding a good solution to this problem. I have computer A with a public ip address (say 192.0.2.1) that can access the Internet unrestricted. I have another computer B with a private ip address (192.168.1.1) that can only access computer A. How do I use iptables to forward network traffic from B through A to the Internet? I need to use http, ftp, and https in order to use apt-get with sudo. Both computers run Ubuntu linux. I have tried using Squid but I think it is far too complicated for what I need to do.

    Read the article

  • Postfix $smtpd_banner rules

    - by horen
    For monitoring purposes I would like to add the IP address to the Postfix smtpd_banner: smtpd_banner = $myhostname ESMTP $smtp_bind_address which works and outputs: 220 mail.mydomain.com ESMTP 123.456.789.0 Now I am wondering if there are any (negative) repercussions to expect. I couldn't find anything about it in the RFC docs. The Postfix docs add another parameter ($mail_name) in their example, so I think I am fine. I just want to verify that my syntax is correct and is allowed.

    Read the article

  • Propagaging Apache rules Automatically for all folders Beneath Root [closed]

    - by Sam
    Hi folks, my webroot folder /httpdocs folder contains a .htaccess file The first lines look like this: RewriteEngine on RewriteBase / AddDefaultCharset UTF-8 Options +FollowSymLinks -Indexes -ExecCGI # DirectoryIndex index.php /index.php # ServerSignature Off Now, I want all settings that I have set it to, to be propagated automatically to other folders as well. How can I do that? Thanks very much for suggestions.

    Read the article

  • Configuring iptables rules for HAProxy and others

    - by MLister
    I have the following relevant settings for HAProxy: defaults log global mode http option httplog option dontlognull retries 3 option redispatch maxconn 500 contimeout 5s clitimeout 15s srvtimeout 15s frontend public bind *:80 option http-server-close option http-pretend-keepalive option forwardfor # ACLs ... I have three backends (including a Nginx server) configured in HAProxy, all listening on different ports of 127.0.0.1. And my iptables config is this: *filter # Allows all loopback (lo0) traffic and drop all traffic to 127/8 that doesn't use lo0 -A INPUT -i lo -j ACCEPT -A INPUT -i lo -d 127.0.0.0/8 -j REJECT # Accepts all established inbound connections -A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT # Allows all outbound traffic # You can modify this to only allow certain traffic -A OUTPUT -j ACCEPT # Allows HTTP and HTTPS connections from anywhere (the normal ports for websites) -A INPUT -p tcp --dport 80 -j ACCEPT -A INPUT -p tcp --dport 443 -j ACCEPT # Allows SSH connections # # THE -dport NUMBER IS THE SAME ONE YOU SET UP IN THE SSHD_CONFIG FILE # -A INPUT -p tcp -m state --state NEW --dport 22 -j ACCEPT # Allow ping -A INPUT -p icmp -m icmp --icmp-type 8 -j ACCEPT # log iptables denied calls -A INPUT -m limit --limit 5/min -j LOG --log-prefix "iptables denied: " --log-level 7 # Reject all other inbound - default deny unless explicitly allowed policy -A INPUT -j REJECT -A FORWARD -j REJECT COMMIT My questions are: Would the above iptables config work with the settings/options in my HAProxy config? I am also runnning a postgres and a redis server on the same machine, what settings do I need to adjust for these two to enable them work with iptables?

    Read the article

  • Apache Rewrite Rules

    - by Philip
    I have moved my website from a Wiki to Wordpress and in the process, realised that I have broken links to some popular pages on my website. Is it possible to fix this with a rewrite rule? I need the rule to redirect anything beginning with "^/wiki/(.+)$" to "/$1" but also replacing the "_" character used in MediaWiki slugs to "-" used in Wordpress slugs. For example: http://example.com/wiki/An_Example_Page should be pointed to: http://example.com/an-example-page Is it possible to write such a rewrite rule? Edit: It appears that Wordpress doesn't even care if the "/wiki/" part is removed - provided the slug matches, and that seems to be case-insensitive too. So all I need to do is change the "_" characters to "-" in the slugs.

    Read the article

  • .htaccess rules not working, but the file seems to be loaded

    - by user221877
    I am trying to remove .php at the end of the URL from any page thats loaded. RewriteEngine on RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_FILENAME}.php -f RewriteRule ^(.*)$ $1.php Its running on my own server, which has WHM/cPanel, so I can change settings at the server level, I'm just not really sure what I'm looking for. I found the httpd.conf file, but it said it was auto generated by whm, so I tried looking in whm for the correct settings but it had barely any settings related to htaccess. If I fill htaccess with gibberish it stops the site from loading, which I assume means that the .htaccess file is being loaded, so I'm not sure what the issue is.

    Read the article

  • iptable rules not blocking

    - by psychok7
    so i am trying to allow ssh access to a certain range of ips (from 192.168.1.1 to 192.168.1.24) and block all the rest but since i am new to iptables i can't seem to figure, i have : iptables -A INPUT -s 192.168.1.0/24 -p udp --dport ssh -j ACCEPT iptables -A INPUT -s 192.168.1.0/24 -p tcp --dport ssh -j ACCEPT iptables -A INPUT -p tcp --dport ssh -j REJECT iptables -A INPUT -p udp --dport ssh -j REJECT but this does not work, with a vm set with 192.168.1.89 i can still access through ssh. can someone help?

    Read the article

  • IIRF redirect combine rules?

    - by Phill
    I have 3 "rules". One to make sure URLs are lowercase another to include a slash at the end of directories, and a 3rd to force access to index.html pages to be thru the directory instead. The problem w/ how I have it is, sometimes this is causing multiple 301 redirects. I'd really like each rule to apply in turn and then if neccessary redirect once to the final url. For example a url might need to be converted to lowercase and have a slash added. Or may need to be lowecase and change from index.html to a directory. Any ideas how I can do this? Thanks very much. The rules are below: #LOWERCASE URLS For Directories, aspx, html files RedirectRule ^/(.*[A-Z].*(/|\.html|\.aspx))$ /#L$1#E [R=301] #ADD SLASH TO DIRECTORIES #--------------------------------------------- #Perm Redirect If: #Starts w/ Forward Slash #Match Any Characters Except (. or ?) 1 or more times #End w/ someting besides a dot, ?, or slash #If So, Perm Redirect captured piece W/ Slash At End and at front RedirectRule ^/([^.?]+[^.?/])$ /$1/ [I,R=301] #CHANGE INDEX.HTML REQUESTS TO DIRECTORY REQUESTS #--------------------------------------------- RedirectRule ^/(.*)/index\.html$ /$1/ [I,R=301]

    Read the article

  • Rules engine for spatial and temporal reasoning?

    - by John
    I have an application that receives a number of datums that characterize spatial / temporal processes. It then filters these datums and creates actions which are then sent to processes that perform the actions. Rinse and repeat. At present, I have a collection of custom filters that perform a lot of complicated spatial/temporal calculations. Many times as I discuss my system to individuals in my company, they ask if I'm using a rules engine. I have yet to find a rules engine that is able to reason well temporally and spatially. (Things like When are two entities ever close? Is entity A ever in region B? If entity C is near entity D but oriented backwards relative to C then perform action D.) I have looked at Drools, Cyc, Jess in the past (say 3-4 years ago). It's time to re-examine the state of the art. Any suggestions? Any standards that you know of that support this kind of reasoning? Any defacto standards? Any applications? Thanks!

    Read the article

  • Excluding a script from the general UrlRewrite rules

    - by Steven
    Hi, I have following rewrite rules for a website: RewriteEngine On # Stop reading config files RewriteCond %{REQUEST_FILENAME} .*/web.config$ [NC,OR] RewriteCond %{REQUEST_FILENAME} .*/\.htaccess$ [NC] RewriteRule ^(.+)$ - [F] # Rewrite to url RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_FILENAME} !^(/bilder_losning/|/bilder/|/gfx/|/js/|/css/|/doc/).* RewriteRule ^(.+)$ index.cfm?smartLinkKey=%{REQUEST_URI} [L] Now I have to exclude a script including its eventually querystrings from the above rules, so that I can access and execute it on the normal way, at the moment the whole url is being ignored and forwarded to the index page. I need to have access to the script shoplink.cfm in the root which takes variables tduid and url (shoplink.cfm?tduid=1&url=) I have tried to resolve it using this: # maybe?: RewriteRule !(^/shoplink.cfm [QSA] but to be honest, I have not much of a clue of urlrewriting and have no idea what I am supposed to write. I just know that above will generate a nice 500 error. I have been looking around a lot on stackoverflow and other websites on the same subject, but all I see is people trying to exclude directories, not files. In the worst case I could add the script to a seperate directory and exclude the directory from the rewriterules, but rather not since the script should really remain in the root. Just also tried: RewriteRule ^/shoplink.cfm$ $0 [L] but that didn't do anything either. Anyone who can help me out on this subject? Thanks in advance. Steven Esser ColdFusion programmer

    Read the article

< Previous Page | 12 13 14 15 16 17 18 19 20 21 22 23  | Next Page >