Search Results

Search found 62215 results on 2489 pages for 'http basic authentication'.

Page 179/2489 | < Previous Page | 175 176 177 178 179 180 181 182 183 184 185 186  | Next Page >

  • HTTP Proxypass of subdomain

    - by enedebe
    I'm trying to install a proxy on my gateway that everything that comes from a subdomain for example sub.mydomain.com goes to an inside server at a :3000 port. I'm installing a redmine server inside my network that has to be reached from outside. Any idea of how to do that? I think in httpd as proxypass, but I don't know how to get just the subdomain name to proxy it. My gateway is currently Clearos machine. Thanks

    Read the article

  • How can I reroute a sub-domain to localhost + port number?

    - by urig
    I have several web applications running on my developer machine. They mimic our production web applications which are hosted on sub-domain. For example, consider: api.myserver.com - is mimicked by 127.0.0.1:8000 www.myserver.com - is mimicked by 127.0.0.1:8008 and so on... How can I make it so that, on my Windows 7 machine, HTTP calls to "api.myserver.com" (note the lack of port number) are redirected to 127.0.0.1:8000 etc? Note that this needs to apply both to client-side calls (in the browser) and server-side calls (from IIS to Python development server and vice versa). Do I need a proxy to run locally to achieve this? Can you recommend such a tool?

    Read the article

  • HTTPS vs. VPN for communication between business partners?

    - by Andrew H
    A business partner has asked to set up a site-to-site VPN just so that a few servers can communicate with each other over HTTPS. I'm convinced this isn't necessary, or even desirable. To be fair it must be part of a wider policy, potentially even a legal requirement. However I'd like to convince them to simply offer an IP to us (and us only) and a port of their choosing for HTTPS. Has anyone had a similar experience, or had to come up with a cast-iron argument against a VPN? Allow me to expand a little - we have a web service that initiates a connection to the partner's corresponding service using an encrypted HTTP connection. The connection uses a client certificate to authenticate. The connection is firewalled so only our IPs can contact the service. So why is a VPN necessary?

    Read the article

  • Proxy service likes Apache Http

    - by Aptos
    Currently I try to simulate my app as distributed servers, so I let them run on localhost:9000 and localhost:9001, i tried using apache load balancer but it is really hard to config on mac, my idea is the second server localhost:9001 will be kept ideal and the requests only be redirected to them when the first server is downed. Is there any good free program can do that ? (except Apache httpd). Extra functions: my application is written in java and maintain an in-memory object, is there any service that can synchronize that object between 2 servers so they can keep uptodate status of other (the second one takes state of the first one)? Is there any app can support that? Thank you very much.

    Read the article

  • Domain redirection to port on Windows Server 2008

    - by Rauffle
    I have a Windows server running IIS. I wish to run a piece of software that hosts a web interface on a non-standard HTTP port (let's say, port 9999). I have static DNS entries on my router for two FQDNs, both of which direct to the Windows server. I wish to have requests to 'website1' to continue to go to the IIS website on port 80, but requests for 'website2' to instead go to port 9999 to be handled by the other application. How can I accomplish this? Right now I can get to the application by going to 'website1:9999' or 'website2:9999'.

    Read the article

  • Setting up a simple proxy

    - by waiwai933
    I'm going to China for a week, and I'd prefer to be able to watch YouTube while I'm there. Since it's blocked, I presume I'm going to need a proxy. I have a Mac and a Linux box at home that I can use, but I'm not sure how complicated setting up a proxy is. From what I understand, I should be able to do it with a browser that supports HTTP 1.1 CONNECT if I connect to my machine at home. Can I do this, and if so, what browser can I use, or if I have misunderstood something, do I have any other simple solutions?

    Read the article

  • Can Squid 2.7 proxy gzipped content

    - by Tom Styles
    We have a forward proxy for our network which is Squid 2.7. This is managed for us by a third party. We noticed recently that http requests going from our network to the web were having the Accept-Encoding header removed. This was resulting in all web traffic across our network (approx 8000+ PCs) being uncompressed even though the browsers and server on each end were capable. We have asked the third party to look into this and they have said it is because Squid 2.7 does not support compression. I understand this to be true but I was under the impression that the compression happened on the webserver rather than the proxy. So... Can Squid 2.7 proxy and/or cache content that is gzipped? If it can, how/why might it be configured such that the Accept-Encoding header is being removed?

    Read the article

  • Window 7 image in vmware will allow network connection out but not http

    - by Ormis
    I am currently trying to create a set of images to deploy on my network, but I've run in to a snag. When I create my own Windows 7 image I can successfully use NAT for connecting to the network but whenever I try to access a webpage I get nothing. To be more specific, All firewalls/iptables are disabled on my host machine, my virtual machine, and my network. I can do lookups and all addresses respond correctly (i'm even using Google's DNS). On the host OS i have full connectivity. On the virtual machine I can ping any device I want and all addresses resolve correctly. Within a browser I cannot reach any page via hostname or IP. I feel almost like port 80 is being blocked but i can't find any reason this would be the case. If anyone has had this occur before, I would love some insight to the problem. I initially asked this on stackoverflow and now my eyes are now opened up to superuser. Thank you for any help you can provide.

    Read the article

  • Random timeout now and then

    - by KenavR
    Maybe this is a to generic question, but since we have this issue for quite a while now, I give it a shot. We have some applications which use HTTP for the connection between the client (website or fat-client) and the server. The Computer who runs this applications is in a Network behind a firewall and a proxy, the server isn't inside the same network. The problem is that every now and then the https Request times out and depending on the Client the Application "hangs" or does some other funky stuff. The problem is definitely inside our network, because if i try the applications outside our network it works fine. Can you give me a hint where i can most likely find the problem?

    Read the article

  • HTTP transfer speeds start fast then slows to a crawl

    - by AnITAdmin
    We just got a new dedicated 1 gigabit server running IIS. The CPU is 15% or less, the RAM (4 GB total) has 3 GB unused... We are pushing 110 mbits per second... Speeds are really slow.. And, if fact, here's how it happens: We connect, and then the speeds are really fast, and quickly decline to 40 kBps or less. What's going on? It seems the server just wont go above 120 mbits per second. The files are all very large. 50 MB to 500 MB... Could this be a factor? Again, CPU, RAM, UI responsiveness when accessing remotely all seem fine.

    Read the article

  • htaccess filesMatch exclusion

    - by Hikari
    I have the following directive in my htaccess <filesMatch "\.(gif|jpe?g|png|js|css|swf|php|ico|txt|pdf|xml|html?)$"> FileETag None <ifModule mod_headers.c> Header unset ETag Header set Cache-Control "max-age=0, no-cache, no-store, must-revalidate" Header set Pragma "no-cache" Header set Expires "Wed, 11 Jan 1984 05:00:00 GMT" </ifModule> </filesMatch> I copied that regex from someplace in Web months ago. It should add those headers to any HTTP Response that does NOT have those extensions. But it's not working, it's adding them to any Response. I also need to create another directive to add Header set Cache-Control "max-age=3600, public" to Responses of files that DOES have them. Could anybody help me make proper fileMatch regexes?

    Read the article

  • Enabling http access on port 80 for centos 6.3 from console

    - by Hugo
    Have a centos 6.3 box running on Parallels and I'm trying to open port 80 to be accesible from outside tried the gui solution from this post and it works, but I need to get it done from a script. Tried to do this: sudo /sbin/iptables -A INPUT -p tcp -m state --state NEW -m tcp --dport 80 -j ACCEPT sudo /sbin/iptables-save sudo /sbin/service iptables restart This creates exactly the same iptables entries as the GUI tool except it does not work: $ telnet xx.xxx.xx.xx 80 Trying xx.xxx.xx.xx... telnet: connect to address xx.xxx.xx.xx: Connection refused telnet: Unable to connect to remote host UPDATE: $ netstat -ntlp (No info could be read for "-p": geteuid()=500 but you should be root.) Active Internet connections (only servers) Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name tcp 0 0 0.0.0.0:3306 0.0.0.0:* LISTEN - tcp 0 0 127.0.0.1:6379 0.0.0.0:* LISTEN - tcp 0 0 0.0.0.0:111 0.0.0.0:* LISTEN - tcp 0 0 0.0.0.0:80 0.0.0.0:* LISTEN - tcp 0 0 0.0.0.0:22 0.0.0.0:* LISTEN - tcp 0 0 127.0.0.1:631 0.0.0.0:* LISTEN - tcp 0 0 127.0.0.1:25 0.0.0.0:* LISTEN - tcp 0 0 0.0.0.0:37439 0.0.0.0:* LISTEN - tcp 0 0 :::111 :::* LISTEN - tcp 0 0 :::22 :::* LISTEN - tcp 0 0 ::1:631 :::* LISTEN - tcp 0 0 :::60472 :::* LISTEN - $ sudo cat /etc/sysconfig/iptables # Generated by iptables-save v1.4.7 on Wed Dec 12 18:04:25 2012 *filter :INPUT ACCEPT [0:0] :FORWARD ACCEPT [0:0] :OUTPUT ACCEPT [5:640] -A INPUT -m state --state RELATED,ESTABLISHED -j ACCEPT -A INPUT -p icmp -j ACCEPT -A INPUT -i lo -j ACCEPT -A INPUT -p tcp -m state --state NEW -m tcp --dport 22 -j ACCEPT -A INPUT -j REJECT --reject-with icmp-host-prohibited -A INPUT -p tcp -m state --state NEW -m tcp --dport 80 -j ACCEPT -A FORWARD -j REJECT --reject-with icmp-host-prohibited COMMIT # Completed on Wed Dec 12 18:04:25 2012

    Read the article

  • Squid throws error, The requested URL could not be retrieved

    - by Supratik
    Hi Sometimes I am getting the following error The requested URL could not be retrieved While trying to retrieve the URL: http://groups.google.com/ The following error was encountered: Unable to determine IP address from host name for groups.google.com The dnsserver returned: Refused: The name server refuses to perform the specified operation. This means that: The cache was not able to resolve the hostname presented in the URL. Check if the address is correct. Your cache administrator is root. What could be the reason for the above error ? Regards Supratik

    Read the article

  • using trickle to slow down browser

    - by tester
    according to trickle's man page, http://linux.die.net/man/1/trickle i can limit the download speed of a process, e.g. trickle -u 10 -d 20 ncftp to Launch ncftp(1) limiting its upload capacity to 10 KB/s, and download capacity at 20 KB/s. how would I go about limiting google-chrome or firefox with trickle? Edit: For those of you asking why I asked such an obvious question, I tried trickle -u 10 -d 20 firefox and I'm getting an error trickle: Could not reach trickled, working independently: No such file or directory firefox opens right after, but is definitely not rate limited...

    Read the article

  • How to make sure clients update their browser cache when my website is updated?

    - by user64204
    I am using the HTTP 1.1 Cache-Control header to implement client-side caching. Since I update my website only once a month I would like the CSS and JS files to be cached for 30 days with Cache-Control: max-age=2592000. The problem is that the 30-day period defined by Cache-Control doesn't coincide with the website update cycle, it starts from the moment the users visit the site and ends 30 days later, which means an update could occur in the meantime and users would be running with outdated content for a while, which could break the rendering of the website if for instance the HTML and CSS no longer match. How can I perform client-side caching of content for periods of several days but somehow get users to refresh their CSS/JS files after the website has been updated? One solution I could think of is that if website updates can be schedule, the max-age returned by the server could be decreased every day accordingly so that no matter when people visit the website, the end of caching period would coincide with the update of the website, but changing the server configuration every day goes against one of my sysadmin principles (once it's running, don't touch it).

    Read the article

  • How can I redirect HTTPS(S) traffic to anothr gateway?

    - by PsyStyle
    I have a network like 192.168.0.0/15 with the default gateway set to 192.168.0.1. Al the workstations of the network use this gateway for all kind of accesses to the Internet. Now I am testing a new Internet connection with another provider and for this I am using a second gateway on the same subnet with 192.168.0.2 as IP address. I want to redirect only HTTP and HTTPS traffic to this second gateway without touching the address of the default gateway set inside every workstation. How can I accomplish this task? What I have to change inside the first's gateway firewall configuration or routes? I tried with a dnat like DNAT loc:192.168.0.1 loc:192.168.0.2 tcp 80 but nothing worked. I use Shorewall for simplicity in configuration but I can understand even theorical answers which I will try to adapt to my case

    Read the article

  • Make nginx avoid cache if response contains Vary Accept-Language

    - by gioele
    The cache module of nginx version 1.1.19 does not take the Vary header into account. This means that nginx will serve the same request even if the content of one of the fields specified in the Vary header has changed. In my case I only care about the Accept-Language header, all the others have been taken care of. How can I make nginx cache everything except responses that have a Vary header that contains Accept-Language? I suppose I should have something like location / { proxy_cache cache; proxy_cache_valid 10m; proxy_cache_valid 404 1m; if ($some_header ~ "Accept-Language") { # WHAT IS THE HEADER TO USE? set $contains_accept_language # HOW SHOULD THIS VARIABLE BE SET? } proxy_no_cache $contains_accept_language proxy_http_version 1.1; proxy_pass http://localhost:8001; } but I do not know what is the variable name for "the Vary header received from the backend".

    Read the article

  • Controlling access to my API using SSH public key (not SSL)

    - by tharrison
    I have the challenge of implementing an API to be consumed by relatively non-technical clients -- pasting some sample code into their WordPress or homegrown PHP site is probably as much as we can ask. Asking them to install SSL on their servers ain't happening. So I am seeking a simple yet secure way to authenticate API clients. OAuth is the obvious solution, but I don't think it passes the "simple" test. Adding a client id and hashed secret as a parameter to the requests is closer -- it's not hard to do md5($secret . $client_id) or whatever the php would be. It seems to me that if client requests could use the same approach as SSH public keys (client gives us a key from their server(s) there should be some existing magic to make all of the subsequent transactions transparently work just as regular HTTP API requests. I am still working this out (obviously :-), so if I am being an idiot, it would be nice to know why. Thanks!

    Read the article

  • Setting correct Content-Type sent from Wordpress, on Apache server

    - by eoinoc
    I need help pointing me in the right direction for setting the ContentType returned by Apache for content produced by WordPress. I'm having trouble figuring out why WordPress is returning incorrect headers. Issue The specific problem is that our Wordpress blog pages are being downloaded as a file rather than displayed by Internet Explorer and Chrome v21. Content-Type: application/x-gzip is being returned by the server. I'm told that I should expect Content-Type: text/html. Background The URL is http://www.bitesizeirishgaelic.com/blog/.

    Read the article

  • Chrome sending of packets to random destinations upon reconnect/disconnect

    - by f0x
    Noticed an interesting thing whilst debugging one of my websocket applications that Google Chrome will push out 3 http requests upon a network connection status changing; Quite disconcerting and looks almost as if some malware is checking out to a random server. I don't quite understand the why though since they all return a 502 or have no response code at all since the destination does not exist. On Disconnect: Reconnect: I guess the main question is this normal and what the use is; howcome they wouldn't go for a dns lookup that actually exists?

    Read the article

  • Strange 3-second tcp connection latencies (Linux, HTTP)

    - by user25417
    Our webservers with static content are experiencing strange 3 second latencies occasionally. Typically, an ApacheBench run ( 10000 requests, concurrency 1 or 40, no difference, but keepalive off) looks like this: Connection Times (ms) min mean[+/-sd] median max Connect: 2 10 152.8 3 3015 Processing: 2 8 34.7 3 663 Waiting: 2 8 34.7 3 663 Total: 4 19 157.2 6 3222 Percentage of the requests served within a certain time (ms) 50% 6 66% 7 75% 7 80% 7 90% 9 95% 11 98% 223 99% 225 100% 3222 (longest request) I have tried many things: - Apache2 2.2.9 with worker or prefork MPM, no difference (with KeepAliveTimeout 10-15) - Nginx 0.6.32 - various tcp parameters (net.core.somaxconn=3000, net.ipv4.tcp_sack=0, net.ipv4.tcp_dsack=0) - putting the files/DocumentRoot on tmpfs - shorewall on or off (i.e. empty iptables or not) - AllowOverride None is on for /, so no .htaccess checks (verified with strace) - the problem persists whether the webservers are accessed directly or through a Foundry load balancer Kernel is 2.6.32 (Debian Lenny backports), but it occurred with 2.6.26 also. IPv6 is enabled, but not used. Does the issue look familiar to anyone? Help/suggestions are much appreciated. It sounds a bit like a SYN,ACK packet getting lost or ignored.

    Read the article

  • Need hosting (e-mail, http) for external domains

    - by disappointed
    This may not be the right place, but since it is a more technical aspect of the hosting world, I am taking the liberty to ask: I'm currently running a virtual server with nginx and postfix for web and e-mail, but I can't handle the administration and, due to frequent problems with e-mail services, I need to resolve this with a almost-standard hosting package (anything should work, even 5 MB static files would be OK). The exception being that I would like to use several domains, hosted with different registrars, for web and e-mail. Currently, this is a very simple configuration in my setup. All hosters I have looked at seem to think this a costly business (more than domain registration costs), but of course the recommend to transfer domains to them (they want the $$). Does anyone know of a hosting company that allows its customers to freely manage domains registered somewhere else?

    Read the article

  • FTP Server with MySQL access, and POST notification

    - by TIW
    Im looking for an FTP server solution, that we can host either internally on a dedicated server, or on Rackspace Cloud/AWS, that provides a HTTP POST notification when a file is uploaded, and allows user accounts to be created either through an API or MySQL database. There are several offerings that provide email notification - but has anyone come across anything that matches the above requirements. BrickFTP being a IaaS system is an option, but we would prefer something hosted in house. I don't believe the standard FTP servers provided with Apache can do the above ... can they?

    Read the article

< Previous Page | 175 176 177 178 179 180 181 182 183 184 185 186  | Next Page >