Search Results

Search found 5067 results on 203 pages for 'squid deb proxy'.

Page 9/203 | < Previous Page | 5 6 7 8 9 10 11 12 13 14 15 16  | Next Page >

  • Accessing Squid Proxy over internet

    - by prateekdayal
    Hi, I recently finished installing Squid on a VPS I have in the US and its working fine locally (I verified by setting http_proxy variable and using lynx). I want to access this proxy over the internet (as an anonymizer) so that I can see how some ads show up for US traffic on my website. I have setup authentication so abuse is not a problem. However, I am not able to access the proxy over the internet. I have set the following rule in squid.conf http_access allow all Is this not possible to do what I want or I am missing something? The port 3128 is open in the firewall so that is not an issue. Squid is running on 0.0.0.0 Thanks Prateek

    Read the article

  • Accessing Squid Proxy over internet

    - by user37074
    Hi, I recently finished installing Squid on a VPS I have in the US and its working fine locally (I verified by setting http_proxy variable and using lynx). I want to access this proxy over the internet (as an anonymizer) so that I can see how some ads show up for US traffic on my website. I have setup authentication so abuse is not a problem. However, I am not able to access the proxy over the internet. I have set the following rule in squid.conf http_access allow all Is this not possible to do what I want or I am missing something? The port 3128 is open in the firewall so that is not an issue. Squid is running on 0.0.0.0 Thanks

    Read the article

  • Using Squid on Debian, Cannot Connect Error

    - by Zed Said
    I am trying to set up Squid on Debian and am getting a connection refused error: squidclient http://www.apple.com/ > test client: ERROR: Cannot connect to 127.0.0.1:3128: Connection refused Here is my config: visible_hostname none cache_effective_user proxy cache_effective_group proxy cache_dir ufs /var/spool/squid 2048 16 256 cache_mem 512 MB cache_access_log /var/log/squid/access.log emulate_httpd_log on strip_query_terms off read_ahead_gap 128 Kb collapsed_forwarding on refresh_stale_hit 30 seconds retry_on_error on maximum_object_size_in_memory 1 MB acl all src 0.0.0.0/0.0.0.0 acl purgehosts src 127.0.0.1/255.255.255.255 # Caching static objects in __data is important. # Without that, apache processes sit around spooling static objects. acl QUERY urlpath_regex /cgi-bin/ /_edit /_admin /_login /_nocache /_recache /__lib /__fudge acl PURGE method PURGE acl POST method POST cache deny QUERY cache deny POST http_access allow PURGE purgehosts http_access deny PURGE http_access allow all http_port 127.0.0.1:80 http_port 50.56.206.139:80 cache_peer 127.0.0.1 parent 80 0 originserver no-query no-digest default redirect_rewrites_host_header off read_ahead_gap 128 Kb shutdown_lifetime 5 seconds Any ideas why this is happening? What have I missed?

    Read the article

  • faster ( squid + apache httpd + apache tomcat )

    - by letronje
    We have a production setup where we have Squid in the front(caching images, js, css, etc) Apache httpd in the middle(prefork + mod_rewrite + mod_jk/AJP + mod_deflate + mod_php(few php pages)) Apache tomcat 5.5 at the end serving all the dynamic stuff. What would be the best way to reduce the overhead of having 3 servers in the request path ? Wondering if replacing httpd with a faster web server like nginx/lighttpd will help. httpd right now does the job of url rewriting(for clean urls) and talking to tomcat(via mod_jk) and compressing output(mod_deflate) and serving some low traffic php pages. What would be ideal replacement for httpd given that we need these features? Is there a way to replace (squid + apache) with a single entity that does caching well (like squid) for static stuff, rewrites url, compresses response and forwards dynamic stuff directly to tomcat ? heard abt varnish cache, wondering if it can help.

    Read the article

  • Normalize Accept-Encoding via HAProxy for optimized Squid hit rate

    - by Matt Beckman
    Our website infrastructure uses HAProxy for load balancing, a Squid cluster for caching, and application data is on an IIS cluster. We load balance HAProxy by URI to optimize the Squid hit-rate, but we know that Squid is holding different copies of each page based on the Accept-Encoding header passed to it by the browser, and so IE (gzip, deflate) will have a different copy of a cached page than Firefox (gzip,deflate) or Chrome (gzip,deflate,sdch). We want to normalize the Accept-Encoding headers and I think the best place to do so would be in HAProxy. I'd appreciate it if someone could offer some ideas on how to accomplish this without breaking support for clients without gzip or deflate support.

    Read the article

  • squid transparent proxy on all ports

    - by Yves Richard
    I have setup squid as a transparent proxy by redirecting port 80 to the native squid port 3128. I know there are issues with getting secure ports like ssl and imaps to go though the proxy but can I redirect all other ports through the proxy as well. I am trying to get a better idea of bandwidth usage. I have setup iptables to log usage and i am getting most traffic going into the related/established rule. I am trying to determine the origins of this traffic by sending traffic to squid for more detailed logging.

    Read the article

  • Best practices for setting lm-factor in Squid refresh patterns

    - by Mpentecost
    I am running a Squid (3.1) cache in front of Django. The content of the site does not change very often, so Squid gives our backend much needed breathing room. Currently, this is the refresh pattern that we are using to cache the content: refresh_pattern . 60 100% 60 We basically want to cache everything for at least an hour (and only an hour) before Squid then re-validates the content. My question is on the "100%" parameter, which sets the lm-factor. I'm not sure if setting that to 100% is doing what we want it to. The assumption was that by setting it to 100%, it would ensure that objects stay in the cache for the max cache time. Is this an incorrect assumption? What are the best practices that one should follow when setting up a refresh pattern like this?

    Read the article

  • Limit number of simultaneous connections squid makes to a single server

    - by Ben Voigt
    Note: I am asking about outbound concurrent connection limits, not inbound, which is sufficiently covered on existing questions Modern browsers typically open a large number of simultaneous connections, to take advantage of the fact that TCP fairly shares bandwidth between connections. Of course, this doesn't result in fair sharing between users, so some servers have started penalizing hosts which open too many connections. This limit can be configured client-side (e.g. IE MaxConnectionsPerServer, Firefox network.http.max-connections-per-server), but the method differs for each browser and version, and many users aren't competent to adjust it themselves. So we turn to a squid transparent HTTP proxy for central management of HTTP download. How can the number of simultaneous connections from squid to a remote webserver be limited, so the webserver doesn't perceive it as abuse of concurrent connections? Ideally the limit would be per source address. Squid should accept virtually unlimited concurrent requests from the client browser, and issue them sequentially to the remote server, only N at a time, delaying (but not dropping) the others.

    Read the article

  • Squid proxy server not forwarding traffic

    - by DilbertDave
    I'm trying to set up a web filtering system (dans guardian) on my home network but am failing at the first hurdle - configuring the Squid Proxy server. No matter what I do I cannot seem to configure it properly and I just receive the 'Requested URL could not be received' error page. If I remove the proxy setting on the browser everything works normally. Ultimately I want to run this on something like a Raspberry Pi but at the moment I'm testing with a virtual machine (although efforts with a netbook were equally unsuccessful). The VM has a clean installation of Linux Mint 15 and I've installed squid via apt-get. I've followed numerous walkthroughs but this on (https://help.ubuntu.com/lts/serverguide/squid.html) pretty much sums up the process I've been taking. I'm obviously missing something but cannot figure it out - any assistance appreciated.

    Read the article

  • Search Domain Not Working With Squid

    - by Kyle Brandt
    I just set up a squid proxy as a parent proxy to HAVP. When I or other users try to access a domain with an address like "http://foo" I get the following squid error in the browser: The dnsserver returned: Server Failure: The name server was unable to process this query. However, "http://foo.companyname.com" works fine. The search domain in resolv.conf on both the client and proxy host is companyname.com. (There a better term for "search domain"?) Is there a way to correct this, maybe something in the squid.conf file?.

    Read the article

  • Squid 2.7 in offline_mode yet tries to contact DNS servers to resolve addresses

    - by William C
    I installed Squid 2.7 to act as a web cache on my laptop, so that I can browse previously-visited sites when I don't have WiFi. Except http_access allow all, I've made no changes to the default squid.conf configuration. When I turn offline_mode ON and disconnect from the Internet, and then I visit sites, I encounter The following error was encountered: Unable to determine IP address from host name for whatever.sitename.com The dnsserver returned: Timeout on any site I visit. What settings do I need to add to squid.conf so I can browse sites offline?

    Read the article

  • Gotchas for reverse proxy setups

    - by kojiro
    We run multiple web applications, some internal-only, some internal/external. I'm putting together a proposal that we use reverse proxy servers to isolate the origin servers, provide SSL termination and (when possible) provide load balancing. For much of our setup, I'm sure it will work nicely, but we do have a few lesser-known proprietary applications that may need special treatment when we move forward with reverse-proxying. What kinds of traps tend to cause problems when moving an origin server from being on the front lines to being behind a proxy? (For example, I can imagine problems if an application needed to know the IP address of incoming requests.)

    Read the article

  • Proxy Appliance

    - by Kumar
    I'm looking for a way to set-up a web proxy at home. Hopefully there's a solution I can use to do this. Alternatively, do you have any ideas for setting up one at home using a proxy with an auto updating list? UPDATE - AutoUpdating list so the new sites are categorized and downloaded and the filtering is done by category instead of per site e.g. peerblock works this way Appliance as in a small dedicated hardware or device to perform this task, like WD TV live or Acer Aspire Revo with the media center edition I'd be ok with setting up Squid / Linux / SquidGuard even if i'm not familiar with *nix but ideally a small form factor pc with dual nic's would be ideal so it can be out of sight for the most part !

    Read the article

  • SSL Proxy: Forwarding without the encryption

    - by John
    I have a python application listening on port 9001 for HTTP traffic. I'm trying to configure Apache (or anything, really) to listen on port 443 for HTTPS connections, and then forward the connection, sans encryption, to port 9001 on the same machine. My application would then reply via the proxy, where the encryption would be reapplied, and returned to the client transparently. I'm not doing anything crazy with the site names and SSL certs, I have one public IP, one hostname, and one SSL cert. Stripping the encryption at the proxy doesn't seem to be a common requirement. Is what I'm asking for a normal requirement? Are there other concerns with this sort of configuration?

    Read the article

  • How do I disable changing proxy settings?

    - by gap
    I've got several machines, running 14.04.1, for kids to access. Each machine has accounts for each kid, as well accounts for several adults. Although I've set a system wide proxy use policy, it doesn't get used by Chrome. Instead, I had to log into each kids account and set a proxy use policy, pointing to tinyproxy/dansguardian, for safe internet access. The problem is that, should the kids get an ounce of computer savvy, they'll figure out that they can launch the proxy settings config panel from Unity, then change the proxy settings to None, and completely bypass the "safe" internet scheme I've setup. Can anyone tell me how I can disable those user accounts from being able to modify their proxy settings (not the "system wide" ones... those arent used.. this is the per-user settings). Thanks

    Read the article

  • Installing a .deb file manually?

    - by stef
    apt-get install gitosis --fix-missing on my Linode still leads to a 404 (Failed to fetch http://ftp.debian.org/debian/pool/main/g/gitosis/gitosis_0.2+20080825-2_all.deb 404 Not Found [IP: 130.89.148.12 80] ) . The correct file location seems to be http://ftp.debian.org/debian/pool/main/g/gitosis/gitosis_0.2+20090917-11_all.deb Is there any way I can install this without apt-get, or point apt-get in the right direction somehow? Several other packages on my Debian Linode also point to 404, both from command line and virtualmin. EDIT: Machine details Debian 5.0 64bit (Latest 2.6 (2.6.39.1-x86_64-linode19)) EDIT2 My sources list # main repo deb http://ftp.debian.org/debian/ lenny main contrib non-free deb-src http://ftp.debian.org/debian/ lenny main contrib non-free deb http://security.debian.org/ lenny/updates main contrib non-free deb-src http://security.debian.org/ lenny/updates main contrib non-free deb http://volatile.debian.org/debian-volatile lenny/volatile main contrib non-free deb-src http://volatile.debian.org/debian-volatile lenny/volatile main contrib non-free # contrib & non-free repos #deb http://ftp.debian.org/debian/ lenny contrib non-free #deb-src http://ftp.debian.org/debian/ lenny contrib non-free #deb http://security.debian.org/debian/ lenny/updates contrib non-free #deb-src http://security.debian.org/debian/ lenny/updates contrib non-free deb http://software.virtualmin.com/gpl/debian/ virtualmin-lenny main deb http://software.virtualmin.com/gpl/debian/ virtualmin-universal main

    Read the article

  • Squid causing websocket issues

    - by Kvad
    I am running squid 2.x. When trying to use websockets in my web application I get the following in my squid logs 13/Jun/2012:10:05:08 +1000 558 192.168.19.76 TCP_MISS/100 199 POST http://api.pusherapp.com/apps/21932/channels/2830b5dd-e75b-4788-ae4a-6da903460d22/events? - DIRECT/107.22.252.43 - TCP_MISS/100 indicates that the service is returning the wrong thing from what I can see. What can I do to fix this?

    Read the article

  • configure squid with windows 2008

    - by G.a.r.y.
    Hi my problem is this: I have a 3 pc (192.168.1.2,..3,..4) and a windows 2008 server (192.168.1.100) router is 192.168.1.1. I just want that the 3 pc set like gateway 192.168.1.100, are filter by squid proxy loaded in win2008 so in win2008 I 've set in control panel the proxy 192.168.1.100:3128 and in win2008 browser work, the connection is filtered by proxy, but in 3 pc not works, so maybe I should route all incoming request into squid, but I dunno how ... thanks

    Read the article

  • Getting Squid to authenticate with kerberos and Windows 2008/2003/7/XP

    - by Harley
    This is something I setup recently and was quite a big pain. My environment was getting squid to authenticate a Windows 7 client against a Windows 2008 Server invisibly. NTLM is not really an option, as using it requires a registry change on each client. MS have been recommending Kerberos since Windows 2000, so it's finally time to get with the program. Many, many thanks to Markus Moeller of the Squid mailing lists for helping to get this working.

    Read the article

  • Squid Proxy: url_regex acl is not working?

    - by bharathi
    I am using squid proxy 3.1 in ubuntu machine. I want to allow only urls matching our pattern through our proxy server. I configured acl like below. Acl for dstdomain is working fine. If i access any url besides .zmedia.com , I got proxy connection refused. But the url_regex is not working. What i am trying here is. Allow only request from ".zmedia.com" domain and the request url should be in "/blog" context. # # Recommended minimum configuration: # acl manager proto cache_object acl localhost src 127.0.0.1/32 ::1 acl to_localhost dst 127.0.0.0/8 ::1 acl urlwhitelist url_regex -i ^http(s)://([a-zA-Z]+).zmedia.com/blog/.*$ acl allowdomain dstdomain .zmedia.com acl Safe_ports port 80 8080 8500 7272 # Example rule allowing access from your local networks. # Adapt to list your (internal) IP networks from where browsing # should be allowed acl SSL_ports port 443 acl Safe_ports port 80 # http acl Safe_ports port 21 # ftp acl Safe_ports port 443 # https acl Safe_ports port 70 # gopher acl Safe_ports port 210 # wais acl Safe_ports port 1025-65535 # unregistered ports acl Safe_ports port 280 # http-mgmt acl Safe_ports port 488 # gss-http acl Safe_ports port 591 # filemaker acl Safe_ports port 777 # multiling http acl SSL_ports port 7272 # multiling http acl CONNECT method CONNECT # # Recommended minimum Access Permission configuration: # # Only allow cachemgr access from localhost http_access allow manager localhost http_access deny manager http_access deny !allowdomain http_access allow urlwhitelist http_access allow CONNECT SSL_ports http_access deny CONNECT !SSL_ports # Deny requests to certain unsafe ports http_access deny !Safe_ports # Deny CONNECT to other than secure SSL ports http_access deny CONNECT !SSL_ports # We strongly recommend the following be uncommented to protect innocent # web applications running on the proxy server who think the only # one who can access services on "localhost" is a local user #http_access deny to_localhost # # INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS # # Example rule allowing access from your local networks. # Adapt localnet in the ACL section to list your (internal) IP networks # from where browsing should be allowed http_access allow localhost # And finally deny all other access to this proxy http_access deny all # Squid normally listens to port 3128 http_port 3128 # We recommend you to use at least the following line. hierarchy_stoplist cgi-bin ? # Uncomment and adjust the following to add a disk cache directory. #cache_dir ufs /var/spool/squid 100 16 256 # Leave coredumps in the first cache dir coredump_dir /var/spool/squid append_domain .zmedia.com # Add any of your own refresh_pattern entries above these. refresh_pattern ^ftp: 1440 20% 10080 refresh_pattern ^gopher: 1440 0% 1440 refresh_pattern -i (/cgi-bin/|\?) 0 0% 0 refresh_pattern . 0 20% 4320 Please correct me , If i did anything wrong?

    Read the article

< Previous Page | 5 6 7 8 9 10 11 12 13 14 15 16  | Next Page >