Search Results

Search found 4460 results on 179 pages for 'uninitialized proxy'.

Page 133/179 | < Previous Page | 129 130 131 132 133 134 135 136 137 138 139 140  | Next Page >

  • Cannot access internal website after being connected to other network

    - by Dandroid
    I have a client who claim they cannot access their internal website when they have e.g. been out traveling. They have to reset their browser settings every time to be able to get access again. As they live on another continent and timezone it's hard for me to run live tests with them to see what changes in their browser settings. My first guess would be it's some sort of Proxy related issue, but I want to know if there could be other reasons for this? It's not the LAN itself, that we are sure of. It's browser specific only. Edit It's worth mentioning that it only happens when they turn off/restart their computers.

    Read the article

  • Good speedtest results, but web pages don't load

    - by dmt0
    I have strange connection problems. Ping and download times are good - speedtest.net showed ping 65ms and download 2.17Mbps. Torrent is working well, giving me up to 300MBps. Webpages are loading very poorly though. They are timing out each time - I'd have to refresh 4-5 times to get any simple page to load. It has been happening consistently for the last few days. Same with different browsers on different machines (same network), Windows and Linux. There is no proxy in the browser. Is there any setting in Windows or in a browser that I can change to help this? Some background: I live on this island in Thailand, where internet connection is through radio to another island and than to mainland - it's very weather dependent, but generally OK. As I mentioned, ping is good. Any input is very appreciated.

    Read the article

  • Two domains accessing same folder

    - by Liam Quinn
    I've just taken a new role in a school and am still familiarizing myself with their network, how ever I have recently been given a task and I'm having a little trouble finding out the fundementals of it. I have an admin network/domain 10.49.x.x and a classroom network/domian 192.168.1.x both connect to a Proxy server 10.49.202.231/192.168.1.51. Each domain has it's own shared folders as you'd expect, files and software installs etc, how ever there is a folder "staff" on the classroom network that all the teachers on the classroom network can access. The users on the admin network would like to access this same folder. How do I go about making this happen?

    Read the article

  • why Apache with ssl but back end weblogic without ssl works?

    - by huangli
        Hello everyone. my question is very simple . The link below is a picture about my architecture. https://docs.google.com/open?id=0BxSXbpgYIZVOR212RVk4ZDN1Sm8.      The pic above shows the architecture right now and it works correctly ! which means I could visit apache with url https//apchehost:8080, could not visit the web app with https served by weblogic but I could visit these app with https served by Apache(Apache is proxy server).      My question is why the Apache is configured with ssl but weblogic without ssl works ? I think weblogic should also configured with ssl . If this works , what about security level ? Is the ssl really works if only Apache configured with ssl but Weblogic without it ? Thanks . condition: Apache 2.2.17 with weblogic module mod_wl_22.so Weblogic: 10.3 OS: Windows server 2003

    Read the article

  • debian installation without internet connection

    - by Gobliins
    Hi i want to install some Debian distributions (Grip, Crush, Lenny...) for arm / armel architectures. www.emdebian.org/ i refer to this guide www.aurel32.net/info/debian_arm_qemu.php The Problem i have is that i dont have internet connection with My Linux VM or Qemu i am behind a Proxy. I want to know is there a way where i can dl all the needed files and save them to disk that i don´t need an i.c. during the installation? I am working under Windows now. my regards

    Read the article

  • Squid3 not working. Access denied.

    - by Nitish
    I installed SQUID3 on a Linux machine with two ethernet interfaces (eth0 and eth1). I used the default settings in the squid.conf file and uncommented the two lines acl localnet src 192.168.0.0/16 and http_access allow localnet. eth0 is connected to a router, which provides Internet access. It is assigned an IP 192.168.1.2 by the router. I manually configured eth1 to have an IP address 192.168.5.1. It is connected to a switch. Systems having IP addresses 192.168.5.x are connected to this switch. I ran these two commands for NAT: iptables -t nat -A PREROUTING -i eth1 -p tcp -m tcp --dport 80 -j DNAT --to-destination 192.168.5.1:3128 iptables -t nat -A PREROUTING -i eth0 -p tcp -m tcp --dport 80 -j REDIRECT --to-ports 3128 But when I try to access internet from a system having IP 192.168.5.2 through the proxy I get an error that says "Access denied". What is wrong with my configuration?

    Read the article

  • Fully secured gateway web sites

    - by SeaShore
    Hello, Are there any web sites that serve as gateways for fully encrypted communication? I mean sites with which I can open a secured session, and then to exchange through them with other sites in a secure way both URLs and content? Thanks in advance. UPDATE Sorry for not being clear. I was wondering if there was a way to access any site over the Internet (http or https) without letting any Intranet-proxy read the requested URL or the received content. My question is whether such a site exists, e.g.: I am connected to that site via https, I send it a URL in a secured way, the site gets the content from the target site (possibly in a non-secured way) and returns to me the requested content in a secured way.

    Read the article

  • College wifi works easily on Linux, but not on Windows

    - by user52849
    In Linux: After connecting to the college wifi, going to the network login page logging in, the internet works perfectly as it should. In Windows: After connecting to the college wifi, going to the network login page, logging in, Windows shows "Internet access" and the wireless icon turns white. But still after that, regardless of the browser being used, attempting to accessing any page just shows "Sending request". It does work though after a lot of tries, but only in intervals. But when running Ubuntu 11.10 in VirtualBox, it works properly just like booting in Ubuntu, even if it isn't working on Windows. The college wifi service is really crappy and has been unable to solve this problem. I'm pretty sure there should be a solution for this, but what? What is it that Ubuntu is doing right and Windows isn't? Windows settings set to "Automatically detect settings" and no proxy server used.

    Read the article

  • Gmail/Facebook/Hotmail not opening in Firefox/IE on Windows 7 Home

    - by singlepoint
    Hi, I am unable to open Gmail/Facebook/Hotmail on Firefox/IE on Widows 7 Home. I just unboxed a brand new hp laptop with Norton Security Suite running inside. I get following error message on Firefox. Please help. The connection has timed out The server at www.google.com is taking too long to respond. * The site could be temporarily unavailable or too busy. Try again in a few moments. * If you are unable to load any pages, check your computer's network connection. * If your computer or network is protected by a firewall or proxy, make sure that Firefox is permitted to access the Web

    Read the article

  • Wyse Simple Imager. Unable to Create Product Directory

    - by Steve
    I am trying to submit a post on www.technicalhelp.de, but I receive an error: Invalid Session. Please resubmit the form. This happens if I delete temporary internet files and log out and back in, and if I use a different browser, and if I use a proxy browser. Perhaps someone on this forum can help I am trying to push a Wyse device image to a USB thumb drive. The image is on a remote server, and the thumb drive is connected to my desktop PC. I am using Wyse Simple Imager to do this. When I select the following: Product: V90 Image Version: 5.010627.512 Image File: \servername\folder\OLD_Rapport\V90-withusb\9V90.i2d Almost instantly, without attempting any action, I receive the message: WyseImager Unable to Create Product Directory. Add Image Failed I have completely formatted the USB drive with FAT32. It is new out of the fox, and I can create folders in it. How do I fix this?

    Read the article

  • SSH Tunnel doesn't work in China

    - by Martin
    Last year I was working in China for a few months. I never bothered setting up a real VPN, but just created a SSH tunnel, and changed my browsers proxy settings to connect through it. Everything worked great (except flash of course) but that was fine. However, now I'm back in China but I'm having problems with this approach. I do the same thing as last time, and according to https://ipcheckit.com/ my IP address is indeed the IP of my (private) server in the US, and I'm logging in to my server using a fingerprint I created long before going to China so no MITM should be possible. Furthermore the certificate from ipcheckit.com is from GeoTrust - so everything should be OK However, I still can't access sites which are blocked in China. Any idea how this could be possible?

    Read the article

  • Restrict only some plugins to specific sites in Google Chrome

    - by Christian
    I am looking for a way to set up Google Chrome so that it will run a certain plug-in (Java, what else?) only on whitelisted sites, but other plug-ins (like the PDF viewer) everywhere. From playing with the policies available for Chrome, I think there are basically two levels of plug-in management: List of disabled plugins/enabled plugins: Controls whether a plug-in exists for the browser at all This pair of policies applies to plug-ins, but not to sites. Default plug-in settings/Allow plug-ins on sites: Controls on which sites plug-ins can run This set of policies applies to sites, but not to individual plugins, and it cannot override the first pair. There appears to be no way to configure Chrome so that some plug-ins only run on whitelisted sites, but others run everywhere by default. I have also looked at filtering content on the firewall/proxy level, but I'm not convinced it can be done securely there. Filtering by URLs (file names) or content types can be circumvented trivially, and identification by content inspection cannot be safe either.

    Read the article

  • Forward Request to Multiple Servers

    - by cactuarz
    We have 2 servers. One is old server and another is the new one. Currently we about doing a migration because the old server is not capable enough to handle everyday requests. The specs are: Old server Ubuntu 10.04 Nginx as Reverse Proxy Apache WSGI Python/Django New Server Ubuntu 10.04 Nginx Gunicorn Python/Django Celery+Redis Our manager asked us to research if the old server can perform multiple forwarding to all incoming request, for example, set Nginx of old server to forward all request to both old and new server. The purpose is to perform unit testing to new server using old server as comparer, see if the new server is ready to take over the role. Please help, if there is an idea, or must install some engine, or what we do is impossible. Many thanks.

    Read the article

  • Block all third party domains from web pages

    - by wizlb
    When I'm browsing the web, I'd like to not be tracked by any third party services like Facebook or Google. For instance, if I visit somepage.com I don't want my browser requesting things from facebook.com unless I allow it. However, if I visit facebook.com, Facebook still works. Does anyone know of a Chrome or Firefox extension that will allow me to do this? AdBlock in Chrome doesn't seem to work because it just hides the web page elements, it doesn't stop the browser from downloading them. I imagine that some kind of proxy/browser extension hybrid would be the best. Any suggestions? Thank you.

    Read the article

  • 1K incoming http post requests per second, each with a 10-50K file

    - by Blankman
    I'm trying to figure out what kind of server setup I will need to support: 1K http post requests per second each post will contain a xml file between 5-50K (average of 25 kilobytes) Even if I get a 100 Mb/s connection with my dedicated box (they usually give 10 Mb/s but you can upgrade), from my calculations that is about 12K kb/s which means about 480 25kb files per second. So this means I need around 3 servers then, each with 100 Mb/s connection. Would a single server running HAProxy be able to redirect the requests to other servers or does this mean I need to get something else that can handle more than 100 Mb/s to proxy things out to the other servers? If my math is off I'd appreciate any corrections you may have.

    Read the article

  • Problem with the hosts file under windows 7

    - by martani_net
    I updated some entries in the hosts file "C:\WINDOWS\System32\drivers\etc" to make google for example point to 127.0.0.1 # Additionally, comments (such as these) may be inserted on individual # lines or following the machine name denoted by a '#' symbol. # # For example: # # 102.54.94.97 rhino.acme.com # source server # 38.25.63.10 x.acme.com # x client host 127.0.0.1 localhost ::1 localhost 127.0.0.1 google.com This works fine under windows Vista, but not under Widows 7. When I type google, it goes directly to Google's website. For info, I am not using a proxy server. I think there are some temporary DNS settings that must be flushed, but I don't know how, anyone knows how to fix this? Thank you.

    Read the article

  • How to check Cookie header line and custom cache on Nginx

    - by user124249
    I am trying cache for my website use Nginx Proxy module and has following requirements: If request has cookie (in request header) The response will use cache of Nginx Hide Set-Cookie header line If request has no cookie (in request header) Foward request to backend Don't hide h Set-Cookie header line I use If (of rewrite module) and any directive: if (!-e $http_cookie) { set $not_cache_rq 0; set $not_cache_rp 0; } if ($http_cookie) { set $not_cache_rq 1; set $not_cache_rp 1; } proxy_cache_bypass $not_cache_rq; proxy_no_cache $not_cache_rp; proxy_hide_header Set-Cookie; I do not know how to call cookie proxy_hide_header option when has cookie and no cookie on header line. Please help me. Many thanks.

    Read the article

  • Nginx return 444 depending on upstream response code

    - by Mark
    I have nginx setup to pass to an upstream using proxy pass. The upstream is written to return a 502 http response on certain requests, rather then returning the 502 with all the header I would like nginx to recoginse this and return 444 so nothing is returned. Is this possible? I also tried to return 444 on any 50x error but it doesn't work either. location / { return 444; } location ^~ /service/v1/ { proxy_pass http://127.0.0.1:3333; proxy_next_upstream error timeout http_502; error_page 500 502 503 504 /50x.html; } location = /50x.html { return 444; } error_page 404 /404.html; location = /404.html { return 444; }

    Read the article

  • Any cloud storage service that lets us to authenticate the file when we serve the file to our visito

    - by TORr0t
    Lets say, i want to restrict a file to my visitors. I mean , i have an xx.avi file to be streamed/downloaded, and the visitor paid me for the bandwidth and the size of the file. In amazon s3, i cant control the file at all .(there is a very basic control thing which is not ok for me) Only way is my server can proxy the file, like it fetches the file from amazon s3 storagenode and send it to the owner with authentication approval by a php script. But this way i would double up the bandwidth usage and again there would be latency problem since my server needs to get the file from amazon s3. So i was wondering if there is a better solution or any cloud storage service that lets us to control the file restriction to my visitors. Thanks

    Read the article

  • Dynamic authentication realms in Apache

    - by Cogsy
    I have a front end server acting as a gateway proxy for many (a dynamic 'many') building monitors with embedded webservers. They are accessed with a URL like: http://www.example.com/monitor1/ http://www.example.com/monitor2/ ... I'm trying to restrict access to these monitors to only the users that own them. So what I need is a way of specifying rights to users or groups for specific directories. The standard auth mechanisms I see in Apache won't work because I need to specify every location. I'd prefer some dynamic map or script. Any suggestions?

    Read the article

  • Disable address bar in Internet Explorer 9

    - by token
    I'm trying to disable the address bar in IE9. I've done a significant amount of searching on this and just can't seem to find a way to make it happen. A lot of web resources discuss how to do it in IE8, but not IE9. The reason you might ask? I have an application being hosted in a remote desktop farm that links to web pages outside of the application into Internet Explorer. I need to ensure users are limited to just going to the pages the program pushes them to. I realize I could use a proxy server to limit where they can go, but I'm trying to find a really simple way to just disable the address bar instead. I can't use Kiosk mode because it puts the browser into full screen mode. This won't work for my situation as I need to give users what appears to be a regular browsing experience without an address bar.

    Read the article

  • How failover should work in IIS cluster with Application Request Routing?

    - by username
    I have set up several servers with IIS and connected them to the load balancer - server with installed IIS Application Request Routing. I have created a server farm and added two servers. Then I stopped IIS on the first server and tried to open my web site. It returned me an error: 502 - Web server received an invalid response while acting as a gateway or proxy server. But if instead of stopping IIS I shut down the first server, I'm getting a response from the next server which is online. The question is, what the expected behaviour should be for failover with ARR, should it switch me to the next server if IIS is stopped and server is online?

    Read the article

  • Is there a way to use something like RewriteRule ... [PT] for an external URL?

    - by nbolton
    I have a non-apache web server running on port 8000, but this cannot be accessed from behind corporate firewalls. So, I would like to use my apache 2 server as a proxy to this other web server. I've tried using: RewriteEngine On RewriteRule /.* http://buildbot.synergy-foss.org:8000/builders/ [PT] ... but this does not work; I get: Bad Request Your browser sent a request that this server could not understand. However, it worked fine with [R]. Update: Also, when using ProxyPass, I get this error: Forbidden You don't have permission to access / on this server.

    Read the article

  • close ssh sessions

    - by egor7
    I'm using ~/.ssh/config for logging to the internal.local corporate server: Host internal.local ProxyCommand ssh -e none corporate.proxy nc %h %p But after closing session (typing exit), my sshd session on server stays still active (I see it through different connection). Hot do I close session or change my config in the appropriate way, to eleminate hang sessions? First check from the second, root session: ps -fu user_name user_name 861 855 0 16:58:16 pts/3 0:00 -bash user_name 855 854 0 16:58:13 ? 0:00 /usr/lib/ssh/sshd After logging out: user_name 855 854 0 16:58:13 ? 0:00 /usr/lib/ssh/sshd Just after scp files to/from the internal.local a new scp sessions still hangs on the server.

    Read the article

  • Serving a default image with nginx

    - by ustun
    I have the following configuration in nginx: location /static/ { root /srv/kose/; expires 2w; access_log off; } location / { proxy_pass http://127.0.0.1:8089; } If a file is not found in /static/, I want to serve a default image, and not proxy_pass to 8089. Currently, it looks for the file in the root for static, if it cannot find it, it tries the proxy. I have tried the following, but it doesn't work. How can I tell nginx to serve the default image? I have also tried try_files to no avail. location /static/ { root /srv/kose/; expires 2w; access_log off; error_page 404 /srv/static/defaultimage.jpg; } location / { proxy_pass http://127.0.0.1:8089; }

    Read the article

< Previous Page | 129 130 131 132 133 134 135 136 137 138 139 140  | Next Page >