Search Results

Search found 9696 results on 388 pages for 'proxy authentication'.

Page 19/388 | < Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >

  • Best way to fetch remote RSS through authenticated proxy and parse it

    - by grajea
    I'm trying to get a remote RSS through proxy and parse it. I'm using magpierss, but it doesn't allow reaching internet through a proxy (or I don't know how to do it). I assume the option is to, first, fetch the rss with curl functions, that allows proxy authenticating, but .... are there any class to do this in a easy way, or ... does magpie support using proxy, and how? Thanks in advance.

    Read the article

  • What alternatives are there for asp.net forms authentication?

    - by Eytan Levit
    Hi, We are developing a web app that will have a pretty complex user and permission system. The general idea is that we have 3 levels of security: a simple user - that can only access basic data that is in a data repository a manager - that can open up data repositories a superuser - that can open up repository factories. each repository contains various data types(text, images, etc etc). We are looking for authentication methods that will allow us: 1. Scalability. 2. Customization. 3. To create permissions that will effect the GUI + deny access to certain pages. 4. To create predefined roles - that will allow for easy setup of new users. 5. To create custom roles for specific users - allowing them permission sets that are different from the predefined roles. Thanks in advance

    Read the article

  • Lock down a site using Forms Auth in IIS7 with Windows Auth

    - by Josh
    I have an ASP.NET MVC 1.0 application that uses Forms Authentication. We are using Windows Server 2008. I need to lock down the site so that only certain users (in AD Groups) can access the site. Unfortunately, though, when I set the site to not allow anon users and use windows authentication, due to the integration of the site and IIS, it shows the user as signed in as their domain account, instead of allowing them to sign in through Forms Auth. So, I need a mixed mode authentication. I need the site to be only accessible through windows auth, without anon users, but once you are in, it needs to use forms auth only. How would I go about doing this the right way?

    Read the article

  • Lock down a site using Forms Auth in IIS7 with Windows Auth

    - by justjoshingyou
    I have an ASP.NET MVC 1.0 application that uses Forms Authentication. We are using Windows Server 2008. I need to lock down the site so that only certain users (in AD Groups) can access the site. Unfortunately, though, when I set the site to not allow anon users and use windows authentication, due to the integration of the site and IIS, it shows the user as signed in as their domain account, instead of allowing them to sign in through Forms Auth. So, I need a mixed mode authentication. I need the site to be only accessible through windows auth, without anon users, but once you are in, it needs to use forms auth only. How would I go about doing this the right way?

    Read the article

  • Does Apache2 Configured as ReverseProxy Hide Cookies Set by Backend Servers?

    - by Ianthe
    I use Apache 2.2.16 as Reverse Proxy. For a static website, I don't have any issues. However, when began to use cookies, I've noticed that cookies are not being sent to the client. Here's a snippet of my config: <VirtualHost *:80> ServerName app.somewhere.com:80 ServerAlias app ProxyRequests Off ProxyPreserveHost On <Proxy *> Order deny,allow Allow from all </Proxy> ProxyPass /app http://10.x.x.x/app ProxyPassReverse /app http://10.x.x.x/app <Location /> Order allow,deny Allow from all </Location> </VirtualHost> But when I try to access the app server directly, I receive the cookies ok. Is this an expected behaviour for Apache2? I'm using HAProxy for another application that sends cookies to the client and I get all of them.

    Read the article

  • Clients with multiple proxy and multithreading callbacks

    - by enzom83
    I created a sessionful web service using WCF, and in particular I used the NetTcpBinding binding. In addition to methods to initiate and terminate a session, other methods allow the client to send to one or more tasks to be performed (the results are returned via callback, so the service is duplex), but they also allow you to know the status of the service. Assuming you activate the same service on multiple endpoints, and assuming that the client knows these endpoints (for example, it could maintain a List of endpoints), the client should connect with one or more replicas of the same service. The client periodically updates the status of the service, so when it needs to perform a new task (the task is submitted by the user via UI), it selects the service currently less loaded and sends the task to it. Periodically, the client also initiates a maintenance procedure in order to disconnect from one or more overloaded service and in order to connect with new services. I created a client proxy using the svcutil tool. I wish each proxy can be used simultaneously by different threads, for example, in addition to the thread that submits the tasks using a proxy, there are also the following two threads which act periodically: a thread that periodically sends a request to the service in order to obtain the updated state; a thread that periodically selects a proxy to close and instantiates a new proxy to replace the closed one. To achieve these objectives, is it sufficient to create an array of proxies and manage their opening and closing in separate threads? I think I read that the proxy method calls are thread safe, so I would not need to perform a lock before requesting updates to the service. However, when the maintenance procedure (which is activated on its own thread) decides to close a proxy, should I perform a lock? Finally, each proxy is also associated with an object that implements the callback interface for the service: are the callbacks (invoked on the client) executed on different threads on the client? I would like to wrap the management of the proxy in one or more classes so that it can then easily manage within a WPF application.

    Read the article

  • How to Apache SSL proxy to openerp 7 running in VM?

    - by Johnbritto
    I have installed openerp v7 in an ubuntu 12.04 Virtual machine from launchpad.i.e server, web, addons. I configured SSL reverse proxy on virtual machine and my configuration for virtual host *:443 are ServerName openerp.mydomain.net ServerAdmin openerp@localhost SSLEngine on SSLCertificateFile /etc/ssl/openerp/server.crt SSLCertificateKeyFile /etc/ssl/openerp/server.key ProxyRequests Off ProxyPreserveHost On <Proxy *> Order deny,allow Allow from all </Proxy> ProxyVia On ProxyPass / http://172.16.150.14:8069/ ProxyPassReverse / http://172.16.150.14:8069/ RequestHeader set "X-Forwarded-Proto" "https" # Fix IE problem (httpapache proxy dav error 408/409) SetEnv proxy-nokeepalive 1 </VirtualHost> on host, I have configured apache reverse proxy for my subdomain in vhost_ssl.conf as SSLEngine On SSLProxyEngine On ProxyRequests Off ProxyPreserveHost On <Proxy *> Order deny,allow Allow from all </Proxy> ProxyPass / https://172.16.150.14/ ProxyPassReverse / https://172.16.150.14/ SetEnv proxy-nokeepalive 1 <Location /> Order allow,deny Allow from all </Location> I have set 172.16.150.14 on netrpc and xmlrcs interfaces in openerp-server.conf. Now, when I access https:// openerp.mydomain.net from Girefox and chrome browser..I get http:// openerp.mydomain.net%2C%20openerp.mydomain.net/?db=testingdb which makes 404. But when i access URL from IE 9, the URL https:// openerp.mydomain.net works ok .. secondly if i change the parameter list_db= false, then the links works as expected.. Kindly let me know what is creating bottleneck with URL redirect to http://openerp.mydomain.net, openerp.myydomain.net/?db=testdb on Firefox and chrome. i am struck here doing troubleshooting with the URL to work.

    Read the article

  • reverse proxy on PFsense, squid or otherwise

    - by Mustafa Ismail Mustafa
    I've been trying to get this to work for days now and its not working. After bashing my head against the desk enough times, I've decided to man up and ask. I'm desperately trying to set up a reverse proxy on the pfsense box itself. One because its a pretty powerful box and its not being utilized to the maximum at all and two because I don't have any spare machines to setup squid (or any other reverse proxy [capable]) server on. So, on pfsense, everytime I set up rules (on ServicesProxy ServerGeneral) as so: acl surveillance dstdomain surveillance.myweb.local; acl camera dstdomain camera.myweb.local; http_access allow surveillance AND camera (ad nauseum) when I check the services, squid stops and refuses to restart until I remove them pesky acls that are supposed to make my life easier! What am I doing wrong? How can I get it to work? Is there another way/package I can use? Thanks

    Read the article

  • Cross domain javascript form filling, reverse proxy

    - by Michel van Engelen
    I need a javascript form filler that can bypass the 'same origin policy' most modern browsers implement. I made a script that opens the desired website/form in a new browser. With the handler, returned by the window.open method, I want to retrieve the inputs with theWindowHandler.document.getElementById('inputx') and fill them (access denied). Is it possible to solve this problem by using Isapi Rewrite (official site) in IIS 6 acting like a reverse proxy? If so, how would I configure the reverse proxy? This is how far I got: RewriteEngine on RewriteLogLevel 9 LogLevel debug RewriteRule CarChecker https://the.actualcarchecker.com/CheckCar.aspx$1 [NC,P] The rewrite works, http://ourcompany.com/ourapplication/CarChecker, as evident in the logging. From within our companysite I can run the carchecker as if it was in our own domain. Except, the 'same origin policy' is still in force. Regards, Michel

    Read the article

  • Using a TS-Gateway through a Apache reverse-proxy

    - by Helder
    Hey all, I've set up a Windows 2008 server as Terminal Services Gateway, to funnel the RDP access to a bunch of backend servers. However, since I only need to publish SSL to the "outside", I've tried to publish it with our reverse proxy, but it's not working. The Apache box is timing out, while trying to reach the tsgateway. However, if I ping it straight from the same box, there is connectivity. I've read a bit, and with ISA 2006 you can publish TS-Gateways on the internet, so I was wondering it anyone ever got it working with an Apache reverse proxy instead :)

    Read the article

  • Integrated Windows Authentication with Chrome and FireFox

    - by Jaap
    I have a webapplication which uses claims based authentication. The STS is ADFS 2.0. When I am in the intranet and use IE, IWA is used and no login dialog appears. When I am on the internet zone, the Forms based authentication of ADFS is used. Just what I want. Chrome and FireFox are also working as expected when I am in the internet zone. But when I am in the intranet zone, both come with a login dialog, instead of using IWA. And supplying my credentials in that dialog does not work, it keeps repeating the dialog. Any hints? UPDATE: Did about an hour searching on the internet before I asked this question. But after asking it I did just another search giving the answer :-), matter of finding the correct keywords. Here the answer: http://stackoverflow.com/questions/5724377/mvc3-site-using-azure-acs-adfs-continually-prompts-for-credentials-when-using

    Read the article

  • Deluge through a socks proxy server

    - by tapan
    To use deluge I use a socks proxy server. Everything was working perfectly till now. I use the following code to create the socks server: ssh -D 1080 user@remote-server In the proxy settings of Deluge I put the host as 127.0.0.1 and port as 1080 and use socks v4 (for some reason it didn't work with socks v5). This setup was working perfectly till now. Now suddenly I cannot download any torrents. The error I get is connection timed out. I checked the remote server and it seems it is not even being used. What can the error be ?

    Read the article

  • SSH: Two Factor Authentication

    - by Pierre
    I currently have a Ubuntu Server 12.04 running OpenSSH along with Samba and a few other services. At the current time I have public key authentication set up, and I'm wondering if it's possible to set up two factor authentication? I've been looking at Google Authenticator which I currently use with my Gmail account. I've found a PAM module that looks like it will be compatible however it seems that you are forced to use a password and the code generated. I'm wondering if there is a way to use the Google Authenticator Application (or something similar) along with my public key to authenticate into my SSH server?

    Read the article

  • Reverse proxy with SSL and IP passthrough?

    - by Paul
    Turns out that the IP of a much-needed new website is blocked from inside our organization's network for reasons that will take weeks to fix. In the meantime, could we set up a reverse proxy on an Internet-based server which will forward SSL traffic and perhaps client IPs to the external site? Load will be light. No need to terminate SSL on the proxy. We may be able to poison DNS so original URL can work. How do I learn if I need URL rewriting? Squid/apache/nginx/something else? Setup would be fastest on Win 2000, but other OSes are OK if that would help. Simple and quick are good since it's a temporary solution. Thanks for your thoughts!

    Read the article

  • Users using Perl script to bypass Squid Proxy

    - by mk22
    The users on our network have been using a perl script to bypass our Squid proxy restrictions. Is there any way we can block this script from working?? #!/usr/bin/perl ######################################################################## # (c) 2008 Indika Bandara Udagedara # [email protected] # http://indikabandara19.blogspot.com # # ---------- # LICENCE # ---------- # This work is protected under GNU GPL # It simply says # " you are hereby granted to do whatever you want with this # except claiming you wrote this." # # # ---------- # README # ---------- # A simple tool to download via http proxies which enforce a download # size limit. Requires curl. # This is NOT a hack. This uses the absolutely legal HTTP/1.1 spec # Tested only for squid-2.6. Only squids will work with this(i think) # Please read the verbose README provided kindly by Rahadian Pratama # if u r on cygwin and think this documentation is not enough :) # # The newest version of pget is available at # http://indikabandara.no-ip.com/~indika/pget # # ---------- # USAGE # ---------- # + Edit below configurations(mainly proxy) # + First run with -i <file> giving a sample file of same type that # you are going to download. Doing this once is enough. # eg. to download '.tar' files first run with # pget -i my.tar ('my.tar' should be a real file) # + Run with # pget -g <URL> # # ######################################################################## ######################################################################## # CONFIGURATIONS - CHANGE THESE FREELY ######################################################################## # *magic* file # pls set absolute path if in cygwin my $_extFile = "./pget.ext" ; # download in chunks of below size my $_chunkSize = 1024*1024; # in Bytes # the proxy that troubles you my $_proxy = "192.168.0.2:3128"; # proxy URL:port my $_proxy_auth = "user:pass"; # proxy user:pass # whereis curl # pls set absolute path if in cygwin my $_curl = "/usr/bin/curl"; ######################################################################## # EDIT BELOW ONLY IF YOU KNOW WHAT YOU ARE DOING ######################################################################## use warnings; my $_version = "0.1.0"; PrintBanner(); if (@ARGV == 0) { PrintHelp(); exit; } PrimaryValidations(); my $val; while(scalar(@ARGV)) { my $arg = shift(@ARGV); if($arg eq '-h') { PrintHelp(); } elsif($arg eq '-i') { $val = shift(@ARGV); if (!defined($val)) { printf("-i option requires a filename\n"); exit; } Init($val); } elsif($arg eq '-g') { $val = shift(@ARGV); if (!defined($val)) { printf("-g option requires a URL\n"); exit; } GetURL($val); } elsif($arg eq '-c') { $val = shift(@ARGV); if (!defined($val)) { printf("-c option requires a URL\n"); exit; } ContinueURL($val); } else { printf ("Unknown option %s\n", $arg); PrintHelp(); } } sub GetURL { my ($URL) = @_; chomp($URL); my $fileName = GetFileName($URL); my %mapExt; my $first; my $readLen; my $ext = GetExt($fileName); ReadMap($_extFile, \%mapExt); if ( exists($mapExt{$ext})) { $first = $mapExt{$ext}; GetFile($URL, $first, $fileName, 0); } else { die "Unknown ext in $fileName. Rerun with -i <fileName>"; } } sub ContinueURL { my ($URL) = @_; chomp($URL); my $fileName = GetFileName($URL); my $fileSize = 0; $fileSize = -s $fileName; printf("Size = %d\n", $fileSize); my $first = -1; if ( $fileSize > 0 ) { $fileSize -= 1; GetFile($URL, $first, $fileName, $fileSize); } else { GetURL($URL); } } sub Init { my ($fileName) = @_; my ($key, $value); my %mapExt; my $ext = GetExt($fileName); if ( $ext eq "") { die "Cannot get ext of \'$fileName\'"; } ReadMap($_extFile, \%mapExt); my $b = GetFirst($fileName); $mapExt{$ext} = $b; WriteMap($_extFile, \%mapExt); print "I handle\n"; while ( ($key, $value) = each(%mapExt) ) { print "\t$key -> $value\n"; } } sub GetExt { my ($name) = @_; my @x = split(/\./, $name); my $ext = ""; if (@x != 1) { $ext = pop @x; } return $ext; } sub ReadMap { my($fileName, $mapRef) = @_; my $f; my @arr; open($f, '<', $fileName) or die "Couldn't open $fileName"; my %map = %{$mapRef}; while (<$f>) { my $line = $_; chomp($line); @arr = split(/[ \t]+/, $line, 2); $mapRef->{ $arr[0]} = $arr[1]; } printf("known ext\n"); while (($key, $value) = each(%$mapRef)) { print("$key, $value\n"); } close($f); } sub WriteMap { my ($fileName, $mapRef) = @_; my $f; my @arr; open($f, '>', $fileName) or die "Couldn't open $fileName"; my ($k, $v); while( ($k, $v) = each(%{$mapRef})) { print $f "$k" . "\t$v\n"; } close($f); } sub PrintHelp { print "usage: -h Print this help -i <filename> Initialize for this filetype -g <URL> Get this URL\n -c <URL> Continue this URL\n" } sub GetFirst { my ($fileName) = @_; my $f; open($f, "<$fileName") or die "Couldn't open $fileName"; my $buffer = ""; my $first = -1; binmode($f); sysread($f, $buffer, 1, 0); close($f); $first = ord($buffer); return $first; } sub GetFirstFromMap { } sub GetFileName { my ($URL) = @_; my @x = split(/\//, $URL); my $fileName = pop @x; return $fileName; } sub GetChunk { my ($URL, $file, $offset, $readLen) = @_; my $end = $offset + $_chunkSize - 1; my $curlCmd = "$_curl -x $_proxy -u $_proxy_auth -r $offset-$end -# \"$URL\""; print "$curlCmd\n"; my $buff = `$curlCmd`; ${$readLen} = syswrite($file, $buff, length($buff)); } sub GetFile { my ($URL, $first, $outFile, $fileSize) = @_; my $readLen = 0; my $start = $fileSize + 1; my $file; open($file, "+>>$outFile") or die "Couldn't open $outFile to write"; if ($fileSize <= 0) { my $uc = pack("C", $first); syswrite ($file, $uc, 1); } do { GetChunk($URL, $file, $start ,\$readLen); $start = $start + $_chunkSize; $fileSize += $readLen; }while ($readLen == $_chunkSize); printf("Downloaded %s(%d bytes).\n", $outFile, $fileSize); close($file); } sub PrintBanner { printf ("pget version %s\n", $_version); printf ("There is absolutely NO WARRANTY for pget.\n"); printf ("Use at your own risk. You have been warned.\n\n"); } sub PrimaryValidations { unless( -e "$_curl") { printf("ERROR:curl is not at %s. Pls install or provide correct path.\n", $_curl); exit; } unless( -e "$_extFile") { printf("extFile is not at %s. Creating one\n", $_extFile); `touch $_extFile`; } if ( $_chunkSize <= 0) { printf ("Invalid chunk size. Using 1Mb as default.\n"); $_chunkSize = 1024*1024; } }

    Read the article

  • Teamcity nuget feed http authentication

    - by Mihalis Bagos
    Nuget feed by team city is working perfectly but there is a strange problem. Local IP (http://192.168.xx.xx:9999/feed/../): Listing through browser works Accessing packages through Visual studio 11 nuget works VPN IP (http://55.xx.xx.xx:9999/feed/../): Listing packages through browser works Accessing packages through Visual studio 11 nuget PROBLEM GUEST Account: Everything works fine, both on VPN and local IP (so its purely an authentication problem) The problem is, we can't get the user to authenticate. Using the same credentials, no matter what we try we get 401. The server VPN ip is whitelisted in internet explorer intranet settings. Any ideas? Basically HTTP authentication is failing for the VPN although it shouldn't, since the browser works fine!

    Read the article

  • configure squid3 to set up a web proxy in ubuntu12.04

    - by Gnijuohz
    I am in a LAN and have to use a proxy given to access the web in a very limited way. I can't even use google, github.com or SE sites. However I can use ssh to log into a server, which I have root access so basically I can do anything I want with it. So I was thinking that maybe I could use that server as a proxy so I can visit sites through it. I tested it using ssh -vT [email protected] which gave a proper response. And In my computer I can't do this. Also I tried downloading something from the gun.org using wget, which can't be done in my computer too. And it succeeded on that server. I don't know if that's enough to say that this server have full access to the Internet. But I assumed so and I installed squid3 on it. After trying some while, I failed to get it working. I got this after I run squid3 -k parse 2012/07/06 21:45:18| Processing Configuration File: /etc/squid3/squid.conf (depth 0) 2012/07/06 21:45:18| Processing: acl manager proto cache_object 2012/07/06 21:45:18| Processing: acl localhost src 127.0.0.1/32 ::1 2012/07/06 21:45:18| Processing: acl to_localhost dst 127.0.0.0/8 0.0.0.0/32 ::1 2012/07/06 21:45:18| Processing: acl localnet src 10.1.0.0/16 # RFC1918 possible internal network 2012/07/06 21:45:18| Processing: acl SSL_ports port 443 2012/07/06 21:45:18| Processing: acl Safe_ports port 80 # http 2012/07/06 21:45:18| Processing: acl Safe_ports port 21 # ftp 2012/07/06 21:45:18| Processing: acl Safe_ports port 443 # https 2012/07/06 21:45:18| Processing: acl Safe_ports port 70 # gopher 2012/07/06 21:45:18| Processing: acl Safe_ports port 210 # wais 2012/07/06 21:45:18| Processing: acl Safe_ports port 1025-65535 # unregistered ports 2012/07/06 21:45:18| Processing: acl Safe_ports port 280 # http-mgmt 2012/07/06 21:45:18| Processing: acl Safe_ports port 488 # gss-http 2012/07/06 21:45:18| Processing: acl Safe_ports port 591 # filemaker 2012/07/06 21:45:18| Processing: acl Safe_ports port 777 # multiling http 2012/07/06 21:45:18| Processing: acl CONNECT method CONNECT 2012/07/06 21:45:18| Processing: http_port 3128 transparent vhost vport 2012/07/06 21:45:18| Starting Authentication on port [::]:3128 2012/07/06 21:45:18| Disabling Authentication on port [::]:3128 (interception enabled) 2012/07/06 21:45:18| Disabling IPv6 on port [::]:3128 (interception enabled) 2012/07/06 21:45:18| Processing: cache_mem 1000 MB 2012/07/06 21:45:18| Processing: cache_swap_low 90 2012/07/06 21:45:18| Processing: coredump_dir /var/spool/squid3 2012/07/06 21:45:18| Processing: refresh_pattern ^ftp: 1440 20% 10080 2012/07/06 21:45:18| Processing: refresh_pattern ^gopher: 1440 0% 1440 2012/07/06 21:45:18| Processing: refresh_pattern -i (/cgi-bin/|?) 0 0% 0 2012/07/06 21:45:18| Processing: refresh_pattern (Release|Packages(.gz)*)$ 0 20% 2880 2012/07/06 21:45:18| Processing: refresh_pattern . 0 20% 4320 2012/07/06 21:45:18| Processing: ipcache_high 95 2012/07/06 21:45:18| Processing: http_access allow all I deleted some allow and deny rules and added http_access allow all so that all the request would be allowed. After configuring my computer, I got this error: Access control configuration prevents your request from being allowed at this time. Please contact your service provider if you feel this is incorrect. And the log in the server showed that my TCP requests had all been denied. So, first of all, is what I am trying to do achievable? If so, how to configure the squid in the server so that I use it as a proxy to surf the Internet? My computer and the server both run Ubuntu11.04. Thanks for any help~

    Read the article

  • Firewall Authentication - logon failed

    - by RoseofPurple
    I am attempting to use a Watchguard firebox 550e with Fireware XTM 11 to authenticate incoming traffic for RDP access. I have configured the firewall to use my domain controller for Active directory authentication with a Windows 2000 server farm and added a couple of user accounts to the users list in the firewall, but when I attempt to log onto the authentication page for the firewall, I get Logon failed. I know that the user names work and that the passwords are correct. I am also certain that I have told it to log on using Active Directory instead of the FireboxDB. I have tried using the username alone, the domain\username, and the email address. I believe that the Search base is correct (DC=mydomainname,DC=com), and I did not change any defaults for sAMAccountName (and I do not recall making any changes to those items when configuring the domain structure). Any assistance would be appreciated.

    Read the article

  • Does Hotmail really offer two-factor authentication? [closed]

    - by Brian Koser
    I've read multiple news articles that claim Hotmail offers two-factor authentication. One of the articles describes Hotmail's system, saying ...whenever you go to Hotmail...you can choose to get a single-use code–a string of numbers that will be sent via text message to your phone–to use instead of your password. Is this an accurate description of Hotmail's system? If so, does Hotmail really offer two-factor authentication? If you can use either your password or a single-use code, it seems to me that it does not. Is this system really more secure than just having a password? Doesn't this just make an additional "key" available to a hacker? (I must be wrong here, I know the folks at Microsoft are much smarter than I am).

    Read the article

  • Rewrite URL before passing to proxy Lighttpd

    - by futureelite7
    Hi, I'm trying to setup a reverse proxy in lighttpd, such that all requests (and only those requests) under /mobile/video is redirected to the / directory of a secondary web server. This is pretty easy in apache, but I can't for the life of me figure out how to do so in lighttpd. $HTTP["url"] =~ "^/wsmobile/video/" { url.rewrite-once = ( ".*" => "/test/" ) proxy.server = ( "" => ( ( "host" => "210.200.144.26", "port" => 9091 ) ) ) } I've tried using the http["url"] directive, but lighttpd simply ignore those requests and continue to pass the full url to the secondary server, which of course chokes and throws 404s. However, if I do a global rewrite then everything gets forwarded to the secondary server, which is also not what I want. How do I go about this task?

    Read the article

  • Lighttpd as a proxy to a https host

    - by Homer J. Simpson
    Hi, I am trying to set up a lighttpd as proxy from one server to another (which is running Apache/SSL), having trouble with the https part.. I want to be able to capture https requests and let the Apache server handle it, trying this: $SERVER["socket"] == ":443" { $HTTP["host"] == "www.mydomain.com" { proxy.server = ( "" => ( ( "host" => "123.123.123.123", # the Apache "port" => 443))) } } Normal port 80 requests are working fine.. What am I doing wrong ? Edit: Additionaly, error.log doesnt show anything.. Requests to https://www.mydomain.com are not finishing.

    Read the article

  • LDAP Authentication for multiple AD Domains

    - by TrevJen
    I have 3 full trust domains (2 child and one root). I need to use LDAP to allow authntication for domain users. The trick is that I need the application to use an AD server for the child domain BUT proxy the LDAP query and authentication for the root domain. I see that it maty be possible with AD LDS and some trusts and synching, but it looks pretty hairy and overly complicated. The short of it is: 3 domains (Parent, ChildA, ChildB) My 3rd party app will need to use ChildA domain servers to authenticate either: a. a user in the parent domain or b. a user in the ChildB domain I already have full trusts between all domains, and regular NTLM authentication works fine (unless you are trying to authenticate with LDAP)

    Read the article

  • Using Linode as a proxy server wrapped in DNS to bypass paid internet services

    - by snihalani
    I have a linode server that I use for development. I was thinking of using that as a proxy server. I have noticed that most paid connections allow DNS queries but don't allow http until I pay them. I verified this by flushing my dns cache and nslookup some random websites. How do I create a proxy server that allows me to may be wrap my packets from my computer in DNS packets, relay to my linode server with ssh key authentication and act as a broker? Thanks.

    Read the article

  • Using IIS7 as a reverse proxy

    - by Jon
    Hi All, My question is pretty much identical to the question listed but they did not get an answer as they ended up using Linux as the reverse proxy. http://serverfault.com/questions/55309/using-iis7-as-a-reverse-proxy I need to have IIS the main site and linux (Apache) being the proxied site(s). so I have site1.com (IIS7) site2.com (Linux Apache) they have subdomains of sub1.site1.com sub2.site1.com sub3.site2.com I want all traffic to go to site1.com and to say anything that is site2.com should be proxied to linux box on internal network, (believe ARR can do this but not sure how). I can not have it running as Apache doing the proxying as I need IIS exposed directly. any and all advice would be great. Thanks

    Read the article

< Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >