Search Results

Search found 4618 results on 185 pages for 'websites'.

Page 161/185 | < Previous Page | 157 158 159 160 161 162 163 164 165 166 167 168  | Next Page >

  • VBA Solution to VLOOKUP with Hyperlinks

    - by Emily2
    I am looking for some help with a VBA solution for preserving hyperlinks when using VLOOKUP on Excel (2010). I have a load of data on Sheet 1 for internal use only, and a cut-down version of this on Sheet 2. Instead of recreating Sheet 2 everytime, I am looking to have a working version which updates everytime Sheet1 is updated. Thus, I have used VLOOKUP on Sheet 2 so that only the desired info is returned on sheet 2. However, the problem was that sheet 1 contained in many cells Hyperlinks to external websites, and this would not pull through to Sheet2 using VLOOKUP. With some help, however, using the following VBA solution the hyperlinks now pull through: Function GetHyperLink(r As Range) As String If r.Hyperlinks.Count Then GetHyperLink = r.Hyperlinks(1).Address End If End Function And I am using the following formula in the relevant cell(s) in Sheet2: =HYPERLINK(GetHyperLink(INDEX('Sheet 1'!$B$1:$B$10001,MATCH(A4,'Sheet 1'!$A$1:$A$10001,0))),(VLOOKUP(A4,'Sheet 1'!$A$1:$B$10001,2,FALSE))) However, the problem is with formatting: every cell on Sheet2 is formatted blue and underlined, even although some of them do not contain a hyperlink! Is someone able to help with a VBA solution/formula to fix this last piece of the puzzle? Many thanks, in anticipation.

    Read the article

  • Connecting to a subdomain severs the connection to the domain itself. What's going on?

    - by TheAgent
    Hi all. We have a website on a third-party server (server leased and shared with other websites) and the server provides access to our SQL Server database through a subdomain in the form of mssql.DomainName.com. I was told to use SQL Management Studio Express to connect to this subdomain in order to manage the database. After a few tries and getting many "Timeout" messages, I finally manage to connect to the server; everything's fine. But now I can't connect to DomainName.com anymore. Trying to browse DomainName.com using Firefox, it tries to "lookup" DomainName.com address and fails, telling me "the server was not found". I have to disconnect Management Studio from the server and wait a couple of hour for DomainName.com to become available again, and after that, trying to reconnect to the SQL Server again repeats the scenario. While I can't browse DomainName.com directly, I can use a proxy to connect to it, meaning that the problem is somehow related to a DNS my computer tries to ask to translate the name to the corresponding IP. Anyone seen anything like this before? Any ideas? Thanks in advance.

    Read the article

  • What are good systems for managing PHP/MySQL infrastructure?

    - by sbrattla
    I work in a company which is about to migrate most applications from in-house custom built Java/Tomcat applications to Drupal. Due to company policies, applications and websites need to run on in-house servers. This means that we need infrastructure for Drupal (PHP/MySQL) applications. This must have been solved a million times already. I believe this is what web-hosting companies does every day. Even though we work on a much smaller scale than web-hosting companies, i assume it would make sense to look at the task as if we're going to have an internal small-scale web-hosting company. This means that the guys in IT operations could be "responsible" for "offering" web-hosting, while developers could use these "services". We have three environments; dev(elopment), test and prod(uction). It would make sense that developers could log in to a system and create/edit/delete dev and test sites as they'd like. Production sites should be available through the same system, but only available to IT ops. We need to work with clusters of web servers, meaning that an administration system should be capable of creating/editing/deleting sites across multiple servers. I know there's no "this is it" answer to my question; but what would be a good place to start to get going with this? Apart from the actual hardware, what would be a good administration system for this?

    Read the article

  • hosting company blocking google bots and crawlers [closed]

    - by Jayapal Chandran
    Hi, I am having a site for the past three years and it is very active for the past two years. Until not the site is working well and also now but not after the hosting company blocked google bots. Many pages appeared in the first page of the google search. After they started blocking i couldn't see my links in the first page instead they appeared after 5 pages or they did not appear at all. Will hosting companies be so stupid that they block and dont mention it to their users. They want to protect themselves by making the websites at stake. I display google ads and not this month i got only half for this 10 days. I have made requests to other hosting companies like blue host and monster host that i wan to transfer my domain by making a condition that the will not block google bots which stops the business indirectly. so any kind of help will be helpful. how can i claim what i lost from the hosting company. what other hosting companies consider the users (by informing the events like changing the IP or blocking google bot.) It was really working hard to bring up my site but these people just crashed down my site in a few days. :-(

    Read the article

  • Google 2-step verification: Should my phone know my password? [closed]

    - by Sir Code-A-Lot
    Hi, Just enabled 2-step verification for my Google account. I have installed Google Authenticator on my Android phone, and I set up an application specific password for the Google account associated on my phone. This works great when just using installed apps like Gmail, Calendar and Google Reader. But if I want to access Google Docs, Google Tasks or any other website that requires a Google login, I don't seem to be able to use a application specific password. I have to use my real password and then use Google Authenticator to make a code for the next step. This means if my phone is stolen, revoking the password to my phone is pointless. The phone have already been verified, and all that is needed is my password, which the phones browser will have remembered. I realize that I can take measures to ensure the phones browser doesn't remember my password, but that's just not convenient at all. Am I missing something, or is there no elegant solution to this? Should I just let my phone know my real password? As I see it, being able to login with application specific passwords on websites (which apparently isn't possible) is the only way I can revoke my phones access in a meaningful way.

    Read the article

  • Is it possible to do a 301 redirect AND redirect to the requested resource?

    - by Pure.Krome
    For one of our projects, we're doing a rebranding of the website name, logo, etc... As such, we need to 301 Moved Permenantly redirect all users from the old domain to the new domain. With IIS7, that's pretty simple. We just create a new website that redirects all traffic to a host-headered domain .. to the new one. But this loses their original destination resource. eg. Old Domain: www.OldDomain.com New Domain: www.NewDomain.com User: www.OldDomain.com/user/PureKrome -> 301 --> www.newDomain.com Notice how it's going to the new domain BUT not to /user/PureKrome? How can I do this so it goes to the new domain and keeps the original resource request? I'm guessing URL-ReWriter for IIS7 might help? Also, what happens if I want to do this... CurrentDomain 1: Domain.com CorrectDomain 1: www.Domain.com CurrentDomain 2: AnotherDomain.com CorrectDomain 2: www.AnotherDomain.com Is it also possible to have those in the same IIS website? So any URL to domain.com will 301 to www.domain.com Right now I'm making 2 IIS websites, with a 301 hardcoded (which still means I lose the original resource request, too). Help!

    Read the article

  • iptables, blocking large numbers of IP Addresses

    - by Twirrim
    I'm looking to block IP addresses in a relatively automated fashion if they look to be 'screen scraping' content from websites that we host. In the past this was achieved by some ingenious perl scripts and OpenBSD's pf. pf is great in that you can provide it nice tables of IP addresses and it will efficiently handle blocking based on them. However for various reasons (before my time) they made the decision to switch to CentOS. iptables doesn't natively provide the ability to block large numbers of addresses (I'm told it wasn't unusual to be blocking 5000+), and I'm a bit cautious over adding that many rules into an iptable. ipt_recent would be awesome for doing this, plus it provides a lot of flexibility for just severely slowing down access, but there is a bug in the CentOS kernel that is stopping me from using it (reported, but awaiting fix). Using ipset would entail compiling a more up-to-date version of iptables than comes with CentOS which whilst I'm perfectly capable of doing it, I'd rather not do from a patching, security and consistency perspective. Other than those two it looks like nfblock is a reasonable alternative. Is anyone aware of other ways of achieving this? Are my concerns about several thousand IP addresses in iptables as individual rules unfounded?

    Read the article

  • REMOTE_USER not getting set?

    - by landed
    I am trying to setup LDAP Authentication in Joomla using a plugin called JMapMyLDAP (in fact 4 plugins each doing a different job). I need to pull a part of a string out of the server variable REMOTE_USER and this should be visible (we see here http://timplummer.com.au/4-how-to-integrate-joomla-3-with-active-directory-using-ldap.html) in phpinfo(); The issue is that REMOTE_USER is not set or at least not appearing. A few things to note (if you don't mind) here- conceptually I am not really understanding authentication as a whole subject it appears to be vast despite my years working in websites. Yes I used asp and built php pages to check a user is who they say they are with a token(/session?) that was given to just them and then they are identified when a stateless request is made to the server. Thats my level of understanding. This sounds different to the basic authentication in apache where a password sits in a file and a username and the user needs to login to a basic form to get access to the folder/docs this is via an .htaccess file. Ok so with the LDAP to work I need to get REMOTE_USER this sounds very reasonable as how else do we know is making the request. Thank you.

    Read the article

  • Need advice in setting up server. fastCGI, suExec, speed, security, etc.

    - by lewisqic
    I am running my own dedicated server with centOS 5 and WHM/cPanel. I would like to configure my server to meet my needs but I need a little help. It will only be my own websites being run on this server. I'm still a little green when it comes to server administration so please forgive my ignorance. What I Would Like to Have: I need some public directories to be writable (for user image uploads and things like that) but I don't want those directories to have 777 permissions. I need individual accounts to have the ability to set custom php settings for their own account without affecting other accounts, whether through a php.ini file or through .htaccess or any other method. I would like things to run as fast as possible, whether that means using a php optimizer or cacher, such as eaccelerator or xcache or anything else. I need things to be as secure as possible. Here Are My Questions What should I use for my php handler? DSO? CGI? fastCGI? suPHP? Other? Should I be using suEXEC? What are the benefits or downfalls of this? What php optimizer/cacher is best to use? Are there any other security tips I need to know about all of this? I'd appreciate any advice or direction that can be offered. Thanks!

    Read the article

  • How to install wordpress without a web browser

    - by bvandrunen
    What I am trying to do is to automate wordpress website creation for the company I am working on. We have lots of information in our database for our customers and we want to create a wordpress website for each customer. The process works great and we have no trouble with the creation of websites/transfer of data or anything like that. The problem we do have is when we buy a new domain (http://www.newdomain.com) our process breaks (we call a stored procedure which installs all the data after the URL is called to install wordpress) if the domain takes more than 15min to resolve. We have tried doing looping (where the process checks to see if the domain resolves and keeps trying - but eventually if fails). So what we are looking for is to see if there is a way to install an URL without actually having the domain resolve yet. I have seen where possibilities where you can change the wp-config file but this doesn't work since we have more than one domain and it changes the source URL for all the domains. What we really need is just a way for us to manually start the install script through a call either through a database or some other way that doesn't check to see if the domain is resolved or pointing at the server or not. Thank for any suggestions. EDIT: All we do to install wordpress is call this URL: http://"newdomain".com/wp-admin/install.php?step=2 - if you change settings in the backend calling this URL will install wordpress without having to go through the wp-admin/install.php form

    Read the article

  • How to speed up request/response to django using apache or another solution?

    - by jbcurtin
    Hey all, I'm mainly a developer, but every now and again I jump into the sys-admin position. For the most part I've gotten away with deploying php and python apps using apache. I write today because I'm starting to research faster alternatives to apache, yet still have some of the core features I require like put and delete methods and the ability to connect to a socket via apache. ( This I have not tried, but might be a nice whistle if I ever employ comet on my apps. ) As you've probably guessed, I use javascript exclusively for all my websites utilizing deep linking for SEO support. The main areas that I'm looking to increase performance is the connection between the django apps and the web server to the client response. Every day I work my best to keep the smallest memory foot print as possible, however I am getting to the end of my rope when it comes to working with apache. In general, keep in mind that I'm just starting this research so I'm looking more for material to read then solutions at this moment. My main questions: Am I missing something about apache that makes it faster then everything else? What would be a good server environment to deploy just static files one? What are some of the leading open-source and paid alternatives?

    Read the article

  • DirectAdmin Centos4 server has virus

    - by Rogier21
    Hello all, I have a problem with a webserver that runs Centos4 with DirectAdmin. Since a few weeks some websites hosted on it are not redirecting on search engines properly, they are redirected to some malware site, resulting in a ban from google. Now I have used 3 virusscanners: ClamAV: Didn't find anything Bitdefender: Found a 2-3 files with JS infection, deleted them AVG: Finds lots of files, but doesn't have the option to clean! The virus that it finds is: JS/Redir JS/Dropper Still the strange thing is: website a (www.aa.com) does not have any infected files (have gone through all the files manually, is a custom PHP app, nothing special) but does still have the same virus. Website b (www.bb.com) does have the infected files as only one. I deleted all these files and suspended the account, but no luck, still the same error. I do get the log entries on the website from the searchengines so the DNS entries are not changed. But now I have gone through the httpd files but cannot find anything. Where can I start looking for this?

    Read the article

  • High disk I/O activity in CentOS server

    - by triiim
    I have about 16 websites in a CentOS dedicated, and I am having some problems on high traffic hours, it seems to be a high disk I/O activity causing a general slowdown. I've installed atop and this is what I see on the bottom (the server has been restarted thats why the values are so low): *** system and process activity since boot *** PID RDDSK WRDSK WCANCL DSK CMD 1/18 2176 1.7G 7.3G 854.4M 39 mysqld 671 1248K 3.0G 0K 13 flush-8:0 566 0K 1.1G 0K 5 jbd2/sda2-8 2401 124.2M 529.1M 22408K 3 crond 2032 2.2G 502.0M 0K 12 nginx 2360 425.8M 115.3M 4188K 2 httpd flush-8:0 and jbd2/sda2-8 are the processes I see with iotop using 99% on the IO column, and they are the processes that write the most on the hdd (after mysql). From what I saw in google this could be caused by some ext4 related bug, the current kernel is: Linux srvr.com 2.6.32-71.29.1.el6.x86_64 #1 SMP Mon Jun 27 19:49:27 BST 2011 x86_64 x86_64 x86_64 GNU/Linux I asked the hosting support to update the kernel and they tried but they now say that the server wont boot with the new installed kernel and they had to go back to the previous, they are not helping very much. Does someone has any idea how could I solve the high disk usage caused by flush-8:0 and jbd2/sda2-8 processes?

    Read the article

  • Javascript loading never completes on many sites

    - by Joe
    I recently moved country and have found that on many websites the page never finishes loading. In some cases, no content is ever displayed, but the loading will never time out. Loading Developer Tools in Chrome shows me that it is the Javascript files which never load. For example, this BBC article will never load compatability.js, though will load all the other JS files perfectly. Google Maps often fails to finish loading, meaning it's impossible to make searches. There seems to be no pattern to which files will fail to load (i.e. they don't come from the same CDN). I have tried Chrome, Safari and Firefox on OSX 10.8, and Chrome on my girlfriend's OSX 10.7. I have similar issues on the iPad. In many cases, if I can go to the mobile version of the page that seems to load fine. I have run the browsers in private mode, disabled plugins, updated flash, cleared the cache, flushed the DNS cache - though it would seem that if this is happening on other devices, none of this would work anyway. Is this an ISP issue? And if so, why would it be limited to certain JS files and not all? JS files from the same domain work fine, so I'm not really sure what I should be looking for.

    Read the article

  • Client cannot access my IIS7 web server

    - by Soccerwiz
    I have a Windows 2008 web server on running IIS 7 with about 25 websites. One of those sites is an SaaS application that is accessed constantly throughout the day. However, one particular client keeps getting blocked from my server. They will be using the service, and then all of a sudden they cannot access the program, or any other site on the server. The entire office of 4 users is blocked from accessing anything on the web server. A trace route reveals they get all the way to the server before they are blocked. However, they can access a linux server that is a different VM with a different IP on the same physical server. Also, when they are blocked from their office, they can still access the site from their mobile phone or local Starbucks. They can also occasionally reset the router and gain access to the web server again as they are on a dynamic IP address. I checked IIS and allow all IPs to access the server. There is nothing in the logs the says anything about a user being banned. I really have no idea what is causing this? Could it be a virus on their end? I have even moved the SaaS to a completely new server in a different location, and they were working fine for about a month, and then the problem started occurring again. Are there any hidden blacklists in IIS? Or is it a routing issue on their end?

    Read the article

  • Physical Debian to VMWare: vmware-converter, dd-image or otherwise?

    - by Dabu
    we have two debian Lenny production machines, both running larger commercial websites. Now these machines need to be moved, and in the process, they need to be virtualized to VMWare ESX. If you believe the internet information, there are several ways to accomplish this. The easiest for us would be to use our weekly dd backup where the whole disk, however, I have no experience with this kind of technology and if it is really possible. The second best way would be via an application on the source machine virtualizing it and generating an ESX compatible VM. However, the software is beta and unsupported, and after installation, nothing really works (the /etc/init.d/vmware-converter script doesn't actually do anything, start and stop reply with success messages, yet ps shows that there are no new processes). The worst way with the most work would be to install a new machine and set it up manually, copying files and databases as needed. This part is clear in it's execution, and my question(s) do not touch this. Is my 1st way possible? Has anyone done this yet, or better, has a page with instructions? Or is there a help page that explains how to correctly install, run and use the vmware-converter tool using a Debian installation (it's possible that I dod something wrong during installation already)? Thank you.

    Read the article

  • map linux drives to windwos 7 for media stream over internet

    - by Ortix92
    I'm trying to map a linux network drive to my windows 7 laptop, however this laptop is not on LAN. At home, I simply use Samba, but this obviously won't work over the internet. I'm trying to avoid VPN, so if there are other solutions, I would like to know about them. The reason I ask is because my university does this as well. We can simply map folders to our computers without VPN connections. I'm not sure what they are running as servers. The main reason is because I want to be able to access my files stored on my home server wherever I go. They are located in the /home/ folder (videos, music and pictures folder). I'm trying to keep my websites and media separate from each other. I wouldn't mind accessing them from a web interface either, but I would like to keep the directory structure intact. I remember having an app like that come with winamp and running it on my windows pc (As the server). Unfortunately it doesn't work for linux. Any ideas on what I could use? Would XBMC be able to help me out with this? I did do some researching but I couldn't find any concrete answers

    Read the article

  • sudoer scheme to allow useful access to another web developer yet retain future control of a virtual

    - by Tchalvak
    Background: Virtual Private Server I have a virtual private server that I'm looking to host multiple websites on, and provide access to another web developer. I don't care about putting too many constraints on him, though I wouldn't mind isolating the site that he'll be developing from other sites on the server that I will develop. The problem: retain control Mainly what I want is to make sure that I retain control over the server in the future. I want to reserve the ability to create/promote/demote and other administrative functions that don't deal with web software. If I make him an admin, he can sudo su - and become root and remove root control from me, for example. I need him not to be able to: take away other admin permissions change the root password have control over other security/administrative functions I would like him to still be able to: install software (through apt-get) restart apache access mysql configure mysql/apache reboot edit web development configuration type files in /etc/ Other Standard Setups would be happily considered I've never really set up a good sudoers file, so simple example setups would be very useful, even if they're only somewhat similar to the settings that I'm hoping for above. Edit: I have not yet finalized permissions, so standard, useful sudo setups are certainly an option, the lists above are more what I'm hoping I can do, I don't know that that setup can be done. I'm sure that people have solved this type of problem before somehow, though, and I'd like to go with something somewhat tested as opposed to something I've homegrown.

    Read the article

  • Looking for easiest, most simple solution to run a customised DNS Server for my local network on Windows 7.

    - by Jamie G
    I need to forward some websites, such as http://testing.server/ to an fixed IP address on my local network. I can do this easily on one computer using the hosts file. However, I need this to work for all machines on my network. I think the best way to do this will be to setup my own DNS Servers and add the custom DNS settings there. However, I'm looking for the simplest way possible to do this - I really don't want to spend hours setting up Unix Servers and running tricky terminal based scripts just to do this! My server is a standard Windows 7 machine. My dream would be a nice simple windows program with a GUI where I could input my ISP's DNS server and it would use those records, unless I had specifically set up my own DNS for a domain to use instead. If it had a web based admin system that was accessible from another computer on the network that would be even better. Does anyone know of anything that can do this? Many thanks indeed.

    Read the article

  • I have bought a custom domain and am using it with Gmail. All My mail is being sent as spam. What can I do?

    - by Leonnears
    A while ago, I purchased my own custom domains for my websites. Before I moved them to Gmail, I just created the e-mails in my CPanel at Bluehost.com and worked from there. When the setup was like that, I could send and receive e-mail fine, and it wouldn't be marked as spam. Now I have moved these custom domains to send and receive e-mails at Gmail using Google apps. I have done everything. I have marked the domains as "Authorized" and I believe that should be enough for the mail I send with these custom e-mails is not send as spam. If it matters, I have configured my iPhone to use these custom domains with it and I'm sending all the e-mail from it. What can I do? I started doing all this today but apparently the DNS changes have already taken place. Is there something I have to do, or is it a matter of waiting 48 hours for my mail to not be marked as spam by other providers yet? EDIT: If I send mail via Gmail itself, the mail is delivered fine. If I use my iPhone however, it gets marked as spam.

    Read the article

  • Seeing traffic destined for other people's servers in wireshark

    - by user350325
    I rent a dedicated server from a hosting provider. I ran wireshark on my server so that I could see incoming HTTP traffic that was destined to my server. Once I ran wireshark and filtered for HTTP I noticed a load of traffic, but most of it was not for stuff that was hosted on my server and had a destination IP address that was not mine, there were various source IP addresses. My immediate reaction was to think that somebody was tunnelling their HTTP traffic through my server somehow. However when I looked closer I noticed that all of this traffic was going to hosts on the same subnet and all of these IP addresses belonged to the same hosting provider that I was using. So it appears that wireshark was intercepting traffic destined for other customers who's servers are attached to the same part of the network as mine. Now I always assumed that on a switch based network that this should not happen as the switch will only send data to the required host and not to every box attached. I assume in this case that other customers would also be able to see data going to my server. As well as potential privacy concerns, this would surely make ARP poising easy and allow others to steal IP addresses (and therefor domains and websites)? It would seem odd that a network provider would configure the network in such a way. Is there a more rational explanation here?

    Read the article

  • Uninstalled server 2008 now router won't handle DHCP

    - by john
    My set up is this. server behind router, router has a server and switch connected to it with multiple computers. router used to serve DHCP and DNS, a couple of days ago installed AD, DNS and DHCP on the server, and the server gave out IP's. For various reasons we had to uninstall the domain on our server. I removed AD, DHCP and DNS from the roles and set the router back to serving DHCP and DNS. Now I can't get computers on the network. I reset my router back to factory defaults, and if I plug a computer directly into the router I can get a IP address, but all the computers behind the switch can't get an IP address and can't see the router. All my computers say unidentified network, and if I ping the router it says host is unreachable. On the other hand, my wireless devices are just fine and connect no problem. But for desktops, ipconfig /release doesn't release anything and /renew can't find a server to renew on. My router log shows several FIN scans but they are from innocuous websites (google, netgear) and it shows a couple of smurf attacks but they are all from my external IP. Any ideas? the server isn't even connected to the route right now, and all the computers are set for dynamic IP addresses.. I don't know what else to try? Any help?

    Read the article

  • 412 Precodition Failed error only occurs on certain networks

    - by Andy
    One of my favorite websites: http://jessiejofficial.com (yes, I'm a Jessie J Fan :')) has recently started displaying the error message "412 Precondition Failed" whenever I visit it from my home network, even when I use Tor Browser. At first I thought that this was an issue with the whole website, however I have contacted the web developer and he has said that they has been plenty of hits within the last 48 hours. Plus, I discovered tonight that I can access the website from my phone, through the mobile network. So it appears to just be my network as all of the devices in my house connected to the WiFi display the same error when I try to visit any page of the site. However there have been no changes that we are aware of or are noticeable to our network since the website was accessible, and I have just heard that another person in a different part of the country is experiencing the same difficulties also. Any help/advice/suggestions would be appreciated greatly Update: When trying to ping 'jessiejofficial.com' in Windows command prompt the request times out on all four attempts, on any computer connected to the wireless network. I can now also confirm that the same thing occurs on my MacBook Pro.

    Read the article

  • Why is my Toshiba Satellite L675D laptop not connecting to my HD TV using an HDMI cord?

    - by Akasha Eyre
    I'm having a problem connecting my Toshiba Satellite L675 laptop to my Samsung HD TV. I've done it before using an HDMI cord but now, for some odd reason, it's not connecting at all. My computer screen before would go completely black for a second & then come back & (with it being attached to the TV with the HDMI cord & it being on the proper channel) it would make a sound, come on & work beautifully. Now, I try hooking it up & it doesn't do anything. No black screen, no sound. The TV screen just says there's no connection & that I need to check out the power source. I'm still checking around, trying to find out if there's a virus in my computer (since my dad & I have the same computer & he can connect his to the TV with no problems), or if I deleted something that had to do with the connection. I've tried turning my computer off & letting it sit for a while, I've tried checking out other websites & have come up with nothing.

    Read the article

  • How do I setup a proper VPN for my friends to play LAN games AND give them internet access?

    - by Gizmo
    I'm trying to setup a VPN on my local network, but everyone who connects to me DOES have access to my laptop but not to the internet or other devices on the network. How can I properly configure my VPN on windows to work correctly (giving internet + access to all devices on my network to the remote pc)? Or is there software on windows which makes creating a VPN server easier? or maybe a VMWare image linux vpn server? I can't find any of those! My requirement is that my friends don't have to install additional software, they have to be able to connect with default windows stuff. My OS is Windows 8 Standart edition (not pro or enterprise) OEM. Most of my friends have also windows 8, some windows 7. Extra info: My device is DMZ'ed (Demilitarized Zone, [disabled NAT on my device so it's accessible on the WAN]) I can access files, websites and services on other devices on my network, and all devices can access file shares, website and all other services on my device When enabling VPN everything works except the client is unable to get internet access or access to any device on my network, client has only access to my device.

    Read the article

< Previous Page | 157 158 159 160 161 162 163 164 165 166 167 168  | Next Page >