Search Results

Search found 4618 results on 185 pages for 'websites'.

Page 18/185 | < Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >

  • OS X Snow Leopard 10.6 Refuses to Load Websites the first time intermittently

    - by Brandon
    Many times when I am browsing the web, Snow Leopard will sit and load a site for 20 seconds or more, until it times out and says it cannot be displayed. If I refresh, it loads RIGHT away, every time. The issue is intermittent but happens from once every couple of days to a few times a day. So the long and short of it is this: Aluminum MacBook (Non-Pro) 2.4GHz Core2Duo, 4GB DDR3 I am using 10.6.6 but I have had this issue since 10.6.0 It happens in Firefox, Chrome, and Safari I have flushed my DNS (using the command 'blablabla flush') I am using custom DNS servers which I hoped would fix it but it had no effect* I am running Apache currently but haven't been for most of the time I've reformatted multiple times, always experiencing the issue I am on Cox cable internet, with a Motorola Surfboard & a Belkin F6D4230-4 v1 (Pre?) N wireless router. I've put the router in G only & N only & G+N to no effect It seems to be domain dependant as I can sometimes load the Google cache right away, and sometimes other sites will load but Google will refuse My Powerbook G4 with Leopard, other Windows XP laptops, & my wired Win7 desktop do not suffer from the issue. *I recently started using these to escape the awful Cox redirect page on timeouts I'm almost positive the issue has happened on other networks but I can't recall a specific instance (I have a terrible memory). The problem is intermittent and fixable enough (I just have to wait until it times out and hit refresh one time) but incredibly annoying since I'm constantly reading documentation from a large variety of sites. EDIT: To clarify, this happens with ALL sites, not only specific sites. I haven't been able to detect any pattern to the failures, but one day Google.com will refuse to load while reddit.com will, and the next day vice versa. Keep in mind that waiting for a timeout and hitting refresh loads the page right away, every time. If I don't wait for the timeout, opening more links, hitting refresh, and clicking the link a billion times have no effect. It seems to be domain neutral, affecting sites seemingly at random. It doesn't seem to have anything to do with connection inactivity either, because I will be SSHed into different servers, uploading files, browsing, downloading, etc, and it will just quit loading Jquery.com (for example) until I sit and wait for a timeout. /EDIT This is my last resort. Please, someone, tell me what is happening. Thank you.

    Read the article

  • How to block access to websites on a linux box [closed]

    - by user364952
    I'm just curious how many ways people can come up with to block access to a website. I simply intended to block (my own) access to news.ycombinator.com to stop my productivity drain. The first two methods I thought of were editing the hosts file to resolve news.ycombinator.com to 0.0.0.0 or adding a rule to iptables. iptables -A OUTPUT -d news.ycombinator.com -j REJECT Disclaimer: I do realise that the above two methods are easily by-passable. Doesn't help this is my own machine. What other ways does the internet know?

    Read the article

  • IIS6 host multiple websites under same sub-domain (or something similar)

    - by user28502
    I'm trying to figure out a structure for a hosted application that i'm working on. I've got a domain lets call it app.company.com (a sub-domain company.com of course) that is setup to redirect to my IIS 6 web server. I would like to set up one website in IIS for each client that will use this application. And have the URL schema be like this: app.company.com/clientA -- would point to ClientA website in IIS app.company.com/clientB -- would point to ClientB website in IIS Do you guys have any pointers or best practices for my scenario?

    Read the article

  • Running two different websites domains one one IP address

    - by Akshar Prabhu Desai
    Here is my apache configuration file. I have two domain names running on same ip but i want them to point to different webapps. But in this case both point to the one intended for e-yantra.org. If I copy paste akshar.co.in part before E-yantra.org both start pointing to akshar.co.in I have two A DNS entries (one per domain name) pointing to the same IP. NameVirtualHost *:80 <VirtualHost *:80> ServerName www.e-yantra.org ServerAdmin [email protected] DocumentRoot /var/www <Directory /> Options FollowSymLinks AllowOverride All </Directory> <Directory /var/www/> Options Indexes FollowSymLinks MultiViews AllowOverride All Order allow,deny allow from all </Directory> <Directory /var/www/ci/> Options Indexes FollowSymLinks MultiViews AllowOverride All Order allow,deny allow from all </Directory> <Directory /var/www/db2/> Options Indexes FollowSymLinks MultiViews AllowOverride All Order allow,deny allow from all </Directory> ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/ <Directory "/usr/lib/cgi-bin"> AllowOverride None Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch Order allow,deny Allow from all </Directory> ErrorLog /var/log/apache2/error.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn CustomLog /var/log/apache2/access.log combined Alias /doc/ "/usr/share/doc/" <Directory "/usr/share/doc/"> Options Indexes MultiViews FollowSymLinks AllowOverride None Order deny,allow Deny from all Allow from 127.0.0.0/255.0.0.0 ::1/128 </Directory> </VirtualHost> <VirtualHost *:80> ServerName www.akshar.co.in ServerAdmin [email protected] DocumentRoot /var/akshar.co.in <Directory /var/akshar.co.in/> Options Indexes FollowSymLinks MultiViews AllowOverride All Order allow,deny allow from all </Directory> </VirtualHost>

    Read the article

  • Does anyone using godaddy shared windows webhosting, have multiple websites on it and faced this pro

    - by Amr ElGarhy
    I have multiple website on the same shared hosting on godaddy server, its Deluxe Hosting - Windows plan. I asked before a question about this: http://serverfault.com/questions/13906/how-to-fix-subfolders-iis7-functionality But i feel that no one is facing this problem except me, so i want to know what i am doing wrong or if someone had the same problem please tell me. all my website are in subfolders from the root folder, the problem that all links are showing like this: www.example.com/example/...., www.anotherwebsite.com/anotherwebsite/.... such as this http://amrelgarhy.com/ Means the folder name is showing in the URL, i did all what i can and discussed with godaddy a lot, but they always tell that its a IIS7 problem. Did you face this problem before or know a solution for?

    Read the article

  • Tomcat and IIS6 on one server, multiple websites

    - by Rafe
    Windows Server 2003 with IIS6 and Tomcat 6.0.26. Three sites running on this server. One is PHP, one is ASP and the third uses Tomcat. The first two work flawlessly. The third is being a pain. Tomcat is installed and if called on the server itself by IP address xxx.xxx.xxx.xxx:8080 shows the welcome page. Calling the same address from an external machine gives a 'could not connect' error. Calling the ip address from within the server on standard port 80 gives me an error 324 (net::ERR_EMPTY_RESPONSE). Have tried using httpcfg to force listening only on ports within the ip range I'm using in a manner similar to question 35650 located here on serverfault but with no success. There are actually ten ip addresses on this machine but I'm only using three of them at this point. Any pointers on troubleshooting this would be appreciated!

    Read the article

  • Keeping websites from knowing where I live

    - by D Connors
    This questions is related to issues and practicality, not security. I live in Brazil and, apparently, every single website I visit knows about it. Usually that's okay. But there are quite a few sites that don't make use of that information adequately. For instance: Bing keeps thinking that Brazilian pages are way more relevant to me than American ones (which they're not). google.com always redirects me to google.com.br Microsoft automatically sends me to horribly translated support pages in Portuguese (which would just be easier to read in English). These are just a few examples. Usually it's stuff I can live with (or work around), but some of them are just plain irritating. I have geolocation disabled in Firefox, so I guess they're either getting this information from my IP or from Windows itself (which I bought here). Is there a way to avoid this? Either tell them nothing or make them think I live somewhere else?

    Read the article

  • Anti-spam measures for websites

    - by acidzombie24
    What are anti-spam measure I should consider before launching my user content website? Some things I have considered: Silent JavaScript based CAPTCHA on the register page (I do not have an implementation) Validate emails by forcing a confirmation link/number Allow X amount of comments per 10 minutes and Y per 2 hours (I am considering excited first time users who want to experience the site) Disallow link until user is trusted (I am not sure how a user will become trusted) Run all comments, messages, etc. through a spam filter. Check to see if messages are duplicate or similar (I may not bother with this. I'd like the system to be strong without this) I also timestamp everything which I then can retrieve as a long on my administrator page. What other measures can I take or consider?

    Read the article

  • Block access to certain websites from other computers on shared internet connection

    - by Dheeraj V.S.
    I have very little knowledge about networking. I have an internet connection on my PC (running Windows XP). I've enabled internet connection sharing to other computers on my network connected using a WIFI router. I want to block use of GMail (in general, specific domains) from other computers, but should be able to access them from my own machine. I can't edit my hosts file because my machine too would lose access. I guess I need to configure my wireless router for that. But I have no idea how to do so. My wireless router is a Zyxel P-660HW-T1 v2. I'd appreciate any pointers.

    Read the article

  • Can't access some websites with any browser

    - by Charles Kingsmill
    I'm running Windows 7 64-bit on a new Samsung laptop and accessing the internet okay via ethernet cable to my university's ISP. Some sites work fine (e.g. google.com) but I can't access others at all (microsoft.com, topshop.com). I can't connect to those sites in safe mode with networking. And ping and tracert both fail. There's no proxy. Other users can connect successfully to these sites using my cable and socket. I've tried all the following with no success: using various browsers (IE9, FF, Chrome) creating a new user updating drivers clearing the DNS cache using OpenDNS and Google's DNS turning off Avast tweaking the MTU running MS malicious software removal tool running Spybot S&D reviewing the hosts file disabling the IPv6 options repairing / resetting winsock settings disabling advanced javascript options I have run out of ideas... can anyone see anything I've missed??!

    Read the article

  • All websites migrated from server running IIS6 to IIS7

    - by Leah
    Hi, I hope someone will be able to help me with this. We have recently migrated all of our clients' sites to a new server running IIS7 - all the sites were originally running on a server running IIS6. Ever since the migration, lots of our clients are reporting error messages. There seems to be quite a number of issues related to sending emails and also, we have had the following error message reported by several different clients: Server Error in '/' Application. -------------------------------------------------------------------------------- Validation of viewstate MAC failed. If this application is hosted by a Web Farm or cluster, ensure that <machineKey> configuration specifies the same validationKey and validation algorithm. AutoGenerate cannot be used in a cluster. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.Web.HttpException: Validation of viewstate MAC failed. If this application is hosted by a Web Farm or cluster, ensure that <machineKey> configuration specifies the same validationKey and validation algorithm. AutoGenerate cannot be used in a cluster. I have read elsewhere that this error can appear if a button is clicked before the whole page has finished loading. But as this error has now appeared on multiple sites and only since the server migration, it seems to me that it must be something else. I was wondering if someone could tell me if there is something specific which needs to be changed for .NET sites when sites are moved from a server running IIS6 to a server running IIS7? I don't deal with the actual servers very much so I'm afraid this is very much a grey area for me. Any help would be very much appreciated.

    Read the article

  • block certain websites from browser

    - by phunehehe
    Hello there, A friend of mine (who is not a geek) asks me how to stop her little brother from playing web games on her computer. She is currently using Chrome and IE, and I have never done that before, even on FF. I would prefer a solution that is simple and does not require additional applications. Although it seems unlikely, is there a solution that works for all browsers (i.e. do it once and I never have to fix it for a new browser)? Thanks.

    Read the article

  • Tool to highlight and annotate text on websites?

    - by doug
    When I read something I need to underscore what I like, or what I think is important, and to take notices about that read, near on that current paragraph. So, does anyone know a Firefox add-on or something else (another browser, any other application) which can provide me such a functionality? ps: I've tried some research tools as Zotero, but it is not what I'm looking for.

    Read the article

  • Weird behaviour with OpenVPN: can not connect to a few websites

    - by Gaby Solis
    My OpenVPN server is Ubuntu 10.04.4 LTS and openvpn version is 2.x My client is on Win 7. He can access most sites but not Youtube, Facebook, Twitter, groups.google.com, etc My server.conf is: local x.x.x.x port 1194 proto udp dev tun ca /etc/openvpn/keys/ca.crt cert /etc/openvpn/keys/server.crt key /etc/openvpn/keys/server.key dh /etc/openvpn/keys/dh1024.pem server 10.8.0.0 255.255.255.0 push "redirect-gateway def1" push "dhcp-option DNS 8.8.8.8" client-to-client keepalive 10 120 comp-lzo persist-key persist-tun status /etc/openvpn/keys/openvpn-status.log verb 4 I can access Youtube etc using SSH Tunnel + SOCKS Proxy, and the Ubuntu server can access all sites. so nothing is wrong with the Ubuntu server. With little information I can provide, I am not looking for a quck solution. How can I debug?

    Read the article

  • Transferring websites from x64 to x86 server

    - by Ke
    Hi, I run a x64 staging server here along with the following: Solr Java etc. However, I am about to get a linode vps for production and quickly realising that x86 is the way to go for their lowest RAM package (thinking to upgrade later). My staging server is x64 with 12gb ram, so going down to 300mb ram is going to feel devilishly slow ;/ Here are my questions: 1) Will I have problems transferring my scripts, dbs etc from a x64 to x86 server? e.g. solr indexes 2) Is it worth going for the x86 package? I am probably going to upgrade later down the line and x64 might be better for the servers with more RAM? should I stick with x64 instead as there isnt much difference when using with low RAM? Cheers Ke

    Read the article

  • Bandwidth monitor for apache websites

    - by bmaynard
    I am after a web application that will parse apache log files and record how much bandwidth the user has used. We have several virtual hosts that have custom log files and the I/O is recorded at the end of the logfile. However I can't find an application that will parse multiple log files and display a summary for each site. I believe awstats can do this but I want to be able to see all of my clients in one list. If there is something that integrates into cacti then that would be perfect.

    Read the article

  • Programs still opening websites through Google Chrome, despite its removal

    - by Russsell Feldman
    Even after I've uninstalled Google Chrome, when other programs want to open a website (e.g.: Yahoo! Messenger getting a profile) they will still attempt to do so through Chrome, and fail looking for it. I've read all the advice on how to make Firefox or IE for Windows 7 the default browser. I don't think Google would do this sort of "hijack the default browser" thing and I'm convinced it must be a trojan or virus or even a registry hack. If so, any ideas on how I would go about fixing this without purchasing every virus/trojan program until it was removed? That method could be an expensive fix.

    Read the article

  • Discover all websites that point to an IP address

    - by crn
    Our company owns over 60 domains and a few external IP addresses (several domains share an external IP address). How do I discover all the domains that use a particular IP address (e.g. 69.x.x.7) without going through the DNS information on each domain? The particular webserver is running Windows 2003 and IIS6. Thanks, in advance, for your help!

    Read the article

  • Set up multiple websites on a local web server

    - by mickburkejnr
    I have spent the last few days setting up a CentOS 6 server on my local network so that I can host multiple projects that I'm currently working on. Everything has been set up so that I access the server by typing 192.168.1.10 and the Apache test page comes up. What I'm aiming to do is to access different projects by typing in 192.168.1.10/project, and then view the project as if it was on it's own standalone server. I have thought about just sticking these sites inside folders on the server then accessing them that way, but a lot of my projects use CakePHP so this isn't feasible. So what I need to do is create VirtualHosts in Apache to allow me to do this, but without using a domain name. I want to stick to using the IP address of the machine (which is static). Any ideas? EDIT I've followed Peter's suggestion, but now I have a new problem. In the httpd.conf file I have entered the following information: NameVirtualHost *:80 <VirtualHost *:80> ServerAdmin [email protected] DocumentRoot /www/html/project1 ServerName local.project1.com ErrorLog logs/local.project1.com-error_log CustomLog logs/local.project1.com-access_log common </VirtualHost> And now Apache is saying: Starting httpd: Warning: DocumentRoot [/www/html/project1] does not exist When it clearly does exist. I've disabled SELinux and I can confirm this isn't turned on. I've also checked the ownership of the folder, and its owned by root. I can also save files to these folders using a guest FTP account (which isn't associated to root), so the folders are being listed and can be written to. But when I try the folder in a web browser it doesn't seem to work either. I've also done a reboot of the server and the problem persists. What should I change in order to resolve this?

    Read the article

  • Multiple websites each with an SSL certificate of its own

    - by ServerDown
    Hi, We run cent os, plesk with apache and php, mysql. There are around 25 sites and each of them need an SSL certificate now. The host cannot have more than 16 IPs on the same server. Is it possible to have all these sites use just one IP address and have SSL certificate setup for each site? If yes, please let me know how I can set this up. Thanks

    Read the article

  • Intermediate SSL Certificates on Azure Websites

    - by amhed
    I have successfully configured an Extended-Validation Certificate on an Azure Website following this article: http://www.windowsazure.com/en-us/documentation/articles/web-sites-configure-ssl-certificate/ The main (non-technical) stakeholder of the web application went through great lengths to validate that our site is secure. He went to this site to check the validity of our SSL: http://www.whynopadlock.com/ The site throw the following error: `SSL verification issue (Possibly mis-matched URL or bad intermediate cert.). Details: ERROR: no certificate subject alternative name matches`` The certificate is installed using IP Based SSL instead of SNI. This is done this way because some site visitors still use Internet Explorer 8 on Windows XP, which has no support for SNI and throws a security warning. Is my certificate correclty installed? I received three .CRT files from my SSL provider: PrimaryIntermediate.crt SecondaryIntermediate.crt EndCertificate.crt This is how I exported our certificate as a .PFX file to Azure: openssl pkcs12 -export -out myserver.pfx -inkey myserver.key -in myserver.crt

    Read the article

  • Viewing local websites on my iOS device over Wi-fi

    - by John
    Trying to view some local html/css/js files in a mobile browser on my iOS device. Thought maybe file-sharing would be an option, and is, but I'm not completely satisfied with it. Any time I try to do the following an error occurs. Web sharing is on and available at http://192.168.1.101/~user but I have to manually copy the files in. If I try to symlink a folder in so that the address could be viewed at ''~user/some_dir by issuing $ ln -s /Users/user/dev/some_dir ~/Sites/ then I get a 403 forbidden error. I've tried to remedy this by modifying a user.conf file in /private/etc/apache2/ and using the following syntax: <Directory "/Users/user/Sites/"> Options Indexes MultiViews SymLinks AllowOverride None Order allow,deny Allow from all </Directory> but nope, still doesn't work. I get a 403 error. If I try to symlink each individual file in instead of using a directory as a sub-directory, same error. Any help would be greatly appreciated! I'd just like to symlink directories into the ~/Sites one and browse them on my iOS device over wifi. I'm on OS X 10.7 Lion trying to connect with iOS 5.

    Read the article

  • Certain websites won't load with DHCP server

    - by Simon E
    Hi, I recently configured a server using Windows Server 2008 to take over from my router using DHCP Server. I disabled my router from assigning IPs too. All works well, except for certain sites that refuse to load e.g. "Firefox could not find www.amazon.co.uk" http://www.google.co.uk/ http://www.amazon.co.uk/ http://www.jobsite.co.uk/ I think it may have something to do with co.uk domain. If I reboot the server everything goes back to normal but of course thats not a good solution to do that every other day. Does anyone know of this problem or what it might be - I am very new to server technology.

    Read the article

  • screen scraper templates for various websites

    - by intuited
    I'm looking specifically for a convenient way to locally archive posts from this and other similar sites. I'd like to separate the question itself from the answers, or maybe crop the question and store it, keeping the page title. Obviously I don't need to store the menu or the various other site interface chrome. The best way to do this would seem to be to associate an XSLT template with a match on the URL and use that template to pull the various relevant informations and format them. My two-part question: Is there a tool specifically built for this task? I.E. something that takes a URL and checks it against a map of path-matching expressions to templates, and outputs the result of applying the template to that resource? xmlto seems to be most of the way there, and could probably just be called from a script that does the pattern-matching, but something already integrated would be more convenient. Is such a URL_pattern-to-XSLT_template map publicly available somewhere? Question 2.5: Is it legal to do this with sites like this one that have public licenses on their content?

    Read the article

< Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >