Search Results

Search found 27592 results on 1104 pages for 'google sites'.

Page 469/1104 | < Previous Page | 465 466 467 468 469 470 471 472 473 474 475 476  | Next Page >

  • Linux with Windows XP vmware guest unable to access certain Internet hosts

    - by unknown
    hi I have annoying problem. My setup is the following: debian Linux, 64 bits, VMWare workstation 7 host, with Windows XP running as guest. From Firefox, or Internet Explorer, I am unable to access few sites, for example nvidia.com, osdir. Basically get connection timed out, on the other hand ping works to those sites. Moreover, Slashdot loads very very slow and sometimes gets horrible text-only version. everything works fine on Linux host I suspect it has something to do with routing on Linux, I recall having similar problem long time ago, which was fixed by setting something in /proc. I tried setting MTU and TCP window size on Windows lower, but did not help Any idea what is going on?

    Read the article

  • Mailman delivery troubles

    - by stanigator
    I'm not sure if this is a good place to ask this question. It's about mailing list management software called Mailman from GNU. Here are the details: Hosting provider: Vlexofree Domain: www.sysil.com with Google Apps Mailing List created from hosting cpanel: [email protected] I have registered a list of subscribers, and tried sending an email to [email protected]. I got the following error message: Delivery to the following recipient failed permanently: [email protected] Technical details of permanent failure: Google tried to deliver your message, but it was rejected by the recipient domain. We recommend contacting the other email provider for further information about the cause of this error. The error that the other server returned was: 550 550-5.1.1 The email account that you tried to reach does not exist. Please try 550-5.1.1 double-checking the recipient's email address for typos or 550-5.1.1 unnecessary spaces. Learn more at 550 5.1.1 http://mail.google.com/support/bin/answer.py?answer=6596 23si6479194ewy.44 (state 14). ----- Original message ----- MIME-Version: 1.0 Received: by 10.216.90.136 with SMTP id e8mr1469147wef.110.1264220118960; Fri, 22 Jan 2010 20:15:18 -0800 (PST) Date: Fri, 22 Jan 2010 20:15:18 -0800 Message-ID: <[email protected]> Subject: From: Stanley Lee <[email protected]> To: [email protected] Content-Type: multipart/alternative; boundary=0016e6dab0931bccc3047dcd2f1e - Show quoted text - Is there any way of fixing this problem? I would like to be able to have this mailing list to work through my hosting and domain. Thanks in advance.

    Read the article

  • How can I setup my local Nginx server so I can edit the files?

    - by Shane Grant
    I have my local development machine running Arch Linux, Nginx, PHP-FPM and MySQL. In order for the websites I am working on to run the files need to be owned by the http user. The files are currently located in folders like this: /srv/http/site1/ /srv/http/site2/ When I use the following chown command on the http folder the sites work fine, but I cannot edit the files with my user: chown -R http.users /srv/http When I do this the sites do not work, but I can edit the files: chown -R shane.http /srv/http How can I make it so that my user can edit the files, and the web server can run them at the same time? Thank you

    Read the article

  • Can't access certain specific IP/domain on Windows 7

    - by jfrobishow
    Been banging my head on the wall for some time now. I have a fresh install of Windows 7 - Ultimate ~x86. Some IPs can't resolve properly, it doesn't matter if I try using Firefox, Chrome or IE I end up with a domain not found (server not found, etc.) I am behind a router, all the other computers aren't affected only this Win 7 machine. It's only certain sites. e.g. xkcd.com, facebook's cdn, google group. Things I have tried: flushdns ipconfig renew Disable IP v6 on the NIC Double checked that the computer was in the right timezone @ the right time. Reinstall the NIC driver. Again 90% of the sites will work but not the major one :P help!!!

    Read the article

  • ASP.NET website http requests appear to be queueing

    - by scolemann
    We cloned our servers this weekend into a colo. All non-asp.net sites are performing great, but ASP.NET sites are very slow. It appears to be an issue with the requests/connections, but I cannot figure out where. The reason I think it is a problem with the connections is that when I launch fiddler and watch the requests, all requests appear to happen sequentially. Even the static image requests are taking 5 seconds and another one doesn't start until the first one finishes. MaxConnections is set to 100 in machine.config and the "website connections" are set to unlimited. Any idea what else coudld be causing this? from machine.config:

    Read the article

  • Weird behaviour with OpenVPN: can not connect to a few websites

    - by Gaby Solis
    My OpenVPN server is Ubuntu 10.04.4 LTS and openvpn version is 2.x My client is on Win 7. He can access most sites but not Youtube, Facebook, Twitter, groups.google.com, etc My server.conf is: local x.x.x.x port 1194 proto udp dev tun ca /etc/openvpn/keys/ca.crt cert /etc/openvpn/keys/server.crt key /etc/openvpn/keys/server.key dh /etc/openvpn/keys/dh1024.pem server 10.8.0.0 255.255.255.0 push "redirect-gateway def1" push "dhcp-option DNS 8.8.8.8" client-to-client keepalive 10 120 comp-lzo persist-key persist-tun status /etc/openvpn/keys/openvpn-status.log verb 4 I can access Youtube etc using SSH Tunnel + SOCKS Proxy, and the Ubuntu server can access all sites. so nothing is wrong with the Ubuntu server. With little information I can provide, I am not looking for a quck solution. How can I debug?

    Read the article

  • Why is my vhosts file interfering with my apache deployment?

    - by Avery Chan
    When I enable my vhosts file (i.e. uncomment this line: Include /private/etc/apache2/extra/httpd-vhosts.conf) I am unable to reach localhost. I /am/ able to reach the last virtual host listed in my vhosts file: <VirtualHost *:80> DocumentRoot "/Users/achan/Sites/epwbst" ServerName epwbst </VirtualHost> <VirtualHost *:80> DocumentRoot "/Users/achan/Sites/pxproj" ServerName pxproj </VirtualHost> Typing pxproj in my browser brings up the expected web content. But I am unable to reach epwbst or localhost. If I re-comment the vhost line in my httpd.conf, I am able to reach local host (i.e. "It works!") but obviously am unable to reach my virtual hosts. I don't know how to continue troubleshooting this. Why can't I reach localhost when I've got my vhosts turned on? OS: Mac OS X 10.7 Server version: Apache/2.2.21 (Unix)

    Read the article

  • Share Exchange Calendar Outside Organization

    - by CalCurious
    I'm trying to figure out the best way to meet a user's (Corp-A-User) request to share their calendar with someone at another company (Corp-B-User). We're running Microsoft SBS 2008 with Exchange 2007 and SharePoint. The remote user is running Exchange, version unknown. Corp-A-User wants to give the Corp-B-User the ability to create appointments on Corp-A-User's calendar. This will naturally require sharing of Free/Busy information. Corp-A-User naturally lacks the vision to seen ANY problem with giving Corp-B-User full access to their calendar. But, I see the problems with that and would prefer that Corp-B-User have only the ability to see Free/Busy and create appointments. Most of the external publishing options that I have thought of, such as WebDav, allow displaying a user's calendar, but there are problems with security and the ability to create appointments. Right now, I'm thinking the cleanest solution would be to use a Google calendar along with Google Calendar Sync for the two user's Outlook clients. But, I'm not sure if there isn't a better way and I hate teh idea of pushing a corporate calendar up to Google. Not to mention the issues likely to pop up from the multiple sync paths. Does any one have a good solution for this scenario that would be willing to share what they use?

    Read the article

  • how to connect public web server to internal LAN

    - by DefSol
    I have a VPS which is my public web server for all my clients. It's running server 2008 and I would like to have it connect via secure connection to my internal LAN. I would like this to be a route so access is bi-derectional. Have read about Server & Domain isolation, but am concerned this may prevent public views to the webs sites on the server. I currently have a PPTP tunnel, but I'm wanting better security (IPSec or SSL etc) and it's not given my bi derectional access. (In fact my backups aren't copying accross but this could be an acl issue) The goal is to provide easy/automated backups of data & sql db's to my internal LAN, as well as a means to provision new sites & db's from a workflow occuring internally. Internal lan is windows based with ISA 2006 at the perimeter. Thanks

    Read the article

  • Mod_rewrite not working on ISPConfig 3 Server

    - by Akahadaka
    Problem I recently migrated a Drupal site from a shared hosting server to my own VM. Everything appears to work correctly, except clean urls. My VM Setup Ubuntu 10.04 LAMP ISPConfig 3 What I've tried From reading up on a number of drupal forums I've tried the following in this order Check that mod_rewrite is installed and enabled Changed PHP from FastCGI to Mod_PHP (prefer to use FastCGI or suPHP though to avoid having tmp/files folders with 777 permissions) Changed the Redirect type to L in ISPConfig Sites-domain.com-Redirect Changed /etc/apache2/sites-enabled/000-default <Directory /var/www/> Options Indexes FollowSymLinks MultiViews AllowOverride All ... </Directory> Not sure about points 3 and 4, I do want all domains to be able to use mod_rewrite out of the box. Question Have I done something wrong or am I missing a step? Ultimately I would like to use FastCGI and clean urls working on all ISPConfig 3 domains without having to make any changes to individual domain settings. Any ideas appreciated, I'll try them all.

    Read the article

  • Latency, Ping and Other Questions

    - by Paulo Cassiano
    In a high traffic application, like an online auction system, few ms could determine 'to win or 'to lose' the 'battle'. I'm from Brazil. Here, I 'ping' local sites - like UOL - and receive replies in ~ 11ms. When I 'ping' US sites - like RackSpace - I receive replies in ~ 130 ms! The point is: I need a (very good like RackSpace [1]) infra-structure to host my killer online auction application, but there's no (RackSpace like) options in Brazil... Assuming that all users are located here, in Brazil, is it 'sine qua non' condition to host my application here, in Brazil? I think ~130 ms is a very high latency but, all users will receive this reply, sure? Well, where should I host my application? [1] Feel free to point me to any other very good host option other than RackSpace. I've cited it because I only know these guys...

    Read the article

  • FTP Synchronization software for Mac or PC

    - by evanmcd
    Hi, I've been using FTP Synchronizer for awhile and generally have had pretty good results with it. But, I've just moved to a Mac full-time (at work as well as home now) so want to get a native client if I can. I've tried the only one that I've found - SuperFlexibleSynchronizer - but it crashed every time I loaded up an FTP to FTP synch attempt. The most important features to me are: 1) ability to synch with a large number of files (thousands), as I generally work on sites with large number of files. 2) FTP to FTP synch. This would be very helpful as I work with some CMS based sites for which users upload files while on staging and don't want to move files locally first before moving live. Thanks! Evan

    Read the article

  • DKIM- Filter No Signature Data

    - by Vineet Sharma
    I have installed DKIM-Filter on Postfix after reading this tutorial http://www.unibia.com/unibianet/systems-networking/how-setup-domainkeys-identified-mail-dkim-postfix-and-ubuntu-server My email now has a DKIM signature but still it is landing in the SPAM folder. Here is the header Received-SPF: neutral (google.com: 69.164.193.167 is neither permitted nor denied by best guess record for domain of [email protected]) client-ip=69.164.193.167; Authentication-Results: mx.google.com; spf=neutral (google.com: 69.164.193.167 is neither permitted nor denied by best guess record for domain of [email protected]) [email protected]; dkim=hardfail (test mode) [email protected] Received: from promote.a2labs.in (localhost [127.0.0.1]) by promote.a2labs.in (Postfix) with ESMTPA id 34858530E8 for <[email protected]>; Mon, 28 Feb 2011 12:23:07 +0530 (IST) DKIM-Signature: v=1; a=rsa-sha256; c=simple/simple; d=a2labs.in; s=mail; t=1298875987; bh=bo+H1VYPIHMja2u7i1lnzr4k/j4Pe8iSf79bVw94XpI=; h=To:Subject:Message-ID:Date:From:Reply-To:MIME-Version: Content-Type:Content-Transfer-Encoding; b=nhTdlnUwo0iUJ92ycQzKSRjw 5Pfya0DJcJrAc8Mr2hIv8OLpgzBCzdOMWTGqR5nuUmAzgCGYBhYAM2XZwVxo9JG/iz7 oYKysmNQnskFx0TRyW3UOkDWcfHcPnCL6Y7fGzZWinmsyjsg47k+mKZg/e8jqlwTAMO PYKkt5pBz7SM0= Also my mail.err file shows Feb 28 12:17:03 ivineet dkim-filter[32181]: 1F788530E1: no signature data Feb 28 12:18:02 ivineet dkim-filter[32181]: 432BA530E2: no signature data How to fix it

    Read the article

  • Sending email with exim and external sender address

    - by Tronic
    i have following problem: i want to send emails with an rails webapp. i set up an exim server and when looking into the logs, the sending works, but the emails aren't sent really. i had the same problem with another isp. the sender address is hosted on another mailserver, other isp. i think the problem is, that sending doesn't work because the sener address isn't hosted on the same server. do you have any advice on this? the logs (exim) tell me the following: 2011-01-01 14:38:06 1PZ1eo-0000Ga-38 <= <> R=1PZ1eo-0000GY-1p U=Debian-exim P=local S=1778 2011-01-01 14:38:08 1PZ1eo-0000Ga-38 => [email protected] R=dnslookup T=remote_smtp H=mx1.emailsrvr.com [98.129.184.131] X=TLS1.0:RSA_AES_256_CBC_SHA1:32 DN="C=US,O=mx1.emailsrvr.com,OU=GT21850092,OU=See www.geotrust.com/resources/cps (c)08,OU=Domain Control Validated - QuickSSL(R),CN=mx1.emailsrvr.com" 2011-01-01 14:38:08 1PZ1eo-0000Ga-38 Completed [email protected] is the external sender-address! thank you! Edit with more details when sending a mail from command line with echo "Test" | mail -s Testmail [email protected] the logs says 2011-01-01 20:45:24 1PZ7OG-0001Vp-Rx <= root@gustav U=root P=local S=360 2011-01-01 20:45:26 1PZ7OG-0001Vp-Rx => [email protected] R=dnslookup T=remote_smtp H=gmail-smtp-in.l.google.com [209.85.229.27] X=TLS1.0:RSA_ARCFOUR_MD5:16 DN="C=US,ST=California,L=Mountain View,O=Google Inc,CN=mx.google.com" 2011-01-01 20:45:26 1PZ7OG-0001Vp-Rx Completed and i get the mail on my gmail account. but when sending by webapp (when testing locally with sendmail it works fine) i only get this log output 2011-01-01 20:50:08 1PZ7Sq-0001X9-L4 <= <> R=1PZ7Sq-0001X7-Jo U=Debian-exim P=local S=1780 2011-01-01 20:50:11 1PZ7Sq-0001X9-L4 => [email protected] R=dnslookup T=remote_smtp H=mx1.emailsrvr.com [98.129.184.3] X=TLS1.0:RSA_AES_256_CBC_SHA1:32 DN="C=US,O=mx1.emailsrvr.com,OU=GT21850092,OU=See www.geotrust.com/resources/cps (c)08,OU=Domain Control Validated - QuickSSL(R),CN=mx1.emailsrvr.com" 2011-01-01 20:50:11 1PZ7Sq-0001X9-L4 Completed

    Read the article

  • unusual webpage access problem

    - by user28163
    This is one of the weirdest problems I've ever seen and I'm out of ideas. I cannot access a couple of websites from my home pc (for 3-4 days now), or at least they go really really slow ~700b/s I can access these same websites fine from my laptop from same IP address. - The websites originate from different countries and are huge popular sites. - I have tried both my PC's network cards. - I have reformatted my PC. - I have reset my router to factory settings. - I have created virtual computers on my PC, and from inside those I cannot access the webpages. - I have flushed my dns. - I tried specifying my dns server addresses to 8.8.8.8 and 8.8.4.4. I can only access the webpages using my PC, if I first connect to an outside VPN. Again, these sites work fine on the 2 other pcs in this house (same ISP and internet connection). help plz :)

    Read the article

  • Cold Fusion 8 and 9 on IIS 6

    - by David Mesh
    I have a client, who against my better judgement has insisted on doing the following. They have a single IIS 6 on Win 2003 Server Enterprise. It currently runs ColdFusion 8. They want me to install ColdFusion 9 on the server without changing any of the existing sites so that they can develop in CF 9 and upgrade other sites in the future. Yes, I have begged them to use a Dev server, or run on apache on the same box. Can this even be done? Many thanks in advance! DM

    Read the article

  • How does geolocation based on IP address work?

    - by Martin
    As all Internet users, I've visited web sites which appear to know in which country and city I'm located. I understand that these web sites typically look up my IP address in a database which maps IP address to country / city which works fairly well. I've also seen companies selling this type of database. How is this database, which maps an IP address to a country / city, created in the first place? Is there a central database somewhere where each ISP registers the link between IP address and country/city? Or does the companies selling geolocation databases contact different ISP's and purchase the mapping information from them? Or is there some organization 'above' ISP's who keeps track of this?

    Read the article

  • Browser extension (or other software) to delay page load

    - by Doug Harris
    The alt text to today's comic at xkcd.com (strip below) says: After years of trying, I broke this habit in a day by decoupling the action and the neurological reward. I set up a simple 30-second delay I had to wait through, in which I couldn't do anything else, before any new page or chat client would load (and only allowed one to run at once). The urge to check all those sites magically vanished--and my 'productive' computer use was unaffected. (bold is my emphasis) Does anybody know of a browser extension or other software that will add this sort of delay? I've seen extensions which simply block sites, but not a delay like this.

    Read the article

  • Naming standard for additional A records/IP addresses for IIS servers?

    - by serialhobbyist
    When you're adding another IP address to and IIS server, what naming standards do you use for the A records? Background: I've a bunch of sites on an IIS server which use (CNAME'd) host-headers and a single IP address. Server names (and A records) adhere to unfriendly (as in difficult-to-remember) naming standards whereas CNAMEs, and therefore host-headers, can be friendly. Now I've a need for several SSL certificates for different sites. I was thinking about using an additional IP address for each to-be-SSL'd site but still using friendly CNAMEs. So then I come to what to call the A record. What do you do? Related to this question.

    Read the article

  • Ping and crawling not working, site still resolving

    - by Andrew Alexander
    Ok, so we're trying to figure out why the site of one of our clients isn't being crawled by Google (we've ruled out robots.txt or meta tags) When we go to the site, either IP address or domain name, the site resolves, everything works. However, Google is getting a 302 redirect (which it apparently isn't following for crawling), and when we ping the address, it times out (note, the site is still resolving in the browser throughout all of this). The site is built in ASP.Net (I assume C#) and so my thoughts were that it was an errant redirect rule, or some other sort of server side issue. We also thought that it might be due to incorrect domain pointing (but if we try to ping the IP, it doesn't work, so that sorta rules that out). We're really not sure what is causing all of these errors, or even if they have one single source. Anyone have any ideas what could be going on? Do you need any more information? To boil it down in a TL; dr: * Site resolving in browser, both IP and domain name. No problems here. * Site not being crawled by Google (gets a 302 it doesn't seem to follow) - it is not due to robots.txt or meta tags * Ping is not working for the IP address. This is very odd, because again, the IP address seems to work fine in the browser. * Our thoughts are either redirect rule issue, domain pointing issue, or possibly some errant code - or some combination of the three

    Read the article

  • Hosting several domains on one server using IIS 7

    - by Øyvind Knobloch-Bråthen
    I have created several web sites inside IIS7 on my server. All of them use the same ip and port, but different host names. Currently I have set the host name to www.mydomain.com. Now my question is, how do I get my actual domains to target the different sites on my server. Second question. Can I set my host name to only mydomain.com to make sure that all requests to that domain is handeled by the same application? Primarily, I want both www.mydomain.com and mydomain.com to work when the user types the address in their browser.

    Read the article

  • Free, simple, configurable SOCKS5 server

    - by Pooria Azimi
    I've been looking (for the past 6-7 hours) for a fast, free and configurable SOCKS5 server. I haven't found anything that matches my needs. They are either too complicated, too bare-bones or simply buggy as hell. This is (all) I need: I want it to run on Linux (and also OS X, preferably) I want it to listen on localhost:8888 When my app (say wget.. or curl --socks5=localhost:8888) requests http://www.google.com/search?q=asd (or any other url - both http and https), I want it to fetch the page not from google's servers, but from http://localhost:4444/cached?uri=http://www.google.com/search%3Fq%3Dasd. Nothing more! I don't need caching, or anything else. I just want a SOCKS5 server, running locally, which redirects all queries to my own (local) server. It could be written in C, C++, Python, PHP, Perl, Node.js or any other language. I don't care, as long as it supports my (very limited) needs, or I can easily change the source to make it so. Thanks a lot

    Read the article

  • Kernel-mode Authentication: 401 errors when accessing site from remote machines

    - by CJM
    I have several Classic ASP sites that use Integrated Windows Authentication and Kerberos delegation. They work OK on the live servers (recently moved to a Server 2008/IIS7 servers), but do not work fully on my development PC or my development server. The IIS on both machines were configured through an IIS web deployment tool package which was exported from an old machine; the deployment didn't work perfectly, and I had to tinker a bit to get the sites working. When accessing the apps locally on either machine, they work fine; when accessing from another machine, the user is prompted by a username/password dialog, and regardless of what you enter, ultimately it results in a 401 (Unauthorised) error. I've tried comparing the configuration of these machines against similar live servers (that all work fine), and they seem generally comparable (given that none of the live servers are yet on IIS7.5 (Windows 7/Server 2008 R2). These applications run in a common application pool which uses a special domain user as it's identity - this user has similar permissions on the live and development machines. On IIS6 platforms, to enable kerberos delegation, I needed to set up some SPNs for this user, and they are still in place (even though I don't believe they are needed any longer for IIS7+ due to kernel-mode authentication), Furthermore, this account is enabled for Kerberos delegation in Active Directory, as is each machine I am dealing with. I'm considering the possibility that the deployment might have made changes/failed to make changes to the IIS configuration thus causing this problem. Perhaps a complete rebuild (minus another web deployment attempt) would solve the problem, but I'd rather fix (thus understand) the current problem. Any ideas so far? I've just had another attempt at fixing this issue, and I've made some progress, but I don't have a complete fix...yet. I've discovered that if I access the sites via IP address (than via NetBIOS name), I get the same dialog, except that it accepts my credentials and thus the application works - not quite a fix, but a useful step. More interestingly, I discovered that if I disable Kernel-mode authentication (in IIS Manager Website Authentication Advanced Settings), the applications work perfectly. My foggy understanding is that this is effectively working in the pre-IIS7 way. A reasonable short-term solution, but consider the following explicit advice from IIS on this issue: By default, IIS enables kernel-mode authentication, which may improve authentication performance and prevent authentication problems with application pools configured to use a custom identity. As a best practice, do not disable this setting if Kerberos authentication is used in your environment and the application pool is configured to use a custom identity. Clearly, this is not the way my applications should be working. So what is the issue?

    Read the article

  • Squid ban policy

    - by VOX
    I need a requirement to let users view a particular website for an hour and then put it into ban list of that user. My company have a team of website reviewers who review their website. In most cases, when they found a good website (online RPG? social sites? web proxies) they enjoy it all the day without ever going to another sites. So I want to let them view a new website for an hour then I want to ban those websites. Is there any convenient way to do it?

    Read the article

< Previous Page | 465 466 467 468 469 470 471 472 473 474 475 476  | Next Page >