Search Results

Search found 806 results on 33 pages for 'ouhsd webmaster'.

Page 25/33 | < Previous Page | 21 22 23 24 25 26 27 28 29 30 31 32  | Next Page >

  • What are some good courses to take my programming to the next level?

    - by absentx
    I am in search of either some in person, or online training that could take my coding to the next level. I am looking to attack two specific areas: Javascript: While I have been getting by with javascript for three or four years, I still feel like it takes a back seat to my other programming. I use Jquery a lot but would prefer to be proficient in pure JS also. PHP: I feel pretty proficient at PHP but I know there is only room to improve. Here I am interested in something that can teach me the more advanced aspects of the language, improve my code writing and perhaps cover object oriented php in depth also. I have looked into Netcom's training courses before but I can't tell if there advanced webmaster professional would be a good fit or not. Seems kind of like a force fed course but I am interested in it because I am looking for something in the one to two week range that is targeted at what I am looking for. I have zero experience with any type of online courses in terms of programming. It appears lots are available, but I am not sure on the quality.

    Read the article

  • Apache mod_proxy to another server

    - by trobrock
    I am using the proxy_balancer in Apache2 to proxy requests to a Rails application to my rails server on the port the application is running on. This is how its set up... Rails Server Mongrel running on port 8000, when accessing the url directly to http://rails_server:8000 the site loads fine Apache Server Conf file for the site: <VirtualHost *:80> ServerAdmin webmaster@localhost ServerName myserver.com ServerAlias application.myserver.com <Proxy balancer://application_cluster> Allow from localhost BalancerMember http://ip.to.server:8000 retry=10 </Proxy> ProxyPass / balancer://application_cluster </VirtualHost> The problem I am having is going to http://rails_server:8000 works fine, but going to http://application.myserver.com Loads the right content, but is displaying all the HTML as text and not rendering it as html

    Read the article

  • configuration issue with respect to .htaccess file on ubuntu

    - by Registered User
    I am building an application tshirtshop I have following configuration in /etc/apache2/sites-enabled/tshirtshop <VirtualHost *:80> ServerAdmin webmaster@localhost DocumentRoot /var/www/tshirtshop <Directory /var/www/tshirtshop> Options Indexes FollowSymLinks AllowOverride All Order allow,deny allow from all </Directory> ErrorLog ${APACHE_LOG_DIR}/error.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn CustomLog ${APACHE_LOG_DIR}/access.log combined </VirtualHost> and following in .htaccess file in location /var/www/tshirtshop/.htaccess <IfModule mod_rewrite.c> # Enable mod_rewrite RewriteEngine On # Specify the folder in which the application resides. # Use / if the application is in the root. RewriteBase /tshirtshop #RewriteBase / # Rewrite to correct domain to avoid canonicalization problems # RewriteCond %{HTTP_HOST} !^www\.example\.com # RewriteRule ^(.*)$ http://www.example.com/$1 [R=301,L] # Rewrite URLs ending in /index.php or /index.html to / RewriteCond %{THE_REQUEST} ^GET\ .*/index\.(php|html?)\ HTTP RewriteRule ^(.*)index\.(php|html?)$ $1 [R=301,L] # Rewrite category pages RewriteRule ^.*-d([0-9]+)/.*-c([0-9]+)/page-([0-9]+)/?$ index.php?DepartmentId=$1&CategoryId=$2&Page=$3 [L] RewriteRule ^.*-d([0-9]+)/.*-c([0-9]+)/?$ index.php?DepartmentId=$1&CategoryId=$2 [L] # Rewrite department pages RewriteRule ^.*-d([0-9]+)/page-([0-9]+)/?$ index.php?DepartmentId=$1&Page=$2 [L] RewriteRule ^.*-d([0-9]+)/?$ index.php?DepartmentId=$1 [L] # Rewrite subpages of the home page RewriteRule ^page-([0-9]+)/?$ index.php?Page=$1 [L] # Rewrite product details pages RewriteRule ^.*-p([0-9]+)/?$ index.php?ProductId=$1 [L] </IfModule> the site is working on localhost and is working as if there is no .htaccess rule specified i.e. if I were to view a page as http://localhost/tshirtshop/nature-d2 then I get a 404 Error but if I view the same page as http://localhost/tshirtshop/index.php?DepartmentId=2 then I can view it. sudo apache2ctl -M Loaded Modules: core_module (static) log_config_module (static) logio_module (static) mpm_prefork_module (static) http_module (static) so_module (static) alias_module (shared) auth_basic_module (shared) authn_file_module (shared) authz_default_module (shared) authz_groupfile_module (shared) authz_host_module (shared) authz_user_module (shared) autoindex_module (shared) cgi_module (shared) deflate_module (shared) dir_module (shared) env_module (shared) mime_module (shared) negotiation_module (shared) php5_module (shared) reqtimeout_module (shared) rewrite_module (shared) setenvif_module (shared) status_module (shared) Syntax OK What is the mistake if any one can point out in above configuration, or else I need to check any thing else?

    Read the article

  • page rank 0 penalty

    - by mark
    I have a wordpress blog and a www-website on the same domain for about one year. Together it is about 170 pages. The page rank is still 0. I understand that page rank 0 is a penalty for duplicate content. The pages are indexed in google but still no page rank. In google webmaster tools there is no indication for any problem. I asked for reconsideration of both blog and website a month ago. Google accepted the reconsideration but it did not change anything. Other pages of similar size and similar audience earn PR 4-6. Is there something I can do in order to get a fair page rank? A coworker told me that it might be the case that a link farm is using the content and I can do nothing about it. Is there a reliable way to check for something like that? I do not like to give up so quickly is there a chance to fix this by for example moving to another domain?

    Read the article

  • How to receive mail in Qmail?

    - by Ivan
    I've a server that uses Qmail. It is installed by default and it is supposed to work. I've created a new domain and new user (vadddomain + vadduser) without problems, but when I send an email from Gmail to [email protected] (the address I've created) it desappears, it is. But if connect to SMTP server directly (telnet domain.com 25) and post an email it arrives to the user queue. What's happening?!? Note: If I try to access to my user through telnet domain.com 110 it seems my pwd is not correct and it's the same I used when created the user with vadduser

    Read the article

  • php programming

    - by HARSHA
    Hi, i am learning php,i downloded the xampp.and Apache server,Mysql are running properly in Xampp Control panel. I tried with a simple program that is hello world,i created a new folder in htdocs, and i saved my program in that new folder with .php extention. But when i run the program then is showing a error as follows ------ Object not found! The requested URL was not found on this server. If you entered the URL manually please check your spelling and try again. If you think this is a server error, please contact the webmaster. Error 404 localhost 18-5-2010 11:51:44 Apache/2.2.14 (Win32) DAV/2 mod_ssl/2.2.14 OpenSSL/0.9.8l mod_autoindex_color PHP/5.3.1 mod_apreq2-20090110/2.7.1 mod_perl/2.0.4 Perl/v5.10.1

    Read the article

  • Can't get directory listing on my Apache to work.

    - by joon
    Hi, I'm having trouble enabling the directory listing on apache. I did it a few weeks ago but had to reinstall cause my Wubi ran out of space, and now I can't get it to work and it's driving me crazy. I have a folder /home/joon/Dropbox/Projects/apache, which I want to set as the root for my apache. Here's the first lines of the 'default' file in the sites-available folder: <VirtualHost *:8888> ServerAdmin webmaster@localhost DocumentRoot /home/joon/Dropbox/Projects/apache <Directory /home/joon/Dropbox/Projects/apache> Options +Indexes FollowSymLinks AllowOverride None The rest is unchanged. Ports.conf is set to Listen 8888. I thought the +Indexes should do it, but I must have overlooked something. I get a 403, forbidden. "You don't have permission to access / on this server." If I input the url of an image, http://127.0.0.1:8888/joon/bin/1chart.png, it displays, but no directory listing. Please help.

    Read the article

  • Which Language Next? Python? Ruby? [closed]

    - by Ryan Craig
    I am a beginning Webmaster (relatively), with 2+ years of php experience. I also have some java training and a bit of .net. My company is now close to redeveloping the website that I work on, which is coded primarily in php, but has some poorly-written .net in part as well (it's confusing and ill-planned, but I didn't make any of those decisions. Can anyone say action-oriented .net and JScript?). So, I'm trying to decide which language I should learn next to quickly develop a new site. I will probably just redevelop it at first in php because I'm very comfortable with it. However, I'd like to migrate in the next year to something newer and more forward-thinking. This being said, .net is out of the question a little bit. We need cheap developers who are fast and can get pages up quickly. In this part of the country, part-time .net developers are hard to find. So, we need something that will be pretty standard in the next few years, but we have some .net SOAP 1.1 APIs that we use on our actual service (separate from the corporate website), that we will need to integrate part of the site with. Developing with php and SOAP is much more difficult than doing the same thing. So, I may have to develop the API collaborative part in .net just to be easy, and then I'd like to use something else that is fast, flexible, forward thinking, and will be relatively standard and easy to find developers for. So, any ideas? Python and Django? Ruby on Rails? Another framework? Thanks for your thoughts. Sorry, I know this was long, but it's all very convoluted and confusing so I needed to be slightly long-winded.

    Read the article

  • Why do my websites have a first page rank on Bing and Yahoo but not Google? [closed]

    - by Linda Cullum
    I have 3 websites suffering from a drop in ranking with Google and hence a huge drop in traffic. The instant drop ocurred in September and I have not been able to remedy it. For the past 6-10 years my main website http://LearnToSail.Net has ranked from #3 to #1 on the 1st page of Google and all the other engines with the search term "learn to sail" Now it shows on the 1st page of Bing and Yahoo but does not show up on ANY pages of Google. The only way it does come up is if I add "cd" to the "learn to sail" phrase. We sell a sailing cd on that website. The other websites are http://LearnToSailOnLine.com ..search terms are "learn to sail online" or learntosailonline and historyofthepilgrims.com search terms are "history of the pilgrims" "historyofthepilgrims" I get the same result. Gone on Google but 1st pages for Bing and Yahoo. I have researched, edited,updated blogs, made sitemaps, prayed to the universe and use Google Webmaster tools but nothing is changing and I have lost alot of business. I host with 1and1.com and have been back and forth with them but to no avail and no change in traffic. I thought maybe some DNS mapping was off. I used to have alot of traffic now I have hardly any. Any advice would be greatly appreciated. I am still in the process of working on the issue of course! This is a really great website here and I am glad I came across it. Thank you, LS Cullum Little Pines Multimedia

    Read the article

  • Dedicated Servers: Is one better then two for LAMP pseudo HA setup? [closed]

    - by bikedorkseattle
    Possible Duplicate: How to find web hosting that meets my requirements? I know there are zillions of commentary about hosting out there, but I haven't read much about this. Our current well known host is having too many problems, the hardware we are on it subpar, and I'm ready to leave. A day of downtime can cost as much as our monthly hosting bill. A month of bad performance is just killing us right now, user and google wise. I'm wondering about running two dedicated boxes for LAMP, one running as the primary Nginx/Apache (proxy pass), and the other as the MySQL box. Running a single box scares the bejesus out of me because who knows how long it will take anyone to fix a raid card or whatever. The idea is to set this up using some sort of failover system using pacemaker and heartbeat. If one server goes down the other can take over for the other running both web and db. There are some good articles over at Linode about this. I have a few DBs that are 1GB+ and would like to load them into memory. Because of this, I'm shying away from a Linode HA setup because for the price I could do it with two dedicated like I described. Am I mad or an idiot? What are people out there doing for pseodu high availability good performance setups under $400/month? I'm a webmaster; I do a lot of things none of it that well :)

    Read the article

  • how to make a small image become really huge

    - by DennyHalim.com
    all webmaster should already know about hotlinking stuffs. and we know how to ban those bad referer too... but i want to get revenge... i want to replace the hotlinked images with one huge image with few megs in size. i have found one good image. yet it less than 100k. i already use it to replace all bad hotlinkers. how can i convert this image to become few megs?

    Read the article

  • Redirection & SEO related stuff while moving to a new blog

    - by Karshim Kanwar
    I have a WordPress blog and recently I have setup a new blog lets call the old blog as blog old and new blog as blog new. What I did is moved the content, photos, pictures and all 250 posts from blog old to blog new. Both the blog name are changed as they are pointing to different domain names! I read helpful things in this site itself at here. I will no longer use blog old, moreover I am concerned about the SEO of the blog new. The blog new is fairly new (just 24 hours and no pages have been indexed in Google). I have done the following stuff: Deleted all the post share at Facebook fan Page, Twitter profile, Google+ page and Finally deleted the fan page/Twitter, Google+ page. Edited the link backs of old blog in the blog new. The question I have is: How do I prevent duplicate content issues? Do I go straightaway and delete all the posts in blog old? Should I start sharing the blog posts in blog new? Should I submit the new site to Webmaster Tools or wait for few weeks? Every comment here is appreciated! What issues can I face relating to SEO?

    Read the article

  • Google Analytics Social Tracking implementation. Is Google's example correct?

    - by s_a
    The current Google Analytics help page on Social tracking (developers.google.com/analytics/devguides/collection/gajs/gaTrackingSocial?hl=es-419) links to this page with an example of the implementation: http://analytics-api-samples.googlecode.com/svn/trunk/src/tracking/javascript/v5/social/facebook_js_async.html I've followed the example carefully yet social interactions are not registered. This is the webpage with the non-working setup: http://bit.ly/1dA00dY (obscured domain as per Google's Webmaster Central recommendations for their product forums) This is the structure of the page: In the : ga async code copied from the analytics' page a script tag linking to stored in the same domain. the twitter js loading tag In the the fb-root div the facebook async loading js including the _ga.trackFacebook(); call the social buttons afterwards, like so: (with the proper URL) Tweet (with the proper handle) That's it. As far as I can tell, I have implemented it exactly like in the example, but likes and twitts aren't registered. I have also altered the ga_social_tracking.js to register the social interactions as events, adding the code below. It doesn't work either. What could be wrong? Thanks! Code added to ga_social_tracking.js var url = document.URL; var category = 'Social Media'; /* Facebook */ FB.Event.subscribe('edge.create', function(href, widget) { _gaq.push(['_trackEvent', category, 'Facebook', url]); }); /* Twitter */ twttr.events.bind('tweet', function(event) { _gaq.push(['_trackEvent', category, 'Twitter', url]); });

    Read the article

  • Second virtual host on Apache redirects to root

    - by Slytherin
    I tried to setup my second virtual host , but I'm getting the default /var/www/index.html ( the one that says "It works!" ) I followed the same procedure as the first time, but this time it didn't work my configuration looks like this <VirtualHost *:80> ServerName messup ServerAlias messup.loc ServerAdmin webmaster@localhost DocumentRoot /var/www/messup ErrorLog ${APACHE_LOG_DIR}/error.log CustomLog ${APACHE_LOG_DIR}/access.log combined </VirtualHost> my hosts configuration is the following 127.0.0.1 localhost 127.0.1.1 SlytherinPC 127.0.0.1 AFS.loc 127.0.0.1 messup.loc After this , my apache wouldn't restart without any message , only saying [fail] , but stop and start worked. What am I missing ?

    Read the article

  • How to get local business nationwide exposure? [closed]

    - by guisasso
    here's the situation: This company offers local home services (construction...), but also fabricates many custom items that can be shipped nationally, and even internationally. Since i started working on this website, it has ranked pretty well on alexa global and locally, and i have made many SEO improvements that doubled the visits to the website in 6 months. The website is listed in many different directories (dmoz & etc...), maps (google maps & etc...), business listing sites (yelp & etc..), trade specific websites (angie's list, houzz & etc...), state specific business listings and etc, there are many links to pictures displayed on the website, links to the website itself, i have a google analytics and webmaster tools account, with sitemaps, newsletters, facebook page.... the list goes on and on. All of which have been working pretty well locally. We have had some success with doing business in other states and even other countries, but it is still a pretty small percentage of the market. I also advertise on google adwords locally, and since this would be the obvious answer, my question is: Without paid advertisement, how can i improve the visibility of this local business website nationally to attract customers in all US States?

    Read the article

  • What could have caused a large traffic drop from Google in early May?

    - by Scott Schluer
    I have a website (www.equispot.com) that has been indexed for almost 2 years in Google. I managed to get myself on the first page (average position 6-8) on Google for my target keyword of "horses for sale" and held there pretty solidly for months. Suddenly, with no changes to the site, traffic from Google dropped like a rock in early May. I slowly fell in position until now I'm sitting at the bottom of page 4. I have never hired an SEO firm, have not used any "black hat" techniques that Google would have penalized me for in their May update, etc. I'm not familiar enough with SEO to know how to look at link profiles, etc. to tell if there's something wrong. I've run my site through a DNS checker and it came back with no errors. Google Webmaster Tools shows no messages or notices of any kind, just a drop in traffic. GWT also shows only 2 server errors and 1 404. Is there anyone who can tell me by quickly checking my domain if there's an obvious reason that my traffic would have fallen so far, something that I can fix?

    Read the article

  • mod_ReWrite to remove part of a URL

    - by Jack
    Someone has incorrectly linked to some of my urls causing 404 erros in Google Webmaster Tools. Here is an example Linked URL: http://www.example.com/foo-%E2%80%8Bbar.html Correct URL: http://www.example.com/foor-bar.html I would like to 301 redirect any instance of this kind of incorrect linking to the correct URL. I have tried the following but it generates 404 Errors site wide. Options +FollowSymLinks RewriteEngine on RewriteRule ^foo-(.*)bar\.html$ http://www.example.com/foo-bar\.html? [L,R=301] Could anyone let me know what I am doing wrong?

    Read the article

  • Multiple domains for different products?

    - by alexandertr
    I have a website with software applications. Is it good for SEO to choose one keyword rich domain name for each of our software products or should we stick to a single domain? From a user's perspective I think it would be easier to remember a domain that is keyword rich as the user will instantly know what this product is for. But I have read articles that the latest trend in SEO is to stick to one domain for all of your products and invest on this single domain website. Is that true? What do you advise? Should I register a separate domain for each of our products or should I use only one single domain? Should I do a 301 redirect with a .htaccess to a single domain? And what about the sitemaps? Should I register all sites in Google Webmaster Tools and post a separate sitemap for each one of them? should my main site sitemap include all pages or should separate domains have their own sitemaps?

    Read the article

  • Conditional https redirect to http depending on URL? (Apache)

    - by Joel Marcey
    Right now I redirect 100% of the time if someone does https://mysite.com <VirtualHost *:443> ServerAdmin [email protected] ServerName mysite.com ServerAlias www.mysite.com RewriteEngine on RewriteRule (.*) http://%{HTTP_HOST} [L,R=permanent] <VirtualHost> However, now I want to conditionally redirect. If a user goes to https://mysite.com/abc/, then I want to use https; otherwise redirect. How do I do this? I tried reading the docs, but just couldn't find what I needed. I am using Apache on Ubuntu Linux.

    Read the article

  • Drupal migration failed

    - by Marco
    First of all, I'm new to Drupal and the work I have to do is some kind of too hard. My old colleague (webmaster) had a server with a multisite Drupal 6 installation. Sites and their dirs were (e.g.) Sites Site directory b.a.mycompany.com /drupal_install_dir/sites/b.a.mycompany.com c.a.mycompany.com /drupal_install_dir/sites/c.a.mycompany.com d.a.mycompany.com /drupal_install_dir/sites/d.a.mycompany.com Unluckily my colleague moved and server hdd aren't in my hands: all I have is a backup of /drupal_install_dir and three sql dumps (one for each site). I had to restore three sites, but changing them as z.mycompany.com/b z.mycompany.com/c z.mycompany.com/d Beeing a sysadmin, I Extracted tar.gz backup file under wwwroot (let's call full path to extracted directory /new_install_dir) Restored three databases Created mysql users and give them correct GRANTS on databases Then (trying to restore at least first site) I changed /new_install_dir/sites/settings.php putting correct database connection data and new basepath. But there is no way I can see my new site, simply it doesn't work. Watching /var/log/apache2/error.log I saw Drupal searching for main drupal database; so I created that db too setting user and grants, but dump file is empty. Well, now I can run something like install.php or update.php, but my site is not shown. Is there something I can do? Do I have to walk another way? Consider I searched the web, but I'm not able to find a guide that can help me for my problem. Ah, I forgot: before producing the backup, my colleague set site in maintenance mode. When I try to run z.mycompany.com/?q=user (trying to login) nothing happens. I'm really stuck...

    Read the article

  • Per-user vhost logging

    - by kojiro
    I have a working per-user virtual host configuration with Apache, but I would like each user to have access to the logs for his virtual hosts. Obviously the ErrorLog and CustomLog directives don't accept the wildcard syntax that VirtualDocumentRoot does, but is there a way to achieve logs in each user's directory? <VirtualHost *:80> ServerName *.example.com ServerAdmin [email protected] VirtualDocumentRoot /home/%2/projects/%1 <Directory /home/*/projects/> Options FollowSymlinks Indexes IndexOptions FancyIndexing FoldersFirst AllowOverride All Order Allow,Deny Allow From All Satisfy Any </Directory> Alias /favicon.ico /var/www/default/favicon.ico Alias /robots.txt /var/www/default/robots.txt LogLevel warn # ErrorLog /home/%2/logs/%1.error.log # CustomLog /home/%2/logs/%1.access.log combined </VirtualHost>

    Read the article

  • Cant make my site available to the internet

    - by user1683645
    Hi I'm using ubuntu as server OS for my webhosting but I'm having problem redirecting my domainname to my server Here are my /etc/hosts file and /etc/apache2/sites-available/mysite file. hosts file: 127.0.0.1 www.lowkey.se The following lines are desirable for IPv6 capable hosts ::1 ip6-localhost ip6-loopback fe00::0 ip6-localnet ff00::0 ip6-mcastprefix ff02::1 ip6-allnodes ff02::2 ip6-allrouters sites-available/file: ServerAdmin webmaster@localhost ServerName www.lowkey.se DocumentRoot /var/www/doost/ <Directory /> Options FollowSymLinks AllowOverride None </Directory> <Directory /var/www/doost/> Options Indexes FollowSymLinks MultiViews AllowOverride None Order allow,deny allow from all </Directory> ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/ <Directory "/usr/lib/cgi-bin"> AllowOverride None Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch Order allow,deny Allow from all </Directory> ErrorLog ${APACHE_LOG_DIR}/error.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn CustomLog ${APACHE_LOG_DIR}/access.log combined And a screenshot from my domain name provider: http://imgur.com/VyqBR the site has been enabled in ubuntu, I've restarted apache2 and the folder /var/www/doost/ is there. What am i doing wrong?

    Read the article

  • SEO Blog Indexing : Dot Wordpress Versus a Registered Domain?

    - by rumspringa00
    I've used Wordpress for a few of my client's sites, mostly small businesses and ecommerce sites. I have found through Google Analytics as well as the All in One Webmaster plugin that when it comes to social media, using Wordpress is a surefire way of getting your site indexed by Google and occasionally Bing and Yahoo. Since I am a heavy WP user, I'd like to contribute by registering a dot Wordpress domain for my portfolio. When using a WP installation concurrently with a WP domain, e.g. myportfolio.wordpress.com, will the site be more or less likely to be indexed rather a generic myportfolio.com domain? I've seen mixed opinions where people seem to favor a WP domain for URL output where others say that it's a moot point, and that Google will not favor a WP domain over a dot com domain as long as your meta tags are updated and content is keyword optimized. I tend to disagree and believe a WP domian would more likely be indexed and output more URLs over an individual, laconic domain like myportfolio.com. Am I wrong? Thanks in advance!

    Read the article

  • local wordpress installation not accessible from the outside world

    - by hello
    I have a working installation of wordpress located in /var/www/html/wordpress It is accessible in my local network at [local-machine-ip]/wordpress/ There is also a test page located in /var/www/html/test.html It is also accessible in my local network at [local-machine-ip] I would like the wordpress website to be accessible from the outside world. I know that my ISP blocks incoming requests on port 80, so I set my router to redirect requests from port 8080 to 80. This feature appears to be working correctly since I can access the test.html page using my public ip address as follows: [public-ip]:8080 However, I cannot access [public-ip]:8080/wordpress Here is my Apache config : <VirtualHost *:80> ServerAdmin webmaster@localhost DocumentRoot /var/www/html ServerName [my.domain.com] <Directory /var/www/html/> Options FollowSymLinks Indexes MultiViews AllowOverride All Order allow,deny allow from all </Directory> ErrorLog ${APACHE_LOG_DIR}/error.log CustomLog ${APACHE_LOG_DIR}/access.log combined </VirtualHost> Thanks!

    Read the article

  • Website is live but ping times out

    - by infinity
    I have a client's website that is running on GoDaddy hosting and started behaving very strange recently. The site is up and running but when I try to ping it I get time out. The problem is PayPal doesn't work also Google Webmaster reported the site is down. The client spoke with support and they said there is no firewall or any other traffic filter on their side. The site itself is PHP. Any ideas are welcome. I've tried to send test IPN from PayPal to the payment URL and got: IPN delivery failed. Unable to connect to the specified URL. Please verify the URL and try again. Which makes me think that the website/server is inaccessible in some specific cases. The site URL is http://www.flavourly.com P.S.: I tried to ping it from different machines, ISPs and OS

    Read the article

< Previous Page | 21 22 23 24 25 26 27 28 29 30 31 32  | Next Page >