Search Results

Search found 9721 results on 389 pages for 'quicktest pro'.

Page 92/389 | < Previous Page | 88 89 90 91 92 93 94 95 96 97 98 99  | Next Page >

  • What is the role of web hosting in SEO [closed]

    - by Vinay
    Possible Duplicate: Does changing web hosting server affects SEO page ranking? SEO Geolocation What are the best ways to increase a site's position in Google? How to find web hosting that meets my requirements? I have read somewhere that hosting providers do play a role in website SEO, As my website is hosted in yahoo small business, That has got analytics and some other tool they provide to check the keyword activity, I think these can be achieved with the google's analytics as well, Server performance and Uptime is one important factor. I have also got few doubts in my mind 1) Does shared hosting affect the SEO and what is the role of domain extension like .com, .in, .org ,etc. 2) Does server geolocation affects the SEO 3) Does server OS affects the SEO. Apart from the above, Is there any factors that affect the SEO One more last question that If hosting really matters lot, can you suggest me a web hosting service for a small business e commerce site for PHP

    Read the article

  • Free Domain hosting configurations and transfer

    - by upog
    I have registered a new domain name with GODaddy.com now i would like to host my domain for free. Assume the app is a basic HTML page.I have done some search and decided to host it under google app engine I am looking answers for few question currently my domain name is managed by GODaddy, how can i transfer it to Google app engine, so that going forward it will be managed by Google How can i configure the new domain in Google app engine and associate with my domain name Is there any indirect cost involved in domain hosting service hosted by Google app engine Any suggestion for free and reliable hosting is also welcomed Update Can i host free web page in cloud.google.com ?

    Read the article

  • What to look for in a free hosting plan? [duplicate]

    - by Jon
    This question already has an answer here: How to find web hosting that meets my requirements? 5 answers I have a test website that's hosted on a free plan by Zymic. At the moment I'm typing, it and my site is down. I don't want to let my clients down in the future. It's been down for over 2 days. I thought it was a coding problem at first, and then found out I couldn't connect to my server. Zymic had very good reviews, and its downtime was OK (not high or low), but now I want to change my web host. What should I look for (besides downtime guarantee)? Also, do you have any suggestions that with all the benefits? Any feedback will be greatly appreciated.

    Read the article

  • What a web developer can learn [closed]

    - by knoxxs
    There are many things to learn in web development. You can easily find what are the most important thing that you need to learn if you want to be a webmaster. Answer to questions about how to become a web developer or a webmaster only contained limited items that someone need to master. (Some eg - a, b ) But the problem is that these resources are not complete. When I started learning web development i follow the same steps. But after learning the basic development I didn't know that I have learnt nothing, there are many more things to learn. I realized this by following blogs , Q&A sites. When I first downloaded the HTNL% Boilerplate, the issue that they have covered, some of them I haven't even heard about. I want you to just suggest what are the possible things, issues that someone can learn and why to learn. I know the answer is follow blogs and do your work you will learn with time, but with these platforms I could get some benefit out of other experiences. This question is not how to become a webmaster, but answer to this may also cover that too.

    Read the article

  • How to interpret number of URL errors in Google webmaster tools

    - by user359650
    Recently Google has made some changes to Webmaster tools which are explained below: http://googlewebmastercentral.blogspot.com/2012/03/crawl-errors-next-generation.html One thing I could not find out is how to interpret the number of errors over time. At the end of February we've recently migrated our website and didn't implement redirect rules for some pages (quite a few actually). Here is what we're getting from the Crawl errors: What I don't know is if the number of errors is cumulative over time or not (i.e. if Google bots crawl your website on 2 different days and find 1 separate issue on each day, whether they will report 1 error for each day, or 1 for the 1st, and 2 for the 2nd). Based on the Crawl stats we can see that the number of requests made by Google bots doesn't increase: Therefore I believe the number of errors reported is cumulative and that an error detected on 1 day is taken into account and reported on the subsequent days until the underlying problem is fixed and the page it's crawled again (or if you manually Mark as fixed the error) because if you don't make more requests to a website, there is no way you can check new pages and old pages at the same time. Q: Am I interpreting the number of errors correctly?

    Read the article

  • Webpage redirection time

    - by Abhijeet Ashok Muneshwar
    I want to calculate time consumed in redirecting from 1 webpage to another webpage. For Example: 1) I am using Facebook in Google Chrome browser. I have shared 1 link on my Facebook profile like below: http://www.webdeveloper.com/ (It's not only Facebook. It can be any domain having link to another domain). 2) When I click on this link from my Facebook profile, then this website will open in new tab. 3) I want to calculate time difference in miliseconds or microseconds between below two events: First Event: Time of clicking link "http://www.webdeveloper.com/" from my Facebook profile. Second Event: Time of completely loading webpage of "http://www.webdeveloper.com/". Thank you in advance.

    Read the article

  • Sitemap structure for network of subdomains

    - by HaCos
    I am working on a project that's a network of 2 domains (domain.gr & domain.com.cy) of subdomains similar to Hubpages [each user gets a profile under a different subdomain & there is a personal blog for that user as well] and I think there is something wrong with our sitemap submission. Sometimes it takes weeks in order a new profiles to get indexed. We make use of one Webmasters account in order to manage all network and we don't want to create different accounts for each subdomain since there are more than 1000 already. According to this post http://goo.gl/KjMCjN, I end up on a structure of 5 sitemaps with the following structure : 1st sitemap will be for indexing the others. 2nd sitemap for all users profile under the domain.gr 3nd sitemap for all users profile under the domain.com.cy 4th sitemap for all posts under the *.domain.gr - news sitemap http://goo.gl/8A8S9f 5th sitemap for all posts under the *.domain.com.cy - news sitemap again Now my questions: Should we create news sitemaps or just list all post in 2nd & 3rd sitemap? Does links ordering has anything to do? Eg: Most recent user created be first in sitemap or doesn't make any different and we just need to make sure that lastmod date is correct? Does anyone guess how Hubpages submit their sitemap in Webmasters so maybe we could follow there way? Any alternative or better way to index this kind of schema? PS: Our network is multi language - Greek & English are available. We make use of hreflang tags on the head of each page to separate country target of each version.

    Read the article

  • How to increase backlinks of blog or websites [duplicate]

    - by Adarsh Sojitra
    This question already has an answer here: How do I build backlinks? 5 answers I know that this question is very easy and also Silly.But I don't know how to make backlinks to my blog.I have tried commenting in various blogs and websites but in alexa there is only 1 backlink which is my own blog.Do anyone know how to make a quality backlinks for blog or website....I also want to know that by increasing Backlinks, SEO of my blog imporves???Thanks in advance......

    Read the article

  • Ranking hit after WP site migration

    - by Ben
    I migrated my site from its old domain over a month ago. I followed WMT completely, including 301 redirects from every existing URL to the new domain, and then submitting a change of address. Traffic continued as normal, but then a few days after submitting the change of address traffic plummeted to about 20-30% of what it was previously. Most of my traffic come from organic search, and I can see that for the keywords I had targeted before and performed well with and am now ranking much much lower for. In some cases for low competition keywords I've only lost a few places, for higher competition terms I have really suffered. This has started to pick up a bit (one of my keywords I have risen from 195 to 100 in the last week), but it seems to be a very slow process. How seamless is this process normally? I was under the impression that this would not affect my rankings too severely, but it has now been a month since the move and recovery seems to be very slow, if at all. Is it likely that I've missed something? The only change is that I have moved what was the home page to be more of a sub-page, and now in its place is a magazine-style home page. I understand that links to the old site will now be pointing to the latter which means that rankings for some keywords attributed to the old home page will take a hit, but even on other pages that seem to fit in exactly the same page structure as the previous site I have seen a drop in rankings. Any help would be greatly appreciated. Thanks!

    Read the article

  • Dropped impression 25 days after restructure

    - by Hamid
    Our website is a non English property related website (moshaver.com) which is similar to rightmove.co.uk. On September 2012 our website was adversely affected by Panda causing our Google incoming clicks to drop from around 3000 clicks to less than a thousand. We were hoping that Google will eventually realize that we are not a spam website and things will get better. However, in August 2013 we were almost sure that we needed to do something, so we started to restructure our web content. We used the canonical tag to remove our search results and point to our listing pages, using the noindex tag to remove it from our listing pages which does not have any properties at the moment. We also changed title tags to more friendly ones, in addition to other changes. Our changes were effective on 10th August. As shown in the graph taken from Google Analytics Search Engine Optimization section, these changes has resulted in an increase in the number of times Google displayed our results in its search results. Our impressions almost doubled starting 15th August. However, as the graph shows, our CTR dropped from this date from around 15% to 8%. This might have been because of our changed title tags (so people were less likely to click on them), or it might be normal for increased impressions. This situation has continued up until 10th September, when our impressions decreased dramatically to less than a thousand. This is almost 30% of our original impressions (before website restructure) and 15% of the new impressions. At the same time our impressions has increased dramatically to around 50%. I have two theories for this increase. The first one is that these statistics are less accurate for lower impressions. The second one is that Google is now only displaying our results for queries directly related to our website (our name, our url), and not for general terms, such as "apartments in a specific city". The second theory also explains the dramatic decrease in impression as well. After digging the analytic data a little more, I constructed the following table. It displays the breakdown of our impressions, clicks and ctr in different Google products (web and image) and in total. What I understand from this table is that, most of our increased impressions after restructure were on the image search section. I don't think users of search would be looking for content in our website. Furthermore, it shows that the drop in our web search ctr, is as dramatic of the overall ctr (-30% in compare to -60%) . I thought posting it here might help you understand the situation better. Is it possible that Google has tested our new structure for 25 days, and then decided to decrease our impressions because of the the new low CTR? Or should we look for another factor? If this is the case, how long does it usually take for Google to give us another chance? It has been one month since our impressions has dropped.

    Read the article

  • DNS records on website.. What are they for?

    - by Blake Nic
    Recently we had to get some ddos protection for our website because of the large attacks we were seeing after getting a bit of popularity. We handed over our domain and hosting information to our ddos protection provider. It worked perfectly but I have a question. On our DNS records we have the Host and Answer and Type. The Host has our domain name there. The answer is this: SOMETEXTXXXX.dv.googlehosted.com. And when i copy and paste it into my browser it gives me a 404 error. But our website still loads and functions as it should. I don't understand why it would need this? I asked them about this and they said it is a method for ddos protection and the other IPs are the reverse proxy (the other ips give a 404 error too). Can anyone expand on this more please. How does all this tie in together and make the internet browser know where to point the person with all these reverse proxies and stuff I don't understand. Thank you. Here is an image for reference: http://i.stack.imgur.com/qo5QO.png

    Read the article

  • Facebook page design is not working in IE8 [closed]

    - by PrateekSaluja
    Hello Experts, We have designed a face book page.It is working fine in all browser including IE7 but it is not working in IE8.We checked then we got if we run our code outside the face book page it works in IE8 but when we put our code into face book page its not working.Here is the css code what we are using for IE8. <!--[if lt IE 8]> <style> .nv_a { width:90px; height:27px; float:left; text-align:center; padding-top:8px; } .nvt_a { width:66px; height:27px; float:left; text-align:center; padding-top:8px; } .nv_a a { width:90px; height:27px; float:left; padding-top:8px; text-align:center; color:#000; display:inline-block; text-decoration:none; background-color:#e0e0e0; border-top:solid 1px #999; border-left:solid 1px #999; border-right:solid 1px #999; border-bottom:solid 1px #999; } .nv_a a:hover { width:90px; height:27px; padding-top:8px; float:left; color:#000; text-align:center; background-color:#ccc; } .nvt_a a { width:66px; height:27px; float:left; padding-top:8px; text-align:center; color:#000; display:inline-block; text-decoration:none; background-color:#e0e0e0; border-top:solid 1px #999; border-left:solid 1px #999; border-right:solid 1px #999; border-bottom:solid 1px #999; border:1px solid red; } Please help us to solve the issue.

    Read the article

  • Apache + Lighttpd serving from same Domain name

    - by Alex Pineda
    So we wish to host some pages on a new server w/ apache2, and embed some of our old content & functionality from another server w/ lighttpd in an iframe. I'm looking at this configuration from the apache docs (http://httpd.apache.org/docs/2.2/vhosts/examples.html#page-header) under "Using Virtual_host and mod_proxy" together. <VirtualHost *:*> ProxyPreserveHost On ProxyPass / http://192.168.111.2/ ProxyPassReverse / http://192.168.111.2/ ServerName hostname.example.com </VirtualHost> The only issue is that I want to proxy only on a subdomain, or even better, if I can keep the top domain and proxy only if the url contains a particular path ie. "/myprocess.php". So in essence the DNS will point to the apache2 as the "master router".

    Read the article

  • Will domain change affect my pagerank?

    - by Chankey Pathak
    I have two blogger's blog. (http://chankeypathak.blogspot.com and http://javaenthusiastic.blogspot.com) One blog has PR 3 and the other blog has PR 2. I want to buy the domain for both blogs so that they will become http://chankeypathak.com/ and http://javaenthusiastic.com/ I will follow all the procedures that Blogger suggests so that all the visitors to http://chankeypathak.blogspot.com will be redirected to http://chankeypathak.com/ and same for the java's blog. I just want to know that whether this will affect my pagerank or not? I want my PR to remain same and not to be change because of domain change. Let me know. Thank you. PS: I don't know whether one person is allowed to post his site's URL in questions or not. If it is not allowed then you may edit the question.

    Read the article

  • Why does Google Search Engine reject my title tag's change?

    - by Michal P.
    I made a simple webpage http://pundaquitboat.michaelspages.com/ giving it the the title tag "Boat – Pundaquit" and I have submitted it to Google bot by Google Webmaster Tools. Then I decided to change the title to "Anawangin trip" of the same page and I submited my webpage again in the same way to Google bot. The result was that the new title of my webpage coexisted with the old title of the same webpage in SERPs for maybe 2 days. After that the new title was rejected and if I enter site:pundaquitboat.michaelspages.com/ I can see that Google has my old copy of my webpage with old title in its database. This problem doesn't occur in Bing when I can enjoy high position of "Anawangin trip" phrase. (In Bing I haven't submitted the old version of title.)

    Read the article

  • Are there any risk if your DNS's SOA or admin contact are using the same domain as the DNS

    - by Yoga
    For example, Google.com [1] The SOA email is : dns-admin.google.com The contact is: Administrative Contact: DNS Admin Google Inc. dns-admin.google.com As you can see, both are using google.com, I am thinking it is safe to use the same domain, i.e. consider the case you lost control of the domain, you can receive email also. (Of course Google is a public company so the chance is low, but might occur for smaller company that their domain might be stolen..) So, do you recommend use your the same domain as the contact or others free services such as gmail? [1] http://whois.domaintools.com/google.com

    Read the article

  • fully encrypt website using SSL

    - by eddywebs
    I had been trying to use SSL for the following site http://bit.ly/e8Lj32 , although the SSL certificate is signed properly by networksolutions , each time the pages are loaded it still displays an SSl warning in browser warning "Some parts of the site are not using SSL" , in I.E, its even worst if you hit "no I dont want view unsecured part of the page" site does not display properly (as it blocks some of the widgets) screenshots upped at http://i.imgur.com/fm5GO.png

    Read the article

  • Is the Mailchimp API available in other languages?

    - by boundaryfunctions
    I'm using the Mailchimp API in combination with PHP and jQuery to provide the subscribing/unsubscribing-actions on a website via Ajax. On errors with user data you get useful messages like "Invalid Email Address", "[email protected] is already subscribed to list x. Click here to update your profile." or "There is no record of "[email protected]" in the database". For sure I want to keep theses messages, but is there a way I can get them in other languages (in particular in German)? How would I achieve this? I wasn't able to find anything about in the Mailchimp docs. I wouldn't like to translate them myself...

    Read the article

  • Huge difference between Facebook Ad Click figures and Apache log requests

    - by Gearóid
    We're running a facebook ad campaign for our business but there seems to be a huge discrepancy between the number of clicks registered and the number of requests made with "facebook.com" in the HTTP referrer. The difference can be anything between 40-80 clicks/requests. I understand why the Google Analytics would be off and I understand that the figures shouldnt be exactly the same but surely if 100 people click the ad then I should be seeing at least 90 requests for the homepage with facebook.com as the referrer? Can anybody provide any insight into why this may be happening?

    Read the article

  • SSL Certificate Works in Monit - But Not in Keystore

    - by Bart Silverstrim
    I have a situation where there's a keystore file with the various root/intermediate certificates stored in it in a way that it seems to work for most browsers. Problem is that when mobile browsers hit it, there's a break in the chain and they complain. I used an SSL checker at http://www.sslshopper.com/ssl-checker.html and it states that "The certificate is not trusted in all web browsers. You may need to install an Intermediate/chain certificate to link it to a trusted root certificate." So...the desktop browsers must have the intermediate certs already and can make the chain connections, I'm assuming, while the mobile browsers can't. The thing is that I had used Portecle to export certificates from the keystore and cobble them together to create a .PEM certificate to run the Monit utility. When I check that application with the SSL checker, it works fine! The person that originally created the keystore said he couldn't follow the SSL provider's directions for creating the keystore because he created the CSR request using openssl, so the cert and private key had to be converted to DER format and use importkey to get it to work; following the directions he found online had importkey seem to use only a set keystore file as a result, and it would erase anything already in the file if it existed. So is there a way to take the certificate I created for Monit and create a working keystore for the Tomcat website? What would be causing the chain to be broken in the current keystore, but work for Monit? I have the SSL cert provider's intermediate and cross certificates, and the website's certificate, but is what else would I need to create a working chain of certs for a keystore?

    Read the article

  • Screencast several application windows at once in Microsoft Windows

    - by Birt
    I have several (20+) applications running on a Microsoft Windows PC. What I would like is a solution that allows me to broadcast the window of each application in a webpage, in readonly mode (there's no need for the users to interact with it). This should work even if the application is in the background, seeing that there's no way to fit all of them on the screen. I performed very extensive searching, from simple screencasting apps such as Camtasia, CamStudio or VHScrCap to things like VNC (haven't found any server able to broadcast multiple windows at once, much less background windows) and even application virtualization, but in the end I haven't found anything that fits my needs. Most solutions that allow capturing a window instead of the whole desktop will not let you capture multiple windows but only a single window and on top of that they don't even work when the window is in the background.

    Read the article

  • Informing Googlebot for deprecated pages

    - by trante
    I publish timetables in my website. For example last year I published Number 2 bus Summer 2013 timetable. I has pretty good ranking on Google SERPs for number 2 bus timetable But this year I added a new page with the name "Number 2 bus Summer 2014 timetable". When users search number 2 bus timetable in Google, they find 2013 timetable in first page of SERPs. But I want them to find 2014 timetable. Thy can reach 2014 page with the keywords number 2 bus timetable 2014. But most of the users doesn't write year name. So what's the proper way to say Googlebot that 2013 page is deprecated and newer version is 2014 page ? I created a link from 2013 page to 2013 page and added a deprecation alert for visitors. But I still see 2013 timetable in first page of Google SERPs. Of course it is possible to 301 redirect, 2013 page to 2014 page. But I want users to reach old pages to compare the differences between years. (As you would guess I have many pages like this.) Edit: Why I don't put timetables on same page and show different years' timetables with sorting. Because my old pages has good pagerank scores or SERPs. Removing these old page will remove them.

    Read the article

  • 410 Responses when your CMS host doesn't support them?

    - by leeand00
    Sending a 410 responses for a page that no longer exist should make Google stop crawling for that page. The site I am working on has been recently migrated, and very little of the content was migrated. I've already turned the existing content into 301 redirects (the content that is on both the old and the new site), but now I would like to flush the old content from Google's memory by placing 410 responses in it's path when it returns to crawl for them and finds a 404 response. However, I asked our CMS host about it, and they said that our CMS does not support 410 responses. Is there some other way to post a 410 response, like making a dead link 301 redirect to a page that a 410 response in the form of a meta tag?

    Read the article

  • How do search engines handle hyphenated words?

    - by NinjaKC
    I am not sure my title fully explains what I mean. I thought this might be an interesting question. If I had a set of keywords, broken with a dash or 2, will search engines consider the dashed split keyword as maybe a full keyword? Say I have a site that sort of breaks words down, like the dictionary sites do. So a keyword for that page, might end up in the page, and / or the URL, as broken by dashes. Key-word = keyword Co-op-er-at-ive = cooperative Pho-to-gra-phy = Photography www.example.com/key-word/ www.example.com/co-op-er-at-ive/ www.example.com/pho-to-gra-phy/ I know search engines will consider a dash (at least Google) as a space, and understand it as multiple words. But in the English language, a dash can also break a word down (at least I think it can, can't it?), so will search engines also take this into consideration? I did a 'little' research, I Googled some words and placed random dashes, and it returned the words I searched for, but this could be considered a typo from the user on Google's search end, so really I am wondering if I can purposely put a dash in a keyword, and have the search engine spiders still catch that keyword as the real word without dashes? I've done a little Googling and looking here on Stackoverflow, but everything comes down to dashes for multiple words, not really the specific thing I'm trying to figure out. Hopefully that makes sense, I am not an expert in SEO, yet, but get the basics and have been playing, and this is just really a random question to satisfy my knowledge of playing :P

    Read the article

  • How to create a good sitemap for dynamic website

    - by Saif Bechan
    I have a website with dynamic content and different kind of pages. I have some pages that rarely change, and I have pages like blogs that change often. The blog pages also have links for sorting, for example sorting on date, asc, desc. On some of the pages I also have links to different tabbed content, and links that are just anchor links. Now when I use a xml sitemap generator then all the links are thrown into the site, and so I don't think all the links are really relevant. The blogposts up until now are also taken into the sitemap. Is this really necessary? I think the links to the blogposts can be indexed just fine. Is the best way to make a sitemap just to manually assign the main menu links to the sitemap, or is indexing everything really recommended?

    Read the article

< Previous Page | 88 89 90 91 92 93 94 95 96 97 98 99  | Next Page >