Search Results

Search found 8370 results on 335 pages for 'seo friendly urls'.

Page 146/335 | < Previous Page | 142 143 144 145 146 147 148 149 150 151 152 153  | Next Page >

  • Google and Mirror Websites

    - by Roberto Aloi
    Which is the best way to manage a website with one or more mirrors so that: Google don't consider it as "dupicated content" The website is correctly indexed No inconsistencies or duplicated information are present in Google Analytics The Google webmaster guidelines in general are respected NOTE: I'm not sure if I should ask this question here or in ServerFault. It looks a bit in the middle between programming and server administration. Let me know if you think ServerFault represent a more appropriate place for this and I'll move it. Thanks.

    Read the article

  • How will a search engine read data from my Ajax-based webapp?

    - by Jack W-H
    OK, not entirely related to programming, so I'm sorry. But I'd like to know about this: So I've got a webapp. There's one column where a list of results are fetched from the database. When you click one, jQuery fetches the information associated with that result and puts it into the second column - all without a refresh and using Ajax. Is it possible for Google to still read it etc.? I understand it can follow links... but presumably not Javascript actions etc.? If this is the case, what do other Ajax-heavy websites do about search engine optimisation? Jack

    Read the article

  • How Can I Deal With Those Dead Links After Revamping My Web Site?

    - by skyflyer
    Couple of months ago, we revamped our web site. We adopted totally new site structure, specifically merged several pages into one. Everything looks charming. However, there are lots of dead links which produce a large number of 404 errors. So how can I do with it? If I leave it alone, could it bite back someday, say eating up my pr? One basic option is using 301 redirect, however it is almost impossible considering the number of it. So is there any workaround? Thanks for your considering!

    Read the article

  • Multilanguage website sitemap

    - by Alex
    My site is i18n based on the following structure: mydomain.com/en mydomain.com/en/product/blue-widgets mydomain.com/fr mydomain.com/fr/product/blue-widgets The site is internationalised not localised, therefore i don't want to GEO target specific locals just target "french" or "english" speaking users. When submitting a sitemap to the search engines should i send one sitemap with links to all the different language versions or have one separate sitemap for each language. Is that even possible?

    Read the article

  • How long does it take Google to update all links from R 301 ?

    - by romant
    I just changed the location of my blog, and have done the appropriate redirects. Does anyone have knowledge or experience for the delay in updating all the links across Google? Reason I ask, I wish to change the A record. So this will eliminate the .htaccess file, and thus null and void the redirect. How long must I wait prior to the undertaking? Thank you.

    Read the article

  • how to category/subcategory/city/firm-name url?

    - by kkalgidim
    iam using ruby on rails i have models -category -subcategory -city -firm when i click on category it will show sub categories and permalink should be: xxx.com/category when i click subcategory it will show firms and city names. xxx.xom/category/subcategory when i clikc on city name it will filter firms belongs to that city xxx.com/category/subcategory/city when i clikc on firm name it will show xxx.com/category/subcategory/city/firm-name firms may have more than one sub category i used premalink_fu but i could not do that sub category system. category,subcategory,city,firm tables have their own permalink field on db. but i dont know how to combine them dynamically. i can do xxx.com/category but icant do xxx.com/category/subcategory how can i do that please help me

    Read the article

  • Are AJAX sites crawlable by search engines?

    - by frankadelic
    I had always assumed that AJAX-driven content was invisible to search engines. (i.e. content inserted into the DOM via XMLHTTPRequest) For example, in this site, the main content is loaded via AJAX request by the browser: http://www.trustedsource.org/query/terra.cl ...if you view this page with Javascript disabled, the main content area is blank. However, Google cache shows the full content after the AJAX load: http://74.125.155.132/search?q=cache:JqcT6EVDHBoJ:www.trustedsource.org/query/terra.cl+http://www.trustedsource.org/query/terra.cl&cd=1&hl=en&ct=clnk&gl=us So, apparently search engines do index content loaded by AJAX. Questions: Is this a new feature in search engines? Most postings on the web indicate that you have to publish duplicate static HTML content for search engines to find them. Are there any tricks to get an AJAX-driven content to be crawled by search engines (besides creating duplicate static HTML content). Will the AJAX-driven content be indexed if it is loaded from a separate subdomain? How about a separate domain?

    Read the article

  • How Search Engine Bots Crawl Forums?

    - by Waleed Eissa
    If I have a forums site with a large number of threads, will the search engine bot crawl the whole site every time? Say I have over 1,000,000 threads in my site, will they get crawled every time the bot crawls my site? or how does it work? I want my website to be indexed but I don't want the bot to kill my website! In other words I don't want the bot to keep crawling the old threads again and again every time it crawls my website. Also, what about the pages crawled before? Will the bot request them every time it crawls my website to make sure they are still on the site? I'm asking this because I only link to the latest threads, i.e. there's a page that contains a list of all the latest threads, but I don't link to the older threads, they have to be explicitly requested by URL, e.g. http://www.mysite.com/showthread.aspx?threadid=7 , will this work to stop the bot from bringing my site down and consuming all my bandwidth? P.S. The site is still under development but I want to know in order to design the site so that search engine bots don't bring it down. Thanks

    Read the article

  • 301 redirect vs parking

    - by Pat
    I have several domain names registered, each a slight variant of each other. E.g, fastcar.com fast-car.com fastcar.co.uk fast-car.co.uk etc.. I don't wish to be penalized for duplicate content or spammy links by any of the major search engines. Should I park them all directly on the main domain I wish to promote, 301 redirect them to the main domain or not use them at all? Thanks

    Read the article

  • How can I efficiently group a large list of URLs by their host name in Perl?

    - by jesper
    I have text file that contains over one million URLs. I have to process this file in order to assign URLs to groups, based on host address: { 'http://www.ex1.com' = ['http://www.ex1.com/...', 'http://www.ex1.com/...', ...], 'http://www.ex2.com' = ['http://www.ex2.com/...', 'http://www.ex2.com/...', ...] } My current basic solution takes about 600 MB of RAM to do this (size of file is about 300 MB). Could you provide some more efficient ways? My current solution simply reads line by line, extracts host address by regex and puts the url into a hash. EDIT Here is my implementation (I've cut off irrelevant things): while($line = <STDIN>) { chomp($line); $line =~ /(http:\/\/.+?)(\/|$)/i; $host = "$1"; push @{$urls{$host}}, $line; } store \%urls, 'out.hash';

    Read the article

  • Website Sitemaps and <priority>, is it working?

    - by Mike Gleason jr Couturier
    Hi, My "Privacy Policy" page is seen more important by Google than other really more important pages on my website. I'm currently creating a script to generate a sitemap, should I bother with the priority? How do you effectively assign priorities to pages? I consider one of my page important but the page have less content than another one less important to my eyes... but maybe Google bot will see it the other way around. If my degree of "importantness" differs from the one of Google, will I get penalized on the ranking for a particular page? Thank you for sharing your black art with us :P

    Read the article

  • Is there anyway of making json data readable by a Google spider?

    - by leeand00
    Is it possible to make JSON data readable by a Google spider? Say for instance that I have a JSON feed that contains the data for an e-commerce site. This JSON data is used to populate a human-readable page in the users browser. (I.E. The translation from JSON data to human displayed page is done inside the users browser; not my choice, just what I've been given to work with, its an old legacy CGI application and not an actual server-side scripting language.) My concern here is that, the google spiders will not be able to pickup/directly link to the item in question when a user clicks on it in google, being presented with an index page full of all the items, rather than being linked directly to the item they clicked on. Is there anyway of "informing" the google spider in the JSON that what they should feed the user a different link?

    Read the article

  • I want to combine my www and non www and keep the link jucie from both.

    - by John Ray
    My website shows up for some keywords in the www and some in the non www. Seaquake shows more links to the non www version. It is a PR2 either way. I would like to combine the link juice of the two versions into the non www version. Does anyone know the best way to combine the two and keep the link juice of both. It is as simple as a 301 redirect and if so does the 301 need to be handled in any specific way.

    Read the article

  • Does URL Shortening affect Page Ranking?

    - by rc
    Recently there has been a lot of hype about URL Shortening. I guess some URL Shortening services even offer tracking stats. But, doesn't adding one more level of look-up to the original URL affect page ranking in any way? Just curious to know.

    Read the article

  • Will news ticker using overflow:hidden cause Google to see site as spam?

    - by molipix
    In the hope of tempting Googlebot with fresh content, I've implemented a homepage news ticker which displays the 20 most recent headlines on our site. The implementation I have chosen is a <ul> with each headline being a <li> Initially all the <li> elements have no style but Javascript kicks in on page load and gives all but one of them a display="style:none" attribute. Javascript then displays each of the other 19 headlines in a loop. So far so good. However, in order to prevent a visually unplesant page load where the 20 items display and then immediately collapse, I am using overflow:hidden on the <ul> element. Anyone got a view on what Googlebot is likely to make of this? Does the fact that I'm using overflow:hidden make the content look like spam?

    Read the article

  • mod_rewrite and htaccess

    - by chris
    I have set up a few rules based on other questions but now my css breaks I did have the URL / /eshop/cart.php?products_id=bla and everything work fine. but now with my mod rewrite url- /product/product-title/ It loose the base directory. Is there an option to fix this? So i dont have to go back with the full url in all the img src tags and so on?

    Read the article

  • Cannot see my wordpress website on google search

    - by ion
    Hi guys I recently uploaded a site made with wordpress. The site url is oakabeachvolley.gr I have set on the privacy settings of wordpress for the site to be visible by search engines. However after almost 45 days the site is invisible on google even when I'm searching using the url name and very specific keywords. Since I have made quite a few sites with wordpress I have never seen this behavior before. Sites will eventually be visible to google engine, sometimes even in the first day. However in this case the site does not show nowhere in the first 20 pages. Any help would be greatly appreciated.

    Read the article

  • Do you know best site on Net to learn XHTML 1.0 strict , Like other than W3schools.com but with bett

    - by metal-gear-solid
    Do you know best site on Net to learn XHTML , Like other than W3schools.com but with better and latest content? I have to link some friends who want to learn HTML? I like the "Try it editor" of W3C schools but not the content. I need semantic discussion also. what is the element all about and what is the semantic value, even if's it's valid, should we use or not. etc Is there any other site focused on Semantic , accessible and Valid XHTML with good content with "try it editor" like w3c schools? or now i should suggest to someone to learn HTML 5 Directly?

    Read the article

  • URL Rewrite query database?

    - by Liam
    Im trying to understand how URL rewriting works. I have the following link... mysite.com/profile.php?id=23 I want to rewrite the above url with the Users first and last name... mysite.com/directory/liam-gallagher From what Ive read however you specify the rule for what the url should be output as, But how do i query my table to get each users name? Sorry if this is hard to understand, ive confused myself!

    Read the article

  • mod_rewrite different rules for different pages

    - by Sophia Gavish
    Hi, I'm trying to understand how mod_rewrite works. I've been using it before but this week I tried to write rules to a new website and it doesn't works. I want to make a rule to make : www.example.com/media/?gallery=galleryname&album=albumname&pid=pictureid looks like: www.example.com/media/galleryname/albumname/pictureid The rule is: RewriteRule ^([^/])/([^/])/([^/]*)$ /media/?gallery=$1&album=$2&pid=$3 [L] and here is the code below: Options -Indexes Options +FollowSymLinks RewriteEngine on RewriteBase / RewriteCond %{REQUEST_METHOD} !^(TRACE|TRACK|GET|POST|HEAD)$ RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_FILENAME} !-l RewriteRule ^([^/])/([^/])/([^/]*)$ /media/?gallery=$1&album=$2&pid=$3 [L] I really want to know what I'm missing, because I tried some examples and it looks fine to me. maybe because /media/ is an actual folder the rule is wrong? Thanks.

    Read the article

  • generic async loading method for page web scripts?

    - by boomhauer
    The google analytics code went to an async load model some time back. I've noticed that a lot of the other scripts I use on many sites are causing slow load times - specifically the addthis script and the facebook like button. I'm noticing that the slow load times of these scripts is causing the google bot to calc my page loadtimes as being much slower than previously. I'd like to know if there is a standard/generic way of causing these scripts to load async as well, or perhaps a pointer to someone who has done the work for this already. Seems this would be a popular thing to do, but not much luck searching around.

    Read the article

< Previous Page | 142 143 144 145 146 147 148 149 150 151 152 153  | Next Page >