Search Results

Search found 20852 results on 835 pages for 'local seo'.

Page 168/835 | < Previous Page | 164 165 166 167 168 169 170 171 172 173 174 175  | Next Page >

  • triangular link exchange

    - by Bharanikumar
    What is triangular link exchange ? Every one saying Site a links to Site B Site b links to Site c Site c links to Site A its called triangular link , But can some one tell me in the prommer understandable view.. I dont How to Start this application ... Please advise How to start this....

    Read the article

  • Silverlight and Search Engines

    - by Ben
    Hi, I have recently started learning SilverLight (and Web developement in general) and have been advised by a friend that SilverLight isn't Search engine friendly (because Silverlight isn't HTML). Is there any way of getting around this and getting my site onto the Search engine lists (without paying)? (Any advise on getting my site on search engines lists would be greatly appreciated). Thanks

    Read the article

  • PHP & AJAX SEO - For users with javascript and non javascript

    - by RussellHarrower
    So I understand this may come across as a open question, but I need away to solve this issue. I can either make the site do ajax request that load the body content, or I can have links that re-loads the whole page. I need the site to be SEO compliant, and I would really like the header to not re-load when the content changes, the reason is that we have a media player that plays live audio. Is there away that if Google BOT or someone without ajax enabled it does the site like normal href but if ajax or javascript allowed do it the ajax way.

    Read the article

  • Does Google crawl AJAX content?

    - by Doug
    On the home page of my site I use JQuery's ajax function to pull down a list of recent activity of users. The recent activity is displayed on the page, and each line of the recent activity includes a link to the user profile of the user who did the activity. Will Google actually make the ajax call to pull down this info and use it in calculating page relevancy / link juice flow? I'm hoping that it does not because the user profile pages are not very Google index worthy, and I don't want all those links to the User profile pages diluting my home page's link juice flow away from other more important links.

    Read the article

  • How many pages can i add to a new website without going to the google sandbox ?

    - by François
    Hello there, I'm about to open a new website wich have 15.000 pages (ecommerce store). Of course, i will not publish all this pages at the same time, but i'm looking for infos on how much pages should i start with without going to the sandbox. For example, could i start with a 50 pages website or is it too much ? Then, have you an idea (i know there's no precises rules here) what's the frenquency / volume of pages i could add later ? (50 pages / a day is it ok ?) Thank you very much for your advice, and sorry for my english :-)

    Read the article

  • php request variables assigning $_GEt

    - by chris
    if you take a look at a previous question http://stackoverflow.com/questions/2690742/mod-rewrite-title-slugs-and-htaccess I am using the solution that Col. Shrapnel proposed- but when i assign values to $_GET in the actual file and not from a request the code doesnt work. It defaults away from the file as if the $_GET variables are not set The code I have come up with is- if(!empty($_GET['cat'])){ $_GET['target'] = "category"; if(isset($_GET['page'])){ $_GET['pageID'] = $_GET['page']; } $URL_query = "SELECT category_id FROM cats WHERE slug = '".$_GET['cat']."';"; $URL_result = mysql_query($URL_query); $URL_array = mysql_fetch_array($URL_result); $_GET['category_id'] = $URL_array['category_id']; }elseif($_GET['product']){ $_GET['target'] = "product"; $URL_query = "SELECT product_id FROM products WHERE slug = '".$_GET['product']."';"; $URL_result = mysql_query($URL_query); $URL_array = mysql_fetch_array($URL_result); print_r($URL_array); $_GET['product_id'] = $URL_array['product_id']; The original variable string that im trying to represent is /cart.php?Target=product&product_id=16142&category_id=249 And i'm trying to build the query string variables with code and including cart.php so i can use cleaner URL's So I have product/product-title-with-clean-url/ going to slug.php?product=slug Then the slug searches the db for a record with the matching slug and returns the product_id as in the code above.Then built the query string and include cart.php

    Read the article

  • seo custom domains

    - by BamBam
    I'm trying to do the following: I have website like apartments.com. sometimes, I want to expand to different cities, so for SEO purposes, I might create a separate domain like bostonapartments.com or newyorkapartments.com for only boston and new york apartments. bostonapartments.com and the main domain apartments.com are all hosted on 1 server. what I did was I used Virtual Host apache config to direct bostonapartments.com to a directory on that server, and then used an iframe to load content for bostonapartments.com from apartments.com. So all the content will be hosted on apartments.com, but bostonapartments.com will get the content from apartments.com. how can I accomplish this effectively in a scalable way using php, apache, mysql? btw, I do not own apartments.com, I'm just using that as an example.

    Read the article

  • how to remove .php from a certain file (with apache .htaccess)

    - by user2015253
    I want a certain file (only this!) to remove the php extension for calling it. BUT: It uses get parameters and they should remain! I tried it with my .htaccess and something like: RewriteEngine on followed by either RewriteRule ^file(.*)$ file.php$1 or RewriteRule ^file(.+)$ file.php$1 but it doesn't work. (First gives error 500, second gives 404) Example what I want to call in the browser: file?param=asd&foo=bar - It should call as: file.php?param=asd&foo=bar

    Read the article

  • What is the precision of the priority field in sitemap.xml?

    - by Christoph
    Unfortunately the specification does not tell anything about precision. The xml scheme definition states that it is of the type xsd:decimal: <xsd:restriction base="xsd:decimal"> <xsd:minInclusive value="0.0"/> <xsd:maxInclusive value="1.0"/> </xsd:restriction> I have a sitemap generator that uses up to 10 positions after decimal point. Where often only the last few positions differ. These numbers are perfectly right according to the xsd, but yet i found some pages(3,4) that state that only 0.0, 0.1, 0.2, .., 1.0 are valid values. How will the search engines react to such a sitemap? Will some just round the value? I know that it is unlikely that someone can provide an answer to that question, unless he works for that search engine, but i think experiences will also do.

    Read the article

  • Is window.location.href = 'some_page.html' followed by search engines?

    - by Arkaaito
    Currently our website uses links to allow the user to change their locale. The problem with this is that you get a lot of random outlinks from each page on the site to... the same page, in other languages. When a search engine traverses this, it gets an excessively complex view of the site. We were going to change it to a form post to avoid this. However, it seems to me that we should just be able to change it to an onclick="window.location.href='change_my_language.php'" rather than an href="change_my_language.php". Am I right? Or do the major search engines scan for and follow this sort of thing nowadays?

    Read the article

  • Has Google introduced a system that allows multiple "keyword-only" domains per site?

    - by tags2k
    I've been told by a client that "a friend" told them that as of January 2010, Google allows multiple domain names that have keywords in them to be associated with a single site. To be honest this sounds rather April-foolish but I'm not sure when his "friend" told him so for the time being I have to take it at face value. I've heard nothing of this and have searched for such a thing this morning, to find nothing but warnings against this practice. Said client seems keen on buying up lots of domains today, so before he insists upon it I just want to be absolutely sure - has Google silently introduced such an allowance, or is there something else they introduced earlier this year that he could be getting confused with? Thanks for any light you can shine on this!

    Read the article

  • String Functions in IIS Url Rewritting Module

    - by Nariman
    The IIS URL Rewrite Module ships with 3 built-in functions: * ToLower - returns the input string converted to lower case. * UrlEncode - returns the input string converted to URL-encoded format. This function can be used if the substitution URL in rewrite rule contains special characters (for example non-ASCII or URI-unsafe characters). * UrlDecode - decodes the URL-encoded input string. This function can be used to decode a condition input before matching it against a pattern. The functions can be invoked by using the following syntax: {function_name:any_string} The question is: can this list be extended by introducing a Replace function that's available for changing values within a rewrite rule action or condition?

    Read the article

  • checking google pagerank api

    - by Bharanikumar
    Hi , I am doing the small application for link exchange, so before i approve the URL , i want to check his/her site page rank, I know , there some site tell the Google page rank position , But am looking some API , for page rank checking... Thanks Bharanikumar

    Read the article

  • Cakephp, Route old google search results to new home page

    - by ion
    Hi there, I have created a new website for a company and I would like all the previous search engine results to be redirected. Since there were quite a few pages and most of them where using an id I would like to use something generic instead of re-routing all the old pages. My first thought was to do that: Router::connect('/*', array('controller' => 'pages', 'action' => 'display', 'home')); And put that at the very end of the routes.php file [since it is prioritized] so that all requests not validating with previous route actions would return true with this one and redirect to homepage. However this does not work. I'm pasting my routes.php file [since it is small] hoping that someone could give me a hint: Router::connect('/', array('controller' => 'pages', 'action' => 'display', 'home')); Router::connect('/company/*', array('controller' => 'articles', 'action' => 'view')); Router::connect('/contact/*', array('controller' => 'contacts', 'action' => 'view')); Router::connect('/lang/*', array('controller' => 'p28n', 'action' => 'change')); Router::connect('/eng/*', array('controller' => 'p28n', 'action' => 'shuntRequest', 'lang' => 'eng')); Router::connect('/gre/*', array('controller' => 'p28n', 'action' => 'shuntRequest', 'lang' => 'gre')); Router::parseExtensions('xml');

    Read the article

  • String Functions in IIS Url Rewrite Module

    - by Nariman
    The IIS URL Rewrite Module ships with 3 built-in functions: * ToLower - returns the input string converted to lower case. * UrlEncode - returns the input string converted to URL-encoded format. This function can be used if the substitution URL in rewrite rule contains special characters (for example non-ASCII or URI-unsafe characters). * UrlDecode - decodes the URL-encoded input string. This function can be used to decode a condition input before matching it against a pattern. The functions can be invoked by using the following syntax: {function_name:any_string} The question is: can this list be extended by introducing a Replace function that's available for changing values within a rewrite rule action or condition? Another way to frame the question: is there any way to do a global replace on a URL coming in using this module? It seems that you're limited to using regular expressions and back-references to construct strings, without a search/replace functionality to replace every X with Y in {REQUEST_URI} before issuing a redirect.

    Read the article

  • Best methods to make urls friendly?

    - by Geuis
    We're working on revising the url structure for some of our movie content, but we aren't quite sure on the best way to handle odd characters. For example, '303/302' '8 1/2 Women' 'Dude, Where's My Car?' '9-1/2 Weeks' So far, we're thinking: /movies/303-302 /movies/8-1-2-women /movies/dude-wheres-my-car /movies/9-1-2-weeks Is this the best solution? Is there anything we're forgetting?

    Read the article

  • Zend, slow load, "waiting for response" for 20-80 seconds on local site

    - by Tony C.
    So I have several sites running under the same zend setup. All of the sites run pretty normally except one. Upon loading or reloading this one site, reguardless of which page your on (excluding the 404 page explanation later...) you get a serious pause before any content begins to download. Using firebugs net panel you can see that the first request which is www.(siteaddress).com.local you see a "waiting for response" bar (purple) that is going for anywhere from 20 to sometimes 80+ seconds and this isn't on a dev site, this is on a local site under localhost. What I've managed to figure out so far is that all the pages do this except my 404 page. The reason the 404 page doesn't succumb to this is because it uses a seperate controller (the error controller) and therefore bypasses much of the controller and functions the other parts of the site use. Using exit statements I've manged to figure out that the problem happens somewhere between my post dispatch and my main (top most) controllers Init function. If i exit in the main controllers init the page loads (then exits instantly, no wait). If i do the same in the pre or post dispatch the page waits the 20-80 seconds then exits. Is there a diagram or explanation somewhere or a way for me to find out what events fire inbetween the post dispatch and the main controllers init function? Or does anyone have any clue what might cause this? Any help would be greatly appreciated...

    Read the article

  • Google says: Sort parameters in URL problematic

    - by feklee
    From Google's recommendations for URL structure: Sorting parameters. Some large shopping sites provide multiple ways to sort the same items, resulting in a much greater number of URLs. For example: http://www.example.com/results?search_type=search_videos&search_query=tpb&search_sort=relevance&search_category=25" When linking from outside, then having URLs differing only by sort parameters is obviously a bad idea: Google will not understand that these links point to the same item, i.e. that the item is popular. Therefore ranking will be lower than it should. But what's the alternative? Using a fragment identifier (#), and then doing the sorting in JavaScript? What else? Some settings in Webmaster tools?

    Read the article

  • Why is GXHC_gx_session_id appended to URLs?

    - by Nariman
    Based on my limited understanding of this parameter [1] it seems to be used in representing cookieless session IDs in java applications... but strangely, we're now noticing that a 3-year-old .NET stack is now appearing in Bing SERPs with GXHC_gx_session_id appended to the domain - and we're not alone: http://www.bing.com/search?q=GXHC_gx_session_id http://www.google.ca/#hl=en&q=GXHC_gx_session_id When comparing Google SERPs to Bing SERPs there are some inconsistencies in whether a particular site carries this parameter - is it then a bing-specific issue only? What else could cause this parameter to be appended to indexed URLs if the target environment (anything behind the load balancers) isn't running java? [1] - http://java.itags.org/java-web-tier-apis/72018/

    Read the article

< Previous Page | 164 165 166 167 168 169 170 171 172 173 174 175  | Next Page >