Search Results

Search found 5738 results on 230 pages for 'seo friendly'.

Page 24/230 | < Previous Page | 20 21 22 23 24 25 26 27 28 29 30 31  | Next Page >

  • Does whitespace in the title tag affect SEO?

    - by amelvin
    The site I'm working on uses Umbraco and has xslt macros to generate dynamic page title tags - but the title tags generated contain lots of whitespace and linefeeds. Now these macros can be changed so I'm sure that the contents of the title tag can be condensed, but at this stage of development we'd rather not do any work that is not essential. I've checked W3 and Google but I'm struggling to find something conclusive on whitespace. So I'd like to ask is a title tag formatted like this: <title> Sitename - The official blah blah blah - Section - Section Search Results </title> any worse for SEO than: <title>Sitename - The official blah blah blah - Section - Section Search Results</title> ... and are there any other implications to leaving the title tag with whitespace in it?

    Read the article

  • joomla - SEO settings and mod_rewrite

    - by Stefano
    Hi I'm using Joomla 1.5.14 and I configured SEO as in the following image Now I need to map a few old URL to the new site let's say that I need to map htp://mysite/old.html to the new Joomla page http://mysite/index.php?option=com_content&view=article&id=32&Itemid=70 I added in my .htaccess file the following RewriteRule ^old\.html$ index.php?option=com_content&view=article&id=32&Itemid=70 #works!! this works fine, but if I use the SEF URL in .htaccess (let's say the above page can be reached with htp://mysite/contacts.html), I obtain a 404 error RewriteRule ^old\.html$ contacts.html #this does not work Now the question: Is it possible use SEF URLs in RewriteRule? where am I wrong? thank you in advance stefano

    Read the article

  • Is W3C Valid XHTML and CSS Code, Semantic and Accessible Mark-up enough for site's SEO?

    - by metal-gear-solid
    I created a web-site for a client with W3C Valid XHTML and CSS Code, Semantic and Accessible Mark-up and I had said to client my code will be SEO friendly. Theyway i code it will be good for your site SEO. I putted my all efforts to make good code Now my non-techie Client is asking me, Should him go for any SEO company even after providing SEO friendly site by me? What other SEO companies will do for him other than what we can't by W3C Valid XHTML, CSS , Semantic and Accessible Mark-up?

    Read the article

  • SEO with image link alt text vs standard text-based link

    - by Infiniti Fizz
    Hi, I'm currently developing a website and the main navigation is made up of image links because the font used for them isn't standard. My client's only worry is will this mess up search engine optimization? Can I just add alt text to the images like "link 1" or use the name attribute of the anchor tag? Or would it be better to just have the navigation as anchor tags with the names of the links in them like: <a href="...">link 1</a>? I'm new to SEO so really don't know which to suggest to him, Thanks for your time, InfinitiFizz

    Read the article

  • Will dynamicaly generated content via Javascript hurt SEO

    - by Luke101
    This is what I would like to do. I would like to load content dynamically. Everything except the actual content will be rendered by javascript. I will place all the require information in a javascript variable or array at the bottom of the page. Then I will use javascript to place the content in the designated area. These are the types of things I would like javascript to render: Login menu Header and logo info Side bar info Footer info Dialog popups Ads All of the MEAT content will not be rendered by javascript. I will use the backend server to put the content in html. My logic is that more of the real content will be in HTML and all the other things will be rendered by javascript. Will this help or hurt SEO?

    Read the article

  • SEO dynamic / AJAX pages

    - by Andrew
    I have a very dynamic / ajax powered website which also includes iframes and due this reason I have a very bad SEO rank and it come in my mind to make one more additional version of the site (text based / no script) and serve it to the search engines based on the user agent . Please let me know if you think that is a feasible method and if it's not what else would you recommend me to do .. I don't want to loose any fancy ajax feature but I also need to keep the website on the google map :) thank you in advance for any answer ! btw the website is developed in asp.net c# .

    Read the article

  • Using DataPager Control with AJAX and SEO

    - by Jonathan Wood
    I've just taken my first stab at making a ListView, ObjectDataSource, and DataPager run in an AJAX panel. I had trouble getting it to work until I removed the QueryStringField="page" attribute from the DataPager. This attribute causes the current page to be passed as a query argument in the URL. For obvious reasons, I guess that won't work when posting back using AJAX. Now my question is if this hurts my SEO. When I used QueryStringField, the page links appeared as regular links with various query arguments. But now the links are just javascript. Haven't I hurt a search engine's ability to scan related pages? Or is there another approach to this?

    Read the article

  • Create an seo and web accessibility analyzer

    - by rebellion
    I'm thinking of making a little web tool for analyzing the search engine optimization and web accessiblity of a whole website. First of all, this is just a private tool for now. Crawling a whole website takes up alot of resources and time. I've found out that wget is the best option for downloading the markup for a whole site. I plan on using PHP/MySQL (maybe even CodeIgniter), but I'm not quite sure if that's the right way to do it. There's always someone who recommends Python, Ruby or Perl. I only know PHP and a little bit Rails. I've also found a great HTML DOM parser class in PHP on SourceForge. But, the thing is, I need some feedback on what I should and should not do. Everything from how I should make the crawl process to what I should be checking for in regards to SEO and WCAG. So, what comes to your mind when you hear this?

    Read the article

  • SEO/PHP: How to Convert Form-Submit URL (Get-Method) without Javascript SEO-Friendly?

    - by elmas
    hello, i have this code <form action="index.php" method="get" class="search-form"><input type="text" size="35" name="search" class="searchBox" value="" /><input type="submit" value="Start Searching!" /></form> and actually i convert the url with javascript <script type="text/javascript"> $(document).ready(function() { $('.search-form').submit(function() { var value = $('.search-form input:text').val(); value = value = value.replace(/\W/,''); // replace window.location.href = value + "-keyword" + ".html"; return false; }); }); </script> is there a method to convert the url seo-friendly without javascript? maybe with php?

    Read the article

  • T-4 Templates for ASP.NET Web Form Databound Control Friendly Logical Layers

    - by joycsharp
    I just released an open source project at codeplex, which includes a set of T-4 templates to enable you to build logical layers (i.e. DAL/BLL) with just few clicks! The logical layers implemented here are  based on Entity Framework 4.0, ASP.NET Web Form Data Bound control friendly and fully unit testable. In this open source project you will get Entity Framework 4.0 based T-4 templates for following types of logical layers: Data Access Layer: Entity Framework 4.0 provides excellent ORM data access layer. It also includes support for T-4 templates, as built-in code generation strategy in Visual Studio 2010, where we can customize default structure of data access layer based on Entity Framework. default structure of data access layer has been enhanced to get support for mock testing in Entity Framework 4.0 object model. Business Logic Layer: ASP.NET web form based data bound control friendly business logic layer, which will enable you few clicks to build data bound web applications on top of ASP.NET Web Form and Entity Framework 4.0 quickly with great support of mock testing. Download it to make your web development productive. Enjoy!

    Read the article

  • SEO - Problems possibly related to 301 Moved Permanently

    - by ILMV
    Right, here's the story: We have had a website for one of our brands now for many years, the site design was very bad and recently did a complete overhaul, mostly design, but also some of the backend code. The original site was using links such as this example.com/products/item/127 and thus I wanted to change them to be move user friendly, especially to include the product name, the same link now reads example.com/product/127/my-jucy-product/. Since our switch over we have seen our Google results take a beating (we were on the first page for our normal search terms, now we're nearer the 4th!). The other problem we're having is that the links to the old products haven't updated to the new links despite me coding a 301 redirect from old to new. The 301 is not being fired from .htaccess, but in our PHP framework. I had a look at how the site is being loaded from a old link that is still in Google and here's what firebug is reporting: GET <google link> 302 Found GET example.com/products/item/127 302 Found GET example.com/products/item/127 301 Moved Permanently GET example.com/product/127/my-jucy-product/ 302 Found So the Google link has a 302, good. But when the old link comes in our framework is returning a 302! It's only afterwards when it finally hits the right part of the framework does it 301, so here's my question: Is the reason our old links have not changed and our Google Ranking has significantly nose dived because Google is seeing a 302 before the 301? At the time I was reluctant to mess with our .htaccess because it had become pretty complicated and I was under some pretty intense time constraints, now I'm wondering whether this was an incorrect disicion and perhaps I should revisit it. Many thanks! Edit Bugger, just signed up to the Webmaster Tools and I'm getting redirect errors all over the place, hundreds of them! I think this is my problem.

    Read the article

  • Discussion on SEO best-practices for site development involving php...

    - by Bradley Herman
    Recently in our work, I've started getting some experience with SEO (finally). It's something I've put off for a long time because I've always maintained that SEO is a buzz-word b.s. pseudo-science and more about providing quality, relevant content (assuming proper header tags and the basics are covered). However, sometimes a client doesn't have stellar content yet still demands SEO and high rankings. While it's not how I design sites 100% of the time (as design dictates structure), I typically create a basic template from the design my boss gives me, then I optimize it, and then strip the top and bottom and move those to header.php and footer.php, using the following to bring in the header and footer based on AJAX versus HTML requests: <?php if($_SERVER['HTTP_X_REQUESTED_WITH']==''){ include('includes/header.php'); }?> #content here <?php if($_SERVER['HTTP_X_REQUESTED_WITH']==''){ include('includes/footer.php'); }?> Then, I use jQuery to intercept page requests and I use AJAX to fill in, for example, a #copy div with the new content. This avoids unnecessarily loading all the header and footer info everytime, but still allows users without Java to access pages without any problems. (also to think about, depending on size of content, do the extra http requests added using this method render it more of a server strain versus a single, larger file?) I don't have a really solid understanding of the meta keywords and their SEO significance, but as I recall reading, the keywords, title, and description on a page should match up to the pages content--ie. each page should have slightly different keywords/description while retaining some common ground. What I'm getting at here is trying to foster a discussion on whether my approach is flawed to begin with, if there are things I can do (within reason) that keep the site structure simple but allow for better SEO practices, or if my SEO understandings are wrong. This isn't a question, per say, but hopefully a constructive discussion here that more than just I can learn from. I appreciate any responses and hope to hear from you. Thanks!

    Read the article

  • SEO, ordering and duplication of content

    - by piquadrat
    I run a specialized news site and am trying to apply a little bit of SEO sauce to it. One of the most important things I hear is to avoid duplication of content. I've covered all the basics but I'm stuck with ordering of content. As an example, the archive of the site is orderable by date, views, and rating. Since we don't have that many news items, an archive page for a particular day has usually only a couple of items, so the following URLs all have the same content, albeit in different ordering: /news/archive/2010/05/16/ /news/archive/2010/05/16/?o=views /news/archive/2010/05/16/?o=rating Do search machines penalize this particular kind of duplication of content? And if yes, what's the best way to avoid said penalty? <link rel="canonical" />? Tell Google & Co. to ingore the o parameter? Marking the ordering links with nofollow? Only allow the indexation of the date-ordered archive sites through robots.txt (not sure if this is even possible)?

    Read the article

  • Search Engine Optimization For Beginners - How to Write Search Engine Friendly Articles

    If you're planning to implement Search Engine Optimization as an Internet Marketing strategy to boost your site's online coverage then you need to focus one of the most important steps to produce quality results -- writing content. There is more to writing articles or Web content than just stuffing it full of keywords just to make it easy for search engine to find your page and put you on top. There are certain rules to be followed in order for this to be an effective strategy for your SEO.

    Read the article

  • Tips For SEO Friendly Press Releases

    With the emergent commercial nature of the web it is becoming harder to get quality one way inbound links. One of the best ways of achieving incoming links is by submitting free or cheap press releases that help you build link justice without spending lots of money. An SEO press release is primary way to deliver news of new events taking place within your company.

    Read the article

  • Using Keywords to Create SEO Friendly Content

    So you have your site up and running and now you are about to load it with content. So you figure its time to get writing, but before you do you should have to know that not all articles are created equal! If you want to maximize your chances of ranking well in search engines, the first step in creating SEO friendly content is through understanding how to use keywords!

    Read the article

< Previous Page | 20 21 22 23 24 25 26 27 28 29 30 31  | Next Page >