Search Results

Search found 824 results on 33 pages for 'ranking stackingblocks'.

Page 28/33 | < Previous Page | 24 25 26 27 28 29 30 31 32 33  | Next Page >

  • SEO: Is promoting your backlinks a good strategy for improving search results for my site's name?

    - by user4394
    I run a website that's been around for about three years in the sports space. I am successfully ranking well for targeted keywords, but searching for the name of my site itself returns very poor results - it shows my site, its FB/Twitter, and then 15 pages of unrelated spam that happen to contain two words that, when combined, form my website's name. After that, my backlinks begin to show up spordically. As far as I can tell, I simply don't have enough backlinks and the backlinks I do have are ranked worse than the spam. (Site Explorer lists 200 external links to any page on our domain and 20 external links directly to the front page). To counter this, my strategy is to promote my backlinks so they get a better page rank than the spam. Does that make sense? Am I going in the right direction or should I just focus on getting more backlinks pointing directly to my site? Thanks in advance and I'd be happy to answer any questions I can (without giving away my site of course).

    Read the article

  • Considerations for changing URL path

    - by Mandar
    I am having an e-commerce site and current URL structure is like this: www.example.com/category1 [Category landing page] www.example.com/category1/sub-category [sub-category listing page] www.example.com/category1/sub-category/product-name [Product Details page] I am finding it difficult to identify from the URLs whether the URL is category landing page or a listing page or a product details page (primarily in Google Analytics). To solve this problem, I am thinking of adding qualifiers in the URL as follows: www.example.com/category1/cat-land [Category landing page] www.example.com/category1/sub-category/cat-list [sub-category listing page] www.example.com/category1/sub-category/product-name/prod-details [Product Details page] Original URLs would be redirected to new URLs using 301 permanent redirect. Would this have any negative effect on existing SEO and Google ranking?

    Read the article

  • Should I add rel nofollow to internal links which already have meta noindex?

    - by CamSpy
    Let's say I have a products page with listing producsts and the page has pagination. I would like the 1st page to have all the SE ranking weight so I decided to put meta noindex on the rest of the paginated pages (from page 2 to N). My common sense says that if I don't want pages to not get indexed, I shouldn't also pass link/PR juice to these pages. (Is that smart?) What happens if I set rel="nofollow" for all pagination links from page 2 to page N?

    Read the article

  • Length of Page Title, URL, Meta Description and total number of links on a page

    - by MJWadmin
    We've been examining a number of different SEO tools recently. Several of these tell us that some of our page title's, urls and meta descriptions are too long. We've also been told that some of our pages have too many links on them. I guess our first question is - is any of that feedback true! Can URL's etc actually be too long and if so how much does this affect ranking? Secondly can you have too many links on a page and if so, how many is too many? Thanks in advance...

    Read the article

  • How an offline main domain can influence traffic on an active sub domain

    - by danie7L T
    The website(s) design is for a company active in 3 different areas. As an example lets use the following structure: www.example.com [sub1.example.com] [sub2.example.com] [sub3.example.com] sub2.example.com and sub3.example.com are ready to go live but www.example.com really isn't and send a 503 http error code. I would like to know if this situation will affect the traffic and ranking of the subdomains ready to go live? Is it preferable to wait and go live with the main domain? Or there is nothing to "fear" and one doesn't affect the other? Thank you

    Read the article

  • Adding tags for SEO in clothing website [duplicate]

    - by samyb8
    This question already has an answer here: What are the best ways to increase a site's position in Google? 18 answers I am building a site for a women's accessories brand. The site has a Homepage, a Store page (where all accessories are displayed), a page for each of the accessories description, an about page and a contact page. There is also a whole set up for shopping cart and checkout (irrelevant to this question). My issue is the SEO. Where can I put the keywords? The home page has only the menu and some photos. The store page displays the items and its titles. Then the specific item's page has a description of the item (pulled from the database), category and price. However, I feel like this is not enough for SEO for google ranking. Where could I add tags in this type of site?

    Read the article

  • Website custom tracking

    - by Francisco Goldenstein
    I'm using ASP.NET MVC 4 and I want to track the incoming traffic of my site to know things like: 1) How many users that clicked a Google Adwords advertisement have bought one particular product? 2) Ranking of buyers grouped by Google Adwords advertisements. I could add a parameter to the URL like mysite.com?source=advertisement1234 but I want to avoid this practice to have cleaner URLs and for SEO purposes. Url.Referrer is not going to give me that information either, it's just going to say that the referrer is Google. Any suggestions? Thanks in advance!

    Read the article

  • Are generic keywords in url bad for SEO? [closed]

    - by user1661479
    Possible Duplicate: Squeezing all the SEO out of a URL as possible Need help with url structure. Let's say I'm a manufacturer of Wire EDM machines. Is it bad for me to put the keywords wire-edm in my url to help try to raise SEO ranking? For example: mywebsite.com/wire-edm/machine/model-xxxx mywebsite.com/wire-edm/customer-service mywebsite.com/wire-edm/contact Or should I leave it as the following because the gains are fairly insignificant and it doesn't help users understand my site structure: mywebsite.com/machine/model-xxxx mywebsite.com/customer-service mywebsite.com/contact I’d like to hear what everyones thoughts are on this and please provide some sources for which method is better.

    Read the article

  • Why are new pages not being indexed and old pages stay in the index?

    - by ZakGottlieb
    I currently have a site that was recently restructured, causing much of its content to be reposted, creating new URL's for each page. To avoid duplicates, all of the existing pages were added to the robots file. That said, it has now been over a week - I know Google has recrawled the site - and when I search for term X, it is stil the old page that is ranking, with the new one nowhere to be seen. I'm assuming it's a cached version, but why are so many of the old pages still appearing in the index? Furthermore, all "tags" pages (it's a Q&A site, like this one) were also added to the robots a few months ago, yet I think they are all still appearing in the index. Anyone got any ideas about why this is happening, and how I can get my new pages indexed?

    Read the article

  • A drop in SERP after following webmaster guidelines [on hold]

    - by digiwig
    So here's a puzzle for all you SEO gurus out there. I recently launched my own site. I had target keywords which were ranking very well for about 1 month, within the top five and even appearing in first place. In an attempt to maintain good positioning, I followed guidelines by adding robots.txt, an xml sitemap redirecting non-www to www redirecting index.php to root domain adding htaccess 301 redirect for old pages I added rich snippets created a google+ account, verified my picture to appear, I went through each of the webmaster issues with duplicate titles and meta descriptions and improved header tag document outlines i even created a few more blog posts to keep the content freshing and moving. So now my website appears on page 2 with my target keywords - and all because I followed the guidelines. What is happening? I see competitors with stagnant content superglued to position 1.

    Read the article

  • Alt text vs CSS sprites (SEO vs speed)

    - by leeoniya
    I'm reworking our site to reduce HTTP requests and blocking requests by concatenating JS, css, gzipping, loading all JS via LABjs and using CSS sprites for images that were loaded individually via <img> tags before. Progress has been great so far - 5x page load performance improvement. However, we're in the top 5 organic search ranking in google for many targeted keywords and phrases. I'm afraid eliminating so many img tags with alt attributes can hurt our SEO. Does anyone have any experience with alt tag manip/removal and effects on SEO positions? Is previous rank "sticky"?

    Read the article

  • A relatively new blog seems to be getting very poor Google indexing

    - by Genadinik
    I have a new blog that is 2 months old. In the first few weeks, it was getting indexed nicely and my GoogleWebmaster reports were showing that it was getting crawled and began ranking for some terms. Then as I kept writing, the GoogleWebmaster report thinned out and showed less and less terms that this blog ranks for. Now there are only 4 terms with one of them being my name. Is there something I need to do to keep the old posts to remain indexed and crawled? Thanks, Alex

    Read the article

  • I've got two technical degrees but little in the way of experience. How do I get into programming? [closed]

    - by Neonfirelights
    I'm looking for a job, I want to break into programming. I'm looking for the right sort of role and the right place to look for it; I would really appreciate input from someone with industry experience. I've got an excellent academic record: BSc Physics (2:1), MSc Computer Graphics, Vision and Imaging (expecting Merit) from two world ranking universities. I have advanced technical knowledge of C/C++ and Matlab and experience working with C# and VB.NET. Unfortunately I don't have much in the way of commercial experience; unlike a lot of people I know my under-graduate didn't come with a sandwich placement. Where can I go to break into the software industry?

    Read the article

  • Aug 3rd 2014 HUGE DRop in visits [duplicate]

    - by stephanie
    This question already has an answer here: How to diagnose a search engine ranking drop? 5 answers i am not a pro at this but i have built my website using freeway pro and for the past year i have managed to get between 100-130 visits a day to my website but on Aug 3, it dropped to 40… then now 15 a day… I do not know what happened and clearly freaking out on the situation as this is my shop window. There are no warnings or errors in Google Webmaster tools. Please help me out how to understand this issue. Has anything been happening with google around this time??

    Read the article

  • Should I be concerned about hAtom tags on my blog?

    - by Sid
    I am using a theme that automatically adds hatom-entry, hatom-feed classes on my WordPress blog. I read that such tags/classes should be used for syndicated content. Anyway, then I ran a Rich Snippet Tool, which threw a "HAtomfeed" error. So I removed a "hfeed" div tag. Now, should I be concerned? Can this cause any problems? I still have a couple of these classes (listed below), and I just hope they do not effect my site's ranking. For now, these are the tags the Rich Snippet Tool has detected: hatom-feed hatom-entry: entry-title: entry-content: published: author: fn: person-name: url: Appreciate your help! Edit: All the content on this weblog is unique and written by me and others. Thought I'd share that.

    Read the article

  • Top 5 Places to Get Good Quality Links & Boost Your Search Rank

    If you're looking to get a higher ranking for your website, the bottom line is that you need to get good quality links. Gone are the days when you could just rely on keywords on your site to get you to the top... or even getting 1,000's of un-targeted links to blast your way to the #1 spot on Google. Now, it's all about getting high quality links that will make Google think your site is "worthy" enough to put at the top of the results... and here's where to get those links.

    Read the article

  • create new subdomain or buy new domain? seo costs [closed]

    - by greek_no_money
    If I am targeting the same audience, but the new sub-site has different concept from the existing site, should I create a new sub-domain or create a new domain? What are the seo advantages and disadvantages of creating a new sub-domain? For example Stack Exchange with Alexa rank 1.474 right now, has sub-domains such as programmers.stackexchange.com and other domains such us serverfault.com with rank 3.515 and stackoverflow.com with rank 109. So why didn't Stack Exchange put, for example, Stack Overflow in a sub-domain to create a better ranking?

    Read the article

  • Split html text in a SEO friendly manner

    - by al nik
    I've some html text like <h1>GreenWhiteRed</h1> Is it SEO friendly to split this text in something like <h1><span class="green">Green</span><span class="white">White</span><span class="red">Red</span></h1> Is the text still ranking well and is it interpreted as a single word 'GreenWhiteRed'?

    Read the article

  • From SEO point of view, is it better to use Domain-Dash.com or Domainwithoutdash.com?

    - by Msc. Adrian Lopez
    I have been reading forums and so, but found not a clear answer or nor conclusive, about the strategic decission of using domain-with-dash.com or notusingdashes.com Is there a problem or disadvantage in ranking for those key words? Is it better having the-domain-with-dash.com than shortdomain.net? many cases you dont have the dot.com available for that specific key word. what are your opinions, please prove facts, or add links to the source. What Google has to say?

    Read the article

  • using union in a construct sparql query

    - by simon
    hello, i have such a sparql query: select ?s ?p ?o from <http://localhost:8890/DAV/ranking> where { {<http://seekda.com/providers/cdyne.com/PhoneNotify> so:hasEndpoint ?s. ?s ?p ?o} union {<http://seekda.com/providers/cdyne.com/PhoneNotify> ?p ?o} } but i need a graph query (construct ord describe). unfortunatly i have no clue about how to use unions in construct or describe queries. please help me best regards simon

    Read the article

  • Bad_alloc exception when using new for a struct c++

    - by bsg
    Hi, I am writing a query processor which allocates large amounts of memory and tries to find matching documents. Whenever I find a match, I create a structure to hold two variables describing the document and add it to a priority queue. Since there is no way of knowing how many times I will do this, I tried creating my structs dynamically using new. When I pop a struct off the priority queue, the queue (STL priority queue implementation) is supposed to call the object's destructor. My struct code has no destructor, so I assume a default destructor is called in that case. However, the very first time that I try to create a DOC struct, I get the following error: Unhandled exception at 0x7c812afb in QueryProcessor.exe: Microsoft C++ exception: std::bad_alloc at memory location 0x0012f5dc.. I don't understand what's happening - have I used up so much memory that the heap is full? It doesn't seem likely. And it's not as if I've even used that pointer before. So: first of all, what am I doing that's causing the error, and secondly, will the following code work more than once? Do I need to have a separate pointer for each struct created, or can I re-use the same temporary pointer and assume that the queue will keep a pointer to each struct? Here is my code: struct DOC{ int docid; double rank; public: DOC() { docid = 0; rank = 0.0; } DOC(int num, double ranking) { docid = num; rank = ranking; } bool operator>( const DOC & d ) const { return rank > d.rank; } bool operator<( const DOC & d ) const { return rank < d.rank; } }; //a lot of processing goes on here; when a matching document is found, I do this: rank = calculateRanking(table, num); //if the heap is not full, create a DOC struct with the docid and rank and add it to the heap if(q.size() < 20) { doc = new DOC(num, rank); q.push(*doc); doc = NULL; } //if the heap is full, but the new rank is greater than the //smallest element in the min heap, remove the current smallest element //and add the new one to the heap else if(rank > q.top().rank) { q.pop(); cout << "pushing doc on to queue" << endl; doc = new DOC(num, rank); q.push(*doc); } Thank you very much, bsg.

    Read the article

  • Replace between 2 items in observableArray - knockout

    - by Yojik
    im tring to Replace between 2 items in observableArray with knockout but something is wrong.. after the replace of the items ,i will change and send the displayOrder property (in both itmems) to the server (or should i take other approach for this) rankDownMessage: function () { console.log("ranking down msg"); var currentItemindex = viewModel.messages.indexOf(this); var nextItemIndex = currentItemindex + 1; viewModel.messages.replace( viewModel.messages()[nextItemIndex], viewModel.messages()[currentItemindex] ); } only the first item changed to the second item but the second item doesnt become the first one

    Read the article

  • best tools for SEO

    - by user261002
    I am trying to do some SEO for a plumbing website, but the more I search on Google and youtube and different websites the more I get confused as there is a thousand of different tools out there. what is the best tool and way to get the best ranking from Google?

    Read the article

< Previous Page | 24 25 26 27 28 29 30 31 32 33  | Next Page >