Search Results

Search found 9728 results on 390 pages for 'meysam pro'.

Page 115/390 | < Previous Page | 111 112 113 114 115 116 117 118 119 120 121 122  | Next Page >

  • 403 error on index file

    - by John L.
    When I try to access index.py in my server root through http://domain/, I get a 403 Forbidden error, but when I can access it through http://domain/index.py. In my server logs it says "Options ExecCGI is off in this directory: /var/www/index.py". However, my httpd.conf entry for that directory is the same as the ones for other directories, and getting to index.py works fine. My permissions are set to 755 for index.py. I also tried making a php file and naming it index.php, and it works from both domain/ and domain/index.php. Here is my httpd.conf entry: <Directory /var/www> Options Indexes Includes FollowSymLinks MultiViews AllowOverride All Order allow,deny Allow from all AddHandler cgi-script .cgi AddHandler cgi-script .pl AddHandler cgi-script .py Options +ExecCGI DirectoryIndex index.html index.php index.py </Directory> Thanks

    Read the article

  • Trade off: Lower the number of URLs in sitemap from 43k to 23k or update the sitemap.xml only weekly basis

    - by Tobias
    we rewrote the sitemap creation process. Now the sitemap contains 43.000 URLs. 20k more than before. We have daily changing in URLs. The script that is creating the complete sitemap takes more than 30h. So we can not build it every day. Lets say that increasing the speed of the script is not possible. What should I do? A: Stay with the 23k URLs and update it daily B: Increase number of URLs to 43k and update it weekly

    Read the article

  • Can inbound links through template-based layouts result in penalties?

    - by Liam Sorsby
    So obviously link building is encouraged as long as it is natural, organic and has meaningful links with content relevant to your site. Obviously with the constant release of new updates to algorithms, Google is flagging sites for unnatural links to their sites. My Question is: Can this be caused by templating systems? With WordPress for example, where you can add a link on the footer and it is repeated throughout the entire website generating thousands of links? If we don't add any links, Good Content will be re-posted and linked to, surely if your content is constantly linked to this will flag your site for "unnatural" content as it's difficult to see if someone has been paid to write an article on your content. Or does Google just simply want us to audit some of the links to show we are making an effort? As you can tell we have had a Manual action for: Unnatural links to your site—impacts links. However, this seems to impact our website as well. Edit: To clarify the question: Can you get penalised for paying for advertising on a site that uses a templated sidebar. So when they create a new blog/page ect your link is also added onto the page hence resulting in 1000's of links to one page on our site. I know that one effect maybe a 0 pagerank web page linking to your page dilutes the PR of our page. However the links are only inbound not reciprocal

    Read the article

  • two <select> always next to each other inseide <td> ? [closed]

    - by Radek
    I have to selects inside td and I want to make sure that they are next to each other at all times but td's width is width of these two selects. Not more. The thing is that value to be displayd in selects changes based on data. <td> <select name="db2.rfthdd"> <option value="WEI">WEI</option> <option value="SCOTSdatabase">SCOTSdatabase</option> </select> <select id="db2.rfttimestamp"> <option value="20110302122831">2011-03-02-122831</option> <option value="20110302122442">2011-03-02-122442</option> </select> </td>

    Read the article

  • What's better for SEO for many international markets?

    - by Roy Rico
    Right now, we're working to migrate our company sites for international markets to this scheme www.company.com/[2 letter country codes] www.company.com/uk #for United Kingdom www.company.com/au #for Australia www.company.com/jp #for Japan www.comapny.com/ #for united states, and non identifiable. However, in google webmaster tools, we can geo target each directory, but not the root. If we geo-tag the root with US, all the other markets will inherit. Is it better to move the US market to /us/ or leave it where it is?

    Read the article

  • Google Authorship issues

    - by user29107
    I am facing the same issue , tried lots of time followed whols process after that also I am getting the same issue. Email verification has not established authorship for this webpage. Email address on the sanjeebpanda.com domain has been verified on this profile: Yes Public contributor-to link from Google+ profile to sanjeebpanda.com: Yes Automatically detected author name on webpage: Not Found. What to do. Please help

    Read the article

  • Mysterious subdomains to my site indexed by Google

    - by shouren
    Stackers, We have an issue with strange subdomains pointing to (pages on) our site such as: www2.example.com 2.example.com anothersite.com.example.com A few things are perplexing: who created them? why they do that? why Google index them and made them appear in the search results when clicking them gets a 5xx error. how can we get rid of them? It seems some type of scams that hurt our site's free search and experience. Anyone had similar experience and knows the answers? Really appreciate it!

    Read the article

  • Wordpress subcatagory navigation with permalinks

    - by Towhid
    I used beautiful permalinks on my WP website but navigation in sub subcategories is not possible. for example these URLs are fine: http://technopolis.ir/category/articles/security-articles/ & http://technopolis.ir/category/articles/security-articles/page/2/ but this sub subcategory will generate 404 on 2nd page: http://technopolis.ir/category/articles/security-articles/backtrack/ [first page is fine] http://technopolis.ir/category/articles/security-articles/backtrack/page/2/ [404 error]

    Read the article

  • Wordpress theme for user generated content website

    - by iamjonesy
    I'm looking for a wordpress theme that I can work from. I'm basically creating a website like the following two http://www.damnyouautocorrect.com/ and http://icanhas.cheezburger.com/ - both are wordpress based websites which I guess are custom made themes. I'm looking for a theme that will let users enter content without beign logged in. Basically the post type has a title and a description and the name of the author. The homepage will show one post with a "Next" button. Clicking that will load the next post. The user content input just needs a title, description, and a name of the author. I'd also like to add voting up/down. I'm just asking first before I start hacking away at a theme.

    Read the article

  • Multiple sites redirected to one main site

    - by mattgcon
    I have a client who insists of having multiple website domains all being redirected to one main website domain. It is getting out of hand and his server has become conveluted and riddled with garbage because of it, not to mention confusing at times. Each of these domains that he is setting up has no content, they simply redirect the user to the main website domain. Is this practice of having multiple domains pointing to one main website common? And does anyone know where I can get information to give to this client to let him know this is a bad practice if it is a bad practice?

    Read the article

  • Squeezing all the SEO out of a URL as possible.

    - by John Isaacks
    I am working on an ecommerce site, I told our SEO consultant that I plan to make the URL scheme: /products/<id>/<name>. This is similar to Stackoverflow's URLs which are /questions/<id>/<title>. He asked me if I could change the URL scheme to /p/<id>/<name> instead. I know why he wants this change, the word "products" isn't needed to find the correct product, and it doesn't offer any SEO, so shortening it to just p would make the relevant keywords in the <name> weigh more. His main priority is maximizing SEO, but the part that I don't think he is considering is how this effects the semantics of the site. Also having the word "products" looks like it has meaning and a reason for being there, just having a p looks chaotic and ugly to me. I also don't think it makes that much of a difference does it? Stackoverflow doesn't use /q/<id>/<title> and they do just fine, I do realize that theres many factors at play here though, not just the URL. So I want some outside opinions on which is the better way and why?

    Read the article

  • HTML5 CSS3 layout not working

    - by John.Weland
    I have been asked by a local MMA (Mixed Martial Arts) School to help them develop a website. For the life of me I CANNOT get the layout to work correctly. When I get one section set where it should be another moves out of place! here is a pic of the layout: here The header should be a set height as should the footer the entire site at its widest point should be 1250px with the header/content area/footer and the like being 1240px the black in the picture is a scaling background to expand wider as larger resolution systems are viewing them. The full site should be a minimum-height of 100% but scale virtually as content in the target area deems necessary. My biggest issue currently is that my "sticky" footer doesn't stick once the content has stretched the content target area virtually. the Code is not pretty but here it is: HTML5 <!doctype html> <html> <head> <link rel="stylesheet" href="menu.css" type="text/css" media="screen"> <link rel="stylesheet" href="master.css" type="text/css" media="screen"> <meta charset="utf-8"> <title>Untitled Document</title> </head> <body bottommargin="0" leftmargin="0" rightmargin="0" topmargin="0"> <div id="wrap" class="wrap"><div id="logo" class="logo"><img src="images/comalogo.png" width="100" height="150"></div> <div id="header" class="header">College of Martial Arts</div> <div id="nav" class="nav"> <ul id="menu"><b> <li><a href="#">News</a></li> <li>·</li> <li><a href="#">About Us</a> <ul> <li><a href="#">The Instructors</a></li> <li><a href="#">Our Arts</a></li> </li> </ul> <li>·</li> <li><a href="#">Location</a></li> <li>·</li> <li><a href="#">Gallery</a></li> <li>·</li> <li><a href="#">MMA.tv</a></li> <li>·</li> <li><a href="#">Schedule</a></li> <li>·</li> <li><a href="#">Fight Gear</a></li></b> </div> <div id="social" class="social"> <a href="http://www.facebook.com/pages/Canyon-Lake-College-of-Martial-Arts/189432551104674"><img src="images/soc/facebook.png"></a> <a href="https://twitter.com/#!/CanyonLakeMMA"><img src="images/soc/twitter.png"></a> <a href="https://plus.google.com/108252414577423199314/"><img src="images/soc/google+.png"></a> <a href="http://youtube.com/user/clmmatv"><img src="images/soc/youtube.png"></a></div> <div id="mid" class="mid">test <br>test <br>test <br>test <br>test <br>test <br>test <br>test <br>test <br>test <br>test <br>test <br>test <br>test <br>test <br>test <br>test <br>test <br>test <br>test <br>test <br>test <br>test <br>test <br>test <br></div> <div id="footer" class="footer"> <div id="contact" style="left:0px;">tel: (830) 214-4591<br /> e: [email protected]<br /> add: 1273 FM 2673, Sattler, TX 78133<br /> </div> <div id="affiliates" style="right:0px;">Hwa Rang World Tang soo Do</div> <div id="copyright">Copyright © College of Martial Arts</div> </div> </body> </html> CSS3 -Dropdown Menu- @charset "utf-8"; /* CSS Document */ /* Main */ #menu { width: 100%; margin: 0; padding: 10px 0 0 0; list-style: none; background: #444; background: -moz-linear-gradient(#000, #333); background: -webkit-gradient(linear,left bottom,left top,color-stop(0, #444),color-stop(1, #000)); background: -webkit-linear-gradient(#000, #333); background: -o-linear-gradient(#000, #333); background: -ms-linear-gradient(#000, #333); background: linear-gradient(#000, #333); -moz-border-radius: 5px; border-radius: 5px; -moz-box-shadow: 0 2px 1px #9c9c9c; -webkit-box-shadow: 0 2px 1px #9c9c9c; box-shadow: 0 8px 8px #9c9c9c; /* outline:#000 solid thin; */ } #menu li { left:150px; float: left; padding: 0 0 10px 0; position:relative; color: #FC0; font-size:15px; font-family:'freshman' cursive; line-height:15px; } #menu a { float: left; height: 15px; line-height:15px; padding: 0 10px; color: #FC0; font-size:15px; text-decoration: none; text-shadow: 1 1px 0 #000; text-align:center; } #menu li:hover > a { color: #fafafa; } *html #menu li a:hover /* IE6 */ { color: #fafafa; } #menu li:hover > ul { display: block; } /* Sub-menu */ #menu ul { list-style: none; margin: 0; padding: 0; display: none; position: absolute; top: 25px; left: 0; z-index: 99999; background: #444; background: -moz-linear-gradient(#000, #333); background: -webkit-gradient(linear,left bottom,left top,color-stop(0, #111),color-stop(1, #444)); background: -webkit-linear-gradient(#000, #333); background: -o-linear-gradient(#000, #333); background: -ms-linear-gradient(#000, #333); background: linear-gradient(#000, #333); -moz-border-radius: 5px; border-radius: 5px; /* outline:#000 solid thin; */ } #menu ul li { left:0; -moz-box-shadow: none; -webkit-box-shadow: none; box-shadow: none; } #menu ul a { padding: 10px; height: auto; line-height: 1; display: block; white-space: nowrap; float: none; text-transform: none; } *html #menu ul a /* IE6 */ { height: 10px; width: 200px; } *:first-child+html #menu ul a /* IE7 */ { height: 10px; width: 200px; } /*#menu ul a:hover { background: #000; background: -moz-linear-gradient(#000, #333); background: -webkit-gradient(linear, left top, left bottom, from(#04acec), to(#0186ba)); background: -webkit-linear-gradient(#000, #333); background: -o-linear-gradient(#000, #333); background: -ms-linear-gradient(#000, #333); background: linear-gradient(#000, #333); }*/ /* Clear floated elements */ #menu:after { visibility: hidden; display: block; font-size: 0; content: " "; clear: both; height: 0; } * html #menu { zoom: 1; } /* IE6 */ *:first-child+html #menu { zoom: 1; } /* IE7 */ CSS3 -Master Style Sheet- @charset "utf-8"; /* CSS Document */ a:link {color:#FC0; text-decoration:none;} /* unvisited link */ a:visited {color:#FC0; text-decoration:none;} /* visited link */ a:hover {color:#FFF; text-decoration:none;} /* mouse over link */ a:active {color:#FC0; text-decoration:none;} /* selected link */ ul.a {list-style-type:none;} ul.b {list-style-type:inherit} html { } body { /*background-image:url(images/cagebg.jpg);*/ background-repeat:repeat; background-position:top; } div.wrap { margin: 0 auto; min-height: 100%; position: relative; width: 1250px; } div.logo{ top:25px; left:20px; position:absolute; float:top; height:150px; } /*Freshman FONT is on my computer needs to be uploaded to the webhost and rendered host side like a webfont*/ div.header{ background-color:#999; color:#FC0; margin-left:5px; height:80px; width:1240px; line-height:70px; font-family:'freshman' cursive; font-size:50px; text-shadow:8px 8px #9c9c9c; text-outline:1px 1px #000; text-align:center; background-color:#999; clear: both; } div.social{ height:50px; margin-left:5px; width:1240px; font-family:'freshman' cursive; font-size:50px; text-align:right; color:#000; background-color:#999; line-height:30px; box-sizing: border-box; ms-box-sizing: border-box; webkit-box-sizing: border-box; moz-box-sizing: border-box; padding-right:5px; } div.mid{ position:absolute; min-height:100%; margin-left:5px; width:1240px; font-family:'freshman' cursive; font-size:50px; text-align:center; color:#000; background-color:#999; } /*SIDE left and right should be 40px wide and a minimum height (100% the area from nav-footer) to fill between the NAV and the footer yet stretch as displayed content streatches the page longer (scrollable)*/ div #side.sright{ top:96px; right:0; position:absolute; float:right; height:100%; min-height:100%; width:40px; background-image:url(images/border.png); } /*Container should vary in height in acordance to content displayed*/ div #content.container{ } /*Footer should stick at ABSOLUTE BOTTOM of the page*/ div #footer{ font-family:'freshman' cursive; position:fixed; bottom:0; background-color:#000000; margin-left:5px; width:1240px; color:#FC0; clear: both; /*this clear property forces the .container to understand where the columns end and contain them*/ } /*HTML 5 support - Sets new HTML 5 tags to display:block so browsers know how to render the tags properly.*/ header, section, footer, aside, nav, article, figure { display: block; } Eventually once the layout is correct I have to use PHP to make calls for where data should be displayed from what database. If anyone can help me to fix this layout and clean up the crap code, I'd be much appreciated.. I've spent weeks trying to figure this out.

    Read the article

  • Does Bing support anything like Google's First Click Free program?

    - by Dan Fabulich
    Google has a program for webmasters called First Click Free. To implement First Click Free, you need to allow all users who find a document on your site via Google search to see the full text of that document, even if they have not registered or subscribed to see that content. The user's first click to your content area is free. However, once that user clicks a link on the original page, you can require them to sign in or register to read further. The user must be able to see the full content of a multi-page article. You can allow this by displaying all content on a single page to both Googlebot and users. Alternatively, you can use cookies to make sure that a user can visit each page of a multi-page article before being asked for registration or payment. Does Bing support anything like this?

    Read the article

  • SSL and green address bar

    - by tinab
    I am new to SSL so can someone explain why my address bar turns green when I'm on certain sites beginning with https:// and sometimes it doesn't even though I know the site has SSL? Maybe these two nuances are not even related, but if I go to GoDaddy and order a new domain I notice their address bar is green the entire time I'm using the https:// protocol, but then I go to Victoria's Secret to place an order and even though it says https:// the address bar doesn't turn green.

    Read the article

  • How can I monitor a website for malicious changes to the files

    - by rossmcm
    I had an occasion recently where our website was compromised - a link farm was added to a couple of the pages on one occasion, and on another occasion, a large and nasty aspx file was put on the server. I won't mention the host's name (Hostway), but I was pretty annoyed that someone was able to do this. No, it wasn't a leaky password - around 10 sites hosted by HW with consecutive IP addresses got trashed. Anyway. What I need is a utility or service (preferably free) that takes a snapshot of my websites contents, and then regularly monitors the files (size and datestamp) for unauthorized changes or additions, and alerts me. I've used web services that monitor one file for changes, but I'm looking for something a bit more aggressive.

    Read the article

  • Why google is not crawling my website

    - by Aman Virk
    I am running a design and development blog http://www.thetutlage.com/ . From last couple of days my search traffic have been reduced from 70% to 10%. I myself is against black hat seo and all it do is write my own unique content almost everyday. Last week my search traffic was really good but now is dropping like heck. I have checked my webmasters dashboard and no message there from google. When i checked server logs i came to know last time google crawled my website was on 27 september 2012. Really i have no idea what i am doing wrong. I follow all google guidelines like bible, please help me

    Read the article

  • What to do with database in dev/production phases of a website?

    - by TheLQ
    For a while now I've been keeping a website I'm developing in the standard dev/production phases. Its been pretty simple: Mercurial repo for dev, repo for production. Do work in dev, get approved, push to production. But now I'm trying to apply this process to a new website that has a database and am struggling on how to figure out a development strategy. What I didn't mention above is that I do all my work on my own repo, push it to dev, then later push it to production, so its 3 different servers. So how do I manage my database? The obvious solution of mysqldump every commit isn't going to happen, and a dump at the end of the day isn't all that helpful when you want to undo later one change that happened in the middle of the day. What is the best way to accomplish this?

    Read the article

  • Temporary website redirect: 3xx or php/meta?

    - by Damien Pirsy
    Hi, I run a (small) news website which has also a forum in a subfolder of the root. I'm planning to give the site a facelift and a code restructuration, so I wanted to put some redirect on the home page that will point directly to forum's index (www.mysite.com -- www.mysite.com/forum) while I tinker with it. And that, given the little free time I have, will take no less than a couple of month. Being a news site I'm pretty sure that would affect it's overall ranking, but I need to do it, so: which is the best way to redirect? I pondered and read here and there about the different means, but I couldn't figure out which is worst for SEO. Do I use a 302 redirect or use "Location:newurl" in page headers using php? Or I just put a meta tag in the html page (or a javascript, what's better). Sorry but I'm not really into these things, I may have said something silly, I know... Thanks

    Read the article

  • Why is Google Webmaster Tools crawling invalid URLS and showing 500 errors?

    - by Amos Kane
    Google Webmaster tools is reporting 12k+ 500 errors. Eeek! None of the URLS are valid- they all contain www.youtube.com. First, why is Google crawling these URLS if they don't exist? I supplied a sitemap, and they are of course not in the sitemap. I don't have a robots.txt blocking anything. I've checked for invalid redirects--none, and checked for unclosed tags or something that would throw www.youtube.com into the URL by accident--none. In every 'linked from', the referring URL is also a bad URL, with www.youtube.com in it. The Google Tools report no malware, and I can't check the server logs because the host won't give me access. Really stuck!! Any ideas appreciated!

    Read the article

  • Nginx routing script for NodeJS and Wordpress

    - by Nilay Parikh
    We are moving blogs and site from wordpress to nodejs and ready to move into production. However I'm not able to figure it out how to implement routing from front server (Nginx) to NodeJS (prefered web instance) and if data not synced yet into NodeJS website than (404 will throw by NodeJS) fall back to (using reverse proxy) to Wordpress and serve page, during the transformation period. Q1. Is the approach good for the scenario, or anyone can suggest better approach? Q2. Should NodeJS treat itself as Reverse proxy (using bouncy : https://github.com/substack/bouncy or similar package) in event of fall back or shoud stick with Nginx to do so using fastcgi approch. Both NodeJS and Wordpress are on single server only, In first scenario, /if resource available than serve directly User -> Nginx -> NodeJS (8080) \if resource not available then reverse query wordpress and serve content second scenario, /if resource available than serve directly User -> Nginx -> NodeJS (8080) \if resource not available then 404 to Nginx and Nginx script fallback to Wordpress (FastCGI PHP) Later we have plan to phase out Wordpress and PHP from the server environment completely. I'd like to see any examples of Nginx or Varnish scripts and/or NodeJS scripts if you have for me to refer. Thanks.

    Read the article

  • Video conference/chat tool that can be embedded in own website needed

    - by Olaf
    We are looking for a means (a tool, a commercial service) to enable a closed user group to start a live video conference in a browser, as part of the company website. Something like Skype, but embedded and available for everybody that has access to the page into which the tool is embedded. Most services require registration and the creation of a chat room on their website, or, as Skype or similar solutions, the installation of an extra software. What we need is a solution with some kind of a "hidden login", performed by the site's client script (which knows who the user is and forwards the credentials to the service). Any suggestion?

    Read the article

  • How to correctly track the analytics when using iframe

    - by Sherry Ann Hernandez
    In our main aspx page we have this analytics code <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-1301114-2']); _gaq.push(['_setDomainName', 'florahospitality.com']); _gaq.push(['_setAllowLinker', true]); _gaq.push(['_trackPageview']); _gaq.push(function() { var pageTracker = _gat._getTrackerByName(); var iframe = document.getElementById('reservationFrame'); iframe.src = pageTracker._getLinkerUrl('https://reservations.synxis.com/xbe/rez.aspx?Hotel=15159&template=flex&shell=flex&Chain=5375&locale=en&arrive=11/12/2012&depart=11/13/2012&adult=2&child=0&rooms=1&start=availresults&iata=&promo=&group='); }); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); </script> Then inside this aspx page is an iframe. Inside the iframe we setup this analytics code <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-1301114-2']); _gaq.push(['_setDomainName', 'reservations.synxis.com']); _gaq.push(['_setAllowLinker', true]); _gaq.push(['_trackPageview', 'AvailabilityResults']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); </script> The problem is I see to pageview when I go to find the AvailabilityResults page. The first one is a direct traffic and the other one is a cpc. How come that they have different source? I was expecting that both of them is using a direct traffic.

    Read the article

  • Looking to trade a 1U HP Proliant DL360 G5 in exchange for a small linux VPS

    - by user597875
    I have a 1U HP Proliant DL360 G5 that I have no place to rack and would like to trade it for a small linux VPS. If interested let me know... Here are the specs of the server: Model: Intel Xeon CPU 5150 @ 2.66GHz, 4MB L2 Cache Processor Speed: 2.7GHz Processor Sockets: 2 Processor Cores per Socket: 2 Logical Processors: 4 8GB of memory 4x72GB 10k SAS drives Manufacturer: HP Model: Proliant DL360 G5 BIOS Version: P58

    Read the article

  • How to get tens of millions of pages indexed by Google bot?

    - by Chris Adragna
    We are currently developing a site that currently has 8 million unique pages that will grow to about 20 million right away, and eventually to about 50 million or more. Before you criticize... Yes, it provides unique, useful content. We continually process raw data from public records and by doing some data scrubbing, entity rollups, and relationship mapping, we've been able to generate quality content, developing a site that's quite useful and also unique, in part due to the breadth of the data. It's PR is 0 (new domain, no links), and we're getting spidered at a rate of about 500 pages per day, putting us at about 30,000 pages indexed thus far. At this rate, it would take over 400 years to index all of our data. I have two questions: Is the rate of the indexing directly correlated to PR, and by that I mean is it correlated enough that by purchasing an old domain with good PR will get us to a workable indexing rate (in the neighborhood of 100,000 pages per day). Are there any SEO consultants who specialize in aiding the indexing process itself. We're otherwise doing very well with SEO, on-page especially, besides, the competition for our "long-tail" keyword phrases is pretty low, so our success hinges mostly on the number of pages indexed. Our main competitor has achieved approx 20MM pages indexed in just over one year's time, along with an Alexa 2000-ish ranking. Noteworthy qualities we have in place: page download speed is pretty good (250-500 ms) no errors (no 404 or 500 errors when getting spidered) we use Google webmaster tools and login daily friendly URLs in place I'm afraid to submit sitemaps. Some SEO community postings suggest a new site with millions of pages and no PR is suspicious. There is a Google video of Matt Cutts speaking of a staged on-boarding of large sites, too, in order to avoid increased scrutiny (at approx 2:30 in the video). Clickable site links deliver all pages, no more than four pages deep and typically no more than 250(-ish) internal links on a page. Anchor text for internal links is logical and adds relevance hierarchically to the data on the detail pages. We had previously set the crawl rate to the highest on webmaster tools (only about a page every two seconds, max). I recently turned it back to "let Google decide" which is what is advised.

    Read the article

  • How can I decrease relevancy of Creative Commons footer text? (In Google Webmaster Tools)

    - by anonymous coward
    I know that I may just have to link the image to make this happen, but I figured it was worth asking, just in case there's some other semantic markup or tips I could use... I have a site that uses the textual Creative Commons blurb in the footer. The markup is like so: <div class="footer"> <!-- snip --> <!-- Creative Commons License --> <a rel="license" href="http://creativecommons.org/licenses/by-nc-sa/3.0/us/"><img alt="Creative Commons License" style="border-width:0" src="http://i.creativecommons.org/l/by-nc-sa/3.0/us/80x15.png" /></a><br />This work by <a xmlns:cc="http://creativecommons.org/ns#" href="http://www.xmemphisx.com/" property="cc:attributionName" rel="cc:attributionURL">xMEMPHISx.com</a> is licensed under a <a rel="license" href="http://creativecommons.org/licenses/by-nc-sa/3.0/us/">Creative Commons Attribution-Noncommercial-Share Alike 3.0 United States License</a>. <!-- /Creative Commons License --> </div> Within Google Webmaster Tools, the list of relevant keywords is heavily saturated with the text from that blurb. For instance, 50% of my top-ten most relevant keywords (including the site name): [site name] license [keyword] commons creative [keyword] alike [keyword] attribution [keyword] I have not done any extensive testing to find out rather or not this list even matters, and so far this doesn't impact performance in any way. The site is well designed for humans, and it is as findable as it needs to be at the moment. But, out of mostly curiosity: Do you have any tips for decreasing the relevancy of the text from the Creative Commons footer blurb?

    Read the article

< Previous Page | 111 112 113 114 115 116 117 118 119 120 121 122  | Next Page >