Search Results

Search found 9717 results on 389 pages for 'gkt pro'.

Page 153/389 | < Previous Page | 149 150 151 152 153 154 155 156 157 158 159 160  | Next Page >

  • How much time it needs google webmaster yo generate content keyword if url masking is enabled? [closed]

    - by user1439968
    Possible Duplicate: What is domain “masking” or “cloaking”? Why should it be avoided for a new web site? my real domain is domain.in. But url masking has been enabled and the masked url is domain2.in .. In that case i have added d url bputdoubts.21backlogs.in to google webmaster a week ago but content keyword hasn't been generated. In this case when can I expect to get the content keywords generated ?? And is there a problem for getting visitors from google search if url masking is enabled ?

    Read the article

  • List of freely available SEO tools (software) for keyword rank checking? [closed]

    - by Craig
    Possible Duplicate: can anyone reccommend a Google SERP tracker? Requirements: Analysis of site positions on the list of keywords in different search engines; Track keyword positions on search engines. I want see if my keyword rankings have moved up or down; Creating reports. I use Excel + Rank Checker addon for Firefox, to analyze the position of the site in search engines for my keyword list. Are there any tools which tested and working properly. Thanks.

    Read the article

  • Consolidating multiple domain names

    - by Mike
    I have a client that has three separately hosted copies of their website, each on a separate domain name. The websites are all essentially the same, bar a few discrepancies caused by badly managed updates in the past. I will soon be launching a completely new website for them, at which point, all three domain names are to resolve to the same web server. One domain name will become the default domain name that they refer to in all their literature, and the other two will simply be used as catch-alls for old links, bookmarks, and so on. I would like to know what people consider the best route to achieve this. My plan so far is: Get the new site up and running on the new webserver. Change the relevant A record of the default domain name to point to the new webserver. a) Keep the existing hosting accounts in operation. Create a list of 301 redirects from old page names on the old site to new page names on the new site. or b) Configure CNAME records for the non-default domain names, each pointing to the new webserver. Create a list of 301 redirects on the new site that redirect from old page names to new page names. If my understanding is correct, 3a will help to maintain whatever search engine rankings the sites already have (I know it's not going to be perfect), while at the same time informing search engines that the old domain names are no longer in use. What's a good approach to take here?

    Read the article

  • How to crawl a webPage with dynamic content added by javascript

    - by blunderboy
    I guess there is a news that Google bots have the capability to understand our javascript code. It means this is possible to fully crawl a webpage which has lazy loading feature enabled. I am using Apache Nutch to crawl websites but I don't think it has the capability to fetch the URLs being injected in HTML page by javascript when the page is scrolled down. I see a lot of websites doing lazy loading for performance issue. So Can somebody please explain me how can i crawl the data which comes in HTML page on lazy load. (On scrolling the page down).

    Read the article

  • How long does it take for Google Webmasters to index site after submitting sitemap? [closed]

    - by Venkatesh Hodavdekar
    Possible Duplicate: Why isn't my website in Google search results? I have submitted my website today into Google search using Google Webmasters using sitemaps. The status on the sitemap says OK and it shows that 12 urls have been recognized. I was wondering how long does it take for the link to get indexed, as the indexed url option says "No data available. Please check back soon." I am not sure if it is showing this message due to some error, or everything is fine.

    Read the article

  • HTML5 media loading sometimes suspends or aborts: misconfigured Apache?

    - by Joan Botella
    Recently, some code that has been working fine for months started to run unexpectedly. That code is just a media files loading JavaScript function, that uses jQuery. It's pretty long, but in essence it is like this: var $audio=$('<audio>'); $audio.on('canplaythrough',function(e){ $audio[0].play(); }); $audio.attr('src','song.ogg'); Basically, the file only loads sometimes, and sometimes stops loading with a suspend or even an abort event. I have uploaded a little testing HTML to http://www.joanbotella.com/tests/loading , where you can see what's happening. You can download the test files from http://www.joanbotella.com/tests/loading/loadingTest.zip for local testing. I have just checked that opening the test index.html file directly into Firefox, and not through my localhost Apache server, makes the audio files perfectly playable. So, I assume, my hosting and I have the Apache server misconfigured for serving media files. My software versions are: Apache 2.2.22-1ubuntu1.7 , Mozilla Firefox 31.0 , Chromium 36.0.1985.125 and jQuery 1.11.0. Can you help me? Thanks in advance!

    Read the article

  • best way to host multiple wordpress site on single vps [migrated]

    - by Ben
    Not sure if this is webmaster or a WordPress question, it's a bit half and half, sorry if I'm posting in the wrong place. Without using Multi-Site or installing new WordPress CMS' in second-level domains, what's the best way to get multiple WordPress installs running on my VPS (running Linux powered CentOS 6 with WHM and cPanel)? It's currently working but only by setting the permalinks option to the default setting, so the URLs aren't human-friendly. I have come across something called WPSiteStack, though I'd really rather not go down this route. Long story short, I need the following: Seperate installs so one core / theme / plugin update doesn't affect all sites and increases security of all sites; 'Pretty' permalinks; Each WordPress install must be in the root of it's own domain to ensure that I can accurately measure my clients' quotas; It may also be worth noting that some functions within each install use the $_SERVER['DOCUMENT_ROOT'] and $_SERVER['HOST'] variables. I have already edited the httpd-vhosts.conf, httpd.conf and .htaccess files but this hasn't made any changes. So any ideas what I'm missing or doing wrong? Any help is much appreciated.

    Read the article

  • Adsense: You have rejected ad requests, which will result in lost revenue

    - by Chankey Pathak
    Got an alert on my adsense account which says You have rejected ad requests, which will result in lost revenue. The following ad units have made ad requests with incorrect site information. This occurs when the URL of the server from which the ad unit has been served differs from the URL of the actual page where the ad will be displayed. Learn how to fix these errors. So the solution is that I'll have to use "google_page_url = "http://myurl.com/fullpath";" I'm using wordpress, what should be the URL for google_page_url? For example my website is www.technostall.com. Should I put www.technostall.com there or should I give the path of each post? That is not good because I'm using a sidebar widget for sidebar ad unit. I can't change google_page_url for each page. What should I do? This error is appearing only on my sidebar/navigation ad units. Is using google_page_url = document.location; fine?

    Read the article

  • Sub Domain tracking with Analytics filters

    - by Nick
    Hi All, We currently have Analytics tracking codes running throughout our site including our Sub Domains. What I would like to do is create different Profiles under the same account segmenting the sub domains by means of filters. Currently I am just excluding the hostname of the main website by using the following custom filter: Exclude: Hostname Filter pattern: ^www.mydomain.co.za(.*) I know this isn't the proper method of doing this though and have some of the main domains links coming through in the data. Ideally I would just like to include anything from: sub.domain.co.za Any help would be greatly appreciated. Thanks

    Read the article

  • Update Google Sitemap for Mobile

    - by dimo414
    I have a series of utilities to generate Google sitemaps for my whole site. These files are massive, and slow to build. We want to start telling Google these pages are mobile-crawl-able too, by adding them to mobile sitemaps, but the documentation is unclear if I need to specify physically different files for my mobile URLs than for my normal ones. If this is my current sitemap: <?xml version="1.0" encoding="UTF-8" ?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"> <url> <loc>http://mobile.example.com/article100.html</loc> </url> </urlset> Can I simply change it to: <?xml version="1.0" encoding="UTF-8" ?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:mobile="http://www.google.com/schemas/sitemap-mobile/1.0"> <url> <loc>http://mobile.example.com/article100.html</loc> <mobile:mobile/> </url> </urlset> Or do I need to create new files with the additional markup, alongside my existing files?

    Read the article

  • Website Hosting/Registration [closed]

    - by Ricko M
    Possible Duplicate: How to find web hosting that meets my requirements? I am planning to launch a website down soon. I wanted to know what solutions are available for hosting and registration. Starting with domain registration. Any site you have used/preferred ? I am considering either godaddy or 123reg. Does it even make any difference which you choose? Is there any fine print i need to worry about. I am based in UK , not sure if that helps in resolving any issues if encountered. Does my hosting need to be done at the site i purchased my registration? If not , will there be any transfer fees if i change my hosting? Can I just register the name now and worry about hosting later? At the moment, I plan to have it up and running using either some sort of a tool or a template and perhaps put the bells and whistles down the line. I understand 123 has its own builder tool available, There are a few solutions suggested like wordpress,drupal & jhoomla... I am a C++ developer , not a web programmer, but I do feel the need to open the hood up and make changes if i see fit. So I guess I am looking for a solution where I can easily drag-drop widgets I need and when the time comes customize it. Which CMS would you recommend. Extras: What extras do you need to get , I was suggested to get hold of whois privacy to keep the spambots away, anything else you guys would recommend I keep my eyes open before I sign the dotted line.

    Read the article

  • How to Prevent Site For Penality [on hold]

    - by Simon Walker
    I have been working hard to recover site from Penguin 2.0 and I was successful in it. A week before Penguin 2.1 one of my competitor created thousands of spam backlinks in just 2 days. The timing is more than perfect to make my website victim of Penguin 2.1. I have disavowed all those dirty links and regularly updating the disavow sheet for poor backlinks. Is there else I can do with the site to recover faster. http://bit.ly/fvbyLg

    Read the article

  • How to to let Google know about dynamic content?

    - by Yaniv
    Im looking for the best practice to let Google know about a vast number of dynamically created content. Let's say (I mean - dream) that I'm Facebook, and I want to let Google to index all the users' posts. Sitemap.xml may be the answer for this but they are limited to 50,000 URLs in each site map. I know that I can create 500 sitemaps and create a sitemap for sitemaps, but they are also limited, 25,000,000 URLS sounds quite enough at the moment, but could cause problems in the future. I.E - stackoverflow already has 3 Million posts, probably sitemap is not the solution for them. Creating a page with paging, and links to all the dynamic data. i guess this is what stackoverflow did by creating this page here: http://stackoverflow.com/questions So I think that Option 2 is the answer, but it seems to me that sitemaps might have some added value. So what should i do?

    Read the article

  • Is it considered blackhat SEO to have hidden text within links?

    - by Sam152
    My aim is to simply be informative about where a link is pointing to search engines. I have some content that is listed by name and then I have a "Permalink" button. Would it be blackhat SEO to add some hidden text within the anchor that describes where the permalink is pointing? My content is like so: News Item 1 Permalink (<a href="/my-news-item-1"><hidden>News Item 1</hidden> Permalink</a>) Teaser text.. The news title of the block already links to the article, but I think it would be of benefit to users to provide and explicit permalink button.

    Read the article

  • Information about links disappeared from Webmaster Tools

    - by Bobrovsky
    I discovered that all information about links to my site disappeared from Google Webmaster Tools. Last time I checked the "Links to your site" page in GWT there was nice list of linking domains and all. But now there is only "No data available." There were no changes to the site contents. Why could it be? And what can I do to fix this? About a month earlier I found that PR of all my pages dropped by 2 points. May these changes be related?

    Read the article

  • Is there a way of using HTTPS with Amazon's CloudFront CDN and CNAMEs?

    - by Metalshark
    We use Amazon's CloudFront CDN with custom CNAMEs hanging under the main domain (static1.example.com). Although we can break this uniform appearance and use the original whatever123wigglyw00.cloudfront.net URLs to utilise HTTPS, is there another way? Do Amazon or any other similar provider offer HTTPS CDN hosting? Is TLS and its selective encryption available for use somewhere (SNI: Server Name Indication)? Foot note: assuming that the answer is no, but just in the hope someone knows. EDIT: Now using Google App Engine https://developers.google.com/appengine/docs/ssl for CDN hosting with SSL support.

    Read the article

  • Pagination and duplicate content

    - by jazz090
    I have an archive page that displays the number of articles published. Because there were so many, I ran a pagination script: for 127.0.0.1/archive/2/?p=x&pp=y where p is the page number and pp is number of articles to display per page. The pagination looks like this: Prev 1 2 3 4 ... 12 NEXT with each item linking to p like <a href="?p=x">x</a>. I also have the items per page setter: 25 | 50 | 100 (<a href="?pp=y">y</a>). Now I have a PHP script that fixes pp into a session variable. But I am worried about duplicate content (since incrementing pp values will be inclusive) and also content not getting indexed because its not in the pagination link. so in the example above, pages 5-11 will not be indexed. Any ideas on how to fix this?

    Read the article

  • .html extension or no for SEO purposes

    - by Scott Schluer
    I know this question has been asked before on Stack Overflow, but what I have not been able to find in the posts I've read are concrete references as to WHY one is better than the other (something I can take to my boss). So I'm working on an MVC 3 application that is basically a rewrite of the existing production application (web forms) using MVC. The current site uses a URL rewriter to rewrite "friendly" urls with HTML extensions to their ASPX counterpart. i.e. http://www.site.com/products/18554-widget.html gets rewritten to http://www.site.com/products.aspx?id=18554 We're moving away from this with the MVC site, but the powers that be still want the HTML extension on the URLs. As a developer, that just feels wrong on an MVC site. I've written a quick and dirty HttpModule that will perform a 301 redirect from the .html URL to the same URL without the .html extension and it works fine, but I need to convince management that removing the .html extension is not going to hurt SEO. I'd prefer to have this sort of friendly URL: http://www.site.com/products/18554-widget Can anyone provide information to back up my position or am I actually trying to do something that WOULD hurt SEO, in which case can you provide references on that?

    Read the article

  • How do I change pages registering as 404 to 200

    - by christian
    I have this problem. After relaunching my site: http://www.kgstiles.com, traffic dropped immensely(about 60%). After troubleshooting for a week and a half - losing thousands of dollars off of lost traffic in the process, I found that Google was getting a 404 error at the end of many of my 301 redirects(so it wouldn't index the new pages). Most of of the pages, though, would register in my browser. They registered as a 404 error in Google's index as well as a 404checker. So my first question is: could this be what's causing my loss of traffic? and second: how do I fix it? I'm desperate! Any help is appreciated! # BEGIN s2Member GZIP exclusions <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteCond %{QUERY_STRING} (^|\?|&)s2member_file_download\=.+ RewriteRule .* - [E=no-gzip:1] </IfModule> # END s2Member GZIP exclusions # BEGIN WordPress <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] </IfModule> <IfModule mod_rewrite.c> RewriteEngine On RewriteRule ^moreinfo/(.*)$ http://www.kgstiles.com/moreinfo$1 [R=301] RewriteRule ^healthsolutions/(.*)$ http://www.kgstiles.com/healthsolutions$1 [R=301] RewriteRule ^(.*)\.html$ $1/ [R=301,L] RewriteRule ^(.*)\.htm$ $1/ [R=301,L] </IfModule> # END WordPress

    Read the article

  • cpnfigure open_basedir under Plesk

    - by cori
    This might be a question for ServerFault, and f it wasn't for the Plesk aspect I would ask it there to start with, so if it's better suited for over there let me know and I'll move it. I'm working on a dedicated server set up as a reseller account with Plesk to manage the domains and server configuration, and i need to add a directory to the local open_basedir configuration for a specific vhost. Given Plesk's normal methodology, I expected to be able to go to /var/www/vhost/{%DOMAINNAME%}/conf and modify vhost.conf and place a new value there, as I have successfully done with other configuration settings for this domain (turning safe_mode off, for instance). When I do so, however, the new setting doesn't take (per phpinfo();). If I edit httpd.conf (which the plesk configuration specifically says not to do in the notes at the top of httpd.conf) the setting takes. Is there something specific about the open_basdir setting that makes it not configurable in vhost.conf? How much trouble am I letting myself in for by editing the vhost-specific httpd.conf (I imagine is someone makes changes in the plesk web interface it might be overwritten, but what other risk is there)? Thanks!

    Read the article

  • Primary domain in vps has been deactivated

    - by manix
    This is my scenario: I have a vps with two domains (example1.com, example2.com). When I started with this vps I set example1.com as primary domain and the nameserver were configured with the pattern ns1.example.com, ns2.example1.com. The domains were brought in name.com. Across the time, I usually only work whit the domain example2.com, for that reason I stopped to pay example1.com anual registration and just keep the example2.com. But, today my vps is unreacheable because the main domain was deactivated last ago 23th. I never imagined that it could affect my server. So, I am so worried because I don't know if rebuild the vps is the solution here because I could lost my data. Can you take me to the right direction in order to recover my vps?

    Read the article

  • Should a URL match the page's title?

    - by Yottatron
    Should the URL of a page match its title? For example: Http://example.com/about-cats.html <title>About Cats</title> Furthermore, if that title were to be changed by the page's author, should the URL change to match and the old URL be redirected (301) to the new URL? Edit Also, if the pages author were to decide to revert his changes after several days, would it be right to remove the redirect and set up an new redirect from the amended URL back to the old URL?

    Read the article

  • To Fix HTTP 400-499 error codes with 301 redirects in .htaccess file

    - by user2131844
    Google previously indexed my websites pages (sitemap.xml) with below format: www.domain.com/2013/04/18/hot?test-gadgets-of-2013-to-include-in-?your-list www.domain.com/2013/02/09/rin?gdroid I have resubmitted the sitemap but there are still 404 errors in Google/Bing engine. Could you please help me to write 301 redirects rule in .htaccess file so when some clicks the URL for: www.domain.com/2013/02/09/rin?gdroid They should be redirected to: www.domain.com/rin?gdroid How we can write rule in .htaccess file to remove date part 2013/02/09/?

    Read the article

  • How to properly URL/domain forward

    - by NRGdallas
    No clue on a title for this, someone feel free to suggest an edit. I have a client that has a website. He owns around 200 domains, and wants each domain to contain content from the main website. The header, footer, and navigation bars will remain the same for each domain, but the actual page content will vary (obviously duplicate content issues, open to suggestions) He wants each individual page to be its own separate domain, rather than a url within the main domain. (page1.com page2.com etc - NOT site.com/page1.html, however the file is actually hosted at site.com/page1.html - all links will direct to site.com/whatever accordingly) What would be the best place to start reading / learning on how to do this, and what concerns/considerations should be taken into mind?

    Read the article

  • My site has crashed .. anyone have some info ?

    - by marwan
    Hi all , I booked a domain name for my website from a hosting provider .I gave the domain name , along with ftp details to a freelancer to develop the site in wordpress . the freelancer developped and he got full payment , and the site and site was working fine ,etc .. From that time , I did not change the admin logging as well as ftp details , this means that such info is still known to the freelancer .. A week ago , I found that some links in my site was not working .. I sent him a mail about this , and he said that he will fix it if i give him ftp details . and I did so , next I found that the entire site is gone . then he sent me a mail , without I asked him , and he he said that there have been someone who got access to my server , and he removed all files of my site and he installed drupal instead .and that he can rebuild the site in one day , by charging a full fee of 250 usd again .. Can anyone know what I can do in this situation , to find who did such act , could it be the host provider or that freelancer ,, and if there is a possibility to have my site back top the server .. I will appreciate any info on this.. Regards , Thanks

    Read the article

< Previous Page | 149 150 151 152 153 154 155 156 157 158 159 160  | Next Page >