Search Results

Search found 9717 results on 389 pages for 'gkt pro'.

Page 140/389 | < Previous Page | 136 137 138 139 140 141 142 143 144 145 146 147  | Next Page >

  • SH404SEF URLs in Joomla 1.5

    - by Tao Bellamine
    I have two modules to play with urls, the global configuration module and the sh404sef module. The global config is set to "Sef urls: YES" and "mod rewrite enabled: YES" and the sh404sef is set "url optimization: NO". My problem is, even with "Sef urls" set in the global config, my urls still don't seem to be that "user friendly" so I turn on the "Url optimization" using the sh404sef module, and I get better descriptive urls. However, the problem I inherit from doing this is that my dynamically populated chronoforms get messed up (only the chrono forms, other forms are fine); These forms are now showing up at the homepage instead of their own reserved page. Here's an example: Old form "GOOD" url: http://www.mycraftwork.com/index.php?option=com_content&view=article&id=94 New optimized "BAD" URL: http://www.mycraftwork.com/handthrown-pottery/alladin-teapot/index.php?option=com_content&view=article&id=94 Any help would be GREATLY appreciated! I can even turn the sh404sef on and off if some people are interested in seeing the issue LIVE. Thanks!! Tao Bellamine

    Read the article

  • Best practice for bulk eCommerce product upload?

    - by Or W
    I'm thinking about opening a large online store for Jewelry, the one thing that really bothers me is managing the actual operation of taking pictures, uploading and describing all the products. I'm trying to figure out the best way to do it, in terms of performance or the least time consuming. Just a few things to keep in mind I'll have over 1,000 items in the online store I'll have 3-4 pictures for each item, I'm using a DSLR camera if it makes any difference. I'm going to probably use Magento, unless you have better experience with another eCommerce platform that will help me get this done quickly. I'll need to randomly(?) create a product code for each item.

    Read the article

  • http to https upgrade -- SEO troubles

    - by SLIM
    I upgraded my site so that all pages have gone from using http to https. I didn't consider that Google treats https pages differently than http. I re-created my sitemap to so that all links now reflect the new https and let it be for a few days. (Whoops!) Google is now re-indexing all https pages. I have about 19k pages on the site, and Google has already indexed about 8k of the new https. The problem is that Google sees all of these as brand new pages when many of them have a long http history. Of course most of you will recognize the problem, I didn't set up a 301 from the old http to the new https. Is it too late to do this? Should I switch my sitemap back to http and then 301 to the new https? Or should I leave the sitemap as is, and setup 301 redirects anyway.. I'm not even sure if Google is trying to reach the http site anymore. Currently the site is doing 303 redirects (from http to https), although I haven't figured out why yet. Thanks for any suggestions you can offer.

    Read the article

  • How can I pass referrer header from my https domain to http domains?

    - by nutcracker
    My website is 100% https. I have links to other http domains. The referrer header is not set when linking from a https page to a http page. From http://en.wikipedia.org/wiki/HTTP_referrer If a website is accessed from a HTTP Secure (HTTPS) connection and a link points to anywhere except another secure location, then the referer field is not sent. I would prefer that other domains can see the referrer so that they know that traffic comes from my domain. Is there a way to force this header or is there another solution? Update I've done some basic testing using a redirect: http page -- link to http --> 301 redirect --> http page = referrer intact https page -- link to https --> 301 redirect --> http page = referrer blank https page -- link to http --> 301 redirect --> http page = referrer blank https page -- link to http --> 302 redirect --> http page = referrer blank The referrer is lost when linking from a https page to a http redirect page on my own domain. So there is no referrer on the redirect.

    Read the article

  • Ranking drop after using reverse proxy for blog subdirectory and robots.txt for old blog subdomain

    - by user40387
    We have a 3Dcart store and a WordPress blog hosted on a separate server. Originally, we had a CNAME set up to point the blog to http://blog.example.com/. However, in our attempt to boost link-based and traffic-based authority on the main site, we've opted to do a reverse proxy to http://www.example.com/blog/. It’s been about two months since we finished the reverse proxy migration. It appears that everything is technically working as intended, including some robots and sitemap changes; the new URLs are even generating some traffic, as indicated on Google Analytics. While Google has been indexing the new URL locations, they’re ranking very poorly, even for non-competitive, long-tail keywords. Meanwhile, the old subdomain URLs are still ranking mostly as well as they used to (even though they aren’t showing meta titles and descriptions due to being blocked by robots.txt). Our working theory is that Google has an old index of the subdomain URLs, and is considering the new URLs to be duplicate content, since it’s being told not to crawl the subdomain and therefore can’t see the rel canonicals we have in place. To resolve this, we’ve updated the subdomain’s robot.txt to no longer block crawling and indexing. Theoretically, seeing the canonical tag on the subdomain pages will resolve any perceived duplicate content issues. In the meantime, we were wondering if anyone would have any other ideas. We are very concerned that we’ll be losing valuable traffic, as we’re entering our on season at the moment.

    Read the article

  • adding regular expression in php not work

    - by John Smiith
    Code i added ([a-zA-Z0-9\_\-]+) but not work i wan't to include all css files is there is any other way to add?? My code file css.php header("Content-type: text/css"); $css = array( '([a-zA-Z0-9\\_\\-]+).css', ); foreach ($css as $css_file) { $css_get = file_get_contents($css_file); echo $css_get; } call.php <link href="css.php" rel="stylesheet" type="text/css" /> i wan't to rewrite css.php to css.css so public can see css.css instead of css.php. how can i do that using php script?

    Read the article

  • How to secure robots.txt file?

    - by CompilingCyborg
    I would like for User-agents to index my relative pages only without accessing any directory on my server. As initial thought, i had this version in mind: User-agent: * Disallow: */* Sitemap: http://www.mydomain.com/sitemap.xml My Questions: Is it correct to block all directories like that - Disallow: */*? Would still search engines be able to see and index my sitemap if i disallowed all directories? What are the best practices for securing the robots.txt file? For Reference: Here is a good tutorial for robots.txt #Add this if you want to stop Alexa from indexing your site. User-agent: ia_archiver Disallow: / #Add this to stop duggmirror User-agent: duggmirror Disallow: / #Add this to allow specific agents User-agent: Googlebot Disallow: #Add this to allow all agents while blocking specific directories User-agent: * Disallow: /cgi-bin/ Disallow: /*?*

    Read the article

  • folder level 301 redirect without .htaccess

    - by Vinay
    I have a website hosted in yahoo small business, I don't have access to .htaccess file. I have around 220 pages in a folder 'mysubfolder' (http://mysite.com/myfolder/mysubfolder). And the age of website is around 3 years. I am planning to move all 220 pages in 'mysubfolder' to 'myfolder' (one level up). All the pages in 'mysubfolder' are indexed. what is the best way to do this.So that it should not affects the SEO.

    Read the article

  • Local Business Listing Dashboard

    - by Steve
    I operate a website for a West Australian company, and the company has listings in local business directory websites. Currently we are listed in: www.HotFrog.com.au www.Google.com/Places www.TrueLocal.com.au www.StartLocal.com.au www.localstore.com.au www.communityguide.com.au www.yelp.com.au www.aussieweb.com.au Do you know of any method for examining account stats (profile views, profile clicks etc) within a dashboard, so I can see at a glance how each of our listings is going? I'd be happy to build a dashboard if necessary, but I'm not confident I currently have the skills to accurately display only the correct concise information. Would I use iframes? See if they have APIs? Is there a dashboard framework I could use?

    Read the article

  • Does anyone know any good resources for learning how to market a web app?

    - by Jack Kinsella
    I'm a developer first and foremost. I write web apps but have a hard time generating traffic and converting potential users once I've released my product into the wild. I know I need to learn more about marketing but I don't know where to start as I've no baseline to judge the quality of the materials I stumble across. Does anyone know any websites, blogs, e-books or other resources for learning how to market effectively?

    Read the article

  • Is there evidence that linking to quality, reputable and popular website helps with ranking?

    - by JVerstry
    Is there any evidence that linking to external quality, reputable and popular websites helps with ranking (directly or indirectly)? Is there an established correlation? Some posts on the web do claim it, but without providing any evidence. It is known that if your website links to bad neighborhood, this will harm your reputation and authority, but does the reverse actually help? And, does it matter if the website is young or old in this case? Update I have found this Moz video revealing there is a 0.04 correlation with ranking.

    Read the article

  • which is the best CMS for building a Video tutorial website? [duplicate]

    - by Rajesh Sankar R
    This question already has an answer here: How to choose a CMS system for a small web site? 4 answers I am planning on setting up a website where I will be posting "how to or training videos" for various softwares. There will be several training videos under a single topic. And there will be several topics under a single software training. Similar to lynda.com but in a small scale. Can you suggest me a CMS for the job? Initially I am planning on hosting videos on YouTube and linking them to my website. But, I will be moving to my own hosting later on. It must be customizable, scalable, and most of all OPEN SOURCE

    Read the article

  • Google Cache showing wrong URL

    - by Sathiya Kumar
    I searched the cache details of the URL http://property.sulekha.com/pune-properties but the Google Cache showing details for property.sulekha.com. I don't know why it's showing like this. Not only for http://property.sulekha.com/pune-properties but also for all the Indian city relates URL's like http://property.sulekha.com/chennai-properties , http://property.sulekha.com/mumbai-properties , http://property.sulekha.com/kolkata-properties etc. Even i don't find these urls in the Google search result. If i search Chennai properties in Google, i find property.sulekha.com and not http://property.sulekha.com/chennai-properties . Why its happening like this? Please let me know

    Read the article

  • Do web crawlers/spiders index azure web sites?

    - by Clay Shannon
    For somebody who wants their web site to be as discoverable as possible (and who doesn't?), are Microsoft's Azure web sites (azurewebsites.net) a feasible domain to host sites? I have a site that is both on an azurewebsites.net and hosted under a completely different name by discountasp.net Both of these sites are exactly the same, except for the URL; whenever I update the code, I republish the site to/in both places. So obviosuly, they both have the same H1 and H2 elements. Searching for the value/content in my H1 tag, I find my .com site listed #3 on google and #2 on both Bing and Yahoo; OTOH, my azurewebsites.net site doesn't show up on the first page at all, in any of them. This makes me wonder if azurewebsites.net should only be used for Web API hosting and such-like, not for generic/commercial "public" sites. Are my conclusions valid?

    Read the article

  • "X-Robots-Tag: noindex" on an HTTP 301 response

    - by Peter O.
    I understand that a resource with X-Robots-Tag: noindex forces some search engines, including Google, not to index the resource further. I also understand that an HTTP 301 response causes search engines to use the redirected URL instead of the original URL to refer to the resource. But what happens if both "X-Robots-Tag: noindex" and status code 301 occur on the same response? It's likely that the original URL will no longer be indexed, but will that cause the redirected URL to no longer be indexed too? This possibility is not mentioned in the X-Robots-Tag specification.

    Read the article

  • Multisites Network SEO::Can self-referencing canonical tag(rel="canonical") inside article improve google rating?

    - by user5674576
    Hi, Can self-referencing canonical tag(rel="canonical") inside article improve google rating? The Case: Company have 40 sites with original content and 1 main site with some of 40 sites articles. Main site have rel="canonical" in each article Should article in original site have also rel="canonical" for self-referencing? example: inside main network site(reference to other site):<link href="http://site7.com/article25" rel="canonical" /> inside original network site(self-reference):<link href="http://site7.com/article25" rel="canonical"/> Thanks in advance

    Read the article

  • Google webmaster Verification failed.

    - by KMC
    I have a site created by Ruby on Rails. I had verified against Google Webmaster Tool some months ago, which was successful. One day webmaster starts giving me Re-verification fails. I tried again to verify my site using Meta tags and HTML files. But I kept having "Verification failed. The connection to your server timed out." Since then, Google stop crawling my site's content - though, somehow google still crawl my PDF contents on my site.

    Read the article

  • Wordpress Multisite and Google Analytics in subfolders with mapped domains

    - by David
    I have a wordpress multisite with sub folders. The site's subfolders are mapped to domains, which are set to primary. I'm using the 'Google Analytics Multisite Async' code to track things. From what I can see it's tracking the sites fine (getting page hits for each site in google analytics) baring the original site in the Multisite which in content overview lists domains then the amount of traffic it's getting along with the orginal domains traffic. I don't want to track any other traffic for my orginal site than what goes to that. i.e. I don't want it tracking my other sites in multi-site. e.g. domain1.com is my orginal and I have lots of other sites in the multisite lets say domain2.com, domain3.com. In content overview in Analytics it's listing say domain2.com as content. Can I tell it to filter these out some how either in Analytics or within WordPress? Hopefully explained that clearly!

    Read the article

  • AdSense Custom Search Ads - custom quesry

    - by Alex
    i'm trying to set up a custom search ad, but I am nost sure about the query. On the site it says (https://developers.google.com/custom-search-ads/docs/implementation-guide) 'query' should be dynamic based on your page. This variable targets the ads and therefore should always match what the user on your site has just performed a search for. Now, what I understand is: I have to program my page so that the query variable contains some custom words. Am I right? If a user gets to my site through clicking on an adsense, there is no way to "know" what the user looked for and display my query accordingly, right? Thanks for any help!

    Read the article

  • Is there a modern (eg NoSQL) web analytics solution based on log files?

    - by Martin
    I have been using Awstats for many years to process my log files. But I am missing many possibilities (like cross-domain reports) and I hate being stuck with extra fields I created years ago. Anyway, I am not going to continue to use this script. Is there a modern apache logs analytics solution based on modern storage technologies like NoSQL or at least somehow ready to cope with large datasets efficiently? I am primarily looking for something that generates nice sortable and searchable outputs with the focus on web analytics, before having to write my own frontends. (so graylog2 is not an option) This question is purely about log file based solutions.

    Read the article

  • How to check that I have recovered from Penguin 2.0?

    - by Simon Walker
    I have 3 year old website which has been hit by Penguin 2.0 in May. The website traffic dropped almost 30%. I have been working hard from last 2.5 months on the website and my website's traffic recovered in last week of August. In fact, I am receiving more traffic then ever. When I look at the stats, I find my website's search engine visibility has been improved. It is now appearing for more search queries. My website's impressions have also increased. What I am worried about is that my website is nowhere in top 5 pages for keywords having high competition and carrying the highest search volume. They are few in number but important. Should I consider my current situation as recovery or it's just the partial recovery? If it is only partial, then how come traffic is more then it was before penguin 2.0?

    Read the article

  • How exactly is Google Webmaster Tools measuring "Site Performance"?

    - by Rémi
    I've been working for two months now on improving our response time (mainly server side) on a new forum (a brand new product on a technical point of view) we've launched in Germany a few month ago and I'm a lot surprised by the results I get. I monitor our response time using Apache logs and our own implementation of Boomerang beacon. Using my stats, I can see that our new product responds in about 680 ms where our old product was responding in about 1050 ms. On the other side, Google Webmaster Tool tells us that our pages have an average reponse time of about 1500 ms today where it was 700 three months ago with our old product. I've figured that GWT was taking client side metrics into account so I've added some measures on our Boomerang beacon and everything looks just fine. I've also ran some random pages on ySlow and Google's Page Speed and everything looks better than it was before. We event have a 82% on Google's Page Speed tool which is quite cool for a site with some ads in it :) Lately, we have signed a deal with Akamai to use two of their products : CDN for our static files (we were using another CDN before but it wasn't very effective) and RMA to improve Networks routes. We have also introduced a new agressive cache mecanism to ensure that most of the pages served to crawlers are cached by our memcache grid. After checking my metrics, it seems that this changes have improved from 650ms to about 500ms, which is good (still not great but it is definitly an improvement). But webmaster tools continues to report an increasing average response time where we see it decreasing in the same time. Have you ever had the same kind of wierd behavior on your sites while doing performance improvements ? Do you have any idea how to monitor the same thing Google does with Site Performance in Google Webmaster Tools so that we could improve our site and constantly check if it is what Google wants ? Edit 2011/07/26 : Thanks for your answers guys ! Nevertheless, I was not precise enough. The main issue we have is not with the Site Performance page but with the Crawl Stats one for now. We probably found an issue on our side with some very slow pages (around 3000 ms !!) and we are trying to fix them. I'll keep you posted as soon I'll have some infos. Thanks again !

    Read the article

  • Files not refreshing after uploaded and overwritten via ftp

    - by guisasso
    I have been having some trouble with updating my website this morning. After editing some files, in special, a css file, and uploading it to the server, i would notice that the changes wouldn't happen, as if the file had not been overwritten. File size on my local machine would be 3.244 kb and on the server 3.080 kb. Even after deleting the whole folder itself, and uploading everything again, same error. Answer: Cloudfare.

    Read the article

  • How does hreflang interact with geo targeting?

    - by zakgottlieb
    If I have multiple subfolders that I wish to target at different countries, I'm thinking the ideal set up would be to specify rel="alternative" hreflang with a language AND country code (e.g. en-AU) and ALSO to geotarget that subfolder to the particular country. That way, the pages would be showing up both in the country-specific results (accessed via Search Tools) because of hreflang, AND the more generic country results from regular searches, because of geotargeting. Is this correct? p.s. What would happen if you geotargeted a subfolder which had e.g. pt-BR hreflang value (i.e. Portuguese-Brazil) to just Portugal?

    Read the article

  • Google sitemap HrefLang tag without the main site url

    - by Rashmi Pandit
    We have websites with multilingual content. e.g. http://www.example.com/about-us/ http://www.example.com/en-HK/about-us/ http://www.example.com/en-GB/about-us/ http://www.example.com/zn-CH/about-us/ We need to configure the hreflang tags in sitemap for Google to know that there are alternate links for the same pages in different languages. I know for the above example that my sitemap url tag would look like this: <url> <loc>http://www.example.com/about-us</loc> <xhtml:link rel="alternate" hreflang="en-GB" href="http://www.example.com/en-GB/about-us"/> <xhtml:link rel="alternate" hreflang="en-HK" href="http://www.example.com/en-HK/about-us"/> <xhtml:link rel="alternate" hreflang="zn-CH" href="http://www.example.com/zn-CH/about-us"/> <changefreq>daily</changefreq> <priority>0.8</priority> </url> However, if I don't have the main url but just the last three ones with en-HK, en-GB and zn-CH, then how should my url tag look? Should I just skip the loc tag and keep the three xhtml:link tags? Or can I specify any url in the loc tag and put the remaining two in xhtml:link tags? I am new to Google sitemaps. Any help is greatly appreciated. Thanks, Rashmi Edit: From the answer posted on http://stackoverflow.com/questions/18423624/sitemap-for-domain-with-multilanguage-site/18423803#18423803, for my example with sites in en-HK, en-GB and zn-CH, should there be three url tags, with each of them assigned to loc with the other two in xhtml:link?

    Read the article

< Previous Page | 136 137 138 139 140 141 142 143 144 145 146 147  | Next Page >