Search Results

Search found 9722 results on 389 pages for 'dvc pro'.

Page 140/389 | < Previous Page | 136 137 138 139 140 141 142 143 144 145 146 147  | Next Page >

  • 301 redirects in main navigation menu of WordPress website - is this okay for SEO?

    - by Lewis Bassett
    I want to allow a client to have a flexible way to configure the navigation menu for his WordPress website. To that end, I have created a parent page called "Navigation", which has child pages for each page to be displayed in the navigation menu. Those pages then get 301 redirected to the actual page that should be served. This means the client can create pages freely, and then set up redirects for them as and when needed. This is a really easy way for him to manage his main menu and it works well. From an SEO point of view, is this okay? Will the pages be indexed fine?

    Read the article

  • http to https upgrade -- SEO troubles

    - by SLIM
    I upgraded my site so that all pages have gone from using http to https. I didn't consider that Google treats https pages differently than http. I re-created my sitemap to so that all links now reflect the new https and let it be for a few days. (Whoops!) Google is now re-indexing all https pages. I have about 19k pages on the site, and Google has already indexed about 8k of the new https. The problem is that Google sees all of these as brand new pages when many of them have a long http history. Of course most of you will recognize the problem, I didn't set up a 301 from the old http to the new https. Is it too late to do this? Should I switch my sitemap back to http and then 301 to the new https? Or should I leave the sitemap as is, and setup 301 redirects anyway.. I'm not even sure if Google is trying to reach the http site anymore. Currently the site is doing 303 redirects (from http to https), although I haven't figured out why yet. Thanks for any suggestions you can offer.

    Read the article

  • Ranking drop after using reverse proxy for blog subdirectory and robots.txt for old blog subdomain

    - by user40387
    We have a 3Dcart store and a WordPress blog hosted on a separate server. Originally, we had a CNAME set up to point the blog to http://blog.example.com/. However, in our attempt to boost link-based and traffic-based authority on the main site, we've opted to do a reverse proxy to http://www.example.com/blog/. It’s been about two months since we finished the reverse proxy migration. It appears that everything is technically working as intended, including some robots and sitemap changes; the new URLs are even generating some traffic, as indicated on Google Analytics. While Google has been indexing the new URL locations, they’re ranking very poorly, even for non-competitive, long-tail keywords. Meanwhile, the old subdomain URLs are still ranking mostly as well as they used to (even though they aren’t showing meta titles and descriptions due to being blocked by robots.txt). Our working theory is that Google has an old index of the subdomain URLs, and is considering the new URLs to be duplicate content, since it’s being told not to crawl the subdomain and therefore can’t see the rel canonicals we have in place. To resolve this, we’ve updated the subdomain’s robot.txt to no longer block crawling and indexing. Theoretically, seeing the canonical tag on the subdomain pages will resolve any perceived duplicate content issues. In the meantime, we were wondering if anyone would have any other ideas. We are very concerned that we’ll be losing valuable traffic, as we’re entering our on season at the moment.

    Read the article

  • adding regular expression in php not work

    - by John Smiith
    Code i added ([a-zA-Z0-9\_\-]+) but not work i wan't to include all css files is there is any other way to add?? My code file css.php header("Content-type: text/css"); $css = array( '([a-zA-Z0-9\\_\\-]+).css', ); foreach ($css as $css_file) { $css_get = file_get_contents($css_file); echo $css_get; } call.php <link href="css.php" rel="stylesheet" type="text/css" /> i wan't to rewrite css.php to css.css so public can see css.css instead of css.php. how can i do that using php script?

    Read the article

  • Link tracking: Amazon or Google way

    - by Howard
    When doing a shopping site, the best way is to reference some successful stores, like Amazon. In the area of link tracking, for example, to see which section of your frontpage yield better conversion: Amazon way: Generate an unique URL for each link in the frontpage, such as http://www.amazon.com/gp/product/B0083Q04IQ/ref=s9_pop_gw_g424_ir04/175-6575053-9292830?pf_rd_m=ATVPDKIKX0DER&pf_rd_s=center-2&pf_rd_r=0AMJCKBBQA63EP0XHB86&pf_rd_t=101&pf_rd_p=1263340922&pf_rd_i=507846 Google way Use Google Analytics <a href="/products/abc" onClick="javascript: pageTracker._trackPageview('/from-main-menu/products/abc');"> WHat are the pros and cons with the above two approaches (besides Google require JS support)?

    Read the article

  • Multisites Network SEO::Can self-referencing canonical tag(rel="canonical") inside article improve google rating?

    - by user5674576
    Hi, Can self-referencing canonical tag(rel="canonical") inside article improve google rating? The Case: Company have 40 sites with original content and 1 main site with some of 40 sites articles. Main site have rel="canonical" in each article Should article in original site have also rel="canonical" for self-referencing? example: inside main network site(reference to other site):<link href="http://site7.com/article25" rel="canonical" /> inside original network site(self-reference):<link href="http://site7.com/article25" rel="canonical"/> Thanks in advance

    Read the article

  • URL structure for content that is updated daily

    - by Brendon
    A small, simple site I am working on displays a single page with the day's best offers on it. The user is able to move back and forth between previous days. Which of the following URL structures works best? Structure 1 /index.html -- today's best offers /2013-06-29.html -- yesterday's best offers, etc. Structure 2 /index.html -- 302 redirects to /2013-06-30.html (or whatever today is) /2013-06-30.html -- today's best offers /2013-06-29.html -- yesterday's best offers, etc. I quite like structure 2 from the user's point of view (they can share content easily), but I am a bit concerned about updating the redirect from /index.html every single day -- would this perhaps have unintended SEO consequences?

    Read the article

  • Garbled text in server logs

    - by Glenn Dayton
    I recently looked over my server's logs and I found a bunch of garbled text. Here is a link to the full log, and here is a snapshot of what it looks like: ¹^œÌÓûFF™ÃŒ-ôÚÏàÃÒNRs§cÝi ~F#J"|³Ôq0ã~QQbA ¼¹¦’š¶É3œßå<ú€Ç©XAwdL?R°ÝbÒt©ôÇ·Æ…÷q˜ÇѺ| Þ,߯¡Êr yR¤Q¹Jêlš‘AzP\ ¦ÂY„ÉÉ,æ™ U™»ì³ÔÝáCÿ42‹Ö.nŽÉ2%ÓN8i4Œ®¿‘•"-se•äŽ¿ÊÁ§€þ 8åv%'#Äpžs/ÙÍ:¡1ÑÖÃå ºu|Q®!ÏyÆ,­NR@¶ËȯRDkã=ÿÀܸ ›¼Ô ’ð>ÓÌBftdÃ8–é}‰[øbãÝÁ嘲b¾W n´tT­œpäNëëÔ ·RUÓP+ÅuKÁ£¬\âÌ®:J<ÍÁ0:Q%ª(Œ˜E-ÁI:ï™4®hæœT†«);°Çda@´#èì}‡£ü•{57ý]¼|øÓñð÷ÈÌð‡MkŠâ•C~$Óô#ÙV¾Núå.#Á]vôžóæ» V&8)%øVSž“±ÔQLåÓý1–ŽÃßQ$¹ýž")ÈûQcÄý_ÔüGP=s‹vq#Pmoo.tigertutorialscomµÐOKÃ0ð»Ÿâ‘ØH“ What is this? and is someone trying to do something to my website?

    Read the article

  • Is there evidence that linking to quality, reputable and popular website helps with ranking?

    - by JVerstry
    Is there any evidence that linking to external quality, reputable and popular websites helps with ranking (directly or indirectly)? Is there an established correlation? Some posts on the web do claim it, but without providing any evidence. It is known that if your website links to bad neighborhood, this will harm your reputation and authority, but does the reverse actually help? And, does it matter if the website is young or old in this case? Update I have found this Moz video revealing there is a 0.04 correlation with ranking.

    Read the article

  • which is the best CMS for building a Video tutorial website? [duplicate]

    - by Rajesh Sankar R
    This question already has an answer here: How to choose a CMS system for a small web site? 4 answers I am planning on setting up a website where I will be posting "how to or training videos" for various softwares. There will be several training videos under a single topic. And there will be several topics under a single software training. Similar to lynda.com but in a small scale. Can you suggest me a CMS for the job? Initially I am planning on hosting videos on YouTube and linking them to my website. But, I will be moving to my own hosting later on. It must be customizable, scalable, and most of all OPEN SOURCE

    Read the article

  • Is there a way of listing files for a directory if it contains index.html?

    - by fredley
    On my server (over which I have little control), directories are listed by default, so for mysite.com/images I get: Index of /images Parent Directory BirdsAreHere.png CanYouSpot-AdBlank.jpg etc. Is putting an index.html in that directory enough to prevent people listing the files, or is there still a way of getting at that list? Is it the same for my web root directory (mysite.com)?

    Read the article

  • Multiple domain names with pages linking to one website

    - by Mark Ravenhill
    Hello, I work for a company who have been redesigning their company website. I have been asked to register loads of domain names that contain the keywords that they want to use on the original site. Each of these domain names will contain a one page website with a destription of what the company offers and a link saying something along the lines of 'click here for more infmormation' which then takes you to the main site. The idea being the main site will then be recieving a lot of inbound links and hopefully rise in the google rankings, not to mention bring in more customers who have come to the site from all the other domain names who wouldn't have normally got to the website because it wasn't ranked on the first page. Is this a good idea or will Google see this as spam and penalise the main site for having loads of links to it from one page websites hosted on the same nameserver? Any advice would be greatly appreciated. Thanks, Mark.

    Read the article

  • Can I import an existing member data used in old ASP to a new ASP.NET membership database? [closed]

    - by Rick Brown
    I have an old website that I designed and still maintain using old ASP that has a membership database (MS-SQL) that I built from scratch. It is a very simple database that has all the user information in one table (including login info and personal info) and then details and other odds and ends in other tables. It is WAY past time to upgrade this to .NET, especially since I need to add a Paypal payment system into it as soon as I can. I've designed several other sites with membership in .NET, but they have all been from scratch. Is there an easy way to transition from the old ASP site to a new .NET membership database without losing the data? There are hundreds of users with thousands of records relating to those users that I'd rather not lose, if possible. Any ideas on a relatively painless way to do this?

    Read the article

  • Google's process for publishing/modifying pages [closed]

    - by Glenn Dayton
    I'm assuming that a group of people at Google have control of certain sections of google.com, but how does Google make sure that employees don't accidentally or intentionally sabotage the website? Does Google use Adobe Contribute or some similar product for sharing/publishing the website. Do employees use WebDAV, FTP, SFTP, or SSH to publish the site. Since Google has hundreds of thousands of servers it probably takes some time for its servers to update. Do they transmit the new copy of the website to all servers before publishing at once? This question does not apply to Google editing a database and having a page reflect the database's changes. It applies to employees editing the source code and/ or back end of the site.

    Read the article

  • How to secure robots.txt file?

    - by CompilingCyborg
    I would like for User-agents to index my relative pages only without accessing any directory on my server. As initial thought, i had this version in mind: User-agent: * Disallow: */* Sitemap: http://www.mydomain.com/sitemap.xml My Questions: Is it correct to block all directories like that - Disallow: */*? Would still search engines be able to see and index my sitemap if i disallowed all directories? What are the best practices for securing the robots.txt file? For Reference: Here is a good tutorial for robots.txt #Add this if you want to stop Alexa from indexing your site. User-agent: ia_archiver Disallow: / #Add this to stop duggmirror User-agent: duggmirror Disallow: / #Add this to allow specific agents User-agent: Googlebot Disallow: #Add this to allow all agents while blocking specific directories User-agent: * Disallow: /cgi-bin/ Disallow: /*?*

    Read the article

  • folder level 301 redirect without .htaccess

    - by Vinay
    I have a website hosted in yahoo small business, I don't have access to .htaccess file. I have around 220 pages in a folder 'mysubfolder' (http://mysite.com/myfolder/mysubfolder). And the age of website is around 3 years. I am planning to move all 220 pages in 'mysubfolder' to 'myfolder' (one level up). All the pages in 'mysubfolder' are indexed. what is the best way to do this.So that it should not affects the SEO.

    Read the article

  • Does anyone know any good resources for learning how to market a web app?

    - by Jack Kinsella
    I'm a developer first and foremost. I write web apps but have a hard time generating traffic and converting potential users once I've released my product into the wild. I know I need to learn more about marketing but I don't know where to start as I've no baseline to judge the quality of the materials I stumble across. Does anyone know any websites, blogs, e-books or other resources for learning how to market effectively?

    Read the article

  • Files not refreshing after uploaded and overwritten via ftp

    - by guisasso
    I have been having some trouble with updating my website this morning. After editing some files, in special, a css file, and uploading it to the server, i would notice that the changes wouldn't happen, as if the file had not been overwritten. File size on my local machine would be 3.244 kb and on the server 3.080 kb. Even after deleting the whole folder itself, and uploading everything again, same error. Answer: Cloudfare.

    Read the article

  • schema.org 'reviewRating' tag not recognized by google snippet testing tool

    - by saravanak
    I'm trying to add more structural information to my webpages by using the microdata format suggested in www.schema.org. The procedure seems straight forward but I'm having issues validating my results in the Google Rich Snippets Testing Tool. Check out this review page, here I'm using the 'reviewRating' property item to specify rating values for that particular review. I followed the same format as defined in schema.org/Rating but this markup fails validation in Google's rich snippet testing tool with the following error info. Item Type: http://schema.org/rating reviewrating = 5 ratingvalue = 5 Warning: Property "reviewrating" was not found.

    Read the article

  • Forward .html/.htm to .php with .config

    - by PhilipK
    I'm moving a site from my linux hosted server to a client's windows hosted server. The .htaccess file no longer works and I'm told that windows servers use .config . How can I forward all users accessing .html & .htm files to the equivalent .php file. Server Info... OS/Hosting Type: Windows / Shared Hosting .Net Runtime Version: ASP.Net 2.0/3.0/3.5 PHP Version: PHP 5.2 IIS Version: IIS 7.0 Data Center: US Regional EDIT *Hosting provided by GoDaddy Was told by a friend following should work but it has no effect on the site. <configuration> <system.webServer> <handlers> <add name="PHP-FastCGI" verb="*" path="*.html" modules="FastCgiModule" scriptProcessor="c:\php\php-cgi.exe" resourceType="Either" /> </handlers> </system.webServer> </configuration>

    Read the article

  • How does hreflang interact with geo targeting?

    - by zakgottlieb
    If I have multiple subfolders that I wish to target at different countries, I'm thinking the ideal set up would be to specify rel="alternative" hreflang with a language AND country code (e.g. en-AU) and ALSO to geotarget that subfolder to the particular country. That way, the pages would be showing up both in the country-specific results (accessed via Search Tools) because of hreflang, AND the more generic country results from regular searches, because of geotargeting. Is this correct? p.s. What would happen if you geotargeted a subfolder which had e.g. pt-BR hreflang value (i.e. Portuguese-Brazil) to just Portugal?

    Read the article

  • Google sitemap HrefLang tag without the main site url

    - by Rashmi Pandit
    We have websites with multilingual content. e.g. http://www.example.com/about-us/ http://www.example.com/en-HK/about-us/ http://www.example.com/en-GB/about-us/ http://www.example.com/zn-CH/about-us/ We need to configure the hreflang tags in sitemap for Google to know that there are alternate links for the same pages in different languages. I know for the above example that my sitemap url tag would look like this: <url> <loc>http://www.example.com/about-us</loc> <xhtml:link rel="alternate" hreflang="en-GB" href="http://www.example.com/en-GB/about-us"/> <xhtml:link rel="alternate" hreflang="en-HK" href="http://www.example.com/en-HK/about-us"/> <xhtml:link rel="alternate" hreflang="zn-CH" href="http://www.example.com/zn-CH/about-us"/> <changefreq>daily</changefreq> <priority>0.8</priority> </url> However, if I don't have the main url but just the last three ones with en-HK, en-GB and zn-CH, then how should my url tag look? Should I just skip the loc tag and keep the three xhtml:link tags? Or can I specify any url in the loc tag and put the remaining two in xhtml:link tags? I am new to Google sitemaps. Any help is greatly appreciated. Thanks, Rashmi Edit: From the answer posted on http://stackoverflow.com/questions/18423624/sitemap-for-domain-with-multilanguage-site/18423803#18423803, for my example with sites in en-HK, en-GB and zn-CH, should there be three url tags, with each of them assigned to loc with the other two in xhtml:link?

    Read the article

  • How to prevent Google from indexing non-domain URL of website?

    - by Gavin
    My webhost gives you two URLs for your website: the URL on your shared server, which is something like usr283725992783.webhost.com and your domain URL, which is www.example.com Google is indexing both of these URLs, but obviously I only want www.example.com to be indexed. I can't add "nofollow" tags to usr283725992783.webhost.com because that URL serves the same files as www.example.com. How can I only make Google not follow usr283725992783.webhost.com and keep following www.example.com?

    Read the article

  • Wordpress Multisite and Google Analytics in subfolders with mapped domains

    - by David
    I have a wordpress multisite with sub folders. The site's subfolders are mapped to domains, which are set to primary. I'm using the 'Google Analytics Multisite Async' code to track things. From what I can see it's tracking the sites fine (getting page hits for each site in google analytics) baring the original site in the Multisite which in content overview lists domains then the amount of traffic it's getting along with the orginal domains traffic. I don't want to track any other traffic for my orginal site than what goes to that. i.e. I don't want it tracking my other sites in multi-site. e.g. domain1.com is my orginal and I have lots of other sites in the multisite lets say domain2.com, domain3.com. In content overview in Analytics it's listing say domain2.com as content. Can I tell it to filter these out some how either in Analytics or within WordPress? Hopefully explained that clearly!

    Read the article

  • AdSense Custom Search Ads - custom quesry

    - by Alex
    i'm trying to set up a custom search ad, but I am nost sure about the query. On the site it says (https://developers.google.com/custom-search-ads/docs/implementation-guide) 'query' should be dynamic based on your page. This variable targets the ads and therefore should always match what the user on your site has just performed a search for. Now, what I understand is: I have to program my page so that the query variable contains some custom words. Am I right? If a user gets to my site through clicking on an adsense, there is no way to "know" what the user looked for and display my query accordingly, right? Thanks for any help!

    Read the article

< Previous Page | 136 137 138 139 140 141 142 143 144 145 146 147  | Next Page >