Search Results

Search found 9717 results on 389 pages for 'pro metedor'.

Page 139/389 | < Previous Page | 135 136 137 138 139 140 141 142 143 144 145 146  | Next Page >

  • How to prevent duplication of content on a page with too many filters?

    - by Vikas Gulati
    I have a webpage where a user can search for items based on around 6 filters. Currently I have the page implemented with one filter as the base filter (part of the url that would get indexed) and other filters in the form of hash urls (which won't get indexed). This way the duplication is less. Something like this example.com/filter1value-items#by-filter3-filter3value-filter2-filter2value Now as you may see, only one filter is within the reach of the search engine while the rest are hashed. This way I could have 6 pages. Now the problem is I expect users to use two filters as well at times while searching. As per my analysis using the Google Keyword Analyzer there are a fare bit of users that might use two filters in conjunction while searching. So how should I go about it? Having all the filters as part of the url would simply explode the number of pages and sticking to the current way wouldn't let me target those users. I was thinking of going with at max 2 base filters and rest as part of the hash url. But the only thing stopping me is that it would lead to duplication of content as per Google Webmaster Tool's suggestions on Url Structure.

    Read the article

  • Do web crawlers/spiders index azure web sites?

    - by Clay Shannon
    For somebody who wants their web site to be as discoverable as possible (and who doesn't?), are Microsoft's Azure web sites (azurewebsites.net) a feasible domain to host sites? I have a site that is both on an azurewebsites.net and hosted under a completely different name by discountasp.net Both of these sites are exactly the same, except for the URL; whenever I update the code, I republish the site to/in both places. So obviosuly, they both have the same H1 and H2 elements. Searching for the value/content in my H1 tag, I find my .com site listed #3 on google and #2 on both Bing and Yahoo; OTOH, my azurewebsites.net site doesn't show up on the first page at all, in any of them. This makes me wonder if azurewebsites.net should only be used for Web API hosting and such-like, not for generic/commercial "public" sites. Are my conclusions valid?

    Read the article

  • Is there evidence that linking to quality, reputable and popular website helps with ranking?

    - by JVerstry
    Is there any evidence that linking to external quality, reputable and popular websites helps with ranking (directly or indirectly)? Is there an established correlation? Some posts on the web do claim it, but without providing any evidence. It is known that if your website links to bad neighborhood, this will harm your reputation and authority, but does the reverse actually help? And, does it matter if the website is young or old in this case? Update I have found this Moz video revealing there is a 0.04 correlation with ranking.

    Read the article

  • Finding a Payment Gateway?

    - by Lynda
    I have a client who would like to sell glass pipes online. The problem I run into is with the payment gateway. Glass pipes fall into two categories drug paraphernalia or tobacco product. This leads me to here and asking: Does anyone know of a payment gateway that will process payments for glass pipes? Note: Doing some Google searching I read that Authorize.net will accept tobacco but when I spoke with them they said they do not.

    Read the article

  • How exactly is Google Webmaster Tools measuring "Site Performance"?

    - by Rémi
    I've been working for two months now on improving our response time (mainly server side) on a new forum (a brand new product on a technical point of view) we've launched in Germany a few month ago and I'm a lot surprised by the results I get. I monitor our response time using Apache logs and our own implementation of Boomerang beacon. Using my stats, I can see that our new product responds in about 680 ms where our old product was responding in about 1050 ms. On the other side, Google Webmaster Tool tells us that our pages have an average reponse time of about 1500 ms today where it was 700 three months ago with our old product. I've figured that GWT was taking client side metrics into account so I've added some measures on our Boomerang beacon and everything looks just fine. I've also ran some random pages on ySlow and Google's Page Speed and everything looks better than it was before. We event have a 82% on Google's Page Speed tool which is quite cool for a site with some ads in it :) Lately, we have signed a deal with Akamai to use two of their products : CDN for our static files (we were using another CDN before but it wasn't very effective) and RMA to improve Networks routes. We have also introduced a new agressive cache mecanism to ensure that most of the pages served to crawlers are cached by our memcache grid. After checking my metrics, it seems that this changes have improved from 650ms to about 500ms, which is good (still not great but it is definitly an improvement). But webmaster tools continues to report an increasing average response time where we see it decreasing in the same time. Have you ever had the same kind of wierd behavior on your sites while doing performance improvements ? Do you have any idea how to monitor the same thing Google does with Site Performance in Google Webmaster Tools so that we could improve our site and constantly check if it is what Google wants ? Edit 2011/07/26 : Thanks for your answers guys ! Nevertheless, I was not precise enough. The main issue we have is not with the Site Performance page but with the Crawl Stats one for now. We probably found an issue on our side with some very slow pages (around 3000 ms !!) and we are trying to fix them. I'll keep you posted as soon I'll have some infos. Thanks again !

    Read the article

  • How do I allow e-mail to be relayed through this MTA?

    - by BlueToast
    When I try to send an e-mail using authenticationless relay via telnet, I receive an error message "553 sorry, that domain isn't allowed to be relayed thru this MTA (#5.7.1) rcpt to:[email protected]". How can I allow a specific domain to be whitelisted and allowed through the MTA? There is only one domain I am trying to relay e-mails to (and that domain uses a totally different, independent and standalone mail server with IceWarp). 220 mail4.myhsphere.cc ESMTP ehlo sisterwebsite.com 250-mail4.myhsphere.cc 250-PIPELINING 250-8BITMIME 250-SIZE 41943040 250-AUTH LOGIN PLAIN CRAM-MD5 250 STARTTLS mail from:[email protected] 250 ok rcpt to:[email protected] 553 sorry, that domain isn't allowed to be relayed thru this MTA (#5.7.1) rcpt to:[email protected] 553 sorry, that domain isn't allowed to be relayed thru this MTA (#5.7.1) rcpt to:[email protected] 553 sorry, that domain isn't allowed to be relayed thru this MTA (#5.7.1) rcpt to:[email protected] 250 ok data 354 go ahead To: [email protected] From: [email protected] Subject: Test mail -- please ignore Test, please ignore this Jane Sincerely, BlueToast . 250 ok 1350407684 qp 22451 quit 221 mail4.myhsphere.cc Connection to host lost. C:\Users\genericaccount Not sure what to do. I did some Googling but I'm having a hard time finding relevant results. Most of the search results I get are about trying to receive mail -- but I am trying to send mail. mail.sisterwebsite.com = mail4.myhsphere.com. We use FluidHosting for the e-mail on sisterwebsite.com. (Repeating question just in case) How can I allow a specific domain to be whitelisted and allowed through the MTA?

    Read the article

  • Ditch cPanel / WHM in favour of manual seup

    - by BWRic
    We currently use cPanel / WHM on a reseller account but are looking at getting a dedicated server. My first thought was to duplicate this set up on the dedicated box to allow us to quickly create new accounts. I'll be a managed server so they'll have set up the LAMP stack. I'm curious if I actually need cPanel and WHM. We don't use many of the features from cPanel / WHM, just creating accounts and databases, clients do not have FTP access. I'm no sys admin and come from a Windows / GUI background but have some knowledge in setting up development servers. WHM: Creating accounts I presume this sets up the Apache virtual host, FTP access and DNS settings. I've some knowledge of editing the Apache files to create virtual hosts. Am I correct in thinking as long as the DNS is pointing to the server IP and the virtual host is configured the server can serve the (php) pages? I'm not sure I need per site FTP access as only we will have access so I could have a server wide/htdocs only access to view all the site. The company who supply the dedicated hosts would also provide the own DNS management tool so I'm not need to cPanel one. MySQL: Creating users and databases We use cPanel to create the MySQL users and databases. As it's a dedicated box and I can have root access I think this could be replaced by SQLyog for db management and phpMyAdmin for user management. Do you I need cPanel or can I get by editing a few text files for creating the accounts, then use the MySQL tools for databases? Or am I missing something major with how the sites are configured?

    Read the article

  • AdSense Custom Search Ads - custom quesry

    - by Alex
    i'm trying to set up a custom search ad, but I am nost sure about the query. On the site it says (https://developers.google.com/custom-search-ads/docs/implementation-guide) 'query' should be dynamic based on your page. This variable targets the ads and therefore should always match what the user on your site has just performed a search for. Now, what I understand is: I have to program my page so that the query variable contains some custom words. Am I right? If a user gets to my site through clicking on an adsense, there is no way to "know" what the user looked for and display my query accordingly, right? Thanks for any help!

    Read the article

  • Web Hosting Checklist

    - by Chris
    Hello, I am a web developer that is starting to look into hosting his own website. I would like to showcase my programming skills (PHP, MySQl, C#, Wordpress). My knowledge of languages I am OK with but the actually hosting site is where my knowledge starts to get a little shaky. I know the basics (bandwidth, sub-domains, re-write rules) but I would love your input, to help me formulate a check list of certain web-hosting services that I should be on the look-out for. Also I was wondering if there were any reliable hosting providers who give you the option to host both c# code-behinds and PHP code. As I would like to have two versions of my site, one in C# and one in PHP the hope is that if I need to look for another job this website will help me show possible employers my server side knowledge. I hope this is enough info, I did some researching online but found a bunch of unless articles and I've always have had luck on the StackExchange sites. So hopefully you, can help me. Thanks alot.

    Read the article

  • Creating a Template Like System in cPanel

    - by clifgray
    I am creating a medium sized website using cPanel and their File Manager system and the majority of my pages are going to be the same with a different title and content section and I wanted to see if there is a system for making one general template file and then having all the other pages inherit from that file so all I have to do is have a content and title section and the rest of the links, headers, and whatnot can be changed throughout the site by just changing one file. Is there anything like this? I have used Jinaj2 in python and a few other systems for other server scripting languages but I am not sure how to implement it with cPanel.

    Read the article

  • Republishing blog posts on a popular website

    - by Giorgi
    I started my blog about programming yesterday and in order to promote and increase traffic I submitted my rss to Codeproject which pulls my posts and publishes them at Codeproject. While it increases the number of people reading my posts (but they are reading it at codeproject) I am worried that Google will penalize my site for duplicate content (Especially considering that Codeproject has much more reputation compared to my new website). The post at Codeproject has a link back to my blog post but it does not have "rel=canonical". So my question which one is better: a link from a high reputation website and some traffic or should I remove it from codeproject so that my blog is not penalized? What if codeproject adds "rel=canonical" to the link?

    Read the article

  • Having good domain name and using domain aliases ( I use notlong.com)?

    - by Michal P.
    I use only free servers and after creating my website: http://pundaquit.republika.pl I decided to make access to that domain by simple domain name . I decided to use domain alias http://notlong.com/ service and have simple domain name http://pundaquit.notlong.com The second advantage of using alias here was to be independant from my file host which I will have to change. I haven't found a better alias service like notlong, because notlong.com is easy to remember. After that I encounter many problems: * most of forums or social services treat notlong adress as a spam, * Bing so far hvn't accepted http://pundaquit.notlong.com domain and others. Is it another way to have good free domain name? How about the situation when your hosting server will inform you to expire? Only a lasting layer of domain aliases make you independant from the real file hosts.

    Read the article

  • Can I import an existing member data used in old ASP to a new ASP.NET membership database? [closed]

    - by Rick Brown
    I have an old website that I designed and still maintain using old ASP that has a membership database (MS-SQL) that I built from scratch. It is a very simple database that has all the user information in one table (including login info and personal info) and then details and other odds and ends in other tables. It is WAY past time to upgrade this to .NET, especially since I need to add a Paypal payment system into it as soon as I can. I've designed several other sites with membership in .NET, but they have all been from scratch. Is there an easy way to transition from the old ASP site to a new .NET membership database without losing the data? There are hundreds of users with thousands of records relating to those users that I'd rather not lose, if possible. Any ideas on a relatively painless way to do this?

    Read the article

  • Multiple domain names with pages linking to one website

    - by Mark Ravenhill
    Hello, I work for a company who have been redesigning their company website. I have been asked to register loads of domain names that contain the keywords that they want to use on the original site. Each of these domain names will contain a one page website with a destription of what the company offers and a link saying something along the lines of 'click here for more infmormation' which then takes you to the main site. The idea being the main site will then be recieving a lot of inbound links and hopefully rise in the google rankings, not to mention bring in more customers who have come to the site from all the other domain names who wouldn't have normally got to the website because it wasn't ranked on the first page. Is this a good idea or will Google see this as spam and penalise the main site for having loads of links to it from one page websites hosted on the same nameserver? Any advice would be greatly appreciated. Thanks, Mark.

    Read the article

  • Constructive criticism for my bounce rate being so high [closed]

    - by Daniel
    The bounce rate on my website's product pages is 80%, which is terrible. Could you offer any opinions on whether you consider the user experience to be bad, and how I could possibly improve it? Other pages, such as the home and category pages, have acceptable bounce rates, but the vast majority of my traffic lands on the product pages. I already tried removing some Google ads for a couple of days, but this didn't seem to help at all. I'm working on doing A/B testing at the moment. (It's tricky, as the site is based on a CMS - I custom coded the [Joomla] component, so hopefully I can get this testing working.)

    Read the article

  • How to secure robots.txt file?

    - by CompilingCyborg
    I would like for User-agents to index my relative pages only without accessing any directory on my server. As initial thought, i had this version in mind: User-agent: * Disallow: */* Sitemap: http://www.mydomain.com/sitemap.xml My Questions: Is it correct to block all directories like that - Disallow: */*? Would still search engines be able to see and index my sitemap if i disallowed all directories? What are the best practices for securing the robots.txt file? For Reference: Here is a good tutorial for robots.txt #Add this if you want to stop Alexa from indexing your site. User-agent: ia_archiver Disallow: / #Add this to stop duggmirror User-agent: duggmirror Disallow: / #Add this to allow specific agents User-agent: Googlebot Disallow: #Add this to allow all agents while blocking specific directories User-agent: * Disallow: /cgi-bin/ Disallow: /*?*

    Read the article

  • What should filenames and URLs of images contain for SEO benefit?

    - by Baumr
    We know that good site architecture usually looks like this: example-company.com/ example-company.com/about/ example-company.com/contact/ example-company.com/products/ example-company.com/products/category/ example-company.com/products/category/productname/ Now, when it comes to Google Image search, it is clear that the img alt tag, filename/URL, and surrounding text (captions, headings, paragraphs) have an effect on ranking. I want to ask about the filename of the images that we should use (e.g. product-photo.jpg). ...but first about the URL: Often web developers stick all images in a single folder in the root: example-company.com/img/ — and I have stopped doing that. (I don't want to get into it, but basically, it seems more semantic for images which make up part of the content at each sub-directory) However, when all images appear in a folder, I feel that their filename needs to reflect what they are a bit more than usual, for example: example-company.com/img/example-company-productname-category.jpg It's a longer filename than just product.png, but as long as it's relevant, I see no problem with regards to SEO (unless you're keyword stuffing), and it could even help rank for keywords: "example company" "productname" "category" So no questions there. But what about when we have places images in the site architecture we outlined at the beginning? In other words, what if image URL paths look like this: example-company.com/products/category/productname/productname.jpg My question is, should the URL be kept short like above and only have the "productname" (and some descriptive keywords) as part of it's filename? Or, should it also include the "example-company" and "category"? Like so: example-company.com/products/category/productname/example-company-category-productname.jpg That seems much longer, and redundant when we look at the URL, but here are a few considerations. Images are often downloaded onto computers, and, to the average user, they lose their original URL and thus — it isn't clear where they came from. Also, some social networks, forums, and other platforms leave the filename intact when uploaded. (Many others rewrite it, for example, Pinterest and Facebook.) Another consideration, will this really help (even if ever so slightly) rank in Google Image Search, or at least inform Google that the product is something specific to the "example-company"? For example, what if this product can only be bought at this store and is the flagship product? In addition to an abundance of internal links to this product page, would having the "example company" name and "category" help it appear in "example company" searches? In other words, is less more?

    Read the article

  • Local Business Listing Dashboard

    - by Steve
    I operate a website for a West Australian company, and the company has listings in local business directory websites. Currently we are listed in: www.HotFrog.com.au www.Google.com/Places www.TrueLocal.com.au www.StartLocal.com.au www.localstore.com.au www.communityguide.com.au www.yelp.com.au www.aussieweb.com.au Do you know of any method for examining account stats (profile views, profile clicks etc) within a dashboard, so I can see at a glance how each of our listings is going? I'd be happy to build a dashboard if necessary, but I'm not confident I currently have the skills to accurately display only the correct concise information. Would I use iframes? See if they have APIs? Is there a dashboard framework I could use?

    Read the article

  • folder level 301 redirect without .htaccess

    - by Vinay
    I have a website hosted in yahoo small business, I don't have access to .htaccess file. I have around 220 pages in a folder 'mysubfolder' (http://mysite.com/myfolder/mysubfolder). And the age of website is around 3 years. I am planning to move all 220 pages in 'mysubfolder' to 'myfolder' (one level up). All the pages in 'mysubfolder' are indexed. what is the best way to do this.So that it should not affects the SEO.

    Read the article

  • Forward .html/.htm to .php with .config

    - by PhilipK
    I'm moving a site from my linux hosted server to a client's windows hosted server. The .htaccess file no longer works and I'm told that windows servers use .config . How can I forward all users accessing .html & .htm files to the equivalent .php file. Server Info... OS/Hosting Type: Windows / Shared Hosting .Net Runtime Version: ASP.Net 2.0/3.0/3.5 PHP Version: PHP 5.2 IIS Version: IIS 7.0 Data Center: US Regional EDIT *Hosting provided by GoDaddy Was told by a friend following should work but it has no effect on the site. <configuration> <system.webServer> <handlers> <add name="PHP-FastCGI" verb="*" path="*.html" modules="FastCgiModule" scriptProcessor="c:\php\php-cgi.exe" resourceType="Either" /> </handlers> </system.webServer> </configuration>

    Read the article

  • Ranking drop after using reverse proxy for blog subdirectory and robots.txt for old blog subdomain

    - by user40387
    We have a 3Dcart store and a WordPress blog hosted on a separate server. Originally, we had a CNAME set up to point the blog to http://blog.example.com/. However, in our attempt to boost link-based and traffic-based authority on the main site, we've opted to do a reverse proxy to http://www.example.com/blog/. It’s been about two months since we finished the reverse proxy migration. It appears that everything is technically working as intended, including some robots and sitemap changes; the new URLs are even generating some traffic, as indicated on Google Analytics. While Google has been indexing the new URL locations, they’re ranking very poorly, even for non-competitive, long-tail keywords. Meanwhile, the old subdomain URLs are still ranking mostly as well as they used to (even though they aren’t showing meta titles and descriptions due to being blocked by robots.txt). Our working theory is that Google has an old index of the subdomain URLs, and is considering the new URLs to be duplicate content, since it’s being told not to crawl the subdomain and therefore can’t see the rel canonicals we have in place. To resolve this, we’ve updated the subdomain’s robot.txt to no longer block crawling and indexing. Theoretically, seeing the canonical tag on the subdomain pages will resolve any perceived duplicate content issues. In the meantime, we were wondering if anyone would have any other ideas. We are very concerned that we’ll be losing valuable traffic, as we’re entering our on season at the moment.

    Read the article

  • Loading main javascript on every page? Or breaking it up to relevant pages?

    - by Kyle
    I have a 700kb decompressed JS file which is loaded on every page. Before I had 12 javascript files on each page but to reduce http requests I compressed them all into 1 file. This file is ~130kb gzipped and is served over gzip. However on the local computer it is still unpacked and loaded on every page. Is this a performance issue? I've profiled the javascript with firebug profiler but did not see any issues. The problem/illusion I am facing is there are jquery libraries compressed in that file that are sometimes not used on the current page. For example jquery datatables is 200kb compressed and that is only loaded on 2 of my website pages. Another is jqplot and that is another 200kb. I now have 400kb of excess code that isn't executed on 80% of the pages. Should I leave everything in 1 file? Should I take out the jquery libraries and load only relevant JS on the current page?

    Read the article

  • http to https upgrade -- SEO troubles

    - by SLIM
    I upgraded my site so that all pages have gone from using http to https. I didn't consider that Google treats https pages differently than http. I re-created my sitemap to so that all links now reflect the new https and let it be for a few days. (Whoops!) Google is now re-indexing all https pages. I have about 19k pages on the site, and Google has already indexed about 8k of the new https. The problem is that Google sees all of these as brand new pages when many of them have a long http history. Of course most of you will recognize the problem, I didn't set up a 301 from the old http to the new https. Is it too late to do this? Should I switch my sitemap back to http and then 301 to the new https? Or should I leave the sitemap as is, and setup 301 redirects anyway.. I'm not even sure if Google is trying to reach the http site anymore. Currently the site is doing 303 redirects (from http to https), although I haven't figured out why yet. Thanks for any suggestions you can offer.

    Read the article

  • yahoo media player not working

    - by luca590
    I have a yahoo media player embedded in my webpage. I am currently using Ruby on Rails to create/edit my web page. When i click the play button next to a track the YMP waits a while and then goes to the next track without playing the first one. I then get a warning on my second (last) track that its file could not be found. Does anyone has a better recommendation for an audio player or a way to fix this one?

    Read the article

  • mod_rewrite works within directory not on root

    - by Anvesh Saxena
    I am having problem in my RewriteRule for the tags portion. What I am able to debug is that the rule is been triggered at least because the page "tags.php" is been rendered but without the URL parameters. This .htaccess file with the rules is within root for my sub-domain and has following content for tags postion. # Rewrite rule for tags RewriteRule ^tags/(\w+)/(\d+)/?$ tags.php?tag_name=$1&tag_id=$2 RewriteRule ^tags/(\w+)/?$ tags.php?tag_name=$1 RewriteRule ^tags/?$ tags.php?tag_name= Another problem that I ain't able to debug is that the similar .htaccess file exists for a directory within my sub domain and is working as expected with the necessary URL parameters also been available. The .htaccess file within the directory reads as follows # Rewrite rule for tags RewriteRule ^tags/(\w+)/(\d+)/?$ restAPI.php?type=tags&tag_name=$1&tag_id=$2 RewriteRule ^tags/(\w+)/?$ restAPI.php?type=tags&tag_name=$1 RewriteRule ^tags/?$ restAPI.php?type=tags&tag_name= Could anyone point me the problem that I might be having in my Rewrite rules, I am also facing Internal server error sometimes which I am second guessing is due to the linked problem. Note:- I have Apache version 2.2.23 on my shared hosting.

    Read the article

< Previous Page | 135 136 137 138 139 140 141 142 143 144 145 146  | Next Page >