Search Results

Search found 11896 results on 476 pages for 'smart pro'.

Page 147/476 | < Previous Page | 143 144 145 146 147 148 149 150 151 152 153 154  | Next Page >

  • last-modified/etags - to include or not?

    - by Kae Verens
    Google's PageSpeed plugin suggests that a website should include Last-Modified and ETag headers: Specify a cache validator "Resources that do not specify a cache validator cannot be refreshed efficiently. Specify a Last-Modified or ETag header to enable cache validation" However, Apache suggests that by not including them at all, we speed up websites by eliminating If-Modified-Since and If-None-Match requests: http://www.askapache.com/htaccess/apache-speed-last-modified.html these are in direct opposition - which should be implemented? I'm leaning towards Apache's suggestion, as when I want a file cached, I don't want it refreshed.

    Read the article

  • Does Webmaster Tools list traffic from ads as inbound links?

    - by Mohamad
    In Webmaster Tools, under the inbound links section, do ads get counted as inbound links? I am doing a review of inbound links on a website and found that most of them are sourced from meaningless blogs and spam websites. Before I accuse anyone of not doing their job properly, I would like to know something: Is it possible that those inbound links were generated when an ad for the website appeared on the spam website? An SEO firm was paid handsomly to generate inbound links and I am afraid all they did was submit material to spam blogs and websites.

    Read the article

  • Silverstripe: How can I disable comments?

    - by SamIAm
    My client site is built in Silverstripe, there is a news page, and it allows people to leave comments. Unfortunately we've got loads of spam emails. I'm new to this, is there any way we can disable the comment field by default? How do I do it? Alternatively is there easy way for me to install a spam protection? Update - Because this is someone else's code, I just realised that they have some sort of spam protection already, so we are trying to disable comments now. I have manage to set no comment as default by changing file BlogEntry.php static $defaults = array( "ProvideComments" => true, 'ShowInMenus' => false ); to static $defaults = array( "ProvideComments" => false, //changed 'ShowInMenus' => false ); Am I on the right track to disable comments by default? Also how can I stop on the news page showing xxx comments link? eg Test Posted by Admin on 21 June 2011 | 3 Comments Tags: P This is a test.... 3 comments | Read the full post

    Read the article

  • tumblr blog under subdomain blog.mysite.com - do i still get the benefit?

    - by sam
    ive got a tumblr blog for my main site, i use it more as a proper blog with articles and images, rather than just a tumblog. The blog is mapped to blog.mydomain.com Becuase of the ease of 'social' reposting in tumblr people often repost my articles (which is great - Backlinks !) But all these links go back to my blog and not my main site which is a mydomain.com does google see blog.mydomain and mydomain as the same / linked item and is my main site getting the benefit of these links ?

    Read the article

  • Avoiding Duplicate Content Penalties on a Corporate/Franchise website

    - by heath
    My question is really an extension of a previous question that was ported from stackoverflow and closed so I cannot edit it. The basic gist is a regional franchise company has decided to force all independent stores into one website look; they currently all have their own domains and completely different websites. After reading the helpful answers and looking over some links provided, I think my solution is to put a 301 on each franchise store site (acme-store1.com, acme-store2.com, etc) back to the main corporate site (acme.com). All of the company history, product info, etc (about 90% of the entire site) applies to all stores. However, each store should have some exclusive content such as staff, location pictures, exclusive events and promotions, etc. I originally thought that I would simply do something like acme.com/store1/staff, acme.com/store2/staff, etc for the store exclusive content and then acme.com/our-company, for example, would cover all stores. However, I now see two issues that I don't know how to solve. They want to see site stats based on what store site they came from. If a user comes from acme-store1.com, is redirected to acme.com and hits several pages, don't I need to somehow keep that original site in the new url to track each page in that user's session and show they originally came from acme-store1.com? Each store is still independently owned and is essentially still in competition with the other stores, albeit, in less competition than they are with other brands. This is important because each store would like THEIR contact info, links to their social media pages, their mailing list sign-up and customer requests on EVERY page. So if a user originally goes to acme-store1.com and is redirected to acme.com, it still should look to the user that it's all about store 1, even though 90% of the content will be exactly the same as it is in the store 2, store 3 and corporate site. For example, acme.com/our-company would have the same company history, same header/footer/navigation, BUT depending on the original site the user came from, it would display contact and links to THAT store. If someone came directly to the corporate site, it would display their contact and links (they have their own as well). I was considering that all redirects would be to store1.acme.com, store2.acme.com, etc (or acme.com/store1) and then I can dynamically add the contact info and appropriate links based on the subdomain or subfolder. But, then I have to worry about duplicate content penalties because, again, about 90% of the text in these "subdomains" are all the same. For reference, this is a PHP5 site. I've already written a compact framework utilizing templates and mod-rewrite that I've used for other sites. Is this an easy fix that I'm just not grasping? Any suggestions?

    Read the article

  • Is it time to drop Courier from your monospace font stacks?

    - by Jeff
    I've been fine-tuning my font stacks lately and was wondering if it's safe to drop Courier from my monospace font stack yet? Would you feel comfortable dropping it? Of course, monospace is my final fallback. Note 1: OS Testbed: WinXP, WinVista, Win7, iPhone, iPad Based on my research, these browsers now substitute Courier New for Courier by default: IE9+ Chrome 2+ Firefox 10+ Safari 3.1+ iDevices Note 2: The default "font-family: monospace;" renders as Courier New in every browser I've tested, from IE6 through the latest iPhone/iPad devices. EDIT: One exception is Opera 12, which renders Consolas on Win. Opera 10 renders Courier New. Note 3: I've noticed that Courier refuses to render with any font smoothing (anti-aliasing) in any browser I've tested, regardless of system and/or browser display settings. Probably because it's an old bitmap font. This could be because of my system setup, however.

    Read the article

  • Google sitemap HrefLang tag without the main site url

    - by Rashmi Pandit
    We have websites with multilingual content. e.g. http://www.example.com/about-us/ http://www.example.com/en-HK/about-us/ http://www.example.com/en-GB/about-us/ http://www.example.com/zn-CH/about-us/ We need to configure the hreflang tags in sitemap for Google to know that there are alternate links for the same pages in different languages. I know for the above example that my sitemap url tag would look like this: <url> <loc>http://www.example.com/about-us</loc> <xhtml:link rel="alternate" hreflang="en-GB" href="http://www.example.com/en-GB/about-us"/> <xhtml:link rel="alternate" hreflang="en-HK" href="http://www.example.com/en-HK/about-us"/> <xhtml:link rel="alternate" hreflang="zn-CH" href="http://www.example.com/zn-CH/about-us"/> <changefreq>daily</changefreq> <priority>0.8</priority> </url> However, if I don't have the main url but just the last three ones with en-HK, en-GB and zn-CH, then how should my url tag look? Should I just skip the loc tag and keep the three xhtml:link tags? Or can I specify any url in the loc tag and put the remaining two in xhtml:link tags? I am new to Google sitemaps. Any help is greatly appreciated. Thanks, Rashmi Edit: From the answer posted on http://stackoverflow.com/questions/18423624/sitemap-for-domain-with-multilanguage-site/18423803#18423803, for my example with sites in en-HK, en-GB and zn-CH, should there be three url tags, with each of them assigned to loc with the other two in xhtml:link?

    Read the article

  • Yandex frequently replaces page names with ampersands

    - by Guy
    The Yandex spider is a frequent visitor to one of the sites I manage. On ocassion it replaces the page name with two ampersands and a space. So if the page is: /mypage.aspx?param=value then it will try and crawl it as: /&& ?param=value Any idea why it is doing this? [EDIT] If I remember correctly the IP that this "mistake" is coming from is based in California and not Russia. I believe that they crawl US sites from a US based IP address. Not sure if that helps. More Info about request: IP: 199.21.99.82 City: Palo Alto State: California Country: United States ISP: Yandex Inc. User-Agent: Mozilla/5.0 (compatible; YandexBot/3.0; +http://yandex.com/bots)

    Read the article

  • Where can I download list of all .com domains registered in the world

    - by John
    I just need registered .com domain names. I know this list is available at: http://www.verisigninc.com/en_US/products-and-services/domain-name-services/grow-your-domain-name-business/tld-zone-access/index.xhtml ( looks like it could take 4 weeks for approval) http://www.premiumdrops.com/zones.html Also I can extract domain names using domain search API at domaintools.com Is there any other source where I can find this list?

    Read the article

  • Google analytics - drop in traffic

    - by Andy
    Bit of a general question here. We are in the process of converting a number of our clients from older web sites to new ones. The problem we are getting, and sorry for being so general here, is we are getting a sharp decline in traffic as reported on Google Analytics. It's not a gradual decline, it seems to hit almost as soon as the new site goes live. I've just got a few questions to see if there is something we are doing wrong: a) We are using the same analytics accounts going from old to new site. Is this a bad idea? b) The actual analytics code is integrated into the pages using a server-side include. IS this a bad idea? c) We structure our sites differently to our old site. IE. The old sites would pretty must have all the web pages in the root directory, and hyperlinks would be linked to the page files: EG. <a href="somepage.aspx">Link</a> Our new sites now have a directory structure that pretty much reflects the navigation structure, and hyper links link to the pages directory instead of the actual page: EG. <a href="/new-items/shoes/">New shoes</a> Is this a bad idea. I'm really searching for a needle in a haystack here. Would appriciate any help or advice as to why we are getting such a sharp and sudden drop in traffic. Again, so this is such a general question. Thanks in advance.

    Read the article

  • SVG images grow and create scrollbars when on the server

    - by zuko
    Okay so I embedded some SVG images into my page and opened it locally on Chrome and it looked fine. I upload the same file to the server and look at the page online and the SVG images have grown by maybe 5-10% and are surrounded by scroll bars like they are overflowing. I think it probably has to do with my lack of knowledge on how SVG and Embed work. What's really puzzling me though, is that it works fine locally. (I have cache disabled.) Help? Thanks. Edit: code HTML: <embed type="image/svg+xml" src="content/web-logo.svg"/> There's no CSS on the image. I'm not sure if I was just wrong before or if I changed something I'm not aware of, but it doesn't appear to be actually changing size anymore. It just decides to stuff it into a scrollbox. pic: https://www.dropbox.com/s/wt1aufi7nl1fpyi/svg-problem.png

    Read the article

  • Is there an app/script I can deploy to enable my users to change their own LDAP passwords?

    - by Tom Wright
    I've recently enabled LDAP based authentication on my domain. This has allowed us to use a single set of credentials to administer the blog, the forum and the wiki. Unfortunately, this has come at the cost of users being able to change their own passwords. Ideally, users would be able to visit a page (i.e. mydomain.com/account), authenticate and then change their password. Does anyone know of a script or app that will allow me to do this quickly and easily? I guess it wouldn't be hard to write in PHP, but I'd prefer not to have the hassle.

    Read the article

  • Getting user generated content with no titles to rank

    - by hugo
    We are creating a site that allows users to generate content. The user is provided with a text field only (no title), similar to Twitter, Facebook, and Google+. Each piece of content created by the users will have a dedicated page/URL. Since the page has no title, I was wondering how search engines will index and display our pages. If the content was shared on other social networks, what will those results look like if there is no title for the open graph or Twitter tags?

    Read the article

  • Reverse proxying only a specific URL

    - by Bart Silverstrim
    I have a web server at www.ourcompany.com running Apache2. Using the proxy modules, I am able to (for example) get 172.16.0.5, an internal IP device, to be accessed on www.ourcompany.com/device. The trouble is that anyone can play with or explore the device using strings sent to www.ourcompany.com/device/change/settings/here.html. I'd like the reverse proxy to only work for a specific URL; www.ourcompany.com/device/you/must/use/this while anything else will be rejected if requested. Is there a setting that can be used to do this, or is it a simple rewrite condition placed in the virtualhost for the site under sites-enabled? What is the simplest, most maintainable way to sanitize requests to the internal device through the reverse proxy? Running Apache2 on Ubuntu.

    Read the article

  • Google ranking, page crawl

    - by Nawaf Mubarak
    please don't mind me for asking this newbie question about Google ranking. I know that in order to get ranked the page has to be crawled by Google bots, I have had a page example of which I will get a better understanding of how the system works with Google. I have made a page on my website last month, it got indexed pretty quickly, then I found that it's in Google's page 15 on my keyword as a start, next day it made it to page 13, then after a week it was jumping back and forth in page 17/18 up to 20. Now a month passed by, when and it isn't listed in any position of that 'keyword' sometimes I will find it in page 30, but later I won't find it anywhere, keep happening this way these days. Even if it isn't listed in any page for my keyword if I do a search for "site:thepageadress" it will be listed which means I'm not penalized and my page is there for google to see, but it isn't in the search result for my keyword. But when I write "site:thepage_adress" and I hit "search tools" option and click on "Past day" or "past week" it isn't listed, it is only listed when I click on "Past month" which I think means that Google indexed the page, looked at it once when I published it, and never looked at it again, is this a fair statement? So two questions that comes to mind here. 1- Should Google keep looking at a page even if I haven't changed any info for it? and is this an indication for me that my page is doing fine? or is it normal that Google see's it once and thats it? 2- Why and how to fix the fact that my page keeps jumping back and forth in the ranking result for keyword, and sometimes it isn't even listed, what does that mean? Sorry for the long msg, I hope to god that somebody help me with this. Thank you!

    Read the article

  • Search engine friendly, SEO blog software

    - by Steve
    Is there a comparison of the SEO capabilities of different blogging software/blogging plugins? I'd like things to be as optimised as possible. I have a basic grasp of SEO principles, probably 12-24 months old. I'm about to start a blog, after having a few previously. Also, I'm not up to speed on what pings are in the blogging world. What are they, and how do they work? I assume it is best to have blogging software that automatically pings.

    Read the article

  • flash twitter and facebook widgets

    - by NorthPole
    I'm stuck with a crappy digital signage platform that only renders .html and .swf files (and rss feeds) No customization of rss, only way to show something dynamic in a pretty way is to use flash. The question: is there any way to embed javascript in swf files or somehow have facebook and twitter flash plugins? I looked for ready-made swf widgets for the job but didn't find any, if there is any flash widget that serves a facebook and/or twitter feed please give me a link. (sorry if the question is out of context but these things usually run a stripped-down browser to display everything so its pretty much a web page run from a file and not a web server)

    Read the article

  • how to select categories for user generated content site?

    - by Frederik Creemers
    On the site I'm building, users can create tutorials. I want the users to be able to create tutorials on as many subjects as possible, but still have some preset categories. What's the best way to select these categories? The reason I don't just let users add keywords, and use these for categorization, is because users gain experience points in a certain subject when their tutorial is liked by someone, and in a similar way the Stack Exchange network does, create communities around these subjects. I will give visiters the possibility to suggest new categories. here are the categories that I'm thinking of at the moment: health gardening cooking technology science & math music visual art

    Read the article

  • Can't get lines around table borders/cells [migrated]

    - by Ira Baxter
    I have several web pages containing tables, for which I'd like to have line-borders around the tables and the cells. In fact, some of these pages existed for several years already, and rendered acceptly in IE6, IE7. We switched about 6 months ago to a completely different set of style sheets to change our site look and feel. We also switched to "modern" browsers such as IE8 (and because I couldn't stop Vista) to IE9. Now the borders don't render at all. I spent a day fighting with this about a month ago, and failed to fix it. It seemed that I could reduce the page down to just the barest table and IE8 would still not render the border. I think I decided IE8 was just buggy, but I'm not an HTML expert so it is more likely that I'm buggy. (I'm just getting back to this; I'll go see if I can find that reduced page). Here is one such page: http://www.semdesigns.com/products/DMS/DMSComparison.html The tables should be obvious; you can tell them by their absence of lines :-{ The URI validates using the W3C service as HTML 4.01 Transitional. Any suggestions?

    Read the article

  • Should a website be on a topic?

    - by Rana Prathap
    I run an online writers' community where authors publish their literature works and other members of the community read and comment on them. The authors write a wide variety of literature pieces(such as haikus, stories, poems, scientific articles, personal narratives) on a wide variety of topics(about sun and anything under it). My intention of providing the authors with search engine traffic is largely affected by the non existence of topical focus of the website(or so I think). Is there a way to overcome this problem?

    Read the article

  • How does hreflang interact with geo targeting?

    - by zakgottlieb
    If I have multiple subfolders that I wish to target at different countries, I'm thinking the ideal set up would be to specify rel="alternative" hreflang with a language AND country code (e.g. en-AU) and ALSO to geotarget that subfolder to the particular country. That way, the pages would be showing up both in the country-specific results (accessed via Search Tools) because of hreflang, AND the more generic country results from regular searches, because of geotargeting. Is this correct? p.s. What would happen if you geotargeted a subfolder which had e.g. pt-BR hreflang value (i.e. Portuguese-Brazil) to just Portugal?

    Read the article

  • SEO: Single URL rewrite from one app to another

    - by user1909186
    I have two web applications running on two different servers. I want one, example.com/hello, to redirect to the second, hello.com. But I want both to contribute to each other's SEO ranking. What is the best way to accomplish this primarily for google search and for other search engines? I currently do a rewrite with permanent from example.com/hello to hello.com using nginx. Thanks for your help

    Read the article

  • Bing flagging pages as Malware

    - by Vince Pettit
    Bing has flagged some pages on a site I manage as malware, these have been looked at and looks like there was some malware at some point but it's now since been removed. It's also pointing to some pages which no longer exist saying there is malware on those. Is there anything specific I need to do to get Bing to stop trying to access the removed pages and also deflag the pages that have been fixed.

    Read the article

  • How can I making Twitter, Facebook and Reddit share buttons load last?

    - by Daniel Bingham
    I have a website with a number of pages that sport twitter, facebook and reddit share buttons. They take forever to load and until they do the rest of the page doesn't load. So how I can make them load last? Currently, they are loaded something like this: <div class="item"><a href="http://twitter.com/share" class="twitter-share-button" data-count="vertical" data-via="FridgeToFood" data-related="danielBingham:Recipe and update tweets from Fridge to Food.">Tweet</a><script type="text/javascript" src="http://platform.twitter.com/widgets.js"></script></div> <div class="item"><script src="http://connect.facebook.net/en_US/all.js#xfbml=1"></script><fb:like layout="box_count" width="40"></fb:like></div> <div class="item"> <script type="text/javascript">reddit_target='recipes';</script> <script type="text/javascript" src="http://reddit.com/static/button/button2.js"></script> </div> They are in a div called "shareWrapper" and are loading to one side of the page. The buttons load where ever the script code is placed. As far as I know, I can't place the script code at the bottom of the page and move the resulting buttons after the fact. I want them to appear near the top, which right now means they are stopping everything below them from loading for several seconds. I tried loading them using javascript, but using JQuery's $(document).ready(), but that failed. It seems to leave the page in some sort of loading loop from which it never emerges. Are there other ways to get these to load last?

    Read the article

  • Does Google rate the webpage by amount of visits?

    - by petiar
    Hi there, there is quite extensive discussion about this topic on another website and I am really losing my confidence. The thing is that I claim that the amount (count) of visits is NOT a criteria for increasing the PR of the particular web because: a) Google just doesn't know about every single visit on a webpage (in case it's not using GA) b) Google just would not rate by something what Google actually affects Thanks for your opinions. Peter.

    Read the article

< Previous Page | 143 144 145 146 147 148 149 150 151 152 153 154  | Next Page >