Search Results

Search found 9721 results on 389 pages for 'quicktest pro'.

Page 214/389 | < Previous Page | 210 211 212 213 214 215 216 217 218 219 220 221  | Next Page >

  • Duplicating content from another site and adding value (summaries, statistics) - ranking and courtesy

    - by Krastanov
    I am working on a site that takes a governmental data base, provides a number of statistical and other summaries and also post the original data. However this data (mostly long pieces of text) is also published on the official governmental site (without the added value of summaries). Should I worry about google ranking due to this duplication? What is the preferred way to point to the official source of the information? There is no advertisement on my site. My site is ".com". The governmental site is ".bg".

    Read the article

  • How to specify importance of html elements?

    - by Julien Fouilhé
    Is it possible to specify what elements of the page are important, or, more specifically, what elements of the page are not important? I'm using HTML5 new elements (nav, header, footer, section, article, aside...), but in the description of the website, there's sometimes my login form (in the header of my page though) in the Google description of my website pages... Is there a solution to resolve this problem? Thank you.

    Read the article

  • registration form with payment system ( paypal ) [closed]

    - by Alecs
    I'm using an ajax registration form plugin for my website and I'm thinking to implement also Paypal. Here is how I want to implement it : I have 3 labels ( Name, Phone, Email, ) and a " Buy " button. After the user is typing his name, phone and email they click on "Buy" and they will be redirected to the paypal payment page or if it's possible to stay on the same page. Probably, what I need to know is how to make the "Buy " only after the forms ( name, phone, email ) are validated. Is there a plugin, or a snippet of code already made to not start something which already exists.

    Read the article

  • SEO problem for site with 2 domains [closed]

    - by Harry
    Possible Duplicate: What is duplicate content and how can I avoid being penalized for it on my site? I have two domains pointing to the same site. I want both domains to co-exist, they share most of the same content, but they differ in design and they are aimed at different markets / rivaling communities. Is there a way to let google know that these two domains are the same site and don't cause me to get hit with a duplicate content penalty? Any other general SEO tips for this situation would also be welcomed. Thanks. Come on man, why was this closed. The linked page is completely irrelevant for me.

    Read the article

  • To change url to user friendly url

    - by German
    I'm re-factoring my asp.net application from asp.net 3.5 to 4.0. Also I'm changing url to user friendly url. Example /product.aspx?id=100 to /product-name/100 All my pages indexed by search engines and the site already 6 years online. I'm planning to do 301 redirect from old pages to new one. I want to make sure I won't loose the rank and traffic. Any suggestion how to do it properly?

    Read the article

  • How does delicious.com avoid being sued for copyright infringement?

    - by Stanish
    With the recent redesign of delicious.com, they've added a much more graphical home page. The site continues to be a service for people to bookmark and share websites they come across on the web. The delicious home is now made up of images taken from those linked sites. See for yourself at http://delicious.com I would like to know what in the law allows them to do this, considering the images represent the main content of the page, and they clearly do not own copyright to those images? I know there is some leeway given to search engines where it is considered fair use to use a small portion of the content if the aim is to lead people to the originating site. Does that apply here?

    Read the article

  • How to smartly optimize ads on website

    - by YardenST
    I've a content website that presents ads. Now, my team want to optimize it for a better experience for the users. (we really believe our ads are good for our users.) We are sure that every website deals with this issue and there must be some known ways and methods to deal with it, that smart people thought of before. so what i'm looking is a tested, working method to optimize ads. for example: if i was asking about optimizing my website in Google, I would expect you to answer me: learn SEO if i was asking about optimizing the use of my website: usability testing. navigation: information architecture what is the field that deals with optimizing ads?

    Read the article

  • can I forward "referrer" information to other address?

    - by user5679
    I have two addresses for two servers: www.urlA.com www.urlB.com I have all my websites installed in www.urlB.com, but visitors recognize www.urlA.com primarily. I have www.urlA.com/index.php as the following <?php header('Location: http://www.urlB.com/'); ?> But, when I use this forwarding method, the tracking javascript in www.urlB.com cannot recognize where the visitors are from. I only obtain "NO REFERRING LINK" What should I do to do the following two jobs: 1. to forward urlA.com to urlB.com 2. to receive the referrer information

    Read the article

  • Is there a way to learn why Google penalized a site?

    - by pawelbrodzinski
    Is there any way to learn for sure why Google penalized a specific site? I think about situation when webmaster/site administrator is aware about Google rules and is sure they aren't breaking any, but the site is penalized nevertheless. The only information you get from Google is that they processed your reconsideration request but they neither say what is the result nor what is the penalty reason if they keep the site penalized. You can try to get information on Google webmasters forum or here but most of the time these are only speculations. Considering the site administrator tried to find out what's wrong but failed, is there a source which can tell what is the problem?

    Read the article

  • Photos - do I really need to look for the author and ask his permission when posting them on my site?

    - by user6456
    When I find a photo somewhere on the internet, without any explicit information of whether I can re-publish it on my own website, without any hint of who is the owner/author of that photo, can I still do it? I'm puzzled here cause I've seen like millions of websites, often very big, that repost photos, most probably found via google and it's VERY unlikely they bothered to look for and contact the author of that photos. Is every one of that sites likely to be sued at any moment? What about the case of forums and content provided by users - there is virtually no way of prevention here.

    Read the article

  • Whats a good host for an active vBulletin site?

    - by Kyle
    I've been switching hosts using a VPS each time and I'm just really not sure I'm finding the right VPS's. I've used a VPS from burst.net & rubyringtech and I just feel like it's slowly killing my site because of the slow speed. I really don't know if it's the network or the VPS itself but I really wish to fix this. When I TOP into the VPS peak times it shows this: top - 03:18:56 up 16:33, 1 user, load average: 1.33, 1.40, 1.33 Tasks: 30 total, 1 running, 29 sleeping, 0 stopped, 0 zombie Cpu(s): 27.2%us, 13.6%sy, 0.0%ni, 59.2%id, 0.0%wa, 0.0%hi, 0.0%si, 0.0%st Mem: 1048576k total, 679712k used, 368864k free, 0k buffers Swap: 0k total, 0k used, 0k free, 0k cached And pages take atleast a good 2-3 minutes to load. I have only like 50-60 members on the forum also. I had a shared hosting account and the forum was lightning fast.... Is a VPS a bad idea? :\ What should I do to fix this? I'm running lighttpd with xcache, and the latest mysql + php version. The server is a intel i7 2600 w/ 1gb uplink (I think the 1gb uplink is a lie because I've tested the network and the highest download speed I've seen was 20mb/s from a code.google page) All in all I've seen people talking about linode. Should I try them? I honestly don't need a dedicated server yet it's only 50-70 members online. What should I do? I really want a VPS because I enjoy root access. Does anyone have any suggestions?

    Read the article

  • Tricky mod_rewrite challenge

    - by And Finally
    I list about 9,000 records on my little site. At the moment I'm showing them with a dynamic page, like http://domain.com/records.php?id=019031 But I'd like to start using meaningful URLs like this one on Amazon http://www.amazon.co.uk/Library-Mythology-Oxford-Worlds-Classics/dp/0199536325 where the title string on the root level gets ignored and requests are redirected to the records.php page, which accepts the ID as usual. Does anybody know how I could achieve that with mod_rewrite? I'm wondering how I'd deal with requests to my other root-level pages, like http://domain.com/contact.php, that I don't want to redirect to the records page.

    Read the article

  • Safe way to send thousands of promotional emails

    - by Arsheep
    My new partner has an email list with over 1 Million email addresses for targeted traffic (no spam... only genuine subscribers from his last startup ) But now I have a problem.. how can I sent an email to all those email addresses ? I can't use my ISP SMTP mailer, they will block me immediately for bulk mailing. I thought of a way to send emails slowly . Like dividing them in sets of few thousands and sending to each set daily . Will it be fine solution ?

    Read the article

  • Amazon EC2 vs Dedicated server at Hetzner, what's the use for EC2?

    - by C-Blu
    After searching the web I still can't find the reason to use EC2. What's the point to scale EC2? If you expect a huge burst in traffic, they say. OK, but what if you already have a couple of sites with good traffic, and for example medium reserved EC2 instance is not enough. You are paying $36.60(medium reserved for 1year) in EU(Ireland) + traffic + optional expenses for databases and S3 if you use them. Of course as some point when you are under $56.6-$66.1 you can optimize your hosting costs with Amazon EC2. But when you get at some point if purchase EX4 server from Hetzner, it will surpass your perfomance needs for a long time, before you get a massive traffic. (I am wrong?) CPU: i7-2600 Quadcore (3.4-3.8 Ghz) RAM: 16 GB HDD: 2x3 TB SATA (6 Gbit/s) - I think that disc performance of a dedicated is better then of Amazon EBS Traffic: 10 TiB in month included. This is what you get from Hetzner for $56(- 19% VAT) or $66 for EU residents. Please, tell me what's the reason to use Amazon? Which load won't a server from Hetzner take, but Amazon Auto Scaling will? The maintenance of dedicated vs EC2 is still the same? Or hardware failure at Amazon, won't ruin your EBS storage? I'm still not at the level when I need expensive hosting, but want to know beforehand, just to be sure if Amazon infrastructure is better then pure performance of Hetzner's hardware.

    Read the article

  • How to configure Google sitemap links in Wordpress? (without editing its HTML or PHP source code)

    - by Alexander Farber
    I run a Wordpress 3.7.1–de_DE site, but don't have much experience with it yet. When my site comes up in a Google search, there are 2 Google sitemap links displayed underneath: One of them points to a non-existent webpage /imprint though and I had to add a page at that URL to workaround this (and I want the URL actually be /impressum anyway since the site is in German and has German URLs). How to configure Google sitemap links in Wordpress (without editing its HTML or PHP source code)?

    Read the article

  • Is SEO affected negatively by having densely encoded identifiers of content in URLs?

    - by casperOne
    This isn't about where to put the id of a piece of unique content in URLs, but more about densely packing the URL (or, does it just not matter). Take for example, a hypothetical post in a blog: http://tempuri.org/123456789/seo-friendly-title The ID that uniquely identifies this is 123456789. This corresponds to a look-up and is the direct key in the underlying data store. However, I could encode that in say, hexadecimal, like so: http://tempuri.org/75bcd15/seo-friendly-title And that would be shorter. One could take it even further and have more compact encodings; since URLs are case sensitive, one could imagine an encoding that uses numbers, lowercase and uppercase letters, for a base of 62 (26 upper case + 26 lower case + 10 digits): 0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz For a resulting URL of: http://tempuri.org/8M0kX/seo-friendly-title The question is, does densely packing the ID of the content (the requirement is that an ID is mandatory for look-ups) have a negative impact on SEO (and dare I ask, might it have any positive impact), or is it just not worth the time? Note that this is not for a URL shortening service, so saving space in the URL for browser limitation purposes is not an issue.

    Read the article

  • Do I need multiple accounts in Facebook for each of my product site?

    - by John
    I've a dozen sites which include for-profit ones as well as for charity. For each site I've created a Facebook company/charity account. After creating those accounts it dawned on me that I could as well have created a new page for each of my site from my personal account only even if a site has multiple product pages. What'll be the right strategy? Also as per Facebook terms we can have only single personal account. I do have single personal account only but for each site I've created only company pages. I hope I'm not violating the facebook terms.

    Read the article

  • How to pass information across domains to ask for newsletter only once?

    - by Michal Stefanow
    Lets assume following scenario, I have two sites: example1.com example2.com When user visits 1 there is a prompt "please signup to a newsletter". Same thing happens when user visits 2. However when navigating from 1 to 2 I don't want signup form to be shown. My first thought were 3rd-party cookies, but it seems that they are blocked / not working: http://stackoverflow.com/questions/4701922/how-does-facebook-set-cross-domain-cookies-for-iframes-on-canvas-pages?rq=1 http://stackoverflow.com/questions/172223/how-do-i-set-cookies-from-outside-domains-inside-iframes-in-safari?rq=1 Another thought is to append #noshow for each URL but that would require some work - for instance a script that would intercept click / tap events and modify URL structure depending on the address. (but that seems hacky) I wonder if you know a robust well-established solution to this issue? Thanks

    Read the article

  • In Google Analytics, how can I determine the value of a page if no goals or revenue have been determined?

    - by Brandon Durham
    I have 4 years of data in Analytics with over 20 million pageviews for the entire site. No goals have ever been set up, and while the site is an ecommerce site, no ecommerce features in Google Analytics have ever been taken advantage of. So I have no way to determine what the actual value of a page is. I've been tasked with determining if a particular page on the site is worth keeping around. How might I use all standard data (pageviews, bounce rate, time on page, time on site, etc.) to help determine the value of this page? I really appreciate any help I can get!

    Read the article

  • What is the average page size for single page application (SPA)? [on hold]

    - by Emmanuel Istace
    I'm developing a single page application with a lot of css & javascript. For now the page is 1.3Mo composed by 5 section. Here are the rounded stats : Document : 10kb Style : 60kb Images : 450 kb (already compressed, include a big gallery thumbnails) Javascript : 700kb - 600kb of "framework" (jquery, jquery-ui, boostrap, modernizer, waypoint, ...) and 100kb of custom js. Fonts : 125kb And the site is not finished yet. (Will include gmap api, and some others...) My questions are : Do you have any statistics about the average weight of an SPA? As this is the whole website, do you think it's acceptable? Is lazy load (for images) a solution? What will be impact for SEO ? Is the "200kb rule" of google still relevant? Do you know great tools to detect which javascript code is not used during the the exection of a page and then the availability to optimize these 700kb of framework js stuffs? Can a caching strategy be an answer?

    Read the article

  • Sudden drop of pageviews/visit and increase of bounce rate in Analytics

    - by Tebb
    Google analytics stats: 04 june 2012 Visits: 4.423 Unique visitors: 3.558 Pageviews: 77.352 Pageviews / visit: 17,49 Visit length: 00:06:26 Bounce rate: 1,09% 05 june 2012 Visits: 4.652 Unique visitors: 3.825 Pageviews: 45.087 Pageviews / visit: 9,69 Visit length: 00:06:45 Bounce rate: 19,60% From one day to another the bounce rate went from 1% to 19%, the pageviews dropped by half so did the pageviews/visit. The only thing I changed (If I remember correctly) on the site, was an advertisment that used a javascript. Could this be the reason? and, if it is, how can I know which one is the real stats?

    Read the article

  • Upload image file: is compression on client side already possible?

    - by Chris
    When offering photo file uploading, usually the user will have badly compressed and huge (10+ megapixels) JPEG files from their cameras or phones. On the server side, these files will get re-compressed to something like 800x600px and JPEG quality 7 or 8. Is it (already) possible to do that re-compression on the client side? So that I would only need to transmit some 100kB (800x600px) and not 3 MB or more. Something like: (1) With javascript's new FileSystem API ( http://slides.html5rocks.com/#filewriter ) it would be possible to read the photo file's data into client side JS. (2) Then it would be necessary to re-encode the JPEG data, which is possible, but I counld not find any library for that (yet). Anybody knows such a library? (3) Last step would be to POST the re-compressed JPEG data to the server side for storage and get a URL to the stored photo file back from the server for inclusion into the client's HTML. I am looking for some jQuery plugin, other JS library or example web page that does this.

    Read the article

  • Uniform url in different device

    - by yanglifu90
    I noticed almost all of StackExchange's sites uses the same url in mobile browser, I think this is cool because when I share something on my phone, people viewing the link would not see a mobile webpage on their desktop. What is this specification called by W3C? How do I find other websites that use this technology. I noticed that ArsTechnica and the Telegraph used the same url with their desktop version.

    Read the article

  • Looking for a package allowing user-entered profiles

    - by Mark
    Title was a little hard to work, but take this as an example. User goes to site, creates account, and then has his/her own profile to edit. Let's say the profile includes height, weight, sex, eye color, etc.. I've really only used wordpress before, but I'm sure something else would cater to this better. The entire site is focused around a person having their own profile page with the info they supply. Thanks!

    Read the article

  • CSS hover behavior inconsistent on desktop/mobile devices [migrated]

    - by tbart
    I have a strange problem: This page looks good on desktop browsers, but the hovering effect does not seem to work correctly on at least my CM7 Android 2.3.7 device. I know hovering is not supposed to work on touch displays as it does with a mouse, but I'd like to have touch feedback, i.e. the highlight color should show once the user has tapped a menu item. This does work when the link is just a href="#" but it does not when it is a real link. I tried all sorts of stuff as you can see, to no avail. If you go back in the browser history after having tapped a real link, the item is highlighted, so the browser understands the CSS I am throwing at it. However, the javascript alert makes it clear that it only seems to interpret the link opening action and does not care about the color changing stuff. Weird that is. Workarounds welcome, preferable without javascript, but if it has to be JS, then go ahead! either go here: http://orpheus.co.at/hoverprob and Use the source, Luke! or see it here in all its glory: <html> <head> <meta name="viewport" content="width=320"> <style> #nav, #nav ul { width: 100%; float: left; list-style: none; line-height: 1; background: #fff; font-weight: bold; padding: 0; margin: 0 0 5px 0; } #nav a { display: block; color: #001834; text-decoration: none; padding: 5px 7px; } #nav li { float: left; padding: 0; width: 33%; } #nav li ul { position: absolute; left: -9999px; height: auto; margin: 0; opacity: .95; width: 100%; } #nav li a { text-align: center; height: 20px; line-height: 20px; } #nav li ul li a { text-align: left; } #nav li ul li { float: none; /* width: 316px; */ width: 100%; } #nav li:hover ul ul, #nav li:hover ul ul ul, #nav li.sfhover ul ul, #nav li.sfhover ul ul ul { left: -9999px; } #nav li:hover ul, #nav li li:hover ul, #nav li li li:hover ul, #nav li.sfhover ul, #nav li li.sfhover ul, #nav li li li.sfhover ul { left: 0; } #nav li.educate { background: #FFF0B8; /* background: #FF0000; */ /* border-radius: 5px; */ border: 5px; } #nav li.educate:hover { background: #FFCE00; /* border-radius: 5px; */ } </style> </head> <body> <div id="mobMenu"> <ul id="nav" class="nav"> <li class="educate"><a href="#">menu</a> <ul class="educate"> <li class="educate"><a href="#">href=&quot#&quot;, works</a></li> <!--(+emtpy onmouseover for iPose devices)--> <li class="educate"><a onmouseover="" href="index.html">does not work, real link</a></li> <li class="educate" id="bla"><a onmousedown="document.getElementById('bla').style.backgroundColor='Blue'; alert('Done');document.location='index.html';" href="#">JS, not interpreted in corr order</a></li> </ul> </li> </div> </body>

    Read the article

< Previous Page | 210 211 212 213 214 215 216 217 218 219 220 221  | Next Page >