Search Results

Search found 9717 results on 389 pages for 'pro metedor'.

Page 98/389 | < Previous Page | 94 95 96 97 98 99 100 101 102 103 104 105  | Next Page >

  • Do AdWords conversions only track AdWords visitors?

    - by atticae
    I have set up an AdWords conversion and put the conversion tracking JS code on a test page. However, I don't see any conversions tracked when I visit it. Does the AdWords conversion tracking only register your conversion if you come to the site by clicking on an AdWords campaign? Google's help page advises me to test the code by clicking a campaign. However, I would like to use the tracking to track all conversions, not just AdWords. I considered using Analytics as well, but it seems you can only track via url there, not JS, which would mean I had to restructure a part of my page. (Because currently a conversion appears does not necessarily happen on a different URL.) Is AdWords tracking not a viable solution to track all visitor conversions on my site?

    Read the article

  • Godaddy one page hosting

    - by liv a
    Disclaimer: not sure this is the right place for this kind of question, sorry in advance, just point me to the right place and I'll move it. In godaddy when paying only for domain, without hosting, they state you can get one page hosting for free but that option only opens their web-builder. I want to create a nicely design landing page, where the content is static.Is there a way to make my domain point to a wordpress one page or self created html one page/ landing page?

    Read the article

  • authorize.net SIM PCI compliance

    - by David
    Does anyone know if authorize.net's SIM rids you of having to be PCI compliant? The payment form is hosted on authorize.net's site and they're processing the payment. I know you can do a relay response which basically puts some of the transaction details in a url that goes back to your website(to display a receipt). I'm not sure what all information gets put into the url though. I'm wondering if that makes you have to become PCI compliant?

    Read the article

  • Should I block bots from my site and why?

    - by Frank E
    My logs are full of bot visitors, often from Eastern Europe and China. The bots are identified as Ahrefs, Seznam, LSSRocketCrawler, Yandex, Sogou and so on. Should I block these bots from my site and why? Which ones have a legitimate purpose in increasing traffic to my site? Many of them are SEO. I have to say I see less traffic if anything since the bots have arrived in large numbers. It would not be too hard to block these since they all admit in their User Agent that they are bots.

    Read the article

  • URL length and content optimised for SEO

    - by Brendan Vogt
    I have done some reading on what URLS should look like for search engine optimisation, but I am curious to know how mine would like, I need some advice. I have a tutorial website, and my categories is something like: Web Development -> Client Side -> JavaScript So if I have a tutorial called "What is JavaScript?", is it good to have a URL that looks something like: www.MyWebsite.com/web-development/client-side/javascript/what-is-javascipt Or would something like this be more appropriate: www.MyWebsite.com/tutorials/what-is-javascipt Just curious because I also read that it is wise to have keywords in your URLs. Do I need to add the identifiers of each categories in the link as well, something like: www.MyWebsite.com/1/web-development/5/client-side/15/javascript/100/what-is-javascipt 1 is the unique identifier (primary key) of category web development 5 is the unique identifier (primary key) of category client side 15 is the unique identifier (primary key) of category javascript 100 is the unique identifier (primary key) of tutorial what is javascript UPDATE This is not a programming question so can someone please help migrate this to the correct Q&A site without devoting my questions?

    Read the article

  • Scripts Casing Flash Intro Animation To Stop [migrated]

    - by ubique
    When my Flash website loads, it freezes halfway through the initial animation for 2-3 seconds and then continues. This obviously doesn't look great and I can't figure out what is causing it. Am thinking it is one of the scripts in index.html causing the issue and have tried all sorts of ways to correct it - what have I done wrong? <!DOCTYPE html> <html lang="en"> <head> <title>company name</title> . . . <link href="style.css" rel="stylesheet" type="text/css" /> <script type="text/javascript" src="js/flashobject.js"></script> <!--[if lt IE 7]> <link href="ie6.css" rel="stylesheet" type="text/css" /> <![endif]--> </head> <body> <header> <hgroup> <h1>company</h1> <h2>company</h2> </hgroup> </header> <div id="container"> <div id="head"> <div class="aligncenter"><a href="http://www.adobe.com/go/EN_US-H-GET-FLASH"> <img src="http://www.adobe.com/images/shared/download_buttons/get_adobe_flash_player.png" alt="" /></a> </div> </div> </div> <div class="g-plus" data-href="https://plus.google.com/100925740920754223119?rel=publisher" data-width="170" data-height="69" data-theme="light"> </body> <!-- Flash --> <script type="text/javascript"> var fo = new FlashObject("main_v10.swf", "head", "100%", "100%", "8", ""); fo.addParam("quality", "high"); fo.addParam("allowFullScreen", "true"); fo.write("head"); </script> <!-- Hello Bar --> <script type="text/javascript" src="//www.hellobar.com/hellobar.js"></script> <script type="text/javascript"> new HelloBar(39040,52484); </script> <!-- GPlus --> <script type="text/javascript"> window.___gcfg = {lang: 'en'}; (function() {var po = document.createElement("script"); po.type = "text/javascript"; po.async = true;po.src = "https://apis.google.com/js/plusone.js"; var s = document.getElementsByTagName("script")[0]; s.parentNode.insertBefore(po, s); })();</script> <!-- Google --> <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-xxxxxxxx-1']); _gaq.push(['_setSiteSpeedSampleRate', 10]); _gaq.push(['_trackPageview']); (function init() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga,s); })(); window.onload = init; </script> </html>

    Read the article

  • general questions about link spam

    - by hen3ry
    Hello, A CMS-based site I manage is suffering from a small but ominously growing number of almost certainly bot-emplaced, invisible spam links placed in registered-user-only shoutboxes and user forums. "Link Spam", yes? Until recently, I've kept my eyes on narrow tech issues, and I'm having trouble understanding what's going on. I understand that we need to tighten up our registration procedures, but more generally... Do I understand correctly that our primary interest in combatting link spam on our site is that major search engines reduce or zero the search visibility of sites that contain link spam? Although we're non-commercial, we don't want to be at the bottom of the rankings, or eliminated altogether. Are the linked-to sites the direct beneficiaries of the spam links, or is there some kind of indirection? What is the likely relationship between the link-spammers and the owners of the (directly or indirectly) linked-to sites? Are the owners of the linked-to sites paying the link-spammers for higher visibility? Are the owners aware that this method is being used? It is my impression that major search engines are capable these days of detecting that given sites are being promoted by link spam, and that these sites may consequently be reduced in search rank or dropped altogether. Do these sanctions occur frequently? Is there any potential value in sending notifications to the owners of the linked-to sites that their visibility is at risk? TIA, hen3ry

    Read the article

  • Sending a small number of targeted emails, is it spamming?

    - by Alex Mor
    I have a directory website and I want to send focused emails, a small amount, less than 50 a month, to some of the businesses on my directory that get many visitors. The intention is to let them now many people are viewing there page and encourage them to update it and post information on it. How can I send this small number of emails without being targeted as spam? Also, should I send it from an email with the websites domain or will it better to send from a personal email? that way at least of email is tagged as spam sometimes it won't hurt the website's reputation, is this true?

    Read the article

  • Yahoo search: different results shown in two identical searches

    - by Marco Demaio
    Hello,simple question: searching on http://www.yahoo.it for villa matrimonio bologna I noticed Yahoo shows different results. You need to retry few times to get this done maybe exiting the browser and openeing it again, or maybe searching once and then clearing browser cookies and then search again (it's even easier to test if you use two different browsers at the same time to search for the same phrase). Anyway in order to reproduce this easily I write down here the query shown in the address bar after the search, so you can just click on these to see the results shown by entering these query: http://it.search.yahoo.com/search;_ylt=AirvLYKvBMPP_6MpAmONN14brK5_?vc=&p=villa+matrimonio+bologna&toggle=1&cop=mss&ei=UTF-8&fr=yfp-t-709 http://it.search.yahoo.com/search;_ylt=AirvLYKvBMPP_6MpAmONN14brK5_?vc=&p=villa+matrimonio+bologna&toggle=1&cop=mss&ei=UTF-8&fr=sfp Note the last parameter fr is different, but it's Yahoo that set it (not me), I don't even know what it means. You can see in the search box that the searched phrase is IDENTICAL in both cases. So why Yahoo is giving out different results on same search phrase? I used the same browser and performed the test in few minutes by simply trying more than once. You may also notice that the number of results returned (written on the left side of the page) is different, for the 1st search it returns 274K results, for the 2nd one 5.38M results. Actually you might think that this is just an error on Yahoo, but it's almost 1 year that while looking once in a while at some websites to see how they are ranking on Yahoo and also Google, I noticed that two searches on the same phrase show up different results even on the same day after few minutes/hours. I couldn't reproduce this behaviour also on Google so I can not say for sure, but since it seems to me it happened sometimes I was wondering if anyone of you noticed it too. Do you know if this is the normal behaviour of search engines? Because if it's normal (and it's just me that noticed it only now) I wonder how do you understand how well a site is ranking on a search engine, you could even see one of your customer's website ranking differently compared to what your customer sees on his PC.

    Read the article

  • Disqus-like comment server

    - by wxs
    Hi all, I'm looking at setting up a blog, and I think I want to go the static website compiler route, rather than the perhaps more conventional Wordpress route. I'm looking at using blogofile, but could use jekyll as well. These tools recommend using disqus to embed a javascript comment widget on blog posts. I'd go that route, but I'd rather host the comments myself, rather than use a third party. I could certainly write my own dirt-simple comment server, but I was wondering if anyone knew of one that already exists (of the open source variety). Thanks!

    Read the article

  • iFrame content pageviews not matching parent page pageviews

    - by surfbird0713
    I have a page with content hosted in an iFrame, both using the same GA account ID. When I look at the pages report, the parent page has about 9000 unique views, but the iFrame content only has 3700. Anyone have an idea what could cause that kind of discrepancy? My only guess is that it would be caused by people moving on before the iFrame content has a chance to load, but the average time on page for the host page is 56 seconds, so that doesn't seem possible. This is the page in question: http://cookware.lecreuset.com/cookware/content_le-creuset-lid_10151_-1_20002 The flipbook is hosted in the iFrame on a separate domain. I have each page of the flipbook triggering a virtual pageview to try to evaluate engagement with the book - when the flipbook loads, it fires a pageview for the page it is on, so that is the page I'm using for the 3700 number. I also looked at the source of the iFrame in the pages report, and that number just about matches the virtual pageviews so that piece is consistent. Any ideas on this are much appreciated. Thanks!

    Read the article

  • Is there any new method for link-backs [on hold]

    - by Mir Hammad
    As all SEOs know that google is trying its very best to kill SEO and linkbacks are quite a difficult task now. Although content is the key but my boss is still possessed with linkbacks. I can not do directory posting, link exchange, paid linking, web 2.0 and blog commenting as they are spam now. I do not see what other choice i have except forum posting and article posting. Can someone suggest new method to acquire link backs ? I know almost all traditional methods so don't say press release or etc. If you really have something out of the box or not very much common please share.

    Read the article

  • MySQL vs. SQL Server Go daddy, What is the difference bewteen hosted DB and App_Data Db

    - by Nate Gates
    I'm using Goddady for site hosting, and I'm currently using MySQL, because there are less limits on size,etc. My question is what is the difference between using a hosted Godaddy Db such as MySQL vs. creating a SQL Serverdatabase in the the App_Data folder? My guess is security? Would it be a bad idea to use a SQL ServerDB thats located in the App_Data folder? Additional Well I am able to create a .mdf (SQL Server DB file) in the App_Data folder, but I'm really unsure if should use that or not, If I did use it it would simplify using some of the Microsoft tools. Like I said my guess is that it would be less secure, but I don't really know. I know I have a 10gb, file system limit, so I'm assuming my db would have to share that space.

    Read the article

  • Website restyle, SEO migration plan?

    - by Goboozo
    I am currently in a project for one of my biggest clients. We have built a website that will -replace- the old website. When it comes to actual content its is largely the same. However, the presentation of the content has changed drastically. From our point of view much more user-friendly (main reason to update the site). Now, since the sites presentation has changed we have some major changes in: HTML & CSS: To change the presentation of the content URL's: To make them better understandable (301 redirects have been taken care of and are in place) Breadcrumbs: To enhance the navigation (we have made the breadcrumbs match exactly with the url's) Pagination: This was added to enable content browsing Title tags: Added descriptive title tags to the major links and buttons. Basically all user content including meta tags have remained the same. Now since this company is rather successful and 90% of its clients come from Google's organic results I am obliged to take all necessary precautions. People tell me I need a migration plan to prevent the site being hurt in Google, but I have never worked using such a plan... ...So, based on the above. Would you consider a migration plan necessary and what precautions/actions would you recommend to prevent us being put down in our SERP positions? Many thanks in advance for your answers.

    Read the article

  • How to protect SHTML pages from crawlers/spiders/scrapers?

    - by Adam Lynch
    I have A LOT of SHTML pages I want to protect from crawlers, spiders & scrapers. I understand the limitations of SSIs. An implementation of the following can be suggested in conjunction with any technology/technologies you wish: The idea is that if you request too many pages too fast you're added to a blacklist for 24 hrs and shown a captcha instead of content, upon every page you request. If you enter the captcha correctly you've removed from the blacklist. There is a whitelist so GoogleBot, etc. will never get blocked. Which is the best/easiest way to implement this idea? Server = IIS Cleaning out the old tuples from a DB every 24 hrs is easily done so no need to explain that.

    Read the article

  • CSS tags/media queries for chrome on iPhone [migrated]

    - by Mick79
    So Chrome is here for iOS.. Hoorah! However now due to the different screen layout (no footer toolbar) it messes with the ability to make a perfect layout for iphone web pages. I have a site for my company that resided perfectly inside an iphone screen, no scrolling required, it looked like an app. However now that chrome is here (and wildly popular) with its different screen layout, sites that were sized for iphone safari now look odd. Is there, or will there be, ways to isolate out chrome from safari and give them different CSS?

    Read the article

  • How to get users to commit and collaborate to make a website valuable? [closed]

    - by AzizAG
    I own a website that requires a fairly largish amount of users to collaborate and commit occasionally to make the website valuable, so basically, the website can't be any valuable without users helping me put some content on it. To not get confused, I'm thinking of websites like Wikipedia, Stack Exchange and Yahoo! Answers, most of the content is based on peer effort. How do they actually get users interested and committed in the first place? What are the things I have to do to get users involved in the website and actually help me grow it bigger?

    Read the article

  • List of usage information to collect in a web application

    - by Thomas Levine
    I'm writing a web application that will allow people to create accounts, edit stuff, send stuff to people, &c. I plan on recording things like when things were created and sent and stuff. Is there a list of usage information that one should collect in a web application? I'd like to see whether I'm missing something. Also, is there a list of usage information that I shouldn't collect (Like maybe information that people find private)?

    Read the article

  • is RapidSSL wildcard cert supported by major browsers?

    - by Jorre
    I'm thinking of buying a wildcard SSL cert from clickSSL : http://www.clickssl.com/rapidssl/rapidsslwildcard.aspx That would be a rapidssl certificate, and I was looking into my firefox options to see if RapidSSL is in the list of recognized Authorities. My certificate manager doesn't mention RapidSSL anywhere. Am I looking for the wrong name, e.g. is rapidssl recognized by browsers under a different name? I want to be sure that this certificate is working in all major browsers (including IE6)

    Read the article

  • Google adsense bombing

    - by tereško
    I made a website few months ago and put google adsense on it, but someone kept clicking on ads with bad intentions, e.g. google bombing I guess but as result they disabled my google adsense account, now I wonder how did this all happened and what steps should I take in future to stop such kind of attacks. Is there any tricks or tips to save your website from bombing of invalid clicks ? what ways do normally hacker use to bomb adsense on a website ?

    Read the article

  • Google Analytics - Unable to get GA Tracking

    - by Pure.Krome
    We've been using GA for a few years with no probs. About 2-3 weeks ago we tried to clean up some of our tracking and on one of our profiles, it's not working anymore (since oct 10.) First, some context then some GA Debugging code. 1. Context. We have the following setup: different root domains AND different sub-domains on one of the root domains. www.website.com www.website.com.au www.anotherWebsite.com foo.website.com baa.website.com So what we're doing is the following: each root domain and each sub-domain get their own tracking code. This way we can allow separate people (from outside our company) to access only their own data. Eg. a manager for foo.website.com can only see data related to that domain .. and see data on the other domains. Have a last account which is the SUM of all the domains. this is for us. so we can see total numbers. So to do this, we have two trackers that fire off, on the page. the individual accounts all work fine - they seem to be tracking data ok. the 'global' account is not working and this gives us the = Tracking Not Installed error. This has been going on since oct 10. So the wait 24/48/72 hours thing is waaaaay over. 2. GA Debug code. Installing GA Debug chrome extension gives the following output. I've tried to hide anything that could be considered secret. UA-XXXXX34-1 == Global account (which isn't working any more). UA-XXXXX34-11 == Specific account for www.website.com _gaq.push processing "_setAccount" for args: "[UA-XXXXX34-1]": ga_debug.js:18 _gaq.push processing "_setDomainName" for args: "[website.com]": ga_debug.js:18 _gaq.push processing "_setAllowLinker" for args: "[true]": ga_debug.js:18 _gaq.push processing "_trackPageview" for args: "[]": ga_debug.js:18 Track Pageview ga_debug.js:18 Tracking beacon sent! utmwv=--snipped-- Account ID : UA-XXXX234-1 Page Title : Some page title Host Name : www.website.com Page : / Referring URL : - Hit ID : 1923583969 Visitor ID : 785310647 Session Count : 51 Session Time - First : Thu Aug 23 2012 15:20:17 GMT 1000 (AUS Eastern Standard Time) Session Time - Last : Mon Oct 29 2012 11:41:46 GMT 1100 (AUS Eastern Summer Time) Session Time - Current : Mon Oct 29 2012 12:19:23 GMT 1100 (AUS Eastern Summer Time) Campaign Time : Thu Aug 23 2012 15:20:17 GMT 1000 (AUS Eastern Standard Time) Campaign Session : 1 Campaign Count : 1 Campaign Source : (direct) Campaign Medium : (none); Campaign Name : (direct) Language : en-gb Encoding : UTF-8 Flash Version : 11.4 r31 Java Enabled : true Screen Resolution : 1050x1680 Browser Size : 1033x861 Color Depth : 32-bit Ga.js Version : 5.3.7d Cachebuster : 1846514973 ga_debug.js:18 _gaq.push processing "_setAccount" for args: "[UA-XXXX234-11]": ga_debug.js:18 _gaq.push processing "_setDomainName" for args: "[website.com]": ga_debug.js:18 _gaq.push processing "_setAllowLinker" for args: "[true]": ga_debug.js:18 _gaq.push processing "_trackPageview" for args: "[]": ga_debug.js:18 Track Pageview ga_debug.js:18 Tracking beacon sent! utmwv=--snipped-- Account ID : UA-XXXX234-11 Page Title : SomePageTitle Host Name : www.website.com Page : / Referring URL : - Hit ID : 1923583969 Visitor ID : 785310647 Session Count : 51 Session Time - First : Thu Aug 23 2012 15:20:17 GMT 1000 (AUS Eastern Standard Time) Session Time - Last : Mon Oct 29 2012 11:41:46 GMT 1100 (AUS Eastern Summer Time) Session Time - Current : Mon Oct 29 2012 12:19:23 GMT 1100 (AUS Eastern Summer Time) Campaign Time : Thu Aug 23 2012 15:20:17 GMT 1000 (AUS Eastern Standard Time) Campaign Session : 1 Campaign Count : 1 Campaign Source : (direct) Campaign Medium : (none); Campaign Name : (direct) Language : en-gb Encoding : UTF-8 Flash Version : 11.4 r31 Java Enabled : true Screen Resolution : 1050x1680 Browser Size : 1033x861 Color Depth : 32-bit Ga.js Version : 5.3.7d Cachebuster : 1580443754 and this is the js code he have. BTW, it is inside a <head></head> <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push( ['_setAccount', 'UA-XXXX234-1'], ['_setDomainName', 'website.com'], ['_setAllowLinker', true], ['_trackPageview'] ,['b._setAccount','UA-XXXX234-11'], ['b._setDomainName','website.com'], ['b._setAllowLinker',true], ['b._trackPageview'] ); (function () { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); </script> Finally, I've triple checked that the UA is the correct text. and yes, the global account is -1 and the specific domain is -11. Anyone have any suggestions to help?

    Read the article

  • Using hreflang to specify a catchall language

    - by adam
    We have a site primarily targeted at the UK market, and are adding a US-market alternative. As per Google's recommendations: To indicate to Google that you want the German version of the page to be served to searchers using Google in German, the en-us version to searchers using google.com in English, and the en-gb version to searchers using google.co.uk in English, use rel="alternate" hreflang="x" to identify alternate language versions. Which gives us: <link rel="alternate" hreflang="en-gb" href="http://www.example.com/page.html" /> <link rel="alternate" hreflang="en-us" href="http://www.example.com/us/page.html" /> We do get enquiries from other areas of the world - particularly where there are expat communities (Dubai, UAE, Portugal etc). By adding the above tags, is there a risk that Google will only surface our site for UK and US search users? Do we need to specify a catch-all that will default all other searches to our UK site?

    Read the article

  • 2 google analytics profiles for 2 sections of the same site

    - by sam
    Ive got a website which for the most part is a portfolio, there is another section of the site mysite.com/micro-site which ranks extremely well for it chosen term / topic, and brings in lots of traffic, but actually has little to do with the core business. It was really made as a piece of content - in the same way sites like this are - http://chrome.com/campaigns/rollit For the main site i use 1 Google analytics profile and set of tags, for the micro site i have a completely different analytics profile and set of tags. The main reason ive done this is because the traffic stats and insights for the micro site are essentially just noise, its nice to have the traffic but they dont help when reading analytics reports, so if they were combined my analytics reports would be a mess. Is there any disadvantage / negatives of doing this ?

    Read the article

  • Free forum engine with good anti-attack mechanisms

    - by macias
    I am looking for forum engine (for discussions) with good attack countermeasures built in. Windows (preferrably) or Linux. Free (as beer). I think about registration flooding and blocking user accounts attacks. For registration, such engine should have at least: captcha blocking mulitple registrations from the same IP providing login (for logging in) and user name (for displaying the author of the posts) For logging in: no blocking on multiple tries -- instead after X try sending via mail a token, the third piece needed for next login -- without it logging in will be impossible (it would be similar to activation process) The engine should be designed with two ideas in mind: protecting engine against attacks 0 penalty for decent users Thank you in advance for your help and recommendations.

    Read the article

  • Monitoring Domain Availability

    - by JP19
    How can I write a tool to monitor domain name availability? In particular, I am interested in monitoring availability of a domain which is in PENDINGDELETE (or REDEMPTIONPERIOD or REGISTRY-DELETE-NOTIFY or PENDINGRESTORE or similar ) status after its expiration date. Any suggestions or more information about the PENDINGDELETE and similar status are also welcome (what is the time frame till which it can remain in this status, etc. I usually don't see a fixed pattern or even consistent correlation with expiration date and this status).

    Read the article

< Previous Page | 94 95 96 97 98 99 100 101 102 103 104 105  | Next Page >