Search Results

Search found 5380 results on 216 pages for 'webmasters'.

Page 131/216 | < Previous Page | 127 128 129 130 131 132 133 134 135 136 137 138  | Next Page >

  • Any advantage to the script version of Google Adwords' conversion tracking code?

    - by ripper234
    Google Adword has an HTML snippet to track conversions: <script type="text/javascript"> /* <![CDATA[ */ var google_conversion_id = 12345; var google_conversion_language = "en"; var google_conversion_format = "3"; var google_conversion_color = "ffffff"; var google_conversion_label = "someopaqueid"; var google_conversion_value = 0; /* ]]> */ </script> <script type="text/javascript" src="http://www.googleadservices.com/pagead/conversion.js"> </script> <noscript> <div style="display:inline;"> <img height="1" width="1" style="border-style:none;" alt="" src="http://www.googleadservices.com/pagead/conversion/12345/?label=opaque&amp;guid=ON&amp;script=0"/> </div> </noscript> It is composed of two parts: For clients supporting javascript, an inline script that sets variables, plus loading a reporting script. For other clients, an image tag. As far as I can see, the image tag has some advantages: It works on all browsers. It is asynchronous. It's shorter to have only this version, compared to both this and the js version. Any reason not to drop the <noscript> tag and just use the image conversion snippet directly?

    Read the article

  • How to filter traffic coming to particular page from other page?

    - by BishKopt
    I've got page A linking to page B. There are also other pages linking to B. How can I see traffic that is coming to page B be ONLY form page A? I can somehow do it via Behavior flow: Behavior Behavior flow [Right click on anything] Explore traffic through here [Click edit icon] Define a page group [Right click] Group details [Dropdown] Incoming traffic But how do I do it in normal reports? Is there any way to filter out only the traffic coming from particular page?

    Read the article

  • Displaying the same page, no matter what URI

    - by jgauffin
    We have moved a webapplication and would like to display a message in the old IIS. Let's say that the application was in http://oldserver/appname/. How do I make sure that our moved.html is displayed to the user no matter which uri the user browsed in to (in that virtual folder)? http://oldserver/appname/some/path.aspx --- should display http://oldserver/appname/moved.html http://oldserver/appname -- should display http://oldserver/appname/moved.html

    Read the article

  • Web Safe Area (optimal resolution) for web app design?

    - by M.A.X
    I'm in the process of designing a new web app and I'm wondering for what 'Web Safe Area' should I optimize the app layout and design. By Web Safe Area I mean the actual area available to display the website in the browser (which is influenced by monitor resolution as well as the space taken up by the browser and OS) I did some investigation and thinking on my own but wanted to share this to see what the general opinion is. Here is what I found: Optimal Display Resolution: w3schools web stats seems to be the most referenced source (however they state that these are results from their site and is biased towards tech savvy users) http://www.w3counter.com/globalstats.php (aggregate data from something like 15,000 different sites that use their tracking services) StatCounter Global Stats Display Resolution (Stats are based on aggregate data collected by StatCounter on a sample exceeding 15 billion pageviews per month collected from across the StatCounter network of more than 3 million websites) NetMarketShare Screen Resolutions (marketshare.hitslink.com) (a web analytics consulting firm, they get data from browsers of site visitors to their on-demand network of live stats customers. The data is compiled from approximately 160 million visitors per month) Display Resolution Summary: There is a bit of variation between the above sources but in general as of Jan 2011 looks like 1024x768 is about 20%, while ~85% have a higher resolution of at least 1280x768 (1280x800 is the most common of these with 15-20% of total web, depending on the source; 1280x1024 and 1366x768 follow behind with 9-14% of the share). My guess would be that the higher resolution values will be even more common if we filter on North America, and even higher if we filter on N.American corporate users (unfortunately I couldn't find any free geographically filtered statistics). Another point to note is that the 1024x768 desktop user population is likely lower than the aforementioned 20%, seeing as the iPad (1024x768 native display) is likely propping up those number (the app I'm designing is flash based, Apple mobile devices don't support flash so iPad support isn't a concern). My recommendation would be to optimize around the 1280x768 constraint (*note: 1280x768 is actually a relatively rare resolution, but I think it's a valid constraint range considering that 1366x768 is relatively common and 1280 is the most common horizontal resolution). Browser + OS Constraints: To further add to the constraints we have to subtract the space taken up by the browser (assuming IE, which is the most space consuming) and the OS (assuming WinXP-Win7): Win7 has the biggest taskbar footprint at a height of 40px (XP's and Vista's is 30px) The default IE8 view uses up 25px at the bottom of the screen with the status bar and a further 120px at the top of the screen with the windows title bar and the browser UI (assuming the default 'favorites' toolbar is present, it would instead be 91px without the favorites toolbar). Assuming no scrollbar, we also loose a total of 4px horizontally for the window outline. This means that we are left with 583px of vertical space and 1276px of horizontal. In other words, a Web Safe Area of 1276 x 583 Is this a correct line of thinking? I'm really surprised that I couldn't find this type of investigation anywhere on the web. Lots of websites talk about designing for 1024x768, but that's only half the equation! There is no mention of browser/OS influences on the actual area you have to display the site/app. Any help on this would be greatly appreciated! Thanks. EDIT Another caveat to my line of thinking above is that different browsers actually take up different amounts of pixels based on the OS they're running on. For example, under WinXP IE8 takes up 142px on top of the screen (instead the aforementioned 120px for Win7) because the file menu shows up by default on XP while in Win7 the file menu is hidden by default. So it looks like on WinXP + IE8 the Web Safe Area would be a mere 572px (768px-142-30-24=572)

    Read the article

  • How to use rel=canonical with Sitecore aliases?

    - by Mike G
    I have inherited a Sitecore architecture that is a mess from an SEO duplicate content POV. There are multiple aliases that have been created (and indexed by the search engines) for many of the 2nd tier pages of the site. Due to server issues, I am not able to 301 redirect these duped pages, so I would like to use the rel=canonical tag in an attempt to try and get Google/Bing to recognize the correct pages I would like to appear in the index. I have blocked the most extraneous duped pages with a robots.txt file, however, since Google/Bing have already spidered many of the duped pages, I need to keep them accessible to the spiders, BUT removed from the index. The catch is, since the duped pages are aliases (and don't really physically exist in Sitecore that I can find), I am not sure how to go about using rel=canonical - or if I even can in this situation..?

    Read the article

  • Buy internet country domain name: .fr .co.uk .de .com.au .sg etc

    - by user700580
    I already bought a domain name .com on godaddy for my company. I would like to reserve the same name with country specific domain extention, but not sure where to buy them and how to do it. Here are the ones that I would like to buy: Europe: fr, co.uk, de, ch, es, it, nl, se, no, ru australia: com.au asia: sg Godaddy has all except 1 in europe, australia and singapore. Should I find a website that sell all of them or should I buy some of them in godaddy and others elsewhere? Any suggestions where to buy them? Until now i've always buy .com domain names only so not sure how to do it. Thanks

    Read the article

  • Marketing for Scheduled Online Events

    - by JT703
    Last year I started working with a team on our first major web project (We, the Pixels). I believe the idea is very solid, but it has a hard requirement for a group of people being on the site for the randomly scheduled events. We are having problems getting people to come and stay for these events. What is the proper marketing approach needed to bring people to the site for these events? We have recently done the following in an attempt to fix the problem: Added email notification of new events being created Added privileges based on rank Added text throughout the site encouraging setting up the events in the future so other users can have time see that it exists. Gotten involved in with other communities that would find the site interesting in order to promote (market) the site Advertised using Google Adwords Is there an standard marketing approach for such a case as this?

    Read the article

  • Does Jquery and Mootools usually have conflict if both are used on a webpage? [migrated]

    - by Charming Prince
    I have this website am designing, i tried using mootools 1.31 to animate some of the div boxes when clicked or when the mouse hover rounds it, to shows the content. the thing is that it doesn't seem to work on the webpage, but if i try the same script on a blank webpage it works, am thinking probably it's because i have Jquery 1.52 on the same page and maybe both scripts are conflicting with each other because, if i remove the Jquery, the Mootools works. What should be my option, because i need the Jquery to do some validations for me, so i can't remove it completely. Here are the codes <script> //-vertical var mySlide = new Fx.Slide('test'); $('slidein').addEvent('click', function(e){ e = new Event(e); mySlide.slideIn(); e.stop(); }); $('slideout').addEvent('click', function(e){ e = new Event(e); mySlide.slideOut(); e.stop(); }); $('toggle').addEvent('click', function(e){ e = new Event(e); mySlide.toggle(); e.stop(); }); $('hide').addEvent('click', function(e){ e = new Event(e); mySlide.hide(); e.stop(); }); </script> here's the HTML <html> <h3 class="section">Fx.Slide Vertical</h3> <a id="slideout" href="#">slideout</a> | <a id="slidein" href="#">slidein</a> | <a id="toggle" href="#"> toggle</a> | <a id="hide" href="#">hide</a> <div id="test"> Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad mi nim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum. </div> Here's the CSS #test { background: #222; color: #fff; padding: 10px; margin: 20px; border: 10px solid pink; } #test2 { background: #222; color: #fff; padding: 10px; margin: 20px; border: 10px solid pink; } Am using the exact same code supplied by Mootools in their own example, if i do this on a blank webpage it works but incorporated into my own webpage, it doesn't, and my own page just have the script tag of the Jquery in the head section of the HTML.

    Read the article

  • Google search does not show sub-pages from my website

    - by user5679
    My website appears in Google search, but only the first page. Of course I have sub-pages linked from the first page, but the sub-pages do not show in Google search. Not in Yahoo, not in Bing. What should I do? It has been three years that sub-pages do not show. (I tried searching site:mydomain.com and pressed 'repeat the search with the omitted results included' link) What would you suspect the reason? My website addresses were like xxx.php?yy=zzz etc, etc, so I changed it to /yy/zzz using mod_rewrite. I thought it might be (X)HTML standard violations, so now I changed it. I hope Google will soon have my entire website, but I am a little bit pessimistic. Do you have any thought?

    Read the article

  • Apache2 and FTP

    - by Jo Colina
    I just set up an Apache web server on my Raspberry Pi, along with MySQL and PHP5, and to upload files i set up vsftpd. The thing is that the ftp connection sent me to my pi user home directory, instead of /var/www . So i changed Pi home directory to /var/www and changed it again to it's previous home. FTP now sends me to /var/www but whenever I upload files other rights are null. (Apache sends a 403 Forbidden every time unless I manually chmod the files inside /var/www uploaded via ftp) Does anyuone know how to fix this? Thanks!

    Read the article

  • Where can a list of Desktop web browsers be found at?

    - by Sn3akyP3t3
    I have another question posted in regards to the practicality of whitelisting. In this question I'm simply looking for an frequently updated list of top known used Desktop web browsers to use as part of my whitelist. I'm not trying to target any specific OS so please show one, show all. The list of browsers for desktops isn't exploding, but it does grow. I've only recently been made aware of other browsers that have multiple rendering engines. I'm not always on top of the text based browsers found out there either. I'm aware of the mobile browser platform and there is an active list used with regular expression for identification purposes that I will use as well as whatever I can find for the desktop platforms.

    Read the article

  • Gaming Community CMS, with forum integration [closed]

    - by Tillman32
    Possible Duplicate: Which Content Management System (CMS) should I use? I've had a simple website that I coded myself for a while now, the site is a gaming community. It's very forum and news driven. It was a HORRIBLE idea to take on coding this thing myself. Although we've used it for about a year now, we're just getting too big, and I need to streamline our work. I need writers to post news, etc. I've been doing it through code. ( A year ago I thought it would be a cool idea ) Anyway, I've been messing with just about every CMS out there, and I'm struggling to get something that I really like. The main issue I'm facing, is a good news system, and good forum integration. I'm sort of picky when it comes to looks, its a curse. Reading on here, I see a lot of people saying Drupal is the best for the 3 things I need, community interaction, and forums. I think the main issue that I ran into with drupal, was ease of use, and themes. I am not a web designer, and I need a good theme. For an idea of what I'm looking for, go check out http://www.clgaming.net, they have forums integrated, a nice news area on home page/news section, and nice user accounts. It looks very professional, and I doubt I'll get close to that with a free theme, but their functionality is exactly what I need. Any ideas would be greatly appreciated.

    Read the article

  • Wordpress with user login and file manager support

    - by Don
    This may be a RTFM kind of thing, so I'll apologize up front. I've been asked by a friend I used to freelance for if there's a solution in Wordpress where users an login, then they can upload their own files in a "my docs" kind of thing. I've never used WP, so before I dig into their info I thought I'd see if anyone here can confirm or maybe point me to a resource. It's one of those "I'll look up at lunch and get back to you" things, which is why I'm bugging you all before reading the docs. Thanks

    Read the article

  • How can I clone or mirror a site without SEO penalties for duplicate content?

    - by Amanda
    I am a web developer and I want to create clones of the sites I've developed for clients, so that I have an "original copy" on a subdomain of my own website, so that I can showcase my work to new clients. What is the best way to not get my clients original websites penalised for duplicate content? I am planning to have a robots.txt file that disallows all robots, as well as using <link href="http://www.client-canonical-site.com/" rel="canonical" /> in the <head> of the pages. Is that sufficient? Should I use rel=nofollow on all the links as well?

    Read the article

  • SEO for images: can I use a different (cookieless) domain?

    - by Oliver
    Hello, We want to increase the value of some of our important images by means of SEO, and we want to start serving them from a different, i.e. cookieless, domain. We want to go from http://www.example.com/images/1234.jpg to http://www.example.com/germany/bavaria/landscape.jpg which can easily be done via URL rewriting. Then on the other hand, we would like to serve the image from a completely different domain, let's say http://www.examplestatic.com/germany/bavaria/landscape.jpg, to save the overhead of sending the cookie from www.example.com. Somehow I feel that this is not a good idea because I move the image away from the content by putting it on a different domain. Can anyone shed some light on this problem? Naturally, I would just use a different subdomain, e.g. img.example.com, but we already use subdomains for languages and our cookies are valid for all subdomains of example.com, so this won't help. I'd really appreciate any hints. Cheers,

    Read the article

  • Should the English website use href="x-default" when it doesn't auto-redirect to the user's language or country?

    - by Noam
    For each URL on my site, I'm auto-redirecting according to header accept language. The site arch is English version: http://mydomain.com/page Spanish version http://es.mydomaina.com/page etc.. The english version is displayed unless I'm seeing a specific language other than en and that I support in the header, and then a redirect occurs. Google says this: For language/country selectors or auto-redirecting homepages, you should add an annotation for the hreflang value "x-default" as well: My pages aren't language selectors, nor are they the homepage. But I am auto-redirecting. My question is, should my english version be hreflang="x-default" or/and hrefland="en"?

    Read the article

  • Is it illegal to use content in such a way?

    - by MHZ
    I have a couple of questions about the legality of the content of some websites. I am currently working on two websites and I would like to make sure I am not breaking any laws, by using some content like I am... Do I need to get a license to use images from the Internet (such as google.images.com) in my site, assuming they aren't a company logo belonging to another company? If not, am I allowed to use it after I modify it with a image editing software? If content such as phone numbers, e-mail addresses, website addresses, and text from websites can be found for free online, and I gather this information for a search engine based site that I am working on and offering this information on a paid basis (similar to google, but more specialized), is something that is legal? Note: I am not 'copying' or redirecting business from anywhere, to my site. The exact opposite, the site I am working on actually helps advertise businesses and make it easier for customers to find them.

    Read the article

  • Changing domain name - what are the practical steps involved

    - by Homunculus Reticulli
    I launched a website a couple of years ago, bright eyed and bushy tailed, with dreams of conquering the world. Unfortunately it wasn't to be. Now, that I am a bit older and wiser, I have spent some money on branding and creating more quality content etc, I am rebranding and relaunching the site with a new domain name. Although the traffic on the old site is laughable (i.e. non-existent), there are a few pages of good information on there and I don't want to lose any "juice" those pages may have gained because web crawlers have been seeing it for a few years now. Ok, the upshot of all that is this: I want to change my domain name from xyz.com to abc.com. I am maintaining the same friendly urls I had before, only the domanin name part of the url will change, so that any traffic coming to the old page will be forwarded/redirected? to the new page seamlessly. How do I go about achieving this (i.e. what are the steps I need to carry out, and to minimize any "disruption" to any credibility the existing site has with Googlebot etc? I am running Apache 2.x on a headless Linux (Ubuntu) server.

    Read the article

  • Use Google Apps mail with website on a separate web host

    - by Oxwivi
    We've bought a domain through the free Google Apps service and use the Gmail account provided. I'm thinking of hosting our own site on a separate web host, but emails sent to our domain might be directed to our web host. Is there a way to continue using Gmail while the domain points to the web host? I've never dealt with domain names and web hosts before, nor am I experienced with web development (will use a CMS).

    Read the article

  • How long should I keep 301 redirecting pages from a deprecated domain?

    - by ElHaix
    I had an old domain that I have deprecated, but 301 redirected all results from it to my new site. The new site is now receiving a decent amount of traffic, but I don't know if it's 301 redirected from the old site, and doing a site:[old site] still shows several thousand pages indexed. Since all pages from the old site are 301 redirected, will they ever be removed from the index, as long as the old domain name is active? As a rule of thumb, somewhere I got 90 days for any significant site changes. When is it safe to burn the old domain?

    Read the article

  • How to deal with overly aggressive "Link Take Down Demands"?

    - by Eoin
    I've been receiving a large number of emails recently requesting I clean from link spam from my forum. Initially the emails were very polite and professional, and I was happy to remove the links. Recently the email have gotten very abrasive, here is a particularly rude example: From: [email protected] To: [email protected] Hi, This is the second time we are reaching out to you regarding your link to our site hxxp://www.company-two.com from hxxp://www.my-forum.com/some-topic-id. We really do need to remove this link. We have to report to Google any link we were unable to remove, and I wouldn't want to have to include your site in the list. Could you please remove our link from this page and any other page on your site? Thank You, Name Changed Behind the superficial pleasantries I feel there is some very real maliciousness. Note the email address, DMCA Violations, I don't see how the DMCA is involved here, except as a word which tends to strike fear in many people. Also relating to the email address, it doesn't match the company being linked to at all. How am I to trust they are truely operating on behalf of company-two when they don't even use one of it's email addresses. My email is hidden by privacypost. While a service with legitimate uses, I feel it's highly unprofessional for communications between to companies. The claim "This is the second time..." Every email I've received has started like this, but a check of my spam filters has never revealed a 1st mail. Initially I gave them the benefit of the doubt, by now though it's clear this is a cheap ploy to start me off on the defensive. And finally worst of all- the threats of reporting me to Google if I don't do everything they ask. I sent a polite reply asking for more information. I have no idea if the email address was even valid but I never received any response. Much later I got this followup mail From: [email protected] To: [email protected] Hi, This is the final time we are reaching out to you regarding your link to our site hxxp://www.company-two.com from hxxp://www.my-forum.com/some-topic-id. We will soon be reporting to Google any link we were unable to remove, and currently your site will have to be on the list. Could you please remove our link from this page and any other page on your site? I appreciate your urgent attention to this matter. Thank You, Name Changed This time the from address was more personal, though still not obviously connected to the spammed company. Lets be honest, I don't for one second believe that the companies were the victim of a 3rd party spammer as they claim. The links in questions were generated well over a year ago, and I firmly believe the companies were directly responsible for the spam links in question, a type of spam that has plagued my forum. Now they have the audacity to demand I spend my time cleaning up their mess, using threats to ensure they get their way. Have recent changes in Googles algorithms meant all the cash they spent spamming the web has now turned into a liability? If so I can see why these companies are all of a sudden running scared. Frankly, cleaning up my forum is a good things, but the threats they are using sickens me. So my question here is specifically about the threats: Are they vaild, and would such reports to Google destroy my page rankings? Is there a way I can report this abusive behaviour to Google?

    Read the article

  • Can I use a list of blog ping services for a portal?

    - by Ivanhoe123
    I'm setting up a list of ping services for a portal. It has a blog, forum, articles, restaurants, hotels and many other information, so it is far beyond a blog. I have a list of standard ping services for WP blogs - but I do not know if this should be literally only for blogs. My questions are: Is it recommended to ping blog services from a portal, such as http://blogsearch.google.com/ping/RPC2? Are there any penalties for sites that are not recognized as blogs? Is there some list of ping services for regular websites and not only blogs? Thanks!

    Read the article

  • New site not appearing in index after change of address, no feedback from google webmaster tools

    - by Duffy
    Our change of address seems to not be taking effect. Here's the story so far: We're a web company and our product is called The New Hive. Our site used to be at thenewhive.com, but we decided to switch to newhive.com (drop the "the", it's cleaner). So the timeline of what I've tried, starting on July 29th: used 301 redirects for all pages (e.g. thenewhive.com/tag/art = newhive.com/tag/art) At this point we noticed that we had disappeared from search results when searching "The New Hive", the front page used to be all links to our site plus a couple news articles about the company. So on August 5th I: verified new domain in webmaster tools (old domain was already verified) submitted a change of address request on August 5th with Webmaster Tools / Configuration / Change of Address Then after another week, on August 13th I did this: Went to Webmaster Tools / Health / Fetch as google fetched our homepage and a couple sub pages, all successfully clicked "Submit to Index" for homepage As of today (August 23rd) we're still not showing up in the index. We're getting no warnings or feedback of any kind from the dashboard so I'm inclined to think something's broken with the dashboard rather than that something's wrong with our site from an SEO perspective. From the dashboard: No new messages or recent critical issues. Crawl Errors: No data available. From Health - Index Status: Total indexed 0 Ever crawled 42,490 Not selected 12 Blocked by robots 0 I'm really at a loss here, any help would be appreciated.

    Read the article

  • Chrome trims the last <li> element in a row [closed]

    - by Paul
    Ok guys, I give up. Here's the code I'm struggling with: <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <meta http-equiv="Content-Type" content="text/html;charset=utf-8" /> <title>Blah</title> <style type="text/css"> #container { margin: 0 auto; width: 350px; border: 1px solid #ccc; } ul { list-style-type: none; margin: 0; padding: 0; text-align: center; } ul li { display: inline; padding: 5px; margin: 0 1px; background-color: lime; line-height: 2em; /* border: 1px solid red; */ } </style> </head> <body> <div id="container"> <ul> <li>Element A</li> <li>Element B</li> <li>Element C</li> <li>Element D</li> <li>Element E</li> <li>Element F</li> </ul> </div> </body> </html> Why the heck does Chrome trim the right side of "Element D" (even though there is enough space to display whole item), while Firefox and even Internet Explorer render this code properly? It becomes more visible when we apply the commented border. In other words, is there a way to tell the browser that I want every single <li> element to be autonomic, and thus to move it to the next row if it doesn't fit entirely in the previous one? Can't wait to see the solution, thanks in advance.

    Read the article

  • part of google search appliance drawing from http instead of https

    - by mcgyver5
    we are using the google search appliance in our web app. It is used by several other parts of our organization but we are using it on a web app that uses https. So, we followed google's instructions to get all the google code via https so that users don't get the annoying "This page contains both secure and insecure items" popup. Most of the google code has behaved and come to us as https, but there is a part of it pulling from http://www.google.com/cse full URL = http://www.google.com/cse?q=searchTerma&cx=001025153263958516519%3Aj2323tveixc&cof=FORID%3A11%3BNB%3A1&ie... that causes the insecure items warning to popup. This popup occurs in the results page and the above URL is the only non-secure request I can find.

    Read the article

< Previous Page | 127 128 129 130 131 132 133 134 135 136 137 138  | Next Page >