Search Results

Search found 11896 results on 476 pages for 'smart pro'.

Page 109/476 | < Previous Page | 105 106 107 108 109 110 111 112 113 114 115 116  | Next Page >

  • Display only visits referred from a Google Adwords campain

    - by Adjam
    I want to only display visits to my site which were sent by my Google Adwords campain preferably in the Visitors overview page. I've tried filtering with 'Advanced Segments' but when I select "Paid Search Traffic" visits goes down to zero. But I do know that most of my visitors at the moment are being sent from Google Adwords. In this question the answer (which was not chosen) suggested adding a HTTP GET request or an URL shortener, but surly there is a way to do it in Analytics?

    Read the article

  • Stop bots from crawling old links with extensions

    - by Jared
    I've recently switched to MVC3 which is extension-less for the URL's, but Google and Bing have a wealth of links that they are crawling which no longer exist. So I'm trying to find out if there is a way to format robots.txt (or by some other method) to tell google/bing that any link that ends in an extension isn't a valid link... Is this possible? On pages that I'm concerned about a User having saved as a fav I'm displaying a 404 page that lists the links to take once they are redirected to the new page (I decided to not just redirect them as I don't want to maintain these forever). For Google/Bing sake I do have the canonical tag in the header. User-agent: * Allow: / Disallow: /*.* EDIT: I just added the 3rd line (in text above) and it APPEARS to do what I'm wanting. Allow a path, but disallow a file. Can anyone confirm this?

    Read the article

  • Are there any guidelines for laying out screen "real estate?"

    - by Corey
    I'm wondering if there is any information about creating a decent page layout so that your website will appeal to users of all resolutions. For example, the optimal width for pages. It seems like on my resolution, most websites have their content centered and covers about 80% of the page, which is easy on the eyes. Or maybe the height of the website's logo/header -- some sites I stumble upon have a huge logo with links or navigation under it, making it so that I need to scroll down to see the actual content, like articles or images (these sites don't keep me for very long). I understand that every user is different and may have browser extensions, page zoom or may be running some ancient system that displays in 640x480. I'm not looking for a "best" solution, but rather, some guidelines about designing to accommodate different resolutions. Basically, how can I make sure that I don't design a page where a paragraph might display in several easy-to-read lines on my resolution, but it turns into a single line on a 1920x1080 resolution and makes it hard for the user to follow?

    Read the article

  • What is the SEO impact of moving my domain to another IP address and what is the right way of doing this?

    - by ElHaix
    I am planning to move several websites to a new hosting provider - keeping the same URL but will resolve to different IP addresses. For example, some sites are Canadian content-only sites, hosted on .CA domains sitting on Canadian IP addresses. I want to move these to Amazon servers which have US IP addresses. The domain names will remain the same. (1) What is the SEO impact of this? (2) Will the site lose some ranking if the sites are moved to a new IP address (Canadian or not), and if so, what is the cleanest way of accomplishing this (some kind of 301's)?

    Read the article

  • Can I enable GZIP on Godaddy?

    - by Dave
    One of my sites lives on GoDaddy's bottom-of-the-line cheap hosting, I have the correct code in my .htaccess file, but it's not compressing because mod_deflate is not loaded. How do I enable that? The best I've found is this article which suggests I use PHP to zip everything (which is going to be more work than just changing hosting companies): " Add the following code to the very top of your Web pages above the DOCTYPE: if (substr_count($_SERVER['HTTP_ACCEPT_ENCODING'], 'gzip')) b_start("ob_gzhandler"); else ob_start();

    Read the article

  • What liability concerns do advertising vendors raise, and how can I address them?

    - by Beofett
    One of the websites I administer wants to provide free advertising in the form of direct links to vendors at an event they are running. Up until now, there has been no advertising whatsoever on the site (or any of our other sites). The site is for a for-profit business. The idea of implicit endorsement of any vendors we advertise has been raised, which brought up the question of what we need to do, if anything, to protect ourselves from any potential problems such endorsement might create. I know that many sites have clauses in their Terms of Service that state that (in a nutshell) they are not responsible for any problems or grievances between the visitors to the site and any vendor advertised or linked. Are there other steps that a website typically takes when considering advertising, such as getting the advertiser to provide some sort of certification that their ad will not violate any trademarks or copyrighted material?

    Read the article

  • Parsing google site speed in analytics

    - by Kevin Burke
    I'm having a hard time making heads or tails of the Site Speed graphs in Google Analytics. Our site speed is fluctuating wildly from month to month, despite a large sample (the report is "based on 100,000's of visits) and a consistent web set up (static files served from an EC2 instance running nginx behind a load balancer). Here's our site speed, with each datapoint representing a week worth of data. Over this time period we modified our source and HTTP headers to increase our cache hits on static resources by 5x. Why would it fluctuate so much? Is there any way to get more reliable information from those graphs?

    Read the article

  • Web developer has become uncooperative, what should we do to rescue our site? [closed]

    - by TOM
    What can an individual or a company do if the web designer who has designed their website becomes completely uncooperative? In our case He refuses to meet with us for discussions. He refuses to give us training on the effective use and management of the website he has designed for us (and has been paid in full for) despite this training being part of the original contract. Last year he disappeared for a number of months ,refused to answer emails or phobe calls and was totally unavailable to help us He refuses to give us the details of the hosting of our website. We have totally lost faith in this arrogant and unreasonable guy and would like to break off all relationships with him but ,it appears, he's got us over a barell

    Read the article

  • How can I screen clients that try to register multiple times?

    - by Aba Dov
    My company offers a bonus to every client that register. We would like to prevent people from abusing this by registering several times. we thought about filtering clients by ip (there is a problem with workplaces where all stations have the same ip) cookies (if cookies are not allowed we might lose a client) I would like your opinions on these two methods and will be glad to hear about new ones. thanks

    Read the article

  • Remove IP address from the URL of website using apache

    - by sapatos
    I'm on an EC2 instance and have a domain domain.com linked to the EC2 nameservers and it happily is serving my pages if I type domain.com in the URL. However when the page is served it resolves the url to: 1.1.1.10/directory/page.php. Using apache I've set up the following VirtualHost, following examples provided at http://httpd.apache.org/docs/2.0/dns-caveats.html Listen 80 NameVirtualHost 1.1.1.10:80 <VirtualHost 1.1.1.10:80> DocumentRoot /var/www/html/directory ServerName domain.com # Other directives here ... <FilesMatch "\.(ico|pdf|flv|jpg|jpeg|png|gif|js|css|swf)$"> Header set Cache-Control "max-age=290304000, public" </FilesMatch> </VirtualHost> However I'm not getting any changes to how the URL is displayed. This is the only VirtualHost configured on this site and I've confirmed its the one being used as I've managed to break it a number of times whilst experimenting with the configuration. The route53 entries I have are: domain.com A 1.1.1.10 domain.com NS ns-11.awsdns-11.com ns-111.awsdns-11.net ns-1111.awsdns-11.org ns-1111.awsdns-11.co.uk domain.com SOA ns-11.awsdns-11.com. awsdns-hostmaster.amazon.com. 1 1100 100 1101100 11100

    Read the article

  • Root Domain Redirects Incorrectly To Https instead of to WWW

    - by Ari
    TL;DR - Why do visits to my website homepage work without "www", but not to specific pages on it? I recently moved my website (Zappable.com) to a new webhost, RedHat's OpenShift (a PAAS). It requires using Cname records to setup custom domains, something my domain name registar (1&1) does not support without a hosting plan. So instead I setup Cloudflare in-between my domain and web host, and setup a Cname record on it. I then pointed a 1&1 "www" sub-domain to CloudFlare, and then pointed my 1&1 root to "www" sub-domain. This works fine for visiting to my homepage, but for some reason it does not work when visiting a specific page without "www". Instead of adding "www", it goes to HTTPS, which is strange.

    Read the article

  • Do subdomains need to be defined through domain registerar?

    - by Johnny
    I have bought a new domain name from GoDaddy. Let's say it is abcd.com. On GoDaddy's DNS Managing page, I changed A(Host) part to @ = 74.125.232.215 which is www.google.co.uk's IP address. Now if I type www.abcd.com, it directly goes to www.google.co.uk. But if I type http://test.abcd.com, it cannot be loaded. Do I need to define every subdomain through GoDaddy? Is this how it work? P.S. Amazon EC2 directly generates a subdomain for users to reach their virtual PCs. It cannot be domain registerar dependant. P.S.2. Same question for using "www2" at the start of url.

    Read the article

  • Meta description not displaying in custom site search results page

    - by Stephen Connolly
    We have Google Custom Site Search implemented on our company website. When I'm looking at the results page, I noticed that the Meta Description is not being displayed. It just seems to be reading the links titles from our drop down menu and using this as a description. When I search for the same page via google.com, the meta description is pulled in correctly. Any thoughts why this might be happening. I can't see anything in the Custom Site Search settings.

    Read the article

  • Choosing a CSS grid/framework

    - by jonallard
    There are many grids and framework to choose from. A Google search for CSS frameworks will return a dozen articles that themselves list a number of frameworks to choose from. When it comes to choosing one, it's easy to be lost without having an intimate knowledge of all of them. What are the main factors that go into choosing a CSS framework, and how will those choices map to certain frameworks? More generally, how does one choose a CSS framework? Note 1: I'm using "grid" and "framework" almost interchangeably here, but there is probably one I should use over the other. Corrections on this are welcome. Note 2: I am well aware that some choices will depend on taste and accordingly, this question can turn into a "best of" contest/subjective topic. I'm trying to keep it as answerable as possible, as I'm pretty sure many have this problem/question of choosing a framework and an answer to that would benefit the community. As such, improvements to this question are welcome rather than just closing it.

    Read the article

  • Different robots.txt for two different domains point to same folder

    - by Ali
    Hi, I have the following two domains: domain.com test.domain.com Both point to same folder which is "public_html". What I want is a different robots.txt file for each domain. So when someone browse domain.com/robots.txt then a different file is shown. And when someone go to test.domain.com/robots.txt then a different file is shown. How can I do this using URL rewriting in .htacces? Thanks

    Read the article

  • Can't create site connection in Contribute. Adminstrator account removed from domain

    - by tribus
    I have been tasked with administering several sites through Contribute. The previous administrator has since left and his domain account was removed. I believe I had been set up as an additional admin on these sites previously. However, when I attempt to create a connection through "My Connections" in Contribute using SFTP and my credentials, I get the following error: I have verified that I have read/write access to these sites through FTP. How can I take control of these sites through Contribute so that I can start administering them? Is this related to the removal of the previous admin's account?

    Read the article

  • Shopping Cart URL Structure

    - by Drew
    In regards to URL structure when it comes to guests and authenticated users, am I able to track traffic associated with both paths, but at the same time track total conversions going through the shopping cart? I have set up the following URL structure: Authenticated users follow this path: /cart /checkout /checkout-confirmation-ty Guests go like such: /cart /checkout-guest /checkout-confirmation-guest-ty can I track the authenticated and guest users separately? is this possible with Google Analytics?

    Read the article

  • Is this form of cloaking likely to be penalised?

    - by Flo
    I'm looking to create a website which is considerably javascript heavy, built with backbone.js and most content being passed as JSON and loaded via backbone. I just needed some advice or opinions on likely hood of my website being penalised using the method of serving plain HTML (text, images, everything) to search engine bots and an js front-end version to normal users. This is my basic plan for my site: I plan on having the first request to any page being html which will only give about 1/4 of the page and there after load the last 3/4 with backbone js. Therefore non javascript users get a 'bit' of the experience. Once that new user has visited and detected to have js will have a cookie saved on their machine and requests from there after will be AJAX only. Example If (AJAX || HasJSCookie) { // Pass JSON } Search Engine server content: That entire experience of loading via AJAX will be stripped if a google bot for example is detected, the same content will be servered but all html. I thought about just allowing search engines to index the first 1/4 of content but as I'm considered about inner links and picking up every bit of content I thought it would be better to give search engines the entire content. I plan to do this by just detected a list of user agents and knowing if it's a bot or not. If (Bot) { //server plain html } In addition I plan to make clean URLs for the entire website despite full AJAX, therefore providing AJAX content to www.example.com/#/page and normal html to www.example.com/page is kind of our of the question. Would rather avoid the practice of using # when there are technology such as HTML 5 push state is around. So my question is really just asking the opinion of the masses on if it's likely that my website will be penalised? And do you suggest an alternative which avoids 'noscript' method

    Read the article

  • DNS slows down on development environment

    - by Sequenzia
    I have a local development environment setup on my Mac. I am running an Ubuntu Web Server inside of a Virtual Box VM. I setup a host file on my Mac that points my dev site to the IP of the Ubuntu Virtual Server. Everything works good other than the fact a lot (not all) of the time it takes more than 5 seconds to load a page. I used firebug to track down where the problem is and when it's slow the DNS part of my request is taking over 5 seconds. Like I said it's not all the time. Sometimes it resolves and loads the page within milliseconds. The same page one click will be super fast and then the next time it takes over 5 seconds. It's really slowing me down and I am not sure what is causing it.

    Read the article

  • Email provider - suggestions needed

    - by Christian Fazzini
    We are looking for a good way to have email support. In theory, we need to allow end-users to send emails directly to support and careers. i.e. support@domain_name_here.com and careers@domain_name_here.com. Second, we need to provide emails to our staff. So each staff member has their own email address. i.e. joe@domain_name_here.com, meghan@domain_name_here.com, etc. Google Apps is one that we are considering. However, they are charging $50 per user, per year. Not so bad, considering the quality and the features they offer. However, there are also cheaper alternatives. i.e. my domain registrar offers an email plan for $20 / year / 10 emails. Go Daddy has a number of plans and still a lot more affordable than Google Apps. So far Namecheap and Go Daddy are the only ones I have looked at for email plans. Is it worth signing up with Google Apps? Or are there better alternatives? Your thoughts?

    Read the article

  • URL good practice for category sub category?

    - by Seting
    I have developed a application and I need to work for SEO-friendly URL. I have following URL structure: http://localhost:3000/posts/product/testing-with-slug-url-2 and http://localhost:3000/posts/product/testing-with-slug-url-2-4-23 Is this a good practice? If not how can I rewrite it? Ok Ill explain about my applicaiton. My application is based on shopping. if a customer searched for mobiles. it will redirect to url like this http://mydomain.com/cat/mobile-3 3 in the url indicates my database id it is used for further searching After the user reached the mobile page he may need to filter for some brand eg. nokia so my url look lik http://mydomain.com/subcat/nokia-3-2 The integer at the end refers to 3 category id and 2 the brand id My doubt is whether the integer at the end of the url will affect seo ranking.

    Read the article

  • Software for video subscription service

    - by Clinton Blackmore
    I'd like to sell instructional videos over the web. Primarily, I'd like uses to subscribe to the site and be allowed access to videos over the internet. Secondarily, I might sell DVDs for those who have poor internet connections or would like a physical copy, or possibly I'd sell eBooks and the like in the future. Regarding the subscriptions: I'd like a system that automatically sends out e-mails when it is time to renew I'd like to be able to offer free trials Users without a free trial or subscription should not be able to access the content Incidentally, I plan to host videos on my current web host and move them to a CDN when volume (and capital) make this a good idea. While I have no intention to go crazy with the DRM, it seems expedient not to directly link to the files -- how can I link to them indirectly? It would be nice to support multiple payment processors -- specifically, I'd like to avoid a PayPal only approach. Are there any web applications (or plugins) you'd recommend for something like this? While I've set up and administered several web technologies, I've never done anything with e-commerce. I see there are possibilities like osCommerce, one friend recommends using WordPress with plugins, and it really appears that for any given CMS, you can graft on components like this, although I imagine that not all are created equal. As I'm not tied to a particular web application (and, while open source software that can run on a LAMP [p=perl, python, php] stack is preferable), I'd like to make a good choice at the beginning.

    Read the article

  • Displaying google analytics data on my website

    - by anon-user0
    I sale adspace for my websites directly to the advertisers. I ad page where I want to show google anaylytics information that update automatically without me having to manually update everyday or every month. Something like this: http://wstats.net/en/website/riverplate.com#stat_trafic I don't want to use embedded or iframed third party services. I know google has public API and you can connect it to google graph API to to show pretty graphs. There is a tutorial by google here on how to do it: https://developers.google.com/analytics/resources/articles/gdataAnalyticsCharts Few problems: I don't know much javascript The javascript seems to prompt for authntication as opposed to login automatically. (from my understanding by reading comments on the code) Does anyone know of any ready made script that does what I am looking for or know how I can fix this code that will allow me to display analytic info without authenticating? Thanks.

    Read the article

  • Filtering your offices IPs from Google Analytics when each has a dynamic IP?

    - by leeand00
    I found the documentation for filtering IPs from Google Analytics, but the address of the several locations of our company all have dynamic IP addresses that change every 30 days from what I'm told. I know from working with Dynamic DNS that the provider usually gives you a script that you configure your router to run when it's IP address changes or when it is restarted, which passes the new IP address to the DDNS server. I'm wondering if there might be a way to write or use a preexisting script to do the same thing with the Google Analytics API.

    Read the article

  • Google keeps indexing /comment/reply URL

    - by jaypabs
    With the new update of Google algorithm called Penguin, I think my site was being penalized due to webspam. But of course I don't create post which seems to be spam to Google. It is just I think how Google index my site. I found out that Google index the URL of my site like: http://www.example.com/comment/reply/3866/26556 So there are so many comment/reply URL index by Google. I have already added: Disallow: /comment/reply/ Disallow: /?q=comment/reply/ but still Google still index this URL. Any idea how to prevent Google from indexing comments?

    Read the article

< Previous Page | 105 106 107 108 109 110 111 112 113 114 115 116  | Next Page >