Search Results

Search found 11896 results on 476 pages for 'smart pro'.

Page 158/476 | < Previous Page | 154 155 156 157 158 159 160 161 162 163 164 165  | Next Page >

  • Adwords: Is there a drawback to setting a really high max CPC to learn what works faster?

    - by Rob Sobers
    I'm toying with increasing my max CPC really high on all my keywords so ensure my ad gets shown in the top spot on page one in order to draw more clicks. I think this will be a good way to quickly figure out whether the ads I'm writing have a decent CTR and, more importantly, whether the landing pages I'm building are converting. Since I can set a max daily budget for my campaign, I won't risk breaking the bank. I can't think of any drawbacks, personally. Am I missing any?

    Read the article

  • Would google consider this a "link scheme"?

    - by Abe Miessler
    I work for a County Government. In an effort to improve our search engine rankings I was considering contacting some of the other Counties in my state and seeing if we could setup an arrangement where we link to them and they link to us. It seems like a reasonable thing to have a "Other Counties" page with various links on a County website, but it also seems like it's bordering on a link scheme according to Google. When I read Google's article about link schemes it seems like we hit some of the right reasons for having links on another County's site but also some of the wrong. Does anyone have any insight into whether this is a good or bad idea?

    Read the article

  • A good resource to get the most out of Google Analytics

    - by glinch
    I was wondering if any one could offer me some advice as to the best resources out there (ideally books) on google analytics. I have a basic understanding but have a lot of room for improvement. The following book "Advanced Web Metrics with Google Analytics" by Brian Clifton, appears to be a good starting but but is already quite dated, even though published in march 2010. Any advice would be greatly appreciated.

    Read the article

  • Google search does not show sub-pages from my website

    - by Chang
    My website appears in Google search, but only the first page. Of course I have sub-pages linked from the first page, but the sub-pages do not show in Google search. Not in Yahoo, not in Bing. What should I do? It has been three years that sub-pages do not show. (I tried searching site:mydomain.com and pressed 'repeat the search with the omitted results included' link) What would you suspect the reason? My website addresses were like xxx.php?yy=zzz etc, etc, so I changed it to /yy/zzz using mod_rewrite. I thought it might be (X)HTML standard violations, so now I changed it. I hope Google will soon have my entire website, but I am a little bit pessimistic. Do you have any thought?

    Read the article

  • Website Stopped Showing From Google Search Results Sunddenly

    - by Aman Virk
    I have a design and development blog http://www.thetutlage.com (1.5 years old), which was doing really well in Google search as I was getting over 70% of my traffic from Google. Now suddenly from last two days it reduced the amount of traffic from 70% to 20% and also when I am trying to search for the exact posts that I can created even after appending my website name to it does not show any results for that. Sample Search Text: JQuery Game Programming Creating A Ping Pong Game Part 1 I have post with exact same title and it does not show it on Google search anywhere. I am totally shocked, I write my own unique content and follow Google guide lines like bible. Also there is no message under my webmasters account stating any problem or error.

    Read the article

  • How to block FeedReader from fetching my content to their site?

    - by Wei Kai
    As you all can see from the picture below, my site's content is duplicated by FeedReader and indexed at Google. When I clicked at the FeedReader link, it uses some sort of iFrame to draw content from my site live. This forms some sort of content duplication to me, and I believe it does harm to my site. (Stackoverflow doesn't allow me to post image due to new account, pls click at the link above to see the picture, million thanks to you.) What can I do to prevent Feedreader to fetch my content to their site? I know robots.txt can perform such function, but I don't know how to do it. Any help would be much appreciated. I have also highlighted this issue to FeedReader 2 days ago, but yet to get any reply from them.

    Read the article

  • Put a link on the nav bar in Wordpress

    - by Rafe Kettler
    I have a Wordpress blog. On the same domain, I have some other stuff hosted that isn't part of my WP install. I want to link to those other places on my domain from the top menu bar (nav bar) on my blog. How can I do that? The theme is Lightword, relevant header.php code follows: <body <?php body_class(); ?>> <div id="wrapper"> <?php lightword_header_image(); ?> <div id="header"> <?php lightword_rss_feed(); ?> <div id="top_bar"> <div class="center_menu"> <ul id="front_menu" <?php global $lw_remove_searchbox, $lw_use_wp_menus; $lw_menu_width = ""; if($lw_remove_searchbox == "true") $lw_menu_width = " class=\"expand\" "; echo $lw_menu_width; ?>> <?php echo lightword_homebtn(__('Home','lightword')); ?> <?php if ( function_exists('wp_nav_menu') && $lw_use_wp_menus != "true") { $lightword_menu = wp_nav_menu( array( 'menu' => 'lightword_top_menu', 'echo' => false, 'menu_id' => 'front_menu', 'container' => '', 'theme_location' => 'lightword_top_menu', 'link_before' => '<span>', 'link_after' => '</span>' ) ); $lightword_menu = preg_replace( array( '/^<ul id="front_menu" class="menu">/', '/\n<\/ul>$/' ), '', $lightword_menu); echo $lightword_menu; }else{ echo lightword_wp_list_pages(); } ?> </ul> </div> <?php echo lightword_searchbox(); ?> </div> </div> <div id="content">

    Read the article

  • Link tags in iframe widget

    - by john Smith
    I have a rating community-site and I´m offering little iframe widgets with the average rating and some little other info. Does it make sense (for visibility, SEO) to add link tags to the head like: <link rel="alternate" type="application/rss+xml" title="RSS 2.0" href="rssfeed" /> <link rel="index" title="main-profile" href="main-profile"> To get a logical association of the widget to relating pages? How would you do this?

    Read the article

  • Does using structure data semantic LocalBusiness schema markup work for local EMD URL's?

    - by ElHaix
    Based on what I have read about Google's recent Panda and Penguin updates, I'm getting the impression that using semantic markup may help improve SEO results. On a EMD (exact match domain) site, that may have been hit, we list location-based products. We are now going to be adding a itemtype="http://schema.org/Product" to each product, with relevant details. However, that product may be available in Los Angeles and also in appear in a Seattle results page. We could add a LocalBusiness item type on each geo page to define the geo location for that page. While the definition states: A particular physical business or branch of an organization. Examples of LocalBusiness include a restaurant, a particular branch of a restaurant chain, a branch of a bank, a medical practice, a club, a bowling alley, etc. We could add use the location property which would simply include the city/state details. I realize that this looks like it is meant for a physical location, however could this be done without seeming black-hat?

    Read the article

  • Paid Website Code Review

    - by clifgray
    I have written a pretty extensive webapp and it is going to go live in the next fews weeks and before I really publicize it I want to get some professionals to review it for optimization and best practices. Is there any online service or way to find local software engineers who would be willing to do this? Just to give some specifics that may be helpful, my site is on Google App Engine and written in Python and it is tough to find someone with extensive experience in that area.

    Read the article

  • wordpress woocommerce php variable usage %1$s

    - by tech
    I am using wordpress with woocommerce and I am trying to manipulate a copy of myaccount.php The default code uses some variables of some sort that I am not familiar with nor have I been able to find documentation on. The variables in question are %1$s, %2$s and %s <p class="myaccount_user"> <?php printf( __( 'Hello <strong>%1$s</strong> (not %1$s? <a href="%2$s">Sign out</a>).', 'woocommerce' ) . ' ', $current_user->display_name, wp_logout_url( get_permalink( wc_get_page_id( 'myaccount' ) ) ) ); ?> <?php printf( __( 'From this page you can view your recent orders, manage your shipping and billing addresses and <a href="%s">edit your password and account details</a>.', 'woocommerce' ), wc_customer_edit_account_url() ); ?> </p> How can I identify the variables, what they represent and how to use them? Thank you.

    Read the article

  • How can I pass referrer header from my https domain to http domains?

    - by nutcracker
    My website is 100% https. I have links to other http domains. The referrer header is not set when linking from a https page to a http page. From http://en.wikipedia.org/wiki/HTTP_referrer If a website is accessed from a HTTP Secure (HTTPS) connection and a link points to anywhere except another secure location, then the referer field is not sent. I would prefer that other domains can see the referrer so that they know that traffic comes from my domain. Is there a way to force this header or is there another solution? Update I've done some basic testing using a redirect: http page -- link to http --> 301 redirect --> http page = referrer intact https page -- link to https --> 301 redirect --> http page = referrer blank https page -- link to http --> 301 redirect --> http page = referrer blank https page -- link to http --> 302 redirect --> http page = referrer blank The referrer is lost when linking from a https page to a http redirect page on my own domain. So there is no referrer on the redirect.

    Read the article

  • Not Provide count has reached 80%! But what about remaining 20?

    - by Rajesh Magar
    I am updated with all not-provided reasons as Google has encrypted their all searches, but here is the little question banging again and again in my head. That if all search results has encrypted with HTTPS protocol then how did Google analytics still able to track some of (20%) organic keywords details? I means their still some keywords appreading in my organic keywords section. So how did Google analytics track or bypass that HTTPS thing? Thanks in advance!

    Read the article

  • How to easily delete all email forwarders in cPanel?

    - by psoft
    I know that I can import a list of email forwarders using CPanel, but how can I delete a list? I want to manage 300+ addresses - as a membership list for my organization. I want to be able to delete that many without clicking 'Delete' and then 'Confirm' (or whatever it is) 300 times. Even if I am able to simply delete ALL forwarders, then upload a modified list - that's fine with me. Note: I'm using a shared hosting package through SiteGround. The tech service rep informed me that I can't use CPanel scripts in a shared environment. Any suggestions?

    Read the article

  • Google Goals process not working through similarly named pages

    - by David
    Well, I'm at a loss. I've ensured that my tracking script is in etc etc, and I've set up my goal and funnel path, but only the first step is ever being shown on the funnel. Goal URL: /checkout/checkoutComplete/ Type: Head Match ... but should this be /checkout/checkoutComplete/(.*) and set to regex rather because there are parameters after the main part of the URL (I thought that's what head match was for) Step 1: /checkout/ <-- required Step 2: /checkout/confirm/ both the above are valid and correct URLs for my domain. But for some reason, the funnel visualization shows entries into the first step, then an exits count that matches the entry count, including /checkout/confirm - but it doesn't go on to the next step! Perhaps I'm doing something obviously wrong...but I can't quite see it? Also, semi-related questions. Making a change to the funnel, does it only affect new incoming data? And how often does it update? Thanks in advance for your help.

    Read the article

  • Local Business Listing Dashboard

    - by Steve
    I operate a website for a West Australian company, and the company has listings in local business directory websites. Currently we are listed in: www.HotFrog.com.au www.Google.com/Places www.TrueLocal.com.au www.StartLocal.com.au www.localstore.com.au www.communityguide.com.au www.yelp.com.au www.aussieweb.com.au Do you know of any method for examining account stats (profile views, profile clicks etc) within a dashboard, so I can see at a glance how each of our listings is going? I'd be happy to build a dashboard if necessary, but I'm not confident I currently have the skills to accurately display only the correct concise information. Would I use iframes? See if they have APIs? Is there a dashboard framework I could use?

    Read the article

  • How to prevent duplication of content on a page with too many filters?

    - by Vikas Gulati
    I have a webpage where a user can search for items based on around 6 filters. Currently I have the page implemented with one filter as the base filter (part of the url that would get indexed) and other filters in the form of hash urls (which won't get indexed). This way the duplication is less. Something like this example.com/filter1value-items#by-filter3-filter3value-filter2-filter2value Now as you may see, only one filter is within the reach of the search engine while the rest are hashed. This way I could have 6 pages. Now the problem is I expect users to use two filters as well at times while searching. As per my analysis using the Google Keyword Analyzer there are a fare bit of users that might use two filters in conjunction while searching. So how should I go about it? Having all the filters as part of the url would simply explode the number of pages and sticking to the current way wouldn't let me target those users. I was thinking of going with at max 2 base filters and rest as part of the hash url. But the only thing stopping me is that it would lead to duplication of content as per Google Webmaster Tool's suggestions on Url Structure.

    Read the article

  • Finding a Payment Gateway?

    - by Lynda
    I have a client who would like to sell glass pipes online. The problem I run into is with the payment gateway. Glass pipes fall into two categories drug paraphernalia or tobacco product. This leads me to here and asking: Does anyone know of a payment gateway that will process payments for glass pipes? Note: Doing some Google searching I read that Authorize.net will accept tobacco but when I spoke with them they said they do not.

    Read the article

  • Clicks counting and crawler bots

    - by Dennis
    I am currently running a small affiliate-program for Facebook users. We use an auto-poster to publish links to fan pages. Every hit is stored in our database and we have included a 24 hour reload block for the IP-addresses. My problem right now is that the PHP script also stores every hit from all the bots that crawls my website. Now I was thinking to block those bots with the robots.txt of my website but I am afraid that this will have a negative effect on my AdSense ads. Does anybody have an idea for me how to work this out?

    Read the article

  • Want to tap into a niche market. Do I create new site or bolt on to existing site?

    - by nitbuntu
    Hi, After a lot of heard work and a few years of perseverance, I'm seeing regular sales on my website which have been steadily growing over the past year. However, the entrepreneur in me wants tap into a niche market which I've become very interested in. It's possible to bolt on this niche onto my existing site as an additional category, without it looking too out of place; my new category of products would also benefit from the ranking my current site gets. The kind of people who would purchase these new niche products, however, are very particular and obsessive about detail. So, for example, many Vegetarians would not eat in KFC even if they were to introduce a new range of Veggie burgers. So, I thought it best to create a new website and since my existing site was created using an 'old-school' shopping cart and there are many more up-to-date, feature-rich, ones available now, I wanted to use a different shopping cart system. My dilemma is that I already have 2 websites (1 b2c and another b2b site) and maintaining a 2nd b2c site would end up vastly increasing my workload and I fear that I would not be able to pay adequate attention to all the sites. Moreover, the additional customer service work (e.g. answering emails from many separate email accounts) could end up being too confusing and difficult to maintain. The easy answer would be to take on an employee, but I'm just not earning enough to justify this yet. If anyone has any tips or experience they'd like to share, which could help me answer this question, I'd be highly grateful.

    Read the article

  • Is it good or bad to have dynamic content in page titles and/or description

    - by Gunjan
    In a local listing website, I append number of search results found in the description(not in title currntly) meta tag of the page as I think this is valuable for users for e.g. "Find address, phone numbers, blah blah blah for 21 outlets in locality. some more stuff after this..." as more places are added to the database, the description for the same page will change frequently. is this good or bad for SEO how about doing the same for title tags?

    Read the article

  • Google Webmasters tools crawl error caused by URL split into two lines

    - by Shiro
    I am looking in to Google Webmaster Tools - Crawl Error section. How should I handle for those URL due their system / application showed invalid URL. e.g http://www.example/images/products/s_=enlarge_16gb.jpg but, I dunno what happen to yahoo groups, it break the link into http://www.example/images/products/s_= enlarge_16gb.jpg and I only make the top part become hyperlink, which is http://www.example/images/products/s_= Because of the URL, Google show crawl error, I got few error because of this kind of result or because other people typo error. How do I prevent this. I am sure I don't have the right go and change other people post. What is the solution for this. Thanks!

    Read the article

  • How do I allow e-mail to be relayed through this MTA?

    - by BlueToast
    When I try to send an e-mail using authenticationless relay via telnet, I receive an error message "553 sorry, that domain isn't allowed to be relayed thru this MTA (#5.7.1) rcpt to:[email protected]". How can I allow a specific domain to be whitelisted and allowed through the MTA? There is only one domain I am trying to relay e-mails to (and that domain uses a totally different, independent and standalone mail server with IceWarp). 220 mail4.myhsphere.cc ESMTP ehlo sisterwebsite.com 250-mail4.myhsphere.cc 250-PIPELINING 250-8BITMIME 250-SIZE 41943040 250-AUTH LOGIN PLAIN CRAM-MD5 250 STARTTLS mail from:[email protected] 250 ok rcpt to:[email protected] 553 sorry, that domain isn't allowed to be relayed thru this MTA (#5.7.1) rcpt to:[email protected] 553 sorry, that domain isn't allowed to be relayed thru this MTA (#5.7.1) rcpt to:[email protected] 553 sorry, that domain isn't allowed to be relayed thru this MTA (#5.7.1) rcpt to:[email protected] 250 ok data 354 go ahead To: [email protected] From: [email protected] Subject: Test mail -- please ignore Test, please ignore this Jane Sincerely, BlueToast . 250 ok 1350407684 qp 22451 quit 221 mail4.myhsphere.cc Connection to host lost. C:\Users\genericaccount Not sure what to do. I did some Googling but I'm having a hard time finding relevant results. Most of the search results I get are about trying to receive mail -- but I am trying to send mail. mail.sisterwebsite.com = mail4.myhsphere.com. We use FluidHosting for the e-mail on sisterwebsite.com. (Repeating question just in case) How can I allow a specific domain to be whitelisted and allowed through the MTA?

    Read the article

  • Use Outlook password for website verification

    - by Jack Lockyer
    I am currently building an internal employee dashboard for our global company (it is hosted on an external website for logistical reasons) I'd like (need) to password protect the page as we will be displaying sensitive information, my question is, is it possible to integrate with Outlook passwords? We have over 350 staff all of whom use outlook on a daily basis, I'd love for the website to check whether the visitor is logged into Outlook and if they're not, prompt them to log in. Is it possible?? If it is I'll get is developed straight away.

    Read the article

  • Constructive criticism for my bounce rate being so high [closed]

    - by Daniel
    The bounce rate on my website's product pages is 80%, which is terrible. Could you offer any opinions on whether you consider the user experience to be bad, and how I could possibly improve it? Other pages, such as the home and category pages, have acceptable bounce rates, but the vast majority of my traffic lands on the product pages. I already tried removing some Google ads for a couple of days, but this didn't seem to help at all. I'm working on doing A/B testing at the moment. (It's tricky, as the site is based on a CMS - I custom coded the [Joomla] component, so hopefully I can get this testing working.)

    Read the article

< Previous Page | 154 155 156 157 158 159 160 161 162 163 164 165  | Next Page >