Search Results

Search found 9717 results on 389 pages for 'gkt pro'.

Page 85/389 | < Previous Page | 81 82 83 84 85 86 87 88 89 90 91 92  | Next Page >

  • Domain masking (and simple page links)

    - by Halik
    How do you set up the domain (Im using godaddy) to mask the server url but to append the sub-page link. Im thinking something like the wikipedia en.wikipedia.org/wiki/something (or, if it would require httpd.conf access, setting it to append the default subpage link eg. '?page_id=2') Currently I can set up the domain to either be masked completely without showing any sub-page links or to simply redirect my domain to my web server.

    Read the article

  • Tracking unique views for a site showing my advertisements [on hold]

    - by user580950
    I am in trouble. I placed and advertisement on a website in 2012. The website said they got 950,000 unique visits each month. Early in 2012 I advertised with them. The advertisement didn't worked out. I checked in 2-3 months time and I saw that the unique visitors on the site was 8,000 at that time. I immediately closed the account. I don't remember which site I used to check the unique visitors. The advertising company has filed a dispute against me. So is there any tool that can show me the 2012 stats for any website? I tried Google Trends but it doesn't show statistics.

    Read the article

  • Confirm that a dns zone is served by a nameserver

    - by adam
    We currently have a domain which has custom nameservers. Our host has their own nameservers. I'd like to switch our domain to use our host's nameservers for a while. Our host tells me that their nameservers hold a replica of our dns zone, but I'd like to confirm this before I switch. Is there a command line tool I can use that I can use to answer the question "does this nameserver know the dns zone of this domain?" Hope that makes sense! Thanks, Adam

    Read the article

  • Improve efficiency of web building setup and processes - Wordpress on Mac

    - by Rob
    Can anyone see any ways in which I can improve my speed and efficiency with the following setup? Or if there are any obvious holes in my building process? This is for building Wordpress websites on Mac: 1) I have a standard Wordpress setup that I work from which includes various plugins that I tend to use across all setups - thus cutting out the step of having to download them all the time! 2) My standard WP files are copied into a Dropbox folder - thus creating backups of the files. 3) I then open up MAMP and setup a local version. 4) I open up Coda and setup the FTP details so files can be uploaded to the live domain by using the publish button. If anyone has any advice on how to improve this process then please let me know!

    Read the article

  • Is eCPMs dropping by about 50% in January a usual behavior on Google AdSense?

    - by Andrew G. Johnson
    So I just got semi-serious about running some AdSense sites over the past 6 months and the eCPM's have hovered between 1.38 and 1.42 [yes it's that close] when I look at the eCPM for each month. Obviously some deviation day to day but pretty damn close to a buck forty in aggregate. So far for January I am sitting at 0.80 for an eCPM. I know it's not a huge sample size but the daily pageviews are fairly consistent [actually a bit higher] than where they were in December. I am trying to justify this by thinking that somehow a lot of ad buyers buy inventory for the year and have to get setup to do another big buy now that it's a new calendar year but that thought isn't close to comforting. Is this happening to anyone else? EDIT: I run a lot of websites and the ratios of pageviews are about the same this month to last month but just to be clear the eCPM I posted is for 20 websites in a variety of niches, it doesn't accurately depict any one domain.

    Read the article

  • How do you exclude yourself from Google Analytics on your website using cookies?

    - by Keoki Zee
    I'm trying to set up an exclusion filter with a browser cookie, so that my own visits to my don't show up in my Google Analytics. I tried 3 different methods and none of them have worked so far. I would like help understanding what I am doing wrong and how I can fix this. Method 1 First, I tried following Google's instructions, http://www.google.com/support/analytics/bin/answer.py?hl=en&answer=55481, for excluding traffic by Cookie Content: Create a new page on your domain, containing the following code: <body onLoad="javascript:pageTracker._setVar('test_value');"> Method 2 Next, when that didn't work, I googled around and found this Google thread, http://www.google.com/support/forum/p/Google%20Analytics/thread?tid=4741f1499823fcd5&hl=en, where the most popular answer says to use a slightly different code: SHS Analytics wrote: <body onLoad="javascript:_gaq.push(['_setVar','test_value']);"> Thank you! This has now set a __utmv cookie containing "test_value", whereas the original: pageTracker._setVar('test_value') (which Google is still recommending) did not manage to do that for me (in Mac Safari 5 and Firefox 3.6.8). So I tried this code, but it didn't work for me. Method 3 Finally, I searched StackOverflow and came across this thread, http://stackoverflow.com/questions/3495270/exclude-my-traffic-from-google-analytics-using-cookie-with-subdomain, which suggests that the following code might work: <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push(['_setVar', 'exclude_me']); _gaq.push(['_setAccount', 'UA-xxxxxxxx-x']); _gaq.push(['_trackPageview']); // etc... </script> This script appeared in the head element in the example, instead of in the onload event of the body like in the previous 2 examples. So I tried this too, but still had no luck with trying to exclude myself from Google Analytics. Re-iterate question So, I tried all 3 methods above with no success. Am I doing something wrong? How can I exclude myself from my Google Analytics using an exclusion cookie for my browser?

    Read the article

  • Example sites which use UCC certificates

    - by Brian
    Can anyone point me to a few sites that make use of a UCC (SAN) certificates? I tried to search for this but found a lot of information about UCC certficates without any examples. As a sanity check before buying/configuring a UCC certificate, I wish to do some basic testing to determine exactly how the certificate will look in different browsers. Yes, I realize I could just use makecert instead. I would rather just look at them in the wild.

    Read the article

  • Need a generic way to create SEO friendly URL

    - by Fawad Ghafoor as Xainee Khan
    I have searched a lot and implemented many many Regular Expression in my .htaccess file but can not succeed. How do I find a generic way that make my URL SEO friendly? Currently this is in my .htaccess file: RewriteEngine On RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ index.php?page=$1 [L,QSA] What I need to do is that I have a URL like this: http://localhost/abc/index.php?page=boats_for_sale I need to change it to http://localhost/abc/boats_for_sale Similarly, I want to hide all query strings in my URL. How would I achieve it?

    Read the article

  • Meta description of my blog post changes

    - by Aadarsh sojitra
    I have some problems in Meta description tags in my Blogger blog. When I update my pages in search engine with the help of the Fetch as Google feature in GWT, all my blog's results comes with a correct meta description like Today I am back with a reason that "WHY IS ORIGINAL MEMORY IN HARD-DISK IS LESS THAN PRINTED" on a box. If we buy any hard-disk or a pen drive... But after approx 5-6 days, it changes to my blog's default meta description. This is also happening after changing the default meta description of my blog. I want only one answer that why its happening? After deleting my blog and creating a new blog with the same name this problem was solved. Why this problem was solved? - I am asking this question because to solve problems in the future.

    Read the article

  • Best practices when loading images for improving page loading speed

    - by Naoise Golden
    I am working on optimizing a page's loading speed. Here are some analytics: Notice how the images, although only accounting for 65% of the total size (1.1MB), are by far the slowest loading assets: 96% of time. I'd like to know which are the recommended practices on optimizing loading speed, only taking images into account. Some of the techniques we are already applying: image compression images hosted on cookieless domain and CDN spriting everything that can be sprited http headers: keep alive and Expires to one year. Disclaimer: I have gone through the available documentation, I think by focusing on image loading optimization I am not creating a duplicate or a subjective question.

    Read the article

  • Website speed issues

    - by Jose David Garcia Llanos
    I am developing a website however i have noticed speed issues, i am not sure whether is due to the location of the server. I am not a guru when it comes to performance or speed issues, but according to a website speed test it seems that it takes quite a long time to connect to the website. Speed Test Results Can someone suggest something or give me some tips, the website address is http://www.n1bar.com

    Read the article

  • When is meta description still relevant?

    - by Jeff Atwood
    I received this bit of advice about the meta description tag recently: Meta descriptions are used by Google probably 80% of the time for the snippet. They don’t help with rankings but you should probably use them. You could just auto generate them from the first part of the question. The description tag exists in the header, like so: <meta name="Description" content="A brief summary of the content on the page."> I'm not sure why we would need this field, as Google seems perfectly capable of showing the relevant search terms in context in the search result pages, like so (I searched for c# list performance): In other words, where would a meta description summary improve these results? We want the page to show context around the actual search hits, not a random summary we inserted! Google Webmaster Central has this advice: For some sites, like news media sources, generating an accurate and unique description for each page is easy: since each article is hand-written, it takes minimal effort to also add a one-sentence description. For larger database-driven sites, like product aggregators, hand-written descriptions are more difficult. In the latter case, though, programmatic generation of the descriptions can be appropriate and is encouraged -- just make sure that your descriptions are not "spammy." Good descriptions are human-readable and diverse, as we talked about in the first point above. The page-specific data we mentioned in the second point is a good candidate for programmatic generation. I'm struggling to think of any scenario when I would want the Google-generated summary, that is, actual context from the page for the search terms, to be replaced by a hard-coded meta description summary of the question itself.

    Read the article

  • Semantic Form Markup for Yes or No Questions - Or Should I Tell my Designers to Bugger Off?

    - by sholsinger
    I frequently receive mock-ups of HTML forms with the following prototype: Some long winded yes or no question?   (o) Yes   ( ) No The (o) and ( ) in this prototype represent radio buttons. My personal view is that if the question has only a true or false value then it should be a check box. That said, I have seen this sort of "layout" from almost every designer I've ever worked with. If I were not to question their decision, or question the client's decision, I'd probably mark it up like this: <p class="pseudo_label">Some long winded yes or no question?</p> <input type="radio" name="the_question" id="the_question_yes" value="1"> <label for="the_question_yes" class="after_radio">Yes</label> <input type="radio" name="the_question" id="the_question_no" value="0"> <label for="the_question_no" class="after_radio">No</label> I really don't want to do that. I want to push back and convince them that this should really be a check box and not two radio buttons. But my question is, if I can't convince them – you're welcome to help me try – how should I code that original design requirement such that it is semantic and at least understandable for screen reader users? If I were able to convince my tormentors to change their minds, I would likely code it in the following fashion: <label for="the_question">Some long winded yes or no question?</label> <input type="checkbox" name="the_question" id="the_question" value="1"> What do you think about this issue? Should I push back? Possibly more importantly is either way semantically correct?

    Read the article

  • What is the SEO impact of moving my domain to another IP address and what is the right way of doing this?

    - by ElHaix
    I am planning to move several websites to a new hosting provider - keeping the same URL but will resolve to different IP addresses. For example, some sites are Canadian content-only sites, hosted on .CA domains sitting on Canadian IP addresses. I want to move these to Amazon servers which have US IP addresses. The domain names will remain the same. (1) What is the SEO impact of this? (2) Will the site lose some ranking if the sites are moved to a new IP address (Canadian or not), and if so, what is the cleanest way of accomplishing this (some kind of 301's)?

    Read the article

  • Using multiple A-records for my domain - do web browsers ever try more than one?

    - by Jonas
    If I add multiple A-records for my domain, they are returned in a round robin order by DNS-servers. E.g: 1.1.1.1 A example.com 1.1.1.2 A example.com 1.1.1.3 A example.com But how does webbrowsers react if the first host (1.1.1.1) is down (unreachable)? do they try the second host (1.1.1.2) or do they return a error message to the user? Are there any difference between the most popular browsers? If I implement my own application, I can implement so that the second is used in case the first is down, so it's possible. And this would be very helpful to create a fault tolerant website.

    Read the article

  • DNS and Wildcard CNAME

    - by Thomas Chapman
    Whenever I attempt to make a record for *.schneiderdonnelly.com.au and CNAME it, I get two errors: You can't mix CNAME/MX records together using the same hostname. Domain root's cannot be CNAME's, however you can web-forward this record to www.schneiderdonnelly.com.au instead for the same effect. I've read it's possible so why can't I make it work? I donated $5 to be a premium member and I've been trying to make it work for yonks. http://i.stack.imgur.com/D9Ui5.jpg This is how I want it to appear. The last record. I am prepared to swap DNS providers as long as they're free.

    Read the article

  • Adding a CMS to an existing Magento shop

    - by user6341
    I am working on a project for 3 niche stores built on magento (using magento's multi-store function) that each get roughly 50k unique visitors a day. The sites don't currently have a blog or forum or any social networking aspects. Would like to add a cms to each site that can be centrally run and would like it to take over the front end content from Magento. Also would like it to maintain an online blog/publication of sorts with videos, articles, and the like with privileges to edit the content given to a dozen or so people with different privileges. Want to add a forum to each site that is fairly robust and to possibly add some social networking aspects down the road, so extandability and available plugins/mods in each cms is important. Other than shared login between the forums,blog/publication and store, would like to be able to integrate some content from the forums and blog/publication into the store as well. After researching this a bit, I am inclined towards Drupal, but I haven't found any modules to integrate it with Magento. Also, since the blog content will be done by about a dozen nontechnical people, I want something that is very easy to work with. Lastly, since the site gets a good amount of traffic, speed and security are very important. What CMS would you recommend integrating in this context? Deciding between Drupal, Wordpress and ModX. Also Plone as well. Thanks.

    Read the article

  • Integrating eBay and PayPal inventory

    - by JW01
    Say I have an item for sale on eBay, and the same item for sale on another site via PayPal. Is it possible to have sales on one site reflected in the inventory for the other site, and vice-versa? In other words, if I have ten items for sale, and I buy one on either site, it should show that there are nine items left on both sites. I know that PayPal has an API for setting the inventory level of an item associated with a button. eBay also has an API for controlling an item's inventory. I'm wondering if anyone has tried to integrate them.

    Read the article

  • SEO Keyword Research Help

    - by James
    Hi Everyone, I'm new at SEO and keyword research. I am using Market Samurai as my research tool, and I was wondering if I could ask for your help to identify the best key word to target for my niche. I do plan on incorporating all of them into my site, but I wanted to start with one. If you could give me your input on these keywords, I would appreciate it. This is all new to me :) I'm too new to post pictures, but here are my keywords (Searches, SEO Traffic, and SEO Value / Day): Searches | SEO Traffic | PBR | SEO Value | Average PR/Backlinks of Current Top 10 1: 730 | 307 | 20% | 2311.33 | 1.9 / 7k-60k 2: 325 | 137 | 24% | 822.94 | 2.3 / 7k-60k 3: 398 | 167 | 82% | 589.79 | 1.6 / 7k-60k I'm wondering if the PBR (Phrase-to-broad) value of #1 is too low. It seems like the best value because the SEOV is crazy high. That is like $70k a month. #3 has the highest PBR, but also the lowest SEOV. #2 doesn't seem worth it because of the PR competetion. Might be a little too hard to get into the top page of Google. I'm wondering which keywords to target, and if I should be looking at any other metric to see if this is a profitable niche to jump into. Thanks.

    Read the article

  • Breadcrumb using and schema.org rich snippets

    - by Adam Jenkin
    I am having problems implementing the breadcrumb rich snippets from schema.org. When I construct my breadcrumb using the documentation and run via Google Rich Snippet testing tool, the breadcrumb is identified but not shown in the preview. <!DOCTYPE html> <html> <head> <title>My Test Page</title> </head> <body itemscope itemtype="http://schema.org/WebPage"> <strong>You are here: </strong> <div itemprop="breadcrumb"> <a title="Home" href="/">Home</a> > <a title="Test Pages" href="/Test-Pages/">Test Pages</a> > </div> </body> </html> If I change to use the snippets from data-vocabulary.org, the rich snippets show correctly in the preview. <!DOCTYPE html> <html> <head> <title>My Test Page</title> </head> <body> <strong>You are here: </strong> <ol itemprop="breadcrumb"> <li itemscope itemtype="http://data-vocabulary.org/Breadcrumb"> <a href="/" itemprop="url"> <span itemprop="title">Home</span> </a> </li> <li itemscope itemtype="http://data-vocabulary.org/Breadcrumb"> <a href="/Test-Pages/" itemprop="url"> <span itemprop="title">Test Pages</span> </a> </li> </ol> </body> </html> I want the breadcrumb to be shown in the search result rather than the url to the page. Given that schema.org is the recommended way to be using rich snippets, I would rather use this, however as the breadcrumb is not showing in the preview of the search result using this method, i'm not convinced this is working correctly. Am I doing something wrong in the markup for schema.org example?

    Read the article

  • SEM & Adwords: How many click without a sale before i should pause a keyword

    - by Thomas Jönsson
    I wonder how many clicks I optimally should let pass through every new keyword I try in Adwords before I find out that it's not making a profit and it should be paused! It's actually four question. 1: At which likelihood percentile should I pause a word? 2: How many clicks should I let through before I pause a word for those word which do not generate any lead? 3: How many clicks should I let through after one sale to consider the word not to be profitable? 4: Does the likelihood of the word becoming profitable affect the above? Conditions: -The clicks is normally distributed. (correct?) -A CR of 1% is break even, everything above is profit (1 sale/100 clicks=break even) Cost per Click(cpc) = 4$ -Marginal (profit per sale) = 400$ -Paybacktime = 1 year -Average click per word = 0,333 per day (121 + 2/3 per year) Exampel: After 1 click and no sale the keyword still has a high probability to be profitable. After 500 clicks and no sale it has almost no likelihood to not be profitable and should probably be paused. Thanks in advance!

    Read the article

  • Rich Snippets - LocalBusiness - Photos - Correct Implementation

    - by user32622
    Does somebody know, how this is supposed to be implemented correctly? In my local business full page, I have a carousel with several images, so what I did is that on the container of this carousel i have written the following: "itemprop='photos' itemscope itemtype="http://schema.org/ImageObject"", i.e. <div class="tourism-product-media-gallery" itemprop='photos' itemscope itemtype="http://schema.org/ImageObject"> and then on each and every image i have written the following: "itemprop="contentURL"", i.e. <img src="@mediaItem.NormalImage" alt="@mediaItemCaption" itemprop="contentURL"/> But i am not convinced that this is the way it should be. Anyone has any insight on this and more knowledge? Thanks Note: here are the results from the rich snippet google testing tool: click here

    Read the article

  • How to enable customers to use their own domain for sites hosted by me

    - by Scott
    I am thinking of running a self-site builder. But was wondering how would I allow customers to use their own domains that they already own. Is that even possible? Let's say my site is www.bestsitebuildingwebsite.com and each customer has urls like this www.bestsitebuildingwebsite.com/frances www.bestsitebuildingwebsite.com/eden www.bestsitebuildingwebsite.com/john And a customer has a domain called widgets.com Is it actually possible domain widgets.com to go to my site somehow and have HASHES on the URL still work (my site makes use of hashes for AJAX queries). And their site still have good SEO with Google? Thanks Scott

    Read the article

  • Best way to prevent Google from indexing a directory [duplicate]

    - by Gkhan14
    This question already has an answer here: Stopping Google index some web pages I have 5 answers I've researched many methods on how to prevent Google/other search engines from crawling a specific directory. The two most popular ones I've seen are: Adding it into the robots.txt file: Disallow: /directory/ Adding a meta tag: <meta name="robots" content="noindex, nofollow"> Which method would work the best? I want this directory to remain "invisible" from search engines so it does not affect any of my site's ranking. In other words, I want this directory to be neutral/invisible and "just there." I don't want it to affect any ranking. Which method would be the best to achieve this?

    Read the article

  • Permanent redirect domain to www subdomain without web.config

    - by Lord Simpson
    I've just set up a site via 1and1 and have run into an issue, I want to accomplish the simple task of redirecting the root domain to the www sub domain however due to complications I cant seam to find a way to get it to work. I'm on a Microsoft (asp.net) package so can't use .htaccess, also the IIS server they have doesn't have the URL redirect module installed (so can't use <rewrite> in web.config). They have built in HTTP forwarding options however if I set the root domain to redirect to the www sub domain it just infinitely redirects. Hopefully there is some obvious option/method I've missed during the past two days of searching!

    Read the article

< Previous Page | 81 82 83 84 85 86 87 88 89 90 91 92  | Next Page >