Search Results

Search found 9717 results on 389 pages for 'gkt pro'.

Page 141/389 | < Previous Page | 137 138 139 140 141 142 143 144 145 146 147 148  | Next Page >

  • Does google use chrome to check if a link is used by humans or just there for the bots?

    - by sam
    Does clicking a link in chrome tell google the link is used by humans and there fore not just automated backlink spam. It sounds weird but i read it today on a slightly obscure seo blog, they mensionned clicking the backlinks they make in a version of chrome where they have the "send data annonmusly to google" feature turned on. It sounded a bit far fetched but then i thought it could have a truth to it as with google now looking harder at "spammy" links it would mean at least some humans are using them. Has any one else heard anything else about this ?

    Read the article

  • How to check that I have recovered from Penguin 2.0?

    - by Simon Walker
    I have 3 year old website which has been hit by Penguin 2.0 in May. The website traffic dropped almost 30%. I have been working hard from last 2.5 months on the website and my website's traffic recovered in last week of August. In fact, I am receiving more traffic then ever. When I look at the stats, I find my website's search engine visibility has been improved. It is now appearing for more search queries. My website's impressions have also increased. What I am worried about is that my website is nowhere in top 5 pages for keywords having high competition and carrying the highest search volume. They are few in number but important. Should I consider my current situation as recovery or it's just the partial recovery? If it is only partial, then how come traffic is more then it was before penguin 2.0?

    Read the article

  • AdSense Custom Search Ads - custom quesry

    - by Alex
    i'm trying to set up a custom search ad, but I am nost sure about the query. On the site it says (https://developers.google.com/custom-search-ads/docs/implementation-guide) 'query' should be dynamic based on your page. This variable targets the ads and therefore should always match what the user on your site has just performed a search for. Now, what I understand is: I have to program my page so that the query variable contains some custom words. Am I right? If a user gets to my site through clicking on an adsense, there is no way to "know" what the user looked for and display my query accordingly, right? Thanks for any help!

    Read the article

  • Wordpress Multisite and Google Analytics in subfolders with mapped domains

    - by David
    I have a wordpress multisite with sub folders. The site's subfolders are mapped to domains, which are set to primary. I'm using the 'Google Analytics Multisite Async' code to track things. From what I can see it's tracking the sites fine (getting page hits for each site in google analytics) baring the original site in the Multisite which in content overview lists domains then the amount of traffic it's getting along with the orginal domains traffic. I don't want to track any other traffic for my orginal site than what goes to that. i.e. I don't want it tracking my other sites in multi-site. e.g. domain1.com is my orginal and I have lots of other sites in the multisite lets say domain2.com, domain3.com. In content overview in Analytics it's listing say domain2.com as content. Can I tell it to filter these out some how either in Analytics or within WordPress? Hopefully explained that clearly!

    Read the article

  • Is there a modern (eg NoSQL) web analytics solution based on log files?

    - by Martin
    I have been using Awstats for many years to process my log files. But I am missing many possibilities (like cross-domain reports) and I hate being stuck with extra fields I created years ago. Anyway, I am not going to continue to use this script. Is there a modern apache logs analytics solution based on modern storage technologies like NoSQL or at least somehow ready to cope with large datasets efficiently? I am primarily looking for something that generates nice sortable and searchable outputs with the focus on web analytics, before having to write my own frontends. (so graylog2 is not an option) This question is purely about log file based solutions.

    Read the article

  • Reviewing an advertiser / ads

    - by Joyce Babu
    I recently got a direct ad campaign for one of my sites. The advertiser agreed to my price without much bargaining. No contract was signed, but I had specified pre payment as part of our agreement. The advertiser is not a major network. Currently they are showing affiliate ads. Some things about this deal seems fishy to me - The amount was direct deposited to our account, not transferred from their company account. - Just below the visible ad, I can see a hidden iframe which contains the flight search widget for a major airline (Could this be cookie stuffing?) - They are contacting me from a gmail account - They did not insist for a signed contact How can I ensure that the advertiser is legitimate and is not using the adslot for illegal purpose?

    Read the article

  • SEO and new site - visibility best practices

    - by Ispuk
    Since i launched a new web site, i was wondering which are the best practices to let the visibility of the site grow up faster then just leaving the site online? I mean which internet channels are good to speed up visibility of a new site? Can anyone show some tricks he do when launching new site? I'm not talking about spam, advertising and SEO tech tips (the site is well done with all the main SEO tech tricks).

    Read the article

  • schema.org 'reviewRating' tag not recognized by google snippet testing tool

    - by saravanak
    I'm trying to add more structural information to my webpages by using the microdata format suggested in www.schema.org. The procedure seems straight forward but I'm having issues validating my results in the Google Rich Snippets Testing Tool. Check out this review page, here I'm using the 'reviewRating' property item to specify rating values for that particular review. I followed the same format as defined in schema.org/Rating but this markup fails validation in Google's rich snippet testing tool with the following error info. Item Type: http://schema.org/rating reviewrating = 5 ratingvalue = 5 Warning: Property "reviewrating" was not found.

    Read the article

  • Forward .html/.htm to .php with .config

    - by PhilipK
    I'm moving a site from my linux hosted server to a client's windows hosted server. The .htaccess file no longer works and I'm told that windows servers use .config . How can I forward all users accessing .html & .htm files to the equivalent .php file. Server Info... OS/Hosting Type: Windows / Shared Hosting .Net Runtime Version: ASP.Net 2.0/3.0/3.5 PHP Version: PHP 5.2 IIS Version: IIS 7.0 Data Center: US Regional EDIT *Hosting provided by GoDaddy Was told by a friend following should work but it has no effect on the site. <configuration> <system.webServer> <handlers> <add name="PHP-FastCGI" verb="*" path="*.html" modules="FastCgiModule" scriptProcessor="c:\php\php-cgi.exe" resourceType="Either" /> </handlers> </system.webServer> </configuration>

    Read the article

  • Is there an easier way to implement 301 redirects when converting a site to WordPress

    - by Amanda
    I have just converted a website to WordPress. The old site has hundreds of hard-coded html files, and the new site does not match the old site's directory structure or file naming system (bad SEO in the original site), so I can't place any "blanket" 301 redirects. Its been at least 2 months, and the old links are still appearing in Google searches, despite a google-friendly sitemap.xml. Do I need to hardcode a 301 for every individual page in my htaccess file, or am I just misunderstanding 301s and apache? Is there some other way I can update Google about the fact that my entire site structure has changed?

    Read the article

  • Ditch cPanel / WHM in favour of manual seup

    - by BWRic
    We currently use cPanel / WHM on a reseller account but are looking at getting a dedicated server. My first thought was to duplicate this set up on the dedicated box to allow us to quickly create new accounts. I'll be a managed server so they'll have set up the LAMP stack. I'm curious if I actually need cPanel and WHM. We don't use many of the features from cPanel / WHM, just creating accounts and databases, clients do not have FTP access. I'm no sys admin and come from a Windows / GUI background but have some knowledge in setting up development servers. WHM: Creating accounts I presume this sets up the Apache virtual host, FTP access and DNS settings. I've some knowledge of editing the Apache files to create virtual hosts. Am I correct in thinking as long as the DNS is pointing to the server IP and the virtual host is configured the server can serve the (php) pages? I'm not sure I need per site FTP access as only we will have access so I could have a server wide/htdocs only access to view all the site. The company who supply the dedicated hosts would also provide the own DNS management tool so I'm not need to cPanel one. MySQL: Creating users and databases We use cPanel to create the MySQL users and databases. As it's a dedicated box and I can have root access I think this could be replaced by SQLyog for db management and phpMyAdmin for user management. Do you I need cPanel or can I get by editing a few text files for creating the accounts, then use the MySQL tools for databases? Or am I missing something major with how the sites are configured?

    Read the article

  • Besxt Text-to-Speech Solution for my Website

    - by Tim Marshall
    I'm working on the 'Ease of Access' section of my website with the options to increase the font-size displayed on pages to a minimum, invert colours and whatnot. I wish to implement a plugin which, if enabled by the user, to read content on my website. Presumably my best option is a website plugin, however there might be some programming I've not come across which allows the likes of PHP to read content. I'm not entirely sure how this all works. Best Regards, Tim

    Read the article

  • Why am I seeing unexpected requests for "crossdomain.xml" in my logs?

    - by Bogdacutu
    I've getting lots of 404 errors from crossdomain.xml. Here are the request details, as provided by Google App Engine: 404 22ms 19cpu_ms 0kb Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.122 Safari/534.30 69.130.*.* - - [24/Jul/2011:07:43:42 -0700] "GET /crossdomain.xml HTTP/1.1" 404 124 "http://s.nsdsvc.com/App/DddWrapper.swf?c=3" "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.122 Safari/534.30" "app.*.*.*" ms=22 cpu_ms=19 api_cpu_ms=0 cpm_usd=0.000633 instance=00c61b117c557326bef77d341a345431e66b I'm not sure what is going on. Can anyone help me solve this issue?

    Read the article

  • How much a website like bytes.com earn

    - by robin das
    i am running a website similar to bytes.com (IT QnA site) my site attract 600 unique visitors daily. we are planning our self as bytes.com. Atleast any one can tell me that whether we can earn some serious money with this kind of website. websiteoutlook estimate Daily Pageview - 700636 to bytes.com Can any one please let me know what kind of earning we can expect for dotcom like bytes.com. Please give some light on this topic as a lot of energy, time and money goes in building this kind of website thanks robin Das

    Read the article

  • Directing from a 1und1 hosting solution, with urls intact

    - by Jelmar
    I have done this before on GoDaddy without a hitch, but I cannot seem to figure out this particular case. I have a domain space with temporary url http://yogainun.mysubname.com/ and am hosting the domain name that is to be applied to it at 1und1.de. Right now I have set it up so that from the 1und1 domain name hosting the address http://www.yoga-in-unternehmen.de/ is frame redirected to the subdomain that I just referred to. But this is not what I want. http://www.yoga-in-unternehmen.de/ is to be the domain. With the frame redirect, url's like http://www.yoga-in-unternehmen.de/example-article do not show up. But this is what I want. With godaddy in a similar case, I just turned on DNS and changed the name servers. That worked without problem, but with 1und1 not. Is there something I am missing?

    Read the article

  • Website with sections in Drupal?

    - by Matt Hampel
    What is the best way to create a website with sections in Drupal? Users need to be able to add, remove, and nest pages fairly easily. Pages added to a section should have an appropriate URL, like "/[section name]/[page title]". This seems like a straightforward task, but I can't find the right combination of tools to do it. Subsite comes close, but for some odd reason, doesn't set up the correct content paths. The closest I got was creating a book for each subsection, but that feels like I'm using the wrong tool for the job. Edited with my solution: I used organic groups with pathauto. I set pathauto so that pages in groups had URLs that were of the form [group path]/[page title].

    Read the article

  • Will search engines reindex a page that has been set to redirect on to a newer site page?

    - by Luke Duddridge
    We were asked by a client to change a website so that any pages/Urls we were hosting on an older site would now redirect to a newer site hosted somewhere else and a different domain name to boot. We did this by changing each page in the IIS site management, to redirect to a url on their new domain instead of rendering a page locally. According to the redirect tool here: http://www.webconfs.com/redirect-check.php . What we have done is search engine friendly. Problem now is... the client has been on a course learning all about meta tags and so thinks they have a better understanding of the "matrix" (remember there is no spoon). As Google still has the older site appearing in a search, this isnt helping matters. I have tried to explain, we have to wait for Google to reindex. I'm not blowing smoke am I? I'm now starting to wonder... will the older site always appear in a search, even though the pages don't exist? Is there a better way I should be redirecting their site to ensure google will stop keeping an index of pages that no longer exist and would instead replace them with the content in the newer site? a suggestion on the site mentioned above is to use the code: Response.Status="301 Moved Permanently" Response.AddHeader "Location","http://www.new-url.com/" Does using the option in the IIS management tool to redirect the url not do the same?

    Read the article

  • Tools for managing eCommerce backend

    - by rboarman
    I am working with an eCommerce company that has outgrown their hacked together backend for managing inventory, pricing and feeds to various shopping engines (Yahoo, 3d cart, Amazon, etc.). They currently manage about 12,000 skus and are doing $40M in revenue. Their internal people are working on a new Magento solution, but that is six months away and they need to replace/improve their current solution in order to hold them over. Their current solution was developed by two people who have left the company. What tools/architecture do other eCommerce sites use to manage their inventory, pricing, product descriptions and feed generation for the shopping engines? The current solution looks like this: 1) Inventory, pricing and product descriptions are maintained in a database and in NetSuite by employees 2) New products are added to the database via import 3) Twice a week data is extracted into a giant Excel spreadsheet 4) The Excel file adjusts pricing based on some simple algorithms 5) The Excel file exports about six different csv feeds which are manually uploaded to Amazon, 3d cart, Yahoo, Google and Merchant Advantage a. Each feed is a variant of the product which different field names and formatting b. Pricing levels differ between feeds c. Some products are not sent to all feeds 6) Orders are manually parsed and the inventory is adjusted as needed once product is sold The new solution should: 1) Import data from ODBC, CSV and NetSuite (CSV via ftp) 2) Apply pricing changes via simple algorithms (< $80 add $10, $200 add $25) 3) Ensure margins are being met 4) Format and generate a bunch of CSV and XML feeds 5) Perhaps upload feeds to shopping engines automatically What I need to do is replace the Excel file with something that is maintainable and automated. Something in the .Net stack is preferable but not mandatory. I’ve been looking at BizTalk but it may take too long to develop and deploy. Any suggestions?

    Read the article

  • Estimate of Hits / Visits / Uniques in order to fall within a given Alexa Tier?

    - by Alex C
    I was wondering if anyone could offer up rough estimates that could tell me how many hits a day move you into a given Alexa rank ? Top 5,000 Top 10,000 Top 50,000 Top 100,000 Top 500,000 Top 1,000,000 I know this is incredibly subjective and thus the broad brush strokes with the number ranges... BUT I've got a site currently ranked just over 1.2M worldwide and over 500k in the USA (http://www.alexa.com/siteinfo/fstr.net) Pretty cool for something hand-built on weekends (pat self on back) I was applying to an ad-platform and was told that their program doesn't accept webmasters who have an Alexa rank of greater than 100,000. (Time to take back that pat on the back I guess). I know that my hits in the last 30 days are somewhere on the order of 15,000 uniques and 20,000 pageviews. So I'm wondering how much harder do I have to work to achieve my next "goals"? I'd like to break into the top million, then re-evaluate from there. It'd be nice to know what those targets translate into (very roughly of course). I imagine that alexa ranks and tiers become very much exponential as you move up the ranks, but even hearing annecdotal evidence from other webmasters would be really useful to me. (ie: I have a site that is ranked X and it got Y hits in the last 30 days) Thanks :) - Alex

    Read the article

  • How to parse JSON data from web more faster [closed]

    - by Kaidul Islam Sazal
    I have json inventory inventory.json on the server like this: [ { "body" : "SUV", "color" : { "ext" : "White diamond pearl", "int" : "Taupe" }, "id" : "276181", "make" : "Acura", "miles" : 35949, "model" : "RDX", "pic" : [ { "full" : "http://images1.dealercp.com/90961/000JNBD/001_0292.jpg" } ], "power" : { "drive" : "Front wheel drive", "eng" : "2.3L DOHC PGM-FI 16-VALVE", "trans" : "Automatic" }, "price" : { "net" : 29488 }, "stock" : "6942", "trim" : "AWD 4dr Tech Pkg SUV", "vin" : "5J8TB2H53BA000334", "year" : 2011 }, { "body" : "Sedan", "color" : { "ext" : "Premium white pearl", "int" : "Taupe" }, "id" : "275622", "make" : "Acura", "miles" : 40923, "model" : "TSX", "pic" : [ { "full" : "http://images1.dealercp.com/90961/000JMC6/001_1765.jpg" } ], "power" : { "drive" : "Front wheel drive", "eng" : "2.4L L4 MPI DOHC 16V", "trans" : "Automatic" }, "price" : { "net" : 22288 }, "stock" : "6945", "trim" : "4dr Sdn I4 Auto Sedan", "vin" : "JH4CU2F66AC011933", "year" : 2010 } ] here are two index, There are almost 5000 index like this. I parsed this json like this: var url = "inventory/inventory.json"; $.getJSON(url, function(data){ $.each(data, function(index, item){ //straight-forward loop if(item.year == 2012) { $('#desc').append(item.make + ' ' + item.model + ' ' + '<br/>' + item.price.net + '<br/>' + item.pic[0].full); } }); }); This is working fine.But the problem is that, this searching and fetching process is little bit slow as there are 5000 indexes already and it's increasing day by day. It seems that, it is a straight-forward loop to parse the data and a normal brute-force method. Now I want to know if there any time efiicient way to parse more faster.Any faster method to parse instead of straight-forward loop ?

    Read the article

  • Wordpress .htaccess preventing subfolder access

    - by John K.
    This is sort of a goofy setup, but it's not in my power to reconfigure it at this time. I'm running in a shared hosting environment. The domain is example.com. This is an add-on domain on the host side with example.com being redirected to the www/example.com sub-directory. That directory houses a standard Wordpress site which acts as the main site when you visit example.com. The .htaccess file within that directory is: # BEGIN WordPress <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] </IfModule> # END WordPress <IfModule mod_rewrite.c> RewriteEngine On RewriteRule ^wp-admin/profile\.php$ /ssm/welcome [R] </IfModule> I have a subdirectory, at the root level with the /example.com subdirectory that houses a cake php application. That subdirectory is /tracker. My problem is that when I attempt to browse to example.com/tracker, I get a 404 from Wordpress because perma links are on. What I think I need is a rewrite rule in the Wordpress .htaccess file that short circuits the existing rewrite rules and permits example.com/tracker to work independently of the Wordpress install. Or a rewrite rule at the root level that short circuits the redirect to the /example.com directory in the first place. Not sure how well I explained that so here's a summary. The www/ directory structure: example.com/ tracker/ Add on domain of www.example.com redirecting to the /example.com directory with Wordpress and a tracker/ directory running CakePHP which I would like to access via www.example.com/tracker. If you need further info or clarification let me know!

    Read the article

  • QapTcha error issue. Works locally, not on live server [migrated]

    - by BlassFemur
    I am adding QapTcha (http://demos.myjqueryplugins.com/qaptcha/) to a website that I am working on and I'm getting the error "Uncaught TypeError: Cannot read property 'error' of null". What's weird to me is everything is working perfectly locally. No errors or anything. Once I uploaded via ftp to the live server, I get the above error. Below is the block of code that seems to be generating the error: Slider.draggable({ revert: function(){ if(opts.autoRevert) { if(parseInt(Slider.css("left")) (bgSlider.width()-Slider.width()-10)) return false; else return true; } }, containment: bgSlider, axis:'x', stop: function(event,ui){ if(ui.position.left (bgSlider.width()-Slider.width()-10)) { // set the SESSION iQaptcha in PHP file $.post(opts.PHPfile,{ action : 'qaptcha', qaptcha_key : inputQapTcha.attr('name') }, function(data) { if(!data.error) Uncaught TypeError: Cannot read property 'error' of null { Slider.draggable('disable').css('cursor','default'); inputQapTcha.val(''); TxtStatus.text(opts.txtUnlock).addClass('dropSuccess').removeClass('dropError'); form.find('input[type=\'submit\']').removeAttr('disabled'); if(opts.autoSubmit) form.find('input[type=\'submit\']').trigger('click'); } },'json'); } } }); I'm not really sure what's going on as to why it works locally and not on the server. Any help/ suggestions would be appreciated. Thanks

    Read the article

  • mod_rewrite works within directory not on root

    - by Anvesh Saxena
    I am having problem in my RewriteRule for the tags portion. What I am able to debug is that the rule is been triggered at least because the page "tags.php" is been rendered but without the URL parameters. This .htaccess file with the rules is within root for my sub-domain and has following content for tags postion. # Rewrite rule for tags RewriteRule ^tags/(\w+)/(\d+)/?$ tags.php?tag_name=$1&tag_id=$2 RewriteRule ^tags/(\w+)/?$ tags.php?tag_name=$1 RewriteRule ^tags/?$ tags.php?tag_name= Another problem that I ain't able to debug is that the similar .htaccess file exists for a directory within my sub domain and is working as expected with the necessary URL parameters also been available. The .htaccess file within the directory reads as follows # Rewrite rule for tags RewriteRule ^tags/(\w+)/(\d+)/?$ restAPI.php?type=tags&tag_name=$1&tag_id=$2 RewriteRule ^tags/(\w+)/?$ restAPI.php?type=tags&tag_name=$1 RewriteRule ^tags/?$ restAPI.php?type=tags&tag_name= Could anyone point me the problem that I might be having in my Rewrite rules, I am also facing Internal server error sometimes which I am second guessing is due to the linked problem. Note:- I have Apache version 2.2.23 on my shared hosting.

    Read the article

  • Very few visitors on Analytics: incorrect setting?

    - by Akaban
    it's quite a long time Analytics is making me crazy: I have a 2 years old website, started with Aruba (an Italian provider) and then transfered on Hostgator. It's a blog Wordpress + MyBB forum, and on both the platforms I've the Analytics code in the footer. The problem is that the stats on Analytics are simply ridiculous compared to the numbers reported by the Aruba (before) and Hostgator (then). I think that the numbers of Aruba/Hostgator are correct, simply because just the daily users connected on the forum is higher than the Analytics numbers. I know it's a really confused request, but maybe you can help me to understand what's the problem.

    Read the article

  • Hidden form and SEO

    - by AntonAL
    I'm using hidden forms, to collects some statistics. Will it have any penalty from search engines ? Update 1: I'm collecting some statistics, based on user interaction with my website. For example, POST requests will be sent to server, when: user stops a playing video user has watched a video till it's end etc. Using form_remote_for in Rails, i'm just rendering the form and keep it invisible. The reason on doing that - is to utilize authencity tokens, and just have less to code. Via JavaScript i'm only filling some hidden fields up and initiating form submission.

    Read the article

< Previous Page | 137 138 139 140 141 142 143 144 145 146 147 148  | Next Page >