Search Results

Search found 11896 results on 476 pages for 'smart pro'.

Page 215/476 | < Previous Page | 211 212 213 214 215 216 217 218 219 220 221 222  | Next Page >

  • Does Fetch as Googlebot still support their ajax-crawling proposal?

    - by Gunchars
    I spent half a day implementing the server side html generation for modal pages based on their proposal (link), but it seems like the Fetch as Googlebot functionality in Webmaster tools completely ignores the URL fragment. I've verified that the _escaped_fragment_ functionality is working on my server (example), but when I submit a URL like /#!/recipes, the Googlebot just fetches /. There aren't any recent confirmations that it's working and, honestly, it wouldn't surprise me if they just silently dropped the functionality without even editing the docs.

    Read the article

  • SIMPLEST way to set up password protection for a static site, with basic admin UI?

    - by Joseph Turian
    I have a static site. I would like the simplest approach to password protecting a directory, with a basic admin UI for adding/removing users. I will have so few users that I don't care about performance. I don't care if it's PHP or Django or whatever, I just want a complete software package. Apache basic auth isn't good, because you can't log out. Nor is there a UI for adding users. I tried throwing everything behind Django auth and serving the files through Django. However, Chrome treats all my text/css headers as text/plain, so I don't get any stylesheets showing. I can't use mod_xsendfile on my server because I can't reconfigure Apache to add new modules. I think this approach is overkill anyway. I can try configuring Nginx's X-Accel-Redirect, however that requires implementing all the Django code for auth myself, and I'd prefer an existing solution. However, this is my backup plan. Is there a code package that implements authentication with basic admin for a static site?

    Read the article

  • multiple domian links on google from one wordpress site

    - by user557318
    at present when i google the domain name of the wordpress sites i have worked on i receive at least three listings(often the top three. The first listing is the only one i am interested in seeing, others appear from individual pages from that wordpress site i.e. 1st hit - www.domain.com 2nd hit www.domain.com/about 3rd hit www.domain.com/designers . Does anybody know if its possible to remove all the links but the absoloute www.domain.com

    Read the article

  • Google Analytics - include filter not working

    - by gerl
    I just added an include filter this morning in my domain (test.org). I have: Custom Filter Include Request URI ^/test-a/46212$|^/test-a/46212|^/test-a/46315 Now after I go to Content Site Content All Pages, I see stats for other pages that I didn't include in my filter. For example I see /somethingelse. I only want to see stats for /test-a/46212 and whatever else in my filter. Please let me know what I'm doing wrong.

    Read the article

  • What is the right approach to use adsense with responsive web design?

    - by Sisir
    Recently I was studying responsive design a lot and designed couple of sites. But i was wondering how would I use google adsense (which is pixel based) ads on my responsive design? Very typical example is suppose I have a 728x90 ads on header. Or if i do a mobile first approach i would need different versions of ad sizes for different view posts but google doesn't allow more than three ad unit per page (as far as i know). So, Question: What is the right approach/best practice of using google adsense on a responsive site design?

    Read the article

  • Not able to track traffic on subdomain using Google Analytics

    - by Steven
    I'm trying to track traffic for my sub-domain, but it's not happening. This is how it's set up. My partner has a domain called sub1.partner.com. This domain points to partner1.mydomain.com. The idea is that users think they are browsing my partners website, when they are in fact browsing pages on my server. My tracking code looks like this: var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-xxxxxxxx-x']); _gaq.push(['_setDomainName', '.mysite.com']); _gaq.push(['_trackPageview']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); In Google analytics I've created a new account under my main account and called in partner1.mysite.com. On this account I have created a filter: Filter type: include Filter field: Host name Filter pattern: partner1.mysite.no Case sensetive: No What more can I try to track traffic on my subdomain? UPDATE Question 1 Is this line correct? _gaq.push(['_setDomainName', '.mysite.com']); Question 2 Is it correct that I have to add \ before any punctuations like so \. in filters?

    Read the article

  • Embeding a generic google search with autocomplete - not a custom site search

    - by picxelplay
    Most people's home page is google.com. My homepage is just a custom html page hosted on my computer. I do this because I am a web developer, and I have several projects that I work on a one time, so I like to have quick links to all of them. On that page I usually just have a Link to google.com for when I want to search. But below all of my quick links, I want to add a google search box (with Autocompletions). I first used a simple iframe to embed google.com into the page, but then my search results were confined to that iframe. I wanted to search for something, then my results would open in a new tab. I then came across this code snippet but it doesn't have Autocompletions: http://www.refactory.org/s/google_search/view/2 How can I add Autocompletions to this? Or is there a better way of doing it? Thanks in advance for any advice

    Read the article

  • What's an acceptable "Avg. Page Load Time"?

    - by hawbsl
    Is there any industry rule of thumb for what's considered an unacceptable load time v. an OK one v. a blistering fast one? We're just reviewing some Google Analytics data and getting 0.74 Avg. Page Load Time reported. I guess that's OK. However it would be good if some meatier comparison data were available, or a blog post, or somewhere where there's some analysis of what speeds are generally being achieved by various kinds of sites. Any useful links to help someone interpret these speeds? If you Google it you just get a lot of results dealing with how to improve your speed. We're not at that stage yet.

    Read the article

  • How can I choose between Linux and Windows hosting? [closed]

    - by Mohamad
    Possible Duplicate: How to find web hosting that meets my requirements? I am a relative beginner when it comes to choosing web servers and hosting plans. I'm about to signup for a hosting plan with GoDaddy. My main requirement is ColdFusion and MySQL. The plans on offer include Linux and Windows based plans. Which one should I choose, and why? I don't have a lot of requirements other than what I mentioned above. I never used Linux before but I doubt I'll ever need to do anything beyond tampering with my account. What are the main advantages of one over the other?

    Read the article

  • Can someone sue me/take my domain?

    - by qwerty
    I have found a great domain that isn't in use, but the .com and .net domains are already taken. There's nothing on the domains though, it just says they are registered with Network Solutions and are under construction. My question is: If i buy the .org version of the domain, and the .com guys later start a company on that domain, can they sue me or make me change name because it is too similar to their .com domain? Should i avoid using domains that have already been registered but with a different ending?

    Read the article

  • Linking with an image: Background vs <img>

    - by FreshCode
    What is considered best practice (semantically) when using text with an image to link to an internal page or category? Option 1 <nav> <a href="/kittens"> <img src="kittens.png" /> <span>Kittens</span> </a> <a href="/puppies"> <img src="puppies.png" /> <span>Puppies</span> </a> </nav> Option 2 <nav> <a href="/kittens" class="kittens">Kittens</a> <a href="/puppies" class="puppies"><span>Puppies</a> </nav> where the CSS is defined: a.kittens { background-image:url("kittens.png"); width:40px; height:60px; } a.puppies { background-image:url("puppies.png"); width:40px; height:60px; } Should I use a styled background for the link, or an <img> inside the anchor element?

    Read the article

  • How do I interpret direct traffic that lands on random pages?

    - by mfg
    Looking at yesterday, according to Google Analytics, I got six direct visitors to my site (their source/medium is direct/(none)). Only one ended up at the actual domain. The other five ended up at miscellaneous foo.com/xyz.html. I did not send out links to people by email, and I'm not sure how likely it is the people would have copy/pasted the URLs. How do the visitors end up there? Is there a way to better capture where they might be coming from?

    Read the article

  • How to handle new domain names?

    - by michael
    I have a new product which I'll call a pen ink reloader. I have a website using my products name, for example, www.inkywink.com which I want to have accessed by searches for keywords such as "pen ink", "pen out of ink" "ink for pens" etc. , since nobody knows that a pen ink reloader exists. I see that its quite difficult to get on front page for these keywords since they have lots of competition. However I notice that the exact phrases I want to rank highly for are available as domains. I purchase "www.penink.com" and "penoutofink.com" which for arguments sake are highly searched and the perfect keywords to get eyes on my money site www.inkywink.com . Two questions: 1. What is my best option to leverage those names so that they appear near top of searches so that I can get traffic to my money site? Do I just have them redirect 301 to inkywink.com or should I create small original content on each with links to my main site? 2. If I just have them redirected to inkywink.com, am I able to use keywords in metatag and headers for each site separately or do they all automatically obtain the same headers and tags as the site to which theyre redirected ? Thanks to anyone who can help as I'm a real newbie to all this.

    Read the article

  • historical weather data APIs

    - by AJ.
    I am building a web application where I need to display whole year's month wise weather conditions. So that users get an idea of what the weather conditions are like and plan their trips accordingly. I am using WunderGround's History feature but it does not give this data for smaller towns and destinations, even some very popular tourist destinations. Are there any alternatives which could provide me the same information.

    Read the article

  • Photo and Video backup [closed]

    - by MyNameIsTooCommon
    Apologies if this is the wrong forum. Please move if necessary I want to back up all my photos and videos online. Some videos are HD and up to 2GB in size. Does anyone know of any good sites? Flcker and Picasca seem the obvious ones but there seem to be limits on size. I have also heard bad things about the Picasca UI. I basically want to remove most of the photos from my laptop HD and then sync with web when I want to download them. Viewing via mobile is not a big thing for me. I just want somewhere save to back up stuff. Thanks

    Read the article

  • GAPI output doesn't match Google Analytics website

    - by Yekver
    I have to get the main info about my Google Analytics Goals. I'm using GAPI lib, with this code: <?php require_once 'conf.inc'; require_once 'gapi.class.php'; $ga = new gapi(ga_email,ga_password); $dimensions = array('pagePath', 'hostname'); $metrics = array('goalCompletionsAll', 'goalConversionRateAll', 'goalValueAll'); $ga->requestReportData(ga_profile_id, $dimensions, $metrics, '-goalCompletionsAll', '', '2012-09-07', '2012-10-07', 1, 500); $gaResults = $ga->getResults(); foreach($gaResults as $result) { var_dump($result); } cut this code is output: object(gapiReportEntry)[7] private 'metrics' => array (size=3) 'goalCompletionsAll' => int 12031 'goalConversionRateAll' => float 206.93154454764 'goalValueAll' => float 0 private 'dimensions' => array (size=2) 'pagePath' => string '/catalogs.php' (length=13) 'hostname' => string 'www.example.com' (length=13) object(gapiReportEntry)[6] private 'metrics' => array (size=3) 'goalCompletionsAll' => int 9744 'goalConversionRateAll' => float 661.05834464043 'goalValueAll' => float 0 private 'dimensions' => array (size=2) 'pagePath' => string '/price.php' (length=10) 'hostname' => string 'www.example.com' (length=13) What I see on Google Analytics website on Goals URLs page with the same period of date is: Goal Completion Location Goal Completions Goal Value 1. /price.php 9,396 $0.00 2. /saloni.php 3,739 $0.00 As you can see outputs doesn't match. Why? What's wrong?

    Read the article

  • How does Lastpass recognize actual login?

    - by Pan.student
    We are currently working on simple school project using Codeigniter where we need login page. It would be very useful if Lastpass could recognise and save logins. We have several accounts with different roles and manual insert of login is pretty slow. So I was wondering what needs to be done and where in files (view, controller?) for Lastpass to work as it does on every website. For example this is our login form: <?php echo form_open('login'); ?> <input type="text" id="username" name="username"/> <input type="text" id="password" name="password"/> <input type="submit" value="Login"/> </form> Thanks for help. (could not create new tag "Lastpass" due to low reputation) [SOLVED] changed <input type="text" id="password" name="password"/> to <input type="password" id="password" name="password"/>

    Read the article

  • Will URL encoding the image names

    - by TheGateKeeper
    Just wondering if it makes any difference to Google whether or not I URL encode the image names when linking to them. For example if I have an image named "test-1234-!.jpg", does it make a difference if I name it refer to it as "test-1234-%21.jpg"? The reason I am asking is because I am doing a major shift in the way my website works and while all new image names will not be URL encoded, all of the past ones are. I want to see if it is worth it renaming all of them or if I should just leave it like that.

    Read the article

  • Blog: Search results prefer index page over content pages.

    - by jonescb
    I have a typical blog that has recent posts on the main page, and each post's title links to a page that only shows that one article and comments and such. I was looking through some of the keywords used to get to my site and I was noticing that some of the searches would only show my main page, and not the page for the article. If users have to find the article by scrolling through the main page, it just makes it more difficult. Is there some way that I can tell search engines to rank the content page higher than my index page? Or can I do something else like not display the full text of the posts on the main page?

    Read the article

  • Avoid penalties for duplicate (multilanguage) shared hosting

    - by Dave
    My concern is about SEO. Now let me explain the scenario. I am making a 3 languages website. The development is alright, but I was targeting local customers with one domain, and international (english version) with another. Eg: Local http://www.minhalojadesapatos.com.br (this is not the real website, just example!) Other http://www.myshoesstore.com.br Both domain point to exactly the same hosting and content, but when user comes through local domain, default language is set to portuguese, otherwise, default is english. Language handling on backend uses PHP Sessions and cookies, so with just a click users can change content language. How to avoid being SEO-penalised in this context? (yeah, I was hungry when focusing market for choosing two domains but the activity really needs that, it is a travel agency).

    Read the article

  • Cookie manager PHP

    - by HaCos
    I own a Joomla commerce store and although I use Google Analytics in order to track visitors, I need to install a cookie manager in order to be able to track cookies that were installed on customer when he punctuate an order. To be more specific , I am planning to join an affiliate network and I need somehow to track no only the last visit of a customer but if he has a cookie and from which affiliate network as well.

    Read the article

  • jQuery/AJAX on old Computers/Browsers

    - by Andresch Serj
    I am working on a plattform that will have a lot of users in the so called "developing countries". So many of them will be using old computers and old browsers in tiny internet cafes. We want to make sure to give them a good user Experience and make sure the website loads as fast as possible. Problem is, that while you can save a lot of requeasts and time, using jQuery/AJAX, it also brings along a lot of Problems: - Will the Computers be powerfull enough to deal with the client side scripts? - Will the old Browsers handle jQuery? Does anyone have any experience with these sort of problems or might know of some sort of article on the topic?

    Read the article

  • SEO and multiple domains to same site

    - by mwb
    I have one website. I have two domain names that I want to point to the same site install. So whether you go to name-one.com or name-two.com you see the exact same site. Now, I can either set up name-two.com to serve 301 redirect header redirecting to name-one.com – or, I can set up name-two.com as a CNAME in the DNS pointing to name-one.com What is the different implications for SEO on this? What is recommended? I would guess it's better for branding to use a 301 redirect, so that visitors will see one consistent url for my site, right? The reason I want the two domains is that I want a version with regional letters ('ö' instead of 'oe' ) in the name.

    Read the article

  • Redirect Google crawler to different robots.txt via .htaccess

    - by user3474818
    I have googled for the answer all day and still couldn't find an answer. I have a virtual subdomain www.static.example.com which is a mirror site of www.example.com. It means I have just one root folder for subdomain and domain aswell. I want to redirect crawlers to different robots.txt file - robots_static.txt when they see .static in url in which I will forbid indexing via /disallow command. I want to do this because I have duplicated content in Google search results. Subdomain is showing the exact same content as the main domain. Does anyone know how could I achieve that crawlers sees robots_static.txt instead of robots.txt? What I have managed to find so far is this: RewriteCond %{HTTP_HOST} ^www.static.*$ [NC] RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*robots\.txt.*\ HTTP/ [NC] RewriteRule ^robots\.txt /robots_static.txt [NC,L] but when I check in webmaster tools, it still sees robots.txt as my robots file instead of robots_static.txt, so it crawls and index everything twice. What did I do wrong? Thanks EDIT: This is my .htaccess file ## # @package Joomla # @copyright Copyright (C) 2005 - 2013 Open Source Matters. All rights reserved. # @license GNU General Public License version 2 or later; see LICENSE.txt ## ## # READ THIS COMPLETELY IF YOU CHOOSE TO USE THIS FILE! # # The line just below this section: 'Options +FollowSymLinks' may cause problems # with some server configurations. It is required for use of mod_rewrite, but may already # be set by your server administrator in a way that dissallows changing it in # your .htaccess file. If using it causes your server to error out, comment it out (add # to # beginning of line), reload your site in your browser and test your sef url's. If they work, # it has been set by your server administrator and you do not need it set here. ## ## Can be commented out if causes errors, see notes above. Options +FollowSymLinks ## Mod_rewrite in use. RewriteEngine On RewriteEngine On RewriteCond %{HTTP_HOST} !^www\. RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L] RewriteCond %{HTTP_HOST} ^www.static.*$ [NC] RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*robots\.txt.*\ HTTP/ [NC] RewriteRule ^robots\.txt /robots_static.txt [NC,L] ## Begin - Rewrite rules to block out some common exploits. # If you experience problems on your site block out the operations listed below # This attempts to block the most common type of exploit `attempts` to Joomla! # # Block out any script trying to base64_encode data within the URL. RewriteCond %{QUERY_STRING} base64_encode[^(]*\([^)]*\) [OR] # Block out any script that includes a <script> tag in URL. RewriteCond %{QUERY_STRING} (<|%3C)([^s]*s)+cript.*(>|%3E) [NC,OR] # Block out any script trying to set a PHP GLOBALS variable via URL. RewriteCond %{QUERY_STRING} GLOBALS(=|\[|\%[0-9A-Z]{0,2}) [OR] # Block out any script trying to modify a _REQUEST variable via URL. RewriteCond %{QUERY_STRING} _REQUEST(=|\[|\%[0-9A-Z]{0,2}) # Return 403 Forbidden header and show the content of the root homepage RewriteRule .* index.php [F] # ## End - Rewrite rules to block out some common exploits. ## Begin - Custom redirects # # If you need to redirect some pages, or set a canonical non-www to # www redirect (or vice versa), place that code here. Ensure those # redirects use the correct RewriteRule syntax and the [R=301,L] flags. # ## End - Custom redirects ## # Uncomment following line if your webserver's URL # is not directly related to physical file paths. # Update Your Joomla! Directory (just / for root). ## # RewriteBase / RewriteCond %{THE_REQUEST} ^GET.*index\.php [NC] RewriteCond %{THE_REQUEST} !/system/.* RewriteRule (.*?)index\.php/*(.*) /$1$2 [R=301,L] RewriteCond %{THE_REQUEST} ^GET ## Begin - Joomla! core SEF Section. # RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}] # # If the requested path and file is not /index.php and the request # has not already been internally rewritten to the index.php script RewriteCond %{REQUEST_URI} !^/index\.php # and the request is for something within the component folder, # or for the site root, or for an extensionless URL, or the # requested URL ends with one of the listed extensions RewriteCond %{REQUEST_URI} /component/|(/[^.]*|\.(php|html?|feed|pdf|vcf|raw))$ [NC] # and the requested path and file doesn't directly match a physical file RewriteCond %{REQUEST_FILENAME} !-f # and the requested path and file doesn't directly match a physical folder RewriteCond %{REQUEST_FILENAME} !-d # internally rewrite the request to the index.php script RewriteRule .* index.php [L] # ## End - Joomla! core SEF Section. <FilesMatch "\.(ico|pdf|flv|jpg|ttf|jpg|jpeg|png|gif|js|css|swf)$"> Header set Expires "Wed, 15 Apr 2020 20:00:00 GMT" Header set Cache-Control "public" </FilesMatch> <ifModule mod_headers.c> Header set Connection keep-alive </ifModule> ########## Begin - Remove Etags # FileETag none # ########## End - Remove Etags

    Read the article

  • Redisigning an old site, structure change etc

    - by RhymeGuy
    I have an old site built in 2006, it has around 200 pages and 500 pictures. Every single page is of course indexed as well as images. It is very well ranked for targeted keywords and I receive good amount of SEO traffic (I guess that's due the various campaigns, branding, ppc, etc..) Problem: Site has outdated design, pages and images have not so proper names, there are no heading and alt tags, it was built in tables, inline CSS etc.. Goal: Complete redisign site, use divs, change file names, add proper meta data, alt tags etc.. Question: How this can affect current SEO positions? I will redirect (301) every single page to the new one, build site map, but what to do with images? Do I need to redirect them also? Any other suggestion?

    Read the article

< Previous Page | 211 212 213 214 215 216 217 218 219 220 221 222  | Next Page >