Search Results

Search found 9717 results on 389 pages for 'pro'.

Page 194/389 | < Previous Page | 190 191 192 193 194 195 196 197 198 199 200 201  | Next Page >

  • Not able to track traffic on subdomain using Google Analytics

    - by Steven
    I'm trying to track traffic for my sub-domain, but it's not happening. This is how it's set up. My partner has a domain called sub1.partner.com. This domain points to partner1.mydomain.com. The idea is that users think they are browsing my partners website, when they are in fact browsing pages on my server. My tracking code looks like this: var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-xxxxxxxx-x']); _gaq.push(['_setDomainName', '.mysite.com']); _gaq.push(['_trackPageview']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); In Google analytics I've created a new account under my main account and called in partner1.mysite.com. On this account I have created a filter: Filter type: include Filter field: Host name Filter pattern: partner1.mysite.no Case sensetive: No What more can I try to track traffic on my subdomain? UPDATE Question 1 Is this line correct? _gaq.push(['_setDomainName', '.mysite.com']); Question 2 Is it correct that I have to add \ before any punctuations like so \. in filters?

    Read the article

  • wrong SEO result for staging website

    - by OCD
    Hi, We have a domain with URL, lets say: http://www.xxxxstaging.com/abc.php?id=10 which is having a pretty good ranking for some keywords in google. However, this is our staging website and due to some historical reason, it was exposed to public. Now we have a production domain (the one we want to be publicly available), say: http://www.xxxxproduction.com/abc.php?id=10 which have exactly the same html output as xxxxstaging.com our question is, for the exactly same keyword search, it never appear in any page of google. Is it possible for us to "switch" the ranking in google, either tell the robot that they are indeed the same website, or tell google to change it for us? Thank you.

    Read the article

  • How to smartly optimize ads on website

    - by YardenST
    I've a content website that presents ads. Now, my team want to optimize it for a better experience for the users. (we really believe our ads are good for our users.) We are sure that every website deals with this issue and there must be some known ways and methods to deal with it, that smart people thought of before. so what i'm looking is a tested, working method to optimize ads. for example: if i was asking about optimizing my website in Google, I would expect you to answer me: learn SEO if i was asking about optimizing the use of my website: usability testing. navigation: information architecture what is the field that deals with optimizing ads?

    Read the article

  • Why do my websites have a first page rank on Bing and Yahoo but not Google? [closed]

    - by Linda Cullum
    I have 3 websites suffering from a drop in ranking with Google and hence a huge drop in traffic. The instant drop ocurred in September and I have not been able to remedy it. For the past 6-10 years my main website http://LearnToSail.Net has ranked from #3 to #1 on the 1st page of Google and all the other engines with the search term "learn to sail" Now it shows on the 1st page of Bing and Yahoo but does not show up on ANY pages of Google. The only way it does come up is if I add "cd" to the "learn to sail" phrase. We sell a sailing cd on that website. The other websites are http://LearnToSailOnLine.com ..search terms are "learn to sail online" or learntosailonline and historyofthepilgrims.com search terms are "history of the pilgrims" "historyofthepilgrims" I get the same result. Gone on Google but 1st pages for Bing and Yahoo. I have researched, edited,updated blogs, made sitemaps, prayed to the universe and use Google Webmaster tools but nothing is changing and I have lost alot of business. I host with 1and1.com and have been back and forth with them but to no avail and no change in traffic. I thought maybe some DNS mapping was off. I used to have alot of traffic now I have hardly any. Any advice would be greatly appreciated. I am still in the process of working on the issue of course! This is a really great website here and I am glad I came across it. Thank you, LS Cullum Little Pines Multimedia

    Read the article

  • Google penalty recovery

    - by sajeev
    I have a site, which is a spiritual site and has nothing to do with commercial or business thing. It was in the first page with the keyword RADHANATH SWAMI. Then suddenly, in sep'13, the site went out from google search. When I checked my WebmasterTools, I saw that there were 40,000 backlinks from a site named http://freeguestbooks.net. But manual action section in WebmasterToolssaid "No Manual webspam actions found" and till date, it shows the same. Then I contacted the webmaster of freeguestbooks.net and requested him to remove the 40K odd backlinks and he very kindly removed it. The WebmasterTools now shows 21,777 backlinks from this site. But since 2 months, the decrease rate of these backlinks is very slow, almost zero. Again, I had contacted the webmaster of freeguestbooks.net and he confirmed that there are no backlinks pointing to his site. I am also told that my anchor text is over-optimised... Total backlinks to my site as per WebmasterTools is only 24,937 out of which 21,777 are the links from freeguestbooks.net as shown in WebmasterTools. Could an expert suggest a way out to get back into google.... Sajeev, India

    Read the article

  • How do I optimize SEO in a multiblog WordPress install?

    - by user35585
    We are about to launch two product pages plus a corporate website. The goal is to keep a blog in all of the sites, but here it comes the question about how to do it in a way we get everything unified but do not mess with Google's web crawlers. We considered the following options: Putting a blog from which we retrieve two categories with custom CSS, so we have a blog that sub splits two category-dependent blogs; this way we can get the feeds and will point to it Putting two product blogs of which we retrieve their posts into a bigger, corporate blog Putting three independent blogs Despite I was for the first option, so we only have to address our content from the product pages, I would sincerely like to hear your opinion. We are afraid duplicate content or strange link games may make us lose PageRank. How would you do it?

    Read the article

  • Cloudflare displaying cached CSS in-line?

    - by esqew
    I recently enabled CloudFlare on my domain, and when I make HTML statements like this: <link href="css/main.css" rel="stylesheet" /> The CSS gets displayed in-line, like this: <style>body{padding-top:40px}span.light{font-weight:lighter!important}span.title{font-size:60px;line-height:1;letter-spacing:-1px;color:inherit}</style> When I update the file via FTP, the changes are not reflected, which leads me to believe that this is a caching issue. Is this due to CloudFlare? If so, how do I disable the behavior? EDIT I also came to the conclusion that caching is behind the behavior after being able to see changes in the page after re-naming the CSS file.

    Read the article

  • Avoid pagination pages from appearing higher than real content in SERPS

    - by WordPress Developer
    I have a gallery-type website that has about 20k of pages and naturally it uses pagination. However, sometimes /page/2 appears higher in search results than /post/201339 for example. I'd like to give emphasis to the actual content (posts, videos, whatever the site is about) and not on pages that merely list this content in a paginated matter. What is the best way to avoid this issue? Maybe a NOINDEX,FOLLOW meta tag on the paginated pages?

    Read the article

  • A lot of 302 redirects

    - by user3651934
    I have a website for which one month stat shows: Unique Visitors 6274 Total Visitors 7260 Pages visited 9520 Hits 88891 Whats concerns me about is the HTTP status code: 302 Moved temporarily (redirect) 36302 How come 40% hits are being redirected. If it is not normal, what could be the possible reasons? ------------------------ adding more information ------------------------ Ok, here is the code I'm using in my .htaccess file for clean URLs. Is this causing as many as 36302 redirect hits? RewriteCond %{REQUEST_FILENAME}.php -f RewriteRule ^([^\.]+)$ $1.php [L] RewriteCond %{REQUEST_FILENAME} -d RewriteRule ^(.+[^/])$ $1/ [R] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ index.php?page=$1 [L,QSA] RewriteRule ^(.*)/$ index.php?page=$1 [L,QSA]

    Read the article

  • Using .htaccess, can you hide the true URL?

    - by Richard Muthwill
    So I have a web hotel with 1 main website http://www.myrootsite.com/ and a few websites in subdirectories, in a folder called projects. I have domain names pointing to the subdirectories, but when holding the mouse over a link in those websites the URLs are shown as: http://www.myrootsite.com/projects/mysubsite/contact.html When I'm on mysubsite.com I want them to be shown as: http://www.mysubsite.com/contact.html I spoke to support for the web hotel and the guy said try using .htaccess, but I'm not sure exactly how to do this. Thank you very much for your time! Edit: For more information My website is: http://www.example1.com/ and I also own http://www.example2.com/. All of example2.com's files are in: example1.com/projects/example2/. When you visit example2.com, you'll notice all of the URL's point towards: example1.com/projects/example2/ but I want them to point towards: example2.com/ Can this be done? I hope this is enough info for you to go on :). Edit: For w3d I go to the url mysubsite.com and the browser shows the url mysubsite.com. The services I'm using create an iframe around myrootsite.com and use the url mysubsite.com I just hate that in Firefox and Internet Explorer, holding the mouse over link show that the destination url is: myrootsite.com/projects/mysubsite/...

    Read the article

  • Which tool can tidy/indent HTML5?

    - by user2534
    I've already spent about 2 hours searching google and trying various tools but either they don't do indent tags per say or they aren't HTML5 compatible such as the famous HTML tidy which was last updated 3 years ago… N.B. I don't want to apply this to the source of a page (like PHP or JS would do) but to the code in the editor so that a clear hierarchy appears between tags. Ideally I'd like a Mac OS X tool but I'll take any online tool and in last resot a Wine compatible one. P.S. at the moment I use Coda from Panic

    Read the article

  • URL parameter names being changed by user agents

    - by Mike Deck
    In reviewing one of our site's web logs I'm seeing instances where we are returning a 404 to requests because we're expecting an id parameter to be sent, but instead we're seeing a di parameter. The resource in question is an image but which image file actually gets served is dependent on the id parameter. The expected url is something like http://images.mysite.com/photo.gif?id=123&width=200&height=300 What I'm seeing in the logs is requests for http://images.mysite.com/photo.gif?di=123&width=200&height=300 The only case where we are seeing this on the id parameter. It seems unlikely that this is due to a server side or JavaScript bug since it seems to be only effecting a small percentage of our traffic. We are seeing this across a wide variety of user agents (both mobile and desktop) and IPs. Has anyone else seen this? Is there a browser plugin or other software you're aware of that could be causing this, and if so is there a good way to work around the issue?

    Read the article

  • How can I optimize Apache to use 1GB of RAM on my website? [closed]

    - by Markon
    My VPS plan gives me 1GB of RAM burstable to 2GB. Of course I cannot use 2 GB, nor 1 GB, everyday, so I'm planning to optimize the performance of my webserver. The average of hits-per-hour is about 8'000-10'000. This means about 2 connections-per-second. Max hits-per-hour reached until now is about 60'000. That means about 16 connections-per-second. Unluckily my current apache configuration uses too much memory (when there are not connected clients - usually during the night - it uses about 1GB) so I've tried to customize the apache installation to fit to my needs. I'm using Ubuntu, kernel 2.6.18, with apache2-mpm-worker, since I've read it requires less memory, and fcgid ( + PHP). This is my /etc/apache2/apache2.conf: Timeout 45 KeepAlive on MaxKeepAliveRequests 100 KeepAliveTimeout 10 <IfModule mpm_worker_module> StartServer 2 MinSpareThreads 25 MaxSpareThreads 75 MaxClients 100 MaxRequestsPerChild 0 </IfModule> This is the output of ps aux: www-data 9547 0.0 0.3 423828 7268 ? Sl 20:09 0:00 /usr/sbin/apache2 -k start root 17714 0.0 0.1 76496 3712 ? Ss Feb05 0:00 /usr/sbin/apache2 -k start www-data 17716 0.0 0.0 75560 2048 ? S Feb05 0:00 /usr/sbin/apache2 -k start www-data 17746 0.0 0.1 76228 2384 ? S Feb05 0:00 /usr/sbin/apache2 -k start www-data 20126 0.0 0.3 424852 7588 ? Sl 19:24 0:02 /usr/sbin/apache2 -k start www-data 24260 0.0 0.3 424852 7580 ? Sl 19:42 0:01 /usr/sbin/apache2 -k start while this is ps aux for php5: www-data 7461 2.9 2.2 142172 47048 ? S 19:39 1:39 /usr/lib/cgi-bin/php5 www-data 23845 1.3 1.7 135744 35948 ? S 20:17 0:15 /usr/lib/cgi-bin/php5 www-data 23900 2.0 1.7 136692 36760 ? S 20:17 0:22 /usr/lib/cgi-bin/php5 www-data 27907 2.0 2.0 142272 43432 ? S 20:00 0:43 /usr/lib/cgi-bin/php5 www-data 27909 2.5 1.9 138092 40036 ? S 20:00 0:53 /usr/lib/cgi-bin/php5 www-data 27993 2.4 2.2 142336 47192 ? S 20:01 0:50 /usr/lib/cgi-bin/php5 www-data 27999 1.8 1.4 135932 31100 ? S 20:01 0:38 /usr/lib/cgi-bin/php5 www-data 28230 2.6 1.9 143436 39956 ? S 20:01 0:54 /usr/lib/cgi-bin/php5 www-data 30708 3.1 2.2 142508 46528 ? S 19:44 1:38 /usr/lib/cgi-bin/php5 As you can see it use a lot of memory. How can I reduce it to fit to just 1GB of RAM? PS: I also think about the switch to nginx, if Apache can't fit to my needs...

    Read the article

  • Allowed keywords for adwords [closed]

    - by Tom Gullen
    Possible Duplicate: Ok to target product names in adwords? I've replaced relevant words away from the real world without losing semantics which is a little difficult. A competitor is called "Box Maker" Can I target the keywords: "box maker" if my company is selling a tool to help you make boxes? Or is that disallowed? Would "Box Maker" the company be able to file a complain on Google? Would it go anywhere? The term 'box maker' gets a lot of searches and is an incredibly cost effective search to target

    Read the article

  • Linking with an image: Background vs <img>

    - by FreshCode
    What is considered best practice (semantically) when using text with an image to link to an internal page or category? Option 1 <nav> <a href="/kittens"> <img src="kittens.png" /> <span>Kittens</span> </a> <a href="/puppies"> <img src="puppies.png" /> <span>Puppies</span> </a> </nav> Option 2 <nav> <a href="/kittens" class="kittens">Kittens</a> <a href="/puppies" class="puppies"><span>Puppies</a> </nav> where the CSS is defined: a.kittens { background-image:url("kittens.png"); width:40px; height:60px; } a.puppies { background-image:url("puppies.png"); width:40px; height:60px; } Should I use a styled background for the link, or an <img> inside the anchor element?

    Read the article

  • Apache config that uses two document roots based on whether the requested resource exists in the first [closed]

    - by mattalexx
    Background I have a client site that consists of a CakePHP installation and a Magento installation: /web/example.com/ /web/example.com/app/ <== CakePHP /web/example.com/app/webroot/ <== DocumentRoot /web/example.com/app/webroot/store/ <== Magento /web/example.com/config/ <== Site-wide config /web/example.com/vendors/ <== Site-wide libraries The server runs Apache 2.2.3. The problem The whole company has FTP access and got used to clogging up the /web/example.com/, /web/example.com/app/webroot/, and /web/example.com/app/webroot/store/ directories with their own files. Sometimes these files need HTTP access and sometimes they don't. In any case, this mess makes my job harder when it comes to maintaining the site. Code merges, tarring the live code, etc, is very complicated and usually requires a bunch of filters. Abandoned solution At first, I thought I would set up a new subdomain on the same server, move all of their files there, and change their FTP chroot. But that wouldn't work for these reasons: Firstly, I have no idea (and neither do they remember) what marketing materials they've sent out that contain URLs to certain resources they've uploaded to the server, using the main domain, and also using abstract subdomains that use the main virtual host because it has ServerAlias *.example.com. So suddenly having them only use static.example.com isn't feasible. Secondly, The PHP scripts in their projects are potentially very non-portable. I want their files to stay in as similar an environment as they were built as I can. Also, I do not want to debug their code to make it portable. Half-baked solution After some thought, I decided to find a way to section off the actual website files into another directory that they would not touch. The company's uploaded files would stay where they were. This would ensure that I didn't break any of their projects that needed HTTP access. It would look something like this: /web/example.com/ <== A bunch of their files are in here /web/example.com/app/webroot/ <== 1st DocumentRoot; A bunch of their files are in here /web/example.com/app/webroot/store/ <== Some more are in here /web/example.com/site/ <== New dir; Contains only site files /web/example.com/site/app/ <== CakePHP /web/example.com/site/app/webroot/ <== 2nd DocumentRoot /web/example.com/site/app/webroot/store/ <== Magento /web/example.com/site/config/ <== Site-wide config /web/example.com/site/vendors/ <== Site-wide libraries After I made this change, I would not need to pay attention to anything except for the stuff within /web/example.com/site/ and my job would be a lot easier. I would be the only one changing stuff in there. So here's where the Apache magic would happen: I need an HTTP request to http://www.example.com/ to first use /web/example.com/app/webroot/ as the document root. If nothing is found (no miscellaneous uploaded company projects are found), try finding something within /web/example.com/site/app/webroot/. Another thing to keep in mind is, the site might have some problems if the $_SERVER['DOCUMENT_ROOT'] variable reads /web/example.com/app/webroot/ but the actual files are within /web/example.com/site/app/webroot/. It would be better if the DOCUMENT_ROOT environment variable could be /web/example.com/site/app/webroot/ for anything within the /web/example.com/site/app/webroot/ directory. Conclusion Is my half-baked solution possible with Apache 2.2.3? Is there a better way to solve this problem?

    Read the article

  • Facebook Like javascript related to Time Spent Downloading a page Increase in GWT?

    - by donaldthe
    Hi, I installed the Facebook Like button Javascript version on my website on December 15th. Take a look at this report from Google Webmaster Central. Crawl stats Googlebot activity in the last 90 days The crawl stats are from Googlebot which as far as I know doesn't execute Javascript. Could the Facebook Like Javascript code, "The XFBML version" be related to large spike in Time spent downloading a page? (By the way the huge spike in November was caused by a mistake where every image request was getting a 301.) I'm not sure what caused the spike to go down by half somewhere in December. It may have been related to a faulty setting in web.config. I'm at a loss as to what I can do about this or even how to tell if this is my problem or Googlebots crawl problem. Here is the Facebook code I am using to create the like button. It is right after the opening body tag <div id="fb-root"></div> <script> window.fbAsyncInit = function() { FB.init({appId: 'xxxxx', status: true, cookie: true, xfbml: true}); }; (function() { var e = document.createElement('script'); e.async = true; e.src = document.location.protocol + '//connect.facebook.net/en_US/all.js'; document.getElementById('fb-root').appendChild(e); }()); ` and this creates the like box: <fb:like show_faces="false"></fb:like> If the Javascript can't be the problem any ideas on where to start looking would be appreciated.

    Read the article

  • Chrome trims the last <li> element in a row [closed]

    - by Paul
    Ok guys, I give up. Here's the code I'm struggling with: <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <meta http-equiv="Content-Type" content="text/html;charset=utf-8" /> <title>Blah</title> <style type="text/css"> #container { margin: 0 auto; width: 350px; border: 1px solid #ccc; } ul { list-style-type: none; margin: 0; padding: 0; text-align: center; } ul li { display: inline; padding: 5px; margin: 0 1px; background-color: lime; line-height: 2em; /* border: 1px solid red; */ } </style> </head> <body> <div id="container"> <ul> <li>Element A</li> <li>Element B</li> <li>Element C</li> <li>Element D</li> <li>Element E</li> <li>Element F</li> </ul> </div> </body> </html> Why the heck does Chrome trim the right side of "Element D" (even though there is enough space to display whole item), while Firefox and even Internet Explorer render this code properly? It becomes more visible when we apply the commented border. In other words, is there a way to tell the browser that I want every single <li> element to be autonomic, and thus to move it to the next row if it doesn't fit entirely in the previous one? Can't wait to see the solution, thanks in advance.

    Read the article

  • Allow a web designer to modify DNS without letting them transfer the domain?

    - by PhilCK
    We are in the process of having a web designer create a new website for us, but I don't want to give access to the control panel for the domain names (and have no way to limit it, it seems), while at the same time I don't want to be the go between guy for editing the settings. Is there a way or a service for me to point the domains at a 3rd party DNS system, that I can then give access for the web designer, without worrying that he can find my personal info or try and transfer my domain out?

    Read the article

  • What's an acceptable "Avg. Page Load Time"?

    - by hawbsl
    Is there any industry rule of thumb for what's considered an unacceptable load time v. an OK one v. a blistering fast one? We're just reviewing some Google Analytics data and getting 0.74 Avg. Page Load Time reported. I guess that's OK. However it would be good if some meatier comparison data were available, or a blog post, or somewhere where there's some analysis of what speeds are generally being achieved by various kinds of sites. Any useful links to help someone interpret these speeds? If you Google it you just get a lot of results dealing with how to improve your speed. We're not at that stage yet.

    Read the article

  • What options are out there for an embeddable WYSIWIG text editor?

    - by Evan Plaice
    I'm thinking something along the lines of TinyMCE Please include a list of features. Examples include: supports text formatting supports links supports images syntax types (markdown/wiki/etc) licensing and/or pricing customizibility plugin support browser compatibility Note: Please limit the answers to one editor per answer to preserve cleanliness Update: Forgot to add browser compatibility to the list

    Read the article

  • Geolocation Blocking and SEO

    - by Mahesh
    I'm in process to block a particular geolocation from accessing my site. But i am not sure if this is going to raise any flags on my website. Basically i'm getting lots of spam from particular state and there are some webmaster copying my content without any attribution or backlink. I feel that is enough and i wish to save my bandwidth on that state or geolocation. So my question is how to block particular geolocation ? Do you have any suggestions for php scripts that help block particular geolocation ? and another question: if there are any SEO disadvantages ?

    Read the article

  • Recommend hosting with fast MySQL database please [closed]

    - by Keith Groben
    Possible Duplicate: How to find web hosting that meets my requirements? I am frustrated to no end with my current hosting provider, mediaTemple. Yes, they are flashy, and have some decent degree of flexibility with their GS plan, which I have. But anytime I install a site that needs a database, it is slow. like really slow. Taking anywhere from 10 - 15 seconds just to load a page. I would host in house, but there are a lot of complications that come with a LAMP server that I don't want to deal with. Honestly, I'd rather spend the time developing. What can you recommend?

    Read the article

  • How does Lastpass recognize actual login?

    - by Pan.student
    We are currently working on simple school project using Codeigniter where we need login page. It would be very useful if Lastpass could recognise and save logins. We have several accounts with different roles and manual insert of login is pretty slow. So I was wondering what needs to be done and where in files (view, controller?) for Lastpass to work as it does on every website. For example this is our login form: <?php echo form_open('login'); ?> <input type="text" id="username" name="username"/> <input type="text" id="password" name="password"/> <input type="submit" value="Login"/> </form> Thanks for help. (could not create new tag "Lastpass" due to low reputation) [SOLVED] changed <input type="text" id="password" name="password"/> to <input type="password" id="password" name="password"/>

    Read the article

  • How to automatically generate html table from image in Linux?

    - by alfish
    In Photoshop, you can easily devide the image into zones using point and click and it automatically generates the corresponding html with image slices addressed in tables. Gimp also has a Slice (Filter Web Slice) but it is so rudimentary and, as far as I can see, does not allow point and click selection of slices. I am wondering if the functionality can be added into Gimp, or there are other Linux software to do this. I hate to return to Windows jut to do this simple task which I happen to use frequently. Thanks in advance for your suggestions.

    Read the article

< Previous Page | 190 191 192 193 194 195 196 197 198 199 200 201  | Next Page >