Search Results

Search found 5380 results on 216 pages for 'webmasters'.

Page 120/216 | < Previous Page | 116 117 118 119 120 121 122 123 124 125 126 127  | Next Page >

  • We have a 200% increase of "organic" search traffic - how to figure out which keyword is causing this?

    - by Robert Grezan
    So our Google Analytics are showing us that 200% increase of "organic" search traffic. Analytics are saying that search keyword is "(not provided)". We are wondering how to find out which keyword is causing this? We are monitoring all important keywords for our website. None of keyword is in first 5, so our "organic" serach traffic is modest. However, today we received 200% increase of "organic" search traffic but none of keywords we can think of moved a bit. We also did not change anything related to SEO. And what is interesting Google Webmaster shows no changes - ~2500 impressions and ~200 clicks. How to find out which "keyword" might be causing this spike?

    Read the article

  • how to check that Google Analytics Tracking Code is firing on an iPad

    - by crmpicco
    I am used to using the Firebug extension "Omnibug" with Firefox to check that Google Analytics Tracking Code is firing on my website. This application works very well and has minimal overhead. I am now testing the website on an iPad and would like to know if there is a way to check that the GATC is firing on the iPad natively? I have spoofed the iPad UA string on Firefox on the desktop and it appears to fire correctly, however i'd like to see it happening on the device itself (if at all possible). I know that Firebug can be installed on an iPhone by means of a bookmarklet, however it is 1) quite buggy and not very user-friendly and 2) it doesn't support Omnibug. How can I check that my GATC is firing on my iPad?

    Read the article

  • Searching for an online shop accessible via API

    - by Simon A. Eugster
    I need an online shop with a custom interface (customizing items with Ajax, with a preview included). Writing it myself does not make too much sense (implementing all the payment options etc.), so I would like to use an existing online shop (OpenSource). I would like to build my own UI which, for example, tells the shop to add an item to its cart -- i.e. without using the online shop's native UI. More precisely, it should be an online gallery where the user can directly order an image if he likes it. The final checkout/payment page can be native again. Is there a shop system that supports this? Or is it still faster to write it on my own? Or are there better options?

    Read the article

  • Why am I getting domainpark.cgi being called from my website?

    - by Sean
    I used to test my site on www.exampleone.com and now I have moved to the real domain www.realdomain.com now and www.exampleone.com is now parked by 1and1 (default). Now when I test to see which requests are made by the www.realdomain.comI see domainpark.cgi and park.js from Sedo Parking also being requested as well as the js that serves the ads by adclicks. How do I get rid of this? It's not on the index page at all, and it's causing a lot of strain and slowing my site down.

    Read the article

  • Should I use mod_wsgi embedded mode if I have full control of Apache?

    - by mgibsonbr
    I'm managing a bunch of sites and applications in a shared hosting, using Django via mod_wsgi. I had planned to use daemon mode from the beginning (to avoid restart problems), but ended up purchasing a plan that allows me to run a dedicated Apache instance. I kept using daemon mode for convenience, but I'm afraid it's consuming more server resources than it should (I have different projects for each site, each with its own process and process group), so I'm considering switching to embedded mode. Would that be a sensible thing to do? I'd still be able to restart Apache anytime I need to, and I wouldn't need so many child processes and sockets (so I hope the resource usage would decrease). But I'm unsure whether or not doing so would make it more difficult to manage those sites (if I need to update one, I have to restart all) or maybe the applications won't be properly isolated from one another. Are these problems really significant (or only a minor nuisance), are there other drawbacks I coudn't foresee? I'm looking for advice in any aspect of this setup - mainainability, performance, security etc. Tips for improving the current setup are also welcome (I know how to correctly configure a basic mod_wsgi setup, but I'm clueless about sensible values for threads, processes etc).

    Read the article

  • My First robots.txt

    - by Whitechapel
    I'm creating my first robots.txt and wanted to get a second opinion on it. Basically I have a FTP setup on my board for some special users to transfer files between each other and I do NOT want that included in the search by the bots. I also want to point to my sitemap which gets auto generated by a PHP page. So here is what I have, what else should I include, and if I need to fix anything with it? Also, it's linking to xmlsitemap.php because that generates the sitemap when called. My goal is to allow any search bot crawl the forums to grab meta data. User-agent: * Disallow: /admin/ Disallow: /ali/ Disallow: /benny/ Disallow: /cgi-bin/ Disallow: /ders/ Disallow: /empire/ Disallow: /komodo_117/ Disallow: /xanxan/ Disallow: /zeroordie/ Disallow: /tmp/ Sitemap: http://www.vivalanation.com/forums/xmlsitemap.php Edit, I'm not sure how to handle all the user's folders under /public_html/ since the robots.txt will be going in /public_html.

    Read the article

  • Looking for Non Hosted Audio & Video Podcasting Solution for Church Websites

    - by motboys
    I am looking for a solution that will do the following: User uploads audio and/or video files with title, desc. image etc Solution embeds info into ID3 tags Solution generates RSS feed Solution embeds new content in our website Content on website is searchable This is for a couple of church websites I manage. I am looking for the ability to do the above with a sermon mp3 and also a video. At the moment we are doing it with multiple steps / people involved and I want to automate the process. I can't seem to find a solution that does all of the above. Thank you!

    Read the article

  • configure open_basedir under Plesk

    - by cori
    This might be a question for ServerFault, and if it wasn't for the Plesk aspect I would ask it there to start with, so if it's better suited for over there let me know and I'll move it. I'm working on a dedicated server set up as a reseller account with Plesk to manage the domains and server configuration, and I need to add a directory to the local open_basedir configuration for a specific vhost. Given Plesk's normal methodology, I expected to be able to go to /var/www/vhost/{%DOMAINNAME%}/conf and modify vhost.conf and place a new value there, as I have successfully done with other configuration settings for this domain (turning safe_mode off, for instance). When I do so, however, the new setting doesn't take (per phpinfo();). If I edit httpd.conf (which the plesk configuration specifically says not to do in the notes at the top of httpd.conf) the setting takes. Is there something specific about the open_basdir setting that makes it not configurable in vhost.conf? How much trouble am I letting myself in for by editing the vhost-specific httpd.conf (I imagine is someone makes changes in the plesk web interface it might be overwritten, but what other risk is there)? Thanks!

    Read the article

  • Is there a better way to have a two column website with header and footer, equal height columns and stretchy column widths? [closed]

    - by Seamus
    I wrote a website a while ago that is a little messy in how it does things. I used this CSS template and this equal height columns trick. I have not one but two container divs and I can't remember what they're doing. So I'm thinking of re structuring the thing from scratch, and possibly making use of the more "semantic" html5 tags like <nav> and so on at the same time. The question is: is there a better way to achieve a site structure with these properties: 2 equal height main columns (with widths as percentages of the available real estate, not explicitly stated) both a header and footer element that stretch the whole width of the total of the two main columns That allows the use of semantic html5 tags instead of meaningless divs

    Read the article

  • Connecting remote mysql database to local mysql databse? [migrated]

    - by Shashank
    I want to write a php code to be embedded in drupal7 module. I want to call a procedure which can copy the newly generated data in local mysql database to the remote mysql database. When data is inserted in tables 'A' of my local data base it should be copied to the specific table 'B' of the remote mysql server's database. Table 'A' is on local host. Table 'B' is on remote server. insert data on 'A' - copied data in 'B' Is this possible? Thanks for the help.

    Read the article

  • Domain name only works with www. but not without the www

    - by Rodrigo Salazar
    I have searched for this problem numerous times but I can't quite apply the solution to my problem. The usual solution I find is that I need to add an A-record to my dns, currently I only have a C-name record for host 'www' to my DNS. The problem is, I'm having trouble understanding what goes in the 'host' section of the record when trying to create an A-record. Here is a screenshot to my domain registrar where I am trying to create the record. http://i.imgur.com/M18zm.png I have tried creating an A-record with the 'host' field set to *, but that is not a valid entry, nor is leaving it blank. Anyone have any suggestions? The domain is swageroo.com, but it's only reachable at http://www.swageroo.com (With the www)

    Read the article

  • SSL Certificate is Untrusted... sometimes

    - by dragonmantank
    Web Designer I'm working with signed up a new client that needed an SSL certificate. We went to namecheap.com and purchased on from Comodo. Got all the needed files and set it up in ISPConfig. To test we used Windows 7 running IE8, Firefox 3.6, and Chrome 12, and then on OSX with Firefox 4, Safari 5, and Chrome 13. All of them worked fine. The client is getting 'This connection is untrusted' in Firefox 4 and 5. Safari works fine on their machine. On my machines and the designer's machines all works with no errors. I had the client forward me the info for the certificate that Firefox has and the fingerprints match up. I have an old Windows 2000 VM with IE6 and Chrome and those work just fine as well. Any ideas on what else to check or do? The server is running Debian 5.0, up-to-date, with Apache 2 and ISPConfig 3.3

    Read the article

  • .htaccess / 301 redirection question

    - by John K
    All my WordPress post URLs generate subdirectories with duplicate content and I do not know what regular expression to use to consistently 301 redirect domain.com/category/post/random-number/ to domain.com/category/post/ and domain.com/category/post/random-number/another-random-number/ also to domain.com/category/post/. Here is an example of my problem: http://www.example.com/features/harb-constitution-not-to-allow-kr-provinces-to-receive-foreign-officials/ http://www.example.com/features/harb-constitution-not-to-allow-kr-provinces-to-receive-foreign-officials/1345257927000/

    Read the article

  • Online iPad 1&2 emulators give different results compared to the real thing

    - by Systembolaget
    I'm designing a centered website (jQuery Isotope). Thre sandbox is here. I have used some online iPad 1&2 emulators to test how the site is viewed on these devices. Then, I managed to get hold of the real thing. Result: on real iPads, the site is centered and the layout adjusts automatically as expected. In online iPad emulators, the site is not quite centered and additional Isotope elements are squeezed in. Of course, I trust the real thing more than online emulators, but why is this happening? To me, it feels like website testing with online emulators is not so reliable after all? If this question is wrong here, please move it or tell me where it should go. SO is about programming, this question isn't. Thanks!

    Read the article

  • Google Keyword Competition rating

    - by Eric
    Google offers a Keyword application that allows me to see the number of time a particular query has been made in Google. There is a column in the results named "Competition" (Actually its Concurrence in French, I'm just translating). Its a rating from 0 to 1, as in percentage. What indicator is that? EDIT * Is this something useful I should rely on? I'm not sure about how to interpret this data. Should I go for less competitive keywords with a lower number of searches or not worry about it and go for the highly searched keywords anyway? Is 50% considered high? what about 75% ? I have a very niche market that sell expensive offline services, so the very long tail is my goal (I assume). If you didn't already figured out, I'm very new to SEO =)

    Read the article

  • My Website title changes by itself in Google

    - by Kane
    I am doing seo on my friends site www.svipl.in each page has its own metas it did change to new ones then somehow google is taking company name as the meta title I just googled this topic and last few days other people getting same problem , I had similar issue in the past on my own site that soon changed after changing the metas again any seo experts with same problem please help , there is no h1 heading on the company name or alt tag with that also.

    Read the article

  • Is there a way to hide text from descriptions in Google

    - by Linda H
    The first line of text on all of our client's product pages is "Download hi-res images", which of course isn't what we'd want in the description when people search for their products. Is there any way to hide this text/link so that Google and the others just ignore it and go on into the text description below? I suppose we could use a meta-description, but the client isn't very good at computers and it's such a small site it seems silly.

    Read the article

  • Registering domains with Network Solutions

    - by Joel
    Few years ago I registered a domain with Network Solutions. In recent years I've been using cheaper services such as namecheap, powerpipe etc. Every time that I need to renew some of the older domains with Network Solutions I am surprised at how much expensive they are. What is the reason for the price differences between the services? Why should I use a service like Network Solutions if there are so many companies out there that offer domain registration for a very cheap price? Thanks, Meir

    Read the article

  • Best Option for Creating A Small Church Website

    - by Jim
    I've been asked to create a website for a small church. Their prior site was hosted on geocities which is no longer. They are not looking for anything robust, just an informational site with a calendar and maybe a contact form. The church would also like to be able to administer the site with little technical know-how. Cost is also an issue. Given these requirements, something like sites.google.com seems like a good option. However, my main concern is that Sites will suffer the same fate as geocities. It is definitely not a flagship Google product. Are there other good alternatives that fit the requirements?

    Read the article

  • How Do Search Engines Rank Combined Keywords?

    - by Itai
    Suppose: A site that ranks very well (1st result) for something like 'best blue widget'. It also ranks very well (1st page) for 'blue widget'. It ranks not so well (2nd page) for 'widget'. Obviously, the number of monthly searches are much higher for 'widget' than for 'blue widget', which is still higher than for 'best blue widget'. Now the actual question: When creating new external links, how does each of the following anchor texts affect SEO for of these searches? widget blue widget best blue widget [HINT: The answer should be a 3x3 table] [NOTE: Assume the site is relevant for all these keyword combination]

    Read the article

  • clean urls using .htaccess

    - by Napster
    I am trying to implement clean urls using .htaccess. Basically after searching for some time I found out this code RewriteRule latestnews/([a-zA-Z0-9]+)/$ http://thinkmovie.in/index.php/latestnews/?nid=$1 [L] RewriteRule latestnews/([a-zA-Z0-9]+)$ http://thinkmovie.in/index.php/latestnews/?nid=$1 [L] so when I try to access the following url http://thinkmovie.in/index.php/latestnews/272 it redirects to http://thinkmovie.in/index.php/latestnews?nid=272 But what I want is to retain the url in the browsers address bar as http://thinkmovie.in/index.php/latestnews/272

    Read the article

  • Download Monitoring for MovieMusic Portal

    - by VenomVipes
    Our portal is targeted on Mobile Users. We have Music(mp3) Video(3gp) files for download. I expect 300 Parallel Downloads. I want a way to control my Downloads. Like Kicking/Ban a IP or download. Stastics of download. Bandwidth Consumed .... I have root/admin access to my Server. My Question is : Is there a way I can Monitor & Control the OnGoing downloads that visitors are doing from my Site.

    Read the article

  • What meta tag or microdata should I use for a dictionary web application?

    - by vonPetrushev
    I have a web application that serves as a dictionary, and it ranks good at google when searching for a rare word in my language (the dictionary's target language). I want the result to appear in the define: some-word, as well as in the search results when someone uses the filter tool Dictionary. Should I add some special meta-tag in the head of the html? How about microdata? Does google have a special webmaster tool for registering dictionaries like: wordnetweb.princeton.edu or en.wiktionary.org ?

    Read the article

  • What is the replacement for the Web Intents HTML standard?

    - by Tom
    "Web Intents" were deprecated in Chrome 24 (November/2011) and are no longer supported in any browser: We also gathered a lot of valuable data and feedback from our experimental support for Web Intents and decided to disable the feature in today's Beta release. Is there an HTML5 standard that I can look into as an alternative to what Web Intents intended? I'm interested in how web services can be stitched together. For example, imagine a website that can import a image from any number of web-services, modify the image in some way, then push the file back to any number of other web-services, all via HTML5 standards.

    Read the article

  • Panda 4: Reducing #indexed pages. How much is enough?

    - by Noam
    I've been hit by panda 4 (40% decrease). I didn't see any change during panda 1-3. From what I've read it and when compared to my site, the change is probably due to the fact that I have over 30M pages indexed on Google, and they've starting seeing that as some sort of bad indication. Although I feel all of the pages have a unique value that Google should crawl, it seems I should make some tough calls and deduce the indexed pages according to some prioritization I will conduct. The question is what should be my target, or what factors should help me figure out a relevant target. How many pages should I try to reduce to? - 25M - 15M - 1M - 2000 Is it enough to add noindex to low priority pages or should I also remove all internal linking to them?

    Read the article

< Previous Page | 116 117 118 119 120 121 122 123 124 125 126 127  | Next Page >