Search Results

Search found 9724 results on 389 pages for 'pro zeck'.

Page 91/389 | < Previous Page | 87 88 89 90 91 92 93 94 95 96 97 98  | Next Page >

  • Dropped impression 25 days after restructure

    - by Hamid
    Our website is a non English property related website (moshaver.com) which is similar to rightmove.co.uk. On September 2012 our website was adversely affected by Panda causing our Google incoming clicks to drop from around 3000 clicks to less than a thousand. We were hoping that Google will eventually realize that we are not a spam website and things will get better. However, in August 2013 we were almost sure that we needed to do something, so we started to restructure our web content. We used the canonical tag to remove our search results and point to our listing pages, using the noindex tag to remove it from our listing pages which does not have any properties at the moment. We also changed title tags to more friendly ones, in addition to other changes. Our changes were effective on 10th August. As shown in the graph taken from Google Analytics Search Engine Optimization section, these changes has resulted in an increase in the number of times Google displayed our results in its search results. Our impressions almost doubled starting 15th August. However, as the graph shows, our CTR dropped from this date from around 15% to 8%. This might have been because of our changed title tags (so people were less likely to click on them), or it might be normal for increased impressions. This situation has continued up until 10th September, when our impressions decreased dramatically to less than a thousand. This is almost 30% of our original impressions (before website restructure) and 15% of the new impressions. At the same time our impressions has increased dramatically to around 50%. I have two theories for this increase. The first one is that these statistics are less accurate for lower impressions. The second one is that Google is now only displaying our results for queries directly related to our website (our name, our url), and not for general terms, such as "apartments in a specific city". The second theory also explains the dramatic decrease in impression as well. After digging the analytic data a little more, I constructed the following table. It displays the breakdown of our impressions, clicks and ctr in different Google products (web and image) and in total. What I understand from this table is that, most of our increased impressions after restructure were on the image search section. I don't think users of search would be looking for content in our website. Furthermore, it shows that the drop in our web search ctr, is as dramatic of the overall ctr (-30% in compare to -60%) . I thought posting it here might help you understand the situation better. Is it possible that Google has tested our new structure for 25 days, and then decided to decrease our impressions because of the the new low CTR? Or should we look for another factor? If this is the case, how long does it usually take for Google to give us another chance? It has been one month since our impressions has dropped.

    Read the article

  • Robots.txt and "Bad" Robots

    - by Lynda
    I understand robots.txt and its purpose. I have read some people saying that using a Robots.txt gives "bad" robots or robots who do not obey a robots.txt a way to access pages on your site that you do not want accessed. While I am not looking to get into a debate about that I do have a question: If I have a structure like this: /Folder/ /Sub-Folder 1/ /Sub-Folder 2/ (Note: There are no pages within /Folder/ only other folders.) If I Disallow: /Folder/ it will prevent "good" robots from accessing the directory and any contents within the sub-folders. While we know that bad robots will see the /Folder/ will they be able to see and acess the sub-folders and the pages within the subfolders if they are not listed in the robots.txt? (Note: I do not fully understand how robots good or bad crawl a site beyond using a robots.txt and links within the site.)

    Read the article

  • Google keeps indexing /comment/reply URL

    - by jaypabs
    With the new update of Google algorithm called Penguin, I think my site was being penalized due to webspam. But of course I don't create post which seems to be spam to Google. It is just I think how Google index my site. I found out that Google index the URL of my site like: http://www.example.com/comment/reply/3866/26556 So there are so many comment/reply URL index by Google. I have already added: Disallow: /comment/reply/ Disallow: /?q=comment/reply/ but still Google still index this URL. Any idea how to prevent Google from indexing comments?

    Read the article

  • How to create posts in asp.net ..?? [closed]

    - by ntechi
    I am doing my final year project and have decided to make a website in asp.net. For that I'll be using Micrsoft Visual Studio 2008. I'm making a Real ESTATE properties website. I want to know how to create posts in asp.net( like in WORDPRESS) and also when I hit SEARCH it should search for the desired keyword or the searched post. If post is not possible then it should display pages....Please help as I'm a beginner..

    Read the article

  • IE9 Loses Some CSS After Particular Form Submit [migrated]

    - by Asherion
    The site I am editing has a search form. For the record, there are several other forms on the site, contact and the like. This is the only one with an issue. Upon submission of the form, SOME of the styling is lost in IE9 (possibly other versions of IE, haven't tested that yet). Primarily, the margins and colors set in html and body appear to have been lost. Menus, banner, text, etc all appear to retain styles. All styles are on one sheet, that are used here... Any helpful advice? Here is the contents of the search page and the php used to check for the form, if that helps, and the css that I think is lost. THE HTML: <div id="search"> <br /> <div style="float:right;font-size:.8em;"> <form name="form_sidesearch" action="search.html" method="post"> <input type="hidden" name="action" value="search" /> <input type="text" name="search_value" value="<?php echo $systems_primary->search_value ?>" /> <input type="submit" name="submit_search" value="Search Website" /> </form> <br /> </div> </div> <?php echo stripslashes($search_results); THE PHP: <?php // -- Begin Search -------------------------------------------------------------------------------------- if($_REQUEST["action"] === "search") { if(strlen($_REQUEST["pg"]) <= 0) { $_REQUEST["pg"] = 1; } $search_results = $systems_primary->search_website("index",urldecode($_REQUEST["search_value"]),"<div class=\"listing ui-corner-all\"><a href=\"{ENTRY_URL}\" title=\"{ENTRY_TITLE}\" class=\"listing_title\">{ENTRY_TITLE}</a>{ENTRY_CONTENT} <a href=\"{ENTRY_URL}\" title=\"{ENTRY_TITLE}\" style=\"font-size:.8em;\">...read more</a></div><br /><br />",345,"all",10,$_REQUEST["pg"]); } // -- End Search ---------------------------------------------------------------------------------------- ?> THE LOST CSS (could be more): html { background-color:#F6E6C8; font-size:16px; font:Helvetica; } body { width:1027px; margin:0 auto; background-color:#ffffff; font: arial, times new roman, sans-serif; }

    Read the article

  • Google Analytics async=true seems wrong in the Google documentation?

    - by leeand00
    In the Google Analytics async example, they state that in order to include more than one tracker, you need to setup your pages for asyncrous tracking, and they do so using the following code: <script type="text/javascript"> _gaq.push( ['_setAccount', 'UA-XXXXX-1'], ['_trackPageview'], ['b._setAccount', 'UA-XXXXX-2'], ['b._trackPageview'] ); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); </script> The second tracker is not receiving any results. After looking at my tracking codes to ensure they are correct, I noticed that the ga.async = true statement is specified differently most of the time and is never set to a value of true, it's often set to async but never true. Could this be stopping my Analytics data from posting to the second tracker? or might it be something else? Also what calls should I look for in the Net tab in firebug to ensure that GA is being called when the page loads?

    Read the article

  • Google Analytics Goal Tracking for Sub-Domains?

    - by Hasan Khan
    I am trying to track goals in Google Analytics for a website that has the goal URL on a sub-domain. The main domain for example is: domain.com and the sub-domain is my.domain.com. I have Google Analytics configured to track domains and all sub-domains and I've eve set up an advanced filter so I can see traffic to my sub-domains in Analytics. However, in goal tracking, you're supposed to put in the website URL after the front (so if it were domain.com/conversions/ you'd put in just /conversions/). However, since for me it would be my.domain.com/conversions/, how would I input that URL into Analytics to track? Would Analytics automatically determine the URL to be on the sub-domain? Thanks!

    Read the article

  • Adding my face to my web-site in Google's search result

    - by Roman Matveev
    I'm trying to accomplish the rich snippet to the template of my future web-site. The data format is review and I used the microdata formatting to add all necessary information to the web-page. The Structured Data Testing Tool delivered rating, author information and review date: However there is no my face image and the sections related to authorship are empty: I made all that recommended to link my Google+ profile to the web-site: I did something wrong? Or I will not be able to see my face in the test tools ever and it will be in the real SERP?

    Read the article

  • Recovery from URL structure change?

    - by Dejan Pelzel
    in July this year, we have changed the URL structure of the website from: Post: domain.com/blog/post/986/dance/heart-beats-dance-video-by-chinatsu/ Category: domain.com/blog/index/cosplay/ to Post: domain.com/dance/heart-beats-dance-video-by-chinatsu-986/ Category: domain.com/cosplay/ Everything was (supposedly) properly redirected with 301 redirects and it first seemed that the traffic returned after a couple of days, but it has now been close to 2 months and things keep going worse although Google is slowly indexing the changes. What is worrying me even more is that the Pages crawled per day from Webmaster Tools started drastically dropping a few days ago and has just reached a new low in months (from over 2000 to 700). Should I be worried or will things sort out eventually?

    Read the article

  • pingback / trackback support for a photo sharing website?

    - by cboettig
    Is there a photo sharing service, such as flickr or picasa, that will collect the urls of the locations where the photo has been posted on other blogs (or mentioned in tweets, etc?) This could be accomplished by posting each photo as a blog entry using wordpress, which would then automatically handle pingbacks, but of course a blog doesn't perform quite like a proper photo service. Perhaps this could be done with a private photo hosting server like zenphoto by editing the php, but that seems rather involved. Does such a service already exist?

    Read the article

  • How do I redirect www and non but not IP

    - by Chad T Parson
    I am trying to redirect www.domain.com or domain.com to www.domain.com/temp.html I am using the following code: RewriteCond %{HTTP_HOST} ^.*$ RewriteRule ^/?$ "http\:\/\/www\.domain\.com\/temp\.html" [R=301,L] That works however I do not want to redirect IP. So if someone types in the static IP of the domain then I do not want them to be redirected to www.domain.com/temp.html Anyone have the code to take care of this?

    Read the article

  • Where can I find comscore rank?

    - by Joyce Babu
    Recently one ad network rejected my registration stating that my site doesn't match their minimum monthly impressions, even though the site serves thrice the required page views. When I contacted them for details, their representative hinted that they are using comscore data for screening submissions. Where can I view my site's comscore ranking and details? Update I was able to find the traffic by tagging my site with comScore Direct.

    Read the article

  • Easy user management on html site?

    - by James Buldon
    I hope I'm not asking a question for which the answer is obvious...If I am, apologies. Within my html site (i.e. not Wordpress, Joomla, etc.) I want to be able to have a level of user management. That means that some pages I want to be only accessible to certain people with the correct username and password. What's the best way to do this? Are there any available scripts out there? I guess I'm looking for a free/open source version of something like this: http://www.webassist.com/php-scripts-and-solutions/user-registration/

    Read the article

  • Ranking hit after WP site migration

    - by Ben
    I migrated my site from its old domain over a month ago. I followed WMT completely, including 301 redirects from every existing URL to the new domain, and then submitting a change of address. Traffic continued as normal, but then a few days after submitting the change of address traffic plummeted to about 20-30% of what it was previously. Most of my traffic come from organic search, and I can see that for the keywords I had targeted before and performed well with and am now ranking much much lower for. In some cases for low competition keywords I've only lost a few places, for higher competition terms I have really suffered. This has started to pick up a bit (one of my keywords I have risen from 195 to 100 in the last week), but it seems to be a very slow process. How seamless is this process normally? I was under the impression that this would not affect my rankings too severely, but it has now been a month since the move and recovery seems to be very slow, if at all. Is it likely that I've missed something? The only change is that I have moved what was the home page to be more of a sub-page, and now in its place is a magazine-style home page. I understand that links to the old site will now be pointing to the latter which means that rankings for some keywords attributed to the old home page will take a hit, but even on other pages that seem to fit in exactly the same page structure as the previous site I have seen a drop in rankings. Any help would be greatly appreciated. Thanks!

    Read the article

  • Google analytics and Adwords showing very different figures

    - by Dave Rook
    In AdWords I have 1 advert running only. The landing page includes a querystring so I can track it. EG, www.mydomain.com/products?source=CPC I also use Google Analytics. For February I have approx 1450 clicks in AdWords. This means, 1450 went to my website. In Google Analytics, according to my landing page, there were only ~850 visits. In Google Analytics, in the Acquisition - All traffic page, it suggests that Google CPC brought 517 visits... I know tracking tools are not 100% reliable but this figure seems to be showing something is very wrong. How can I tell which of the figures is accurate or is this just a limitation of reporting tools?

    Read the article

  • I need a little help with .htaccess rewrite

    - by Pinokyo
    I need a little help with .htaccess file I have songs, singers and albums links I want to rewrite. I all ready rewrote the links and they are like this: the links for the songs is like this: /song/song_name for singers: /singer_name for albums: /album_name From my .htaccess file: RewriteEngine on RewriteRule ^singer/([^/\.]+)/?$ /core/controller.php?singer=$1 [L] RewriteRule ^song/([^/\.]+)/?$ /core/controller.php?song=$1 [L] RewriteRule ^album/([^/\.]+)/?$ /core/controller.php?album=$1 [L] I need the links for the songs, singers and albums to be like this: for songs /singer_name/song_name for singers /singer_name for albums /singer_name/album_name can anyone help me with this please.

    Read the article

  • Should I rely on externally-hosted services?

    - by Mattis
    I am wondering over the dangers / difficulties in using external services like Google Chart in my production state website. With external services I mean them that you can't download and host on your own server. (-) Potentially the Google service can be down when my site is up. (+) I don't have to develop those particular systems for new browser technologies, hopefully Google will do that for me. (-) Extra latency while my site fetch the data from the google servers. What else? Is it worth spending time and money to develop my own systems to be more in control of things?

    Read the article

  • JS and CSS caching issue: possibly .htaccess related

    - by adamturtle
    I've been using the HTML5 Boilerplate for some web projects for a while now and have noticed the following issue cropping up on some sites. My CSS and JS files, when loaded by the browser, are being renamed to things like: ce.52b8fd529e8142bdb6c4f9e7f55aaec0.modernizr-1,o7,omin,l.js …in the case of modernizr-1.7.min.js The pattern always seems to add ce. or cc. in front of the filename. I'm not sure what's causing this, and it's frustrating since when I make updates to those files, the same old cached file is being loaded. I have to explicitly call modernizr-1.7.min.js?v=2 or something similar to get it to re-cache. I'd like to scrap it altogether but it still happens even when .htaccess is empty. Any ideas? Is anyone else experiencing this issue?

    Read the article

  • Adding a CMS to an existing Magento shop

    - by user6341
    I am working on a project for 3 niche stores built on magento (using magento's multi-store function) that each get roughly 50k unique visitors a day. The sites don't currently have a blog or forum or any social networking aspects. Would like to add a cms to each site that can be centrally run and would like it to take over the front end content from Magento. Also would like it to maintain an online blog/publication of sorts with videos, articles, and the like with privileges to edit the content given to a dozen or so people with different privileges. Want to add a forum to each site that is fairly robust and to possibly add some social networking aspects down the road, so extandability and available plugins/mods in each cms is important. Other than shared login between the forums,blog/publication and store, would like to be able to integrate some content from the forums and blog/publication into the store as well. After researching this a bit, I am inclined towards Drupal, but I haven't found any modules to integrate it with Magento. Also, since the blog content will be done by about a dozen nontechnical people, I want something that is very easy to work with. Lastly, since the site gets a good amount of traffic, speed and security are very important. What CMS would you recommend integrating in this context? Deciding between Drupal, Wordpress and Plone. Thanks.

    Read the article

  • How do I make sure the web developer I hire will not steal my idea?

    - by Greg McNulty
    So I have a great idea for a new website. However, not the time to develop it. I would like to hire a person or company to design it for me. What steps do I need to take, to protect my idea? Where and how do people protect website ideas in general? Also, how easy is it for someone to tweak the idea and make it legally heir own? Is a patent enough to protect such a thing, idea. Are there different levels or types of protection? Thank You.

    Read the article

  • How to handle possible duplicate content across multiple sites?

    - by ElHaix
    Let's say I have two sites that cover the same vertical/topic. one in the USA and one in Canada. Both sites have local-related content, which is obviously unique by location. However they will share common news or blog pages. How do I avoid getting hit with duplicate content on both sites for those news/blog pages? If the content is exactly the same, I'm guessing I would have to pick which site's content I want to noindex,nofollow, is that correct, and if so, is that all I have to add on the URL links to those pages, and the pages' meta tags?

    Read the article

  • What liability concerns do advertising vendors raise, and how can I address them?

    - by Beofett
    One of the websites I administer wants to provide free advertising in the form of direct links to vendors at an event they are running. Up until now, there has been no advertising whatsoever on the site (or any of our other sites). The site is for a for-profit business. The idea of implicit endorsement of any vendors we advertise has been raised, which brought up the question of what we need to do, if anything, to protect ourselves from any potential problems such endorsement might create. I know that many sites have clauses in their Terms of Service that state that (in a nutshell) they are not responsible for any problems or grievances between the visitors to the site and any vendor advertised or linked. Are there other steps that a website typically takes when considering advertising, such as getting the advertiser to provide some sort of certification that their ad will not violate any trademarks or copyrighted material?

    Read the article

  • Legal Web Fonts & Licensing

    - by Phill
    So I'm a little confused in regards to legality of using fonts for web. Often from designers I get a PSD file and it uses a special font, and the font is supplied. However attempting to convert the font using: http://www.fontsquirrel.com/fontface/generator Ends up with a message saying it's blocked by Adobe. Not only that sometimes if the font can be converted, it often looks like crap when viewed in a browser. I assume that the generator is primarily for people to convert their own fonts, but if you purchase the use of a font then you can only use it for Web if the terms allows you to? The fonts used in PSD's are often Adobe Fonts, I can't find anything that suggests I can convert those and use them on the web. So I'm wondering if anyone knows the legal rights around using Photoshop supplied fonts on the web? In addition I'm wondering what resources are available (free/paid) that provide fonts that can be used on the web. Free: http://www.fontsquirrel.com http://www.google.com/webfonts Paid: http://www.fontshop.com This is the only one's I've found so far that aren't cartoon type fonts like what's primarily on www.dafont.com

    Read the article

  • Set Up Of Common Name Of SSL Certificate To Protect Plesk Panel

    - by Cbomb
    A PCI Compliance scanner is balking that the self signed SSL certificate protecting secure access to Plesk Panel contains a name mismatch between the location of the Plesk Panel and the name on the certificate, namely the self-signed cert's name is "Parallels" and the domain to reach Plesk is 'ip address:8443'. So I figured I would go ahead and get a free SSL certificate to try to fiddle with this error. But when I generated the certificate I used my server domain name as the site name when I generated the certificate. So if I visit 'domain name:8443' all is fine, no ssl warning. But if I visit 'ip address:8443' (which I believe is what the scanner does) I get the certificate name mismatch error, Digicert's ssl checker says that the certificate name should be the ip address. Can I even generate a certificate whose common name is the ip address? I am tempted to say I should just do what the PCI scanner accepts, but what is really the correct common name to use? Anybody run into this issue before?

    Read the article

  • .htaccess RedirectMatch 301 issue

    - by Steve
    Hi. I've moved my Wordpress installation from one domain to another, and I want to use an .htaccess file on the original to redirect visitors to the new page on the new website. The old site is http://www.steve.doig.com.au/wordpress/. The new site is http://www.superlogical.net I tried using tried using the following .htaccess file in the /wordpress directory: RedirectMatch 301 http://www.steve.doig.com.au/wordpress(.*) http://www.superlogical.net/$1 However, all this does is redirect visitors to the URL: http://www.superlogical.net/wordpress/ I guess this is working properly, but I don't have Wordpress installed in a /wordpress folder on the new domain. How do I remove this from the URL redirected to? Thanks..

    Read the article

< Previous Page | 87 88 89 90 91 92 93 94 95 96 97 98  | Next Page >