Search Results

Search found 9717 results on 389 pages for 'pro'.

Page 90/389 | < Previous Page | 86 87 88 89 90 91 92 93 94 95 96 97  | Next Page >

  • Will domain change affect my pagerank?

    - by Chankey Pathak
    I have two blogger's blog. (http://chankeypathak.blogspot.com and http://javaenthusiastic.blogspot.com) One blog has PR 3 and the other blog has PR 2. I want to buy the domain for both blogs so that they will become http://chankeypathak.com/ and http://javaenthusiastic.com/ I will follow all the procedures that Blogger suggests so that all the visitors to http://chankeypathak.blogspot.com will be redirected to http://chankeypathak.com/ and same for the java's blog. I just want to know that whether this will affect my pagerank or not? I want my PR to remain same and not to be change because of domain change. Let me know. Thank you. PS: I don't know whether one person is allowed to post his site's URL in questions or not. If it is not allowed then you may edit the question.

    Read the article

  • Best way to prevent Google from indexing a directory [duplicate]

    - by Gkhan14
    This question already has an answer here: Stopping Google index some web pages I have 5 answers I've researched many methods on how to prevent Google/other search engines from crawling a specific directory. The two most popular ones I've seen are: Adding it into the robots.txt file: Disallow: /directory/ Adding a meta tag: <meta name="robots" content="noindex, nofollow"> Which method would work the best? I want this directory to remain "invisible" from search engines so it does not affect any of my site's ranking. In other words, I want this directory to be neutral/invisible and "just there." I don't want it to affect any ranking. Which method would be the best to achieve this?

    Read the article

  • Howto fix "[Errno 13] Permission denied" in mailman mailing lists

    - by Michael
    After migrating domains from one plesk server onto another, I got several of those mails every day: (the target mailbox does not exist, so I get those as undeliverable mail bounces) Return-Path: <[email protected]> Received: (qmail 26460 invoked by uid 38); 26 May 2012 12:00:02 +0200 Date: 26 May 2012 12:00:02 +0200 Message-ID: <20120526100002.xyzxx.qmail@lvpsxxx-xx-xx-xx.dedicated.hosteurope.de> From: [email protected] (Cron Daemon) To: [email protected] Subject: Cron <list@lvpsxxx-xx-xx-xx> [ -x /usr/lib/mailman/cron/senddigests ] && /usr/lib/mailman/cron/senddigests Content-Type: text/plain; charset=ANSI_X3.4-1968 X-Cron-Env: <SHELL=/bin/sh> X-Cron-Env: <HOME=/var/list> X-Cron-Env: <PATH=/usr/bin:/bin> X-Cron-Env: <LOGNAME=list> List: xyzxyz: problem processing /var/lib/mailman/lists/xyzxyz/digest.mbox: [Errno 13] Permission denied: '/var/lib/mailman/archives/private/xyzxyz' I tried to fix the permissions myself, but the problem still exists.

    Read the article

  • How to do a cacheable redirection?

    - by John Doe
    When users enter my website example.com, their "preferred" language is detected and they are redirected (using a 301 Moved Permanently redirection) to example.com/en/ (for english), example.com/it/ (for italian), etc. It works perfectly, but when I analized my website with the Google Page Speed tool it gave me the following advice. Many pages, especially mobile pages, redirect users to a different URL, for instance from www.example.com to m.example.com. Making this redirect cacheable by the user's browser can speed up page load times for repeat visitors to a site. And later it says We recommend using a 302 redirect with a cache lifetime of one day. The redirect should include a Vary: User-Agent header as well as a Cache-Control: private header. So my questions are, how can I do a "cacheable" redirection in PHP? Would the following be enough? header("HTTP/1.0 302 Moved Temporarily"); header("Location: example.com/whatever"); exit;

    Read the article

  • Transfer domains without disrupting local email server

    - by krosiris
    Perhaps I am thinking about this wrong, but maybe some of you guys can help me on this one. I have a client who currently has a domain from network solutions and is hosted via GoDaddy. Additionally, he his has email service with GoDaddy as well, but it seems to be forwarded to his local server at work. How can I transfer hosting accounts without disrupting email service (or at least temporarily)? Some MX info via Godaddy: A @ Points to GoDaddy Host main Points to client's home server sw Points to client's home server MX @ Points to main.example.com @ Points to smtp.secureserver @ Points to mailstore1.secureserver

    Read the article

  • DNS NAmeserver Aname and cname records [closed]

    - by David
    I am inexperienced in the configuration of DNS and have an issue with dominan hosting set up. I have two domains 'www.mydomain1.com' and 'www.mydomain2.com', with mydomain2 pointed at the same place as mydomain1. The domains were passed to me recently by the person who previoulsy controlled them. I have an account with Fasthosts in the UK. When I accepted the domains I could not access the DNS settings and inquired with fasthosts as to why. The reply was: The delegate hosting option for both domains were enabled and this is the reason why you were unable to find the option to edit the advanced DNS records. I have now disabled the delegate hosting option so you can now edit the advanced DNS records for both domains in your account. When I log into the Fasthost control panel now I can access the DNS controls but both domains have no A record or Cname record set up. I am concerned that Fasthosts have blatted the previous Nameserver entries and set me up on theirs but not added any record. 'www.mydomain1.com' currently still works but 'www.mydomain2.com' does not find the site anymore. I am worried I will lose mydomain1 to as the DNS changes filter through the system. my webhosting is at 'xxx.xxx.xxx.xxx/mydomain1.com/' and this is where I want both domains to point. Any advice would be much appreciated. One thing which is confusing me is that because I am on a shared server I have to put 'xxx.xxx.xxx.xxx/mydomain1.com/' to get to my site rather than just 'xxx.xxx.xxx.xxx'. The form on Fasthosts for the A name record only allows an IP to be entered - does it add the mydomain1.com/ onto the end itself? Thanks for any help given - I'm quite worried about this David

    Read the article

  • URL rewrite and domain frame

    - by Dennis
    I have registered the domain www.posti.sh at nic.sh. The website is on the server www.myskoob.com/postish. Unfortunately, nic.sh does not support frames, i.e. that the domain stays posti.sh as it forwards to www.myskoob.com/postish - so I thought about a URL rewrite on the server. Unfortunately I have no idea how rewriting works - I am thankful for explanations - but I would also like to ask whether this is generally possible. What I need is: The server needs to recognize that the folder postish is accessed Depending on the file that is opened, it needs to rewrite the url to www.posti.sh/<-according filename here- Also, the server needs to understand that a link to www.posti.sh/about.php links to www.myskoob.com/postish/about.php and likewise for other files - at the moment, when I type in posti.sh/about.php it redirects to http://www.myskoob.com/postishabout.php, which does not exist All this should be possible irrespective of whether the url contains a "www" at the beginning or not A plus but not necessary would be that it does not display the .php extensions Would that generally be possible? If not, what would be the alternatives? If anyone knows how to do it, any code and/or way to do it would be much appreciated!

    Read the article

  • The sharp decline Statistics of website

    - by Erfan Safarpoor
    My website has had 10 months ago, the statistics are very high. Very high ... But after 10 days of server failure, Marm was 20 times less. I got lost for a long time without making a mistake, do ... I am the source of links that they've hired a writer to pen the final results are seen. But a strange thing: Approximately every two months and was hit again 20 more times and then low again after 10 days! my website url : www.sooran.com (food.sooran.com)

    Read the article

  • Will ranking be affected with a mobile XML sitemap for a mobile site with the same URLs as the desktop site?

    - by Emil Rasmussen
    We have a site with both a desktop version and a mobile version. Most of the content are the same and both versions have the same URL, but the HTML generated is device specific. Looking at Google's recommendations for smartphone-optimized sites, one could get the impression that the mobile xml sitemap is only for sites with different URLs. Will ranking be affected - negatively or positively - if we add a mobile xml sitemap that effectively will be a duplicate of the desktop sitemap?

    Read the article

  • Adsense click bot is click bombing my site

    - by Graham
    I have a site that get's roughly 7,000 - 10,000 page views per day right now. Starting around 1 AM on 7/1/12 I noticed the CTR was rising dramatically. These clicks would be credited then de-credited soon after. So, they were obviously fraudulent clicks. The next day I had about 200 clicks in account with about 100 of them being fraudulent. It's about 3 - 8 per hour evenly dispersed for each of the three ads 24 hours a day. This leads me to believe that it's some sort of Adsense click bot. Also, I removed the ads last evening then put them back up around 3AM and the invalid clicks started within 10 minutes. I signed up for statcounter.com to analyze the exit links on the Adsense. Then I conditionally blocked ads for the IP address of the person / bot I suspected doing this. But, I think that the bot has several proxies to choose from and can refresh IP addresses. I've notified Google through the invalid click form / email 4 times over the past two days in order to let them know I'm aware of the situation and am working on a solution. I've also temporally removed all ads on that site. How can I block a bot like this? Thank you.

    Read the article

  • Where should I redirect (removed) phishing pages

    - by tinjaw
    I was unfortunately the victim of a PHP exploit. Looking through my webserver logs, people are still attempting to reach the URL used in the phish. I want to redirect them to a site that will educate these people on what phishing is. My question: Is there a (generic / vendor-neutral) phishing education website that you suggest I send them to with a 301 redirect? (I assume a 301 is the best option.)

    Read the article

  • css equivalent of table-row [closed]

    - by SpashHit
    I am trying to shift my style away from using tables to control formatting, but I haven't seen a simple css solution that does exactly the same thing as <table><tr><td>aribitrary-html-A</td><td>aribitrary-html-A</td></tr><table> All I want is to make sure aribitrary-html-A and aribitrary-html-B are aligned horizontally. I have tried various CSS concoctions using display: inline, clear: none, and float: left but they all have unwanted side-effects of moving my content around, while the table-tr solution just does what I want, regardless of what's in the arbitrary HTML, and regardless of what is in HTML that contains my table. Am I missing something?

    Read the article

  • Google Webmaster Tools Data Highlighter says "Failed to load data, please try again later"

    - by George Garside
    I seem to be unable to access the data highlighter in Google Webmaster Tools since I attempted to start a new highlight on a page. Clicking the red Start Highlighting button to open the tagger did nothing, so I refreshed. Now, the page loads without the middle content section, then a few seconds later shows the following error: Failed to load data, please try again later. I can't get any of the middle section to load, even the list of current pages/page sets that have been highlighted—this error shows. I thought it may be a Google service outage, but other sites' data highlighters work fine. It also seems coincidental that it stopped working after I attempted to start highlighting—I was able to list the existing pages and page sets fine before that, and still am able to access the service on other sites. I've tried clearing browser data and have tried Google Chrome as well—same problem. What's happened?

    Read the article

  • How to configure apache2 to just save certain POST requests without even passing them to application?

    - by Robert Grezan
    I'm running apache in front of glassfish server using BalancerMember. For performance reasons I would like that POST requests on certain endpoint are just saved to a file without passing them to application (and to return correct HTTP return code). How to configure apache to do that? EDIT: In other words, if a POST request is for path "http://example.com/upload" then the content of the post (body) should go into a file.

    Read the article

  • Chrome Web Store verification

    - by Vince V.
    A couple of days ago I created an extension for Chrome and added it to the store. Now I want it to get verified. I payed the 5 dollar and added my website to Webmaster Tools. The website is also verified on those Webmaster Tools. Today I wanted to add my URL to my extension (ultimately to do online installations) but it doesn't recognize the URL I put in those Webmaster Tools. I tried refreshing and clicking add site, but it just doesn't work. Is there some step that I am missing or is this a bug in the Chrome Web Store, because I don't see any option left.

    Read the article

  • Is content in option tags indexed?

    - by Silfverstrom
    Is data inside an <option> tag indexed? For example, would the following option tag allow "Volvo", "Saab", "Opel" and "Audi" to be indexed by a crawler? <select> <option value="volvo">Volvo</option> <option value="saab">Saab</option> <option value="opel">Opel</option> <option value="audi">Audi</option> </select> Will search engines put any weight on data in an option form element?

    Read the article

  • Disable comments / Spam protection

    - by SamIAm
    My client site is built in Silverstripe, there is a news page, and it allows people to leave comments. Unfortunately we've got loads of spam emails. I'm new to this, is there any way we can disable the comment field by default? How do I do it? Alternatively is there easy way for me to install a spam protection? Thanks heaps. Sam Update - Because this is someone else's code, I just realised that they have some sort of spam protection already, so we are trying to disable comments now. I have manage to set no comment as default by changing file BlogEntry.php static $defaults = array( "ProvideComments" => true, 'ShowInMenus' => false ); to static $defaults = array( "ProvideComments" => false, //changed 'ShowInMenus' => false ); Am I on the right track to disable comments by default? Also how can I stop on the news page showing xxx comments link? eg Test Posted by Admin on 21 June 2011 | 3 Comments Tags: P This is a test.... 3 comments | Read the full post Thanks. S:)

    Read the article

  • What tools to use for efficient link building?

    - by Evgeny
    As most SEO experts keep saying, it is not just the content that you have - but also a hefty amount of quality incoming links to your content that is important -- these are the two ways to get to the top of the search results. The question is where do I find the incomnig links? One way I know is Google Blog Search, it can be used to find blogs with related information to your content and some allow to leave comments. The comments usually consist of your name, e-mail and website. If you put your keyword instead of your name, then the keyword turns into a link to your website. Unfortunately most blogs put rel=nofollow on such links, but some blogs don't do that. What other ways are there to find quality pages to put keywords links back to your website? Quality link usually means: located on a page with relevant content does not have a rel=nofollow in the <a has a relevant keyword as in <a href=websitekeyword</a the page with the link has high PageRank (3+) and TrustRank

    Read the article

  • I need a little help with .htaccess rewrite

    - by Pinokyo
    I need a little help with .htaccess file I have songs, singers and albums links I want to rewrite. I all ready rewrote the links and they are like this: the links for the songs is like this: /song/song_name for singers: /singer_name for albums: /album_name From my .htaccess file: RewriteEngine on RewriteRule ^singer/([^/\.]+)/?$ /core/controller.php?singer=$1 [L] RewriteRule ^song/([^/\.]+)/?$ /core/controller.php?song=$1 [L] RewriteRule ^album/([^/\.]+)/?$ /core/controller.php?album=$1 [L] I need the links for the songs, singers and albums to be like this: for songs /singer_name/song_name for singers /singer_name for albums /singer_name/album_name can anyone help me with this please.

    Read the article

  • Set Up Google Analytics to Track Domain Alias

    - by Brian Boatright
    I found this article from Google http://www.google.com/support/analytics/bin/answer.py?hl=en&answer=55523 However I'm not sure what happens to the data. Will I be able to determine which domain forwarded to the primary domain using their technique? Or will it simply tranfers all the relevant keyword and other factors to the primary domain but not which domain was originally landed before the 302 redirect. What I need to do is track which domain alias are being used.

    Read the article

  • Star rating not showing in rich snippets

    - by Danny R
    We've recently been doing a lot of work on our site's SEO (www.betterthanreviews.com). We recently did a push to update the rich snippets breadcrumb, meta description, and star rating. After giving Google some time to index the site, it has updated the breadcrumbs and meta descriptions for our review pages, but the stars are still not showing. This is currently how it appears on a Google search (link to the actual page: http://www.betterthanreviews.com/home-security/livewatch): This is what the Rich Snippets is supposed to look like, and how it appears in Google's testing tool: More context: As seen in our html, we are using schema.org language. We initially were using schema.org/Corporation for the site, but we now have the page labeled as schema.org/HomeAndConstructionBusiness because Google will not show star ratings for the Corporation language. However, in our Webmaster Tools, the Structured Data is still showing the Corporation language, which could be a potential issue. Here is a look at some of the coding that we used. But it can be looked at closer by inspecting the element: <div class="aggregate-rating" itemprop="aggregateRating" itemscope="" itemtype="http://schema.org/AggregateRating"> <div class="review row_fluid" itemprop="review" itemscope="" itemtype="http://schema.org/Review"> <div class="row_fluid rating" itemprop="reviewRating" itemscope="" itemtype="http://schema.org/Rating"> <meta content="4.5" itemprop="ratingValue" title="4.5 out of 5 stars" class="star-rating-readonly"> <meta content="2013-12-05" itemprop="datePublished"> <p class="review-headline" itemprop="headline">Way better than my previous system</p> <div> <p class="reviewer" itemprop="author">Scott H. </p> <span class="bullet">•</span> <p class="created_at">2 months ago</p> <p class="content" itemprop="description">I love it! The experience I have had so far is extremely positive. I had another alarm system before and I didn't like it but this one is really nice. I am telling everybody about it.</p> </div> </div> Any suggestions for how to fix this?

    Read the article

  • How to prevent access to website without SSL connection?

    - by CraigJ
    I have a website that has an SSL certificate installed, so that if I access the website using https instead of http I will be able to connect using a secure connection. However, I have noticed that I can still access the website non-securely, ie. by using http instead of https. How can I prevent people using the website in a non-secure manner? If I have a directory on the website, eg. samples/, can I prevent non-secure connections to just this directory?

    Read the article

  • Javascript slider Image and text from php, scrollable in groups by indexes

    - by Roberto de Nobrega
    I am looking for a javascript solution that slides images with text, pulled from php. This slider will slide in groups by indexes in points. I was googling, but nothing as I need. I am going to make an example. Imagine 10 products. I need to show the principal picture, and a text below the image. It is going to show 6 products, and with points (indexes), I click and the group slides to the next group. Do you know some script.?? I know the php code, but I am a newbie with javascript.! Thanks.!! PD. I am lost of where i have to put this question. So, If this was a wrong place, let me know, and accept my apologises.! ;)

    Read the article

  • GEO Tool - commercial use

    - by Jens
    what i want to do: offer my clients the possibility to display their store as a small graphic (like google maps). just a small png with the location of the store. only the specific client is able to see this images and he pays for this area (not only for that :) ). So i'm looking for a api or whatever else to geocode(?) the adress and render a small (280*160px) image. google offers a premium licence (not cheap) ms bing openstreetmap - i'm not able to understand the 10^10 licences ;( any ideas?

    Read the article

  • How to interpret number of URL errors in Google webmaster tools

    - by user359650
    Recently Google has made some changes to Webmaster tools which are explained below: http://googlewebmastercentral.blogspot.com/2012/03/crawl-errors-next-generation.html One thing I could not find out is how to interpret the number of errors over time. At the end of February we've recently migrated our website and didn't implement redirect rules for some pages (quite a few actually). Here is what we're getting from the Crawl errors: What I don't know is if the number of errors is cumulative over time or not (i.e. if Google bots crawl your website on 2 different days and find 1 separate issue on each day, whether they will report 1 error for each day, or 1 for the 1st, and 2 for the 2nd). Based on the Crawl stats we can see that the number of requests made by Google bots doesn't increase: Therefore I believe the number of errors reported is cumulative and that an error detected on 1 day is taken into account and reported on the subsequent days until the underlying problem is fixed and the page it's crawled again (or if you manually Mark as fixed the error) because if you don't make more requests to a website, there is no way you can check new pages and old pages at the same time. Q: Am I interpreting the number of errors correctly?

    Read the article

< Previous Page | 86 87 88 89 90 91 92 93 94 95 96 97  | Next Page >