Robot.txt can get all soft404s fixed?
Posted
by
olo
on Pro Webmasters
See other posts from Pro Webmasters
or by olo
Published on 2014-06-10T01:00:47Z
Indexed on
2014/06/10
3:43 UTC
Read the original article
Hit count: 478
I got many soft404 in Google webmaster Tools, and those webpages aren't existing any more. thus I am unable to insert <meta name="robots" content="noindex, nofollow">
into my pages, and I've been searching a while but didn't get some valuable clues.
There are about 100 URLs are soft 404, to redirect them all one by one is a bit silly as it would cost too much time for me.
If i just add those links into robot.txt like below
User-agent: *
Disallow: /mysite.asp
Disallow: /mysite-more.html
if this way will fix all soft404s solidly? or if there is a way to change all soft404 to hard404? Please give me some suggestions. Many thanks
© Pro Webmasters or respective owner