Robots.txt Disallow command [on hold]
- by Saahil Sinha
How to disallow folders through Robots.txt, which are been crawled due to wrong url structure, which thus cause duplicate page error
The URL been crawled as incorrectly by Google leading to duplicate page error:
www.abc.com/forum/index.php?option=com_forum
However, The actual correct pages however are:
www.abc.com/index.php?option=com_forum
Is this a correct way by excluding them through robots.txt:
To exclude
www.abc.com/forum/index.php?option=com_forum
Below is command
Disallow: /forum/
Will it not block in legitimate component folder 'Forum' of site?