Ok, I've had a look at the link you provided and I'm a bit confused. Afaik, robots.txt is a method to exclude whole pages from being crawled by search engines. I'm not sure I want to hide my forums from search engines entirely, just exclude certain troublesome bots, and I'm getting the impression from the topic that SQL_novice linked to above that the way to do this is to add code to the .htaccess file rather than the robots.txt file?Thanks for your advice, I'm going to have a closer look at this now.Regardless of how you are blocking them they should be added to bots list. Amazonbot and others like ChatGPT that identify themselves will almost always obey robots.txt which is the best way to block them since they will no longer make requests except to occasionally check robots.txt
https://developers.google.com/search/do ... bots/intro
Once you add them to robots.txt give it a day or two because they don't constantly download that file. If they have been added to bots list in phpBB you can actually use phpBB to see if they are obeying robots.txt because last visit is listed.
Statistics: Posted by Innertraveller — Fri Mar 28, 2025 10:34 pm