Discuss features as they are added to the new version. Give us your feedback. Don't post bug reports, feature requests, support questions or suggestions here.
Forum rules
Discuss features as they are added to the new version. Give us your feedback. Don't post bug reports, feature requests, support questions or suggestions here. Feature requests are closed.
They're bots, or "automated machines" that run around the internet indexing websites so they can list the sites under search engines and the like. Because they visit a site and read through lots of stuff, they can beat down on bandwidth pretty quickly. phpBB recognizes these users and doesn't show them images and stuff, which reduces the load on the webserver (as far as I understand it), because the bots aren't real users.
I hope that easy to follow and that it sorta helps.
DarkOMEN wrote:
They're bots, or "automated machines" that run around the internet indexing websites so they can list the sites under search engines and the like. Because they visit a site and read through lots of stuff, they can beat down on bandwidth pretty quickly. phpBB recognizes these users and doesn't show them images and stuff, which reduces the load on the webserver (as far as I understand it), because the bots aren't real users.
I hope that easy to follow and that it sorta helps.
Thanx for the reply !
But, Gosh, is their a way to "permanently block/ban the bot" since they beat down on bandwidth pretty quickly ?
You can add a robots.txt file to your root web directory. Robots.txt allows you to define restrictions on bot activity for your entire site, or specific parts of it.
but if you're going to use a robots.txt file, you have to keep in mind that not all bots respect them...so you'd be better off going with Graham's suggestion..much easier, and it's simple to do in the ACP.
Having to ban IP-ranges for individual bots is something I'd put that under the heading "Last resorts". The robotos.txt is one that is going to keep a lot of bots away already.