Blocking “crawler”
-
I seem to be having many sites that are bleeding bandwidth (upwards of 20GB a month and more) that aren’t getting caught by blackhole… in Awstats, it’s listed as “crawler” (when I tried adding a modsec rule for that, it seemed to be blocking multiple bots, so I’m not sure whether or not the Awstats info is reliable.) In any case, while I do get occassional notices from the plugin that a bot has gotten blocked, it seems to miss a lot. Is the only protection based on a bot following a link it’s told not to? Is there something I’m missing in how to set this up effectively?
Viewing 6 replies - 1 through 6 (of 6 total)
Viewing 6 replies - 1 through 6 (of 6 total)
The topic ‘Blocking “crawler”’ is closed to new replies.