My Title clearly indicated my lack of understanding about the core concept for pattern matching – specifically using .HTACCESS to block Bad-Bots from accessing a site, which they do in order to crawl or mirror copy, consequently using up bandwidth.
But my question isn’t about .HTACCESS – it’s about the obvious ineffectiveness (IMO) of using a long list for searching and matching.
Isn’t it far better to Allow a list of positive, as opposed to Match against list of negatives??
For example:
1st explanation attempt:
If the User Agent does NOT match one of these Good-Bots, then block.
If the User Agent matches one of these Bad-Bots, then block.
I can’t express this idea by formula or algorithm because I don’t know how to – assuming it could be, but I suppose if I could express the idea it would be . . .
2nd explanation attempt:
If THIS is NOT ‘A’ (‘A’ being a list of positives), then deny.
If THIS is ONE INSTANCE of ‘A’ (‘A’ being a list of negatives), then deny.
3rd try!
If THIS is not RED, then deny.
If this is BLUE, YELLOW, GREEN, (i.e. NOT RED), then deny.
If this makes sense, why would a web developer use the latter approach if the latter’s list is more than the former. Presumably there are less Good-Bots (User Agents) than there are Bad-Bots (ignoring the fact the UA can be forged.)?
Ultimately, wouldn’t it be far better to create an index of all common Good-Bots and use this to search and match, rather than list a seemingly and potentially infinite list of Bad-Bots? (Not forgetting the time it would take to update such a list with new Bad-Bots.)
Reference: perishablepress.com/4g-ultimate-user-agent-blacklist/
Why search & match for MORE negatives, than search for less positives?
7
This is a matter of what you want the default to be.
If you want to allow by default, you need to list the blocked ones (blacklist). If you want to deny by default, you need to list the allowed ones (whitelist).
The User-Agent strings are incredibly varied. If you used white-list, your site wouldn’t work for some people and that’s something you absolutely don’t want. So you just selectively blacklist the bots that actually cause excessive traffic.
That applies to public-facing webs. Intranet sites and sites for specific customers may, and often do, whitelist just the legitimate users.
Presumably there are less Good-Bots (User Agents) than there are Bad-Bots (ignoring the fact the UA can be forged.)?
No, it certainly isn’t. There are many niche browsers and various scripts and tools and the browsers include various bits of configuration and you can’t ever hope to collect the complete set, because new ones are being created every day.