The site already filters IPs somehow. The question in my mind is how.
It may be that somewhere there is a bot registry, and these are used for filtering.
Or more likely, bots self-register by setting some flag when they "browse", which the site can detect and filter.
My supposition is the second case.
Anyway, it seems obvious that some ill-behaved bots do not conform to the standard, whatever that may be, and thus are not automatically excluded.
With this assumption, it would be nice if there were a secondary filtering condition, maintained manually, where detected bots are excluded.
A log analysis program could have certain criteria. An IP with an inhuman number of visits could be excluded, retroactively as a bot.
What would be the result? The result would be people complaining that their view counts were suddenly decreasing.
No amount of instruction would help that problem. People would still be suspicious of the view reporting system for "stealing" their views.
New users would continually generate the issue as old users became aware of the cause.
Views are a pretty terrible metric anyway. Comments and SALES are much more meaningful.
If you want detailed metrics, you need your own, truly own, website, since all your SEO is hogged by Pixels/FAA, and is not left to Artistwebsites.
Just my opinion, and obviously I don't know the whole story, and am just inferring based on partial knowledge and indirect evidence.