my application suddenly had a huge increase in the number of requests being made to it. I believe the only change of merit was adding a sitemap.xml and I believe the increase in requests is due to crawlers/bots but am not sure how to verify except by taking a sample of requests coming in by reading the logs in my Papertrail addon and investigating the ip addresses (alot of them appear to be from googlebot)
Is there a way in Heroku to limit bot activity or at least find out if this increase is in fact due to bots and if not, then what is causing it and how to mitigate it.
I have now updated the robot.txt file and increased the crawl-delay property, but to no avail. I was hoping the number of requests would go down
egauzens is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.