Can any existing load balancer software handle intentionally unevenly distributed requests?
Basically, I have ~20 different endpoints from different places that do exactly the same thing. The difference is in each endpoint’s rate limit, given in both requests/second and requests/month. For example, 1 endpoint might accept 5 requests/second and 10000000 requests/month, and another might accept 10 requests/second and 5000000 requests/month.
Bottlenecks, nginx as a load balancer
Why nginx can keep itself from being a bottleneck when serving as a load balancer. If it becomes the single point bottleneck in some conditions, then is there solution except hardware load balancer?
Bottlenecks, nginx as a load balancer
Why nginx can keep itself from being a bottleneck when serving as a load balancer. If it becomes the single point bottleneck in some conditions, then is there solution except hardware load balancer?
Bottlenecks, nginx as a load balancer
Why nginx can keep itself from being a bottleneck when serving as a load balancer. If it becomes the single point bottleneck in some conditions, then is there solution except hardware load balancer?
Parallelising processing of users
Update I updated my question to reflect the fact I’m working with a database.
I need to process user actions:
Parallelising processing of users
Update I updated my question to reflect the fact I’m working with a database.
I need to process user actions: