This is a nodeJS application hosted on AWS Fargate. As what infer from the graph, the max CPU and memory utlisation is around 2.5x to 3x of the average cpu/memory utilisation.
My question is:
- Am I over provision my fargate resource?
- if I tune down the CPU and memory by half, and max CPU/Memory Utilisation hit 100%, but average CPU/memory utilisation be around 50-70%, will it cause the application to crash/restart, or have impact on performance when max CPU/memory hit 100%?