I’m testing my application with gatling.
The tested endpoint is the login, which must return user information.
The function searches two collections sequentially, not in parallel.
The problem occurs when we increase the number of requests to a level like: 1,500 users per second.
Gatling warns of premature closure
17:18:39,281 [WARN] i.g.h.e.r.DefaultStatsProcessor - 'Auth Prod' request for user 12053 failed: j.i.IOException: premature close
17:18:39.281 [WARN] i.g.h.e.GatlingHttpListener - 'Auth Prod' request for user 12077 failed
But the big problem is that even after we stop testing, the server does not respond to login requests again.
The queries get stuck, not returning any errors, only after a long time does the server come back.
The Rust server is still up, only queries are stuck.
This is probably related to pool size and maximum connections, but I wouldn’t want to change them for now, just so the server doesn’t crash during queries.
Rust
let mut client_options = ClientOptions::parse_async(uri).await?;
client_options.max_idle_time = Some(Duration::new(10, 0));
let client = Client::with_options(client_options)?;
in each request I use the following function to obtain the client ref and access the collection.
pub fn get_service<T>(&self, service: Services) -> Collection<T> {
let client = self.cliente.clone();
let database = client.database(&self.database);
match service {
Services::User => database.collection::<T>("users"),
Services::GrupoEmp => database.collection::<T>("emps"),
}
}
Query
let find_options = FindOneOptions::builder().max_time(Duration::from_secs(5)).build();
let user: User = match user_service
.find_one(
doc! {"login": &user_name},
Some(find_options),
)
.await
I’m using
- Mongodb Rust Driver
- Axum
Obs: there are several improvements that can be made so that the server can support multiple requests, such as load balancer, replications, etc., but the problem is that the queries are dying.