Server.js
const http = require('http');
const server = http.createServer((req, res) => {
const now = new Date();
console.log(`now`);
res.statusCode = 200;
res.setHeader('Content-Type', 'text/plain');
res.end('Hello, World!n');
});
const port = 3000;
server.listen(port, () => {
console.log(`Server running at http://localhost:${port}/`);
});
Client.js
fetch('http://localhost:3000')
const now= new Date();
console.log(now);
// Long loop
console.log('Starting a long loop...');
let sum = 0;
for (let i = 0; i < 100000000; i++) {
sum += i;
}
The loop runs for roughly 10 seconds.
Now what I expected was:
Client.js Output : 2024-05-20T16:56:58.616Z
Server.js Output : 2024-05-20T16:56:58.916Z
The rationale being, that the client runs the first line, calls the Web API, and without waiting for any response goes to the next line (asynchronous), the benifit being while javascript does it’s other work, the server will do it’s own and no waiting and give response , and then whenever our javascript program will become free it will deal with server response
Javascript Engine
But the result I got was:
Client.js : 2024-05-20T16:56:58.616Z
Server.js : 2024-05-20T16:57:09.636Z
Roughly 10 seconds gap before the server is hit, meaning that the request to ther server was sent only after the loop was over
I know that Javascript is a single-threaded asynchronous language, but the chrome browser isn’t, when the web API was called (fetch) why didn’t it send the request in the background to the server , and then just wait for response, our system doesn’t have to wait for response
So in ideal case if the client takes 10 second to do its job, and the server takes 10, to the end user he will see result after 10 seconds only
But the experiment shows that it will take 20 seconds because of this behaviour
Can someone explain the result?
Pranav K is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.