Examining a browser like Chrome, it would be crazy if all our tabs used the same thread. Most likely there would be a lot of blocking code that could slow down the browser. Therefore, a solution is that each tab has its own thread. So the database calls you make in one tab, can run in parallel with another tab that might be streaming data to you (like a youtube video).
Now let's say you're working on some prime-time, intensive computing in the browser. This is where the browser's Web Workers come in.
Web workers gives us breathing space by running scripts in a separate, background thread; away from the main execution thread of a web page/app.
They are not miracle workers, however. As the HTML spec says, workers "...have a high start-up performance cost, and a high per-instance memory cost.". It is a tradeoff, but the benefit is getting more threads to work with, that run in parallel.
## What's the alternative?
If we had a traditional, non-node server, we would spawn a thread for every request. If a request is currently using a thread, the server would need to spawn another thread. And another.
The server creates threads from a limited pool.
Servers can not create millions of threads; once a server uses up its threads, the next request will have to wait until a thread is available. This is where the appeal, and the excitement of a Node.js server, comes in. The idea with a node server is that it handles all the requests in a single thread (thus making it faster to resolve), and any long-running tasks are done in separate threads known as worker threads!
So not only does the browser have a concept of workers, Node servers too utilise worker threads (which uses an internal C++ thread pool), which are non-blocking and can run in parallel.
We are operating with a single thread, and we have Node or the Browser using workers to handle additional threads. Now let's talk a little bit about hardware. If we had a single-core CPU, which means we could handle a single thread, we would need to introduce concurrency as a way to handle multiple threads. Concurrency is the ability to run multiple tasks in parallel, during overlapping periods, without blocking each other or having to be processed sequentially.
On a single core, we would execute tasks on one thread, and if there's an availability, or if we finished with the main thread, we check other threads and continue executing them.