It's maybe intuitive if we consider a desktop application that serves one user. But for example in web applications where we write a program that gets something from the database and displays it to the users browser, what happens when two users or 100+ users requests the same program, the same data, at the same time?
How and by what does this get dealt with, how does it "look like", what's happening behind the scenes to make this possible? With my very little knowledge I imagine that the program runs once for each time it gets called in a sequence where the users have to wait in turn? Where does the concept of multithreading come in or is that irrelevant here? (Edit: I now know this is called parallel computing as opposed to concurrent or sequential)
Thanks for your time. (Btw I'm not sure how to tag this question, any suggestion would be appreciated!)
In most web frameworks this is done by having requests not share any data. The framework takes care to instantiate all necessary objects separately for each request.
Conceptually, the application is just a function that takes request data and outputs response data. So if your application is a simple greeter application then the function looks like this:
string GetResponse(string request) => "Hello " + request;
You can intuitively see here that multiple concurrent users accessing this function would be perfectly safe because the users will not share any state.
Usually, there will not be any global variables since those would be shared. If there are globals great care must be taken to make this safe.
Databases are written so that multiple concurrent requests can safely interact.