I am trying to understand how FastAPI or for that matter any other backend framework handle multiple concurrent user requests. I have built a full stack ML Cancer Prediction app which is working great for one user but I am thinking about how it can scale and serve multiple users fundamentally. I've written my API routes in FastAPI, one of them is to handle user uploads and another to predict. Here is the code:
@app.post("/uploadfile/")
async def create_upload_file(file: UploadFile = File(...)):
path = Path(__file__).parents[1] / "saved_images" / file.filename
try:
with path.open("wb") as buffer:
shutil.copyfileobj(file.file, buffer)
except Exception as e:
print("Error: ", e)
global current_filename
current_filename = file.filename
return {"filename": file.filename}
@app.get("/predict/")
async def predict(filename: str):
path = Path(__file__).parents[1] / "saved_images" / filename
path = str(path)
models = utils.get_models_list()
dataset_test = DatasetTest(df_single_image, 'test', 'test', transform=transforms_test)
image = dataset_test[0]
prediction, probs = utils.predict_single_image(image, models)
return {"Probs": probs,
"Prediction": prediction}
Running the predict
route for each test sample/image takes about 2 minutes. I have two basic questions here:
i) What if two users simultaneously upload their images using the upload
route? How is it handled by FastAPI?
ii) Let's say two users have their images uploaded to the filesystem and now they use the predict
route simultaneously, how does FastAPI handle that and the main question - how does it determine which user to return which Response to(Assuming I don't have an authentication system in place, so any user can just come onto the web app and start using it)?
Also, here is the frontend component I'm using to send these requests:
const PredictButton = () => {
const selectedFile = useContext(FileContext);
const [pred, setPred] = useState(null);
const handleUpload = async () => {
const formData = new FormData();
formData.append("file", selectedFile, selectedFile.name);
const requestOptions = {
method: 'POST',
body: formData,
};
await fetch('http://localhost:8000/uploadfile', requestOptions);
}
const handlePredict = async () => {
await handleUpload();
const requestOptions = {
method: 'GET',
};
const resp = await fetch(`http://localhost:8000/predict/?filename=${selectedFile.name}`, requestOptions);
const data = await resp.json();
setPred(data["Prediction"]);
}
return (
<div className="predict-button">
<button className="predict-button-button" onClick={handlePredict}>Predict!</button>
<div className="pred">
{pred===null ? null : (pred === 1 ? <div className="detected">Prediction: Cancer Detected</div> : <div className="no-detected">Prediction: No cancer Detected</div>)}
</div>
</div>
)
}
It would be great if someone could help me clear up these fundamental doubts.
To give a quick scenario how fast API handles requests concurrently and determines wich user gets response in Multi user environment.
Let's say :
when two users simultaneously uploads thier images /uploadfile/
route, FASTAPI
will uses event loop
to mange concurrent request and each request is processed seperately and ensures that the requests do not interfere with each other. https://fastapi.tiangolo.com/async/
To give specific points on how and why :
FastAPI
runs an async event loop that coordinates the async path functions, and a threadpool (group of workers to do tasks in parallel)
for synchronous path functions
When one user makes a request, FASTAPI doesnt wait for request to complete before handling the next one. (Its like a restaurant with multiple chefs and requests are like orders and processed concurrently by threadpool) also known as Asynchronous Handling
As FASTAPI doesnt have any authentication, it treats all requests equally but doesnt mix up the requests .If you don't have any RBAC or authentication or user identification
mechanisms, it will process request as First-come First served
Fashion.
In Production
, user management and Authentication is crucial to keep users data separated and secured.
Hope this helps and let us know if you have any queries!