I have a server with 2vcores and 7go of ram with Nginx version 1.18. When I want to test the robustness of my server with a heavy HTTP load of about 300 requests per second I get 429 errors after a few seconds.
I know that the 429 error means that a user has made too many requests and it's because of rate limiting. In the doc of thelimit_rate directive it says that if the value is 0 then the rate limiting feature is disabled. I also tried this articlefrom the Nginx blog but I can't solve my problem.
Here is my Nginx configuration at the moment. The configuration of the file /etc/nginx/nginx.conf is the default
server {
listen 80;
listen [::]:80;
server_name _;
root /var/www/laravel-rsync/public;
add_header X-Frame-Options "SAMEORIGIN";
add_header X-Content-Type-Options "nosniff";
index index.php;
charset utf-8;
limit_rate 0;
location / {
try_files $uri $uri/ /index.php?$query_string;
}
location = /favicon.ico { access_log off; log_not_found off; }
location = /robots.txt { access_log off; log_not_found off; }
error_page 404 /index.php;
location ~ \.php$ {
fastcgi_pass unix:/var/run/php/php8.1-fpm.sock;
fastcgi_param SCRIPT_FILENAME $realpath_root$fastcgi_script_name;
include fastcgi_params;
}
location ~ /\.(?!well-known).* {
deny all;
}
}
The answer was that Laravel throttle by default all the /api route. So just comment the throttle and that will be good.
Nginx does not limit your requests. It is mostly limited by the laravel backend.