I created a simple test of calling an HTTP endpoint, that is configured to spam 1000 requests at once as it is in this picture:
The test took 75 seconds to run which means the throughput should be 1000/75 = 13.3 requests per second however, the Summary Report says it is 4.8 requests per second.
Why?
which means the throughput should be 1000/75 = 13.3 requests
According to the glossary:
Throughput is calculated as requests/unit of time. The time is calculated from the start of the first sample to the end of the last sample. This includes any intervals between samples, as it is supposed to represent the load on the server. The formula is:
Throughput = (number of requests) / (total time).
So it should be something like: 4150 / 75 = 55.3, however given you have 1000 threads and no loops you should have 1000 results only so try "clearing" the results and re-running:
Also consider running your test running JMeter in command-line non-GUI mode as GUI mode is supposed to be used for tests development and debugging, when it comes to test execution you should not be using JMeter GUI