I'm attempting to distribute a list of pre-defined request payloads across all the Users
which are spawned in my run. Conceptually I want to split my list of N requests across U Users, such that each request is only issued once to the server.
For a single process I can achieve this by simply assigning a unique id to each User in the __init__
method - e.g.:
class MyUser():
count = 0
def __init__(self, env):
super().__init__(environment)
self.user_id = MyUser.count
MyUser.count += 1
However when using multiple processes (via --processes=P
) obviously have multiple instances of MyUser.count
and hence the user_ids are not unique.
Is there some kind of centralised mechanism I can use to assign IDs (or some existing unique ID)?
Yes! There are multiple ways, either you can use the worker index: self.environment.runner.worker_index
(it starts at zero and is continous unless you have workers disconnect during the run)
Or you can use messaging to provision data from master to worker during the test. Either by writing your own communication: https://docs.locust.io/en/stable/running-distributed.html#communicating-across-nodes, or you can use locust-plugins Distributor that provides a wrapper around any Iterator:
from locust_plugins.distributor import Distributor
from locust import HttpUser, events
from locust.runners import WorkerRunner
product_distributor = None
@events.init.add_listener
def on_locust_init(environment, **_kwargs):
global product_distributor
if isinstance(environment.runner, WorkerRunner):
product_iterator = None
else:
product_iterator = csv.reader("products.csv")
product_distributor = Distributor(environment, product_iterator)
class MyUser(HttpUser):
@task
def my_task(self) -> None:
product = next(product_distributor)
self.client.get(f"/?product={product[0]})