I was able to complete basic tutorial in Veins.
In my simulation, 12 cars broadcast messages to each other. I want to compute the delay associated with every message. I am trying to achieve it in the following manner:
...
wsm->setDelayTime(simTime().dbl());
populateWSM(wsm);
sendDelayedDown(wsm, computeAsynchronousSendingTime(1, ChannelType::service));
...
...
delayVector.record(simTime().dbl()-wsm->getDelayTime());
...
In the picture below you can see the delay w.r.t. node[0]. Two things puzzle me:
Update
I have figured out that since 12 cars broadcast simulatenously, computeAsynchronousSendingTime(1, ChannelType::service)
will return bigger delay for subsequent cars. I can circumvent the issue by using sendDown(wsm)
. However, in this case, not all the messages are delivered, since a car tries to receive a packet while transmitting. So I would like to update the question: how do I simulate the most realistic scenario with the reasonable delay and packet loss?
If somebody comes across the similar issue, computeAsynchronousSendingTime(1, ChannelType::service)
returns the absolute simulation time, at which a message should be sent. We are interested in a delay, though. Thus, one would have to run sendDelayedDown(wsm, computeAsynchronousSendingTime(1, ChannelType::service) - simTime());