Search code examples
omnet++veins

Simulate a Massive Broadcasting Scenario with a Reasonable Delay and Packet Loss


I was able to complete basic tutorial in Veins.
In my simulation, 12 cars broadcast messages to each other. I want to compute the delay associated with every message. I am trying to achieve it in the following manner:

  • Save the time when the transmission begins and send the packet
...
wsm->setDelayTime(simTime().dbl());
populateWSM(wsm);
sendDelayedDown(wsm, computeAsynchronousSendingTime(1, ChannelType::service));
...
  • At the Rx side, compute the delay and save it
...
delayVector.record(simTime().dbl()-wsm->getDelayTime());
...

In the picture below you can see the delay w.r.t. node[0]. Two things puzzle me:

  1. Why the delay is in the range of seconds? I would expect it to be in the range of ms.
  2. Why does the delay increase with the simulation time?

delay

Update

I have figured out that since 12 cars broadcast simulatenously, computeAsynchronousSendingTime(1, ChannelType::service) will return bigger delay for subsequent cars. I can circumvent the issue by using sendDown(wsm). However, in this case, not all the messages are delivered, since a car tries to receive a packet while transmitting. So I would like to update the question: how do I simulate the most realistic scenario with the reasonable delay and packet loss?


Solution

  • If somebody comes across the similar issue, computeAsynchronousSendingTime(1, ChannelType::service) returns the absolute simulation time, at which a message should be sent. We are interested in a delay, though. Thus, one would have to run sendDelayedDown(wsm, computeAsynchronousSendingTime(1, ChannelType::service) - simTime());