Skip to content

network: prevent time overflow in DataRate.

I noticed that changing the time resolution from picoseconds to femtoseconds significantly changes the outcome of my simulation. The problem turned out to be in DataRate::CalculateBitsTxTime() which overflows Time with input 21000 bits. The proposed fix is using int64x64_t to calculate the number of seconds before instantiating Time.

Merge request reports