wifi error model: coded bits vs. data bits
The interference helper class calculates the chunk success ratio using the number of coded bits (phyRate * duration). My understanding is that the error models calculate a bit error rate for data bits. If I understand that correctly, the current implementation assumes that the error model returns something like an "equivalent bit error rate" for an uncoded frame that has the size of the coded frame, which seems odd to me.
Shouldn't phyRate be replaced by dataRate?
https://gitlab.com/nsnam/ns-3-dev/blob/master/src/wifi/model/interference-helper.cc#L265