fix error from BatchNorm when size of current batch is 1
For instance with batch_size = 8 and number of samples = 1945 we have 243 * 8 = 1944 full batches and last batch contains only one sample. In this case, the BatchNorm fails with Expected more than 1 value per channel when training
(cf. https://github.com/pytorch/pytorch/issues/4534).
To prevent this one solution could be to ignore incomplete batches using drop_last=True
option in dataloader.