Skip to content

Fix training of Unet

Main changes:

  • Disable L2 regularization as it was detrimental for training.
  • Increase batch size from 1 to 4. Any value greater than 1 works, but 1 causes training problems. Is this an issue with BatchNormalization?
  • Default network parameters now depend on problem being solved: Exit wave reconstruction needs a smaller network.

Merge request reports

Loading