Posts
-
Jun 24, 2019
Explainability of decisions made by deep neural networks is of high value as it allows for validation and improvement of models. This post introduces an approach to explain semantic segmentation networks by means of layer-wise relevance propagation.
-
May 6, 2019
-
Oct 16, 2018
Differences between categorical cross-entropy and binary cross-entropy loss functions.
-
Jul 12, 2018
Run make automatically on source file changes.
-
Jun 5, 2018
Gradient averaging over multiple training steps is a very useful technique, which can help you overcome the limitations of your GPU.
-
Apr 28, 2018
Development of automatic segmentation method requires careful selection of evaluation criteria, which ideally should correspond to the expected clinical utility. This post describes various approaches to assess tumor segmentation.
-
Feb 18, 2018
See how dice and categorical cross entropy loss functions perform when training a semantic segmentation model.
-
Jan 29, 2018
Get more insights into tensorflow models.
-
Jan 18, 2018
Dropout is a very popular regularization technique which can be injected into most of the neural network architectures. Together with other methods such as L1-/L2-norm regularization, soft weight sharing it helps deep neural nets in fighting overfitting.