Posts

  • LRP for semantic segmentation

    Explainability of decisions made by deep neural networks is of high value as it allows for validation and improvement of models. This post introduces an approach to explain semantic segmentation networks by means of layer-wise relevance propagation.
  • Cheatsheet

  • Cross-entropy based loss functions

    Differences between categorical cross-entropy and binary cross-entropy loss functions.
  • makeauto

    Run make automatically on source file changes.
  • Gradient averaging with TensorFlow

    Gradient averaging over multiple training steps is a very useful technique, which can help you overcome the limitations of your GPU.
  • On Evaluation of Tumor Segmentation

    Development of automatic segmentation method requires careful selection of evaluation criteria, which ideally should correspond to the expected clinical utility. This post describes various approaches to assess tumor segmentation.
  • Loss functions for semantic segmentation

    See how dice and categorical cross entropy loss functions perform when training a semantic segmentation model.
  • Looking under the hood of tensorflow models

    Get more insights into tensorflow models.
  • Dropout

    Dropout is a very popular regularization technique which can be injected into most of the neural network architectures. Together with other methods such as L1-/L2-norm regularization, soft weight sharing it helps deep neural nets in fighting overfitting.