Semi-supervised Active Learning
Supervision: Prof. Dr. Nassir Navab,
Dr. Seong Tae Kim
Abstract
Deep neural network training generally requires a large dataset of labeled data points. In practice, large sets of unlabeled data are usually available, but acquiring labels for these datasets is time-consuming and expensive. Active Learning (AL) is a training protocol that aims at minimizing labeling effort in machine learning applications. Active learning algorithms try to sequentially query labels for the most informative data points of an unlabeled data set. Semi-supervised learning (SSL) is a method that uses unlabeled data for model training in order to improve performance. In this thesis, we will explore the combination of these two promising approaches for the efficient training of deep neural networks. In particular, we explore how query selection criteria of AL algorithms have to be designed when used in conjunction with SSL algorithms.
Requirements:
- Good understanding of statistics and machine learning methods.
- Very good programming skills in Python & TensorFlow? / PyTorch?
Location: