Awet Haileslassie Gebrehiwot presents Deep semi-supervised learning with consistency loss

On 2021-10-14 11:00:00 at https://feectu.zoom.us/j/97005445003
Reading group on "Deep semi-supervised learning with consistency loss"

Abstract: Consistency regularization obtains an synthetic label using the
model’s predicted distribution after randomly modifying the input or model
function. It is widely used in deep semi-supervised learning where it is
applied
to all training examples, both labeled and unlabelled. We will go through some
recent papers that explore the concept of consistency regularization.

Papers:
Mean teachers are better role models: Weight-averaged consistency targets
improve semi-supervised deep learning results. A. Tarvainen, H. Valpola,
NeurIPS
2017
Temporal Ensembling for Semi-Supervised Learning. S. Laine and T. Aila. ICLR
2017
FixMatch: Simplifying Semi-Supervised Learning with Consistency and Confidence
Sohn, Kihyuk, et al. NIPS 2020

Instructions for participants: The reading group studies the literature in the
field of pattern recognition and computer vision. At each meeting one or more
papers are prepared for presentation by a single person, the presenter. The
meetings are open to anyone, disregarding their background. It is assumed that
everyone attending the reading group has, at least briefly, read the paper –
not necessarily understanding everything. Attendants should preferably send
questions about the unclear parts to the speaker at least one day in advance.
During the presentation we aim to have a fruitful discussion, a critical
analysis of the paper, as well as brainstorming for creative extensions.

See the page of reading groups
http://cmp.felk.cvut.cz/~toliageo/rg/index.html
Responsible person: Petr Pošík