Awet Haileslassie Gebrehiwot presents Deep semi-supervised learning with consistency loss

On 2021-10-14 11:00:00 at
Reading group on "Deep semi-supervised learning with consistency loss"

Abstract: Consistency regularization obtains an synthetic label using the
model’s predicted distribution after randomly modifying the input or model
function. It is widely used in deep semi-supervised learning where it is
to all training examples, both labeled and unlabelled. We will go through some
recent papers that explore the concept of consistency regularization.

Mean teachers are better role models: Weight-averaged consistency targets
improve semi-supervised deep learning results. A. Tarvainen, H. Valpola,
Temporal Ensembling for Semi-Supervised Learning. S. Laine and T. Aila. ICLR
FixMatch: Simplifying Semi-Supervised Learning with Consistency and Confidence
Sohn, Kihyuk, et al. NIPS 2020

Instructions for participants: The reading group studies the literature in the
field of pattern recognition and computer vision. At each meeting one or more
papers are prepared for presentation by a single person, the presenter. The
meetings are open to anyone, disregarding their background. It is assumed that
everyone attending the reading group has, at least briefly, read the paper –
not necessarily understanding everything. Attendants should preferably send
questions about the unclear parts to the speaker at least one day in advance.
During the presentation we aim to have a fruitful discussion, a critical
analysis of the paper, as well as brainstorming for creative extensions.

See the page of reading groups
Za obsah zodpovídá: Petr Pošík