Contrastive Learning for Lifted Networks

Christopher Zach (Chalmers University), Virginia Estellers (Microsoft)

Abstract
In this work we address supervised learning via lifted network formulations. Lifted networks are interesting because they allow training on massively parallel hardware and assign energy models to discriminatively trained neural networks. We demonstrate that training methods for lifted networks proposed in the literature have significant limitations, and therefore we propose to use a contrastive loss to train lifted networks. We show that this contrastive training approximates back-propagation in theory and in practice, and that it is superior to the regular training objective for lifted networks.

DOI
10.5244/C.33.163
https://dx.doi.org/10.5244/C.33.163

Files
Paper (PDF)

BibTeX
@inproceedings{BMVC2019,
title={Contrastive Learning for Lifted Networks},
author={Christopher Zach and Virginia Estellers},
year={2019},
month={September},
pages={163.1--163.12},
articleno={163},
numpages={12},
booktitle={Proceedings of the British Machine Vision Conference (BMVC)},
publisher={BMVA Press},
editor={Kirill Sidorov and Yulia Hicks},
doi={10.5244/C.33.163},
url={https://dx.doi.org/10.5244/C.33.163}
}