Communication-Efficient Distributed Deep Metric Learning with Hybrid Synchronization
Other conference paper


Times Cited
Altmetrics Information
.

Other information
AbstractDeep metric learning is widely used in extreme classification and image retrieval because of its powerful ability to learn the semantic low-dimensional embedding of high-dimensional data. However, the heavy computational cost of mining valuable pair or triplet of training data and updating models frequently in existing deep metric learning approaches becomes a barrier to apply such methods to a large-scale real-world context in a distributed environment.
Moreover, existing distributed deep learning framework is not designed for deep metric learning tasks, because it is difficult to implement a smart mining policy of valuable training data.In this paper, we introduce a novel distributed framework to speed up the training process of the deep metric learning using multiple machines.
Specifically, we first design a distributed sampling method to find the hard-negative samples from a broader scope of candidate samples compared to the single-machine solution. Then, we design a hybrid communication pattern and implement a decentralized data-parallel framework to reduce the communication workload while the quality of the trained deep metric models is preserved. In experiments, we show excellent performance gain compared to a full spectrum of state-of-the-art deep metric learning models on multiple datasets in terms of image clustering and image retrieval tasks.
All Author(s) ListYuxin Su, Michael R. Lyu, Irwin King
Name of Conference27th ACM International Conference on Information and Knowledge Management (CIKM)
Start Date of Conference22/10/2018
End Date of Conference26/10/2018
Place of ConferenceTorino
Country/Region of ConferenceItaly
Proceedings TitleCIKM '18: Proceedings of the 27th ACM International Conference on Information and Knowledge Management
Year2018
Pages1463 - 1472
ISBN978-1-4503-6014-2
LanguagesEnglish-United States

Last updated on 2020-23-10 at 02:33