Communication-Efficient Distributed Deep Metric Learning with Hybrid Synchronization
Other conference paper

替代計量分析
.

其它資訊
摘要Deep metric learning is widely used in extreme classification and image retrieval because of its powerful ability to learn the semantic low-dimensional embedding of high-dimensional data. However, the heavy computational cost of mining valuable pair or triplet of training data and updating models frequently in existing deep metric learning approaches becomes a barrier to apply such methods to a large-scale real-world context in a distributed environment.
Moreover, existing distributed deep learning framework is not designed for deep metric learning tasks, because it is difficult to implement a smart mining policy of valuable training data.In this paper, we introduce a novel distributed framework to speed up the training process of the deep metric learning using multiple machines.
Specifically, we first design a distributed sampling method to find the hard-negative samples from a broader scope of candidate samples compared to the single-machine solution. Then, we design a hybrid communication pattern and implement a decentralized data-parallel framework to reduce the communication workload while the quality of the trained deep metric models is preserved. In experiments, we show excellent performance gain compared to a full spectrum of state-of-the-art deep metric learning models on multiple datasets in terms of image clustering and image retrieval tasks.
著者Yuxin Su, Michael R. Lyu, Irwin King
會議名稱27th ACM International Conference on Information and Knowledge Management (CIKM)
會議開始日22.10.2018
會議完結日26.10.2018
會議地點Torino
會議國家/地區意大利
會議論文集題名CIKM '18: Proceedings of the 27th ACM International Conference on Information and Knowledge Management
出版年份2018
頁次1463 - 1472
國際標準書號978-1-4503-6014-2
語言美式英語

上次更新時間 2021-10-05 於 23:52