Penalizing Top Performers: Conservative Loss for Semantic Segmentation Adaptation
Refereed conference paper presented and published in conference proceedings

替代計量分析
.

其它資訊
摘要Due to the expensive and time-consuming annotations (e.g., segmentation) for real-world images, recent works in computer vision resort to synthetic data. However, the performance on the real image often drops significantly because of the domain shift between the synthetic data and the real images. In this setting, domain adaptation brings an appealing option. The effective approaches of domain adaptation shape the representations that (1) are discriminative for the main task and (2) have good generalization capability for domain shift. To this end, we propose a novel loss function, i.e., Conservative Loss, which penalizes the extreme good and bad cases while encouraging the moderate examples. More specifically, it enables the network to learn features that are discriminative by gradient descent and are invariant to the change of domains via gradient ascend method. Extensive experiments on synthetic to real segmentation adaptation show our proposed method achieves state of the art results. Ablation studies give more insights into properties of the Conservative Loss. Exploratory experiments and discussion demonstrate that our Conservative Loss has good flexibility rather than restricting an exact form.
出版社接受日期12.07.2018
著者Xinge Zhu, Hui Zhou, Ceyuan Yang, Jianping Shi, Dahua Lin
會議名稱15th European Conference on Computer Vision, ECCV 2018
會議開始日08.09.2018
會議完結日14.09.2018
會議地點Munich, Germany
會議國家/地區德國
會議論文集題名Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
出版年份2018
月份9
卷號11211
出版社Springer
頁次587 - 603
國際標準書號978-303001233-5
國際標準期刊號03029743
語言美式英語

上次更新時間 2021-22-01 於 01:54