一. 传统的距离度量学习方法:
监督方法
1.NIPS 2005(LMNN):Distance Metric Learning for Large Margin Nearest Neighbor Classification
1.1 AAAI 2017(LMNN的拓展):Parameter Free Large Margin Nearest Neighbor for Distance Metric Learning
2. Lecture Notes(FLD(LDA的一种),必看):Fisher Linear Discriminant Analysis(FLD)
2.1 NIPS 2003(MMC,解决了LDA样本少出现不可逆的问题):Efficient and Robust Feature Extraction by Maximum Margin Criterion
2.2 CVPR 2007(ANMM,注意与LDA的关系以及与MMC,LMNN的区别联系):Feature Extraction by Maximizing the Average Neighborhood Margin
2.3 ICCV 2005(SNMMC,同样是解决LDA出现的四大问题,逐步的思想以及margin的设定很不错):Face Recognition By Stepwise Nonparametric Margin Maximum Criterion
2.4 ECML 2004(MMDA,将分类器与数据降维特征提取结合,非常棒的发现):Margin Maximizing Discriminant Analysis
3.NIPS 2009(相似度学习(Similarity Learning),放弃了对称和半定的约束,但实验效果也不变差,建议看一下):An Online Algorithm for Large Scale Image Similarity Learning
无监督方法:
1.Lecture Notes(PCA,必看):Principal Component Analysis(PCA)
二. 深度距离度量学习:
1.CVPR 2014: Discriminative Deep Metric Learning for Face Verification in the Wild
2.CVPR 2015(注意与1的区别与联系):Multi-Manifold Deep Metric Learning for Image Set Classification
3.CVPR 2016(最大化使用batch)Deep Metric Learning via Lifted Structured Feature Embedding
4.NIPS 2016(注意与3的区别)Improved Deep Metric Learning with Multi-class N-pair Loss Objective
三. 辅助论文学习(上述距离度量论文中涉及的内容):
1.CVPR2016(近似近邻搜索): FANNG: Fast Approximate Nearest Neighbour Graphs
共同学习,写下你的评论
评论加载中...
作者其他优质文章