论文部分内容阅读
提出一种基于改进的最小距离分类器的增量学习算法,消除增量学习过程中产生的分类器内部结构的相互干扰,使分类器既能记住已学习的知识,又能学习新知识.增量学习需要对分类器结构进行调整,必须使用有代表性的已学习样本帮助分类器在学习新知识时复习旧知识.针对正态分布的样本集提出一种筛选算法,只保留有代表性的少量样本,大大减少存储消耗和重新训练的计算开销.实验结果证明该算法对样本的识别准确率高,在有效识别新样本的同时对以前学习的样本也保持较高的识别率,消耗存储空间小.
An incremental learning algorithm based on improved minimum distance classifier is proposed to eliminate the mutual interference of the internal structure of classifiers generated during incremental learning so that the classifier can not only learn learned knowledge but also learn new knowledge. Incremental learning needs to adjust the structure of the classifier, and representative learned samples must be used to help the classifier to review the old knowledge when learning new knowledge. A screening algorithm is proposed for the sample set with normal distribution, Which greatly reduces the computational overhead of memory consumption and retraining.Experimental results show that the proposed algorithm has high recognition accuracy for samples and maintains a high recognition rate for previously learned samples while effectively identifying new samples and consumes storage Small space.