论文部分内容阅读
针对多核最小二乘支持向量机(multiple kernel least squares support vector machine,MK-LSSVM)忽略了核函数的代价以及缺乏稀疏性的问题,提出了一种代价约束的稀疏多核最小二乘支持向量机方法.将MK-LSSVM的原始优化问题转化为二阶锥规划形式,引入核函数代价因子,约束复杂核函数的权重,以节约变量存储空间和计算时间,利用Schmidt正交化理论约简核矩阵,进一步减小计算量,并根据支持向量的数目以及活动核函数的类型评估多核学习的总代价.测试数据集仿真结果表明,相比传统的MK-LSSVM,该方法利用更少的支持向量和更简单的组合核函数达到了相同的精度要求,代价更小.采用该方法预测浮选回收率的代价值降低了27.56.
Aiming at the problem of multiple kernel least squares support vector machine (MK-LSSVM) neglecting the cost of kernel function and the lack of sparsity, this paper proposes a cost-constrained sparse multi-core least square support vector machine The original optimization problem of MK-LSSVM is transformed into the second-order conic programming form, the cost function of kernel function is introduced, the weight of complex kernel function is restrained, the storage space of variables and computing time are saved, the kernel matrix is reduced by Schmidt orthogonalization theory, Further reduce the computational complexity and evaluate the total cost of multicore learning based on the number of support vectors and the types of active kernels.The simulation results of test data set show that this method uses fewer support vectors and more The simple combination of kernel functions achieves the same precision with a smaller cost, and the value of this method for prediction of flotation recovery is reduced by 27.56.