Neural Netw. 2022 Mar 10;150:194-212. doi: 10.1016/j.neunet.2022.03.006. Online ahead of print.
ABSTRACT
Direct multi-task twin support vector machine (DMTSVM) is an effective algorithm to deal with multi-task classification problems. However, the generated hyperplane may shift to outliers since the hinge loss is used in DMTSVM. Therefore, we propose an improved multi-task model RaMTTSVM based on ramp loss to handle noisy points more effectively. It could limit the maximal loss value distinctly and put definite restrictions on the influences of noises. But RaMTTSVM is non-convex which should be solved by CCCP, then a series of approximate convex problems need to be solved. So, it may be time-consuming. Motivated by the sparse solution of our RaMTTSVM, we further propose a safe acceleration rule MSA to accelerate the solving speed. Based on optimality conditions and convex optimization theory, MSA could delete a lot of inactive samples corresponding to 0 elements in dual solutions before solving the model. Then the computation speed can be accelerated by just solving reduced problems. The rule contains three different parts that correspond to different parameters and different iteration phases of CCCP. It can be used not only for the first approximate convex problem of CCCP but also for the successive problems during the iteration process. More importantly, our MSA is safe in the sense that the reduced problem can derive an identical optimal solution as the original problem, so the prediction accuracy will not be disturbed. Experimental results on one artificial dataset, ten Benchmark datasets, ten Image datasets and one real wine dataset confirm the generalization and acceleration ability of our proposed algorithm.
PMID:35316737 | DOI:10.1016/j.neunet.2022.03.006