|

篇目详细内容 |
【篇名】 |
A fast-convergence distributed support vector machine in small-scale strongly connected networks |
【刊名】 |
Frontiers of Electrical and Electronic Engineering |
【刊名缩写】 |
Front. Electr. Electron. Eng. |
【ISSN】 |
2095-2732 |
【EISSN】 |
2095-2740 |
【DOI】 |
10.1007/s11460-011-0172-9 |
【出版社】 |
Higher Education Press and Springer-Verlag Berlin
Heidelberg |
【出版年】 |
2012 |
【卷期】 |
7
卷2期 |
【页码】 |
216-223
页,共
8
页 |
【作者】 |
Hua XU;
Yun WEN;
Jixiong WANG;
|
【关键词】 |
support vector machine; message passing interface; distributed computing; parallel computing; convergence; speedup |
【摘要】 |
In this paper, a fast-convergence distributed support vector machine (FDSVM) algorithm is proposed, aiming at efficiently solving the problem of distributed SVM training. Rather than exchanging information only among immediate neighbor sites, the proposed FDSVM employs a deterministic gossip protocol-based communication policy to accelerate diffusing information around the network, in which each site communicates with others in a flooding and iterative manner. This communication policy significantly reduces the total number of iterations, thus further speeding up the convergence of the algorithm. In addition, the proposed algorithm is proved to converge to the global optimum in finite steps over an arbitrary strongly connected network (SCN). Experiments on various benchmark data sets show that the proposed FDSVM consistently outperforms the related state-of-the-art approach for most networks, especially in the ring network, in terms of the total training time. |
|