KMS Chongqing Institute of Green and Intelligent Technology, CAS
Proximal Alternating-Direction-Method-of-Multipliers-Incorporated Nonnegative Latent Factor Analysis | |
Bi, Fanghui1,2; Luo, Xin3; Shen, Bo4; Dong, Hongli5; Wang, Zidong6 | |
2023-06-01 | |
摘要 | High-dimensional and incomplete (HDI) data subject to the nonnegativity constraints are commonly encountered in a big data-related application concerning the interactions among numerous nodes. A nonnegative latent factor analysis (NLFA) model can perform representation learning to HDI data efficiently. However, existing NLFA models suffer from either slow convergence rate or representation accuracy loss. To address this issue, this paper proposes a proximal alternating-direction-method-of-multipliers-based nonnegative latent factor analysis (PAN) model with two-fold ideas: 1) adopting the principle of alternating-direction-method-of-multipliers to implement an efficient learning scheme for fast convergence and high computational efficiency; and 2) incorporating the proximal regularization into the learning scheme to suppress the optimization fluctuation for high representation learning accuracy to HDI data. Theoretical studies verify that PAN converges to a Karush-Kuhn-Tucker (KKT) stationary point of its nonnegativity-constrained learning objective with its learning scheme. Experimental results on eight HDI matrices from real applications demonstrate that the proposed PAN model outperforms several state-of-the-art models in both estimation accuracy for missing data of an HDI matrix and computational efficiency. |
关键词 | Data science high-dimensional and incomplete data knowledge acquisition industrial application nonnegative latent factor analysis(NLFA) proximal alternating direction method of multipliers representation learning |
DOI | 10.1109/JAS.2023.123474 |
发表期刊 | IEEE-CAA JOURNAL OF AUTOMATICA SINICA |
ISSN | 2329-9266 |
卷号 | 10期号:6页码:1388-1406 |
通讯作者 | Luo, Xin(luoxin@swu.edu.cn) |
收录类别 | SCI |
WOS记录号 | WOS:001000283800004 |
语种 | 英语 |