Nonconvex Nonlinear Transformation of Low-Rank Approximation for Tensor Completion
Recovering incomplete high-dimensional data to create complete and valuable datasets is the main focus of tensor completion research, which lies at the intersection of mathematics and information science. Researchers typically apply various linear and nonlinear transformations to the original tensor...
Saved in:
| Main Authors: | , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2024-12-01
|
| Series: | Applied Sciences |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2076-3417/14/24/11895 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1846106000839409664 |
|---|---|
| author | Yifan Mei Xinhua Su Huixiang Lin Huanmin Ge |
| author_facet | Yifan Mei Xinhua Su Huixiang Lin Huanmin Ge |
| author_sort | Yifan Mei |
| collection | DOAJ |
| description | Recovering incomplete high-dimensional data to create complete and valuable datasets is the main focus of tensor completion research, which lies at the intersection of mathematics and information science. Researchers typically apply various linear and nonlinear transformations to the original tensor, using regularization terms like the nuclear norm for low-rank approximation. However, relying solely on the tensor nuclear norm can lead to suboptimal solutions because of the convex relaxation of tensor rank, which strays from the original outcomes. To tackle these issues, we introduce the low-rank approximation nonconvex nonlinear transformation (LRANNT) method. By employing nonconvex norms and nonlinear transformations, we can more accurately capture the intrinsic structure of tensors, providing a more effective solution to the tensor completion problem. Additionally, we propose the proximal alternating minimization (PAM) algorithm to solve the model, demonstrating its convergence. Tests on publicly available datasets demonstrate that our method outperforms the current state-of-the-art approaches, even under extreme conditions with a high missing rate of up to 97.5%. |
| format | Article |
| id | doaj-art-886c68f3e6e14f06a700bf63b5058ba5 |
| institution | Kabale University |
| issn | 2076-3417 |
| language | English |
| publishDate | 2024-12-01 |
| publisher | MDPI AG |
| record_format | Article |
| series | Applied Sciences |
| spelling | doaj-art-886c68f3e6e14f06a700bf63b5058ba52024-12-27T14:08:41ZengMDPI AGApplied Sciences2076-34172024-12-0114241189510.3390/app142411895Nonconvex Nonlinear Transformation of Low-Rank Approximation for Tensor CompletionYifan Mei0Xinhua Su1Huixiang Lin2Huanmin Ge3School of Sports Engineering, Beijing Sport University, Beijing 100084, ChinaSchool of Sports Engineering, Beijing Sport University, Beijing 100084, ChinaSchool of Sports Engineering, Beijing Sport University, Beijing 100084, ChinaSchool of Sports Engineering, Beijing Sport University, Beijing 100084, ChinaRecovering incomplete high-dimensional data to create complete and valuable datasets is the main focus of tensor completion research, which lies at the intersection of mathematics and information science. Researchers typically apply various linear and nonlinear transformations to the original tensor, using regularization terms like the nuclear norm for low-rank approximation. However, relying solely on the tensor nuclear norm can lead to suboptimal solutions because of the convex relaxation of tensor rank, which strays from the original outcomes. To tackle these issues, we introduce the low-rank approximation nonconvex nonlinear transformation (LRANNT) method. By employing nonconvex norms and nonlinear transformations, we can more accurately capture the intrinsic structure of tensors, providing a more effective solution to the tensor completion problem. Additionally, we propose the proximal alternating minimization (PAM) algorithm to solve the model, demonstrating its convergence. Tests on publicly available datasets demonstrate that our method outperforms the current state-of-the-art approaches, even under extreme conditions with a high missing rate of up to 97.5%.https://www.mdpi.com/2076-3417/14/24/11895nonlinear transformationproximal alternating minimizationtensor completionnonconvexlowrank |
| spellingShingle | Yifan Mei Xinhua Su Huixiang Lin Huanmin Ge Nonconvex Nonlinear Transformation of Low-Rank Approximation for Tensor Completion Applied Sciences nonlinear transformation proximal alternating minimization tensor completion nonconvex lowrank |
| title | Nonconvex Nonlinear Transformation of Low-Rank Approximation for Tensor Completion |
| title_full | Nonconvex Nonlinear Transformation of Low-Rank Approximation for Tensor Completion |
| title_fullStr | Nonconvex Nonlinear Transformation of Low-Rank Approximation for Tensor Completion |
| title_full_unstemmed | Nonconvex Nonlinear Transformation of Low-Rank Approximation for Tensor Completion |
| title_short | Nonconvex Nonlinear Transformation of Low-Rank Approximation for Tensor Completion |
| title_sort | nonconvex nonlinear transformation of low rank approximation for tensor completion |
| topic | nonlinear transformation proximal alternating minimization tensor completion nonconvex lowrank |
| url | https://www.mdpi.com/2076-3417/14/24/11895 |
| work_keys_str_mv | AT yifanmei nonconvexnonlineartransformationoflowrankapproximationfortensorcompletion AT xinhuasu nonconvexnonlineartransformationoflowrankapproximationfortensorcompletion AT huixianglin nonconvexnonlineartransformationoflowrankapproximationfortensorcompletion AT huanminge nonconvexnonlineartransformationoflowrankapproximationfortensorcompletion |