Stochastic Block-Coordinate Gradient Projection Algorithms for Submodular Maximization
We consider a stochastic continuous submodular huge-scale optimization problem, which arises naturally in many applications such as machine learning. Due to high-dimensional data, the computation of the whole gradient vector can become prohibitively expensive. To reduce the complexity and memory req...
Saved in:
| Main Authors: | , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Wiley
2018-01-01
|
| Series: | Complexity |
| Online Access: | http://dx.doi.org/10.1155/2018/2609471 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1849308610869329920 |
|---|---|
| author | Zhigang Li Mingchuan Zhang Junlong Zhu Ruijuan Zheng Qikun Zhang Qingtao Wu |
| author_facet | Zhigang Li Mingchuan Zhang Junlong Zhu Ruijuan Zheng Qikun Zhang Qingtao Wu |
| author_sort | Zhigang Li |
| collection | DOAJ |
| description | We consider a stochastic continuous submodular huge-scale optimization problem, which arises naturally in many applications such as machine learning. Due to high-dimensional data, the computation of the whole gradient vector can become prohibitively expensive. To reduce the complexity and memory requirements, we propose a stochastic block-coordinate gradient projection algorithm for maximizing continuous submodular functions, which chooses a random subset of gradient vector and updates the estimates along the positive gradient direction. We prove that the estimates of all nodes generated by the algorithm converge to some stationary points with probability 1. Moreover, we show that the proposed algorithm achieves the tight (pmin/2F⁎-ϵ) approximation guarantee after O(1/ϵ2) iterations for DR-submodular functions by choosing appropriate step sizes. Furthermore, we also show that the algorithm achieves the tight (γ2/1+γ2pminF⁎-ϵ) approximation guarantee after O(1/ϵ2) iterations for weakly DR-submodular functions with parameter γ by choosing diminishing step sizes. |
| format | Article |
| id | doaj-art-1b372e670d0b4a41b368cb1625d4bff2 |
| institution | Kabale University |
| issn | 1076-2787 1099-0526 |
| language | English |
| publishDate | 2018-01-01 |
| publisher | Wiley |
| record_format | Article |
| series | Complexity |
| spelling | doaj-art-1b372e670d0b4a41b368cb1625d4bff22025-08-20T03:54:24ZengWileyComplexity1076-27871099-05262018-01-01201810.1155/2018/26094712609471Stochastic Block-Coordinate Gradient Projection Algorithms for Submodular MaximizationZhigang Li0Mingchuan Zhang1Junlong Zhu2Ruijuan Zheng3Qikun Zhang4Qingtao Wu5School of Computer and Communication Engineering, Zhengzhou University of Light Industry, Zhengzhou, 450002, ChinaInformation Engineering College, Henan University of Science and Technology, Luoyang, 471023, ChinaInformation Engineering College, Henan University of Science and Technology, Luoyang, 471023, ChinaInformation Engineering College, Henan University of Science and Technology, Luoyang, 471023, ChinaSchool of Computer and Communication Engineering, Zhengzhou University of Light Industry, Zhengzhou, 450002, ChinaInformation Engineering College, Henan University of Science and Technology, Luoyang, 471023, ChinaWe consider a stochastic continuous submodular huge-scale optimization problem, which arises naturally in many applications such as machine learning. Due to high-dimensional data, the computation of the whole gradient vector can become prohibitively expensive. To reduce the complexity and memory requirements, we propose a stochastic block-coordinate gradient projection algorithm for maximizing continuous submodular functions, which chooses a random subset of gradient vector and updates the estimates along the positive gradient direction. We prove that the estimates of all nodes generated by the algorithm converge to some stationary points with probability 1. Moreover, we show that the proposed algorithm achieves the tight (pmin/2F⁎-ϵ) approximation guarantee after O(1/ϵ2) iterations for DR-submodular functions by choosing appropriate step sizes. Furthermore, we also show that the algorithm achieves the tight (γ2/1+γ2pminF⁎-ϵ) approximation guarantee after O(1/ϵ2) iterations for weakly DR-submodular functions with parameter γ by choosing diminishing step sizes.http://dx.doi.org/10.1155/2018/2609471 |
| spellingShingle | Zhigang Li Mingchuan Zhang Junlong Zhu Ruijuan Zheng Qikun Zhang Qingtao Wu Stochastic Block-Coordinate Gradient Projection Algorithms for Submodular Maximization Complexity |
| title | Stochastic Block-Coordinate Gradient Projection Algorithms for Submodular Maximization |
| title_full | Stochastic Block-Coordinate Gradient Projection Algorithms for Submodular Maximization |
| title_fullStr | Stochastic Block-Coordinate Gradient Projection Algorithms for Submodular Maximization |
| title_full_unstemmed | Stochastic Block-Coordinate Gradient Projection Algorithms for Submodular Maximization |
| title_short | Stochastic Block-Coordinate Gradient Projection Algorithms for Submodular Maximization |
| title_sort | stochastic block coordinate gradient projection algorithms for submodular maximization |
| url | http://dx.doi.org/10.1155/2018/2609471 |
| work_keys_str_mv | AT zhigangli stochasticblockcoordinategradientprojectionalgorithmsforsubmodularmaximization AT mingchuanzhang stochasticblockcoordinategradientprojectionalgorithmsforsubmodularmaximization AT junlongzhu stochasticblockcoordinategradientprojectionalgorithmsforsubmodularmaximization AT ruijuanzheng stochasticblockcoordinategradientprojectionalgorithmsforsubmodularmaximization AT qikunzhang stochasticblockcoordinategradientprojectionalgorithmsforsubmodularmaximization AT qingtaowu stochasticblockcoordinategradientprojectionalgorithmsforsubmodularmaximization |