Danh mục

Báo cáo hóa học: Research Article Noisy Sparse Recovery Based on Parameterized Quadratic Programming by Thresholding Jun Zhang,1, 2 Yuanqing Li,1 Zhuliang Yu,1 and

Số trang: 7      Loại file: pdf      Dung lượng: 659.71 KB      Lượt xem: 10      Lượt tải: 0    
Hoai.2512

Hỗ trợ phí lưu trữ khi tải xuống: 3,500 VND Tải xuống file đầy đủ (7 trang) 0
Xem trước 2 trang đầu tiên của tài liệu này:

Thông tin tài liệu:

Tuyển tập báo cáo các nghiên cứu khoa học quốc tế ngành hóa học dành cho các bạn yêu hóa học tham khảo đề tài: Research Article Noisy Sparse Recovery Based on Parameterized Quadratic Programming by ThresholdingJun Zhang,1, 2 Yuanqing Li,1 Zhuliang Yu,1 and
Nội dung trích xuất từ tài liệu:
Báo cáo hóa học: " Research Article Noisy Sparse Recovery Based on Parameterized Quadratic Programming by Thresholding Jun Zhang,1, 2 Yuanqing Li,1 Zhuliang Yu,1 and "Hindawi Publishing CorporationEURASIP Journal on Advances in Signal ProcessingVolume 2011, Article ID 528734, 7 pagesdoi:10.1155/2011/528734Research ArticleNoisy Sparse Recovery Based on Parameterized QuadraticProgramming by Thresholding Jun Zhang,1, 2 Yuanqing Li,1 Zhuliang Yu,1 and Zhenghui Gu1 1 Center for Brain-Computer Interfaces and Brain Information Processing, College of Automation Science and Engineering, South China University of Technology, Guangzhou 510640, China 2 College of Information Engineering, Guangdong University of Technology, Guangzhou 510006, China Correspondence should be addressed to Jun Zhang, zhangjun7907@hotmail.com Received 27 August 2010; Revised 12 December 2010; Accepted 28 January 2011 Academic Editor: Walter Kellermann Copyright © 2011 Jun Zhang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Parameterized quadratic programming (Lasso) is a powerful tool for the recovery of sparse signals based on underdetermined observations contaminated by noise. In this paper, we study the problem of simultaneous sparsity pattern recovery and approximation recovery based on the Lasso. An extended Lasso method is proposed with the following main contributions: (1) we analyze the recovery accuracy of Lasso under the condition of guaranteeing the recovery of nonzero entries positions. Specifically, an upper bound of the tuning parameter h of Lasso is derived. If h exceeds this bound, the recovery error will increase with h; (2) an extended Lasso algorithm is developed by choosing the tuning parameter according to the bound and at the same time deriving a threshold to recover zero entries from the output of the Lasso. The simulation results validate that our method produces higher probability of sparsity pattern recovery and better approximation recovery compared to two state-of-the-art Lasso methods. Here and throughout, · p denotes the L p -norm (0 ≤1. Introduction p ≤ ∞). Specially, S 0 = | supp(S)|, where supp(S): =The problem of recovering unknown sparse vector S ∈ R m { j | S j = 0}, |Ω| denotes the cardinality of a finite set Ω /based on the limited noisy observations Y = AS + e arises and S j denotes the j th component in the vector S. In thein many applications, including compressed sensing [1, 2], optimization problem (1), the tuning parameter h is criticalpattern recognition [3, 4], blind source separation [5, 6], for deriving a satisfactory solution.signal reconstruction [7], and machine learning [8], where Up to date, many theoretical results have been obtainedA ∈ R n×m is referred to a measurement matrix with n < on Lasso to recover a sparse signal. The following twom and e ∈ R n is an unknown vector of noise. In this scenarios are usually of interest:paper, we suppose the positions and the signs of nonzero (1) Sparsity pattern recovery: given noisy observations Ycomponents of S are distributed uniformly at random, and of sparse signal S, how to recover the positions andtheir amplitudes follow an arbitrary distribution. We also signs of S’s nonzero entries.assume that e follows zero-mean, independent, and identi-cally distributed sub-Gaussian with parameter σ 2 . Recently, (2) Stable recovery: analyzing the error bound betweenmany studies have advocated the use of the parameterized Lasso solution S and true sparse vector S.quadratic programming (Lasso [9, 10], also called basispursuit [11]) to deal with the noisy sparse recovery problem About scenario (1), based on deterministic framework, Fuchs [12, 13] has provided a sufficient condition in mutualthrough minimizing the following objective function whichsimultaneously executes approximation and stable recovery incoherence form. Tropp [14] a ...

Tài liệu được xem nhiều:

Tài liệu liên quan: