Sparsity Adaptive Matching Pursuit (SAMP) is a novel greedy pursuit algorithm for practical compressed sensing applications. The most innovative feature of this algorithm is that it can blindly recover sparse signal without information of sparsity, removing a common limitation of most of existing greedy pursuit algorithms. SAMP adopts a similar flavor of the EM algorithm that alternatively estimates sparsity and support set of the signal. Theoretically, the SAMP is shown to own performance comparable to that of l-1 minimization although in simulation, it outperforms many other existing greedy pursuit algorithms, especially with compressible signals, making it a promising sparse recovery algorithm for large-scale, real-time applications
The SAMP algorithm is analogous to a recruiting process:
The simplest way to explain how the SAMP works is to use an analogy of a recruiting process. The CEO of a company is planning to recruit a group of brightest engineers at JHU to build a next generation computer model. He does not know how many engineers are sufficient for accomplishing this task and thus he takes a build-up approach. He uses the following strategy to recruit the first top 10 students. First, he chooses the top 20 students who have the best resumes as an initial candidate list. Then, he asks these 20 guys to do a test and keeps the top 10 guys who have the highest scores. He knows that selecting a student based on his resume can falsely exclude promising students who dont have excellent resumes. Thus, he keeps refining the top 10 list by selecting another 10 guys who have good resume and let's them redo the test with the current top 10 list and then again keeping the top 10 guys who have the highest scores. He repeats this refinement process many times until the average test score starts decreasing. He then discusses with the finalist of top 10 guys if they are able to make the next generation computer model. If not, he then decides to increase the size of the group to 20 students. He again chooses 20 guys having good resumes and let's them do the test with the current top 10 finalist and then keeping the top 20 guys who have the highest scores. Then he refines the list of top 20 guys by choosing another 20 candidate guys and let's them redo the test again..until the average test score starts decreasing. He keeps increasing the size of the group until they can build the next generation computer model.
Remarkable points of this recruiting algorithm: It is composed of several stages. The necessity of separating the recruiting process into stages is because the CEO does not know how many engineers is sufficient to build the computer model. To save the budget, he prefers finding the smallest group of the best engineers who can accomplish the task. Thus, he starts building a small group and then expands it until it is right enough to do the job. At each stage, the number of students being chosen is fixed and the
finalist is refined iteratively. The nature of this refinement process
is to make sure that no promising student is missed due to his bad luck
(not having the best resume, doing the test badly sometimes because of
some objective reason) and that no bad student is wrongly selected due
to his good luck. In addition, the CEO uses 2 different criterion to build a finalist at each stage: resume (Preliminary Test) and test score (Final Test). Resume of a student presents the information of his local competitiveness while his tests score gives the information of relatively global comparison with other students. Thus, although the CEO uses the build-up approach to recruit the group, he also exploits the top-down approach to refine a finalist at each stage. The figure below depicts connection of operation blocks of the SAMP at each stage
1. Thong T. Do, Lu Gan, Nam Nguyen, and Trac D. Tran,
Sparsity adaptive matching pursuit algorithm for practical compressed sensing.
(Asilomar Conference on Signals, Systems, and Computers, Pacific Grove, California, October 2008) [in the nomination list of the Student Paper Contest at the Asilomar conf.]