We prove that a random linear code over Fq, with probability arbitrarily close to 1, is list decodable at radius 1−1q− with list size L=O(12) and rate R=q(2(log3(1))) . Up to the polylogarithmic factor in 1 and constant factors depending on q, this matches the lower bound L=q(12) for the list size and upper bound R=Oq(2) for the rate. Previously only existence (and not abundance) of such codes was known for the special case q=2 (Guruswami, Håstad, Sudan and Zuckerman, 2002).
In order to obtain our result, we employ a relaxed version of the well known Johnson bound on list decoding that translates the average Hamming distance between codewords to list decoding guarantees. We furthermore prove that the desired average-distance guarantees hold for a code provided that a natural complex matrix encoding the codewords satisfies the Restricted Isometry Property with respect to the Euclidean norm (RIP-2). For the case of random binary linear codes, this matrix coincides with a random submatrix of the Hadamard-Walsh transform matrix that is well studied in the compressed sensing literature.
Finally, we improve the analysis of Rudelson and Vershynin (2008) on the number of random frequency samples required for exact reconstruction of k-sparse signals of length N. Specifically, we improve the number of samples from O(klog(N)log2(k)(logk+loglogN)) to O(klog(N)log3(k)). The proof involves bounding the expected supremum of a related Gaussian process by using an improved analysis of the metric defined by the process. This improvement is crucial for our application in list decoding.