摘要:We prove almost optimal hardness for MAX k-CSPR. In MAX k-CSPR, we are given a set of constraints, each of which depends on at most k variables. Each variable can take any value from 1,2,...,R. The goal is to find an assignment to variables that maximizes the number of satisfied constraints. We show that, for any k ≥ 2 and R ≥ 16, it is NP-hard to approximate MAX k-CSPR to within factor k O(k) (logR) k/2/R k−1 . In the regime where 3 ≤ k = o(logR/loglogR), this ratio improves upon Chan’s O(k/R k−2 ) factor NP-hardness of approximation of MAX k-CSPR (J. ACM 2016). Moreover, when k = 2, our result matches the best known hardness result of Khot, Kindler, Mossel and O’Donnell (SIAM J. Comp. 2007). We remark here that NPhardness of an approximation factor of 2 O(k) log(kR)/R k−1 is implicit in the (independent) work of Khot and Saket (ICALP 2015), which is better than our ratio for all k ≥ 3. In addition to the above hardness result, by extending an algorithm for MAX 2-CSPR by Kindler, Kolla and Trevisan (SODA 2016), we provide an Ω(logR/R k−1 )-approximation algorithm for MAX k-CSPR. Thanks to Khot and Saket’s result, this algorithm is tight up to a factor of O(k 2 ) when k ≤ R O(1) . In comparison, when 3 ≤ k is a constant, the previously best known algorithm achieves an O(k/R k−1 )-approximation for the problem, which is a factor of O(k logR) from the inapproximability ratio in contrast to our gap of O(k 2 ).
关键词:constraint satisfaction;hardness of approximation;approximation algorithm