摘要:We present approximation and exact algorithms for piecewise regression of univariate and bivariate data using fixed-degree polynomials. Specifically, given a set S of n data points (??₁, y₁),… , (??_n, y_n) ∈ ℝ^d × ℝ where d ∈ {1,2}, the goal is to segment ??_i’s into some (arbitrary) number of disjoint pieces P₁, … , P_k, where each piece P_j is associated with a fixed-degree polynomial f_j: ℝ^d → ℝ, to minimize the total loss function λ k + ∑_{i = 1}ⁿ (y_i - f(??_i))², where λ ≥ 0 is a regularization term that penalizes model complexity (number of pieces) and f: ⨆_{j = 1}^k P_j → ℝ is the piecewise polynomial function defined as f|_{P_j} = f_j. The pieces P₁, … , P_k are disjoint intervals of ℝ in the case of univariate data and disjoint axis-aligned rectangles in the case of bivariate data. Our error approximation allows use of any fixed-degree polynomial, not just linear functions.Our main results are the following. For univariate data, we present a (1 + ε)-approximation algorithm with time complexity O(n/(ε) log 1/(ε)), assuming that data is presented in sorted order of x_i’s. For bivariate data, we present three results: a sub-exponential exact algorithm with running time n^{O(√n)}; a polynomial-time constant-approximation algorithm; and a quasi-polynomial time approximation scheme (QPTAS). The bivariate case is believed to be NP-hard in the folklore but we could not find a published record in the literature, so in this paper we also present a hardness proof for completeness.