## Smoothed analysis: an attempt to explain the behavior of algorithms in practice (2009)

Venue: | Commun. ACM |

Citations: | 12 - 0 self |

### BibTeX

@ARTICLE{Spielman09smoothedanalysis:,

author = {Daniel A. Spielman and Shang-hua Teng},

title = {Smoothed analysis: an attempt to explain the behavior of algorithms in practice},

journal = {Commun. ACM},

year = {2009}

}

### OpenURL

### Abstract

Many algorithms and heuristics work well on real data, despite having poor complexity under the standard worst-case measure. Smoothed analysis [36] is a step towards a theory that explains the behavior of algorithms in practice. It is based on the assumption that inputs to algorithms are subject to random perturbation and modification in their formation. A concrete example of such a smoothed analysis is a proof that the simplex algorithm for linear programming usually runs in polynomial time, when its input is subject to modeling or measurement noise. 1. MODELING REAL DATA “My experiences also strongly confirmed my previous opinion that the best theory is inspired by practice and the best practice is inspired by theory. ” [Donald E. Knuth: “Theory and Practice”, Theoretical Computer Science, 90 (1), 1–15, 1991.] Algorithms are high-level descriptions of how computational tasks are performed. Engineers and experimentalists design and implement algorithms, and generally consider them a success if they work in practice. However, an algorithm that works well in one practical domain might perform poorly in another. Theorists also design and analyze algorithms, with the goal of providing provable guarantees about their performance. The traditional goal of theoretical computer science is to prove that an algorithm performs well This material is based upon work supported by the National