Smoothed complexity theory


Share/Save/Bookmark

Bläser, Markus and Manthey, Bodo (2012) Smoothed complexity theory. In: 37th International Symposium on Mathematical Foundations of Computer Science, MFCS 2012, 27-31 August 2012, Bratislava, Slovakia.

[img]PDF
Restricted to UT campus only
: Request a copy
346Kb
Abstract:Smoothed analysis is a new way of analyzing algorithms introduced by Spielman and Teng (J. ACM, 2004). Classical methods like worst-case or average-case analysis have accompanying complexity classes, like P and Avg−P, respectively. While worst-case or average-case analysis give us a means to talk about the running time of a particular algorithm, complexity classes allows us to talk about the inherent difficulty of problems. Smoothed analysis is a hybrid of worst-case and average-case analysis and compensates some of their drawbacks. Despite its success for the analysis of single algorithms and problems, there is no embedding of smoothed analysis into computational complexity theory, which is necessary to classify problems according to their intrinsic difficulty. We propose a framework for smoothed complexity theory, define the relevant classes, and prove some first results.
Item Type:Conference or Workshop Item
Copyright:© 2012 Springer
Faculty:
Electrical Engineering, Mathematics and Computer Science (EEMCS)
Research Group:
Link to this item:http://purl.utwente.nl/publications/80994
Official URL:http://dx.doi.org/10.1007/978-3-642-32589-2_20
Export this item as:BibTeX
EndNote
HTML Citation
Reference Manager

 

Repository Staff Only: item control page