Daniel Lokshtanov

Generalization and Specialization of Kernelization

DATE:Saturday, September 3, 2011

ABSTRACT

Research on kernelization is commonly considered as a theory of efficient preprocessing (and data reduction in general). In his talk at WorKer 2011, Daniel Lokshtanov argued that the current definition of "kernel" does not entirely live up to this interpretation. A case in point are the problems one can run into when combining kernelization with heuristics. To address this issue, he characterized a class of kernelization rules that preserve approximate solutions. Switching from an analysis of individual examples to a more systematic perspective, he identified general conditions that entail the existence of such kernels.

Given at WorKer 2011 – The Third Workshop on Kernelization (slides are linked there).

Comments are closed.