*** Welcome to piglix ***

Kernelization


In computer science, a kernelization is a technique for designing efficient algorithms that achieve their efficiency by a preprocessing stage in which inputs to the algorithm are replaced by a smaller input, called a "kernel". The result of solving the problem on the kernel should either be the same as on the original input, or it should be easy to transform the output on the kernel to the desired output for the original problem.

Kernelization is often achieved by applying a set of reduction rules that cut away parts of the instance that are easy to handle. In parameterized complexity theory, it is often possible to prove that a kernel with guaranteed bounds on the size of a kernel (as a function of some parameter associated to the problem) can be found in polynomial time. When this is possible, it results in a fixed-parameter tractable algorithm whose running time is the sum of the (polynomial time) kernelization step and the (non-polynomial but bounded by the parameter) time to solve the kernel. Indeed, every problem that can be solved by a fixed-parameter tractable algorithm can be solved by a kernelization algorithm of this type.

A standard example for a kernelization algorithm is the kernelization of the vertex cover problem by S. Buss. In this problem, the input is an undirected graph together with a number . The output is a set of at most vertices that includes the endpoint of every edge in the graph, if such a set exists, or a failure exception if no such set exists. This problem is NP-hard. However, the following reduction rules may be used to kernelize it:


...
Wikipedia

...