# Parameterized Complexity and the Understanding, Design, and Analysis of Heuristics

**NII Shonan Meeting: **

**@ Shonan Village Center, May 6-11, 2013**

**NII Shonan Meeting Report (ISSN 2186-7437):No.2013-2**

## Organizers

- Gregory Gutin, University of London, UK
- Kazuo Iwama, Kyoto University, Japan
- Dimitrios Thilikos, National and Kapodistrian University of Athens, Hellenic Republic

## Overview

**Short Summary**

In the parameterized/multivariate framework, NP-hardness is just the beginning: a result about the

null-parameterization. What follows is a rich dialogue between theory and practice, with the goals

of:

・explaining the effectiveness of established heuristics

and

・designing better heuristics in mathematically disciplined ways

The workshop will bring together researchers from both universities and industry, who are interested

in exploring how a multivariate view of complexity analysis and algorithm design can lead to

industrially useful algorithms in new, mathematically systematic ways.

The vast majority of practical problems are NP-hard and thus we cannot hope to obtain algorithms

that compute exact or optimal solutions efficiently, at least according to the one-dimensional (n,

the input size) worst-case framework that essentially contrasts worst-case running times that are

polynomial in n, versus worst-case running times that are exponential in n.

Recent discussions of the future of theoretical computer science have pointed to the importance of

heuristic algorithms for NP-hard problems. In 2009, in his plenary talk at the FOCS Theory Day,

Richard Karp posed the question, why do heuristics work so well in practice?, as one of the most

important challenges in the theory of computing. Applied computing communities have been aware

of this issue for decades, and have lamented the relative mission-failure of theoretical computer

science. Theoretical computer science, has reciprocally lamented its lack of support, both by core

funding agencies, and in Computer Science curriculums. There is a central dysfunction between

theory and practice to which Karp points.

In real life, typical inputs to a computational problem are the outputs of other resource-constrained

processes, both natural and artificial. While datasets may be large, significant (although often

multifaceted) structure is almost always present and worth taking into account in algorithm design

and complexity analysis.

Parameterized complexity attends to the underlying structure of real-world input distributions, as

well as structure in the computational objectives, such as size of solutions, noise in the data, goodness

of the desired approximation, or combinations of these. (See [DF99,FG06,Nie06].) Secondary

measurements (that may form an aggregate parameter) capture such crucially relevant problem

structure and open up the possibility of exploiting this structure in order to design efficient worstcase

algorithms ? or to identify algorithmic strategies that can usefully be incorportated into practical heuristics. The central notion of parameterized complexity is a multivariate generalization

of polynomial time that is termed fixed-parameter tractability (FPT), meaning solvability in

time f(k)+nc, where f is some function of the parameter k, n is the input size, and c is a constant

? in other words, the input size contributes polynomially to the complexity, and any exponential

costs are confined to some function of the parameter k.

Parameterized algorithms, both directly and indirectly, have made major contributions to practical

computing in such areas as Computational Biology and (increasingly) Artificial Intelligence. The

historical record is muddy. Since computing began, applications-determined implementors have of

course attended to relevant secondary measurements that allowed their implementations to succeed

on real-world data-sets (but without any attention to theoretical frameworks). Meanwhile, onedimensionally

framed and trained theorists have crept towards a multivariate outlook, under the

rhetoric of exact algorithmics.

Although the historical record is muddied, it is essentially the case that the modern computational

revolutions in Biology and Medicine would not have happened without parameterized / multivariate

algorithmics. Explicit mathematical formulation of the issues adds power to the algorithmsengineering

opportunities, not only in these application domains, but wherever NP-hardness of

central computational tasks has been identified, and heuristics are currently the main game.

Real-life input distributions have significant structure, but what is it, and how should it be explicitly

declared in realistic multivariate complexity bookkeeping? The multivariate approach to complexity

analysis and algorithm design often demands significant dialog between theory results, and practical

computing experience, including the adaptation of FPT algorithms and mathematically-centered

algorithmic strategies into practical-computing-targeted heuristics:

・It is well-known that the input of every FPT parameterized problem can be pre-processed, in

polynomial time, such that the reduced instance (losing no information) has size bounded (only) by a

function of the parameter (for many NP-hard problems, a polynomial function). Such kernelization

subroutines can be highly useful in practical computing, regardless of how the reduced instances

are solved. For the first time: a useful mathematical theory of pre-processing.

・ Parameterization also offers, for the first time, a useful theory of the complexity of local search

subroutines.

・ FPT subroutines are also useful in the “genetic recombination” phase of hybrid (e.g., memetic)

meta-heuristics.

・ Algorithmic strategies for FPT algorithms, such as iterative compression and greedy localization,

can be systematically adapted into theoretically organized classes of heuristics (see [Karp11]) that

are competitive or superior to previous efforts, or in some cases, exactly the same: the successful

heuristic was simply an unrecognized FPT algorithm, according to an unrecognized but relevant

parameter (see examples in [DF99]).

・ The mathematical determination that a parameterized problem is FPT often involves estimates

concerning mathematical relationships that are far from the plausible but unproven truth of the

matter, or worst-case assumptions about unparameterized aspects of the input that are far from

typical. Systematic algorithm engineering compromises about such issues can lead to novel and

effective heuristics derived from FPT algorithms (see [TM09] for a pioneering example).

・ The central innovation of the parameterized approach to algorithms and complexity is, of course,

simply the explicitly multivariate view of complexity analysis and algorithm design. Beyond that,

at the core of the multivariate algorithmics project, there is a richer vision of the workflow between

theory and practice. In the classical framework of P versus NP, a problem is NP-hard, and that is

almost the end of the story. As Karp wrote in [Karp11]:

Except in rare cases, the problems are NP-hard, and the performance guarantees provided

by polynomial-time approximation algorithms are far too pessimistic to be useful.

In the multivariate workflow there are two complementary creative principles:

**Deconstruction of hardness results. **Most hardness proofs in both classical and parameterized

complexity are unreasonable. Read the proofs and ask why the constructions are unreasonable for

real-world inputs. This often points to realistic FPT parameterizations.

**Model enrichment. **When FPT results are obtained, it is often productive to enrich the definition

of the problem, in order to obtain better traction in real-world applications. Parameters can be

complex aggregates that wrap onto problem realism.

Because a single classical problem can be parameterized in an unlimited number of ways, this rich

mathematical dialog exposes many opportunities for fresh approaches in the design of practical algorithms and heuristics. The goal of the Workshop is to explore, systematize and programmatically

articulate these opportunities.

**References**

[DF99] Downey, R.G. and Fellows, M.R.: Parameterized complexity, Springer, 1999.

[FG06] Flum, J. and Grohe, M.: Parameterized Complexity Theory, Springer, 2006.

[Karp11] Karp, R.: “Heuristic algorithms in computational molecular biology,” J. Computer and

System Sciences 77 (2011), 122-128.

[Nie06] Niedermeier, R.: An Invitation to Parameterized Algorithms,

[TM09] Tazari, S. andM¨uller-Hannemann, M.: “Dealing with Large Hidden Constants: Engineering

a Planar Steiner Tree PTAS”, proceedings of ALENEX 2009, pp. 120?131.X 2009, pp. 120?131.