Implicit Computational Complexity and Applications: Resource Control, Security, Real-Number Computation

NII Shonan Meeting:

@ Shonan Village Center, November 4-7, 2013

NII Shonan Meeting Report (ISSN 2186-7437):No.2013-13

Organizers

  • Dr. Akitoshi Kawamura, The University of Tokyo, Japan
  • Prof. Jean-Yves Marion, Lorraine University, France
  • Dr. David Nowak, CNRS & The University of Tokyo, France (Japan)

Overview

Computational complexity theory aims at classifying computational problems according to their inherent difficulty. The standard way to achieve this classification consists in formalizing a precise execution model (e.g., a Turing machine) and posing explicit bounds on time and memory resources. On the other hand, Implicit Computational Complexity (ICC) aims at studying computational complexity without referring to external measuring conditions or a particular machine model, but only by considering language restrictions or logical/computational principles implying complexity properties. The area of ICC has grown out from several proposals to use logic and formal methods to provide languages for complexity-bounded computation (e.g., polynomial time, logarithmic space computation). ICC methods include, among others, linear logic, typed programming language, second order logic, term ordering. The last decades have seen the development of logical formalisms that characterize functions computable in various complexity classes (polynomial or elementary in time, logarithmic in space).

The goal of the proposed meeting is to explore foundational as well as practical interconnections between formal logic and computational complexity, such as it is done in ICC. The main outcome of this meeting will be to trigger new interactions and enrich the various approaches. In particular, and aside from traditional ICC approaches, we would like to focus on computation involving real numbers and topological spaces, thereby providing a deeper understanding of computational complexity in non-discrete realms of mathematics. By bringing together experts in implicit complexity and in complexity in analysis, we will promote new interaction between the two fields. People in those two fields are currently working separately, but there is enough common ground between them to make it worth having those two communities talking and working together. The meeting would also foster discussions about applications, i.e., the design of methods based on ICC and suitable for static verification of program resource consumption and of security.

Research topics for discussion on the various aspects described above would include, among others, the following topics:

  • types for controlling complexity
  • logical systems for computational complexity
  • linear logic
  • semantics of complexity-bounded computation
  • rewriting and termination orderings
  • interpretation-based methods for implicit complexity
  • programming languages for complexity-bounded computation
  • application of implicit complexity to the analysis of resource consumption
  • application of implicit complexity to security
  • complexity over reals and non-discrete spaces
  • type-two complexity
  • resource bounds in computable analysis and algorithmic randomness
  • analog and continuous-time computation
  • theory and implementation of efficient validated numerical algorithms

We briefly describe below two topics that we would like to promote. Although they originate from two different communities, they share the same interest in implicit characterization of complexity and they both have applications to formal validation of computer systems.

Very recently ICC methods have been applied to security methods and conversely security methods have been used as a new approach in ICC context. In the context of security proofs, the computational power of adversaries has to be limited so that their potential attacks are feasible. An adversary with unlimited computational power could indeed break most cryptographic schemes (e.g., RSA by efficiently factoring large integers). It is usual to rely on Cobham’s thesis identifying feasibility with computability in polynomial time. Hence the particular interest in the class of functions computable in polynomial time and its implicit characterization with a programming language that can be used to construct adversaries. Conversely, type systems to control the information flow, which are traditionally used for certifying security policies like confidentiality or integrity, are related to the notion of data stratification. As a result, type systems for imperative programming languages have been developed to control resource consumption.

The other proposed focus is on computation over the reals. Computable analysis, the study of abilities and limitations of digital computers applied to problems in mathematical analysis, has originally evolved from computability theory, but there is increasing interest in computational complexity with bounded time and space. The goals here are to analyze the computational costs of algorithms for problems involving real numbers and to explore the principles and structures of computational complexity in this context, providing a foundation of validated numerical methods for
problems arising in physical sciences and engineering. Broader perspectives in computational complexity, including those from implicit complexity theory, have high potential to help here, as can be already seen, for example, in recent studies of computational power of dynamical systems and analog computers, or in the application of type-two complexity theory to time-bounded computable analysis.

 

Comments are closed.