Seminars

NO.047 Algorithmic Randomness and Complexity

Shonan Village Center

September 8 - 12, 2014 (Check-in: September 7, 2014 )

Organizers

  • Rodney Downey
    • Victoria University, Wellington, New Zealand
  • Kenshi Miyabe
    • University of Tokyo, Japan
  • André Nies
    • University of Auckland, New Zealand
  • Osamu Watanabe
    • Tokyo Institute of Technology, Japan

Overview

Information and randomness are fundamental notions of computer science. Their practical applications are ever increasing. For instance, a recent volume of Communications of the ACM [10] describes an application of (approximations of) Kolmogorov complexity to voice recognition. Kolmogorov complexity can be used to measure information distance (see [11]) in a natural and context-free way, and can be used in areas as diverse as computational biology, music evolution (e.g. [4]), data mining, and at least 20 other areas. Classical information theory and probability provide formalizations, but do not allow us to speak of the information content of a specific string or to say that a particular infinite binary sequence is random. These notions were formalized in the 1960s with the introduction of Kolmogorov complexity and Martin-Löf randomness. The first measures the complexity (or information content) of finite binary strings, while the second captures the intuition that a random infinite binary sequence has no “distinguishing features”. There are also characterizations of Martin-Löf randomness that capture the alternative intuitions that a random sequence is unpredictable and incompressible, the latter connecting Martin-Löf randomness to Kolmogorov complexity.

In recent years the people working in computability theory have produced a surge that resulted in rapid progress in our understanding of even the most basic notions, and in the solution of old open questions. This progress has changed and is still changing the landscape and opened up new avenues of research. An evidence of this activity has been the publication of two new books in the area, and the new edition of an already classical one.

  1. Algorithmic Randomness and Complexity, R. Downey and D. Hirschfeldt, Theory and Applications of Computability, Springer, 2010.
  2. Computability and Randomness, A. Nies, Oxford University Press, 2009.
  3. An Introduction to Kolmogorov Complexity and Its Applications, M. Li and P. Vitanyi. Third Edition, Springer Verlag, 2008.

Additionally, two of the proposers have been invited to the International Congress of Mathematicians to speak on algorithmic randomness, and a number of special issues of Computer Science Journals have been devoted to this area.

Research on the notions of information and randomness has drawn on methods and ideas from computability theory and computational complexity, as well as from core mathematical subjects like measure theory and information theory. The proposed list of invitees is balanced with respect to the various fields and reflects our aim to bring together the leading researchers from these areas. A major goal of the seminar is to bring together researchers to discuss the current open problems.

Specific topics

Algorithmic randomness and computational analysis The areas of analysis and randomness are closely connected because the measure of a set (a formal notion corresponding to our intuition of size or volume) is fundamental in both areas. In classical analysis, results, for instance on differentiability of a function at a real number, often hold only up to an “exception” set of measure null. A series of recent results shows that in computable analysis, such exception sets are often “effectively” null, where randomness notions can be matched with function classes satisfying an effectiveness condition [3, 8]. An interesting emerging direction is when both the randomness notion and the function class are defined in terms of feasible computation. For instance, recently scientists working in Japan such as Kawamura and Miyabe have begun to study this interaction.

The research communities in computable analysis and in effective randomness have only recently begun to work together. The results have been impressive, see for example Brattka, Miller and Nies [3] and Bienvenu, Hölzl, Miller and Nies [1]. A major aim of this seminar is to draw these communities together.

A natural application of these ideas is to ergodic theory, which looks at the long term behaviour of algorithmic iteration and where levels of randomness determine satisfaction of ergodic phenomena, and where, for example, Hochman and Meyerovitch [5] showed us that the values of entropies of subshifts of finite type over Zd for d > 2 are exactly the complements of halting probabilities.

Generic Computation In computational complexity theory, Gurevich and Levin independently introduced the idea of average-case complexity. One has a probability measure on the instances of a problem and one averages the time complexity over all instances. For example, an important result of Blass and Gurevich states that the Bounded Product Problem for the modular group, PSL (2, Z) is NP-complete but has polynomial time average-case complexity. The apparatus of average case complexity is difficult to apply, and it is natural to seek a more easily applied technique for classifying complexity which is natural. This was the motivation of some very interesting recent work of Kapovich, Myasnikov, and Schupp, who introduced (see [6, 7]) the class of generic algorithms. These are partial computable functions ø such that there is an algorithm M such that

  1. If M ever halts on input σ, then M outputs the correct value ø (σ).
  2. The Borel density of the set S = {σ | M halts on input σ} is 1.

Recall that Borel density is widely applied in number theory, and the Borel density of a set Sof strings is the limit (if any) of the ratio of the number of strings in S of length at most ndivided by the total number of strings of length at most n. It turns out that there is a huge class of group theoretical algorithms which are generically computable in linear time where the set of terminating inputs has fast converging Borel density. For example, this holds for the word problem for finitely presented groups. The general theory of generic computation is wide open, with only a few papers so far (notably, Downey, Jockusch and Schupp, Igusa), and almost nothing is known about the generic complexity of problems outside of the setting of group theory. This might provide an alternative explanation of the behaviour of many algorithms in practice beyond things like smoothed analysis. It certainly merits investigation.

Derandomization and complexity hierarchies. Derandomization is the study of how to replace probabilistic algorithms with deterministic algorithms. Earlier work by Allender et al. showed that the techniques of derandomization could be viewed through the lens of resource-bounded Kolmogorov complexity theory, and gave significant applications. More recently, they proved that every sufficiently dense set in NP ∩ coNP contains strings of low resource-bounded Kolmogorov complexity at every length. By recent work of Slaman and his student Bayer - arising from interactions of Fortnow, Allender and Slaman at the Dagstuhl Seminar 12021 - all sufficiently generic reals are low for speed under plausible assumptions and it is conjectured that this result extends to random reals, where low for speed means that the usual complexity classes of computable sets are all left unchanged with access to the real under consideration.

Resource bounded versions. Classical computational complexity theory comes into play defining resource-bounded versions of Kolmogorov complexity, measure, and dimension. This has led to new characterizations of complexity classes involving efficient reducibility to the set of Kolmogorov random strings. Resource-bounded measure and dimension have been used to gain understanding of properties of complexity classes and their complete sets. For instance, they can be used as a probabilistic method to prove lower bounds on nonuniform complexity. Constrained versions of randomness such as betting strategies which are integer valued - which is what is “really allowed” in a casino - have recently been suggested by Bienvenu, Teutsch and Stephan [2], with deep subsequent developments by Dougherty et al.; their relationship with other randomness notions is unclear. Even the simplest notion of randomness, randomness relative to automatic processes has deep interactions with number theory. The notion is equivalent to Borel Normality relative to the given base. In soon to be published work, three teams (Mayordomo and Lutz, Figueira and Nies, Becher and Slaman) have constructed absolutely normal numbers, i.e., numbers that are normal to any base, in near quadratic time, and it is very unclear whether this can be reduced. For applications resource bounded versions are necessary, and again it is not clear which works best in practice. For example, a recent program of Zenil [9] points out that the overhead in classical Kolmogorov complexity is a barrier to its utilization, and suggests that directly using betting strategies is the way to achieve some true applications. This is all very active.

References

[1] L. Bienvenu, R. Hölzl, J. Miller, and A. Nies. The Denjoy alternative for computable functions. Symposium for Theoretical Aspects of Computer Science (STACS), 2012, 543-554.

[2] L. Bienvenu, F. Stephan, J. Teutsch. How powerful are integer-valued martingales? Theory of Computing Systems 51(3) (2012), pp 330-351.

[3] V. Brattka, J. Miller, A. Nies. Randomness and differentiability, to appear.

[4] R. Cilibrasi, P.M.B. Vitanyi, R. de Wolf. Algorithmic clustering of music based on string compression, Computer Music Journal 28(4) (2004), 49-67.

[5] M. Hochman and T. Meyerovitch. A characterization of the entropies of multidimensional shifts of finite type, Annals of Mathematics 171(3) (2010), 2011-2038.

[6] I. Kapovich, A. Myasnikov, P. Schupp, V. Shpilrain. Generic-case complexity, decision problems in group theory and random walks, Journal of Algebra 264 (2003), 665-694.

[7] I. Kapovich and P. Schupp. Genericity, the Arshantseva-Ol’shanskii technique and the isomorphism problem for one-relator groups, Mathematische Annalen 331 (2005), 1-19.

[8] N. Pathak, C. Rojas, and S. Simpson. Schnorr randomness and the Lebesgue Differentiation Theorem. Proceedings of the American Mathematical Society, 2012, in press.

[9] F. Soler-Toscano and H. Zenil. A pseudo-efficient method for upper bound calculations of Kolmogorov complexity, to appear PLoS One

[10] Yang Tang, Di Wang, Jing Bai, Xiaoyan Zhu, Ming Li, Information Distance Between What I Said and What It Heard The RSVP voice-recognition search engine improves speech recognition and translation accuracy in question answering. Communications of the ACM, Vol. 56 (7) (2013), 70-77.

[11] P. Vitanyi. Information distance in multiples, IEEE Transactions on Information Theory 57(4) (2011), 2451-2456.

Report

No-047.pdf