Optimal redundancy in computations from random oracles

Barmpalias, G. & Lewis-Pye, A. (2017). Optimal redundancy in computations from random oracles. Journal of Computer and System Sciences, https://doi.org/10.1016/j.jcss.2017.06.009
Copy

It is a classic result in algorithmic information theory that every infinite binary sequence is computable from an infinite binary sequence which is random in the sense of Martin-Löf. Proved independently by Kuˇcera [Kuˇc85] and Gács [Gác86], this result answered a question by Charles Bennett and has seen numerous applications in the last 30 years. The optimal redundancy in such a coding process has, however, remained unknown. If the computation of the first n bits of a sequence requires n+g(n) bits of the random oracle, then g is the redundancy of the computation. Kuˇcera implicitly achieved redundancy n log n while Gács used a more elaborate block-coding procedure which achieved redundancy √n log n. Merkle and Mihailovi´c [MM04] provided a different presentation of Gács’ approach, without improving his redundancy bound. In this paper we devise a new coding method that achieves optimal logarithmic redundancy. For any computable non-decreasing function g such that Pi 2−g(i) is bounded we show that there is a coding process that codes any given infinite binary sequence into a Martin-Löf random infinite binary sequence with redundancy g. This redundancy bound is exponentially smaller than the previous bound of √n log n and is known to be the best possible by recent work [BLPT16], where it was shown that if Pi 2−g(i) diverges then there exists an infinite binary sequence X which cannot be computed by any Martin-Löf random infinite binary sequence with redundancy g. It follows that redundancy ǫ · log n in computation from a random oracle is possible for every infinite binary sequence, if and only if ǫ > 1.

picture_as_pdf

subject
Accepted Version

Download

Export as

EndNote BibTeX Reference Manager Refer Atom Dublin Core JSON Multiline CSV
Export