Journal of the ACM, Vol 48, No. 3, May 2001, pp. 407-430.
Preliminary version in Proc. 10th Symposium on Discrete Algorithms
(SODA '99)
Abstract. The Burrows-Wheeler Transform (also known as Block-Sorting) is at the base of compression algorithms which are the state of the art in lossless data compression. In this paper we analyze two algorithms which use this technique. The first one is the original algorithm described by Burrows and Wheeler, which, despite its simplicity, outperforms the gzip compressor. The second one uses an additional run-length encoding step to improve compression. We prove that the compression ratio of both algorithms can be bounded in terms of the k-th order empirical entropy of the input string for any k > 0. We make no assumptions on the input and we obtain bounds which hold in the worst case, that is, for every possible input string. All previous results for Block-Sorting algorithms were concerned with the average compression ratio and have been established assuming that the input comes from a finite-order Markov source.
Retrieve a REVISED (April 26, 2002) version in PDF (312688 bytes)