The complexity of approximating entropy
We consider the problem of approximating the entropy of a discrete distribution under several models. If the distribution is given explicitly as an array where the i-th location is the probability of the i-th element, then linear time is both necessary and sufficient for approximating the entropy.We consider a model in which the algorithm is given access only to independent samples from the distribution. Here, we show that a &lgr;-multiplicative approximation to the entropy can be obtained in O(n(1+η)/&lgr;2 < poly(log n)) time for distributions with entropy Ω(&lgr; η), where n is the size of the domain of the distribution and η is an arbitrarily small positive constant. We show that one cannot get a multiplicative approximation to the entropy in general in this model. Even for the class of distributions to which our upper bound applies, we obtain a lower bound of Ω(nmax(1/(2&lgr;2), 2/(5&lgr;2—2)).We next consider a hybrid model in which both the explicit distribution as well as independent samples are available. Here, significantly more efficient algorithms can be achieved: a &lgr;-multiplicative approximation to the entropy can be obtained in O(&lgr;2.Finally, we consider two special families of distributions: those for which the probability of an element decreases monotonically in the label of the element, and those that are uniform over a subset of the domain. In each case, we give more efficient algorithms for approximating the entropy.
| Item Type | Chapter |
|---|---|
| Copyright holders | © 2002 ACM |
| Departments | LSE > Academic Departments > Mathematics |
| DOI | 10.1145/509907.510005 |
| Date Deposited | 05 Jan 2011 |
| URI | https://researchonline.lse.ac.uk/id/eprint/31084 |
Explore Further
- http://www.lse.ac.uk/Mathematics/people/Tugkan-Batu.aspx (Author)
- https://www.scopus.com/pages/publications/0036037637 (Scopus publication)
- http://portal.acm.org/citation.cfm?doid=509907.510... (Official URL)