Looking for Entropy Rate Constancy in Spoken Dialog

Alejandro Vega, Nigel G. Ward

Technical Report UTEP-CS-09-19, Department of Computer Science, University of Texas at El Paso, 2009

Abstract: The entropy constancy principle describes the tendency for information in language to be conveyed at a constant rate. We explore the possible role of this principle in spoken dialog, using the ``summed entropy rate,'' that is, the sum of the entropies of the words of both speakers per second of time. Using the Switchboard corpus of casual dialogs and a standard ngram language model to estimate entropy, we examine patterns in entropy rate over time and the distribution of entropy across the two speakers. The results show effects that can be taken as support for the principle of constant entropy, but also indicate a need for better language models and better techniques for estimating non-lexical entropy.

Full Report

Nigel Ward's Publications