On the Derivational Entropy of Left-to-Right Probabilistic Finite-State Automata and Hidden Markov Models
Abstract
Probabilistic Finite-State Automata are a formalism that is widely
 used in many problems of Automatic Speech Recognition, Natural
 Language Processing. Probabilistic Finite-State Automata are closely
 related to other finite-state models as Weighted Finite-State
 Automata, Word Lattices and Hidden Markov Models. Therefore, they
 share many similar properties and problems. Entropy measures of
 finite-state models have been researched in the past in order to
 study the information capacity of these models. The derivational
 entropy quantifies the uncertainty that the model has about the
 probability distribution it represents. The derivational entropy in
 a finite-state automaton is computed from the probability
 accumulated in all of its individual paths. The computation of the
 entropy from a weighted finite-state automaton requires a normalized model. This paper studies an efficient computation of the
 derivational entropy of a left-to-right probabilistic finite-state
 automata and it introduces an efficient algorithm for normalizing
 weighted finite-state automata. The efficient computation of the
 derivational entropy is also extended to continuous Hidden Markov
 Models.
 
						