PDF

Description

Dynamic probabilistic networks (DPNs) are a powerful and efficient method for encoding stochastic temporal models. In the past, however, their use has been largely confined to the description of uniform temporal processes. In this paper we show how to combine specialized DPN models to represent inhomogeneous processes that progress through a sequence of different stages. We develop a method that takes a set of DPN submodels and a stochastic finite state automaton that defines a legal set of submodel concatenations, and constructs a composite DPN. The composite DPN is shown to represent correctly the intended probability distribution over possible histories of the temporal process. The use of DPNs allows us to take advantage of efficient, general-purpose inference and learning algorithms and can confer significant advantages over HMMs in terms of statistical efficiency and representational flexibility. We illustrate these advantages in the context of speech recognition.

Details

Files

Statistics

from
to
Export
Download Full History