Loading...
Please wait, while we are loading the content...
Similar Documents
Hierarchical recurrent neural networks for long-term dependencies (1996).
| Content Provider | CiteSeerX |
|---|---|
| Author | Hihi, Salah El Bengio, Yoshua |
| Abstract | We have already shown that extracting long-term dependencies from sequential data is difficult, both for deterministic dynamical systems such as recurrent networks, and probabilistic models such as hidden Markov models (HMMs) or input/output hidden Markov models (IOHMMs). In practice, to avoid this problem, researchers have used domain specific a-priori knowledge to give meaning to the hidden or state variables representing past context. In this paper, we propose to use a more general type of a-priori knowledge, namely that the temporal dependencies are structured hierarchically. This implies that long-term dependencies are represented by variables with a long time scale. This principle is applied to a recurrent network which includes delays and multiple time scales. Experiments confirm the advantages of such structures. A similar approach is proposed for HMMs and IOHMMs. 1 Introduction Learning from examples basically amounts to identifying the relations between random v... |
| File Format | |
| Publisher Date | 1996-01-01 |
| Access Restriction | Open |
| Subject Keyword | Long-term Dependency Hierarchical Recurrent Neural Network Recurrent Network Sequential Data State Variable Domain Specific A-priori Knowledge Introduction Learning General Type Temporal Dependency Past Context Similar Approach Long Time Scale Hidden Markov Model Output Hidden Markov Model Probabilistic Model A-priori Knowledge Multiple Time Scale Deterministic Dynamical System |
| Content Type | Text |