Loading...
Please wait, while we are loading the content...
Similar Documents
WWN: Language Acquisition and Generalization using Association
| Content Provider | Semantic Scholar |
|---|---|
| Author | Miyan, Kajal |
| Copyright Year | 2010 |
| Abstract | WWN: Language Acquisition and Generalization using Association By Kajal Miyan Based on some recent advances in understanding and modeling cortical processing for space [26] and time [55], we propose a developmental, general-purpose model for language acquisition using multiple motor areas. The thesis presents two main ideas: a) early language acquisition is a grounded and incremental process, i.e., the network learns as it performs in the real world b) language is a complex perceptual, cognitive and motor skill that can be acquired through associative learning and skill transfer principles described in [57]. The network architecture is informed by the existing neuroanatomic studies and the associative learning literature in psychology. Through the ventral pathway, the “what” motor learns, abstracts and feeds back (as recurrent top-down context) information that is related to the meaning of the text. Via the dorsal pathway, the “where/how” motor learns, abstracts and feeds back (as top-down context) information that relates to the spatial information of text, e.g., where is the text on a page. This is a major departure from the traditional symbolic and connectionist approaches to natural language processing (NLP) — the nature of the motor areas, i.e., actions or abstract meanings, play the role of “state hubs” in language acquisition and understanding. The “hubs” correspond to multiple concepts that form the state of the current context. As any human communicable concept can be either verbally stated (what) or demonstrated through actions (how), this model seems to be the first general purpose developmental model for general language acquisition, although the size of our experiments is still limited. Furthermore, unlike traditional NLP approaches, syntax is a special case of actions. The major novelty in our language acquisition is the ability to generalize, going beyond a probability framework, by simulating the primary, secondary and higher order associations observed in animal learning through the generalization of internal distributed representations. A basic architecture that enables such a generalization is the overall distributed representation: not only a retina image but also an array of muscles is considered high-dimensional images. An emergent internal distributed representation is critical for going beyond experience to enable three types of generalization: member-to-class, subclass-to-superclass, member-to-member, and relation-specification. In our cortex inspired model, syntax and semantics are not treated differently, but as emergent behaviors that arise from grounded real-time experience. |
| File Format | PDF HTM / HTML |
| Alternate Webpage(s) | https://d.lib.msu.edu/etd/1352/datastream/OBJ/download/WWN___language_acquisition_and_generalization_using_association.pdf |
| Language | English |
| Access Restriction | Open |
| Content Type | Text |
| Resource Type | Article |