Artificial Intelligence Journal, 57, pp 183-225, 1992.
Abstract: To produce good utterances from non-trivial inputs a natural language generator should consider many words in parallel, which raises the question of how to handle syntax in a parallel generator. If a generator is incremental and centered on the task of word choice, then the role of syntax is merely to help evaluate the appropriateness of words. One way to do this is to represent syntactic knowledge as an inventory of `syntactic constructions' and to have many constructions active in parallel at run-time. If this is done then the syntactic form of utterances can be emergent, resulting from synergy among constructions, and there is no need to build up or manipulate representations of syntactic structure. This approach is implemented in FIG, an incremental generator based on spreading activation, in which syntactic knowledge is represented in the same network as world knowledge and lexical knowledge.
back to publications