Abstract
We describe a knowledge representation and inference formalism, based on an intensional propositional semantic network, in which variables are structures terms consisting of quantifier, type, and other information. This has three important consequences for natural language processing. First, this leads to an extended, more natural formalism whose use and representations are consistent with the use of variables in natural language in two ways: the structure of representations mirrors the structure of the language and allows re-use phenomena such as pronouns and ellipsis. Second, the formalism allows the specification of description subsumption as a partial ordering on related concepts (variable nodes in a semantic network) that relates more general concepts to more specific instances of that concept, as is done in language. Finally, this structured variable representation simplifies the resolution of some representational difficulties with certain classes of natural language sentences, namely, donkey sentences and sentences involving branching quantifiers. The implementation of this formalism is called ANALOG (A NAtural LOGIC) and its utility for natural language processing tasks is illustrated