Results for 'Probabilistic tree‐substitution grammar'

975 found
Order:
  1.  47
    From Exemplar to Grammar: A Probabilistic Analogy‐Based Model of Language Learning.Rens Bod - 2009 - Cognitive Science 33 (5):752-793.
    While rules and exemplars are usually viewed as opposites, this paper argues that they form end points of the same distribution. By representing both rules and exemplars as (partial) trees, we can take into account the fluid middle ground between the two extremes. This insight is the starting point for a new theory of language learning that is based on the following idea: If a language learner does not know which phrase‐structure trees should be assigned to initial sentences, s/he allows (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   18 citations  
  2.  40
    Disfluencies, language comprehension, and Tree Adjoining Grammars.Fernanda Ferreira, Ellen F. Lau & Karl G. D. Bailey - 2004 - Cognitive Science 28 (5):721-749.
    Disfluencies include editing terms such as uh and um as well as repeats and revisions. Little is known about how disfluencies are processed, and there has been next to no research focused on the way that disfluencies affect structure-building operations during comprehension. We review major findings from both computational linguistics and psycholinguistics, and then we summarize the results of our own work which centers on how the parser behaves when it encounters a disfluency. We describe some new research showing that (...)
    No categories
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  3.  68
    A note on the expressive power of probabilistic context free grammars.Gabriel Infante-Lopez & Maarten De Rijke - 2006 - Journal of Logic, Language and Information 15 (3):219-231.
    We examine the expressive power of probabilistic context free grammars (PCFGs), with a special focus on the use of probabilities as a mechanism for reducing ambiguity by filtering out unwanted parses. Probabilities in PCFGs induce an ordering relation among the set of trees that yield a given input sentence. PCFG parsers return the trees bearing the maximum probability for a given sentence, discarding all other possible trees. This mechanism is naturally viewed as a way of defining a new class (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  4.  20
    A probabilistic plan recognition algorithm based on plan tree grammars.Christopher W. Geib & Robert P. Goldman - 2009 - Artificial Intelligence 173 (11):1101-1132.
  5.  13
    A Note on the Expressive Power of Probabilistic Context Free Grammars.Gabriel Infante-Lopez & Maarten Rijke - 2006 - Journal of Logic, Language and Information 15 (3):219-231.
    We examine the expressive power of probabilistic context free grammars (PCFGs), with a special focus on the use of probabilities as a mechanism for reducing ambiguity by filtering out unwanted parses. Probabilities in PCFGs induce an ordering relation among the set of trees that yield a given input sentence. PCFG parsers return the trees bearing the maximum probability for a given sentence, discarding all other possible trees. This mechanism is naturally viewed as a way of defining a new class (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  6.  23
    Lexicalised Locality: Local Domains and Non-Local Dependencies in a Lexicalised Tree Adjoining Grammar.Diego Gabriel Krivochen & Andrea Padovan - 2021 - Philosophies 6 (3):70.
    Contemporary generative grammar assumes that syntactic structure is best described in terms of sets, and that locality conditions, as well as cross-linguistic variation, is determined at the level of designated functional heads. Syntactic operations (merge, MERGE, etc.) build a structure by deriving sets from lexical atoms and recursively (and monotonically) yielding sets of sets. Additional restrictions over the format of structural descriptions limit the number of elements involved in each operation to two at each derivational step, a head and (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  7.  13
    Feature Selection for a Rich HPSG Grammar Using Decision Trees.Christopher D. Manning & Kristina Toutanova - unknown
    This paper examines feature selection for log linear models over rich constraint-based grammar (HPSG) representations by building decision trees over features in corresponding probabilistic context free grammars (PCFGs). We show that single decision trees do not make optimal use of the available information; constructed ensembles of decision trees based on different feature subspaces show signifi- cant performance gains (14% parse selection error reduction). We compare the performance of the learned PCFG grammars and log linear models over the same (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  8.  28
    On the Effect of the IO-Substitution on the Parikh Image of Semilinear Full AFLs.Pierre Bourreau - 2015 - Journal of Logic, Language and Information 24 (1):1-26.
    Back in the 1980’s, the class of mildly context-sensitive formalisms was introduced so as to capture the syntax of natural languages. While the languages generated by such formalisms are constrained by the constant-growth property, the most well-known and used ones—like tree-adjoining grammars or multiple context-free grammars—generate languages which verify the stronger property of being semilinear. In, the operation of IO-substitution was created so as to exhibit mildly-context sensitive classes of languages which are not semilinear. In the present article, we extend (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  9.  31
    (1 other version)Natural Language Grammar Induction using a Constituent-Context Model.Dan Klein & Christopher D. Manning - unknown
    This paper presents a novel approach to the unsupervised learning of syntactic analyses of natural language text. Most previous work has focused on maximizing likelihood according to generative PCFG models. In contrast, we employ a simpler probabilistic model over trees based directly on constituent identity and linear context, and use an EM-like iterative procedure to induce structure. This method produces much higher quality analyses, giving the best published results on the ATIS dataset.
    Direct download  
     
    Export citation  
     
    Bookmark   5 citations  
  10.  44
    The Equivalence of Tree Adjoining Grammars and Monadic Linear Context-free Tree Grammars.Stephan Kepser & Jim Rogers - 2011 - Journal of Logic, Language and Information 20 (3):361-384.
    The equivalence of leaf languages of tree adjoining grammars and monadic linear context-free grammars was shown about a decade ago. This paper presents a proof of the strong equivalence of these grammar formalisms. Non-strict tree adjoining grammars and monadic linear context-free grammars define the same class of tree languages. We also present a logical characterisation of this tree language class showing that a tree language is a member of this class iff it is the two-dimensional yield of an MSO-definable (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  11. Sentence Planning as Description Using Tree Adjoining Grammar.Matthew Stone - unknown
    We present an algorithm for simultaneously constructing both the syntax and semantics of a sentence using a Lexicalized Tree Adjoining Grammar (LTAG). This approach captures naturally and elegantly the interaction between pragmatic and syntactic constraints on descriptions in a sentence, and the inferential interactions between multiple descriptions in a sentence. At the same time, it exploits linguistically motivated, declarative specifications of the discourse functions of syntactic constructions to make contextually appropriate syntactic choices.
     
    Export citation  
     
    Bookmark   9 citations  
  12. Associative grammar combination operators for tree-based grammars.Yael Sygal & Shuly Wintner - 2009 - Journal of Logic, Language and Information 18 (3):293-316.
    Polarized unification grammar (PUG) is a linguistic formalism which uses polarities to better control the way grammar fragments interact. The grammar combination operation of PUG was conjectured to be associative. We show that PUG grammar combination is not associative, and even attaching polarities to objects does not make it order-independent. Moreover, we prove that no non-trivial polarity system exists for which grammar combination is associative. We then redefine the grammar combination operator, moving to the (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  13.  53
    (1 other version)An Ç ´Ò¿ µ Agenda-Based Chart Parser for Arbitrary Probabilistic Context-Free Grammars.Dan Klein & Christopher D. Manning - unknown
    While Ç ´Ò¿ µ methods for parsing probabilistic context-free grammars (PCFGs) are well known, a tabular parsing framework for arbitrary PCFGs which allows for botton-up, topdown, and other parsing strategies, has not yet been provided. This paper presents such an algorithm, and shows its correctness and advantages over prior work. The paper finishes by bringing out the connections between the algorithm and work on hypergraphs, which permits us to extend the presented Viterbi (best parse) algorithm to an inside (total (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  14. D-LTAG system: Discourse parsing with a lexicalized tree-adjoining grammar[REVIEW]Katherine Forbes, Eleni Miltsakaki, Rashmi Prasad, Anoop Sarkar, Aravind Joshi & Bonnie Webber - 2003 - Journal of Logic, Language and Information 12 (3):261-279.
    We present an implementation of a discourse parsing system for alexicalized Tree-Adjoining Grammar for discourse, specifying the integrationof sentence and discourse level processing. Our system is based on theassumption that the compositional aspects of semantics at thediscourse level parallel those at the sentence level. This coupling isachieved by factoring away inferential semantics and anaphoric features ofdiscourse connectives. Computationally, this parallelism is achievedbecause both the sentence and discourse grammar are LTAG-based and the sameparser works at both levels. The approach (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  15. Fourth International Workshop on Tree Adjoining Grammars and Related Frameworks.Anne Abeillé, Tilman Becker, Giorgio Satta & K. Vijay-Shanker (eds.) - 1998 - Institute for Research in Cognitive Science.
    No categories
     
    Export citation  
     
    Bookmark  
  16. Chasing Aristotle’s Categories Down the Tree of Grammar.Michael R. Baumer - 1993 - Journal of Philosophical Research 18:341-449.
    This paper addresses the problem of the origin and principle of Aristotle’s distinctions among the categories. It explores the possibilities of reformulating and reviving the “grammatical” theory, generally ascribed first to Trendelenburg. The paper brings two new perspectives to the grammatical theory: that of Aristotle’s own theory of syntax and that of contemporary linguistic syntax and semantics. I put forth a provisional theory of Aristotle’s categories in which (1) I propose that the Categories sets forth a theory of lexical structure, (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  17. Probabilistic Grammars and Languages.András Kornai - 2011 - Journal of Logic, Language and Information 20 (3):317-328.
    Using an asymptotic characterization of probabilistic finite state languages over a one-letter alphabet we construct a probabilistic language with regular support that cannot be generated by probabilistic CFGs. Since all probability values used in the example are rational, our work is immune to the criticism leveled by Suppes (Synthese 22:95–116, 1970 ) against the work of Ellis ( 1969 ) who first constructed probabilistic FSLs that admit no probabilistic FSGs. Some implications for probabilistic language (...)
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  18. Probabilistic Substitutivity at a Reduced Price.David Miller - 2011 - Principia: An International Journal of Epistemology 15 (2):271-286.
    One of the many intriguing features of the axiomatic systems of probability investigated in Popper (1959), appendices _iv, _v, is the different status of the two arguments of the probability functor with regard to the laws of replacement and commutation. The laws for the first argument, (rep1) and (comm1), follow from much simpler axioms, whilst (rep2) and (comm2) are independent of them, and have to be incorporated only when most of the important deductions have been accomplished. It is plain that, (...)
    Direct download (9 more)  
     
    Export citation  
     
    Bookmark  
  19. Partial proof trees as building blocks for a categorial grammar.Aravind K. Joshi & Seth Kulick - 1997 - Linguistics and Philosophy 20 (6):637-667.
    We describe a categorial system (PPTS) based on partial proof trees(PPTs) as the building blocks of the system. The PPTs are obtained byunfolding the arguments of the type that would be associated with a lexicalitem in a simple categorial grammar. The PPTs are the basic types in thesystem and a derivation proceeds by combining PPTs together. We describe theconstruction of the finite set of basic PPTs and the operations forcombining them. PPTS can be viewed as a categorial system incorporating (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  20.  70
    On the form of witness terms.Stefan Hetzl - 2010 - Archive for Mathematical Logic 49 (5):529-554.
    We investigate the development of terms during cut-elimination in first-order logic and Peano arithmetic for proofs of existential formulas. The form of witness terms in cut-free proofs is characterized in terms of structured combinations of basic substitutions. Based on this result, a regular tree grammar computing witness terms is given and a class of proofs is shown to have only elementary cut-elimination.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  21.  58
    Tree models and (labeled) categorial grammar.Yde Venema - 1996 - Journal of Logic, Language and Information 5 (3-4):253-277.
    This paper studies the relation between some extensions of the non-associative Lambek Calculus NL and their interpretation in tree models (free groupoids). We give various examples of sequents that are valid in tree models, but not derivable in NL. We argue why tree models may not be axiomatizable if we add finitely many derivation rules to NL, and proceed to consider labeled calculi instead.We define two labeled categorial calculi, and prove soundness and completeness for interpretations that are almost the intended (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  22.  56
    Probabilistic grammars for natural languages.Patrick Suppes - 1970 - Synthese 22 (1-2):95 - 116.
  23.  83
    Probabilistic L-systems can look like the branches of plants and trees.Alfred Hübler - 2012 - Complexity 17 (4):5-7.
  24. (1 other version)Learning a Generative Probabilistic Grammar of Experience: A Process‐Level Model of Language Acquisition.Oren Kolodny, Arnon Lotem & Shimon Edelman - 2014 - Cognitive Science 38 (4):227-267.
    We introduce a set of biologically and computationally motivated design choices for modeling the learning of language, or of other types of sequential, hierarchically structured experience and behavior, and describe an implemented system that conforms to these choices and is capable of unsupervised learning from raw natural-language corpora. Given a stream of linguistic input, our model incrementally learns a grammar that captures its statistical patterns, which can then be used to parse or generate new data. The grammar constructed (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  25.  16
    The research article and the science popularization article: a probabilistic functional grammar perspective on direct discourse representation.Adriana Silvina Pagano & Janaina Minelli de Oliveira - 2006 - Discourse Studies 8 (5):627-646.
    This article discusses the results of an investigation on discourse representation in a corpus of 34 million words constituted by texts in Brazilian Portuguese from two different genres: the research article and the science popularization article. Drawing on a systemic functional grammar perspective of language and pursuing a probabilistic approach, it focuses on the realization of lexicogrammatical systems of direct discourse representation as enacting interpersonal and social relationships. It is argued that the citation practices employed by writers in (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  26.  55
    (1 other version)Talking about Trees and Truth-Conditions.Reinhard Muskens - 1991 - Journal of Logic, Language and Information 10 (4):417-455.
    We present Logical Description Grammar (LDG), a model ofgrammar and the syntax-semantics interface based on descriptions inelementary logic. A description may simultaneously describe the syntacticstructure and the semantics of a natural language expression, i.e., thedescribing logic talks about the trees and about the truth-conditionsof the language described. Logical Description Grammars offer a naturalway of dealing with underspecification in natural language syntax andsemantics. If a logical description (up to isomorphism) has exactly onetree plus truth-conditions as a model, it completely specifies (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  27.  20
    Application of Normalized Compression Distance and Lempel-Ziv Jaccard Distance in Micro-electrode Signal Stream Classification for the Surgical Treatment of Parkinson’s Disease.Kamil Ząbkiewicz - 2018 - Studies in Logic, Grammar and Rhetoric 56 (1):45-57.
    Parkinson’s Disease can be treated with the use of microelectrode recording and stimulation. This paper presents a data stream classifier that analyses raw data from micro-electrodes and decides whether the measurements were taken from the subthalamic nucleus (STN) or not. The novelty of the proposed approach is based on the fact that distances based on raw data are used. Two distances are investigated in this paper, i.e. Normalized Compression Distance (NCD) and Lempel-Ziv Jaccard Distance (LZJD). No new features needed to (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  28. Probabilistic confirmation theory and bayesian reasoning.Timothy McGrew - 2000
    This brief annotated bibliography is intended to help students get started with their research. It is not a substitute for personal investigation of the literature, and it is not a comprehensive bibliography on the subject. For those just beginning to study probabilistic confirmation theory and Bayesian reasoning, I suggest the starred items as good places to start your reading.
     
    Export citation  
     
    Bookmark   1 citation  
  29.  28
    A Probabilistic Model of Lexical and Syntactic Access and Disambiguation.Daniel Jurafsky - 1996 - Cognitive Science 20 (2):137-194.
    The problems of access—retrieving linguistic structure from some mental grammar —and disambiguation—choosing among these structures to correctly parse ambiguous linguistic input—are fundamental to language understanding. The literature abounds with psychological results on lexical access, the access of idioms, syntactic rule access, parsing preferences, syntactic disambiguation, and the processing of garden‐path sentences. Unfortunately, it has been difficult to combine models which account for these results to build a general, uniform model of access and disambiguation at the lexical, idiomatic, and syntactic (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   65 citations  
  30. Action Trees and Moral Judgment.Joshua Knobe - 2010 - Topics in Cognitive Science 2 (3):555-578.
    It has sometimes been suggested that people represent the structure of action in terms of an action tree. A question now arises about the relationship between this action tree representation and people’s moral judgments. A natural hypothesis would be that people first construct a representation of the action tree and then go on to use this representation in making moral judgments. The present paper argues for a more complex view. Specifically, the paper reports a series of experimental studies that appear (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   17 citations  
  31.  24
    Artificial Grammar Learning Capabilities in an Abstract Visual Task Match Requirements for Linguistic Syntax.Gesche Westphal-Fitch, Beatrice Giustolisi, Carlo Cecchetto, Jordan S. Martin & W. Tecumseh Fitch - 2018 - Frontiers in Psychology 9:387357.
    Whether pattern-parsing mechanisms are specific to language or apply across multiple cognitive domains remains unresolved. Formal language theory provides a mathematical framework for classifying pattern-generating rule sets (or “grammars”) according to complexity. This framework applies to patterns at any level of complexity, stretching from simple sequences, to highly complex tree-like or net-like structures, to any Turing-computable set of strings. Here, we explored human pattern-processing capabilities in the visual domain by generating abstract visual sequences made up of abstract tiles differing in (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  32.  56
    Grammar induction by unification of type-logical lexicons.Sean A. Fulop - 2010 - Journal of Logic, Language and Information 19 (3):353-381.
    A method is described for inducing a type-logical grammar from a sample of bare sentence trees which are annotated by lambda terms, called term-labelled trees . Any type logic from a permitted class of multimodal logics may be specified for use with the procedure, which induces the lexicon of the grammar including the grammatical categories. A first stage of semantic bootstrapping is performed, which induces a general form lexicon from the sample of term-labelled trees using Fulop’s (J Log (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  33. Topological Trees: G H von Wright's Theory of Possible Worlds.David H. Sanford - 1998 - In TImothy Childers (ed.), The Logica Yearbook. Acadamy of Sciences of the Czech Republic.
    In several works on modality, G. H. von Wright presents tree structures to explain possible worlds. Worlds that might have developed from an earlier world are possible relative to it. Actually possible worlds are possible relative to the world as it actually was at some point. Many logically consistent worlds are not actually possible. Transitions from node to node in a tree structure are probabilistic. Probabilities are often more useful than similarities between worlds in treating counterfactual conditionals.
     
    Export citation  
     
    Bookmark  
  34. Probabilistic models of cognition: Conceptual foundations.Nick Chater & Alan Yuille - 2006 - Trends in Cognitive Sciences 10 (7):287-291.
    Remarkable progress in the mathematics and computer science of probability has led to a revolution in the scope of probabilistic models. In particular, ‘sophisticated’ probabilistic methods apply to structured relational systems such as graphs and grammars, of immediate relevance to the cognitive sciences. This Special Issue outlines progress in this rapidly developing field, which provides a potentially unifying perspective across a wide range of domains and levels of explanation. Here, we introduce the historical and conceptual foundations of the (...)
    Direct download (10 more)  
     
    Export citation  
     
    Bookmark   89 citations  
  35. Lambda Grammars and the Syntax-Semantics Interface.Reinhard Muskens - 2001 - In Robert Van Rooij & Martin Stokhof (eds.), Proceedings of the Thirteenth Amsterdam Colloquium. Amsterdam: ILLC. pp. 150-155.
    In this paper we discuss a new perspective on the syntax-semantics interface. Semantics, in this new set-up, is not ‘read off’ from Logical Forms as in mainstream approaches to generative grammar. Nor is it assigned to syntactic proofs using a Curry-Howard correspondence as in versions of the Lambek Calculus, or read off from f-structures using Linear Logic as in Lexical-Functional Grammar (LFG, Kaplan & Bresnan [9]). All such approaches are based on the idea that syntactic objects (trees, proofs, (...)
    Direct download  
     
    Export citation  
     
    Bookmark   8 citations  
  36.  27
    Inductive theorem proving based on tree grammars.Sebastian Eberhard & Stefan Hetzl - 2015 - Annals of Pure and Applied Logic 166 (6):665-700.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  37.  60
    Perfect trees and elementary embeddings.Sy-David Friedman & Katherine Thompson - 2008 - Journal of Symbolic Logic 73 (3):906-918.
    An important technique in large cardinal set theory is that of extending an elementary embedding j: M → N between inner models to an elementary embedding j*: M[G] → N[G*] between generic extensions of them. This technique is crucial both in the study of large cardinal preservation and of internal consistency. In easy cases, such as when forcing to make the GCH hold while preserving a measurable cardinal (via a reverse Easton iteration of α-Cohen forcing for successor cardinals α), the (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   11 citations  
  38.  72
    A Probabilistic Constraints Approach to Language Acquisition and Processing.Mark S. Seidenberg & Maryellen C. MacDonald - 1999 - Cognitive Science 23 (4):569-588.
    This article provides an overview of a probabilistic constraints framework for thinking about language acquisition and processing. The generative approach attempts to characterize knowledge of language (i.e., competence grammar) and then asks how this knowledge is acquired and used. Our approach is performance oriented: the goal is to explain how people comprehend and produce utterances and how children acquire this skill. Use of language involves exploiting multiple probabilistic constraints over various types of linguistic and nonlinguistic information. Acquisition (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   32 citations  
  39.  75
    Towards a Grammar of Bayesian Coherentism.Michael Schippers - 2015 - Studia Logica 103 (5):955-984.
    One of the integral parts of Bayesian coherentism is the view that the relation of ‘being no less coherent than’ is fully determined by the probabilistic features of the sets of propositions to be ordered. In the last one and a half decades, a variety of probabilistic measures of coherence have been put forward. However, there is large disagreement as to which of these measures best captures the pre-theoretic notion of coherence. This paper contributes to the debate on (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  40. Calibrating Generative Models: The Probabilistic Chomsky-Schützenberger Hierarchy.Thomas Icard - 2020 - Journal of Mathematical Psychology 95.
    A probabilistic Chomsky–Schützenberger hierarchy of grammars is introduced and studied, with the aim of understanding the expressive power of generative models. We offer characterizations of the distributions definable at each level of the hierarchy, including probabilistic regular, context-free, (linear) indexed, context-sensitive, and unrestricted grammars, each corresponding to familiar probabilistic machine classes. Special attention is given to distributions on (unary notations for) positive integers. Unlike in the classical case where the "semi-linear" languages all collapse into the regular languages, (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  41.  42
    Substitution Frege and extended Frege proof systems in non-classical logics.Emil Jeřábek - 2009 - Annals of Pure and Applied Logic 159 (1-2):1-48.
    We investigate the substitution Frege () proof system and its relationship to extended Frege () in the context of modal and superintuitionistic propositional logics. We show that is p-equivalent to tree-like , and we develop a “normal form” for -proofs. We establish connections between for a logic L, and for certain bimodal expansions of L.We then turn attention to specific families of modal and si logics. We prove p-equivalence of and for all extensions of , all tabular logics, all logics (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   8 citations  
  42.  71
    Second-order abstract categorial grammars as hyperedge replacement grammars.Makoto Kanazawa - 2010 - Journal of Logic, Language and Information 19 (2):137-161.
    Second-order abstract categorial grammars (de Groote in Association for computational linguistics, 39th annual meeting and 10th conference of the European chapter, proceedings of the conference, pp. 148–155, 2001) and hyperedge replacement grammars (Bauderon and Courcelle in Math Syst Theory 20:83–127, 1987; Habel and Kreowski in STACS 87: 4th Annual symposium on theoretical aspects of computer science. Lecture notes in computer science, vol 247, Springer, Berlin, pp 207–219, 1987) are two natural ways of generalizing “context-free” grammar formalisms for string and (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  43.  69
    Grammar as a developmental phenomenon.Guy Dove - 2012 - Biology and Philosophy 27 (5):615-637.
    More and more researchers are examining grammar acquisition from theoretical perspectives that treat it as an emergent phenomenon. In this essay, I argue that a robustly developmental perspective provides a potential explanation for some of the well-known crosslinguistic features of early child language: the process of acquisition is shaped in part by the developmental constraints embodied in von Baer’s law of development. An established model of development, the Developmental Lock, captures and elucidates the probabilistic generalizations at the heart (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  44. Simple sentences, substitution, and intuitions.Jennifer Mather Saul - 2007 - New York: Oxford University Press.
    Substitution and simple sentences -- Simple sentences and semantics -- Simple sentences and implicatures -- The enlightenment problem and a common assumption -- Abandoning (EOI) -- Beyond matching propositions -- App. A : extending the account -- App. B : belief reporting.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   20 citations  
  45.  38
    Linguistics, Logic and Finite Trees.Patrick Blackburn & Wilfried Meyer-Viol - 1994 - Logic Journal of the IGPL 2 (1):3-29.
    A modal logic is developed to deal with finite ordered binary trees a they are used in linguistics. A modal language is introduced with operators for the ‘mother of’, ‘first daughter of’ and ‘second daughter of’ relations together with their transitive reflexive closures. The relevant class of tree models is defined and three linguistic applications of this language are discussed: context free grammars, command relations, and trees decorated with feature structures. An axiomatic proof system is given for which completeness is (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   18 citations  
  46.  58
    Grammaticality, Acceptability, and Probability: A Probabilistic View of Linguistic Knowledge.Lau Jey Han, Clark Alexander & Lappin Shalom - 2017 - Cognitive Science 41 (5):1202-1241.
    The question of whether humans represent grammatical knowledge as a binary condition on membership in a set of well-formed sentences, or as a probabilistic property has been the subject of debate among linguists, psychologists, and cognitive scientists for many decades. Acceptability judgments present a serious problem for both classical binary and probabilistic theories of grammaticality. These judgements are gradient in nature, and so cannot be directly accommodated in a binary formal grammar. However, it is also not possible (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   12 citations  
  47.  31
    ULTRA: Universal Grammar as a Universal Parser.David P. Medeiros - 2018 - Frontiers in Psychology 9:307789.
    A central concern of generative grammar is the relationship between hierarchy and word order, traditionally understood as two dimensions of a single syntactic representation. A related concern is directionality in the grammar. Traditional approaches posit process-neutral grammars, embodying knowledge of language, put to use with infinite facility both for production and comprehension. This has crystallized in the view of Merge as the central property of syntax, perhaps its only novel feature. A growing number of approaches explore grammars with (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  48. Lexicalized Grammar 101.Matthew Stone - unknown
    This paper presents a simple and versatile tree-rewriting lexicalized grammar formalism, TAGLET, that provides an effective scaffold for introducing advanced topics in a survey course on natural language processing (NLP). Students who implement a strong competence TAGLET parser and generator simultaneously get experience with central computer science ideas and develop an effective starting point for their own subsequent projects in data-intensive and interactive NLP.
     
    Export citation  
     
    Bookmark   1 citation  
  49.  49
    Toward discourse representation via pregroup grammars.Anne Preller - 2007 - Journal of Logic, Language and Information 16 (2):173-194.
    Every pregroup grammar is shown to be strongly equivalent to one which uses basic types and left and right adjoints of basic types only. Therefore, a semantical interpretation is independent of the order of the associated logic. Lexical entries are read as expressions in a two sorted predicate logic with ∈ and functional symbols. The parsing of a sentence defines a substitution that combines the expressions associated to the individual words. The resulting variable free formula is the translation of (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  50.  15
    Relevance-Sensitive Truth-Trees.David Makinson - 2021 - In Ivo Düntsch & Edwin Mares (eds.), Alasdair Urquhart on Nonclassical and Algebraic Logic and Complexity of Proofs. Springer Verlag. pp. 23-65.
    Our goal is to articulate a clear rationale for relevance-sensitive propositional logic. The method: truth-trees. Familiar decomposition rules for truth-functional connectives, accompanied by novel ones for the for the arrow, together with a recursive rule, generate a set of ‘acceptable’ formulae that properly contains all theorems of the well-known system R and is closed under substitution, conjunction, and detachment. We conjecture that it satisfies the crucial letter-sharing condition.
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark   2 citations  
1 — 50 / 975