Results for 'Probabilistic grammars'

958 found
Order:
  1. Probabilistic Grammars and Languages.András Kornai - 2011 - Journal of Logic, Language and Information 20 (3):317-328.
    Using an asymptotic characterization of probabilistic finite state languages over a one-letter alphabet we construct a probabilistic language with regular support that cannot be generated by probabilistic CFGs. Since all probability values used in the example are rational, our work is immune to the criticism leveled by Suppes (Synthese 22:95–116, 1970 ) against the work of Ellis ( 1969 ) who first constructed probabilistic FSLs that admit no probabilistic FSGs. Some implications for probabilistic language (...)
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  2.  56
    Probabilistic grammars for natural languages.Patrick Suppes - 1970 - Synthese 22 (1-2):95 - 116.
  3. (1 other version)Learning a Generative Probabilistic Grammar of Experience: A Process‐Level Model of Language Acquisition.Oren Kolodny, Arnon Lotem & Shimon Edelman - 2014 - Cognitive Science 38 (4):227-267.
    We introduce a set of biologically and computationally motivated design choices for modeling the learning of language, or of other types of sequential, hierarchically structured experience and behavior, and describe an implemented system that conforms to these choices and is capable of unsupervised learning from raw natural-language corpora. Given a stream of linguistic input, our model incrementally learns a grammar that captures its statistical patterns, which can then be used to parse or generate new data. The grammar constructed in this (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  4.  20
    A probabilistic plan recognition algorithm based on plan tree grammars.Christopher W. Geib & Robert P. Goldman - 2009 - Artificial Intelligence 173 (11):1101-1132.
  5.  47
    From Exemplar to Grammar: A Probabilistic Analogy‐Based Model of Language Learning.Rens Bod - 2009 - Cognitive Science 33 (5):752-793.
    While rules and exemplars are usually viewed as opposites, this paper argues that they form end points of the same distribution. By representing both rules and exemplars as (partial) trees, we can take into account the fluid middle ground between the two extremes. This insight is the starting point for a new theory of language learning that is based on the following idea: If a language learner does not know which phrase‐structure trees should be assigned to initial sentences, s/he allows (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   18 citations  
  6.  16
    The research article and the science popularization article: a probabilistic functional grammar perspective on direct discourse representation.Adriana Silvina Pagano & Janaina Minelli de Oliveira - 2006 - Discourse Studies 8 (5):627-646.
    This article discusses the results of an investigation on discourse representation in a corpus of 34 million words constituted by texts in Brazilian Portuguese from two different genres: the research article and the science popularization article. Drawing on a systemic functional grammar perspective of language and pursuing a probabilistic approach, it focuses on the realization of lexicogrammatical systems of direct discourse representation as enacting interpersonal and social relationships. It is argued that the citation practices employed by writers in the (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  7.  13
    A Note on the Expressive Power of Probabilistic Context Free Grammars.Gabriel Infante-Lopez & Maarten Rijke - 2006 - Journal of Logic, Language and Information 15 (3):219-231.
    We examine the expressive power of probabilistic context free grammars (PCFGs), with a special focus on the use of probabilities as a mechanism for reducing ambiguity by filtering out unwanted parses. Probabilities in PCFGs induce an ordering relation among the set of trees that yield a given input sentence. PCFG parsers return the trees bearing the maximum probability for a given sentence, discarding all other possible trees. This mechanism is naturally viewed as a way of defining a new (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  8.  72
    A Probabilistic Constraints Approach to Language Acquisition and Processing.Mark S. Seidenberg & Maryellen C. MacDonald - 1999 - Cognitive Science 23 (4):569-588.
    This article provides an overview of a probabilistic constraints framework for thinking about language acquisition and processing. The generative approach attempts to characterize knowledge of language (i.e., competence grammar) and then asks how this knowledge is acquired and used. Our approach is performance oriented: the goal is to explain how people comprehend and produce utterances and how children acquire this skill. Use of language involves exploiting multiple probabilistic constraints over various types of linguistic and nonlinguistic information. Acquisition is (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   32 citations  
  9. Probabilistic models of cognition: Conceptual foundations.Nick Chater & Alan Yuille - 2006 - Trends in Cognitive Sciences 10 (7):287-291.
    Remarkable progress in the mathematics and computer science of probability has led to a revolution in the scope of probabilistic models. In particular, ‘sophisticated’ probabilistic methods apply to structured relational systems such as graphs and grammars, of immediate relevance to the cognitive sciences. This Special Issue outlines progress in this rapidly developing field, which provides a potentially unifying perspective across a wide range of domains and levels of explanation. Here, we introduce the historical and conceptual foundations of (...)
    Direct download (10 more)  
     
    Export citation  
     
    Bookmark   89 citations  
  10. Calibrating Generative Models: The Probabilistic Chomsky-Schützenberger Hierarchy.Thomas Icard - 2020 - Journal of Mathematical Psychology 95.
    A probabilistic Chomsky–Schützenberger hierarchy of grammars is introduced and studied, with the aim of understanding the expressive power of generative models. We offer characterizations of the distributions definable at each level of the hierarchy, including probabilistic regular, context-free, (linear) indexed, context-sensitive, and unrestricted grammars, each corresponding to familiar probabilistic machine classes. Special attention is given to distributions on (unary notations for) positive integers. Unlike in the classical case where the "semi-linear" languages all collapse into the (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  11.  68
    A note on the expressive power of probabilistic context free grammars.Gabriel Infante-Lopez & Maarten De Rijke - 2006 - Journal of Logic, Language and Information 15 (3):219-231.
    We examine the expressive power of probabilistic context free grammars (PCFGs), with a special focus on the use of probabilities as a mechanism for reducing ambiguity by filtering out unwanted parses. Probabilities in PCFGs induce an ordering relation among the set of trees that yield a given input sentence. PCFG parsers return the trees bearing the maximum probability for a given sentence, discarding all other possible trees. This mechanism is naturally viewed as a way of defining a new (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  12.  69
    Grammar as a developmental phenomenon.Guy Dove - 2012 - Biology and Philosophy 27 (5):615-637.
    More and more researchers are examining grammar acquisition from theoretical perspectives that treat it as an emergent phenomenon. In this essay, I argue that a robustly developmental perspective provides a potential explanation for some of the well-known crosslinguistic features of early child language: the process of acquisition is shaped in part by the developmental constraints embodied in von Baer’s law of development. An established model of development, the Developmental Lock, captures and elucidates the probabilistic generalizations at the heart of (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  13.  54
    (1 other version)An Ç ´Ò¿ µ Agenda-Based Chart Parser for Arbitrary Probabilistic Context-Free Grammars.Dan Klein & Christopher D. Manning - unknown
    While Ç ´Ò¿ µ methods for parsing probabilistic context-free grammars (PCFGs) are well known, a tabular parsing framework for arbitrary PCFGs which allows for botton-up, topdown, and other parsing strategies, has not yet been provided. This paper presents such an algorithm, and shows its correctness and advantages over prior work. The paper finishes by bringing out the connections between the algorithm and work on hypergraphs, which permits us to extend the presented Viterbi (best parse) algorithm to an inside (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  14.  28
    A Probabilistic Model of Lexical and Syntactic Access and Disambiguation.Daniel Jurafsky - 1996 - Cognitive Science 20 (2):137-194.
    The problems of access—retrieving linguistic structure from some mental grammar —and disambiguation—choosing among these structures to correctly parse ambiguous linguistic input—are fundamental to language understanding. The literature abounds with psychological results on lexical access, the access of idioms, syntactic rule access, parsing preferences, syntactic disambiguation, and the processing of garden‐path sentences. Unfortunately, it has been difficult to combine models which account for these results to build a general, uniform model of access and disambiguation at the lexical, idiomatic, and syntactic levels. (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   65 citations  
  15.  75
    Towards a Grammar of Bayesian Coherentism.Michael Schippers - 2015 - Studia Logica 103 (5):955-984.
    One of the integral parts of Bayesian coherentism is the view that the relation of ‘being no less coherent than’ is fully determined by the probabilistic features of the sets of propositions to be ordered. In the last one and a half decades, a variety of probabilistic measures of coherence have been put forward. However, there is large disagreement as to which of these measures best captures the pre-theoretic notion of coherence. This paper contributes to the debate on (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  16.  4
    Grammar and Expectation in Active Dependency Resolution: Experimental and Modeling Evidence From Norwegian.Anastasia Kobzeva & Dave Kush - 2024 - Cognitive Science 48 (10):e13501.
    Filler-gap dependency resolution is often characterized as an active process. We probed the mechanisms that determine where and why comprehenders posit gaps during incremental processing using Norwegian as our test language. First, we investigated why active filler-gap dependency resolution is suspended inside island domains like embedded questions in some languages. Processing-based accounts hold that resource limitations prevent gap-filling in embedded questions across languages, while grammar-based accounts predict that active gap-filling is only blocked in languages where embedded questions are grammatical islands. (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  17.  58
    Grammaticality, Acceptability, and Probability: A Probabilistic View of Linguistic Knowledge.Lau Jey Han, Clark Alexander & Lappin Shalom - 2017 - Cognitive Science 41 (5):1202-1241.
    The question of whether humans represent grammatical knowledge as a binary condition on membership in a set of well-formed sentences, or as a probabilistic property has been the subject of debate among linguists, psychologists, and cognitive scientists for many decades. Acceptability judgments present a serious problem for both classical binary and probabilistic theories of grammaticality. These judgements are gradient in nature, and so cannot be directly accommodated in a binary formal grammar. However, it is also not possible to (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   12 citations  
  18.  31
    (1 other version)Natural Language Grammar Induction using a Constituent-Context Model.Dan Klein & Christopher D. Manning - unknown
    This paper presents a novel approach to the unsupervised learning of syntactic analyses of natural language text. Most previous work has focused on maximizing likelihood according to generative PCFG models. In contrast, we employ a simpler probabilistic model over trees based directly on constituent identity and linear context, and use an EM-like iterative procedure to induce structure. This method produces much higher quality analyses, giving the best published results on the ATIS dataset.
    Direct download  
     
    Export citation  
     
    Bookmark   5 citations  
  19.  13
    Feature Selection for a Rich HPSG Grammar Using Decision Trees.Christopher D. Manning & Kristina Toutanova - unknown
    This paper examines feature selection for log linear models over rich constraint-based grammar (HPSG) representations by building decision trees over features in corresponding probabilistic context free grammars (PCFGs). We show that single decision trees do not make optimal use of the available information; constructed ensembles of decision trees based on different feature subspaces show signifi- cant performance gains (14% parse selection error reduction). We compare the performance of the learned PCFG grammars and log linear models over the (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  20.  30
    Compositionality in rational analysis: Grammar-based induction for concept learning.Noah D. Goodman, Joshua B. Tenenbaum, Thomas L. Griffiths & Jacob Feldman - 2008 - In Nick Chater & Mike Oaksford (eds.), The Probabilistic Mind: Prospects for Bayesian Cognitive Science. Oxford University Press.
    Direct download  
     
    Export citation  
     
    Bookmark   2 citations  
  21. Compositionality in rational analysis: grammar-based induction for concept learning.Noah D. Goodman, Joshua B. Tenenbaum, Thomas L. Griffiths & Feldman & Jacob - 2008 - In Nick Chater & Mike Oaksford (eds.), The Probabilistic Mind: Prospects for Bayesian Cognitive Science. Oxford University Press.
     
    Export citation  
     
    Bookmark   1 citation  
  22.  40
    Uncertainty About the Rest of the Sentence.John Hale - 2006 - Cognitive Science 30 (4):643-672.
    A word-by-word human sentence processing complexity metric is presented. This metric formalizes the intuition that comprehenders have more trouble on words contributing larger amounts of information about the syntactic structure of the sentence as a whole. The formalization is in terms of the conditional entropy of grammatical continuations, given the words that have been heard so far. To calculate the predictions of this metric, Wilson and Carroll's (1954) original entropy reduction idea is extended to infinite languages. This is demonstrated with (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   35 citations  
  23.  18
    Assessing the complexity of lectal competence: the register-specificity of the dative alternation after give.Benedikt Szmrecsanyi, Laura Rosseel, Jason Grafmiller & Alexandra Engel - 2022 - Cognitive Linguistics 33 (4):727-766.
    Recent evidence suggests that probabilistic grammars may be modulated by communication mode and genre. Accordingly, the question arises how complex language users’ lectal competence is, where complexity is proportional to the extent to which choice-making processes depend on the situation of language use. Do probabilistic constraints vary when we talk to a friend compared to when we give a speech? Are differences between spoken and written language larger than those within each mode? In the present study, we (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  24. Two Models of Minimalist, Incremental Syntactic Analysis.Edward P. Stabler - 2013 - Topics in Cognitive Science 5 (3):611-633.
    Minimalist grammars (MGs) and multiple context-free grammars (MCFGs) are weakly equivalent in the sense that they define the same languages, a large mildly context-sensitive class that properly includes context-free languages. But in addition, for each MG, there is an MCFG which is strongly equivalent in the sense that it defines the same language with isomorphic derivations. However, the structure-building rules of MGs but not MCFGs are defined in a way that generalizes across categories. Consequently, MGs can be exponentially (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  25.  34
    構文森を用いた実コーパスからの大規模な文脈自由文法の高速学習法.亀谷 由隆 栗原 賢一 - 2004 - Transactions of the Japanese Society for Artificial Intelligence 19:360-367.
    The task of inducing grammar structures has received a great deal of attention. The reasons why researchers have studied are different; to use grammar induction as the first stage in building large treebanks or to make up better language models. However, grammar induction has inherent computational complexity. To overcome it, some grammar induction algorithms add new production rules incrementally. They refine the grammar while keeping their computational complexity low. In this paper, we propose a new efficient grammar induction algorithm. Although (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  26.  20
    The Role of Lexical Frequency in the Acceptability of Syntactic Variants: Evidence From that‐ Clauses in Polish.Dagmar Divjak - 2017 - Cognitive Science 41 (2):354-382.
    A number of studies report that frequency is a poor predictor of acceptability, in particular at the lower end of the frequency spectrum. Because acceptability judgments provide a substantial part of the empirical foundation of dominant linguistic traditions, understanding how acceptability relates to frequency, one of the most robust predictors of human performance, is crucial. The relation between low frequency and acceptability is investigated using corpus‐ and behavioral data on the distribution of infinitival and finite that‐complements in Polish. Polish verbs (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  27.  52
    Development of Different Forms of Skill Learning Throughout the Lifespan.Ágnes Lukács & Ferenc Kemény - 2015 - Cognitive Science 39 (2):383-404.
    The acquisition of complex motor, cognitive, and social skills, like playing a musical instrument or mastering sports or a language, is generally associated with implicit skill learning . Although it is a general view that SL is most effective in childhood, and such skills are best acquired if learning starts early, this idea has rarely been tested by systematic empirical studies on the developmental pathways of SL from childhood to old age. In this paper, we challenge the view that childhood (...)
    Direct download  
     
    Export citation  
     
    Bookmark   10 citations  
  28.  12
    Freezing: theoretical approaches and empirical domains.Jutta Hartmann (ed.) - 2018 - Boston: De Gruyter.
    Exploring the concepts of Freezing: Theoretical and empirical perspectives. Theoretical advancement. Ur Shlonsky and Luigi Rizzi: Criterial Freezing in small clauses and the cartography of copular constructions -- Angel J. Gallego: Freezing Effects in a free-Merge System -- Gereon Maller: Freezing in complex prefields. Empirical domains. Norbert Corver: The Freezing points of the (Dutch) adjectival system -- Jutta M. Hartmann: Freezing in it-clefts: Movement and focus -- Josef Bayer: Criterial Freezing in the syntax of particles -- Michael S. Rochemont: Only (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  29.  29
    The Role of Lexical Frequency in the Acceptability of Syntactic Variants: Evidence From that‐Clauses in Polish.Dagmar Divjak - 2016 - Cognitive Science 40 (7):n/a-n/a.
    A number of studies report that frequency is a poor predictor of acceptability, in particular at the lower end of the frequency spectrum. Because acceptability judgments provide a substantial part of the empirical foundation of dominant linguistic traditions, understanding how acceptability relates to frequency, one of the most robust predictors of human performance, is crucial. The relation between low frequency and acceptability is investigated using corpus- and behavioral data on the distribution of infinitival and finite that-complements in Polish. Polish verbs (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  30.  42
    A Rational Analysis of Rule‐Based Concept Learning.Noah D. Goodman, Joshua B. Tenenbaum, Jacob Feldman & Thomas L. Griffiths - 2008 - Cognitive Science 32 (1):108-154.
    This article proposes a new model of human concept learning that provides a rational analysis of learning feature‐based concepts. This model is built upon Bayesian inference for a grammatically structured hypothesis space—a concept language of logical rules. This article compares the model predictions to human generalization judgments in several well‐known category learning experiments, and finds good agreement for both average and individual participant generalizations. This article further investigates judgments for a broad set of 7‐feature concepts—a more natural setting in several (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   69 citations  
  31.  46
    Discovering syntactic deep structure via Bayesian statistics.Jason Eisner - 2002 - Cognitive Science 26 (3):255-268.
    In the Bayesian framework, a language learner should seek a grammar that explains observed data well and is also a priori probable. This paper proposes such a measure of prior probability. Indeed it develops a full statistical framework for lexicalized syntax. The learner's job is to discover the system of probabilistic transformations (often called lexical redundancy rules) that underlies the patterns of regular and irregular syntactic constructions listed in the lexicon. Specifically, the learner discovers what transformations apply in the (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  32.  46
    (1 other version)Comparing direct and indirect measures of sequence learning.Axel Cleeremans - unknown
    Comparing the relative sensitivity of direct and indirect measures of learning is proposed as the best way to provide evidence for unconscious learning when both conceptual and operative definitions of awareness are lacking. This approach was first proposed by Reingold & Merikle (1988) in the context of subliminal perception. In this paper, we apply it to a choice reaction time task in which the material is generated based on a probabilistic finite-state grammar (Cleeremans, 1993). We show (1) that participants (...)
    Direct download  
     
    Export citation  
     
    Bookmark   16 citations  
  33.  10
    The Phonological Enterprise.Mark Hale & Charles Reiss - 2008 - Oxford University Press UK.
    This book scrutinizes recent work in phonological theory from the perspective of Chomskyan generative linguistics and argues that progress in the field depends on taking seriously the idea that phonology is best studied as a mental computational system derived from an innate base, phonological Universal Grammar. Two simple problems of phonological analysis provide a frame for a variety of topics throughout the book. The competence-performance distinction and markedness theory are both addressed in some detail, especially with reference to phonological acquisition. (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark   8 citations  
  34.  69
    Implicit sequence learning: The truth is in the details.Axel Cleeremans & L. JimC)nez - 1998 - In Michael A. Stadler & Peter A. Frensch (eds.), Handbook of Implicit Learning. Sage Publications.
    Over the past decade, sequence learning has gradually become a central paradigm through which to study implicit learning. In this chapter, we start by briefly summarizing the results obtained with different variants of the sequence learning paradigm. We distinguish three subparadigms in terms of whether the stimulus material is generated either by following a fixed and repeating sequence (e.g., Nissen & Bullemer, 1987), by relying on a complex set of rules from which one can produce several alternative deterministic sequences (e.g., (...)
    Direct download  
     
    Export citation  
     
    Bookmark   10 citations  
  35.  26
    Quantifying Structural and Non‐structural Expectations in Relative Clause Processing.Zhong Chen & John T. Hale - 2021 - Cognitive Science 45 (1):e12927.
    Information‐theoretic complexity metrics, such as Surprisal (Hale, 2001; Levy, 2008) and Entropy Reduction (Hale, 2003), are linking hypotheses that bridge theorized expectations about sentences and observed processing difficulty in comprehension. These expectations can be viewed as syntactic derivations constrained by a grammar. However, this expectation‐based view is not limited to syntactic information alone. The present study combines structural and non‐structural information in unified models of word‐by‐word sentence processing difficulty. Using probabilistic minimalist grammars (Stabler, 1997), we extend expectation‐based models (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  36. Comparing direct and indirect measures of sequence learning.Jimenez Luis, Mendez Castor & Cleeremans Axel - 1996 - Journal of Experimental Psychology 22 (4):948-969.
    Comparing the relative sensitivity of direct and indirect measures of learning is proposed as the best way to provide evidence for unconscious learning when both conceptual and operative definitions of awareness are lacking. This approach was first proposed by Reingold & Merikle (1988) in the context of subliminal perception. In this paper, we apply it to a choice reaction time task in which the material is generated based on a probabilistic finite-state grammar (Cleeremans, 1993). We show (1) that participants (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  37.  63
    Innateness, autonomy, universality? Neurobiological approaches to language.Ralph-Axel Müller - 1996 - Behavioral and Brain Sciences 19 (4):611-631.
    The concepts of the innateness, universality, species-specificity, and autonomy of the human language capacity have had an extreme impact on the psycholinguistic debate for over thirty years. These concepts are evaluated from several neurobiological perspectives, with an emphasis on the emergence of language and its decay due to brain lesion and progressive brain disease.Evidence of perceptuomotor homologies and preadaptations for human language in nonhuman primates suggests a gradual emergence of language during hominid evolution. Regarding ontogeny, the innate component of language (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  38.  13
    Modeling Human Morphological Competence.Yohei Oseki & Alec Marantz - 2020 - Frontiers in Psychology 11.
    One of the central debates in the cognitive science of language has revolved around the nature of human linguistic competence. Whether syntactic competence should be characterized by abstract hierarchical structures or reduced to surface linear strings has been actively debated, but the nature of morphological competence has been insufficiently appreciated despite the parallel question in the cognitive science literature. In this paper, in order to investigate whether morphological competence should be characterized by abstract hierarchical structures, we conducted the crowdsourced acceptability (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  39.  59
    Toward a Connectionist Model of Recursion in Human Linguistic Performance.Morten H. Christiansen & Nick Chater - 1999 - Cognitive Science 23 (2):157-205.
    Naturally occurring speech contains only a limited amount of complex recursive structure, and this is reflected in the empirically documented difficulties that people experience when processing such structures. We present a connectionist model of human performance in processing recursive language structures. The model is trained on simple artificial languages. We find that the qualitative performance profile of the model matches human behavior, both on the relative difficulty of center‐embedding and cross‐dependency, and between the processing of these complex recursive structures and (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   74 citations  
  40. Complexity in Language Acquisition.Alexander Clark & Shalom Lappin - 2013 - Topics in Cognitive Science 5 (1):89-110.
    Learning theory has frequently been applied to language acquisition, but discussion has largely focused on information theoretic problems—in particular on the absence of direct negative evidence. Such arguments typically neglect the probabilistic nature of cognition and learning in general. We argue first that these arguments, and analyses based on them, suffer from a major flaw: they systematically conflate the hypothesis class and the learnable concept class. As a result, they do not allow one to draw significant conclusions about the (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  41. Language Learning From Positive Evidence, Reconsidered: A Simplicity-Based Approach.Anne S. Hsu, Nick Chater & Paul Vitányi - 2013 - Topics in Cognitive Science 5 (1):35-55.
    Children learn their native language by exposure to their linguistic and communicative environment, but apparently without requiring that their mistakes be corrected. Such learning from “positive evidence” has been viewed as raising “logical” problems for language acquisition. In particular, without correction, how is the child to recover from conjecturing an over-general grammar, which will be consistent with any sentence that the child hears? There have been many proposals concerning how this “logical problem” can be dissolved. In this study, we review (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  42.  44
    A Bayesian‐Network Approach to Lexical Disambiguation.Leila M. R. Eizirik, Valmir C. Barbosa & Sueli B. T. Mendes - 1993 - Cognitive Science 17 (2):257-283.
    Lexical ambiguity can be syntactic if it involves more than one grammatical category for a single word, or semantic if more than one meaning can be associated with a word. In this article we discuss the application of a Bayesian‐network model in the resolution of lexical ambiguities of both types. The network we propose comprises a parsing subnetwork, which can be constructed automatically for any context‐free grammar, and a subnetwork for semantic analysis, which, in the spirit of Fillmore's (1968) case (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  43.  37
    Statistical models of syntax learning and use.Mark Johnson & Stefan Riezler - 2002 - Cognitive Science 26 (3):239-253.
    This paper shows how to define probability distributions over linguistically realistic syntactic structures in a way that permits us to define language learning and language comprehension as statistical problems. We demonstrate our approach using lexical‐functional grammar (LFG), but our approach generalizes to virtually any linguistic theory. Our probabilistic models are maximum entropy models. In this paper we concentrate on statistical inference procedures for learning the parameters that define these probability distributions. We point out some of the practical problems that (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  44.  28
    Limited Evidence of an Association Between Language, Literacy, and Procedural Learning in Typical and Atypical Development: A Meta‐Analysis.Cátia M. Oliveira, Lisa M. Henderson & Marianna E. Hayiou-Thomas - 2023 - Cognitive Science 47 (7):e13310.
    The ability to extract patterns from sensory input across time and space is thought to underlie the development and acquisition of language and literacy skills, particularly the subdomains marked by the learning of probabilistic knowledge. Thus, impairments in procedural learning are hypothesized to underlie neurodevelopmental disorders, such as dyslexia and developmental language disorder. In the present meta‐analysis, comprising 2396 participants from 39 independent studies, the continuous relationship between language, literacy, and procedural learning on the Serial Reaction Time task (SRTT) (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  45. “Identifying Phrasal Connectives in Italian Using Quantitative Methods”.Edoardo Zamuner, Fabio Tamburini & Cristiana de Sanctis - 2002 - In Stefania Nuccorini (ed.), Phrases and Phraseology – Data and Descriptions. Peter Lang Verlag.
    In recent decades, the analysis of phraseology has made use of the exploration of large corpora as a source of quantitative information about language. This paper intends to present the main lines of work in progress based on this empirical approach to linguistic analysis. In particular, we focus our attention on some problems relating to the morpho-syntactic annotation of corpora. The CORIS/CODIS corpus of contemporary written Italian, developed at CILTA – University of Bologna (Rossini Favretti 2000; Rossini Favretti, Tamburini, De (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  46.  7
    The Later Wittgenstein: The Emergence of a New Philosophical Method by S. Stephen Hilmy. [REVIEW]John Churchill - 1989 - The Thomist 53 (3):533-538.
    In lieu of an abstract, here is a brief excerpt of the content:BOOK REVIEWS 533 Grammar of Assent. Yet whereas Reid had urged a fundamental agree· ment on first principles on the intuitive basis of common sense, Newman thought such principles were discovered inductively and that there might he much disagreement. It was the disagreement itself that led to the need for a better understanding of the reasoning process. In place of common sense, Newman appealed to the illative sense. In (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  47.  20
    Application of Normalized Compression Distance and Lempel-Ziv Jaccard Distance in Micro-electrode Signal Stream Classification for the Surgical Treatment of Parkinson’s Disease.Kamil Ząbkiewicz - 2018 - Studies in Logic, Grammar and Rhetoric 56 (1):45-57.
    Parkinson’s Disease can be treated with the use of microelectrode recording and stimulation. This paper presents a data stream classifier that analyses raw data from micro-electrodes and decides whether the measurements were taken from the subthalamic nucleus (STN) or not. The novelty of the proposed approach is based on the fact that distances based on raw data are used. Two distances are investigated in this paper, i.e. Normalized Compression Distance (NCD) and Lempel-Ziv Jaccard Distance (LZJD). No new features needed to (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  48.  27
    Thou Shalt Not Squander Life – Comparing Five Approaches to Argument Strength.Simon Wells, Marcin Selinger, David Godden, Kamila Dębowska-Kozłowska & Frank Zenker - 2023 - Studies in Logic, Grammar and Rhetoric 68 (1):133-167.
    Different approaches analyze the strength of a natural language argument in different ways. This paper contrasts the dialectical, structural, probabilistic (or Bayesian), computational, and empirical approaches by exemplarily applying them to a single argumentative text (Epicureans on Squandering Life; Aikin & Talisse, 2019). Rather than pitching these approaches against one another, our main goal is to show the room for fruitful interaction. Our focus is on a dialectical analysis of the squandering argument as an argumentative response that voids an (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  49. James H. Fetzer.Probabilistic Metaphysics - 1988 - In J. H. Fetzer (ed.), Probability and Causality: Essays in Honor of Wesley C. Salmon. D. Reidel. pp. 192--109.
  50. Nicolas Ruwet.in Generative Grammar - 1981 - In W. Klein & W. Levelt (eds.), Crossing the Boundaries in Linguistics. Reidel. pp. 23.
    No categories
     
    Export citation  
     
    Bookmark  
1 — 50 / 958