Results for 'Probabilistic languages'

969 found
Order:
  1.  37
    Concept learning in a probabilistic language-of-thought. How is it possible and what does it presuppose?Matteo Colombo - 2023 - Behavioral and Brain Sciences 46:e271.
    Where does a probabilistic language-of-thought (PLoT) come from? How can we learn new concepts based on probabilistic inferences operating on a PLoT? Here, I explore these questions, sketching a traditional circularity objection to LoT and canvassing various approaches to addressing it. I conclude that PLoT-based cognitive architectures can support genuine concept learning; but, currently, it is unclear that they enjoy more explanatory breadth in relation to concept learning than alternative architectures that do not posit any LoT.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  2.  16
    Interpreting the Probabilistic Language in IPCC Reports.Corey Dethier - 2023 - Ergo: An Open Access Journal of Philosophy 10.
    The Intergovernmental Panel on Climate Change (IPCC) often qualifies its statements by use of probabilistic “likelihood” language. In this paper, I show that this language is not properly interpreted in either frequentist or Bayesian terms—simply put, the IPCC uses both kinds of statistics to calculate these likelihoods. I then offer a deflationist interpretation: the probabilistic language expresses nothing more than how compatible the evidence is with the given hypothesis according to some method that generates normalized scores. I end (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  3. Probabilistic Grammars and Languages.András Kornai - 2011 - Journal of Logic, Language and Information 20 (3):317-328.
    Using an asymptotic characterization of probabilistic finite state languages over a one-letter alphabet we construct a probabilistic language with regular support that cannot be generated by probabilistic CFGs. Since all probability values used in the example are rational, our work is immune to the criticism leveled by Suppes (Synthese 22:95–116, 1970 ) against the work of Ellis ( 1969 ) who first constructed probabilistic FSLs that admit no probabilistic FSGs. Some implications for probabilistic (...)
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  4.  16
    Interaction of phonological biases and frequency in learning a probabilistic language pattern.Hanbyul Song & James White - 2022 - Cognition 226 (C):105170.
  5.  72
    A Probabilistic Constraints Approach to Language Acquisition and Processing.Mark S. Seidenberg & Maryellen C. MacDonald - 1999 - Cognitive Science 23 (4):569-588.
    This article provides an overview of a probabilistic constraints framework for thinking about language acquisition and processing. The generative approach attempts to characterize knowledge of language (i.e., competence grammar) and then asks how this knowledge is acquired and used. Our approach is performance oriented: the goal is to explain how people comprehend and produce utterances and how children acquire this skill. Use of language involves exploiting multiple probabilistic constraints over various types of linguistic and nonlinguistic information. Acquisition is (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   32 citations  
  6. Probabilistic models of language processing and acquisition.Nick Chater & Christopher D. Manning - 2006 - Trends in Cognitive Sciences 10 (7):335–344.
    Probabilistic methods are providing new explanatory approaches to fundamental cognitive science questions of how humans structure, process and acquire language. This review examines probabilistic models defined over traditional symbolic structures. Language comprehension and production involve probabilistic inference in such models; and acquisition involves choosing the best model, given innate constraints and linguistic and other input. Probabilistic models can account for the learning and processing of language, while maintaining the sophistication of symbolic models. A recent burgeoning of (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   49 citations  
  7.  32
    Language polygenesis: A probabilistic model.David A. Freedman & William Wang - unknown
    Monogenesis of language is widely accepted, but the conventional argument seems to be mistaken; a simple probabilistic model shows that polygenesis is likely. Other prehistoric inventions are discussed, as are problems in tracing linguistic lineages. Language is a system of representations; within such a system, words can evoke complex and systematic responses. Along with its social functions, language is important to humans as a mental instrument. Indeed, the invention of language,that is the accumulation of symbols to represent emotions, objects, (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  8. Accuracy, Language Dependence, and Joyce’s Argument for Probabilism.Branden Fitelson - 2012 - Philosophy of Science 79 (1):167-174.
    In this article, I explain how a variant of David Miller's argument concerning the language dependence of the accuracy of predictions can be applied to Joyce's notion of the accuracy of “estimates of numerical truth-values”. This leads to a potential problem for Joyce's accuracy-dominance-based argument for the conclusion that credences should obey the probability calculus.
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  9.  58
    The probabilistic analysis of language acquisition: Theoretical, computational, and experimental analysis.Anne S. Hsu, Nick Chater & Paul M. B. Vitányi - 2011 - Cognition 120 (3):380-390.
    No categories
    Direct download (9 more)  
     
    Export citation  
     
    Bookmark   12 citations  
  10.  90
    The Logical Problem of Language Acquisition: A Probabilistic Perspective.Anne S. Hsu & Nick Chater - 2010 - Cognitive Science 34 (6):972-1016.
    Natural language is full of patterns that appear to fit with general linguistic rules but are ungrammatical. There has been much debate over how children acquire these “linguistic restrictions,” and whether innate language knowledge is needed. Recently, it has been shown that restrictions in language can be learned asymptotically via probabilistic inference using the minimum description length (MDL) principle. Here, we extend the MDL approach to give a simple and practical methodology for estimating how much linguistic data are required (...)
    Direct download  
     
    Export citation  
     
    Bookmark   13 citations  
  11.  26
    Probabilistic Entailment on First Order Languages and Reasoning with Inconsistencies.R. A. D. Soroush Rafiee - 2023 - Review of Symbolic Logic 16 (2):351-368.
    We investigate an approach for drawing logical inference from inconsistent premisses. The main idea in this approach is that the inconsistencies in the premisses should be interpreted as uncertainty of the information. We propose a mechanism, based on Kinght’s [14] study of inconsistency, for revising an inconsistent set of premisses to a minimally uncertain, probabilistically consistent one. We will then generalise the probabilistic entailment relation introduced in [15] for propositional languages to the first order case to draw logical (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  12. Probabilistic coherence, logical consistency, and Bayesian learning: Neural language models as epistemic agents.Gregor Betz & Kyle Richardson - 2023 - PLoS ONE 18 (2).
    It is argued that suitably trained neural language models exhibit key properties of epistemic agency: they hold probabilistically coherent and logically consistent degrees of belief, which they can rationally revise in the face of novel evidence. To this purpose, we conduct computational experiments with rankers: T5 models [Raffel et al. 2020] that are pretrained on carefully designed synthetic corpora. Moreover, we introduce a procedure for eliciting a model’s degrees of belief, and define numerical metrics that measure the extent to which (...)
     
    Export citation  
     
    Bookmark  
  13.  56
    Probabilistic grammars for natural languages.Patrick Suppes - 1970 - Synthese 22 (1-2):95 - 116.
  14. Probabilistic Type Theory and Natural Language Semantics.Robin Cooper, Simon Dobnik, Shalom Lappin & Stefan Larsson - 2015 - Linguistic Issues in Language Technology 10 (1):1--43.
    No categories
     
    Export citation  
     
    Bookmark   6 citations  
  15. Probabilistic semantics and pragmatics : uncertainty in language and thought.Noah D. Goodman & Daniel Lassiter - 1996 - In Shalom Lappin (ed.), The handbook of contemporary semantic theory. Cambridge, Mass., USA: Blackwell Reference.
     
    Export citation  
     
    Bookmark   7 citations  
  16.  23
    Probabilistic White Matter Atlases of Human Auditory, Basal Ganglia, Language, Precuneus, Sensorimotor, Visual and Visuospatial Networks.D. Figley Teresa, Mortazavi Moghadam Behnoush, Bhullar Navdeep, Kornelsen Jennifer, M. Courtney Susan & R. Figley Chase - 2017 - Frontiers in Human Neuroscience 11.
  17.  19
    On probabilistic and causal reasoning with summation operators.Duligur Ibeling, Thomas Icard & Milan Mossé - forthcoming - Journal of Logic and Computation.
    Ibeling et al. (2023) axiomatize increasingly expressive languages of causation and probability, and Mossé et al. (2024) show that reasoning (specifically the satisfiability problem) in each causal language is as difficult, from a computational complexity perspective, as reasoning in its merely probabilistic or “correlational” counterpart. Introducing a summation operator to capture common devices that appear in applications—such as the do-calculus of Pearl (2009) for causal inference, which makes ample use of marginalization—van der Zander et al. (2023) partially extend (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  18.  27
    Natural language at a crossroads: Formal and probabilistic approaches in philosophy and computer science.Paulo Pirozelli & Igor Câmara - 2022 - Manuscrito 45 (2):50-81.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  19.  7
    Probabilistic reasoning and natural language.Laura Macchi & Maria Bagassi - 2006 - In Riccardo Viale, Daniel Andler & Lawrence A. Hirschfeld (eds.), Biological and cultural bases of human inference. Mahwah, N.J.: Lawerence Erlbaum. pp. 1--31.
    Direct download  
     
    Export citation  
     
    Bookmark  
  20.  76
    A Probabilistic Model of Semantic Plausibility in Sentence Processing.Ulrike Padó, Matthew W. Crocker & Frank Keller - 2009 - Cognitive Science 33 (5):794-838.
    Experimental research shows that human sentence processing uses information from different levels of linguistic analysis, for example, lexical and syntactic preferences as well as semantic plausibility. Existing computational models of human sentence processing, however, have focused primarily on lexico‐syntactic factors. Those models that do account for semantic plausibility effects lack a general model of human plausibility intuitions at the sentence level. Within a probabilistic framework, we propose a wide‐coverage model that both assigns thematic roles to verb–argument pairs and determines (...)
    No categories
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   8 citations  
  21. (1 other version)Learning a Generative Probabilistic Grammar of Experience: A Process‐Level Model of Language Acquisition.Oren Kolodny, Arnon Lotem & Shimon Edelman - 2014 - Cognitive Science 38 (4):227-267.
    We introduce a set of biologically and computationally motivated design choices for modeling the learning of language, or of other types of sequential, hierarchically structured experience and behavior, and describe an implemented system that conforms to these choices and is capable of unsupervised learning from raw natural-language corpora. Given a stream of linguistic input, our model incrementally learns a grammar that captures its statistical patterns, which can then be used to parse or generate new data. The grammar constructed in this (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  22.  18
    A new probabilistic constraint logic programming language based on a generalised distribution semantics.Steffen Michels, Arjen Hommersom, Peter J. F. Lucas & Marina Velikova - 2015 - Artificial Intelligence 228 (C):1-44.
  23.  62
    On the Languages Representable by Finite Probabilistic Automata.Phan Dinh Diêu - 1971 - Mathematical Logic Quarterly 17 (1):427-442.
  24.  70
    A Probabilistic Computational Model of Cross-Situational Word Learning.Afsaneh Fazly, Afra Alishahi & Suzanne Stevenson - 2010 - Cognitive Science 34 (6):1017-1063.
    Words are the essence of communication: They are the building blocks of any language. Learning the meaning of words is thus one of the most important aspects of language acquisition: Children must first learn words before they can combine them into complex utterances. Many theories have been developed to explain the impressive efficiency of young children in acquiring the vocabulary of their language, as well as the developmental patterns observed in the course of lexical acquisition. A major source of disagreement (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   27 citations  
  25.  52
    A probabilistic temporal epistemic logic: Strong completeness.Zoran Ognjanović, Angelina Ilić Stepić & Aleksandar Perović - 2024 - Logic Journal of the IGPL 32 (1):94-138.
    The paper offers a formalization of reasoning about distributed multi-agent systems. The presented propositional probabilistic temporal epistemic logic $\textbf {PTEL}$ is developed in full detail: syntax, semantics, soundness and strong completeness theorems. As an example, we prove consistency of the blockchain protocol with respect to the given set of axioms expressed in the formal language of the logic. We explain how to extend $\textbf {PTEL}$ to axiomatize the corresponding first-order logic.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  26.  17
    pSPARQL: A Querying Language for Probabilistic RDF Data.Hong Fang - 2019 - Complexity 2019:1-7.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  27. Temporal Language and Temporal Reality/Dyke, Heather 380-391 Quasi-Realism's Problem of Autonomous Effects/Tenenbaum, Sergio 392-409 Interpreting Mill's Qualitative Hedonism/Riley, Jonathan 410-418 Probabilistic Induction and Hume's Problem: Reply to Lange/Okasha, Samir 419-424 Are You a Sim?/Weatherson, Brian 425-431. [REVIEW]Privileged Access Naturalized, Jordi Fernández & Anthony Hatzimoysis - 2003 - Philosophical Quarterly 53 (212):212.
     
    Export citation  
     
    Bookmark  
  28.  47
    From Exemplar to Grammar: A Probabilistic Analogy‐Based Model of Language Learning.Rens Bod - 2009 - Cognitive Science 33 (5):752-793.
    While rules and exemplars are usually viewed as opposites, this paper argues that they form end points of the same distribution. By representing both rules and exemplars as (partial) trees, we can take into account the fluid middle ground between the two extremes. This insight is the starting point for a new theory of language learning that is based on the following idea: If a language learner does not know which phrase‐structure trees should be assigned to initial sentences, s/he allows (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   18 citations  
  29. Probabilistic Modeling of Discourse‐Aware Sentence Processing.Amit Dubey, Frank Keller & Patrick Sturt - 2013 - Topics in Cognitive Science 5 (3):425-451.
    Probabilistic models of sentence comprehension are increasingly relevant to questions concerning human language processing. However, such models are often limited to syntactic factors. This restriction is unrealistic in light of experimental results suggesting interactions between syntax and other forms of linguistic information in human sentence processing. To address this limitation, this article introduces two sentence processing models that augment a syntactic component with information about discourse co-reference. The novel combination of probabilistic syntactic components with co-reference classifiers permits them (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  30.  58
    Grammaticality, Acceptability, and Probability: A Probabilistic View of Linguistic Knowledge.Lau Jey Han, Clark Alexander & Lappin Shalom - 2017 - Cognitive Science 41 (5):1202-1241.
    The question of whether humans represent grammatical knowledge as a binary condition on membership in a set of well-formed sentences, or as a probabilistic property has been the subject of debate among linguists, psychologists, and cognitive scientists for many decades. Acceptability judgments present a serious problem for both classical binary and probabilistic theories of grammaticality. These judgements are gradient in nature, and so cannot be directly accommodated in a binary formal grammar. However, it is also not possible to (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   12 citations  
  31. A probabilistic analysis of argument cogency.David Godden & Frank Zenker - 2018 - Synthese 195 (4):1715-1740.
    This paper offers a probabilistic treatment of the conditions for argument cogency as endorsed in informal logic: acceptability, relevance, and sufficiency. Treating a natural language argument as a reason-claim-complex, our analysis identifies content features of defeasible argument on which the RSA conditions depend, namely: change in the commitment to the reason, the reason’s sensitivity and selectivity to the claim, one’s prior commitment to the claim, and the contextually determined thresholds of acceptability for reasons and for claims. Results contrast with, (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   10 citations  
  32.  32
    A Minkowski type duality mediating between state and predicate transformer semantics for a probabilistic nondeterministic language.Klaus Keimel, A. Rosenbusch & Thomas Streicher - 2009 - Annals of Pure and Applied Logic 159 (3):307-317.
    In this paper we systematically derive a predicate transformer semantics from a direct semantics for a simple probabilistic-nondeterministic programming language . This goal is achieved by exhibiting the direct semantics as isomorphic to a continuation semantics from which the predicate transformer semantics can be read off immediately. This isomorphism allows one to identify nonempty convex compact saturated sets of valuations on the set S of states with certain “good” functionals from to in a way similar to the one how (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  33. Probabilistic Knowledge.Sarah Moss - 2016 - Oxford, United Kingdom: Oxford University Press.
    Traditional philosophical discussions of knowledge have focused on the epistemic status of full beliefs. In this book, Moss argues that in addition to full beliefs, credences can constitute knowledge. For instance, your .4 credence that it is raining outside can constitute knowledge, in just the same way that your full beliefs can. In addition, you can know that it might be raining, and that if it is raining then it is probably cloudy, where this knowledge is not knowledge of propositions, (...)
  34. Calibrating Generative Models: The Probabilistic Chomsky-Schützenberger Hierarchy.Thomas Icard - 2020 - Journal of Mathematical Psychology 95.
    A probabilistic Chomsky–Schützenberger hierarchy of grammars is introduced and studied, with the aim of understanding the expressive power of generative models. We offer characterizations of the distributions definable at each level of the hierarchy, including probabilistic regular, context-free, (linear) indexed, context-sensitive, and unrestricted grammars, each corresponding to familiar probabilistic machine classes. Special attention is given to distributions on (unary notations for) positive integers. Unlike in the classical case where the "semi-linear" languages all collapse into the regular (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  35. Probabilistic theories of reasoning need pragmatics too: Modulating relevance in uncertain conditionals.A. J. B. Fugard, Niki Pfeifer & B. Mayerhofer - 2011 - Journal of Pragmatics 43:2034–2042.
    According to probabilistic theories of reasoning in psychology, people's degree of belief in an indicative conditional `if A, then B' is given by the conditional probability, P(B|A). The role of language pragmatics is relatively unexplored in the new probabilistic paradigm. We investigated how consequent relevance a ects participants' degrees of belief in conditionals about a randomly chosen card. The set of events referred to by the consequent was either a strict superset or a strict subset of the set (...)
     
    Export citation  
     
    Bookmark   9 citations  
  36.  33
    Deductive, Probabilistic, and Inductive Dependence: An Axiomatic Study in Probability Semantics.Georg Dorn - 1997 - Verlag Peter Lang.
    This work is in two parts. The main aim of part 1 is a systematic examination of deductive, probabilistic, inductive and purely inductive dependence relations within the framework of Kolmogorov probability semantics. The main aim of part 2 is a systematic comparison of (in all) 20 different relations of probabilistic (in)dependence within the framework of Popper probability semantics (for Kolmogorov probability semantics does not allow such a comparison). Added to this comparison is an examination of (in all) 15 (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  37. Probabilistic causation in branching time.Mika Oksanen - 2002 - Synthese 132 (1-2):89 - 117.
    A probabilistic and counterfactual theory of causality is developed within the framework of branching time. The theory combines ideas developed by James Fetzer, Donald Nute, Patrick Suppes, Ming Xu, John Pollock, David Lewis and Mellor among others.
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark  
  38.  28
    A Probabilistic Model of Lexical and Syntactic Access and Disambiguation.Daniel Jurafsky - 1996 - Cognitive Science 20 (2):137-194.
    The problems of access—retrieving linguistic structure from some mental grammar —and disambiguation—choosing among these structures to correctly parse ambiguous linguistic input—are fundamental to language understanding. The literature abounds with psychological results on lexical access, the access of idioms, syntactic rule access, parsing preferences, syntactic disambiguation, and the processing of garden‐path sentences. Unfortunately, it has been difficult to combine models which account for these results to build a general, uniform model of access and disambiguation at the lexical, idiomatic, and syntactic levels. (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   65 citations  
  39.  53
    From probabilistic topologies to Feynman diagrams: Hans Reichenbach on time, genidentity, and quantum physics.Michael Stöltzner - 2022 - Synthese 200 (4):1-26.
    Hans Reichenbach’s posthumous book The Direction of Time ends somewhere between Socratic aporia and historical irony. Prompted by Feynman’s diagrammatic formulation of quantum electrodynamics, Reichenbach eventually abandoned the delicate balancing between the macroscopic foundation of the direction of time and microscopic descriptions of time order undertaken throughout the previous chapters in favor of an exclusively macroscopic theory that he had vehemently rejected in the 1920s. I analyze Reichenbach’s reasoning against the backdrop of the history of Feynman diagrams and the current (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  40.  12
    Probabilistic Semantics and Calculi for Multi-valued and Paraconsistent Logics.Jaime Ramos, João Rasga & Cristina Sernadas - forthcoming - Studia Logica:1-35.
    We show how to obtain a probabilistic semantics and calculus for a logic presented by a valuation specification. By identifying general forms of valuation constraints we are able to accommodate a wide class of propositional based logics encompassing multi-valued logics like Łukasiewicz 3-valued logic and the Belnap–Dunn four-valued logic as well as paraconsistent logics like $${\textsf{mbC}}$$ and $${\textsf{LFI1}}$$. The probabilistic calculus is automatically generated from the valuation specification. Although not having explicit probability constructors in the language, the rules (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  41.  27
    Probabilistic truthlikeness, content elements, and meta-inductive probability optimization.Gerhard Schurz - 2021 - Synthese 199 (3-4):6009-6037.
    The paper starts with the distinction between conjunction-of-parts accounts and disjunction-of-possibilities accounts to truthlikeness. In Sect. 3, three distinctions between kinds of truthlikeness measures are introduced: comparative versus numeric t-measures, t-measures for qualitative versus quantitative theories, and t-measures for deterministic versus probabilistic truth. These three kinds of truthlikeness are explicated and developed within a version of conjunctive part accounts based on content elements. The focus lies on measures of probabilistic truthlikeness, that are divided into t-measures for statistical probabilities (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  42. Probabilistic dynamic epistemic logic.Barteld P. Kooi - 2003 - Journal of Logic, Language and Information 12 (4):381-408.
    In this paper I combine the dynamic epistemic logic ofGerbrandy (1999) with the probabilistic logic of Fagin and Halpern (1994). The resultis a new probabilistic dynamic epistemic logic, a logic for reasoning aboutprobability, information, and information change that takes higher orderinformation into account. Probabilistic epistemic models are defined, and away to build them for applications is given. Semantics and a proof systemis presented and a number of examples are discussed, including the MontyHall Dilemma.
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   31 citations  
  43. Context Probabilism.Seth Yalcin - 2012 - In M. Aloni (ed.), 18th Amsterdam Colloquium. Springer. pp. 12-21.
    We investigate a basic probabilistic dynamic semantics for a fragment containing conditionals, probability operators, modals, and attitude verbs, with the aim of shedding light on the prospects for adding probabilistic structure to models of the conversational common ground.
    Direct download  
     
    Export citation  
     
    Bookmark   42 citations  
  44.  15
    How infants' utterances grow: A probabilistic account of early language development.Qihui Xu, Martin Chodorow & Virginia Valian - 2023 - Cognition 230 (C):105275.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  45.  99
    Nonmonotonic probabilistic reasoning under variable-strength inheritance with overriding.Thomas Lukasiewicz - 2005 - Synthese 146 (1-2):153 - 169.
    We present new probabilistic generalizations of Pearl’s entailment in System Z and Lehmann’s lexicographic entailment, called Zλ- and lexλ-entailment, which are parameterized through a value λ ∈ [0,1] that describes the strength of the inheritance of purely probabilistic knowledge. In the special cases of λ = 0 and λ = 1, the notions of Zλ- and lexλ-entailment coincide with probabilistic generalizations of Pearl’s entailment in System Z and Lehmann’s lexicographic entailment that have been recently introduced by the (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  46. A Probabilistic Truth-Conditional Semantics for Indicative Conditionals.Michał Sikorski - 2022 - Semiotic Studies 35 (2):69-87.
    In my article, I present a new version of a probabilistic truth prescribing semantics for natural language indicative conditionals. The proposed truth conditions can be paraphrased as follows: an indicative conditional is true if the corresponding conditional probability is high and the antecedent is positively probabilistically relevant for the consequent or the probability of the antecedent of the conditional equals 0. In the paper, the truth conditions are defended and some of the logical properties of the proposed semantics are (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  47. Is Causal Reasoning Harder Than Probabilistic Reasoning?Milan Mossé, Duligur Ibeling & Thomas Icard - 2024 - Review of Symbolic Logic 17 (1):106-131.
    Many tasks in statistical and causal inference can be construed as problems of entailment in a suitable formal language. We ask whether those problems are more difficult, from a computational perspective, for causal probabilistic languages than for pure probabilistic (or “associational”) languages. Despite several senses in which causal reasoning is indeed more complex—both expressively and inferentially—we show that causal entailment (or satisfiability) problems can be systematically and robustly reduced to purely probabilistic problems. Thus there is (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  48.  46
    Probabilistic epistemic logic based on neighborhood semantics.Meiyun Guo & Yixin Pan - 2024 - Synthese 203 (5):1-24.
    In the literature, different frameworks of probabilistic epistemic logic have been proposed. Most of these frameworks define knowledge or belief by relational structure. In this paper, we explore the relationship between probability and belief, based on the Lockean thesis, and adopt neighborhood semantics that defines belief directly using probability. We provide a sound and weakly complete axiomatization for our framework. We also try to explain the lottery paradox by modelling it within our framework. Moreover, the paper presents findings concerning (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  49. A probabilistic framework for analysing the compositionality of conceptual combinations.Peter Bruza, Kirsty Kitto, Brentyn Ramm & Laurianne Sitbon - 2015 - Journal of Mathematical Psychology 67:26-38.
    Conceptual combination performs a fundamental role in creating the broad range of compound phrases utilised in everyday language. This article provides a novel probabilistic framework for assessing whether the semantics of conceptual combinations are compositional, and so can be considered as a function of the semantics of the constituent concepts, or not. While the systematicity and productivity of language provide a strong argument in favor of assuming compositionality, this very assumption is still regularly questioned in both cognitive science and (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   11 citations  
  50.  84
    Probabilistic Semantics, Identity and Belief.William Seager - 1983 - Canadian Journal of Philosophy 13 (3):353 - 364.
    The goal of standard semantics is to provide truth conditions for the sentences of a given language. Probabilistic Semantics does not share this aim; it might be said instead, if rather cryptically, that Probabilistic Semantics aims to provide belief conditions.The central and guiding idea of Probabilistic Semantics is that each rational individual has ‘within’ him or her a personal subjective probability function. The output of the function when given a certain sentence as input represents the degree of (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark  
1 — 50 / 969