Results for 'Statistical Theory and Methods'

978 found
Order:
  1. Method, Theory, and Statistics: the Lesson of Physics in The Foundations of Statistical Methods in Biology, Physics and Economics.L. Kruger - 1990 - Boston Studies in the Philosophy of Science 122:1-13.
  2. The development of renormalization group methods for particle physics: Formal analogies between classical statistical mechanics and quantum field theory.Doreen Fraser - 2020 - Synthese 197 (7):3027-3063.
    Analogies between classical statistical mechanics and quantum field theory played a pivotal role in the development of renormalization group methods for application in the two theories. This paper focuses on the analogies that informed the application of RG methods in QFT by Kenneth Wilson and collaborators in the early 1970's. The central task that is accomplished is the identification and analysis of the analogical mappings employed. The conclusion is that the analogies in this case study are (...)
    No categories
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   16 citations  
  3.  46
    Foundations of probability theory, statistical inference, and statistical theories of science.W. Hooker, C., Harper (ed.) - 1975 - Springer.
    In May of 1973 we organized an international research colloquium on foundations of probability, statistics, and statistical theories of science at the University of Western Ontario. During the past four decades there have been striking formal advances in our understanding of logic, semantics and algebraic structure in probabilistic and statistical theories. These advances, which include the development of the relations between semantics and metamathematics, between logics and algebras and the algebraic-geometrical foundations of statistical theories (especially in the (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  4. D. J. White, "Decision Theory" and William A. Chance, "Statistical Methods for Decision Making". [REVIEW]F. H. George - 1971 - Theory and Decision 1 (3):322.
     
    Export citation  
     
    Bookmark  
  5.  17
    Statistical Learning Theory and Occam’s Razor: The Core Argument.Tom F. Sterkenburg - 2024 - Minds and Machines 35 (1):1-28.
    Statistical learning theory is often associated with the principle of Occam’s razor, which recommends a simplicity preference in inductive inference. This paper distills the core argument for simplicity obtainable from statistical learning theory, built on the theory’s central learning guarantee for the method of empirical risk minimization. This core “means-ends” argument is that a simpler hypothesis class or inductive model is better because it has better learning guarantees; however, these guarantees are model-relative and so the (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  6. Principles and Methods of Law and Economics: Enhancing Normative Analysis.Nicholas L. Georgakopoulos - 2005 - Cambridge University Press.
    This is an introductory book that targets the reader who has the ambition to apply economic analysis but may be missing a technical introduction to its mathematical techniques or seeks a structured elaboration of its philosophical principles. The book juxtaposes economic analysis with moral philosophy, political theory, egalitarianism, and other methodological principles and then passes to the details of methods such as model-building, derivatives, differential equations, statistical tests, and the use of computer programs.
     
    Export citation  
     
    Bookmark  
  7.  7
    Relative Ontology and Method of Scientific Theory of Consciousness.Petr M. Kolychev & Колычев Петр Михайлович - 2023 - RUDN Journal of Philosophy 27 (2):316-331.
    Consciousness is defined as operating with the meanings of representations, which are what arises in mind under the influence of a stimulus (primary representations) as well as what arises as a result of their transformation (secondary, combined representations). In a first approximation, a representation is expressed by words. The concept of “representation” is a special case of the concept of “information-certainty”, which is the result of distinction. Any distinction is a distinction by a specific attribute and representation is the value (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  8. Error statistical modeling and inference: Where methodology meets ontology.Aris Spanos & Deborah G. Mayo - 2015 - Synthese 192 (11):3533-3555.
    In empirical modeling, an important desiderata for deeming theoretical entities and processes as real is that they can be reproducible in a statistical sense. Current day crises regarding replicability in science intertwines with the question of how statistical methods link data to statistical and substantive theories and models. Different answers to this question have important methodological consequences for inference, which are intertwined with a contrast between the ontological commitments of the two types of models. The key (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  9.  24
    (1 other version)Research problems and methods in the philosophy of medicine.Michael Loughlin, Robyn Bluhm & Mona Gupta - 2016 - In James A. Marcum (ed.), Bloomsbury Companion to Contemporary Philosophy of Medicine. New York: Bloomsbury. pp. 29-62.
    Philosophy of medicine encompasses a broad range of methodological approaches and theoretical perspectives—from the uses of statistical reasoning and probability theory in epidemiology and evidence-based medicine to questions about how to recognize the uniqueness of individual patients in medical humanities, person-centered care, and values-based practice; and from debates about causal ontology to questions of how to cultivate epistemic and moral virtue in practice. Apart from being different ways of thinking about medical practices, do these different philosophical approaches have (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark   2 citations  
  10. Statistical Normalization Methods in Interpersonal and Intertheoretic Comparisons.William MacAskill, Owen Cotton-Barratt & Toby Ord - 2020 - Journal of Philosophy 117 (2):61-95.
    A major problem for interpersonal aggregation is how to compare utility across individuals; a major problem for decision-making under normative uncertainty is the formally analogous problem of how to compare choice-worthiness across theories. We introduce and study a class of methods, which we call statistical normalization methods, for making interpersonal comparisons of utility and intertheoretic comparisons of choice-worthiness. We argue against the statistical normalization methods that have been proposed in the literature. We argue, instead, in (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  11.  44
    Prospect Theory and the Wisdom of the Inner Crowd.Stephan Hartmann - manuscript
    We give a probabilistic justification of the shape of one of the probability weighting functions used in Prospect Theory. To do so, we use an idea recently introduced by Herzog and Hertwig. Along the way we also suggest a new method for the aggregation of probabilities using statistical distances.
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark  
  12. Statistical significance testing, hypothetico-deductive method, and theory evaluation.Brian D. Haig - 2000 - Behavioral and Brain Sciences 23 (2):292-293.
    Chow's endorsement of a limited role for null hypothesis significance testing is a needed corrective of research malpractice, but his decision to place this procedure in a hypothetico-deductive framework of Popperian cast is unwise. Various failures of this version of the hypothetico-deductive method have negative implications for Chow's treatment of significance testing, meta-analysis, and theory evaluation.
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark  
  13.  42
    Utility theory and the Bayesian paradigm.Jordan Howard Sobel - 1989 - Theory and Decision 26 (3):263-293.
    In this paper, a problem for utility theory - that it would have an agent who was compelled to play “Russian Roulette’ with one revolver or another, to pay as much to have a six-shooter with four bullets relieved of one bullet before playing with it, as he would be willing to pay to have a six-shooter with two bullets emptied - is reviewed. A less demanding Bayesian theory is described, that would have an agent maximize expected values (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  14.  14
    Substance and method: studies in philosophy of science.Chuang Liu - 2015 - Hackensack, NJ: World Scientific.
    Fictional models in science -- The hypothetical versus the fictional -- What is wrong with the new fictionalism of scientific models? -- Re-inflating the conception of scientific representation -- Idealization, confirmation, and scientific realism -- Laws and models in a theory of idealization -- Approximation and its measures -- Approximation, idealization, and the laws of nature -- Coordination of space and unity of science -- Gauge gravity and the unification of natural forces -- Models and theories II: issues and (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  15. The Theory of Natural Selection as a Null Theory in The Foundations of Statistical Methods in Biology, Physics and Economics.A. Shimony - 1990 - Boston Studies in the Philosophy of Science 122:15-26.
  16.  33
    Paleontology and Darwin’s Theory of Evolution: The Subversive Role of Statistics at the End of the 19th Century.Marco Tamborini - 2015 - Journal of the History of Biology 48 (4):575-612.
    This paper examines the subversive role of statistics paleontology at the end of the 19th and the beginning of the 20th centuries. In particular, I will focus on German paleontology and its relationship with statistics. I argue that in paleontology, the quantitative method was questioned and strongly limited by the first decade of the 20th century because, as its opponents noted, when the fossil record is treated statistically, it was found to generate results openly in conflict with the Darwinian (...) of evolution. Essentially, statistics questions the gradual mode of evolution and the role of natural selection. The main objections to statistics were addressed during the meetings at the Kaiserlich-Königliche Geologische Reichsanstalt in Vienna in the 1880s. After having introduced the statistical treatment of the fossil record, I will use the works of Charles Léo Lesquereux, Joachim Barrande, and Henry Shaler Williams to compare the objections raised in Vienna with how the statistical treatment of the data worked in practice. Furthermore, I will discuss the criticisms of Melchior Neumayr, one of the leading German opponents of statistical paleontology, to show why, and to what extent, statistics were questioned in Vienna. The final part of this paper considers what paleontologists can derive from a statistical notion of data: the necessity of opening a discussion about the completeness and nature of the paleontological data. The Vienna discussion about which method paleontologists should follow offers an interesting case study in order to understand the epistemic tensions within paleontology surrounding Darwin’s theory as well as the variety of non-Darwinian alternatives that emerged from the statistical treatment of the fossil record at the end of the 19th century. (shrink)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   15 citations  
  17.  62
    Decisions as statistical evidence and Birnbaum's 'confidence concept'.John W. Pratt - 1977 - Synthese 36 (1):59 - 69.
    To whatever extent the use of a behavioral, not an evidential, interpretation of decisions in the Lindley-Savage argument for Bayesian theory undermines its cogency as a criticism of typical standard practice, it also undermines the Neyman-Pearson theory as a support for typical standard practice. This leaves standard practice with far less theoretical support than Bayesian methods. It does nothing to resolve the anomalies and paradoxes of standard methods. (Similar statements apply to the common protestation that the (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  18. Methods and theories in the experimental analysis of behavior.B. F. Skinner - 1984 - Behavioral and Brain Sciences 7 (4):511-523.
    We owe most scientific knowledge to methods of inquiry that are never formally analyzed. The analysis of behavior does not call for hypothetico-deductive methods. Statistics, taught in lieu of scientific method, is incompatible with major features of much laboratory research. Squeezing significance out of ambiguous data discourages the more promising step of scrapping the experiment and starting again. As a consequence, psychologists have taken flight from the laboratory. They have fled to Real People and the human interest of (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   29 citations  
  19.  50
    Decision Theory and Choices: A Complexity Approach.Marisa Faggini, Concetto Paolo Vinci, Antonio Abatemarco, Rossella Aiello, F. T. Arecchi, Lucio Biggiero, Giovanna Bimonte, Sergio Bruno, Carl Chiarella, Maria Pia Di Gregorio, Giacomo Di Tollo, Simone Giansante, Jaime Gil Aluja, A. I͡U Khrennikov, Marianna Lyra, Riccardo Meucci, Guglielmo Monaco, Giancarlo Nota, Serena Sordi, Pietro Terna, Kumaraswamy Velupillai & Alessandro Vercelli (eds.) - 2010 - Springer Verlag Italia.
    The New Economic Windows Series, derived from Massimo Salzano's ideas and work, incorporates material from textbooks, monographs and conference proceedings that deals with both the theoretical and applied aspects of various sub-disciplines ...
    Direct download  
     
    Export citation  
     
    Bookmark  
  20.  54
    Social judgement theory and medical judgement.Robert S. Wigton - 1996 - Thinking and Reasoning 2 (2 & 3):175 – 190.
    Social judgement theory is particularly well suited to the study of medical judgements. Medical judgements characteristically involve decision making under uncertainty with inevitable error and an abundance of fallible cues. In medicine, as in other areas, SJT research has found wide variation among decision makers in their judgements and in the weighting of clinical information. Strategies inferred from case vignettes differ from physicians' self-described strategies and from the weights suggested by experts. These observations parallel recent findings of unexplained variation (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  21.  83
    Design Thinking in Argumentation Theory and Practice.Sally Jackson - 2015 - Argumentation 29 (3):243-263.
    This essay proposes a design perspective on argumentation, intended as complementary to empirical and critical scholarship. In any substantive domain, design can provide insights that differ from those provided by scientific or humanistic perspectives. For argumentation, the key advantage of a design perspective is the recognition that humanity’s natural capacity for reason and reasonableness can be extended through inventions that improve on unaided human intellect. Historically, these inventions have fallen into three broad classes: logical systems, scientific methods, and disputation (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   29 citations  
  22.  62
    Statistical Method and the Peircean Account Of Truth.Andrew Reynolds - 2000 - Canadian Journal of Philosophy 30 (2):287-314.
    Peirce is often credited with having formulated a pragmatic theory of truth. This can be misleading, if it is assumed that Peirce was chiefly interested in providing a metaphysical analysis of the immediate conditions under which a belief or proposition is true, or the conditions under which a proposition or belief is said to be madetrue. Cheryl Misak has exposed the subtleties in Peirce's discussion of truth, especially showing the difficulties faced by any ascription to him of an analytic (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  23. Steps towards a unified basis for scientific models and methods.Inge S. Helland - 2010 - Hackensack, NJ: World Scientific.
  24.  18
    Construal level theory and escalation of commitment.Nick Benschop, Arno L. P. Nuijten, Mark Keil, Kirsten I. M. Rohde, Jong Seok Lee & Harry R. Commandeur - 2020 - Theory and Decision 91 (1):135-151.
    Escalation of commitment causes people to continue a failing course of action. We study the role of construal level in such escalation of commitment. Consistent with the widely held view of construal level as a primed effect, we employed a commonly used prime for manipulating this construct in a laboratory experiment. Our findings revealed that the prime failed to produce statistically significant differences in construal level, which was measured using the Behavior Identification Form. Furthermore, there was no effect of the (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  25.  82
    Reliable Reasoning: Induction and Statistical Learning Theory.Gilbert Harman & Sanjeev Kulkarni - 2007 - Bradford.
    In _Reliable Reasoning_, Gilbert Harman and Sanjeev Kulkarni -- a philosopher and an engineer -- argue that philosophy and cognitive science can benefit from statistical learning theory, the theory that lies behind recent advances in machine learning. The philosophical problem of induction, for example, is in part about the reliability of inductive reasoning, where the reliability of a method is measured by its statistically expected percentage of errors -- a central topic in SLT. After discussing philosophical attempts (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   37 citations  
  26.  33
    The Philosophy of Quantitative Methods: Understanding Statistics.Brian D. Haig - 2018 - Oup Usa.
    The Philosophy of Quantitative Methods undertakes a philosophical examination of a number of important quantitative research methods within the behavioral sciences in order to overcome the non-critical approaches typically provided by textbooks. These research methods are exploratory data analysis, statistical significance testing, Bayesian confirmation theory and statistics, meta-analysis, and exploratory factor analysis. Further readings are provided to extend the reader's overall understanding of these methods.
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark   2 citations  
  27. Optimum Inductive Methods: A Study in Inductive Probability, Bayesian Statistics, and Verisimilitude.Roberto Festa - 1993 - Dordrecht, Netherland: Kluwer Academic Publishers: Dordrecht.
    According to the Bayesian view, scientific hypotheses must be appraised in terms of their posterior probabilities relative to the available experimental data. Such posterior probabilities are derived from the prior probabilities of the hypotheses by applying Bayes'theorem. One of the most important problems arising within the Bayesian approach to scientific methodology is the choice of prior probabilities. Here this problem is considered in detail w.r.t. two applications of the Bayesian approach: (1) the theory of inductive probabilities (TIP) developed by (...)
    Direct download  
     
    Export citation  
     
    Bookmark   24 citations  
  28. Mind changes and testability: How formal and statistical learning theory converge in the new Riddle of induction.Daniel Steel - manuscript
    This essay demonstrates a previously unnoticed connection between formal and statistical learning theory with regard to Nelson Goodman’s new riddle of induction. Discussions of Goodman’s riddle in formal learning theory explain how conjecturing “all green” before “all grue” can enhance efficient convergence to the truth, where efficiency is understood in terms of minimizing the maximum number of retractions or “mind changes.” Vapnik-Chervonenkis (VC) dimension is a central concept in statistical learning theory and is similar to (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  29.  16
    A simple non-parametric method for eliciting prospect theory's value function and measuring loss aversion under risk and ambiguity.Pavlo Blavatskyy - 2021 - Theory and Decision 91 (3):403-416.
    Prospect theory emerged as one of the leading descriptive decision theories that can rationalize a large body of behavioral regularities. The methods for eliciting prospect theory parameters, such as its value function and probability weighting, are invaluable tools in decision analysis. This paper presents a new simple method for eliciting prospect theory’s value function without any auxiliary/simplifying parametric assumptions. The method is applicable both to choice under ambiguity (Knightian uncertainty) and risk (when events are characterized by (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  30.  6
    Theories and Methods.Morag MacDonald, Lee Harvey & Jane Hill - 2000 - Hodder Education.
    Theories and Methods is the one compulsory unit on the AEB and Interboard syllabuses. This guide outlines the main sociological perspectives, and discusses three main approaches: positivism, phenomenology and critical social research. The topic-book format should be suitable for linear and modular courses, and there are sample questions and skills advice.
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  31.  12
    Multiattribute regret: theory and experimental study.Yoichiro Fujii, Hajime Murakami, Yutaka Nakamura & Kazuhisa Takemura - 2023 - Theory and Decision 95 (4):623-662.
    This paper generalizes the simple regret model by Bell in Operations Research 30(5), 961-981 and Loomes and Sugden in The Economic Journal 92(368), 805-824 to cope with the situation in which decision outcomes are multi-attributed. We propose a model that combines the simple regret model for ex ante preferences and the additive difference representation for ex post preferences. We first present a necessary and sufficient axiomatization of our model in Savage’s framework. The proposed model is composed of three types of (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  32. Computational Methods to Extract Meaning From Text and Advance Theories of Human Cognition.Danielle S. McNamara - 2011 - Topics in Cognitive Science 3 (1):3-17.
    Over the past two decades, researchers have made great advances in the area of computational methods for extracting meaning from text. This research has to a large extent been spurred by the development of latent semantic analysis (LSA), a method for extracting and representing the meaning of words using statistical computations applied to large corpora of text. Since the advent of LSA, researchers have developed and tested alternative statistical methods designed to detect and analyze meaning in (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  33. On Scientific Method, Induction, Statistics, and Skepticism.Abraham D. Stone - unknown
    My aim in this paper is to explain how universal statements, as they occur in scientific theories, are actually tested by observational evidence, and to draw certain conclusions, on that basis, about the way in which scientific theories are tested in general. 1 But I am pursuing that aim, ambitious enough in and of itself, in the service of even more ambitious projects, and in the first place: (a) to say what is distinctive about modern science, and especially modern physical (...)
     
    Export citation  
     
    Bookmark  
  34.  70
    Bayesian statistics and biased procedures.Ronald N. Giere - 1969 - Synthese 20 (3):371 - 387.
    A comparison of Neyman's theory of interval estimation with the corresponding subjective Bayesian theory of credible intervals shows that the Bayesian approach to the estimation of statistical parameters allows experimental procedures which, from the orthodox objective viewpoint, are clearly biased and clearly inadmissible. This demonstrated methodological difference focuses attention on the key difference in the two general theories, namely, that the orthodox theory is supposed to provide a known average frequency of successful estimates, whereas the Bayesian (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  35. Constructive Verification, Empirical Induction, and Falibilist Deduction: A Threefold Contrast.Julio Michael Stern - 2011 - Information 2 (4):635-650.
    This article explores some open questions related to the problem of verification of theories in the context of empirical sciences by contrasting three epistemological frameworks. Each of these epistemological frameworks is based on a corresponding central metaphor, namely: (a) Neo-empiricism and the gambling metaphor; (b) Popperian falsificationism and the scientific tribunal metaphor; (c) Cognitive constructivism and the object as eigen-solution metaphor. Each of one of these epistemological frameworks has also historically co-evolved with a certain statistical theory and method (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   12 citations  
  36.  19
    Sophisticated Statistics Cannot Compensate for Method Effects If Quantifiable Structure Is Compromised.Damian P. Birney, Jens F. Beckmann, Nadin Beckmann & Steven E. Stemler - 2022 - Frontiers in Psychology 13.
    Researchers rely on psychometric principles when trying to gain understanding of unobservable psychological phenomena disconfounded from the methods used. Psychometric models provide us with tools to support this endeavour, but they are agnostic to the meaning researchers intend to attribute to the data. We define method effects as resulting from actions which weaken the psychometric structure of measurement, and argue that solution to this confounding will ultimately rest on testing whether data collected fit a psychometric model based on a (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  37.  8
    Bayesians Versus Frequentists: A Philosophical Debate on Statistical Reasoning.Jordi Vallverdú - 2016 - Berlin, Heidelberg: Imprint: Springer.
    This book analyzes the origins of statistical thinking as well as its related philosophical questions, such as causality, determinism or chance. Bayesian and frequentist approaches are subjected to a historical, cognitive and epistemological analysis, making it possible to not only compare the two competing theories, but to also find a potential solution. The work pursues a naturalistic approach, proceeding from the existence of numerosity in natural environments to the existence of contemporary formulas and methodologies to heuristic pragmatism, a concept (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  38.  15
    The Statistical Mechanics of Interacting Walks, Polygons, Animals and Vesicles.E. J. Janse van Rensburg - 2015 - Oxford University Press UK.
    The self-avoiding walk is a classical model in statistical mechanics, probability theory and mathematical physics. It is also a simple model of polymer entropy which is useful in modelling phase behaviour in polymers. This monograph provides an authoritative examination of interacting self-avoiding walks, presenting aspects of the thermodynamic limit, phase behaviour, scaling and critical exponents for lattice polygons, lattice animals and surfaces. It also includes a comprehensive account of constructive methods in models of adsorbing, collapsing, and pulled (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  39.  35
    Phylogenetic Inference, Selection Theory, and History of Science: Selected Papers of A. W. F. Edwards with Commentaries.Rasmus Grønfeldt Winther - 2018 - Cambridge: Cambridge University Press.
    A. W. F. Edwards is one of the most influential mathematical geneticists in the history of the discipline. One of the last students of R. A. Fisher, Edwards pioneered the statistical analysis of phylogeny in collaboration with L. L. Cavalli-Sforza, and helped establish Fisher's concept of likelihood as a standard of statistical and scientific inference. In this book, edited by philosopher of science Rasmus Grønfeldt Winther, Edwards's key papers are assembled alongside commentaries by leading scientists, discussing Edwards's influence (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  40. Towards a Coherent Theory of Physics and Mathematics: The Theory–Experiment Connection.Paul Benioff - 2005 - Foundations of Physics 35 (11):1825-1856.
    The problem of how mathematics and physics are related at a foundational level is of interest. The approach taken here is to work towards a coherent theory of physics and mathematics together by examining the theory experiment connection. The role of an implied theory hierarchy and use of computers in comparing theory and experiment is described. The main idea of the paper is to tighten the theory experiment connection by bringing physical theories, as mathematical structures (...)
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  41.  69
    A Battle in the Statistics Wars: a simulation-based comparison of Bayesian, Frequentist and Williamsonian methodologies.Mantas Radzvilas, William Peden & Francesco De Pretis - 2021 - Synthese 199 (5-6):13689-13748.
    The debates between Bayesian, frequentist, and other methodologies of statistics have tended to focus on conceptual justifications, sociological arguments, or mathematical proofs of their long run properties. Both Bayesian statistics and frequentist (“classical”) statistics have strong cases on these grounds. In this article, we instead approach the debates in the “Statistics Wars” from a largely unexplored angle: simulations of different methodologies’ performance in the short to medium run. We conducted a large number of simulations using a straightforward decision problem based (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  42. What experiment did we just do? Counterfactual error statistics and uncertainties about the reference class.Kent W. Staley - 2002 - Philosophy of Science 69 (2):279-299.
    Experimenters sometimes insist that it is unwise to examine data before determining how to analyze them, as it creates the potential for biased results. I explore the rationale behind this methodological guideline from the standpoint of an error statistical theory of evidence, and I discuss a method of evaluating evidence in some contexts when this predesignation rule has been violated. I illustrate the problem of potential bias, and the method by which it may be addressed, with an example (...)
    Direct download (9 more)  
     
    Export citation  
     
    Bookmark   13 citations  
  43.  48
    Underdetermination and the promise of statistical sociology.Stephen P. Turner - 1987 - Sociological Theory 5 (2):172-184.
    The lack of "progress" in theory is often contrasted to progress in statistical methodology. The relation between the two bodies of thinking is itself problematic, however, for the particular advances in method that have occurred in quantitative sociology reflect a trade-off in which the results are characterized by the radical underdetermination of models by data and a high level of slack between measures and theoretical concepts. Both of these problems are usually understood as matters of "error," and thus (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  44.  90
    Error Statistics Using the Akaike and Bayesian Information Criteria.Henrique Cheng & Beckett Sterner - forthcoming - Erkenntnis.
    Many biologists, especially in ecology and evolution, analyze their data by estimating fits to a set of candidate models and selecting the best model according to the Akaike Information Criterion (AIC) or the Bayesian Information Criteria (BIC). When the candidate models represent alternative hypotheses, biologists may want to limit the chance of a false positive to a specified level. Existing model selection methodology, however, allows for only indirect control over error rates by setting a threshold for the difference in AIC (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  45.  17
    Econometric Theory and Methods: International Edition.Russell Davidson - 2009 - Oxford University Press USA.
    Econometric Theory and Methods International Edition provides a unified treatment of modern econometric theory and practical econometric methods. The geometrical approach to least squares is emphasized, as is the method of moments, which is used to motivate a wide variety of estimators and tests. Simulation methods, including the bootstrap, are introduced early and used extensively. The book deals with a large number of modern topics. In addition to bootstrap and Monte Carlo tests, these include sandwich (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  46.  26
    Hold-up induced by demand for fairness: theory and experimental evidence.Raghabendra Pratap Kc, Dominique Olié Lauga & Vincent Mak - 2023 - Theory and Decision 94 (4):721-750.
    Research in recent years suggests that fairness concerns could mitigate hold-up problems. In this study, we report theoretical analysis and experimental evidence on an opposite possibility: that fairness concerns could also induce hold-up problems. In our setup, hold-up problems will not occur with purely self-interested agents, but theoretically could be induced by demand for distributional fairness among agents without sufficiently strong counteracting factors such as intention-based reciprocity. We observe a widespread occurrence of hold-up in our experiment. Relationship-specific investments occurred less (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  47.  32
    Continuity postulates and solvability axioms in economic theory and in mathematical psychology: a consolidation of the theory of individual choice.Aniruddha Ghosh, M. Ali Khan & Metin Uyanık - 2022 - Theory and Decision 94 (2):189-210.
    This paper presents four theorems that connect continuity postulates in mathematical economics to solvability axioms in mathematical psychology, and ranks them under alternative supplementary assumptions. Theorem 1 connects notions of continuity (full, separate, Wold, weak Wold, Archimedean, mixture) with those of solvability (restricted, unrestricted) under the completeness and transitivity of a binary relation. Theorem 2 uses the primitive notion of a separately continuous function to answer the question when an analogous property on a relation is fully continuous. Theorem 3 provides (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  48.  12
    Two Test Assembly Methods With Two Statistical Targets.Zheng Huijing, Li Junjie, Zeng Pingfei & Kang Chunhua - 2022 - Frontiers in Psychology 13.
    In educational measurement, exploring the method of generating multiple high-quality parallel tests has become a research hotspot. One purpose of this research is to construct parallel forms item by item according to a seed test, using two proposed item selection heuristic methods [minimum parameters–information–distance method and minimum information–parameters–distance method ]. Moreover, previous research addressing test assembly issues has been limited mainly to situations in which the information curve of the item pool or seed test has a normal or skewed (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  49.  20
    Critical thinking in clinical research: applied theory and practice using case studies.Felipe Fregni & Ben M. W. Illigens (eds.) - 2018 - New York, NY: Oxford University Press.
    Critical Thinking in Clinical Research explains the fundamentals of clinical research in a case-based approach. The core concept is to combine a clear and concise transfer of information and knowledge with an engagement of the reader to develop a mastery of learning and critical thinking skills. The book addresses the main concepts of clinical research, basics of biostatistics, advanced topics in applied biostatistics, and practical aspects of clinical research, with emphasis on clinical relevance across all medical specialties.
    Direct download  
     
    Export citation  
     
    Bookmark  
  50. Theory and method in business ethics.Nicholas Capaldi - 2018 - In Eugene Heath, Byron Kaldis & Alexei M. Marcoux (eds.), The Routledge Companion to Business Ethics. New York: Routledge.
     
    Export citation  
     
    Bookmark  
1 — 50 / 978