Order:
  1. Actual causation: a stone soup essay.Clark Glymour David Danks, Bruce Glymour Frederick Eberhardt, Joseph Ramsey Richard Scheines, Peter Spirtes Choh Man Teng & Zhang Jiji - 2010 - Synthese 175 (2):169--192.
    We argue that current discussions of criteria for actual causation are ill-posed in several respects. (1) The methodology of current discussions is by induction from intuitions about an infinitesimal fraction of the possible examples and counterexamples; (2) cases with larger numbers of causes generate novel puzzles; (3) “neuron” and causal Bayes net diagrams are, as deployed in discussions of actual causation, almost always ambiguous; (4) actual causation is (intuitively) relative to an initial system state since state changes are relevant, but (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   23 citations  
  2. Proportionality, Determinate Intervention Effects, and High-Level Causation.W. Fang & Zhang Jiji - forthcoming - Erkenntnis.
    Stephen Yablo’s notion of proportionality, despite controversies surrounding it, has played a significant role in philosophical discussions of mental causation and of high-level causation more generally. In particular, it is invoked in James Woodward’s interventionist account of high-level causation and explanation, and is implicit in a novel approach to constructing variables for causal modeling in the machine learning literature, known as causal feature learning (CFL). In this article, we articulate an account of proportionality inspired by both Yablo’s account of proportionality (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  3. On Learning Causal Structures from Non-Experimental Data without Any Faithfulness Assumption.Hanti Lin & Zhang Jiji - 2020 - Proceedings of Machine Learning Research 117:554-582.
    Consider the problem of learning, from non-experimental data, the causal (Markov equivalence) structure of the true, unknown causal Bayesian network (CBN) on a given, fixed set of (categorical) variables. This learning problem is known to be very hard, so much so that there is no learning algorithm that converges to the truth for all possible CBNs (on the given set of variables). So the convergence property has to be sacrificed for some CBNs—but for which? In response, the standard practice has (...)
     
    Export citation  
     
    Bookmark   1 citation