Foundations of Computational Causal Inference

December 10, 2025
Paul Luong

Foundations of Computational Causal Inference

Abstract

We develop a formal framework for causal reasoning in computational systems. This paper presents a type-theoretic approach to causality, discusses intervention design, and explores the relationship between causal models and executable systems.

1. Introduction

Causal reasoning is fundamental to scientific inquiry and system design. However, most causal inference frameworks remain disconnected from executable systems. We bridge this gap by developing a computational approach to causality that preserves formal guarantees through implementation.

2. Formal Framework

2.1 Causal Models as Types

We represent causal models as dependent types:

CausalModel : Type where
  Variables : Set Var
  Structure : Graph Variables
  Distributions : Variables → Distribution
  Interventions : Variables → Set Intervention

2.2 Interventions

An intervention modifies the causal structure:

do : (model: CausalModel) →
     (var: Var) →
     (value: Value) →
     CausalModel

This corresponds to Pearl's do-calculus but expressed as type-level operations.

3. Computational Realization

The formal framework maps directly to executable code:

class CausalModel:
    def __init__(self, structure, distributions):
        self.structure = structure
        self.distributions = distributions

    def intervene(self, variable, value):
        # Returns new model with intervention applied
        return modified_model

4. Conclusion

By treating causal models as first-class computational objects, we enable both rigorous reasoning and practical application.


Note: This paper presents work in progress. Full technical details forthcoming.