PriorSensitivity#
- class causalpy.checks.prior_sensitivity.PriorSensitivity[source]#
Re-fit the experiment with alternative models/priors and compare.
Each alternative is specified as a dict with
"name"and"model"keys. The check re-instantiates the experiment for each alternative model and compares the resulting effect summaries.- Parameters:
alternatives (
list[dict[str,Any]]) – Each dict must have"name"(str) and"model"(PyMCModel or RegressorMixin) keys.
Examples
>>> import causalpy as cp >>> check = cp.checks.PriorSensitivity( ... alternatives=[ ... {"name": "diffuse", "model": cp.pymc_models.LinearRegression(...)}, ... {"name": "tight", "model": cp.pymc_models.LinearRegression(...)}, ... ] ... )
Methods
PriorSensitivity.__init__(alternatives)PriorSensitivity.run(experiment, context)Re-fit with each alternative model and compare effect estimates.
PriorSensitivity.validate(experiment)Verify the experiment uses a Bayesian (PyMC) model.
Attributes
applicable_methods- classmethod __new__(*args, **kwargs)#