You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The notebook was originally developed using pymc 5.3.0, when updating to 5.10 the initial propensity model fit on the logistic regression breaks down. 4000 or so divergences. This seems to be due colinearity and the squared terms in the data set.
*Note that this issue tracker is about the contents in the notebooks,
Expected output
Proposed solution
This issue can be fixed by specifying the init conditions on the sampler.
Bayesian Non-parametric Causal Inference:
https://www.pymc.io/projects/examples/en/latest/causal_inference/bayesian_nonparametric_causal.html:
Issue description
The notebook was originally developed using pymc 5.3.0, when updating to 5.10 the initial propensity model fit on the logistic regression breaks down. 4000 or so divergences. This seems to be due colinearity and the squared terms in the data set.
*Note that this issue tracker is about the contents in the notebooks,
Expected output
Proposed solution
This issue can be fixed by specifying the init conditions on the sampler.
idata.extend(pm.sample(samples, init='adapt_diag', random_seed=105, idata_kwargs={"log_likelihood": True}))
The text was updated successfully, but these errors were encountered: