.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "examples\2-examples\1.1_Intro_to_Bayesian_Inference.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_examples_2-examples_1.1_Intro_to_Bayesian_Inference.py: Normal Prior, several observations `````````````````````````````````` .. GENERATED FROM PYTHON SOURCE LINES 7-25 .. code-block:: default # sphinx_gallery_thumbnail_number = -1 import arviz as az import matplotlib.pyplot as plt import pyro import pyro.distributions as dist import torch from matplotlib.ticker import StrMethodFormatter from gempy_probability.plot_posterior import PlotPosterior from pyro.infer import Predictive, NUTS, MCMC y_obs = torch.tensor([2.12]) y_obs_list = torch.tensor([2.12, 2.06, 2.08, 2.05, 2.08, 2.09, 2.19, 2.07, 2.16, 2.11, 2.13, 1.92]) pyro.set_rng_seed(4003) .. rst-class:: sphx-glr-script-out .. code-block:: none Setting Backend To: AvailableBackends.numpy .. GENERATED FROM PYTHON SOURCE LINES 26-38 .. code-block:: default def model(distributions_family, data): if distributions_family == "normal_distribution": mu = pyro.sample('$\mu$', dist.Normal(2.07, 0.07)) elif distributions_family in "uniform_distribution": mu = pyro.sample('$\mu$', dist.Uniform(0, 10)) else: raise ValueError("distributions_family must be either 'normal_distribution' or 'uniform_distribution'") sigma = pyro.sample('$\sigma$', dist.Gamma(0.3, 3)) y = pyro.sample('$y$', dist.Normal(mu, sigma), obs=data) return y .. GENERATED FROM PYTHON SOURCE LINES 39-40 1. Prior Sampling .. GENERATED FROM PYTHON SOURCE LINES 40-49 .. code-block:: default prior = Predictive(model, num_samples=100)("normal_distribution", y_obs_list) # 2. MCMC Sampling nuts_kernel = NUTS(model) mcmc = MCMC(nuts_kernel, num_samples=1000, warmup_steps=100) # Assuming 1000 warmup steps mcmc.run("normal_distribution", y_obs_list) # Get posterior samples posterior_samples = mcmc.get_samples(1100) # 3. Sample from Posterior Predictive posterior_predictive = Predictive(model, posterior_samples)("normal_distribution", y_obs_list) .. rst-class:: sphx-glr-script-out .. code-block:: none Warmup: 0%| | 0/1100 [00:00, ?it/s] Warmup: 2%|▎ | 26/1100 [00:00, 243.34it/s, step size=2.54e-02, acc. prob=0.747] Warmup: 5%|▌ | 51/1100 [00:00, 148.51it/s, step size=1.99e-02, acc. prob=0.765] Warmup: 7%|▊ | 73/1100 [00:00, 166.60it/s, step size=2.68e-02, acc. prob=0.774] Warmup: 9%|█ | 97/1100 [00:00, 180.82it/s, step size=1.57e-01, acc. prob=0.772] Sample: 11%|█▏ | 121/1100 [00:00, 198.02it/s, step size=2.71e-01, acc. prob=0.946] Sample: 13%|█▍ | 148/1100 [00:00, 218.96it/s, step size=2.71e-01, acc. prob=0.955] Sample: 16%|█▋ | 172/1100 [00:00, 223.29it/s, step size=2.71e-01, acc. prob=0.952] Sample: 18%|█▉ | 198/1100 [00:00, 233.24it/s, step size=2.71e-01, acc. prob=0.955] Sample: 21%|██▎ | 229/1100 [00:01, 255.61it/s, step size=2.71e-01, acc. prob=0.949] Sample: 23%|██▌ | 258/1100 [00:01, 263.11it/s, step size=2.71e-01, acc. prob=0.949] Sample: 26%|██▊ | 285/1100 [00:01, 263.56it/s, step size=2.71e-01, acc. prob=0.951] Sample: 29%|███▏ | 314/1100 [00:01, 269.45it/s, step size=2.71e-01, acc. prob=0.951] Sample: 31%|███▍ | 342/1100 [00:01, 272.39it/s, step size=2.71e-01, acc. prob=0.951] Sample: 34%|███▋ | 373/1100 [00:01, 281.13it/s, step size=2.71e-01, acc. prob=0.947] Sample: 37%|████ | 402/1100 [00:01, 268.66it/s, step size=2.71e-01, acc. prob=0.946] Sample: 39%|████▎ | 430/1100 [00:01, 266.02it/s, step size=2.71e-01, acc. prob=0.944] Sample: 42%|████▌ | 461/1100 [00:01, 276.21it/s, step size=2.71e-01, acc. prob=0.941] Sample: 45%|████▉ | 498/1100 [00:01, 301.13it/s, step size=2.71e-01, acc. prob=0.939] Sample: 48%|█████▎ | 529/1100 [00:02, 292.94it/s, step size=2.71e-01, acc. prob=0.938] Sample: 51%|█████▌ | 559/1100 [00:02, 292.06it/s, step size=2.71e-01, acc. prob=0.940] Sample: 54%|█████▉ | 589/1100 [00:02, 284.27it/s, step size=2.71e-01, acc. prob=0.941] Sample: 56%|██████▏ | 618/1100 [00:02, 268.08it/s, step size=2.71e-01, acc. prob=0.941] Sample: 59%|██████▍ | 646/1100 [00:02, 244.63it/s, step size=2.71e-01, acc. prob=0.941] Sample: 62%|██████▊ | 679/1100 [00:02, 266.39it/s, step size=2.71e-01, acc. prob=0.941] Sample: 65%|███████ | 710/1100 [00:02, 275.78it/s, step size=2.71e-01, acc. prob=0.942] Sample: 67%|███████▍ | 739/1100 [00:02, 276.86it/s, step size=2.71e-01, acc. prob=0.942] Sample: 70%|███████▋ | 768/1100 [00:03, 263.63it/s, step size=2.71e-01, acc. prob=0.942] Sample: 72%|███████▉ | 795/1100 [00:03, 255.34it/s, step size=2.71e-01, acc. prob=0.942] Sample: 75%|████████▎ | 825/1100 [00:03, 266.68it/s, step size=2.71e-01, acc. prob=0.942] Sample: 77%|████████▌ | 852/1100 [00:03, 258.75it/s, step size=2.71e-01, acc. prob=0.941] Sample: 80%|████████▊ | 881/1100 [00:03, 266.17it/s, step size=2.71e-01, acc. prob=0.942] Sample: 83%|█████████ | 910/1100 [00:03, 269.62it/s, step size=2.71e-01, acc. prob=0.942] Sample: 85%|█████████▍ | 938/1100 [00:03, 266.48it/s, step size=2.71e-01, acc. prob=0.942] Sample: 88%|█████████▋ | 965/1100 [00:03, 263.00it/s, step size=2.71e-01, acc. prob=0.942] Sample: 90%|█████████▉ | 992/1100 [00:03, 260.45it/s, step size=2.71e-01, acc. prob=0.942] Sample: 93%|█████████▎| 1019/1100 [00:03, 258.64it/s, step size=2.71e-01, acc. prob=0.942] Sample: 95%|█████████▌| 1045/1100 [00:04, 255.68it/s, step size=2.71e-01, acc. prob=0.942] Sample: 97%|█████████▋| 1071/1100 [00:04, 249.31it/s, step size=2.71e-01, acc. prob=0.943] Sample: 100%|█████████▉| 1099/1100 [00:04, 255.38it/s, step size=2.71e-01, acc. prob=0.943] Sample: 100%|██████████| 1100/1100 [00:04, 255.61it/s, step size=2.71e-01, acc. prob=0.943] .. GENERATED FROM PYTHON SOURCE LINES 50-59 .. code-block:: default data = az.from_pyro( posterior=mcmc, prior=prior, posterior_predictive=posterior_predictive ) az_data = data az.plot_trace(az_data) plt.show() .. image-sg:: /examples/2-examples/images/sphx_glr_1.1_Intro_to_Bayesian_Inference_001.png :alt: $\mu$, $\mu$, $\sigma$, $\sigma$ :srcset: /examples/2-examples/images/sphx_glr_1.1_Intro_to_Bayesian_Inference_001.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none C:\Users\MigueldelaVarga\PycharmProjects\VisualBayesic\venv\lib\site-packages\arviz\data\io_pyro.py:157: UserWarning: Could not get vectorized trace, log_likelihood group will be omitted. Check your model vectorization or set log_likelihood=False warnings.warn( posterior predictive shape not compatible with number of chains and draws.This can mean that some draws or even whole chains are not represented. .. GENERATED FROM PYTHON SOURCE LINES 60-69 .. code-block:: default p = PlotPosterior(az_data) p.create_figure(figsize=(9, 3), joyplot=False, marginal=False) p.plot_normal_likelihood('$\mu$', '$\sigma$', '$y$', iteration=-1, hide_bell=True) p.likelihood_axes.set_xlim(1.90, 2.2) p.likelihood_axes.xaxis.set_major_formatter(StrMethodFormatter('{x:,.2f}')) for tick in p.likelihood_axes.get_xticklabels(): tick.set_rotation(45) plt.show() .. image-sg:: /examples/2-examples/images/sphx_glr_1.1_Intro_to_Bayesian_Inference_002.png :alt: Likelihood :srcset: /examples/2-examples/images/sphx_glr_1.1_Intro_to_Bayesian_Inference_002.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 70-79 .. code-block:: default p = PlotPosterior(az_data) p.create_figure(figsize=(9, 3), joyplot=False, marginal=False) p.plot_normal_likelihood('$\mu$', '$\sigma$', '$y$', iteration=-1, hide_lines=True) p.likelihood_axes.set_xlim(1.70, 2.40) p.likelihood_axes.xaxis.set_major_formatter(StrMethodFormatter('{x:,.2f}')) for tick in p.likelihood_axes.get_xticklabels(): tick.set_rotation(45) plt.show() .. image-sg:: /examples/2-examples/images/sphx_glr_1.1_Intro_to_Bayesian_Inference_003.png :alt: Likelihood :srcset: /examples/2-examples/images/sphx_glr_1.1_Intro_to_Bayesian_Inference_003.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 80-86 .. code-block:: default p = PlotPosterior(az_data) p.create_figure(figsize=(9, 9), joyplot=True, marginal=False, likelihood=False, n_samples=31) p.plot_joy(('$\mu$', '$\sigma$'), '$y$', iteration=14) plt.show() .. image-sg:: /examples/2-examples/images/sphx_glr_1.1_Intro_to_Bayesian_Inference_004.png :alt: 1.1 Intro to Bayesian Inference :srcset: /examples/2-examples/images/sphx_glr_1.1_Intro_to_Bayesian_Inference_004.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 87-103 .. code-block:: default p = PlotPosterior(az_data) p.create_figure(figsize=(9, 5), joyplot=False, marginal=True, likelihood=True) p.plot_marginal( var_names=['$\mu$', '$\sigma$'], plot_trace=False, credible_interval=0.95, kind='kde', joint_kwargs={'contour': True, 'pcolormesh_kwargs': {}}, joint_kwargs_prior={'contour': False, 'pcolormesh_kwargs': {}}) p.plot_normal_likelihood('$\mu$', '$\sigma$', '$y$', iteration=-1, hide_lines=True) p.likelihood_axes.set_xlim(1.70, 2.40) plt.show() .. image-sg:: /examples/2-examples/images/sphx_glr_1.1_Intro_to_Bayesian_Inference_005.png :alt: Likelihood :srcset: /examples/2-examples/images/sphx_glr_1.1_Intro_to_Bayesian_Inference_005.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 104-114 License ======= The code in this case study is copyrighted by Miguel de la Varga and licensed under the new BSD (3-clause) license: https://opensource.org/licenses/BSD-3-Clause The text and figures in this case study are copyrighted by Miguel de la Varga and licensed under the CC BY-NC 4.0 license: https://creativecommons.org/licenses/by-nc/4.0/ Make sure to replace the links with actual hyperlinks if you're using a platform that supports it (e.g., Markdown or HTML). Otherwise, the plain URLs work fine for plain text. .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 9.076 seconds) .. _sphx_glr_download_examples_2-examples_1.1_Intro_to_Bayesian_Inference.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: 1.1_Intro_to_Bayesian_Inference.py <1.1_Intro_to_Bayesian_Inference.py>` .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: 1.1_Intro_to_Bayesian_Inference.ipynb <1.1_Intro_to_Bayesian_Inference.ipynb>` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_