如何决定PyMC3中参数的先验分布?

2024-04-25 13:28:57 发布

您现在位置:Python中文网/ 问答频道 /正文

我正在研究PyMC3软件包,我感兴趣的是在一个场景中实现这个软件包,在这个场景中,我有几个不同的信号,每个信号都有不同的振幅

然而,我仍然停留在需要使用什么类型的先验知识来实现PyMC3,以及实现的可能性分布上。场景示例如下图所示:

Signal plot

我试图在这里实现它,但每次我不断地得到错误:

pymc3.exceptions.SamplingError: Bad initial energy

我的代码

## Signal 1:
    with pm.Model() as model:
        # Parameters:
        # Prior Distributions:
        # BoundedNormal = pm.Bound(pm.Exponential, lower=0.0, upper=np.inf)
        # c = BoundedNormal('c', lam=10)
        # c = pm.Uniform('c', lower=0, upper=300)
        alpha = pm.Normal('alpha', mu = 0, sd = 10)
        beta = pm.Normal('beta', mu = 0, sd = 1)
        sigma = pm.HalfNormal('sigma', sd = 1)
        mu = pm.Normal('mu', mu=0, sigma=1)
        sd = pm.HalfNormal('sd', sigma=1)

        # Observed data is from a Multinomial distribution:
        # Likelihood distributions:
        # bradford = pm.DensityDist('observed_data', logp=bradford_logp, observed=dict(value=S1, loc=mu, scale=sd, c=c))
        # observed_data = pm.Beta('observed_data', mu=mu, sd=sd, observed=S1)
        observed_data = pm.Beta('observed_data', alpha=alpha, beta=beta, mu=mu, sd=sd, observed=S1)

    with model:
        # obtain starting values via MAP
        startvals = pm.find_MAP(model=model)

        # instantiate sampler
        # step = pm.Metropolis()
        step = pm.HamiltonianMC()
        # step = pm.NUTS()

        # draw 5000 posterior samples
        trace = pm.sample(start=startvals, draws=1000, step=step, tune=500, chains=4, cores=1, discard_tuned_samples=True)

        # Obtaining Posterior Predictive Sampling:
        post_pred = pm.sample_posterior_predictive(trace, samples=500)
        print(post_pred['observed_data'].shape)

    plt.title('Trace Plot of Signal 1')
    pm.traceplot(trace, var_names=['mu', 'sd'], divergences=None, combined=True)
    plt.show(block=False)
    plt.pause(5)  # Pauses the program for 5 seconds
    plt.close('all')

    pm.plot_posterior(trace, var_names=['mu', 'sd'])
    plt.title('Posterior Plot of Signal 1')
    plt.show(block=False)
    plt.pause(5)  # Pauses the program for 5 seconds
    plt.close('all')

附带问题

我也一直在研究在使用不同的分布而不是高斯分布的情况下,实现适应度优度测试和卡尔曼滤波的想法,因此,如果您有时间,我希望您能看看它们?。这两个问题都可以在这里找到:

拟合优度测试链接:Goodness-to-fit test

卡尔曼滤波链路:Kalman Filter


编辑1

假设我有大约5个信号,并希望实现贝叶斯接口,以便查看信号PDF中的差异。我如何处理这个问题?我是否需要创建多个模型并获得其后验分布?如图所示:

Distribution plots

如果我需要得到后验分布,我是否使用以下代码

# Obtaining Posterior Predictive Sampling:
post_pred = pm.sample_posterior_predictive(trace, samples=500)

编辑2

如果我有多个信号,我可以这样实现它,以便在所有信号中看到alphabeta的变化吗

        observed_data_S1 = pm.Beta('observed_data_S1', alpha=alpha[0], beta=beta[0], observed=S1[0])
        observed_data_S2 = pm.Beta('observed_data_S2', alpha=alpha[1], beta=beta[1], observed=S2[0])
        observed_data_S3 = pm.Beta('observed_data_S3', alpha=alpha[2], beta=beta[2], observed=S3[0])
        observed_data_S4 = pm.Beta('observed_data_S4', alpha=alpha[3], beta=beta[3], observed=S4[0])
        observed_data_S5 = pm.Beta('observed_data_S5', alpha=alpha[4], beta=beta[4], observed=S5[0])
        observed_data_S6 = pm.Beta('observed_data_S6', alpha=alpha[5], beta=beta[5], observed=S6[0])

编辑3:

如何在一个绘图中绘制多条记录道?因为我看到了多个信号,我想把所有的阿尔法和贝塔组合在一起


Tags: alphadatamodel信号steptracepltsd
1条回答
网友
1楼 · 发布于 2024-04-25 13:28:57

第一个错误:Beta分布的参数alphabeta必须为正。您在它们上使用了一个正常的Previor,该Previor允许RV取负值和0值。您可以通过在pm.Normal分布上使用pm.Bound或使用pm.HalfNormal分布来轻松解决这个问题

第二个错误:另一个不一致性是指定musigma以及alphabeta参数。Beta要么接受musigma,要么接受alphabeta,但不能同时接受两者。默认行为是在musigma参数上使用alphabeta参数。推断出musigma是在浪费大量计算能力

其他注释:从3.8版开始,您不应在任何发行版中使用sd参数,因为它已被弃用,并将在3.9版中删除。改用sigma

更正版本

import numpy as np
import theano
import theano.tensor as tt
import pymc3 as pm
import matplotlib.pyplot as plt

S1 = np.random.rand(10)

## Signal 1:
with pm.Model() as model:
    # Parameters:
    # Prior Distributions:
    # BoundedNormal = pm.Bound(pm.Exponential, lower=0.0, upper=np.inf)
    # c = BoundedNormal('c', lam=10)
    # c = pm.Uniform('c', lower=0, upper=300)
    alpha = pm.HalfNormal('alpha', sigma=10)
    beta = pm.HalfNormal('beta', sigma=1)

    # Observed data is from a Multinomial distribution:
    # Likelihood distributions:
    # bradford = pm.DensityDist('observed_data', logp=bradford_logp, observed=dict(value=S1, loc=mu, scale=sd, c=c))
    # observed_data = pm.Beta('observed_data', mu=mu, sd=sd, observed=S1)
    observed_data = pm.Beta('observed_data', alpha=alpha, beta=beta, observed=S1)

with model:
    # obtain starting values via MAP
    startvals = pm.find_MAP(model=model)

    # instantiate sampler
    # step = pm.Metropolis()
    step = pm.HamiltonianMC()
    # step = pm.NUTS()

    # draw 5000 posterior samples
    trace = pm.sample(start=startvals, draws=1000, step=step, tune=500, chains=4, cores=1, discard_tuned_samples=True)

    # Obtaining Posterior Predictive Sampling:
    post_pred = pm.sample_posterior_predictive(trace, samples=500)
    print(post_pred['observed_data'].shape)

plt.title('Trace Plot of Signal 1')
pm.traceplot(trace, var_names=['alpha', 'beta'], divergences=None, combined=True)
plt.show(block=False)
plt.pause(5)  # Pauses the program for 5 seconds
plt.close('all')

pm.plot_posterior(trace, var_names=['alpha', 'beta'])
plt.title('Posterior Plot of Signal 1')
plt.show(block=False)
plt.pause(5)  # Pauses the program for 5 seconds
plt.close('all')


相关问题 更多 >