如何在PyMC3中定义自定义先验

3 投票
1 回答
2540 浏览
提问于 2025-04-18 13:02

我想知道在PyMC3中是否可以定义一个自定义的先验分布(以及怎么做)。从这里来看,在PyMC2中这样做相对简单(不需要修改源代码),但在PyMC3中就没那么简单了(或者我可能没理解清楚)。

我正在尝试复制一本书《Doing Bayesian Data Analysis》中实现的一个先验分布,这个分布在BUGS中有实现:

model {
# Likelihood. Each flip is Bernoulli.
for ( i in 1 : N1 ) { y1[i]  ̃ dbern( theta1 ) }
for ( i in 1 : N2 ) { y2[i]  ̃ dbern( theta2 ) }
# Prior. Curved scallo not ps!
x  ̃ dunif(0,1)
y  ̃ dunif(0,1)
N <- 4
xt <- sin( 2*3.141593*N * x ) / (2*3.141593*N) + x
yt <- 3 * y + (1/3)
xtt <- pow( xt , yt )
theta1 <- xtt
theta2 <- y
}

这个先验分布其实意义不大,只是一个如何定义自定义先验的例子,以及BUGS的灵活性。

我尝试实现上面的自定义先验,代码如下:

from __future__ import division
import numpy as np
import pymc as pm
from pymc import Continuous
from theano.tensor import sin, log

# Generate the data
y1 = np.array([1, 1, 1, 1, 1, 0, 0])  # 5 heads and 2 tails
y2 = np.array([1, 1, 0, 0, 0, 0, 0])  # 2 heads and 5 tails

class Custom_prior(Continuous): 
"""
custom prior
"""
    def __init__(self, y, *args, **kwargs):
        super(Custom_prior, self).__init__(*args, **kwargs)
        self.y = y
        self.N = 4
        self.mean = 0.625  # FIXME
    def logp(self, value):
        N = self.N
        y = self.y
        return -log((sin(2*3.141593*N * value)
                     / (2*3.141593*N) + value)**(3 * y + (1/3)))

with pm.Model() as model:
    theta2 = pm.Uniform('theta2', 0, 1)  # prior
    theta1 = Custom_prior('theta1', theta2)  # prior
    # define the likelihood
    y1 = pm.Bernoulli('y1', p=theta1, observed=y1)
    y2 = pm.Bernoulli('y2', p=theta2, observed=y2)
    # Generate a MCMC chain
    start = pm.find_MAP()  # Find starting value by optimization
    trace = pm.sample(5000, pm.NUTS(), progressbar=False)

编辑

根据chris-fonnesbeck的回答,我觉得我需要一些类似于:

with pm.Model() as model:
    theta2 = pm.Uniform('theta2', 0, 1)  # prior
    N = 4
    theta1 = pm.DensityDist('theta1', lambda value: -log((sin(2*3.141593*N * value)
                       / (2*3.141593*N) + value)**(3 * theta2 + (1/3))))
    # define the likelihood
    y1 = pm.Bernoulli('y1', p=theta1, observed=y1)
    y2 = pm.Bernoulli('y2', p=theta2, observed=y2)

    # Generate a MCMC chain
    start = pm.find_MAP()  # Find starting value by optimization
    trace = pm.sample(10000, pm.NUTS(), progressbar=False) # Use NUTS sampling

唯一的问题是,我得到的theta1和theta2的后验样本值都是一样的,我想我的自定义先验或者先验与似然的组合可能有问题。一个成功定义自定义先验的例子可以在这个示例中找到。

1 个回答

4

你能把完整的BUGS模型发出来吗?在上面看起来只是一些确定性的转换,像是对x和y的先验处理,而不是先验的定义。

不过,如果你想要上面提到的logp,你可以在PyMC中更简单地实现它,像这样:

def logp(value, y):
    N  = 4
    return -log((sin(2*3.141593*N * value)
                 / (2*3.141593*N) + value)**(3 * y + (1/3)))

theta1 = pm.DensityDist('theta1', logp, value, y=theta2)

撰写回答