Search code examples
python-3.xtensorflow-probability

Negative values for a non-negative parameter in tensorflow probablity


I'm trying to fit a simple Dirichlet-Multinomial model in tensorflow probability. The concentration parameters are gamma and I have put a Gamma(1,1) prior distribution on them. This is the model, where S is the number of categories and N is the number of samples:

def dirichlet_model(S, N):
    gamma = ed.Gamma(tf.ones(S)*1.0, tf.ones(S)*1.0, name='gamma')
    y = ed.DirichletMultinomial(total_count=500., concentration=gamma, sample_shape=(N), name='y')
    return y

log_joint = ed.make_log_joint_fn(dirichlet_model)

However, when I try to sample from this using HMC, the acceptance rate is zero, and the initial draw for gamma contains negative values. Am I doing something wrong? Shouldn't negative proposals for the concentration parameters be rejected automatically? Below my sampling code:

def target_log_prob_fn(gamma):
  """Unnormalized target density as a function of states."""
  return log_joint(
    S=S, N=N,
    gamma=gamma,
    y=y_new)

num_results = 5000
num_burnin_steps = 3000

states, kernel_results = tfp.mcmc.sample_chain(
  num_results=num_results,
  num_burnin_steps=num_burnin_steps,
  current_state=[
    tf.ones([5], name='init_gamma')*5,
],
kernel=tfp.mcmc.HamiltonianMonteCarlo(
    target_log_prob_fn=target_log_prob_fn,
    step_size=0.4,
    num_leapfrog_steps=3))

gamma = states

with tf.Session() as sess:
  [
    gamma_,
    is_accepted_,
  ] = sess.run([
    gamma,
    kernel_results.is_accepted,
  ])

num_accepted = np.sum(is_accepted_)
print('Acceptance rate: {}'.format(num_accepted / num_results))

Solution

  • Try reducing step size to increase acceptance rate. Optimal acceptance rate for HMC is around .651 (https://arxiv.org/abs/1001.4460). Not sure why you'd see negative values. Maybe floating point error near zero? Can you post some of the logs of your run?