This function fits the single season N-mixture model of Royle et al. (2004).

stan_pcount(
  formula,
  data,
  K = NULL,
  mixture = "P",
  prior_intercept_state = normal(0, 5),
  prior_coef_state = normal(0, 2.5),
  prior_intercept_det = logistic(0, 1),
  prior_coef_det = logistic(0, 1),
  prior_sigma = gamma(1, 1),
  log_lik = TRUE,
  ...
)

Arguments

formula

Double right-hand side formula describing covariates of detection and abundance in that order

data

A unmarkedFramePCount object

K

Integer upper index of integration for N-mixture. This should be set high enough so that it does not affect the parameter estimates. Note that computation time will increase with K.

mixture

Character specifying mixture: "P" is only option currently.

prior_intercept_state

Prior distribution for the intercept of the state (abundance) model; see ?priors for options

prior_coef_state

Prior distribution for the regression coefficients of the state model

prior_intercept_det

Prior distribution for the intercept of the detection probability model

prior_coef_det

Prior distribution for the regression coefficients of the detection model

prior_sigma

Prior distribution on random effect standard deviations

log_lik

If TRUE, Stan will save pointwise log-likelihood values in the output. This can greatly increase the size of the model. If FALSE, the values are calculated post-hoc from the posteriors

...

Arguments passed to the stan call, such as number of chains chains or iterations iter

Value

ubmsFitPcount object describing the model fit.

References

Royle JA. 2004. N-mixture models for estimating populaiton size from spatially replicated counts. Biometrics 60: 105-108.

Examples

# \donttest{
data(mallard)
mallardUMF <- unmarkedFramePCount(mallard.y, siteCovs=mallard.site)

(fm_mallard <- stan_pcount(~1~elev+forest, mallardUMF, K=30,
                           chains=3, iter=300))
#> 
#> SAMPLING FOR MODEL 'pcount' NOW (CHAIN 1).
#> Chain 1: 
#> Chain 1: Gradient evaluation took 0.00442 seconds
#> Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 44.2 seconds.
#> Chain 1: Adjust your expectations accordingly!
#> Chain 1: 
#> Chain 1: 
#> Chain 1: Iteration:   1 / 300 [  0%]  (Warmup)
#> Chain 1: Iteration:  30 / 300 [ 10%]  (Warmup)
#> Chain 1: Iteration:  60 / 300 [ 20%]  (Warmup)
#> Chain 1: Iteration:  90 / 300 [ 30%]  (Warmup)
#> Chain 1: Iteration: 120 / 300 [ 40%]  (Warmup)
#> Chain 1: Iteration: 150 / 300 [ 50%]  (Warmup)
#> Chain 1: Iteration: 151 / 300 [ 50%]  (Sampling)
#> Chain 1: Iteration: 180 / 300 [ 60%]  (Sampling)
#> Chain 1: Iteration: 210 / 300 [ 70%]  (Sampling)
#> Chain 1: Iteration: 240 / 300 [ 80%]  (Sampling)
#> Chain 1: Iteration: 270 / 300 [ 90%]  (Sampling)
#> Chain 1: Iteration: 300 / 300 [100%]  (Sampling)
#> Chain 1: 
#> Chain 1:  Elapsed Time: 5.963 seconds (Warm-up)
#> Chain 1:                7.917 seconds (Sampling)
#> Chain 1:                13.88 seconds (Total)
#> Chain 1: 
#> 
#> SAMPLING FOR MODEL 'pcount' NOW (CHAIN 2).
#> Chain 2: 
#> Chain 2: Gradient evaluation took 0.003637 seconds
#> Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 36.37 seconds.
#> Chain 2: Adjust your expectations accordingly!
#> Chain 2: 
#> Chain 2: 
#> Chain 2: Iteration:   1 / 300 [  0%]  (Warmup)
#> Chain 2: Iteration:  30 / 300 [ 10%]  (Warmup)
#> Chain 2: Iteration:  60 / 300 [ 20%]  (Warmup)
#> Chain 2: Iteration:  90 / 300 [ 30%]  (Warmup)
#> Chain 2: Iteration: 120 / 300 [ 40%]  (Warmup)
#> Chain 2: Iteration: 150 / 300 [ 50%]  (Warmup)
#> Chain 2: Iteration: 151 / 300 [ 50%]  (Sampling)
#> Chain 2: Iteration: 180 / 300 [ 60%]  (Sampling)
#> Chain 2: Iteration: 210 / 300 [ 70%]  (Sampling)
#> Chain 2: Iteration: 240 / 300 [ 80%]  (Sampling)
#> Chain 2: Iteration: 270 / 300 [ 90%]  (Sampling)
#> Chain 2: Iteration: 300 / 300 [100%]  (Sampling)
#> Chain 2: 
#> Chain 2:  Elapsed Time: 7.043 seconds (Warm-up)
#> Chain 2:                7.001 seconds (Sampling)
#> Chain 2:                14.044 seconds (Total)
#> Chain 2: 
#> 
#> SAMPLING FOR MODEL 'pcount' NOW (CHAIN 3).
#> Chain 3: 
#> Chain 3: Gradient evaluation took 0.003653 seconds
#> Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 36.53 seconds.
#> Chain 3: Adjust your expectations accordingly!
#> Chain 3: 
#> Chain 3: 
#> Chain 3: Iteration:   1 / 300 [  0%]  (Warmup)
#> Chain 3: Iteration:  30 / 300 [ 10%]  (Warmup)
#> Chain 3: Iteration:  60 / 300 [ 20%]  (Warmup)
#> Chain 3: Iteration:  90 / 300 [ 30%]  (Warmup)
#> Chain 3: Iteration: 120 / 300 [ 40%]  (Warmup)
#> Chain 3: Iteration: 150 / 300 [ 50%]  (Warmup)
#> Chain 3: Iteration: 151 / 300 [ 50%]  (Sampling)
#> Chain 3: Iteration: 180 / 300 [ 60%]  (Sampling)
#> Chain 3: Iteration: 210 / 300 [ 70%]  (Sampling)
#> Chain 3: Iteration: 240 / 300 [ 80%]  (Sampling)
#> Chain 3: Iteration: 270 / 300 [ 90%]  (Sampling)
#> Chain 3: Iteration: 300 / 300 [100%]  (Sampling)
#> Chain 3: 
#> Chain 3:  Elapsed Time: 7.085 seconds (Warm-up)
#> Chain 3:                7.052 seconds (Sampling)
#> Chain 3:                14.137 seconds (Total)
#> Chain 3: 
#> Warning: Bulk Effective Samples Size (ESS) is too low, indicating posterior means and medians may be unreliable.
#> Running the chains for more iterations may help. See
#> https://mc-stan.org/misc/warnings.html#bulk-ess
#> Warning: Tail Effective Samples Size (ESS) is too low, indicating posterior variances and tail quantiles may be unreliable.
#> Running the chains for more iterations may help. See
#> https://mc-stan.org/misc/warnings.html#tail-ess
#> Warning: Some Pareto k diagnostic values are too high. See help('pareto-k-diagnostic') for details.
#> 
#> Call:
#> stan_pcount(formula = ~1 ~ elev + forest, data = mallardUMF, 
#>     K = 30, chains = 3, iter = 300)
#> 
#> Abundance (log-scale):
#>             Estimate    SD  2.5%  97.5% n_eff  Rhat
#> (Intercept)   -1.943 0.224 -2.44 -1.547   201 1.002
#> elev          -1.346 0.206 -1.77 -0.941   220 0.996
#> forest        -0.734 0.157 -1.04 -0.454   223 0.996
#> 
#> Detection (logit-scale):
#>  Estimate    SD   2.5% 97.5% n_eff  Rhat
#>     0.486 0.188 0.0932 0.827   338 0.997
#> 
#> LOOIC: 536.301
#> Runtime: 42.061 sec
# }