This function performs Markov chain Monte Carlo simulation for fitting a (latent class) (mixed) (multinomial) probit model to discrete choice data.

Usage

mcmc(
data,
scale = list(parameter = "s", index = 1, value = 1),
R = 10000,
B = R/2,
Q = 1,
print_progress = getOption("RprobitB_progress"),
prior = NULL,
latent_classes = NULL,
seed = NULL
)

Arguments

data

An object of class RprobitB_data.

scale

A named list of three elements, determining the parameter normalization with respect to the utility scale:

• parameter: Either "a" (for a linear coefficient of "alpha") or "s" (for a variance of the error-term covariance matrix "Sigma").

• index: The index of the parameter that gets fixed.

• value: The value for the fixed parameter.

R

The number of iterations of the Gibbs sampler.

B

The length of the burn-in period, i.e. a non-negative number of samples to be discarded.

Q

The thinning factor for the Gibbs samples, i.e. only every Qth sample is kept.

print_progress

A boolean, determining whether to print the Gibbs sampler progress and the estimated remaining computation time.

prior

A named list of parameters for the prior distributions. See the documentation of check_prior for details about which parameters can be specified.

latent_classes

Either NULL (for no latent classes) or a list of parameters specifying the number of latent classes and their updating scheme:

• C: The fixed number (greater or equal 1) of latent classes, which is set to 1 per default. If either weight_update = TRUE or dp_update = TRUE (i.e. if classes are updated), C equals the initial number of latent classes.

• weight_update: A boolean, set to TRUE to weight-based update the latent classes. See ... for details.

• dp_update: A boolean, set to TRUE to update the latent classes based on a Dirichlet process. See ... for details.

• Cmax: The maximum number of latent classes.

• buffer: The number of iterations to wait before a next weight-based update of the latent classes.

• epsmin: The threshold weight (between 0 and 1) for removing a latent class in the weight-based updating scheme.

• epsmax: The threshold weight (between 0 and 1) for splitting a latent class in the weight-based updating scheme.

• distmin: The (non-negative) threshold difference in class means for joining two latent classes in the weight-based updating scheme.

seed

Set a seed for the Gibbs sampling.

Value

An object of class RprobitB_fit.

Details

See the vignette on model fitting for more details.

• prepare_data() and simulate_choices() for building an RprobitB_data object

• nested_model() for estimating nested models

• transform() for transforming a fitted model

Examples

data <- simulate_choices(
form = choice ~ var | 0, N = 100, T = 10, J = 3, seed = 1
)
mod <- mcmc(data = data, R = 1000, seed = 1)
#> Computing sufficient statistics 0 of 4

#> Computing sufficient statistics 1 of 4

#> Computing sufficient statistics 2 of 4

#> Computing sufficient statistics 3 of 4

#> Computing sufficient statistics 4 of 4

#> Gibbs sampler iteration 1 of 1000

#> Gibbs sampler iteration 10 of 1000

#> Gibbs sampler iteration 20 of 1000

#> Gibbs sampler iteration 30 of 1000

#> Gibbs sampler iteration 40 of 1000

#> Gibbs sampler iteration 50 of 1000

#> Gibbs sampler iteration 60 of 1000

#> Gibbs sampler iteration 70 of 1000

#> Gibbs sampler iteration 80 of 1000

#> Gibbs sampler iteration 90 of 1000

#> Gibbs sampler iteration 100 of 1000

#> Gibbs sampler iteration 110 of 1000

#> Gibbs sampler iteration 120 of 1000

#> Gibbs sampler iteration 130 of 1000

#> Gibbs sampler iteration 140 of 1000

#> Gibbs sampler iteration 150 of 1000

#> Gibbs sampler iteration 160 of 1000

#> Gibbs sampler iteration 170 of 1000

#> Gibbs sampler iteration 180 of 1000

#> Gibbs sampler iteration 190 of 1000

#> Gibbs sampler iteration 200 of 1000

#> Gibbs sampler iteration 210 of 1000

#> Gibbs sampler iteration 220 of 1000

#> Gibbs sampler iteration 230 of 1000

#> Gibbs sampler iteration 240 of 1000

#> Gibbs sampler iteration 250 of 1000

#> Gibbs sampler iteration 260 of 1000

#> Gibbs sampler iteration 270 of 1000

#> Gibbs sampler iteration 280 of 1000

#> Gibbs sampler iteration 290 of 1000

#> Gibbs sampler iteration 300 of 1000

#> Gibbs sampler iteration 310 of 1000

#> Gibbs sampler iteration 320 of 1000

#> Gibbs sampler iteration 330 of 1000

#> Gibbs sampler iteration 340 of 1000

#> Gibbs sampler iteration 350 of 1000

#> Gibbs sampler iteration 360 of 1000

#> Gibbs sampler iteration 370 of 1000

#> Gibbs sampler iteration 380 of 1000

#> Gibbs sampler iteration 390 of 1000

#> Gibbs sampler iteration 400 of 1000

#> Gibbs sampler iteration 410 of 1000

#> Gibbs sampler iteration 420 of 1000

#> Gibbs sampler iteration 430 of 1000

#> Gibbs sampler iteration 440 of 1000

#> Gibbs sampler iteration 450 of 1000

#> Gibbs sampler iteration 460 of 1000

#> Gibbs sampler iteration 470 of 1000

#> Gibbs sampler iteration 480 of 1000

#> Gibbs sampler iteration 490 of 1000

#> Gibbs sampler iteration 500 of 1000

#> Gibbs sampler iteration 510 of 1000

#> Gibbs sampler iteration 520 of 1000

#> Gibbs sampler iteration 530 of 1000

#> Gibbs sampler iteration 540 of 1000

#> Gibbs sampler iteration 550 of 1000

#> Gibbs sampler iteration 560 of 1000

#> Gibbs sampler iteration 570 of 1000

#> Gibbs sampler iteration 580 of 1000

#> Gibbs sampler iteration 590 of 1000

#> Gibbs sampler iteration 600 of 1000

#> Gibbs sampler iteration 610 of 1000

#> Gibbs sampler iteration 620 of 1000

#> Gibbs sampler iteration 630 of 1000

#> Gibbs sampler iteration 640 of 1000

#> Gibbs sampler iteration 650 of 1000

#> Gibbs sampler iteration 660 of 1000

#> Gibbs sampler iteration 670 of 1000

#> Gibbs sampler iteration 680 of 1000

#> Gibbs sampler iteration 690 of 1000

#> Gibbs sampler iteration 700 of 1000

#> Gibbs sampler iteration 710 of 1000

#> Gibbs sampler iteration 720 of 1000

#> Gibbs sampler iteration 730 of 1000

#> Gibbs sampler iteration 740 of 1000

#> Gibbs sampler iteration 750 of 1000

#> Gibbs sampler iteration 760 of 1000

#> Gibbs sampler iteration 770 of 1000

#> Gibbs sampler iteration 780 of 1000

#> Gibbs sampler iteration 790 of 1000

#> Gibbs sampler iteration 800 of 1000

#> Gibbs sampler iteration 810 of 1000

#> Gibbs sampler iteration 820 of 1000

#> Gibbs sampler iteration 830 of 1000

#> Gibbs sampler iteration 840 of 1000

#> Gibbs sampler iteration 850 of 1000

#> Gibbs sampler iteration 860 of 1000

#> Gibbs sampler iteration 870 of 1000

#> Gibbs sampler iteration 880 of 1000

#> Gibbs sampler iteration 890 of 1000

#> Gibbs sampler iteration 900 of 1000

#> Gibbs sampler iteration 910 of 1000

#> Gibbs sampler iteration 920 of 1000

#> Gibbs sampler iteration 930 of 1000

#> Gibbs sampler iteration 940 of 1000

#> Gibbs sampler iteration 950 of 1000

#> Gibbs sampler iteration 960 of 1000

#> Gibbs sampler iteration 970 of 1000

#> Gibbs sampler iteration 980 of 1000

#> Gibbs sampler iteration 990 of 1000

#> Gibbs sampler iteration 1000 of 1000

summary(mod)
#> Probit model
#> choice ~ var | 0
#> R: 1000
#> B: 500
#> Q: 1
#>
#> Normalization
#> Level: Utility differences with respect to alternative 3.
#> Scale: Coefficient of the 1. error term variance in Sigma fixed to 1.
#>
#> Gibbs sample statistics
#>           true    mean      sd      R^
#>  alpha
#>
#>      1   -0.94   -0.85    0.06    1.16
#>
#>  Sigma
#>
#>    1,1    1.00    1.00    0.00    1.00
#>    1,2   -0.42   -0.30    0.06    2.15
#>    2,2    0.27    0.19    0.04    1.62