ComputeBest_t.Rd
Runs Monte Carlo simulation for different values of \(\alpha\) and \(\beta\) and computes a specified number of t-points that minimises the determinant of the asymptotic covariance matrix.
ComputeBest_t(AlphaBetaMatrix = abMat, nb_ts = seq(10, 100, 10),
alphaReg = 0.001, FastOptim = TRUE, ...)
values of the parameter \(\alpha\) and \(\beta\) from which we simulate the data. By default, the values of \(\gamma\) and \(\delta\) are set to 1 and 0, respectively; a \(2 \times n\) matrix.
vector of numbers of t-points to use for the minimisation;
default = seq(10, 100, 10)
.
value of the regularisation parameter; numeric, default = 0.001.
Logical flag; if set to TRUE, optim
with "Nelder-Mead" method
is used (fast but not accurate). Otherwise, nlminb
is used
(more accurate but slower).
Other arguments to pass to the optimisation function.
a list
containing slots from class Best_t-class
corresponding to one value of the parameters \(\alpha\) and
\(\beta\).