Skewness for Maximum Likelihood Estimators of the Negative Binomial Distribution
Abstract
The probability generating function of one version of the negative binomial distribution being (p + 1  pt){sup k}, we study elements of the Hessian and in particular Fisher's discovery of a series form for the variance of k, the maximum likelihood estimator, and also for the determinant of the Hessian. There is a link with the Psi function and its derivatives. Basic algebra is excessively complicated and a Maple code implementation is an important task in the solution process. Low order maximum likelihood moments are given and also Fisher's examples relating to data associated with ticks on sheep. Efficiency of moment estimators is mentioned, including the concept of joint efficiency. In an Addendum we give an interesting formula for the difference of two Psi functions.
 Authors:
 ORNL
 Publication Date:
 Research Org.:
 Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
 Sponsoring Org.:
 Work for Others (WFO)
 OSTI Identifier:
 931488
 DOE Contract Number:
 DEAC0500OR22725
 Resource Type:
 Journal Article
 Resource Relation:
 Journal Name: Far East Journal of Theoretical Statistics; Journal Volume: 22; Journal Issue: 1
 Country of Publication:
 United States
 Language:
 English
 Subject:
 97; 99 GENERAL AND MISCELLANEOUS//MATHEMATICS, COMPUTING, AND INFORMATION SCIENCE; MAXIMUMLIKELIHOOD FIT; FORECASTING; DISTRIBUTION; EFFICIENCY; PROBABILITY; STATISTICS
Citation Formats
Bowman, Kimiko o. Skewness for Maximum Likelihood Estimators of the Negative Binomial Distribution. United States: N. p., 2007.
Web.
Bowman, Kimiko o. Skewness for Maximum Likelihood Estimators of the Negative Binomial Distribution. United States.
Bowman, Kimiko o. Mon .
"Skewness for Maximum Likelihood Estimators of the Negative Binomial Distribution". United States.
doi:.
@article{osti_931488,
title = {Skewness for Maximum Likelihood Estimators of the Negative Binomial Distribution},
author = {Bowman, Kimiko o},
abstractNote = {The probability generating function of one version of the negative binomial distribution being (p + 1  pt){sup k}, we study elements of the Hessian and in particular Fisher's discovery of a series form for the variance of k, the maximum likelihood estimator, and also for the determinant of the Hessian. There is a link with the Psi function and its derivatives. Basic algebra is excessively complicated and a Maple code implementation is an important task in the solution process. Low order maximum likelihood moments are given and also Fisher's examples relating to data associated with ticks on sheep. Efficiency of moment estimators is mentioned, including the concept of joint efficiency. In an Addendum we give an interesting formula for the difference of two Psi functions.},
doi = {},
journal = {Far East Journal of Theoretical Statistics},
number = 1,
volume = 22,
place = {United States},
year = {Mon Jan 01 00:00:00 EST 2007},
month = {Mon Jan 01 00:00:00 EST 2007}
}

Maximum likelihood estimators for the gamma distribution revisited
A new algorithm is stated for the evaluation of the maximum likelihood estimators of the twoparameter gamma density. This, along with other approximations, is used to evaluate by quadrature, moments of the estimators of the shape and scale parameters. 11 references, 4 tables. 
Binomial and Poisson Mixtures, Maximum Likelihood, and Maple Code
The bias, variance, and skewness of maximum likelihoood estimators are considered for binomial and Poisson mixture distributions. The moments considered are asymptotic, and they are assessed using the Maple code. Question of existence of solutions and Karl Pearson's study are mentioned, along with the problems of valid sample space. Large samples to reduce variances are not unusual; this also applies to the size of the asymptotic skewness. 
