Hedayat, Rao, and Stufken (1988a and 1988b) first introduced balanced sampling designs for the exclusion of contiguous units. Sampling plans that excluded the selection of contiguous units within a given sample, while maintaining a constant second order inclusion probability for non-contiguous units, were investigated for finite populations of N units arranged in a circular, one-dimensional ordering. There remain many open questions about the existence of such plans and their extension to plans excluding adjacent units.
A first-order observation-driven integer-valued autoregressive model is introduced. Ergodicity of the process is established. Conditional least squares and maximum likelihood estimators of the model parameters are derived. The performances of these estimators are compared via simulation. The models are applied to a real data set.
Regression splines are smooth, flexible, and parsimonious nonparametric function estimators, but the fits are sensitive to the choice of the number and placement of the knots. When a priori knowledge about the regression function includes monotonicity or convexity as well as smoothness, the shape-restricted versions of the regression splines may be used. These fits are more satisfactory as they satisfy shape requirements, with the additional benefit of insensitivity to the knot choices.
The on-line quality monitoring procedure for attributes proposed by Taguchi has been critically studied and extended by a few researchers. Determination of the optimum diagnosis interval requires estimation of some parameters related to the process failure mechanism. Improper estimates of these parameters may lead to incorrect choice of the diagnosis interval and consequently huge economic penalties. In this paper, we highlight both the theoretical and practical problems associated with the estimation of these parameters, and propose a structured approach to solve them.
Regression splines are smooth, flexible, and parsimonious nonparametric function estimators. They are known to be sensitive to knot number and placement, but if assumptions such as monotonicity or convexity may be imposed on the regression function, the shape-restricted regression splines are much more robust to knot choices. Monotone regression splines were introduced by Ramsay (1988). In this paper a more numerically efficient computational method is developed, and the method is extended to convex constraints.
Abstract not available
First we provide a simple derivation of the density of a chi-square random variable. Then we provide another simple proof of the statistical independence of the sample mean and the sample variance of a random sample from a normal distribution. The technique proving independence readily extends to the multivariate case.
Genetic algorithms (GAs) are a popular technology to search for an optimum in a large search space. Using new concepts of forbidden array and weighted mutation, Mandal, Wu and Johnson (2006) used elements of GAs to introduce a new global optimization technique called sequential elimination of level combinations (SELC), that efficiently finds optimums. A SAS macro, and Matlab and R functions are developed to implement the SELC algorithm.