Three simplifying conditions are given for obtaining least squares (LS) estimates for a nonlinear submodel of a linear model. If these are satisfied, and if the subset of nonlinear parameters may be LS fit to the corresponding LS estimates of the linear model, then one attains the desired LS estimates for the entire submodel. Two illustrative analyses employing this method are given, each involving an Eckart-Young (LS) decomposition of a matrix of linear LS estimates. In each case the factors...

Topics: ERIC Archive, Analysis of Variance, Least Squares Statistics, Mathematical Models, Mathematics,...

Aitkin's generalized least squares (GLS) principle, with the inverse of the observed variance-covariance matrix as a weight matrix, is applied to estimate the factor analysis model in the exploratory (unrestricted) case. It is shown that the GLS estimates are scale free and asymptotically efficient. The estimates are computed by a rapidly converging Newton-Raphson procedure. A new technique is used to deal with Heywood cases effectively. (Author)

Topics: ERIC Archive, Correlation, Factor Analysis, Factor Structure, Goodness of Fit, Least Squares...

An iterative least squares procedure for analyzing the effect of various kinds of intervention in time-series data is described. There are numerous applications of this design in economics, education, and psychology, although until recently, no appropriate analysis techniques had been developed to deal with the model adequately. This paper presents and develops a complex example of time-series experiments using simulated data with the intent of illustrating the analytic power of the technique...

Topics: ERIC Archive, Correlation, Data Analysis, Economics, Education, Goodness of Fit, Intervention,...

An "undesigned" experiment is one in which the predictor variables are correlated, either due to a failure to complete a design or because the investigator was unable to select or control relevant experimental conditions. The traditional method of analyzing this class of experiment--multiple regression analysis based on a least squares criterion--gives rise to a number of interpretation problems when the effects of individual predictors are to be assessed. Some difficulties and their...

Topics: ERIC Archive, Bias, Computer Programs, Correlation, Data Analysis, Experiments, Least Squares...

The intent of the study was to determine the extent to which test statistics computed by the unweighted means analysis are F-distributed. Applicability criteria were sought in terms of the number of factor levels and the degree to which cell frequencies differ. The unweighted means analysis, a frequently used approximate analysis, was contrasted with three least squares solutions. Evidence was relatively strong in favor of a least squares analysis if one is to conduct a two-factor analysis of...

Topics: ERIC Archive, Analysis of Variance, Comparative Analysis, Computer Programs, Goodness of Fit, Least...

To eliminate maturation as a factor in the pretest-posttest design, pretest scores can be converted to anticipate posttest scores using grade equivalent scores from standardized tests. This conversion, known as historical regression, assumes that without specific intervention, growth will continue at the rate (grade equivalents per year of schooling) obtained at the time of pretest. Data were taken from reports of 213 Title I compensatory education programs in New York State to examine the...

Topics: ERIC Archive, Academic Achievement, Achievement Gains, Elementary Education, Grade Equivalent...

Least squares and Bayes methods were used in a cross validation study conducted for comparison purposes. The study applies to situations with the following conditions: predictor data are given on the same scales; criterion data may be given on different scales; and it is necessary to pool data even though criterion scale differences exist. Such a system may be needed for minority group prediction studies or graduate school prediction studies where the group sizes are small. Data for the study...

Topics: ERIC Archive, Bayesian Statistics, College Entrance Examinations, Grade Prediction, Graduate Study,...

The conceptualization of analysis of covariance (ANCOVA), as an analysis of variance (ANOVA) on the residual scores that are obtained when the dependent variable is regressed on the covariate, is mathematically incorrect. If residuals are obtained from the pooled within-groups regression coefficient, ANOVA on the residuals results in an inflated alpha-level. If the regression coefficient for the total sample combined into one group is used, ANOVA on the residuals yields an inappropriately...

Topics: ERIC Archive, Analysis of Covariance, Analysis of Variance, Least Squares Statistics, Mathematical...

Whenever one uses ordinary least squares regression, one is making an implicit assumption that all of the independent variables have been measured without error. Such an assumption is obviously unrealistic for most social data. One approach for estimating such regression models is to measure implied coefficients between latent variables for which one had multiple manifest indicators. The problem with this approach is that overidentified models yield multiple estimates of the associations among...

Topics: ERIC Archive, Computer Programs, Factor Analysis, Least Squares Statistics, Mathematical Models,...

Aside from the theoretical issues involving the validity of inferences from surveys, the basic problem of producinq unbiased estimates of regression parameters and estimates of the associated standard errors has been a particularly difficult issue in dealing with results from stratified multistage sample designs such as the one used in the National Longitudinal Study of the High School Class of 1972 (NLS). The purpose of this report is to review some appropriate available techniques that may be...

Topics: ERIC Archive, Computer Programs, Estimation (Mathematics), Graduate Surveys, High Schools, Least...

Comparisons between whites and blacks in models of educational achievement were found to be suspect when based solely on least-square estimates, since the estimates are biased by measurement error varying by race. In this study, white high school seniors were shown to report their parents' status characteristics more reliably than black high school seniors. Data were drawn from the National Longitudinal Study of the High School Class of 1972 as the seniors moved into early adult years....

Topics: ERIC Archive, Academic Achievement, Analysis of Covariance, Black Students, Error of Measurement,...

Least squares fitting process as a method of data reduction is presented. The general strategy is to consider fitting (linear) models as partitioning data into a fit and residuals. The fit can be parsimoniously represented by a summary of the data. A fit is considered adequate if the residuals are small enough so that manipulating their signs and locations does not affect the summary more than a pre-specified amount. The effect of the residuals on the summary is shown to be (approximately)...

Topics: ERIC Archive, Goodness of Fit, Least Squares Statistics, Mathematical Models, Measurement...

Classical statistical methods and the small enrollments in graduate departments have constrained the Graduate Record Examinations (GRE) Validity Study Service to providing only validities for single predictors. Estimates of the validity of two or more predictors, used jointly, are considered too unreliable because the corresponding prediction equations often possess implausible characteristics. This study investigates two statistical methods--empirical Bayes and cluster analysis--to determine...

Topics: ERIC Archive, Bayesian Statistics, College Entrance Examinations, Departments, Grade Point Average,...

This paper reviews recent work in factor analysis of categorical variables. Emphasis is on the generalized least squares solution. A section on maximum likelihood solution focuses on extensions of the classical model, especially the normal case. Many of the recent developments have taken place within this context, and it provides a unified framework of exposition against which other models may be introduced in contrast. Section 2 provides a brief review of factor analysis of measured variables,...

Topics: ERIC Archive, Correlation, Estimation (Mathematics), Factor Analysis, Factor Structure, Latent...

The nature of the criterion (dependent) variable may play a useful role in structuring a list of classification/prediction problems. Such criteria are continuous in nature, binary dichotomous, or multichotomous. In this paper, discussion is limited to the continuous normally distributed criterion scenarios. For both cases, it is assumed that the predictor variables are continuous multivariate normal. For the binary variable case, the multivariate normal assumption is conditioned on the binary...

Topics: ERIC Archive, Classification, Correlation, Error of Measurement, Estimation (Mathematics), Least...

This study examined the effect of type of correlation matrix on the robustness of LISREL maximum likelihood and unweighted least squares structural parameter estimates for models with categorical manifest variables. Two types of correlation matrices were analyzed; one containing Pearson product-moment correlations and one containing tetrachoric, polyserial, and product-moment correlations as appropriate. Using continuous variables generated according to the equations defining the population...

Topics: ERIC Archive, Computer Software, Correlation, Estimation (Mathematics), Goodness of Fit, Hypothesis...

Recent developments in econometrics that are relevant to the task of estimating costs in higher education are reviewed. The relative effectiveness of alternative statistical procedures for estimating costs are also tested. Statistical cost estimation involves three basic parts: a model, a data set, and an estimation procedure. Actual data are used to assess whether the ridge techniques provide a viable alternative to the more familiar ordinary least squares approach within the collinear...

Topics: ERIC Archive, Algorithms, College Students, Cost Estimates, Full Time Students, Higher Education,...

This paper investigates the problems of the impact of grouping on measuring academic performance. It focuses on inferences made from the group level to the individual level. Data for a large state college system with over 30 individual colleges is used. The problem of aggregation bias is studied using the analysis of covariance and is related to the clustering approach of generalized least squares. To illustrate the question, actual data on academic performance at the individual and group level...

Topics: ERIC Archive, Academic Achievement, Analysis of Variance, Cluster Grouping, College Entrance...

Six editions of Scholastic Aptitude Test-Mathematical (SAT-M) were factor analyzed using confirmatory and exploratory methods. Confirmatory factor analyses (using the LISREL VI program) were conducted on correlation matrices among item parcels--sums of scores on a small subset of items. Item parcels were constructed to yield correlation matrices amenable to linear factor analyses. The items constituting a parcel measured the same dimension, and parcels measuring the same construct were parallel...

Topics: ERIC Archive, College Entrance Examinations, Factor Analysis, Factor Structure, Guessing (Tests),...

In research, data sets often occur in which the variance of the distribution of the dependent variable at given levels of the predictors is a function of the values of the predictors. In this situation, the use of weighted least-squares (WLS) or techniques is required. Weights suitable for use in a WLS regression analysis must be estimated. A variety of techniques have been proposed for the empirical selection of weights with the ultimate objective being a better "fit." The outcomes...

Topics: ERIC Archive, Error of Measurement, Estimation (Mathematics), Goodness of Fit, Least Squares...

Traditionally, the errors-in-variables problem is concerned with the point estimation of the slope of the true scores regression line when the regressor is measured with error, and when no specification error is present. In this paper, the errors-in-variables problem is extended to include specification error. Least squares procedures provide a biased estimator of the slope of the true scores regression line. Further, the maximum likelihood estimates of the slope (which are consistent) exist...

Topics: ERIC Archive, Computer Simulation, Equations (Mathematics), Error of Measurement, Graphs, Least...

This report addresses the problem of sample size in developing prediction equations for college freshman grade averages under the American College Testing (ACT) Assessment Program. For the ACT Assessment, the prediction weights are estimated by standard least squares procedures. Because prediction weights are estimated regression coefficients whose accuracy depends on the size of the base sample used to estimate them and because error in estimating the weights propagates error in prediction,...

Topics: ERIC Archive, College Entrance Examinations, College Freshmen, Grade Point Average, Grade...

An algorithm is presented for the best least-squares fitting correlation matrix approximating a given missing value or improper correlation matrix. The proposed algorithm is based on a solution for C. I. Mosier's oblique Procrustes rotation problem offered by J. M. F. ten Berge and K. Nevels (1977). It is shown that the minimization problem belongs to a certain class of convex programs in optimization theory. A necessary and sufficient condition for a solution to yield the unique global minimum...

Topics: ERIC Archive, Algorithms, Computer Software, Correlation, Estimation (Mathematics), Least Squares...

Homogeneity analysis, or multiple correspondence analysis, is usually applied to k separate variables. In this paper, it is applied to sets of variables by using sums within sets. The resulting technique is referred to as OVERALS. It uses the notion of optimal scaling, with transformations that can be multiple or single. The single transformations consist of three types: (1) nominal; (2) ordinal; and (3) numerical. The corresponding OVERALS computer program minimizes a least squares loss...

Topics: ERIC Archive, Algorithms, Computer Software, Least Squares Statistics, Linear Programing,...

This paper explains in user-friendly terms why multivariate statistics are so important in educational research. The basic logic of canonical correlation analysis is presented as a simple or bivariate Pearson "r" procedure. It is noted that all statistical tests implicitly involve the calculation of least squares weights, and that all parametric tests can be conducted using canonical analysis, since canonical analysis subsumes parametric methods as special cases. Canonical analysis is...

Topics: ERIC Archive, Educational Research, Heuristics, Least Squares Statistics, Multiple Regression...

Two methods of using collateral information from similar institutions to predict college freshman grade average were investigated. One central prediction model, referred to as pooled least squares with adjusted intercepts, assumes that slopes and residual variances are homogeneous across selected colleges. The second model, referred to as Bayesian m-group regression, allows estimates of slopes and variances to vary across colleges without ignoring the available collateral information. These...

Topics: ERIC Archive, Bayesian Statistics, College Freshmen, Colleges, Comparative Analysis, Estimation...

Methods for predicting specific college course grades, based on small numbers of observations, were investigated. These methods use collateral information across potentially diverse institutions to obtain refined within-group parameter estimates. One method, referred to as pooled least squares with adjusted intercepts, assumes that slopes and residual variances are homogeneous across selected colleges. The second method, referred to as Bayesian m-group regression, allows estimates of slopes and...

Topics: ERIC Archive, Bayesian Statistics, College Students, Colleges, Comparative Analysis, Estimation...

The validity of American College Testing Program (ACT) test scores and self-reported high school grades for predicting grades in specific college freshman courses was studied. Specific course grades are typically used to place students in remedial, standard, or advanced classes. These placement decisions, in turn, have immediate implications for student performance, satisfaction, and persistence in college. Prediction equations were developed for 18 out of 2,812 specific college courses in...

Topics: ERIC Archive, Bayesian Statistics, College Freshmen, Comparative Analysis, Evaluation Methods,...

The economic benefit that communities derive from in-migration of retired persons has been well recognized in rural development literature. This paper examines the impact of Georgia county attributes on net migration by persons 55 years old and older from 1975 to 1980. Data were obtained from the 1982 County-City Data Book, the U.S. Census Migration Estimates for States and Counties, the Georgia Department of Industry and Trade, and the Georgia Atlas. An empirical model was used to test the...

Topics: ERIC Archive, Community Characteristics, Influences, Least Squares Statistics, Middle Aged Adults,...

Langmuir's model is studied for the situation where epsilon is independently and identically normally distributed. The "Y/x" versus "Y" plot had a 90% mid-range that did not contain the true curve in a vast portion of the range of "x". The "1/Y" versus "1/chi" plot had undefined expected values, and this problem worsens as sample size increases. The use of non-linear least squares is recommended. In non-linear regression, it is demonstrated that...

Topics: ERIC Archive, Equations (Mathematics), Estimation (Mathematics), Least Squares Statistics,...

Performance rating systems frequently use multiple raters in order to improve the reliability of ratings. However, unless all candidates are rated by the same raters, some candidates will be at an unfair advantage or disadvantage solely because they were rated by more stringent or lenient raters. To obtain fair and accurate evaluations of candidate performance, such sources of systematic rating error must be considered. This paper describes four procedures to detect and correct for rater...

Topics: ERIC Archive, Algorithms, Computer Simulation, Educational Assessment, Evaluation Methods,...

This study illustrates the use of three least-squares models to control for rater effects in performance evaluation: (1) ordinary least squares (OLS); (2) weighted least squares (WLS); and (3) OLS subsequent to applying a logistic transformation to observed ratings (LOG-OLS). The three models were applied to ratings obtained from four administrations of an oral examination required for certification in a medical specialty. For any single administration, there were 40 raters and approximately...

Topics: ERIC Archive, Evaluators, Higher Education, Interrater Reliability, Least Squares Statistics,...

Several common sources of error in assessment that depends on the use of judges are identified, and ways to reduce the impact of rating errors are examined. Numerous threats to the validity of scores based on ratings exist. These threats include: (1) the halo effect; (2) stereotyping; (3) perception differences; (4) leniency/stringency error; and (5) scale shrinking. An established body of literature shows that training can minimize rater effects. To be successful, rater training should...

Topics: ERIC Archive, Alternative Assessment, Error of Measurement, Evaluation Methods, Evaluators,...

It is argued that analysis of variance (ANOVA) and related methods should be taught using a general linear model (GLM) approach, rather than a classical ordinary sums of squares approach. The GLM approach emphasizes the linkages among conventional parametric methods, emphasizing that all classical parametric methods are least squares procedures that implicitly or explicitly use weights, focus on latent synthetic variables, and yield effect sizes analogous to "r" squared (are...

Topics: ERIC Archive, Analysis of Variance, Correlation, Effect Size, Higher Education, Introductory...

Several regression methods were examined within the framework of weighted structural regression (WSR), comparing their regression weight stability and score estimation accuracy in the presence of outlier contamination. The methods compared are: (1) ordinary least squares; (2) WSR ridge regression; (3) minimum risk regression; (4) minimum risk 2; (5) goodness of fit index (GFI); and (6) WSR reduced rank regression. Three population covariance matrices were used that were drawn from applied...

Topics: ERIC Archive, Analysis of Covariance, Bayesian Statistics, Comparative Analysis, Computer...

This paper tests the degree of overlap between operational definitions of transformational and transactional leadership, the nature of the relationships between the constructs of transformational and transactional leadership, and specified outcomes in an empirically derived data set by the application of two forms of analysis. Based on Bass's (1985) model, canonical analysis and partial least-squares analysis are applied to derive two path models. The data set was obtained from 1991 Canadian...

Topics: ERIC Archive, Educational Improvement, Elementary Secondary Education, Foreign Countries,...

Meta-analytic methods were used to summarize results of Monte Carlo (MC) studies investigating the robustness of various statistical procedures for testing within-subjects effects in split-plot repeated measures designs. Through a literature review, accessible MC studies were identified, and characteristics (simulation factors) and outcomes (rates of Type I error) of each MC study were coded for univariate, df-adjusted univariate, and multivariate test procedures. Results of weighted least...

Topics: ERIC Archive, Computer Simulation, Foreign Countries, Interaction, Least Squares Statistics,...

Ordinary least-squares regression treats the variables asymmetrically, designating a dependent variable and one or more independent variables. When it is not obvious how to make this distinction, a researcher may prefer to use orthogonal regression, which treats the variables symmetrically. However, the usual procedure for orthogonal regression is not equivariant. A simple modification is proposed to overcome this serious defect. Illustrative computations involving 15 observations on 5...

Topics: ERIC Archive, Equations (Mathematics), Estimation (Mathematics), Least Squares Statistics,...

It has been increasingly realized that (1) multivariate methods are essential in most quantitative studies (Fish, 1988; Thompson, 1992), and (2) all conventional parametric analytic methods are correlational and invoke least squares weights (e.g., the beta weights in regression) (Knapp, 1978; Thompson, 1991). The present paper reviews one very popular multivariate analytic method that explicitly invokes weighting to optimize one criterion: the analytic method that researchers have come to call...

Topics: ERIC Archive, Correlation, Least Squares Statistics, Measurement Techniques, Multivariate Analysis,...

Least squares methods are sophisticated mathematical curve fitting procedures used in all classical parametric methods. The linear least squares approximation is most often associated with finding the "line of best fit" or the regression line. Since all statistical analyses are correlational and all classical parametric methods are least square procedures, it becomes imperative to understand just what the least squares procedure is and how it works. This paper illustrates the least...

Topics: ERIC Archive, Goodness of Fit, Least Squares Statistics, Matrices, Regression (Statistics)

The last decade has seen accelerating change in Australia in the form of devolution of authority, democratic decision making, school accountability, and central reorganization. This paper presents findings of a study that investigated transformational and transactional conceptualizations of leadership and their usefulness in predicting school outcomes within a context of change and educational reform. Specifically, the study examined the effects of transformational and transactional leadership...

Topics: ERIC Archive, Academic Achievement, Educational Administration, Elementary Education, Foreign...

Tests of mean equality proposed by Alexander and Govern (1994) and Tsakok (1978) were compared to the well-known procedures of Brown and Forsythe (1974), James (1951), and Welch (1951) for their ability to limit the number of Type I errors in one-way designs where the underlying distributions were nonnormal, variances were nonhomogeneous, and group sizes were unequal. These tests were compared when the usual method of least squares was applied to estimate group means and variances and when...

Topics: ERIC Archive, Comparative Analysis, Estimation (Mathematics), Foreign Countries, Least Squares...

When tests contain few items, observed score may not be an accurate reflection of true score, and the Mantel Haenszel (MH) statistic may perform poorly in detecting differential item functioning. Applications of the MH procedure in such situations require an alternate strategy; one such strategy is to include background variables in the matching criterion. Techniques for incorporating external information are presented here that match on a weighted score that combines the observed score and...

Topics: ERIC Archive, Criteria, Evaluation Methods, Grade 3, Identification, Item Bias, Least Squares...

This paper reports preliminary research into the nature of relative expertise in economic problem solving. Specifically, this report seeks to address the question of whether the presence of economic knowledge alone accounts for expertise in economic problem solving or whether both economic knowledge and the development and employment of economic problem solving strategies are necessary prerequisites for acquiring expertise in economic problem solving. The researchers examined literal...

Topics: ERIC Archive, Content Analysis, Correlation, Economic Factors, Economic Research, Economics,...

This paper reports preliminary research into the nature of relative expertise in economic problem solving. The first section briefly describes why such research is needed in the context of research on expert and novice problem solving. It also presents the problem explored in this study in the context of the existing research. Subsequent sections present the methods, results, and conclusions of the study. The researchers examined literal transcripts generated from the "talk-aloud"...

Topics: ERIC Archive, Content Analysis, Correlation, Economic Factors, Economic Research, Economics,...

The information that is gained through various analyses of the residual scores yielded by the least squares regression model is explored. In fact, the most widely used methods for detecting data that do not fit this model are based on an analysis of residual scores. First, graphical methods of residual analysis are discussed, followed by a review of several quantitative approaches. Only the more widely used approaches are discussed. Example data sets are analyzed through the use of the...

Topics: ERIC Archive, Graphs, Identification, Least Squares Statistics, Regression (Statistics), Research...

Some standard-setting methods require judges to estimate the probability that an examinee who just meets an achievement standard will answer each of a set of items correctly. These probability estimates are then used to infer the values on some latent scale that, in theory, determines an examinee's responses. The paper focuses on the procedures used to convert the probability estimates into performance standards. A number of procedures are described that have been traditionally used, including...

Topics: ERIC Archive, Academic Achievement, Achievement Tests, Elementary Secondary Education, Error of...

Five issues relative to the use of different Ordinary Least Squares (OLS) and Hierarchical Linear Modeling (HLM) models to identify effective schools and teachers were examined using data from all students in the Dallas (Texas) public schools in grade 3 in 1994 and grade 4 in 1995. OLS models using first- and second-order interactions produced results that were very close to those produced by two-level HLM models at the school level and two- and three-level HLM models at the teacher level. Most...

Topics: ERIC Archive, Academic Achievement, Bayesian Statistics, Correlation, Effective Schools Research,...

A high-breakdown estimator is a robust statistic that can withstand a large amount of contaminated data. In linear regression, high-breakdown estimators can detect outliers and distinguish between good and bad leverage points. This paper summarizes the case for high-breakdown regression and emphasizes the least quartile difference estimator (LQD) proposed by C. Croux, P. J. Rousseeuw, and O. Hossjer (1994). This regression method examines the absolute differences between every pair of residuals...

Topics: ERIC Archive, Computer Software, Estimation (Mathematics), Least Squares Statistics, Regression...

The question of least-squares weights versus equal weights has been a subject of great interest to researchers for over 60 years. Several researchers have compared the efficiency of equal weights and that of least-squares weights under different conditions. Recently, S. V. Paunonen and R. C. Gardner stressed that the necessary and sufficient condition for equal-weights aggregation is that the predictors satisfy the requirements of psychometric parallelism. In this study, the effect of...

Topics: ERIC Archive, Correlation, Error of Measurement, Least Squares Statistics, Predictor Variables,...