The papers were presented at the Social Statistics Section, the Government Statistics Section, and the Section on Survey Research Methods. The following papers are included in the Social Statistics Section and Government Statistics Section, "Overcoming the Bureaucratic Paradigm: Memorial Session in Honor of Roger Herriot": "1995 Roger Herriot Award Presentation" (Daniel Kasprzyk, Fritz Scheuren, and Dan Levine); "Space/Time Variations in Survey Estimates" (Leslie...

Topics: ERIC Archive, Elementary Secondary Education, Least Squares Statistics, Longitudinal Studies,...

The Expository Reading and Writing Course (ERWC) was developed by California State University (CSU) faculty and high school educators to improve the academic literacy of high school seniors, thereby reducing the need for students to enroll in remedial English courses upon entering college. This report, produced by Innovation Studies at WestEd, presents the findings of an independent evaluation of the ERWC funded by an Investing in Innovation (i3) development grant from the U.S. Department of...

Topics: ERIC Archive, Expository Writing, Course Evaluation, High School Seniors, Grade 12, Writing...

In research, data sets often occur in which the variance of the distribution of the dependent variable at given levels of the predictors is a function of the values of the predictors. In this situation, the use of weighted least-squares (WLS) or techniques is required. Weights suitable for use in a WLS regression analysis must be estimated. A variety of techniques have been proposed for the empirical selection of weights with the ultimate objective being a better "fit." The outcomes...

Topics: ERIC Archive, Error of Measurement, Estimation (Mathematics), Goodness of Fit, Least Squares...

An "undesigned" experiment is one in which the predictor variables are correlated, either due to a failure to complete a design or because the investigator was unable to select or control relevant experimental conditions. The traditional method of analyzing this class of experiment--multiple regression analysis based on a least squares criterion--gives rise to a number of interpretation problems when the effects of individual predictors are to be assessed. Some difficulties and their...

Topics: ERIC Archive, Bias, Computer Programs, Correlation, Data Analysis, Experiments, Least Squares...

The 20 item Child Anxiety Scale (CAS) was administered to 343 elementary school children. Unweighted Least Squares extraction with oblique rotation produced 3 correlated primary factors that were interpreted as matching factors C, L and O of the Sixteen Personality Factors questionnaire for adults. Of particular interest was the factor L pattern which has never been identified before with children. Since the items most highly loading on factor L conveyed a sense of being persecuted by other...

Topics: ERIC Archive, Elementary School Students, Bullying, Measures (Individuals), Factor Analysis, Least...

Standardising learning content and teaching approaches is not considered to be the best practice in contemporary education. This approach does not differentiate learners based on their individual abilities and preferences. The present research integrates a pedagogical theory "Universal Design for Learning" ("UDL") with an information system (IS) theory "Technology Acceptance Model" ("TAM"). It aims to examine the effectiveness of a technology-enhanced...

Topics: ERIC Archive, Electronic Learning, Technology Integration, Information Systems, Web Sites, Design,...

Classical statistical methods and the small enrollments in graduate departments have constrained the Graduate Record Examinations (GRE) Validity Study Service to providing only validities for single predictors. Estimates of the validity of two or more predictors, used jointly, are considered too unreliable because the corresponding prediction equations often possess implausible characteristics. This study investigates two statistical methods--empirical Bayes and cluster analysis--to determine...

Topics: ERIC Archive, Bayesian Statistics, College Entrance Examinations, Departments, Grade Point Average,...

A study explored which nonexperimental comparison group methods provide the most accurate estimates of the impacts of mandatory welfare-to-work programs and whether the best methods work well enough to substitute for random assignment experiments. Findings were compared for nonexperimental comparison groups and statistical adjustment procedures with those for experimental control groups from a large-sample, six-state random assignment experiment--the National Evaluation of Welfare-to-Work...

Topics: ERIC Archive, Adult Education, Comparative Analysis, Control Groups, Error of Measurement,...

Dual enrollment programs enable high school students to enroll in college courses and earn college credit. Once limited to high-achieving students, such programs are increasingly seen as a means to support the postsecondary preparation of average-achieving students and students in career and technical education (CTE) programs. This report seeks to answer several questions regarding the effectiveness of dual enrollment programs using statistical methods to examine the impact of dual enrollment...

Topics: ERIC Archive, Postsecondary Education, High School Students, Grade Point Average, Graduation,...

South Africa participated in the Third International Mathematics and Science Study (TIMSS)in 1995 and its repeat in 1999. In 1995, none of the data on school or teacher level could be analyzed to provide the context for the students' poor achievements in mathematics and science. With the 1999 data now available at both school and teacher levels in addition to the student level data, this backdrop to the results can be provided. Path analysis, using Partial Least Square analysis, was conducted...

Topics: ERIC Archive, English, Foreign Countries, International Studies, Language Proficiency, Least...

This paper reports preliminary research into the nature of relative expertise in economic problem solving. Specifically, this report seeks to address the question of whether the presence of economic knowledge alone accounts for expertise in economic problem solving or whether both economic knowledge and the development and employment of economic problem solving strategies are necessary prerequisites for acquiring expertise in economic problem solving. The researchers examined literal...

Topics: ERIC Archive, Content Analysis, Correlation, Economic Factors, Economic Research, Economics,...

Analyzing data that possess some form of nesting is often challenging for applied researchers or district staff who are involved in or in charge of conducting data analyses. This report provides a description of the challenges for analyzing nested data and provides a primer of how multilevel regression modeling may be used to resolve these challenges. An illustration from the companion report, The correlates of academic performance for English language learner students in a New England district...

Topics: ERIC Archive, Multiple Regression Analysis, Statistical Analysis, Data, Models, Hierarchical Linear...

Performance was examined for five cohorts of 1998-2002 Texas public high school graduates through their first year and 1998-2001 cohorts through their fourth year of Texas public higher education. Student performance on college outcomes included (a) first- and fourth-year grade point averages (GPAs), (b) first- and fourth-year credit hours earned, and (c) four-year graduation status. Outcomes were compared across students who varied by three types of AP® (course only, exam only, and both...

Topics: ERIC Archive, Advanced Placement, Academic Achievement, Outcomes of Education, Comparative...

This paper tests the degree of overlap between operational definitions of transformational and transactional leadership, the nature of the relationships between the constructs of transformational and transactional leadership, and specified outcomes in an empirically derived data set by the application of two forms of analysis. Based on Bass's (1985) model, canonical analysis and partial least-squares analysis are applied to derive two path models. The data set was obtained from 1991 Canadian...

Topics: ERIC Archive, Educational Improvement, Elementary Secondary Education, Foreign Countries,...

A key issue in quasi-experimental studies and also with many evaluations which required a treatment effects (i.e. a control or experimental group) design is selection bias (Shadish el at 2002). Selection bias refers to the selection of individuals, groups or data for analysis such that proper randomization is not achieved, thereby ensuring that the sample obtained is not representative of the population intended to be analyzed (Shadish el 2002). There are many ways in which selection bias...

Topics: ERIC Archive, Quasiexperimental Design, Probability, Scores, Least Squares Statistics, Regression...

This paper explains in user-friendly terms why multivariate statistics are so important in educational research. The basic logic of canonical correlation analysis is presented as a simple or bivariate Pearson "r" procedure. It is noted that all statistical tests implicitly involve the calculation of least squares weights, and that all parametric tests can be conducted using canonical analysis, since canonical analysis subsumes parametric methods as special cases. Canonical analysis is...

Topics: ERIC Archive, Educational Research, Heuristics, Least Squares Statistics, Multiple Regression...

Recent developments in econometrics that are relevant to the task of estimating costs in higher education are reviewed. The relative effectiveness of alternative statistical procedures for estimating costs are also tested. Statistical cost estimation involves three basic parts: a model, a data set, and an estimation procedure. Actual data are used to assess whether the ridge techniques provide a viable alternative to the more familiar ordinary least squares approach within the collinear...

Topics: ERIC Archive, Algorithms, College Students, Cost Estimates, Full Time Students, Higher Education,...

Maximum likelihood and least-squares estimates of parameters from the logistic regression model are derived from an iteratively reweighted linear regression algorithm. Empirical Bayes estimates are derived using an m-group regression model to regress the within-group estimates toward common values. The m-group regression model assumes that the parameter vectors from "m" groups are independent, and identically distributed, observations from a multivariate normal "prior"...

Topics: ERIC Archive, Bayesian Statistics, Estimation (Mathematics), Least Squares Statistics, Maximum...

The purpose of this paper is to assist researchers, practitioners, and graduate students in identifying and addressing key questions related to the task of choosing among the analytic techniques designed to analyze a dichotomized dependent variable with a set of independent variables. The discussion is limited to (1) the analysis of data by the analytic procedures of ordinary least squares regression, discriminant analysis, or logistic regression; (2) the use of the Statistical Package for the...

Topics: ERIC Archive, Discriminant Analysis, Least Squares Statistics, Regression (Statistics), Research...

An algorithm is presented for the best least-squares fitting correlation matrix approximating a given missing value or improper correlation matrix. The proposed algorithm is based on a solution for C. I. Mosier's oblique Procrustes rotation problem offered by J. M. F. ten Berge and K. Nevels (1977). It is shown that the minimization problem belongs to a certain class of convex programs in optimization theory. A necessary and sufficient condition for a solution to yield the unique global minimum...

Topics: ERIC Archive, Algorithms, Computer Software, Correlation, Estimation (Mathematics), Least Squares...

The purpose of this study was to evaluate the impact of Achieve3000, a differentiated online literacy curriculum, on students' scores on the California State Test (CST). In the 2011-12 school year, 1,957 students in Chula Vista began using Achieve3000's solutions in 3rd through 8th grade. Using a form of propensity score matching called Inverse Probability-of-Treatment Weighting (IPTW), the researchers assigned weights for the likelihood that students in the non-user comparison group (N =...

Topics: ERIC Archive, Reading Programs, Program Effectiveness, Quasiexperimental Design, Elementary School...

Meta-analytic methods were used to summarize results of Monte Carlo (MC) studies investigating the robustness of various statistical procedures for testing within-subjects effects in split-plot repeated measures designs. Through a literature review, accessible MC studies were identified, and characteristics (simulation factors) and outcomes (rates of Type I error) of each MC study were coded for univariate, df-adjusted univariate, and multivariate test procedures. Results of weighted least...

Topics: ERIC Archive, Computer Simulation, Foreign Countries, Interaction, Least Squares Statistics,...

The last decade has seen accelerating change in Australia in the form of devolution of authority, democratic decision making, school accountability, and central reorganization. This paper presents findings of a study that investigated transformational and transactional conceptualizations of leadership and their usefulness in predicting school outcomes within a context of change and educational reform. Specifically, the study examined the effects of transformational and transactional leadership...

Topics: ERIC Archive, Academic Achievement, Educational Administration, Elementary Education, Foreign...

This report addresses the problem of sample size in developing prediction equations for college freshman grade averages under the American College Testing (ACT) Assessment Program. For the ACT Assessment, the prediction weights are estimated by standard least squares procedures. Because prediction weights are estimated regression coefficients whose accuracy depends on the size of the base sample used to estimate them and because error in estimating the weights propagates error in prediction,...

Topics: ERIC Archive, College Entrance Examinations, College Freshmen, Grade Point Average, Grade...

Tests of mean equality proposed by Alexander and Govern (1994) and Tsakok (1978) were compared to the well-known procedures of Brown and Forsythe (1974), James (1951), and Welch (1951) for their ability to limit the number of Type I errors in one-way designs where the underlying distributions were nonnormal, variances were nonhomogeneous, and group sizes were unequal. These tests were compared when the usual method of least squares was applied to estimate group means and variances and when...

Topics: ERIC Archive, Comparative Analysis, Estimation (Mathematics), Foreign Countries, Least Squares...

It is well established that workers with more years of education earn higher wages. By establishing a reference or "required" level of education for a worker's occupation, it is possible to decompose an individual's actual level of education into years of required education and years of over-education or under-education relative to that occupational norm. A richer picture of wage determination can be gained by substituting these three terms for actual education in the standard Mincer...

Topics: ERIC Archive, Educational Attainment, Foreign Countries, Wages, Least Squares Statistics, Labor...

Least squares and Bayes methods were used in a cross validation study conducted for comparison purposes. The study applies to situations with the following conditions: predictor data are given on the same scales; criterion data may be given on different scales; and it is necessary to pool data even though criterion scale differences exist. Such a system may be needed for minority group prediction studies or graduate school prediction studies where the group sizes are small. Data for the study...

Topics: ERIC Archive, Bayesian Statistics, College Entrance Examinations, Grade Prediction, Graduate Study,...

Six editions of Scholastic Aptitude Test-Mathematical (SAT-M) were factor analyzed using confirmatory and exploratory methods. Confirmatory factor analyses (using the LISREL VI program) were conducted on correlation matrices among item parcels--sums of scores on a small subset of items. Item parcels were constructed to yield correlation matrices amenable to linear factor analyses. The items constituting a parcel measured the same dimension, and parcels measuring the same construct were parallel...

Topics: ERIC Archive, College Entrance Examinations, Factor Analysis, Factor Structure, Guessing (Tests),...

Homoscedasticity is an important assumption of linear regression. This paper explains what it is and why it is important to the researcher. Graphical and mathematical methods for testing the homoscedasticity assumption are demonstrated. Sources of homoscedasticity and types of homoscedasticity are discussed, and methods for correction are demonstrated. Graphs are used to illustrate different patterns that may be caused by heteroscedasticity. An extensive example for using Weighted Least Squares...

Topics: ERIC Archive, Graphs, Least Squares Statistics, Regression (Statistics), Thompson, Russel L.

This study explores the relationship between college graduation rates and student participation and success in Advanced Placement (AP) courses and exams. We reviewed three approaches to examining this relationship: 1) comparing the college graduation rates of AP and non-AP students; 2) comparing the college graduation rate of AP and non-AP students after controlling for students' demographics and prior achievement and the demographics of their high schools; and 3) examining the relationship...

Topics: ERIC Archive, Advanced Placement, Graduation Rate, Student Participation, Academic Achievement,...

Education is necessary for the personality grooming of individual. There are different types of institutions available like private and public institutions, technical institutions, and madrasas (religious institutions). These institutes are having the triangle of three main pillars; consisted of Teachers, Students, and Curriculum. There are two main types of schools in Pakistan and all over the world. One is public and other is private school system. Now a days private schools are becoming more...

Topics: ERIC Archive, Foreign Countries, Public Schools, Private Schools, Comparative Analysis, Case...

The information that is gained through various analyses of the residual scores yielded by the least squares regression model is explored. In fact, the most widely used methods for detecting data that do not fit this model are based on an analysis of residual scores. First, graphical methods of residual analysis are discussed, followed by a review of several quantitative approaches. Only the more widely used approaches are discussed. Example data sets are analyzed through the use of the...

Topics: ERIC Archive, Graphs, Identification, Least Squares Statistics, Regression (Statistics), Research...

When tests contain few items, observed score may not be an accurate reflection of true score, and the Mantel Haenszel (MH) statistic may perform poorly in detecting differential item functioning. Applications of the MH procedure in such situations require an alternate strategy; one such strategy is to include background variables in the matching criterion. Techniques for incorporating external information are presented here that match on a weighted score that combines the observed score and...

Topics: ERIC Archive, Criteria, Evaluation Methods, Grade 3, Identification, Item Bias, Least Squares...

The significant growth of charter schools in the United States has brought praise for the excellent results achieved by some schools as well as criticism that charter schools may not be serving the most disadvantaged students. Critics of charter schools, in New York City and elsewhere, commonly assert that charters' (often) strong academic performance derives primarily from the type of student educated, rather than the quality of schooling provided. In particular, many charter school opponents...

Topics: ERIC Archive, Urban Schools, Charter Schools, Low Achievement, Longitudinal Studies, Enrollment,...

The conceptualization of analysis of covariance (ANCOVA), as an analysis of variance (ANOVA) on the residual scores that are obtained when the dependent variable is regressed on the covariate, is mathematically incorrect. If residuals are obtained from the pooled within-groups regression coefficient, ANOVA on the residuals results in an inflated alpha-level. If the regression coefficient for the total sample combined into one group is used, ANOVA on the residuals yields an inappropriately...

Topics: ERIC Archive, Analysis of Covariance, Analysis of Variance, Least Squares Statistics, Mathematical...

When looking at the relationship between individual earnings and schooling, there are potential sources of bias which arise due to individual education choices; individuals of higher unobserved ability or with higher unobserved payoffs from schooling may for instance invest more in education. This paper reviews alternative models and estimation methods meant to overcome these sources of bias and to thus recover the true causal effect of education on earnings. As to the specification of the...

Topics: ERIC Archive, Education Work Relationship, Income, Outcomes of Education, Models, Foreign...

This study examined the effect of type of correlation matrix on the robustness of LISREL maximum likelihood and unweighted least squares structural parameter estimates for models with categorical manifest variables. Two types of correlation matrices were analyzed; one containing Pearson product-moment correlations and one containing tetrachoric, polyserial, and product-moment correlations as appropriate. Using continuous variables generated according to the equations defining the population...

Topics: ERIC Archive, Computer Software, Correlation, Estimation (Mathematics), Goodness of Fit, Hypothesis...

It is argued that analysis of variance (ANOVA) and related methods should be taught using a general linear model (GLM) approach, rather than a classical ordinary sums of squares approach. The GLM approach emphasizes the linkages among conventional parametric methods, emphasizing that all classical parametric methods are least squares procedures that implicitly or explicitly use weights, focus on latent synthetic variables, and yield effect sizes analogous to "r" squared (are...

Topics: ERIC Archive, Analysis of Variance, Correlation, Effect Size, Higher Education, Introductory...

Whenever one uses ordinary least squares regression, one is making an implicit assumption that all of the independent variables have been measured without error. Such an assumption is obviously unrealistic for most social data. One approach for estimating such regression models is to measure implied coefficients between latent variables for which one had multiple manifest indicators. The problem with this approach is that overidentified models yield multiple estimates of the associations among...

Topics: ERIC Archive, Computer Programs, Factor Analysis, Least Squares Statistics, Mathematical Models,...

Traditionally, the errors-in-variables problem is concerned with the point estimation of the slope of the true scores regression line when the regressor is measured with error, and when no specification error is present. In this paper, the errors-in-variables problem is extended to include specification error. Least squares procedures provide a biased estimator of the slope of the true scores regression line. Further, the maximum likelihood estimates of the slope (which are consistent) exist...

Topics: ERIC Archive, Computer Simulation, Equations (Mathematics), Error of Measurement, Graphs, Least...

This study illustrates the use of three least-squares models to control for rater effects in performance evaluation: (1) ordinary least squares (OLS); (2) weighted least squares (WLS); and (3) OLS subsequent to applying a logistic transformation to observed ratings (LOG-OLS). The three models were applied to ratings obtained from four administrations of an oral examination required for certification in a medical specialty. For any single administration, there were 40 raters and approximately...

Topics: ERIC Archive, Evaluators, Higher Education, Interrater Reliability, Least Squares Statistics,...

Design-based methods have recently been developed as a way to analyze data from impact evaluations of interventions, programs, and policies (Imbens and Rubin, 2015; Schochet, 2015, 2016). The estimators are derived using the building blocks of experimental designs with minimal assumptions, and are unbiased and normally distributed in large samples with simple variance estimators. The methods apply to randomized controlled trials (RCTs) and quasi-experimental designs (QEDs) with comparison...

Topics: ERIC Archive, Design, Randomized Controlled Trials, Quasiexperimental Design, Research Methodology,...

Today, college quality rankings in news magazines and guidebooks are a big business with tangible impacts on the operation of higher education institutions. The college rankings published annually by "U.S. News and World Report" ("U.S. News") are so influential that Don Hossler of Indiana University derisively claims that higher education is the victim of "management" by the magazine. How did academic quality rankings of colleges and universities become so powerful...

Topics: ERIC Archive, Higher Education, Educational Quality, Criticism, Reputation, Institutional...

The sizable gender gap in college enrolment, especially among African Americans, constitutes a puzzling empirical regularity that may have serious consequences on marriage markets, male labor force participation and the diversity of college campuses. For instance, only 35.7 percent of all African American undergraduate students were men in 2004. Reduced form results show that, while family background covariates cannot account for the observed gap, proxy measures for non-cognitive skills are...

Topics: ERIC Archive, Racial Differences, Gender Differences, Enrollment Rate, College Bound Students,...

A high-breakdown estimator is a robust statistic that can withstand a large amount of contaminated data. In linear regression, high-breakdown estimators can detect outliers and distinguish between good and bad leverage points. This paper summarizes the case for high-breakdown regression and emphasizes the least quartile difference estimator (LQD) proposed by C. Croux, P. J. Rousseeuw, and O. Hossjer (1994). This regression method examines the absolute differences between every pair of residuals...

Topics: ERIC Archive, Computer Software, Estimation (Mathematics), Least Squares Statistics, Regression...

Policymakers and practitioners frequently use teacher surveys to inform decisions on school improvement efforts in low-achieving schools. There is little empirical evidence on how the results of these surveys relate to student outcomes. This study provides information on how perception data from a teacher survey in Idaho is correlated with three student outcomes: reading proficiency, math proficiency, and attendance. The Idaho State Department of Education uses the Educational Effectiveness...

Topics: ERIC Archive, School Effectiveness, Outcomes of Education, Reading Achievement, Mathematics...

This report provides empirical results of attempts to achieve consistency of estimates between two National Center for Education Statistics (NCES) surveys. These surveys are the 1991- 92 Private School Survey (PSS) and the Private School Component of the 1990-91 Schools and Staffing Survey (SASS). Consistency was sought in the numbers of schools, teachers, and students from these two sources. Comparisons are made among statistical and computational procedures that might serve to bring about the...

Topics: ERIC Archive, Classification, Elementary Secondary Education, Estimation (Mathematics), Least...

Homogeneity analysis, or multiple correspondence analysis, is usually applied to k separate variables. In this paper, it is applied to sets of variables by using sums within sets. The resulting technique is referred to as OVERALS. It uses the notion of optimal scaling, with transformations that can be multiple or single. The single transformations consist of three types: (1) nominal; (2) ordinal; and (3) numerical. The corresponding OVERALS computer program minimizes a least squares loss...

Topics: ERIC Archive, Algorithms, Computer Software, Least Squares Statistics, Linear Programing,...

Least squares methods are sophisticated mathematical curve fitting procedures used in all classical parametric methods. The linear least squares approximation is most often associated with finding the "line of best fit" or the regression line. Since all statistical analyses are correlational and all classical parametric methods are least square procedures, it becomes imperative to understand just what the least squares procedure is and how it works. This paper illustrates the least...

Topics: ERIC Archive, Goodness of Fit, Least Squares Statistics, Matrices, Regression (Statistics)

Aitkin's generalized least squares (GLS) principle, with the inverse of the observed variance-covariance matrix as a weight matrix, is applied to estimate the factor analysis model in the exploratory (unrestricted) case. It is shown that the GLS estimates are scale free and asymptotically efficient. The estimates are computed by a rapidly converging Newton-Raphson procedure. A new technique is used to deal with Heywood cases effectively. (Author)

Topics: ERIC Archive, Correlation, Factor Analysis, Factor Structure, Goodness of Fit, Least Squares...