DOCOHBHT BSSOHB
ED 157 148
Bi 010 683
AOTHOR
TITLE
PUB DATE
HOTB*
BDRS PRICE
DESCRIPTORS
Bertram, Charley L. ; Sonis/ Paiela B.
An Examination of the, Relationships Between Research
Hethoflologies and Study ' Characteristics of 57
Selected 'Heeds Sensing Activities.
28 Mar 78
t 33p.;' Paper presented at the Annual Tteeting of the
American Educational Research Association (Toronto,
Ontario, Harciji 27-31, 1978) „
i!F-$0c83 HC-S2.06 Plus Postal'. ^ /
Bibliographie's; Elementary Secondary Education^. Heeds
Assessment; *Research Methodology
ABSTRACT ■ - —
A call for the results of needs assessment studies
conducted in fourteen states, in or near th$ Appalachian Region
brought, over fifty responses. Analysis of these responses showed that
more needs assessment is carried out by bureaus or divisions 'of
research, planning,, and evaluation than by central administration or
other programmatic divisions. As the intent of mo'st studies was to
determine the current status of educational need^, tests and
questionnaires served as the prevailing methodologies. Tesits were
most frequently used to assess needs in specific subject areas, while
questionnaires served in studies of a more general, nattire.- 'This
report classifies the needs .assessment studies discussed by type of
agency, type of .office, purpose, length of report, coverage, intended
user, $nd assessment procedure^ (Aojthor/PGD)
v.
3
***********************************
,* - Reproductions supplied by BDRS are the best tha-t can be made* *
* fro'a the original * document. *•'
************************************
•ERLC
0 +
# U S DEPARIM***<*£MEALTM,
£OtjX?ATtON 4WEl>TMt£.
-'NATIONAL INSTITUTE **«
. • EOUCATION
TMiS DOCUMENT MAS BEEN REPRO-
f OUCEO EXACTLY AS RECElVEO F ROM
THE PERSON OR ORGANIZATION ORIGIN-
• ATlNGfT POINTS Of VIEW OS OPINIONS
STAT EO 00 NOT NECESSARILY REPRE-
SENTEE f IC IAL NATIONAL INSTITUTE OP
EOUCATION POSlTlOW'OR POLICY- •
Session 10.04
AN EXAMINATION OF THE RELATIONSHIPS BETWEEN RESEARCH
METHODOLOGIES ANp STUDY CHARACTERISTICS OF '
57 SELECTED NEEDS SENSING ACTIVITIES
by
■ Charles L. Bertram
* Pamela B. Sonis
Appalachia Educational .Laboratory
/ :
\
TO THE EOTTC^TKyNAL RESOURCES
INFORMATION CENTER (£RIC) AND
USERS OF THE ERIy SYSTEM "
• CO
©
H
O
A paper presented at the 1978 Annual Meeting, of the Amerig&n
Educational Research Association (AERA) , Toronto, Canada, i
March 28, 1978. m> -
ERIC
V
0
. ' TABLE OF CONTENTS' •
Introduction . . - \
Background . . . . ^ *,...:.
Purpose • * V • . . . . s . . . .
Methodology . . . \ . • • . :
Analysis \ f
- Explanation of Categories /
^fitnit^ttiorls of Study ; 4 „ . . . r
Presentation of Results ' 1.
Frequency of Methodology Usage . *
Comparison *of Methodology with Other Study } ,
Characteristics I
Generalizations" .
Recommendations ' ....... : . . .
Bibliography . 9 „ .... K .... .
Appendices * - . •
A: Sample Letter and Listl of States \
• 7 ■
B: Classification Form v t / . .
. 4 ' t
< . V
4
Introduction '
* * «■
* • • ■*
This paper presents an analysis of more thVn 50 reports of educational
needs or educational goals obtained' by the^ AppalkbkjLa Educational ^Lbbratory ,
during a 1977 needs assessment activity. When the repotts 1 wer$ analyzed .to
«\ ■ *
determine commonality of identified needs dr goals, the purpose for which
they wer£ originally obtained, staff was intrigued by th\e variety of tech-
niques used to obtain and report the cfcta. Therefore, as a separate activity,
i *
the authors have reviewed the documents with the notioji of identifying relation-
^ . x
ships between the variety of research methodologies used and other characteristic^*
7
of the Reports. (The results of that review are reported here.
) •; t% ' i * : v •-
c . v Background *>
- . • v * * •
/ ' " - *
I Staff of AEL has been periodically surveying the needs of the Appalachian
J
' Region through one means or another sinpe the inception^ of the Laboratory in
1966. For example, interviews* were conducted in 1967, a/questionnaire was
* *v
distributed in 1971, a convergence technique was tiseft in 1973, and the expert
^ ' * ' • . ■[ ' ■ i . - /
opinions of Chief State School Officers of the member-state Region were * ,
'solicited in 1976.- - \ / Cjl '* • \ ' " ,
/ ' v
In late 1976, while preparing/ the AEL Plan for 19*78- $2 t staff decided to
> / V w • , 1 *
collecj: documents reporting needs/ studies and goal statements frbm a variety
of sources. Letters were sent tp selected personnel in each of 14 Mideastern)
/ '. ' ^ • ■ - ' ■ •
and Southeastern states. The request was £or studies whibh might represent
//. . \ - A
the educational needs of the spa'tes," such, as ESEA Title I . needs studies, . /
vocational education studies, /or other similkr reports. JReSearch Triangle
Institute (RTI) , Southern Regional Education Board (SREB) ^ an& Educational
Testing Service (ETS) weJe dlso asked to send jpepo^ts . Appendix A contains a
copy of the letter and the /jList of 'states.
*
f
The Laboratory's purpose' for requesting the studied was to analyze them
to determine commonalities of /regional^ needs- The resulting analysis'- (Miles
and Bertram, 1977) provided AEU with & starting point for informed discussion
relevant to^e Laboratory's long-range institutional planning. ^ Most of tihe '
state education departments &nd the organizations returned one or m6re docu-
h j • " * - *•
ments. The reports received were representative of numerous levels of detail.
These included , for example , a budget document for a *stat£ board of education ,
a comprehensive multi-faceted survey of the educational needs^f the state ,
a five-page statement of educational goals , and reports of studies using the
1 a. ♦
rlWfcs of stud:
National 'Assessment of Educational Progress. * j
As the analysis of needs progressed, Laboratory staff became impressed
s / j
with the variance of methodologies and reporting styles used with the studies,.
. • / - ••-••/ . - . j
Some were highly 7 sophisticated", using probability sampling and advanced .
/ ' ' \
instrument construction techniques; others were reviews of local .assessments ,
7 -a
which were combined into a narrative statement of that state ' s. educational
V
* needs . \i . t \
\ -
\
Purpose
• Beyond the immediate intention of determining relationships between ,
S
methodologies and other report characteristics , there are other reasons fpr
examining the studies. Personnel who perform needs sjasing actiyifies at ^
i . - . J< • A'
I the state and local levels could benefit 'from an awareness of the method-
ologies used by their colleagues. The analysis described here may indicate
* ' * ! * / \* ' 1
j
tha,t certain methodologies help attain some purposes better thr^n others.
The results of this current activity also may serve as a starting point for
tracing how educational decisions get made, how data are $r 'are not used to
support those decisions, aftd how those decisions might or might itat be imf , r
proved through more acceptance of data-based reasoning. f 9i ,v
Methodoiogy
• . • V * . ' ' 3
r . ' ' ■ - ' j
The 'sample consisted of 57 documents supplied by (a) state education
* i * • * *
agencies;', (b) RTI, SREB #> and ETS; and (c)' educational laboratories, including
; AEL% Needs ;assessment^jreports prepared by other regional educational labbra-
.tories are regul&2?ly sent to AEL. These 57 repgrts are indexed in the
bibliography* \ t *
Analysis ,
The reports were reviewed to determine suitable categories rfdr'compari-
son. The categories were applied to^l,5 reports and were then revised, into
the form included as Appendix B. Each report was analyzed using this form.
* The form included two broad categories for comparing the reports. One,
J/ *
called "classification of study indicated who conducted the study, its
t * >
style of v reporting, -and its intended audience, through the subcategories:
agency condvteting the study, division which prepared the study, purpose,
length, coverage, and intended user of information. The second broad category
t "assessment procedures used in study,' 1 identified- the method used t© collect
♦the t data. The subcategories included: testing procedures, survey techniques,
manpower aiJalysis, analysis of existing data, ^review of existing reports, and
• multiple studies. A study could be placed in more than one of the first set
of classification categories, but only one primary assessment procedure was
selected \in order to simplify the comparison.
^ The reports were independently rated by two AEL staff members, one an
> »
experienced researcher and the other a 1 communications specialist. Complete
S * '
agreement between the two researchers was found on 87.3 percent the cate-
e# * ■ ' ■ ♦
gories ii> the 57 forms* There were no differences in rating on 24 forms*
• * 9/ ' ' ' * ** \ * " " ^
4 Examples of ^differences wefe .th&t usually .one 'team member felt'an additional
purpose was served,. or> $t times,,- they agreed that neither had initially
V s
checked the appropriate response. Differences in rating were examined and
discus'sed until consensus could be reached. t
A matrix was prepared with assessment procedures on the horizontal axis
\ v » i> *
and the classification of reports on the vertical axis. Tallies were entered
according to the data on ^fchfc „ consensual rating form. The matrix served as
the basis ,for making generalizations. - .
Explanation of Categories . r *
The categories into which the reports were placed were somewhat arbitrary.
• ' * • ^
Assignment to .the categories was entirely judgmental* however , there Vas sub-
stantial agreement between the two researchers Ljao reviewed them independently.
Following is a description of the specific categories of 'the 1 rating,- form,
included as Appendix B. These categories were chsoen after ,a prelimi«n$|^ v
* v ' * ^> *
review of the documents- ^ \
"Agency conducting the study" was identified either as federal, regional,
' ' '« '
SEA, private or state education association. £he "federal" designation was *
used for the National Assessment of Educational Progress, while a private
agency referred to a private corporation, such as ETS,. - Regional Educational
Laboratories were classified as regional rather than private, since they t
serve a designated humber of states.
The researchers thought that there might be a relationship b^to/een the
"office which prepared the study" and selected methodologies." Many SEA'
reports were prepared by research, planning/ and/or evaluation bureaus or
* — »
divisions* . Some were prepared by the communications or public information
divisions. A third subcategory was prtfgrammatic division, such as curriculum
or federal programs. Some reports were prepared by the administrative division
A Y&riety of subcategories was identified as possibly purpo^esfpa?^the
reports^ Some were obviously designed to mold opinions, others tovsamble
existing opinions. Some studies seemed to be designed to verify or certify
,that pre-determined needs did exist, while others were designed to determine
the present status of a topic, such as educational attainment. Some were
designed to collect data so that one program or curriculum could be selected
c
05 given emphasis over another, and several yere designed^ to support formal^
prbposals, usually for federal funds. Of course, many studies served more than *
one purpose* The one category for which th&re was considerable certainty was
"number of pages" — they were counted,- v r^_ '
There were three categories of "coverage" — geographical, content, and age
of referent. As for geographical, the studies were designed to determine
either local, state, regional, or national educational problems. In some cases,,
studies conducted locally were compared with national norms. M C6ntent" was
broken intol three sub-classifications: ✓subject area, educational resources,
C
and opfcn, Studies of administrative structure, were classified under educa-
tional resources. For example, a stu^ of desegregation procedures and resulting
needs was placed in tjiis category. 'The subcategory "op^i" was used to indicate
studies or reports for which there t;as no apparent limit placed on the area
of focus. ( *
^ 1
The third type of ^overage* age of referent, was determined by the age or
>'■■•''
r grade level which was the focus of the study. Subcategories were preschool,
elementary, secondary, adult, and topen.
The fihal classification category, which incidentally turned out to be
quite subjective, was "intended user of information." Some reports were
apparently intended for the/jeheral public, some for professional educators as
a group, and some exclusively for SEA personnels One dr two were prepared for.
a board of education, or the ^federal government. Those prepared by Regional
Educational Laboratories were for their use. Many apparently were intended
K [ . *
.for more than one audience. *
6 -
-J&q^econd Page of the rating .form was used to categorize the primary '
3roc
assessment procedure used in the study. For those which employed testing,
some used standardized t^Sts.aiid othefs used curriculum specific, objective-
referenced, or criterion-referenced tests. As ap example of tpe latter,
* } ' /
several stayes A used items from the National Assessment of Educational Progress
'.'/.'.• f
m.o^der to^identify state 'or local needs.
v
Some .assessments were based on completion of standard v forms, such as those
1 + i
which report the number ot' Title I children served. Other studies employed
- ' ' \
questionnaires of varying levels of sophistication. Several questionnaires r
included a Likert scale, and some were based on systematically selected samples.
A few of the reports were based either on interview, data or on the results
"of a public opinion poll. One report was bashed on a, sequence of what best- can
* * *
be described as "town meetrhgs," where the public was invited and educational
issues were discussed. k
A few studi^s^used some VcLr,iation of a Delphi technique, where the respon-
r
dents' subsequent replies were based on the influence of information from other
respondents. Some us&d a convergence techriique v or a process of group deterrai-
nation of a need through panel participation < Some studies were based on
expert testimony from such persons as Chief State School 'Officers, who were
in a position to speak for a lar'ge ^population.
The manpower ca^Sg^ry represented a very different type of study. Manpower
studies involve assessments of supply of personnel ,in a specified job market
compared with present and presumed future demand for services of those personnel.
Other categories of toethodology were analyses of existing data, in which
previously collected data were reanalyzed to indicate educational needs or
review of existing reports. A final category was 'a series of multiple studies.
T
ERJC • \ J
Limitations of Study ^ r
* iSie study has several* ^imitations — some apparent and some perhaps not.
V • 'one obvious one is that the s^tudy is highly subjective.** Validation of the
"instrument" would therefore be difficult.
* ' * ' *
A second prpblem is that the categories ^re not .all-inplusive and per-
1 haps are not the best selection. Hopefully, the form will be improved through
t critique and interaction -among colleagues'.
A third limitation is that the design of the form has no conceptual
* * I v
model to tie it together; it is simply based on a review of available documents.
Perhaps a more unified model can be developed as relationships between the
or •
variables become more apparent.
* J' *
A fburth possible problem is that the studies which were analyzed may* not
have been representative of tfhose conducted in the included states. The
personnel may have misunderstood the AEL communication or it may not have
been sent to the appropriate persons in the state education' agencies and other
organizations . +
In spite of these .limitations, study reviews a Substantial number of
> • h
current needs assessment documents, and the results, however provisional, are
v
described the followirfi^ section.
Presentation of Results
All 57 studies were classified as described previously. For some"
studies, certciin classification categQries were not appropriate or did not
discriminate sufficiently to permit coding of items. For^example, the office
within the agency which conducted the study was not always discernible by
the researchers. v j
<
er|c
10
- • • • * ' ' * •
Frequency of Methodology Usage \ ,
A preliminary concern of the present # study was the frequency with which
various methodologies wfcre used to support statements of educational need or
.of educational goals. As indicated in Table 1, some typ§ of questionnaire
***
was used in more than one-fourth of hhe studies examined by the AfcL research'
team. As described previously, the questionnaire, ranged from a highly^
structured scale with sophisticated 'techniques for analysis to simple forms
designed to elicit specific information. Generally, there was little indica-
tion within the report Concerning the processes for. testing and validating
questionnaires* before they were used to collect data.
Some type' of testing, either standardized or, criterion referenced, was
used by less than one-third of th§ studies as. a primary methodology. It is
also of interest that about the saihe number of reports depended <* on some form
of criterion referenced, testing or some fontv of standardized testing.
The fourth favored methodology was analysis of existing data, with seven
reports depending heavily* on, tftis method. .-This method would appear, to be least
cosily of the various methodologies used in \t^e studies, although the" planning
specialist is dependent on the conceptualization\sed by those designing 4 the'
N i
original data collection. ^*
S % The other methodologies were found to have been upecl in either one, two,
v
ERIC
or three studies as indicated in Table 1. In light of^the vast expenditure
for educational personnel, the fact that so few studies 'employ manpower analysis
is sbmewhat surprising .^_The town meeting procedure might be examined further -
as a way of involving the public, in education decision-making. * The infrequent
use of publlc^opinion polls by educators nii^ht, also bear further study. The
final listing, multiple studies, is" deceiving, since most of the studies' examined
11 ! ■ '
p\ ■ ' y :
did use a multiplicity of research methods* 'iifmo^t cases , an attempt was
made to determine the primary methodcjuogy , and in, only one case was that
nqt possible • * * •» ^ , „
Table 1
Frequency of* Assessment Pro'cedufes
^ Used as Primary .Methodologies ■
v
Methodology
Questionnaire
7
Standardized Testings , .
Criterion Testing
. Analysis of Existing Data
' Revfew of Existing Jteports
Manpower^^fcdy, ^
Public Option toll *,
Delphi -
Convergence Techn^qu^, ;
Expert Testimony ( \ / .
Form§ Completion , *
^Interview v
■ ' > v
Toym Meeting
Multiple Studies
«L i
' 1 4 *
1 \ .
t •
Number
^Percefit
15 ' '
£6,3
15.8
8 ,
14.0"
' 7
12.3
J
f * >
. 3 '
5.3
2
* .3.-5* /
3<5 ' * '
2 | i
J . 3.5
- 3,5
1.8 . ' ■
i
' * i.8
V. ** 5
1.8 •
<. 1 -
• ,'1.8 , ^
. $ .
Comparison of Methodology with Other IStudy Characteristics ■ r "
Table" 2 is the matrix which cbjhpares. rfiP*%iassif icatiqn of .studies' against
the yariouSj methodologies used. r By way of interrelation, < the fir'sJt column of -
indicates that eight studies - completed by SEAs u$ed standardized testing
Table 2
Frequency of Types of Studies According to Methodologies Employed
METHODOLOG-IES J
Agency Conducting Study,
♦
'Federal
' *• *
' ^ Regional ,
•SEA jfe
Private
State Educ. Assoc* ■ '
Office Preparing Study
Res. Plajn, & Eval> Div. «
Stnd 'Curr * Form , Intr PbOp Town ConV^ Expt Hnpr Extn Extn Mult
ized Spec Oonp * Q're view Poll Mtg *Dlpi Tech Test Stdy Data .Rept Stdy
Public Information Div.
\ Programmatic Div* _
^ Administrative Div.
' Purpose « j "
Opinion setting
Opinion' sampling
Status determination
i
Program emphasis
Support of Proposals
Length o
f ReportS fiSkpes)
1 - <5
m
) 6
16 - 50
50 +
13
— —
r
*r
1
7
f
2
2 *
X
%L
' l
p
•'V'
i
X
X
<
1
X
* 1 •
2
1
1
v ■
1
—
rnJ
1
> 5
i
1 „.
/
•
\
V
*
*
•
)
/
a
# a
* l
I\
$
x
2
1
•
1
xr
v 1 _
V
1
2
f
* —
1
2
4
•
«
1
*
) —
1
V
V
1
r— f
1
N 2
♦
-X-
*
7
2 .
1
1
Q
o
a
a
1
Q
O
^ 1
1
„ x
3
2
1
f
2
♦
i
2
2
H
\
" *~
2
j
2 ,
•
2
2
1
i
1
2
-3
3
6
• X
-1
i
s
3
1
— :
• 6
3
L
5
i
2
. 3
2«
Table 2 (cont'd.)
METHODOLOGIES
Coverage
(Geo) Local
State
♦
Regional
national
v (Con) Subject area
Ed. Res. org.
Open v
(Age) Preschool
. Eleoentary
Secondary
, Adult
. * Open *
Intended User of Info .
General Pufclic
' All Prof. Educators
sea Personnel
Board of Education
Regional Labs
Federal Government
Stnd Curr Fora , • Intr PbOp Tovn Conv Expt Mnpr . Extn Extn Mult
ired ' Spec Cocp Q'v view v Poll Mtg * Dlpi * Tech Test stdy Data Rept Stdy
4
1
1
\
*9
7
1
7
1
• 1
2
3
>>
3
1
1
6
2
1 ;
. 2
3
2
1
1*
1
1 ^
8
8
) — «J
1
2
2
2
2
1
1*
1
9
1
1
1 ;
2- .
" 2
3 *
2
1
1
9
4
1
5 .
■" 1
l :
1
1
r
1
2
6
— ' TT
r r-
-
1 *:
2
, 2
1
' 4
—
2
2
•
t
6
1
■
if
2
2
3
2
1
*
• 2
1
1
!• .«
/
■ 1
9
8
1
12
1
2
1
*
1
1
3 '
6
2
1
4
3
1
1
1 a
2
3
. ■>
* 1
3
1
1
2
•1
*
1
• - • - : , i2
as a primary methodology ♦ The next column indicates that six studies completed
by SEAs .used some form of criterion referenced test as a primary methodology .
In the following .sections, the- various methodologies will be discussed
^ccordii?g, 'to each of the classifications listed on the first page of the *
reporting fcfrm J Appendix B) . Summary generalizations will be presented follow-
ing the description o? results presented in Table 2*. .
• •* • v. * ,
Agency conducting stlidy . According to the data presented in Table 2, a
federal agency had completed only one of the studies, and used analysis of
existing data as the primary methodology. -Three regional agencies, which were
regional laboratories, "had completed 17 of the studies. They had used* question-
naires for seven of them, analysis of existing data for three/ a convergence
technique, expert testimony, and review of existing reports for two each, and
• i * "* * * '
a manpower study for one needs sensing activity. The state education agencies.
were more dependent on standardized testihg, criterion referenced testing, apd
questionnaires than any other methodologies. Private agencies, such as ETS,
tended to u^e more of a variety of different methodologies than did either
• • . » i
the SEAs or regional laboratories. ' ' / — \,
■ I
Office which prepared the study . A majority of the |£u&ies -appeared to
have been conducted by bureaus of research, planhing and/or evaluation within
tfye various agencies. These bureaus or divisions tended to use standardized
testing and questionnaire analysis as their primary methodologies.
The two r uses of expert testimony as a research methodQlogy were by
administrative divisions within the organisation, rather than by the research
t f
and evaluation divisions. The programmatic divisions iwi thin v the agencies did 0
not vary substantially from the research and evaluation division^ in their
use of varying methodologies.* I
. Purpose . As one might expect, more studies 'seemed to designed to deter^-
mine status than for other purposes listed in Table 2. . The methodologies for
status determination were primarily standardized testing, criterion referenced
.c.
testing, and questionnaire administration, although a few studies employed.
analysis of existfnfcf jiata,^ manpower analysis, and *revi^^oT*feiisting repprts.
When opinion sampling was the piippose, a questionnaire methodology was usually^
selected. On occ asions when op
pijLion
setting tended to be the purpose, the
3 .
pgies were either convergence tecnrjicfue or expebrt testimonx»
selected methoda.c
although the rtelpfii technique, town meeting,, and questionnaire were used on
one occasion ^ac
On those few occasions where program emphasis seemed to be tihe purpose,
analysis of existing data, questionnaire development, and standardized testing
were the selected methodologies?^ If the purpose was to support proposals under
development, the ( studies tended to employ questionnaire adnknistration, expert
testimony 4 or review of existing reports as primary methodologies*
Length of repor.t . Generally, the research team felt th^t brief reports
were aimed more at j the„ general public, while mare lengthy repor£s"were designed
c for specialized audiences. As the criteria were applied during the analysis,
the brifef er reports -tended to use *as primary methodologies, questionnaire
development and expert testimony. Slightly lpnge
r repcfot^
used curriculum
specific testing, questionnaires, and review of existing reports as primary
.methodologies*. " The! lengthy reports tended to employ standardized testing,
A' »
questionnaire administration, criterion referenced testing, and, to a lesser „
extent, manpower analysis and analysis of existing data as primary methodologies
-A.
fetf reports which were local in nature either used criterion
Coverage . The
referenced testing or questionnaire development.. The* state-level reports used
f r '
standardized testing, criterion referenced testing/ and questionnaire develop-
ment as primary methodologies, while a few reports also used manpower analysis,
analysis of existing data, and &n assortmen^^of other methodologies. The
is
region-level reports fended to use questionnaire development, and some also
• * s
used analysis of existing data, convergence techniques, expert testimony, and
review of existing reports. The national-level reports used Questionnaires,
interviews, publ±^ opinion ^olls, and analysis- of existing data. *
As for content ^coverage, the* reports tended to either be fol^specified
subject areas or open, v regarding the referent of the study. Those specific
subject areas tended to use standardised or criterion referenced testing,
while those which were open, tended to use questionnaire administration as a
✓ .* „
primary methodology. '
/ ' \ '
There was a fairly good dist/ibution of reports across the different age
and grade levels, although only one study was of preschool education needs,
and it used an analysis of existing data methodology. The studies of ele-
mentary and secondary education deeds tended to use standardized testing,
f \ "
■* ' * >
criterion referenced testing, and questionnaire development as primary method-
; «* * ^
ologies, while the studies which were open regarding age and grade level tended
t
to use questionnaire development and analysis of existing data, along with
other methodologies to a lesser extent.
Intended user of information * Very few of the studies* were intended for
the general public, and those that were tended to depend on a variety of
methodologies, including questionnaire administration, criterion referenced
• • • ( ' • .
testing, public opinion polls, Delphi approaches, analysis of existing data,
and multiple studies. A wide 'majority of the studies were rated^s intended
for all professional educators, and thqse tended to use questionnaire develop-
m6nt, standardized testing, criterion referenced testing, and analysis of
existing data as primary methodologies. The reports intended for SEA personnel
'were dependent more on standardized testing and criterion Referenced testing
• • «> •• .• ) •
than on other methodologies, and those intended for state boards' of education
were usually dependent on standardized testing
,.. ; ■ ,-. • ...... , is
; -V. \ . ^. v
\^ r Reports intended* for use^by regional laboratories primarily ir^pluded' m
* *. \ . *V
t& ■* * *
'guestlonnaireldevelopment ai\d expert testimony methodologies.
Generations , : X ■
The. d&t'a of Tabl^ 2 tended to group rather nicely, and" no striking su£-
prises were found. -However, several generalizations can be drawn from the
» * , * * «
data analysis. F^rst, ^questionnaire development and administration was the
i , *
primary methodology used by most agencies. There was also a number ,of reports
«t *
;which used some form of ^sting as a primary methodology. ^
More of the studies collected happened to be prepared by £tate education
ag^Rjries, and^ mofet of the .needs sensing, activities are apparently be^ng com-
pleted by b^dreaus or divisions of research, planning and/dr §y^luation than
by central administration or other programmatic divisions. The. research,
planning and evaluation divisions' also tended to use testing .and questionnaire
development* as primary methodologies. » 6#t / *
The purpose for most of the studies appeared ,tb be status determination, J
and, tjiese more frequently use tests and questionnaires as oppos.et} to other
• ■ . / ■■
methodologies. There was also some opinion sampling through use of % questionnaires .
I Most studies designed, to assess needs jya specific subject areas used some
form of testing, while those which were rated as "op^n" regarding subject or -
other topic of concern, tended to use questionnaires or interactive types of
methodologies. ^
«#* »
Most studies tended to be done f or / elementary and secondary grade level
children, and*wer$ designed for 'use by all professional educators.
20"
16
Recommendations
A-
e AEL research team has severaj. recommendations concerning use of the
information contained in this report:
1. Those who are planning needs" sensing activities at all levels
might well exatoine fcfye type$ ; of methodolqgies, used for dif-
, ferent purposes by their colleagues.. ' N
If most educational pl^nne^s are using some form of" question-
H naire t to collect dara, there may well a sound reason, which
might - be considered by those f>l^nning "their owiT needs sensing
- % activities. On the 9ther hand, the research specialists ^f-^
- fl might look, at those ^methodologies. not frequently j^ed and
■ exatfiine the few cases, in which* they were applied. Por*
fexample, it would -appear that-, some f;orm of public opinion
K -poll would be more likely* to produce- desired results for . '
/those wishing to sample opinion than would" cpmplete dependence
on questionnaires, ^but that contrast is -beyond the confines
of- the present study and depends on ftarther analysis of
methodologies. * . * %s
2. Tho^e responsible for training, educational researchers should
encourage examination of Various methodological techniques. 4
'Questionnaires may be simple to design, and da£a--from th6m
are probably easier to analyze; however, educationaLglanners
should be familiar with the use of other 'methodologies so
* they can apply thiem in appropriate situations. Perhaps k
matrijjc which contrasts various* ^ethodelogies with possible
uses would be .hielpfut^Sor practicing educational planners*
3. Consideration of; the .various styles o£ repobfciqg' used to ,
communicate ,the_ results is another important* area of 1
investigation. • /
Perhaps ^informal war<d-of-itfouth or gta^f memoranda rather than
jformal, reports, better enhance the support of important ^duca-
, tional decisions. Presumably, so*ne types of reporting are
roore effective when decisions by. large groups are "required,
T r * v " such as hy all instructional personnel o£]d state, ; than when
the decision is to be made by a smaller group,~*such as ^
state board of* ediicatibn.* The reporting style ^and possibly
the selected methodology* Should be related to the intended
„ ^udience, but additional study is needed to guide the selec-
tion of the communication .strategy.
J
JJi^ approach uked in "this study Should be fiifcther refinedand 4
Standardized sp that we who are involved\'in needs sensing
activities 1 can^continue to compare purposes, methodologies, •
an<2 resulting studies and continue to improve our -products.
' ' • 21
Additional^ classifications should be considered/* and pth^r
methodologies ^should fee added to the list. For example, e
one criterion ^Dt approached in this study was the cost
of the study. Pres\,faiably , the cost of standardized testing
of large numbej|j? of children is greater than^is completion
of a few jLnterviews, 5 but then, interviews a£fe more expensi^k-
when an equal number bf, v interviews and group' 'tests Are com-
pleted. The question is one of expected costs compared with
benefits received for tfar£ous purposes, all of which suggests
a complex study./ - , ,
Other educational researchers should use the classification -
processes described^ in thi^ report. . v
The study herein report^^^^^Lntended as a beginning in the
area of ^generaljteport ,e^3^ M| ^on ^nd classification,' If
found to have merit by dblleSgues, the study could Be expanded
into a general classification procedure useful to multiple
.audiences. The validation of the procedure will , require rating
of additional reports by additional researchers. Vi_
• , * . * * '&>
This* research should be r expanded to determine thfe effect of
needs sensing activities on those who -make, important educational
decisiSns. * * , ^ . . * r ^
No attempt was made in this tf study to trac^J decisions whibh were
based on the data, butynost professionals agree that the ulti- ^
mate purpose of collecting the data is to make possible efiec-/ <
tiye educational decisions. - Perhaps some methodologies, or
some classifications of studies, are more frequently used for ^
educational decision making, Wile others tend to be filed
away or used' only for ftirth'et- interaction among those who
conduct tjie studies. , ' „ -
C
F
22*
) .
Reports Us&d in Analysis
Bibliography
Alabairfe t State Department of Education,. Design for Developing A Program
of School Improvement . Montgomery, Alabama: 1973. , (mii^ographed)
Alabama State department of Education. Educationalijboals for Alabama
^ Schools . Moatgomery, Alabama: 1972. .
<^
Alabama -State Department gf Education. Report of 1974^75 Educational
' Assessment Program . Montgomery , Alabama: 1975^ (mimeographed)
Alabama 'State Department of. Education. "Repbrt of 1975-76 Educational
Assessmerit Program/' Science, Part A: Fourth Grade Results: 11 Mont-
gomery, Alabama: yl977. (mimeographed)
Alabaka State Department of Education. "Report of 1975-76 Educational* '
Assessment Program, Science, Part B: Twelfth Grade Results. 11 Mont-
gomery , Alabama : 1977 . (mimeographed)' . '
J V ' ' \
Appalachia Educational Laboratory, Inc. "Chapter V^^Educational NeedV\
Assessments for .Appalachia — A Brief Review." Diretetiofts -for Educat-
ional Bevelppment in Appalachia . Charleston, W. Va. : 1971.^
Appalachia Educational. Laboratory , Inc. "Chapter 3, A, Survey of School »
Administrators — Report of a 1967 Study." Directions tor Educational
Development in Appalachia' / Charleston, W. Va. : 1971. • c
Appalachia Educational -Labor^to^, I^p. T, Ch^pter 4, Educational Needs as
Perceived by Public School 'Per/onnei. " Directions for Educational
Development in Appalachia . Charleston, W.Wa. : . 1971. 7
Appalachia Educational La,bp^atory, Inc. "Chapter 5^ Educational Needs in - m
1976 — As Viewed by 'Appalachian Experts 1 ." Directig^ra^for Educational
^ - Development, in Appalachia . Charleston, W. Va.: 1971.
Appalachia Educationa^Laboratory, Inc. ""Chapter 6, AEL Membersfiip Suggests
rleas for Educational Development. Directions for Educational
svelopment in Appalachia . Charleston, W. Va.: 1971.
Appalachia Educational Laboratory/ Inc. "Projected Allied Health and
Nursing Training Needs for a Seveiv^6ounty Area ir\ West Virginia."
* . Charleston, W. Va. : 1977. (i^tepographed)
Appalachia Educational Laboratory, Inc. "Summary: Meeting of Appalachian
Cljief State School Officers." Charleston/ W„ Va.: November 17*;*
d976.^ (report)* w ~* Ma • . .
Appalachia Educational laboratory , Inc.. "SMimary Report: Meeting of SEA
J 4 Leaders Ifrom AEL Member-State^. " Charleston, W. Va.: .February V, '
m 1977 if (report) ' v , ' * ' *
Bertram, Charles LA "An Expression of Educational -Neecft By Leaders in
•Appalachia and AEL's Role in Meeting Those Needs." (draft) Charleston,
. W.' Va.: Appalachia Educational Laboratory, Inc., 1977. * (mimeographed)
Bertram, Charleys L. "Social ana Educational Characteristics of .the
Families of Rural Appalachian Preschool Children," Technical T Report
No. 57. Charleston, W. Va.: Appalachia Educational Laboratory, Inc., *
.1975. (mimeographed) • < i *. *
Coan# Dbnald L. /'Parenthood Educatic^ Needs: A National Assessment Study."
Edward E. Gotts (edU) v Charleston, W. Va. : Appalachia Educational p
Laboratory, Inc. (mimeographed)
Conaway, Larry E.,' Fisher, Maurice D., Johnson,; f^nne M. An Analysis of
the Results of the Florida Educational Assessment Using the NAEP Model
Reading and Mathematics 1974-75 . Research Triangle' Park, N.C.*: ^
Research Triangle Institute, 1975. (mimeographed) * • ^
Connecticut State Board of Education.. Connecticut Science Assessment, 19/4-75
Hartford, Connecticut: no dateW, ~+ r ~
Educational Testing Service. A Plan for Evaluating the Quality of EducH-
' tional Programs in Penhsylvania . Princeton, New Jersey: f965.
Educational* Testing Service'. State Educational Assessment Programs 1973
Revision . Princeton/ New Jersey: 1^3. „ "
Florida Sta^e Department of Education. An Educational »Needs Assessment for
the Children of Florida's Migratory Agricultural Jforkefrs. Vol. 1.
Tallahassee, Florida: 1976. (mimeographed)
Gallup, George H. "Ninth Annual Gallup Poll of the Public's Attitudes
Toward the Public Schools." Phi Delta Kappan . Bloomington, Indiana^
Sept. 1977 . ' (magazine) ; < ,
Georgia State Board of Education." "Education Program Data*" .Atlanta, Georgia
. 1975. (mimeographed) * ?
Kentucky Department of Education. Kentucky Educational Assessment Program ,
, 1975. Frankfort, Kentucky:- 1975.^,' '
Kentucky Department of Education. "Title I ^valuation Report: \ Kentucky 1975 «
Frankfort;, Kentucky: 1975. * j
' **** 4 •
Maine Department of Educational* and Cultural Services* Maine Assessment of
Educational Progress: Reading . Elaine: 1974. (mimeographed)
* * *
Maine Department of Educational and Cultural Services. Maine Assessment of
Educational Progress: Results ; Report g Mathematics Results for In,- , '
School Thirteen- and Seventeen-Year-Olds , Maine: 1974. (mimeographed)
Maryland State Department of Educatioii. Goals and Needs of Mary land A Public
Education. Baltimore, Maryland: 1972^ (bookle€) * T "
24
<
■ 20
Miles, Curtis.,, & Bertram, Charles L. r " Review and 'Summary of Appalachia
Statfes' Educational Needs and Goals Documents . " <draft) Charlestpn,
W. Va.:, Appalachia Educational Laboratory, Inc.., 1976; (mimeographed)
Mississippi State Department of Education. General Educational Needs ^ • *
Assessment in Mississippi . Jackson, Mississippi: 1972. (mimeographed)
Nagle, Jphn M^ ."Outcomes -'of the AEL Board's Goal-Setting' Activities Con-
ducted April 30-May 1, 1973." Eugene, Oregon: 1974. (mimeographqtfV
National Assessment of Educational Progress. Reading in America: A Per-
spective on Two Assessments , Reading Report No. 06-R-01. Denver,
Colorado:* 1976. " ' 4 JL '
\ n
Nati9xidl Education Association. 'INEA* Reporter. " Vol 16, No. 8. Washington,
D. C: 1977. (periodical) J * , *
New York State Department of Education. "Local £nd Statewide Needs Assess-
ment." Albany, New York:, no da t^JL (mimeographed)
\. ™ ^ * J ' '
New York State Education Department. An Assessment of Naeds fog" Professional
Personnel for Occupational Education In New York St3>e . jResearch
Publication 75-3) # . 'Albany/ New York: 1975. 7 ~ / >/
New York Stfate Education Department. . "Proposal in connection with the • ' "
Emergency School Aid Act Educational, Television Program/ for FY 1976,
Vegetable Soup II." Albany, New Stork: 1976. t mimeographed) •>
North Carolina State Department of Public Instruction. "Assessment* of
Educational Progress in North Carolina: Title I, ESEA^Participants
Qrade 6, 1971-72;" Raleigh^ Jorth^arolina: 1974. (mimeographed)
North Cairo Una' State Department of Public Instruction.' "Gradf. 3* Language
Arts xj State Assessment of; Educational Progress in North Carblina,
JL973-74." Raleigh, North Carolina: 1974a (n*yneographld)
North CarojJ^na Sta^D^jpartment of Public Instruction. "Grade 3 Mathematics $<
State Assessment of Educational Progress in Nbrth Carolina, 1973-74."
; Raleigh, North Carolina: 1974. (mimeographed) v
• - (
North Carolina State Department of Public Instruction. "Grade 3 Reading;
^ State Assessment of Educational' Progress in North Carolina, 1973-74."
Raleigh, North 'Carolina: 1974! (mimeographed)
North Carolina State Department of Public Instruction. Stafce Assessment
of Educational Progress in North Carolina: Summary, Cultural Arts .
Raleigh, North Carolina: 1^76. (booklet). ~
North Carolina State Department of Public Instruction. State Assessment
' of Educational Progress in North Carolina: SummaryTlfealth Physical
* Education . Raleigh, North. Carolina: 1975. (booklet)
• • . * '
North Carolina State ^Department* of Public Instruction. State Assessment -
of Educational Progress in> North Carolina: Summary, Language Afts .
Raleigh, North Carolina: 1976. ^ (booklet).
25 ,
9 '
North Caroling State. beparOh^nt of Public Instruction. State Assessment
of Educational Progress rn 'North Carolina: ^jbnmary, Mathematics ."
Raleigh, North Carolina: 1976 . (booklet) # *
• . ' ' ' ' * , • - %
North C^piina State Department. of. Public Instruction. State Assessment .
of Educational) Progress, in Ndfrth -Carolina: Summary / ,Rteading .
/ * .Ralegh, North Carolina: - 191W. (booklet) H Z ' ' " . "'
North Carolina State .Department of Public Instruction. > 5tate Assessment
of Educational ggogress in North Carolina:' Summary /* 'Schdol Charac- ' ' %
and Clalsroom Practices. Raleigh, North Carolina: 1976/
\ terist^cs a
\ (b<^ok^t)
North Carbl^ria State ^Department of P^blicr Instruction/ State Assessments ,
! t o f Educational, .Progress in North Carolina; » Summary, Science . Raleigh,
* r N^th Carolina: 1976^<fcoo^t) ~ u 1 VT~Z ; ' s
North Carolina St^te Department of Public Instruction . State \ssessmerfc \
of Educational Prepress in North Carolina: Summary, Social Studies. ^
Raleijjh, North Carolina*, 1976. (booklet) ( . / ^
Norttfwest Regional Educational L^oratory. "^ember-andyun. " Portland, ^
Oregon: -19*76. (newsletteijR
•* Ncfrthwest Regional' Educational Laboratory. Survey Results: Educational
Needs in the Pacific Nortfrwestf f* Portland, Oregon: ' 1976. T . .
Ohio State. DepartmentAQf^^ca^i^. "The Ohio ^ateWide Student Ne 4 eds
* , Assessment Progtanfc Kqpdrtpn 1974^7elf4»grade reading assessment.
Coluabus, Ohio: no ctS£e. (mimeographed) ^S: j ^ •
A , . . ' s ' * ) " • ^ *
Pennsylvania Department of Educ4tibn« "Teacher/Demand and Supply for*^ t
Elementary and^feco^Sary ^Schools, iji Pennsylvania. " 'Harrisbferg, fea. :
1977. (mimeographfed) / ~ O . * ' . * „ " ' J.
^ * • - - v *
Southern Regional Education Board, j jetet Book Orv Higher Education* iff the
y \ South -1075 and 1976 ., Atlanta, Georgia: .1976. - *
/^puthw^^Edupational Development laboratory * "S??ven-Stat£ Educational f
f ^ Futures Study, 1977." Austin/ Texas: February 1977. (njimeographe^ * •
* \ "*\ * * *
Tennessee Department of Education. "The Public Has A Voice.", Report of the 4^1
statewide instructional^ improvement project, ^1976-77. Tennessee:
;1977.- (^diniographed) *
Weiss, Iris R. Guilford Science Assessment 1^5 . Research Triangle Park,
N.C: Research Triangf^ Institute, 1976. * ^mimeographed)
West Virginia *^oaJfd 6'f Educalflon. Educational Goals for West Virginia .
.. Charleston, W. Va.: 1*976., . \ ^> m ^ \ •
* »
Wetft Virginia Department of Education. School Health Assessment . A report
'of the West Virginia 'school health education assessment project.',/
Varies toll, W. Va.: 1977. fcV -
Otfier Documents of Interest
22
Butler, Martha A. , Fisher, Maurice D., Place, Carol. Summary of Procedures
J Employed "in State Evaluation Reports of Title I Migrant Programs ,
Research, Triangle Park, N. Cw: . Research Triangle Institute, 1976,
(mimeographed) ~~ t \ * * ■
Conaway, Larty E. An Example of Utilization of Educational Assessment
Data .
1974.
Research Triangle Park, N.
(mimeographed)
Research Triangle Institute,
Con&wa^ f Larry E. Combining Norm-Referenced and Criterion-Referenced Data
% , to Determine Instructional Needs in the Reading Program . Research
* Triangle Park, N. C: Research Triangle Institute, 1975. (mimeographed)
Conaway, tarry fe. , Weiss', Iris R. The Establishement of Student Performance
t Criterion Levels for Judging the Afrceptability of Assessments feesults .
Research triangle Parjc, N.»C.: Research Triangle Institute, 19761
f
(mimeographed)
Connecticut State Board of Education. m "Conducting Local District Assessment
in Conjunction with Connecticut Statewide Assessments." Connecticut:
$6 date. Jminf^ograpiied) * ' *r
; -
Fisher, I^aurice D. State-of-the Ayt Sunmrary and Critique of Evaluation
Procedures' Used by SEAs and LEAS in ESEA Title I Migrant Programs .
, fce^earch ^riajigle Park, N".. C#: * Research Triaftgle Institute', 1^76.
.Hmi^e^graphed) • ■ » . J . 1
% University of p\e State of New Yo^k. Directory of Occupational Education
. Programs in New York State. Albany, New York s 1972. ^ ] : .
University of the State of New York. New York State Education* Department
Services to Elementary » SeconcTagy and Continuing Education . Albany | '
. tfew York:. 1976 M f , - ^ « *
f
4 *
* *
ERIC
27
r
* APPENDIX A
Sample Letter and List of States
X erJc
28
Sample Letter
September 24, 1976*
Appalachia
Educational
Laboratory
Dr. Richafd K. .McKay
Assistant State Superintendent
Division of Research, Evaluation, and ,
Information Systems
Maryland State Department of Ed&cation
P. 0. Box 8717, Baltin»re-Wash±^£bn
International Airport
Baltimore, Maryland 21240 ,
f '
Dear Dick:
During the p^st two or three years, many of the state education agencies
have completed studies of~" educational negds. In particular, . studiekunay have
been conducted in conjunction with State ESEA Title I programs, witti-the
Emergency School Assistant Act (ESAA)*^ funding, and with certain vocational
education programs. * ,
? ■ '
AEL staff has completed numerous needs assessments of: the Region during
the pasf^ffen years. This year, rather than attempt the collection <?f large
quan titles of original data, we are trying to develop a composite of already
prepared needs studies from the states in -and around, the Region. %e '
<j
If Studies of educational needs have recently been completed by the
Maryland State Department of Education, we would appreciate receiving copies
so they can b^etudied and compared with similar studies from the states.
If the comparison is successful, we wj.11,^ of course, make copies of the com-
posite available to' you and your staff.
eduQ
anks for your cooperation in our effort to continue monitoring the
ional needs of our Region.
Sincerely yours,
Charles L. Bertram
Associate Director for Planning
and Evaluation Services
cc; Frederick Brown
Appalachia Educational Laboratory, Inc. * W *
1031 Quarrier Street /P.O. Box 1348 • Charleston, West Virginia 25325, •^04) 344-8371 .< f
An Affirmative Action /Equal Opportunity Employer **S\ f . V •
List of States
25^
Alabama
Florida
Georgia
Kentucky
Maryland ^
x_ ^ ,
Mississippi
New York
- v
North Carolina
Ohio
Pennsylvania
South parollna
Tennessee
Virginia '
West -Virginia
30 -
/
\
)
APPENDIX B
■Classification Form
/
\
A
/
"ERIC
31
CLB/PBS : AEL: 10/28/77
27
piassif ication of Stat^, Regional, Local, and National
Needs Studies and Related Activities
Title of Report ^_
Agency/State
Date, of Report
I.D. No.
?A. Agency Conducting Study
Classification of Studies
B.
6
5. State Educ. Assoc
Office W^ich % Prepared Study
1 . Bureau of^Research, Plan-
ning and^6r Evaluation
.Public Information Division
3. -Programmatic Division ■
4. Administrative Division
C. Purpose
1. Opinion setting
2. Opinion sampling
3. - Statiis determination
4. Program emphasis ^
5.. \ Support of proposals
Length of Report (page's)
1. 1-5 " -
2.. 6-15
3. , 16 - 50 ' 1
(
4. * 50 + ,
• ' • ' - •
» / .
„ , • ,\ , „
32
Coverage ^ • (
1. Geographical
» a* local
b.
state
c. regional
t
d. jr national
1 ' '
Content
a. subject area
b. educationa| res., org.
c. open
Age of Referent
a. preschool ]
b. elementary
c. secondary
d. adult '
e. open
Intended* User of information
J :
. 1. General public
2. All professional educators
3. SEA personnel . .
———— j
4. Board of Education
S\ Regional Laboratories
6. Federal Government
II ♦ Assfessment Procedures Used in Studies
Testing- *
1. Standardized-
2. Curriculum Spe^fic, Objective Referenced, or Criterion Referenced
3.<
Survey or Report Completion
1. Forms Completion (like LEA to SEA)
2. Questionnaire '
3. Interview
4. Public Opinion -Poll "
5* "ToVn Meeting" •
6. Delphi ' * i
7. Convergence Technique-
8 A Expert Testimony *
9.
Manpower Study
• Analysis of Existing Data
1 Review of Existing Reports
» Multiple Studies » .
33