ACLS, SABES, UMASS: Perfect Together!
Eight years ago, I was the Senior
Psychometrician for the GED Testing Service. My job was to ensure
the technical quality of the Tests of General Educational Development
(GED), which included making sure the score conversion tables were
accurate, making sure the test items were clear and that they were
testing what they were supposed to, and conducting research on the
psychometric quality of the tests (i.e., studies of score reliability
and validity; see Zenisky et al. in this issue). In working with
educators throughout the United States to develop and review GED
Test items, I fell in love with the adult education community. I
became well aware of the dedication of ABE instructors and staff,
as well as the amazing success stories of the millions of ABE students
across the country.
When I left the GED Testing Service in Washington, DC to come to
the University of Massachusetts
at Amherst in 1995, I wondered whether I would again have the
opportunity to work with the ABE community. Thanks to Bob Bickerton
and his staff at Adult and
Community Learning Services (ACLS) of the Massachusetts Department
of Education, I am proud to say that ABE is once again a major part
of my life. And this time I have an army of psychometric professionals
and students to help out, among the most talented testing professionals
with whom I have ever worked. In my new role collaborating with
ACLS, I have also discovered a new set of colleagues who go by the
strange acronym SABES. These
SABES folk are also a pleasure to work with, and are dedicated to
improving education and assessment in ABE. In the remainder of this
article, I will describe exciting events happening right now that
stem from collaborations among UMass, ACLS, and SABES.
In January 2003, ACLS contracted with the UMass Center for Educational
Assessment to help improve the assessment of ABE students in Massachusetts
and to assist with their ongoing refinements of the processes used
to evaluate and monitor all ABE programs in Massachusetts. Since
that time we have written more than a dozen reports for ACLS, worked
with teams of educators across the state to design new assessments
in math and reading, developed and validated new prompts for the
REEP writing assessment, and provided a comprehensive set of recommendations
to ACLS for enhancing their program monitoring processes. Brief
descriptions of three of our major activities follow.
Take out your Number 2 pencils: new ABE assessments are coming!
ACLS and SABES have worked hard over the past several years to come
up with ways to meet the federal government's requirements for the
demonstration of the effectiveness of ABE programs. Presently, the
U.S. Department of Education requires ABE programs to use test scores
as one means of demonstrating that students are learning. ACLS and
SABES convened a group of Massachusetts ABE educators called the
Performance Accountability Working Group (PAWG) to review currently
available tests that were suitable for adult learners in assachusetts.
The final report produced by the PAWG is available at http://ww.sabes.org/resources/pawgfinal.pdf.
In that report, the PAWG concluded that currently available tests
were insufficient for the various needs of ABE students and programs
in Massachusetts. They recommended a set of tests, including the
TABE, BEST, and REEP, to be used on an interim basis until ACLS
could develop new assessments targeted to the recently developed
The development of new assessments for ABE students in Massachusetts
is one of the key activities we are working on. Our vision is to
mobilize ABE instructors and staff across Massachusetts to help
us develop these tests. Our initial test development efforts are
in the areas of math and reading, and recently we worked with two
groups of ABE educators to decide what these tests should look like.
One group helped us develop specifications for the math tests; the
other group helped us develop specifications for the reading tests.
Our next steps are to hold several item-writing workshops for ABE
instructors across the state and ask them to write items for us.
Thus, ABE instructors in Massachusetts will be the ones who develop
the forthcoming tests.
Will our collaborative efforts produce tests that ABE students
love to take? Well, probably not love to take; however, we are confident
that the tests we are developing will be similar to what students
are learning in their classes and will be appropriate for measuring
their knowledge and skills. We are also confident that these item-writing
workshops will provide valuable professional development for ABE
educators. We plan to hold 5-10
workshops over the next year. Check the SABES Website at http://www.sabes.org
periodically for announcements.
Making the REEP deep
Many students throughout Massachusetts strive to improve their
writing. Many of these students write in languages other than English,
but are taking classes to improve their writing in English. ACLS
uses the REEP writing test, developed by the Arlington Education
& Employment Program, to measure how much students' writing
improves after receiving instruction in ABE classes.
A key feature of the REEP (or any writing test) is the prompt,
which is the topic to which students are asked to respond. An example
of a prompt is, "Write a letter to someone about your most
recent vacation." The prompt on a writing test gives the students
something to write about and allows a plan to be developed for scoring
the essays written in response to that prompt. A year ago, there
were only two prompts associated with the REEP. Thus, with respect
to prompts, the REEP was not very "deep." Students who
needed to take the REEP more than twice had to respond to the same
prompts over and over again. ACLS asked us to develop new prompts
for this test. Because these new prompts would also be used to measure
students' improvement in writing, they needed to be equivalent to
the two prompts that were currently in use. After all, if we developed
a new prompt that was harder to respond to, students' newer essays
might appear to be worse than their earlier essays.
I'm pleased to report that last spring we pilot-tested four prompts
and one was selected for the pool of REEP prompts, expanding it
by 50%. During the fall of 2003, we pilot-tested nine new prompts
and four of them were approved for addition to the REEP prompt pool.
In just one year, the number of REEP prompts expanded from two to
seven. We were able to accomplish this goal by calling upon Massachusetts
ABE teachers to send us ideas for prompts and administering experimental
prompts to their students. ABE students also helped us by writing
essays to the experimental prompts. Finally, we used SABES's network
of certified REEP scorers to score the experimental essays. The
new prompts were selected after a comprehensive set of statistical
and qualitative analyses that led us to conclude they are comparable
to the two original prompts with respect to difficulty and scorability.
The technical details regarding the prompt tryout and selection
procedures are available in two reports we prepared for ACLS.
Monitoring program monitoring
A third project we are working on with ACLS is improving the monitoring
of ABE programs throughout the state. ACLS is required to monitor
all ABE programs to see whether they are doing a good job in accomplishing
their goals and to report program evaluation information back to
the federal government as part of the National Reporting System.
Over the past year, we followed ACLS staff on several occasions
when they gathered information on program quality. We also conducted
a survey of ACLS staff members who perform program monitoring and
surveyed programs that had recently been monitored. Finally, we
took a close look at the instrument used to record program-monitoring
data. Using the information we gathered from our observations of
program monitoring and the survey data, we made several suggestions
for revising the Program Monitoring Instrument. Presently, we are
working with ACLS on revising the instrument to make it more efficient.
The above descriptions are just brief glimpses of the activities
we are working on with ACLS and SABES. At the beginning of this
article, I wrote a lot about myself. Before closing, I would like
to write a few words about my terrific colleagues at UMass who are
also working to improve assessment and evaluation in ABE programs.
There are two senior staff members associated with this project:
April Zenisky and Mercedes Valle. Both April and Mercedes are experienced
in test development and statistics and are working tirelessly on
the project. There are also several graduate students who are working
on the project, including Peter Baldwin, Rob Keller, Drey Martone,
and Shuhong Li. In addition, Professors Ronald Hambleton, Lisa Keller,
and James Royer are contributing to the project. So, when I mentioned
an army of psychometric professionals and students, I was not that
far off. We all hope to meet and interact with many of you over
the coming months. If you would like to learn more about us, please
visit our web site at www.umass.edu/remp.
Stephen G. Sireci is Associate Professor in the Research and
Evaluation Methods Program and Co-Director of the Center for Educational
Assessment in the School of Education at the University of Massachusetts
Amherst. Before UMass, he was Senior Psychometrician at the GED
Testing Service, Psychometrician for the Uniform CPA Exam, and Research
Supervisor of Testing for the Newark, NJ Board of Education. He
is known for his research in evaluating test fairness, particularly
issues related to content validity, test bias, cross-lingual assessment,
standard setting, and sensitivity review.
Originally published in Adventures in Assessment,
Volume 16 (Spring 2004),
SABES/World Education, Boston, MA, Copyright 2004.
Funding support for the publication of this document on the Web
provided in part by the Ohio State Literacy Resource Center as part
of the LINCS
Assessment Special Collection.