The Adventure Continues....
It has been about ten years
since I first wrote a piece for Adventures in Assessment.
Everything has changed and nothing has changed at Read/Write/Now
over the past decade.
The reasons for improving the tools and practices of assessment
of learning and teaching are still the same. Meaningful, effective
assessment helps learners on the sometimes long and winding road
of adult basic education know they are getting somewhere. It also
gives learners and teachers important information about the next
steps to take to reach their goals, great and small. If any of us
are to be "lifelong learners," it's essential that we not only keep
learning new things, but know why we are learning them and what
to do with new knowledge and skills. The more involved we are in
directing and assessing our own learning, the more likely we are
to be able to take what we learn in one setting, and use it and
build on it in another.
If this applies to adults who have had a reasonable chance at education
and have adequate literacy skills, it applies even more so to adults
who have not yet had success in education or an opportunity to get
one. If I had started graduate school with the same unanswered questions
that many learners have as they start in adult basic education programs,
(When will I be done? What does it mean to be done? What will I
be able to do at the end of it that I can't do now?), I probably
would never have started. it is the responsibility of programs like
ours to work with learners to answer those questions in relation
to each individual learner as well as to the program as a whole.
Assessment that works is also important for teachers as they work
to connect learners' goals, needs, and interests with curriculum
and learning activities. In a perfect world, goal setting, curriculum,
and assessment are all linked together. in this world, we keep working
to make those pieces connect.
Ten years ago, the assessment tools we developed were mostly used
by teachers to document learners' progress and share with learners
on a regular basis. The student portfolios then were more like teacher-directed
collections of learners' work.
Learners' portfolios are now their own. There is class time scheduled
to introduce them, to put them together in loose-leaf binders that
are kept on shelves in each meeting space for classes, and to choose
things to include in portfolios. Learners write something about
why they chose the item and what it means to them in terms of progress
toward a goal.
We still do lots of goal setting in a variety of ways with a variety
of tools, including a revised version of the Reading/Writing Goals
List we used ten years ago. Learners' goals and interests help form
the curriculum in each group. Learners choose a limited number of
goals to focus on each session and teachers have goal setting and
goal review conferences three times per program year with each of
their reading and writing and math students. Learners keep track
of what they are doing and learning with writing records, book lists,
and math activity records.
Many learners reflect on their reading in Reading Response Journals
and all use Dialogue Journals. Published writing, a book review,
a resume, a research project, a copy of a drivers license earned,
and math work might all find their way into a portfolio. Three times
each program year, we reassess reading progress with a combination
of an adapted version of the New Readers Press Whole Language for
Adults Reading Inventory. We have added readings using the Fleisch
readability scale to get grade levels and we have added recall,
interpretive, and active questions with a scoring scale to make
the assessment less subjective. This is not the only way reading
is assessed, but it is the way that is used for marking progress
within classes and moving people to other classes within the program.
Reading Miscue Analysis is a powerful toot of assessment we are
still trying to incorporate into program practice. Some of us who
have been working at Read/Write/Now for five or more years have
made multiple stabs at doing Miscue Analysis with learners. Over
the past year, we have had several in-service training sessions
with a consultant from UMASS, Dr. Patricia Silver, and teachers
are doing Miscue Analysis with some of their developing and intermediate
students. It is very useful in analyzing the strengths and needs
of readers and planning instruction, but it takes time and practice
on the teacher's part, as well as individual time with a learner
to tape oral reading, and then to meet with them after the analysis
is done to share it. We are committed to making it part of our assessment
practice, but it is far from institutionalized yet.
One of the things about the world of adult basic education that
has changed in the past ten years is the degree of accountability
required by most funding sources. Funders want to know much more
about many more things now. Unfortunately, they all have their own
special way for you to demonstrate that your program is doing what
they are giving you money to do. If a program has multiple funders,
as we do, there may be requirements that mean collecting anecdotes,
writing long narratives, doing case studies, collecting detailed
data on every learner while they are in the program and finding
out about their lives after they leave, using pre-tests, post-tests,
reporting on goals set and met, reading levels attained, and attendance.
It seems reasonable that when money, public or private, is invested
in a program to accomplish certain things, the program must be accountable
for accomplishing those things. It does not seem reasonable that
being held accountable often means devoting more hours to fulfilling
the funders' ideas of accountability than to doing useful assessment
with learners, who are the people that all the funding is supposed
to be serving. In that perfect world, the tools that work for a
program would be acceptable to its funding sources too.
Regardless of the world of grants and reporting requirements, the
essential purpose of assessment is informing learners and teachers
of where they are on the road to wherever the learner wants to go,
and helping them to decide on the next steps or new directions to
take. The challenge has grown from finding effective tools and processes
for meaningful assessment to doing it while meeting the requirements
of funders without duplicating efforts or creating dual systems
So far, we have not found a satisfactory way to avoid the duplication
of efforts and duality. One set of goals lists and assessment measurements
is the "real" one in terms of what learners and teachers use to
understand progress and the other is the "real" one in terms of
what must be reported to funders. Neither one seems to fully capture
the growth and progress we see happening in the lives of adult learners.
The process of continually reviewing and revising tools and procedures
for assessment has never stopped, but the sheer volume of new requirements,
mandates, technological changes, and information of every kind that
has characterized our culture over the past decade has created its
own kind of inertia. Add to the mix that funding for literacy is
still insecure and insufficient, making continuity in staffing impossible.
It's too easy to fall into a kind of passive/reactive role regarding
assessment and accountability to funders. I agree with much of what
Heide Spruck Wrigley wrote in her 1998 article, "Assessment
and Accountability: A Modest Proposal." If we don't try to develop
meaningful frameworks for teaching and assessment that truly reflect
our practice and the kinds of successes learners achieve in our
programs, Those Who Must Be Reported To will fill the void. This
has already happened in many ways, but there is always hope and
the politics of education -- just like all politics -- are subject
to change initiated by human beings on many levels. At best, we
can hope to influence the policy-makers with the stunning validity
of our assessment instruments and what they demonstrate, and at
worst we can fail to influence Those Who Must Be Reported To, but
still develop more clarity and purpose within our programs and offer
learners and teachers a more understandable path defined by real
markers of progress that reflect skills learned and used and goals
met. In Massachusetts, it seems the door is still open on this process.
Maybe it truly is open and maybe it just seems open, but being guardedly
optimistic, I say we may as well assume that we still have a voice
and use it.
At Read/Write/Now, we've been working, sporadically, on our own
teaching and assessment frameworks for reading and writing for almost
a year. We consulted with Jane MacKillop, editor of Whole Language
for Adults, a set of resources published by New Readers Press
that we have found useful. We started with the intention of correlating
our framework with the six levels described by the National Reporting
System, but decided along the way that we needed to have something
that made sense to us; we would worry later about translating it
into NRS for reporting purposes.
We did look at the NRS levels and try to make connections with
it. We also tried to link our framework with the Massachusetts ABE
English Language Arts Curriculum Framework, which was not hard to
do. Jane MacKillop led us through a process of looking at learners'
writings, describing what we saw evidence of, and deciding through
this process what qualities are common to beginning, developing,
intermediate and GED writers in our program. She facilitated a similar
process with reading, through which we named some entry and exit
texts as well as the qualities of texts at various levels and what
readers do at different stages of reading development. The next
stage involves each teacher taking the list of descriptors of writers
and readers at the levels of their classes and turning them into
a checklist to try with learners to get their take on them and see
how relevant they are in practice.
We are using the following questions to guide us in developing
our framework for teaching and assessment. What does the typical
reader/writer do at each level? What knowledge or skills does each
person bring to writing or reading? What skills are they developing?
What skills or strategies need to be mastered before a person is
ready to move on to the next level? What literacy experiences do
they need in order to progress? What skills and strategies are being
modeled or taught?
We are still revising and refining the program's reading and writing
framework. Instead of whole books as exit texts, we are developing
a selection of shorter readings that learners and teachers can choose
from. The reading and writing checklist items will be given numerical
values so that attaining an agreed upon number of skills in each
level along with successfully reading and understanding the exit
texts will signal a move to the next level group. Progress within
the beginning, developing, and intermediate levels will also be
marked in this way. Whether we call it a rubric, a framework, or
a series of checklists, this toot should make sense to us as learners
and teachers, and will be used to make transitions between classes
within our program smoother for learners and teachers. It will need
to be reviewed and revised on a regular basis to stay current and
useful. At least theoretically, it could be correlated to the NRS
and used as a tool for external reporting.
During the past five years, we have engaged in a variety of projects
that have increased the participation by learners in every aspect
of Read/Write/Now. Learners have been on the Health Team, doing
research and social action theater, mentored other learners, become
Peer Tutors, served on program planning committees, been elected
to serve on the program's Advisory Committee, acted as editors of
the monthly student newsletter, been members of the Parent Educator
Project team, and conducted various action research projects.
These projects have made our claim of being a "learner-centered"
program more legitimate than it used to be, and they have strengthened
the program with new energy and ideas. We still have a lot to do
to make our assessment process really work well for learners and
teachers, but we are on what feels like the right road.
It remains to be seen if what we develop will also be meaningful
to funding agencies. If so, we'll rejoice and have more time to
do interesting things that improve the program. If not, we'll continue
to do what we must to fulfill our own needs for assessment and the
program's funders' needs for accountability. Everything changes
and nothing changes.
Originally published in Adventures in Assessment,
Volume 13 (Spring 2001),
SABES/World Education, Boston, MA, Copyright 2001.
Funding support for the publication of this document
on the Web provided in part by the Ohio State Literacy Resource
Center as part of the LINCS
Assessment Special Collection.