Bringing Learners into Goal-Setting
by Anne Burke, Linda Gosselin, Jane Shea
Quinsigamond Community College
Quinsigamond Community College
has several adult education programs, one on campus at night and
several during the morning at off-campus sites in Worcester and
Southbridge. Our program holds classes in the morning at St. Paul's
School. We offer ASE, ABE, and ESL.
Our adult ESL students represent a cross-section of the immigrant
community in Worcester. The largest segment of our student population
is Spanishspeaking, with the majority from Puerto Rico and the rest
from various Central and South American countries. They make up
60-70% of our student enrollment. The remainder of the students
are from a variety of countries in Asia, the Mideast, and Eastern
Europe and Russia. Our day students are predominantly female (80%).
Some are employed or are on public assistance. Many are housewives.
Some are single parents; others are single and never married. Many
of the male students work at night and seem evenly divided between
being married or single. Ages range from 18 to 70.
The program seeks to serve the needs of our students, whether
they are seeking better employment, hoping to pursue higher education,
or just coming to enrich their own language skills. The program
is an open-enrollment/open-ended one. Each of the three classes
must maintain a class enrollment of 13 students. As students reach
their goals or withdraw early from the program, new students are
accepted into the program from a waiting list.
Working in the Community
Our community-based program serves adults with limited English
proficiency who desire to maintain and improve their quality of
life in American society. Our goal is to enable adult learners to
actively formulate their own educational goals.
This goal is attained through a variety of ways. When students
join a class, they complete an IEP (individual education plan),
identifying for instructors and themselves the learners' education
goals. At the start of each curriculum unit of study, students and
their instructor decide on which particular areas they want to work,
via a needs assessment/ interest profile. Together they adapt the
curriculum to class needs.
The structure of our ESL program is based on the MELT (Mainstream
English Language Training) student performance levels, adapted to
our own three proficiency levels. Level 1 includes SPL 0-2; Level
2 consists of SPL 3 and 4; Level 3 embraces SPL 5 and 6. Students
may move to higher levels as they complete their goals and are assessed
to be ready.
Students meet nine hours a week for 38 weeks each year. They can
continue in the program until they complete all three levels.
This tri-level program design was created in 1991. Before that,
each instructor worked independently, covering life skills and grammar
randomly. This program weakness often led to repetition of some
life skill materials and omission of others. It was difficult to
assess the readiness of a student to progress to the next level.
To improve our program and to better meet the needs of our students,
we decided a more formal curriculum was needed.
In 1991, we three ESL instructors won approval to use program
development funds to create a curriculum that would provide better
consistency of skill development from one level to the next. Working
together throughout the summer of 1991, we designed guidelines for
life skill competencies, grammar, and vocabulary appropriate to
each level, flexible enough to meet our instructional needs and
the individual educational plans of our students.
After utilizing the curriculum for one year, we determined that
we needed a more formal instrument for assessing a student's progress
and readiness to move on to the next level.
Our initial means of assessing student progress within this curriculum
was very subjective. Every ten weeks the teacher would write a narrative
concerning each student's general progress in reading, writing,
speaking, and listening. This means of assessment did not really
provide us the information we needed. It didn't inform us enough
as to a student's progress in specific life skills in class nor
which skills were required to review or to further study in class.
Therefore, in the summer of 1992, again with program development
funds, we collaborated on improving the assessment system. We started
by examining various articles from Adventures in Assessment
to see what other programs were using to evaluate student progress.
In particular, we found "Three by Three by Four: Ongoing Assessment
at the Community Learning Center," by Karen Ebbitt, Priscilla Lee,
Pam Nelson, and Joann Wheeler in Volume 2 of Adventures
in Assessment helpful. Using it as a guide we decided
that our assessment would not include a listing of assumed topics
to be covered, but would be a form that would allow the instructor
and the learners to review and list the skills and materials presented
Our decision was based on our philosophy of encouraging our learners
to fully participate in their own assessment. Because it was important
to have some way of evaluating progress from the teacher's, as well
as the student's, point of view, we developed two forms: a Student
Self-Evaluation Form and a Teacher Assessment Form (see
Forms A and B at end of article).
On the Student Self-Evaluation Form we wanted to use vocabulary
that could be easily understood by most students. We decided to
have students assess how they felt about their progress on competencies
worked on in class. We thought it would be easier for lower level
students to check-off how they felt about their progress, rather
than writing about what specific progress they might have made.
For more advanced students, we provided space so they could write
about their progress in specific skill areas.
The Student Self-Evaluation Form is completed through a brainstorming
session in which students verbalize, with or without the teacher's
help, the skills and materials covered over a particular period
of time. Brainstorming offers the opportunity to review and recognize
what was accomplished in class. The list of skills generated is
written on the board and copied by the students on to their forms.
For example, if we had worked on health and medicine the list of
skills generated might be:
Review of parts of the body.
Review of simple illnesses.
Other common illnesses.
Describing one's symptoms.
Calling to make a doctor's appointment.
Filling out a medical history form.
Understanding medicine labels.
Common medical tests and procedures.
While the students assess their own progress in these competencies,
the teacher records the same skills on a Teacher Assessment Form
for each student as a separate assessment. As soon as possible after
completion of both forms, student/teacher conferences are scheduled
during class time to discuss and compare evaluations. The students
and teacher use the forms to identify those items that have been
mastered and those that require further study.
The conference allows discussion of any other difficulties with
the unit or other factors affecting the student's learning that
may not have come to light previously. The conference is important
so that the instructor and learner have a mutual understanding of
progress made and obstacles to learning. If there is a discrepancy
between what progress the teacher and student think has been made,
discussion follows. More often than not, the gaps occur because
the student is not very confident in his/her ability. It is then
the task of the teacher to remind the student of the successes he/she
had in the classroom and what the student could do at the beginning
and what the student can do better now.
This can be accomplished through telling the student what the
teacher has observed or by showing the student samples of classwork
done that demonstrate progress.
Top of Page
Preparing for this article has encouraged us to formally evaluate
our use of this ongoing assessment and come up with the following
list of strengths and weaknesses:
The format is flexible enough to be used at any point in the
program. Because students are only assessed or the materials
covered from the time they enter the program, forms better serve
our open-entry, open-exit program than standardized instruments.
Standardized tests may test students on materials not covered
in class. They would not give us the information we need in
order to assess student progress and readiness to advance to
the next level.
Student involvement in reflecting on skills and materials
covered leads to their recognition that learning has taken place.
A blank assessment form allows for teacher flexibility in
adapting the curriculum to the needs of a particular group of
students at each class level. This is important because what
is covered in a particular unit of study can vary from year
Use of a student form and a teacher form provides a balanced
record to support the determination that a student is ready
to move on to a new level. The higher level teacher can see
the units covered, as well as the student's strengths and weaknesses.
Because evaluation includes a self-assessment, the teacher also
gains a sense of the student's own awareness of competency.
Instructors can use the instrument to identify areas that
need further review or instruction. For example, if the majority
of students respond "not good" to completing a job application,
the instructor knows that further work in that area is necessary.
Often two or more class days are required in order to complete
the assessments and follow-up conferences. This time must be
seen as necessary and a learning experience of its own value
for the students.
Students tend to want to evaluate their ability, not their
progress. In the theme of 'health," for example, the students
may have worked on calling to make a doctor's appointment. When
it comes time to evaluate their ability to do this, students
may tend to want to evaluate themselves as compared to a sample
conversation used in class. It can be difficult to get them
to evaluate themselves or their progress from the time they
started and what they can now communicate.
Our ongoing assessment tools are an effective means of assisting
learners to evaluate their own progress and providing teachers with
the documentation needed to determine a student's readiness to progress
through the levels. However, we have seen a need to make some changes
in order for our on-going assessment to better meet our program
Frequency of Use
The forms were originally intended to be administered every
ten weeks, according to DOE guidelines. However, we found it
more logical for assessment to take place after major units
of study. This schedule is still in compliance with DOE expectations,
while more in line with our own program and learner needs. The
length of time between evaluations can vary according to how
long it takes to cover a theme for a particular group of students.
A. Student Self-Assessment Form
We plan to add a section to the form for students to assess
their attendance and participation.
B. Teacher Assessment Form
We have deleted the evaluation of attitude and effort as they
are already reflected in attendance and participation. Also,
trying to evaluate these factors separately can be very subjective.
At times there can be personality clashes between a student
and teacher. When it is time to evaluate a student's attitude
and effort, these negatives may unconsciously influence a teacher's
perception of these factors. A student's attendance can be quantified.
If there are no factors influencing attendance, good attendance
reflects good attitude toward learning. Observations of a student's
active participation in class activities is evidence of effort
to make progress.
Writing this article gave us the opportunity to reflect on our
ongoing assessment, identify its strengths and weaknesses, and decide
what changes should be made to improve our assessment process. Overall,
we are pleased with the ongoing assessment tools we use. They have
served our needs well since we began designing them in 1991. With
the changes we are making in our ongoing assessment tools, we are
confident that they will continue to serve the needs of our program
into the future.
Originally published in Adventures in Assessment,
Volume 8 (Winter 1995),
SABES/World Education, Boston, MA, Copyright 1997.
Funding support for the publication of this document
on the Web provided in part by the Ohio State Literacy Resource
Center as part of the LINCS
Assessment Special Collection.