Assessment in Motion
Challenges in Supported Distance Learning: How the ABE Distance Learning
Project is Implementing the Massachusetts ABE Assessment Policies and
NIFL LINCS Assessment Special Collection
Assessment Support Website
Setting Work Group
Making Sense of REEP
Best Plus - A New Way to Assess Oral English Skills
BEST Plus at YMCA
BEST Plus at El Centro del Cardenal
in Assesment Briefly Annotated Bibliography of Articles Focusing on
In-Take, Placement, and Goal-Setting
I was prepared to be annoyed. That's right:
just what I needed --another acronym to add to my list! Another mandate
to add to my list! A beautiful summer day lost to training. Just what
I needed. Boy, was I prepared to be annoyed.
Imagine my dismay. I like REEP. I have embraced REEP. REEP is making
my program better. Imagine!
When we first learned that new assessment policies were coming, we were
in just our second year of funding. As a new program, we struggled with
finding "reliable" and "valid" assessment tools. We
had great intake tools, and we were meeting all of DOE's reporting requirements,
but we sensed we could do better.
During this time, we received a curriculum grant which we used to develop
thematic units. We wanted to move away from standardized testing towards
performance-based assessments. Our staff, experienced with student portfolio
work, believed that performance-based assessments would allow us to more
accurately assess student levels, enhance instructor feedback, and better
track student progress. As we proceeded, we remained sensitive to concerns
about objectivity, continuity over time and different instructors, and
the amount of staff time required to evaluate work and prepare feedback
We held many staff meetings, attended workshops, and conducted research.
The task of assembling a comprehensive list of competencies was not nearly
as daunting as the challenge of somehow converting or "crosswalking"
it to an SPL (Student Performance Level) number that was "useable"
for DOE reporting purposes. It has always been my management style to
make data-driven decisions. I prefer to plan for program improvement based
on what I "know I know" instead of what I "think I know,"
and what I knew for sure was that there had to be a better way.
Thank goodness for the work that ACLS, SABES, and the PAWG (Performance
Accountability Work Group) did to facilitate our understanding about this
necessary response to NRS and federal mandates. Over the course of the
previous year, the workshops on Standardized Assessments, the PAWG updates,
NRS Requirements, and Using Data for Program Improvement were immeasurably
helpful in preparing for what lay ahead -- and more importantly -- slowly
but surely bringing my staff up to speed. Thankfully, we were able to
immediately skip right over the
"reactionary mode" phase directly into "implementation
mode." We felt
adequately prepared and informed, and we fully understood this was a policy
that was going to have to be followed. So, we decided to make the best
of it. (Yes, my staff is beginning to be known as the "lemonade"
Ultimately, we have found REEP to be efficient, minimally intrusive to
instructional time, and more reliable and consistent than other tools
we had used. In fact, the information we are able to glean from student
papers is well worth the time required to administer and score the tests.
It frees us from trying to convert performance-based assessments to an
SPL number, while providing detailed feedback on students' strengths and
We have begun placing students in "writing workshops" according
to their REEP scores. This enables instructors to focus on more targeted
instruction and also helps solve the "leveling" problem for
students whose oral skills are significantly higher or lower than their
reading/writing abilities. Most students have responded with enthusiasm,
and feel like it's a "special" opportunity to be in these writing
Here is what we have learned from our first two rounds of REEP testing:
- We were somewhat surprised at the amount of commonality across papers.
We are able to identify several areas of common writing errors which
is helpful to classroom instruction.
- It seems obvious, but it bears repeating that some of our students
are very gifted writers -- a talent which has little to do with their
ability to communicate in English. Even some of our most beginning writers
are able to show "voice" and tremendous "spirit"
in their writing.
- We value the level of scoring objectivity that comes from not having
previously worked with the students. For example, scorers were not able
to understand one student's writing about surgery
on her hand. After the test was scored, we shared it with the
instructor who was immediately able to derive the meaning because she
knew the student had just had hand surgery. We feel the "disconnect"
between the scorers and the students provides more reliable information.
- We have learned to be very specific in explaining to students how
their work is scored and kept. In our post September 11th world, many
of our students were cautious about replying to the prompt, "Write
one bad thing about your life here." Students were afraid that
we would "turn their papers over to the government." It's
easy to understand why they would be reluctant to write anything bad
about America. While I initially supported the idea of using "regional
scorers," I now feel we need to be sensitive to these very valid
concerns. How might writing be impacted if students felt that "outsiders"
would have access to their papers?
- It really is important to "recalibrate" each time we administer
the assessment. We find the anchor papers especially helpful in this
Some of our concerns/questions about the REEP include:
- We wish the rubric were more specific with greater detail for each
- Overall, we have found the REEP to be an accurate reflection of student
abilities and levels, however there is the "on any given day"
aspect that applies to any standardized test. We were recently surprised
to see one of our more advanced students score significantly lower on
his second REEP test than on his first, only to later learn that there
were extenuating circumstances. There is nothing we can do about this:
we can't retest and the score must stand. We can only hope that his
third assessment will be more reflective of his true abilities.
- The use of REEP as the official assessment in higher ESOL levels
has given some students the impression that we -- as a system -- value
writing more than oral communication. This is the exact opposite of
what the majority of our higher level ESOL students want, which is to
concentrate on pronunciation and speech.
- At our REEP training, we were told that a typical learner gain is
.4 each year. If this is accurate, then it may be much more difficult
to show learner gain based on federal SPL levels.
- In fact, after just two administrations of the REEP, we are somewhat
surprised by the relatively small gains that students show, especially
in the higher levels. It seems much harder to progress from a REEP level
5 to a level 6, than from a REEP level 1 to a level 2. Our Site Coordinator,
an ESOL specialist, estimates that the year-end assessments will show
greater gain. Students learning new material will require more practice
before becoming confident enough to take new risks in their writing.
I'm told that a little more time and a little more practice will work
Here is what we plan to do next:
- We are keenly aware of pending fiscal constraints, but how we would
love another curriculum grant! While some have voiced concerns about
the time required to implement the new assessments, we feel more acutely
the lack of planning time to develop a strategy in response to what
the assessments tell us.
- We would like to analyze and review the competencies/benchmarks established
within our curriculum units and align them more closely with the REEP
rubric levels. Since we are electing to work with students in writing
groups based on their REEP levels, it will help instructors target instruction
based on student portfolio and assessment writing. Of equal value, it
will also provide students with clear information on their progress,
including what they still need to master in order to move to the next
level. We had begun work on these checklists, but found it difficult
to convert them into SPLs. With the new assessment policies, we are
free to develop them in a way that meets our instructional and student
learning needs without concern for how they could be used for reporting
When asked to write about our experiences with the REEP Writing Assessment,
I was quick to reply that on most days, we feel like we have more questions
than answers, and that we are an ever-evolving work in progress. I am
thankful for a dedicated staff that embraces new ideas, and always looks
to how they can better serve our students. Our Community Partnership remains
steadfast in their support and encouragement in response to our changing
needs. I am grateful for the help of my peers and never cease to be amazed
by the level of support among ABE
practitioners. Last, but not least, we are always open to new ideas. We
are eager to hear from other programs, and hope you will share your experiences
Luanne Teller is the Director of the Stoughton ABE Program at Massasoit
Community College. See overview of How
This Program is Handling BEST/REEP Requirements.
Originally published in Adventures in Assessment,
Volume 15 (Spring 2003),
SABES/World Education, Boston, MA, Copyright 2003.
Funding support for the publication of this document
on the Web provided in part by the Ohio State Literacy Resource Center
as part of the LINCS
Assessment Special Collection.
15 Table of Contents