Reflections on Meeting the Challenge of Assessment with Beginning
Wake Technical Community College
I currently teach two classes. In
the morning, I teach a Beginning class.
There are two men from Congo, one from Angola, three from Mexico,
three from Vietnam, one of whom is an ethnic Montagnard, and one
from Taiwan. There are also three women: one Palestinian, one from
Somalia, and most recently, one from Mexico. The students range
in age from 21 to 75.
The Palestinian woman and one of the Mexican men never had formal
classroom training. One of the men from Congo was institutionalized
for a substantial portion of his life and also had no classroom
experience. The man from Angola was formerly a mathematics instructor
in his country. The Palestinian, Mexican, Congolese, and Taiwanese
had no prior formal experience with writing using the English alphabet.
In the afternoon, I teach an Intermediate class, which also has
an educationally varied population. At various points during the
semester, I have had two Bulgarian men, two Indian men, one Indian
woman, a man from Congo, one from Togo, a Korean woman, two Brazilian
women, two Mexican men, three men from the Dominican Republic, and
most recently, a woman from France. They range in age from 19 to
In the course of teaching ESOL formally over the past four years,
I have realized that our program lacks repetitive standardized testing
that provides teachers with hard evidence to support progress recommendations
for advancement. Having come from a background in the elementary
schools of set-in-stone curricula, and teaching to pass that
math or reading exam coming up in spring or teaching
to the test, the idea of nothing in stone was
new to me. It did not occur to me that any other kind of assessment
could be valid. With this in mind, I approached assessment with
great hesitation. Several questions came to mind. How could I find
where students really were in terms of their ability? How could
I find what pro-gress they had made? How and where would I be able
to document that progress? Should they be privy to the information?
I was entering into a non-traditional background from a traditional
one. At the community college and in the program where I work, the
focus is on helping students with life skills. In the simplest sense,
that meant not only was it our job to teach students the mechanics
of the language, but more importantly, expose them in as broad a
sense as possible to real-life situations and the language involved
in functioning in those situations.
Our college has three assessment tools. The BEST is used upon entry
into the program to help those interviewing the new students approximate
in what level class those students belong. The next is an interview
form the students are required to answer verbally. The person who
is registering the student asks his/her name, address, other personal
information, and why the student has come to the United States.
During the course of this interview, the individual conducting the
interview decides in which level the student might be successful.
The last is a series of areas on the back of a student information
card where the teachers are required at the beginning and the end
of the session to make blanket statements about students goals
and progress. Often, decisions regarding the readiness of our students
to go on to the next level are based on these notes.
The BEST works very well for those students who come to our program
with a minimal level of proficiency in English, but often pre-beginning,
and beginning students are not able to take the test, even though
it does not reflect their capacity to learn, or other areas of acuity.
The student information cards have very general categories and no
guidelines for getting information regarding students specific
Initially, my project began with the question, what did my students
already know? The BEST test gives a teacher limited information
at best. How could I record the amount of English my students already
possessed? How could I help them articulate what they wanted from
my class? It seemed an overwhelming task, but I was willing to try.
Because my Intermediate students were more capable of expressing
their needs, I started by asking them open-ended questions about
what they thought they needed or what they wanted from me and the
class. I attempted to give them opportunities to describe anything
in their daily interactions in English that caused specific difficulties
for them. I suggested that they make a mental or written note of
these issues and bring them to class to share so that everyone could
benefit from them.
Yet from the beginning of the project, I had a sinking feeling
that there was something I was doing of which I wasnt entirely
aware. This led me to do some reading about what could be considered
assessment. In the midst of this process my focus changed from trying
to find out students goals to determining which kind of assessment
would allow me deeper insight into how the process can be effectively
carried out. In the project, I decided instead to determine which
tools are the most feasible for assessing and making the case for
what worked best in my personal teaching situation.
The importance of this project has two parts. First of all, I hope
that by looking at what can be done to get at students goals,
we will eventually change the system we currently use to record
those goals and have them articulated by students themselves. Second,
it is essential that we be able to better address the specific problems
that are inherent in charting the progress of and meeting the specific
needs of our entry-level students with practical assessment tools.
Putting Out Feelers
Since I came with a limited view of any other kind of assessment
and no experience in how to carry it out, I took to reading about
different types. I discovered with great interest that there are
several kinds of assessments included among those classified as
alternative. Alternative assessment is based on daily
classroom activities and includes a variety of instruments
that can be adapted to varying situations (Huerta-Macias,
1995). This made sense to me. So again, the question of how to do
this with beginners came up. I am trying to look at where they are
when they put their feet in the door. What indicators should I be
I also set out to gather information on indicators into the goals
of emergent speakers and writers. Was there a survey that one could
administer to pre-beginning or pre-literate students? Many of the
tasks I read about were for students who could already complete
coherent sentences. Among the methods that I read about were: student
portfolios, learner logs, cloze activities, learner grids, learner
profiles, teacher observations, and formal surveys. While reading
about these tools, I realized that many teachers in our program
had used the most limited and probably the least productive of these
methods, or we had not validated the methods that were being used.
I discovered that the students same day-to-day activities
(e.g. writing, role-playing, group discussion) are the basis for
alternative assessment (Huerta-Macias, 1995).
I had been working under the assumption that any kind of assessment
would have to be formal, in some kind of questionnaire form, and
that I would have to have masses of those questionnaires repeated
over time for them to be valid. The more I read, however, the more
I discovered that many other teachers and I had been charting students
progress through personal observation. In one piece I read that
discussed using reader logs as assessment tools, the writer admitted
that the instructor found lower-level learners had difficulty responding.
Very often the statement some things I like was answered
I like everything (Gear, 1993). It is this kind of generality
in students surveys with open-ended questions that makes
assessing beginning students particularly frustrating. This issue
is complicated if the students cannot understand the scope of what
is being asked of them, and there are no translation helps in the
class because no one else speaks the persons language.
The same article mentions another really helpful assessment tool:
an individual exit interview with the instructor (Gear, 1993). In
our program, students complete an evaluation of the teacher at the
end of each 15-17 week session. The evaluation includes questions
about the teachers preparedness, how well she has helped the
students reach their goals, and her demeanor. The language on the
survey is problematic for beginning students if they dont
have an interpreter. It has an area for comments but the survey
is more for information than a measure of the students understanding
of what they learned. Complicating the issue of assessment on all
levels is our attendance policy, which is open entry, allowing for
much elasticity in a students attendance - perhaps too much
to get an accurate measure of where our students started and where
they end up at the end of a session.
Encased in any assessment activity should be activities designed
to help students become self-aware. One of the mission statements
of assessment should be that self-assessment is highly important
for successful language learning (McNamara, Dean, 1995). By working
with what a student thinks he wants to learn, we can help students
establish realistic expectations about what language skills they
need to achieve their goals (McNamara, Dean, 1995). The most important
piece is helping students establish their expectations. Assessment
is needed to help them develop internal indicators for where they
stand in their ability. It also makes it imperative to include them
in the assessment, not as guinea pigs, but as active participants
in its shaping. The challenge in this at the beginning level is
that it may take more time for them to learn the language needed
to articulate if they like the style of questioning than there is
time to complete the questions themselves. A way around this is
performance-based assessments that ask learners to complete tasks
or take part in simulations (Spruck, Wrigley, 1992). Any kind of
role playing or oral testing as well as writing of basic information
would be assessment activities with documentable results.
Trial ... and Error ... and Trial
With this information in hand, I decided to look at the two types
of assessment I used most in my practice: informal student-teacher
discussions on what students felt was important, and open-ended
questions. Since I began my project with my intermediate students,
I would like to first share what questions were discussed with them.
In the first three weeks of my intermediate class, I posed several
questions to the students to have them consider what they wanted
to be able to do with the language. In the past, I had literally
asked students what they wanted to learn and they would give me
some very general answer like grammar or pronunciation.
I believe a language cant be learned just by knowing the mechanics.
One must understand the context in which language is used. In my
intermediate classes I constantly attempt to get students to understand
that they know many of the mechanics to speak already. What is needed
are the contexts in which to use the language, and fine-tuning of
On the board, I wrote the following questions:
1) I wish I could___________in English.
2) If I knew how to ________in English it would be great.
3) What I want to know how to say is_________.
I expected answers that were very broad and non-specific. In this
particular exercise, I didnt ask students to write anything
down, but to discuss their answers. Here are some of the things
1) I wish I could _______ in English.
sing- know songs
solve ex-wife problems
tell romantic stories
talk about sports
talk about what I do in my country
understand TV show
describe what Im thinking
tell jokes say the names of spices make a woman fall in love with
2) If I knew how to __________ in English it would be great.
get involved in English conversations with Americans
talk about philosophy
3) What I want to know how to say is__________.
the way I feel in class
what to say for a job interview
words to make happy
I was really surprised at the things that students expressed an
interest in. Not one student mentioned grammar. They all talked
about how they could fit into the scheme of things in their lives
here in the U.S. Perhaps answers of this nature are based on difficult
experiences students had when first trying to adjust to their new
lives. It may be that the more opportunities for social interaction
that students have, the more they see the need for practical uses
Every week I also asked them questions about what they had learned,
what they liked and didnt like. I didnt find this line
of questioning to be as successful.
I would post the following questions on the board at the end of
1) This week I learned about____________.
2) I liked __________. It helped me a lot.
3) I didnt like __________. It didnt help me.
Eliciting answers from these questions was considerably more difficult.
Im not sure if it was because I was immediately throwing students
into a self-assessment role or if their expectations were still
unclear to me.
1) This week I learned about____________.
who is everybody in class
voice register pronouns
2) I liked __________. It helped me a lot
learning the names
asking questions about American culture
3) I didnt like ________. It didnt help.
too much talking
At the end of the third week, attendance went from 18 students in
my intermediate class to three. I thought perhaps I had overwhelmed
them with the line of questioning I had taken. Normally, part of
the introductory process in my class includes ice-breaking activities.
I insist that students get to know one another so they are not strangers.
It takes about that three-week period for them to feel comfortable
with each other and with the notion of becoming self-aware. It also
takes me about that amount of time to get an idea about which direction
to take my class.
I didnt realize at the time that I was constantly making
mental notes about everything involving my students: who was very
talkative, who wasnt, and who seemed willing to engage in
class activities which might seem unusual to them. I noted who thought
my speech was too fast, who acknowledged it, and who felt constrained
that I had a special role or place as the teacher. I
paid close attention to who seemed willing to talk about their personal
lives and who was extremely reticent or evasive. I watched for signs
of boredom, involvement, and lack of comprehension through facial
expressions and body language. I paid attention to who constantly
felt the need to speak in their own language if they came with a
companion from their country and who seemed reluctant to ask questions.
This is a process I have engaged in since I began teaching ESL,
but because it never involved written tests of any kind, I didnt
even consider that I was doing assessment. To my misfortune, I didnt
record my notes because I considered them for my personal use only
and not very effective in helping anyone other than me. I learned,
however, that this information, even though not officially recorded,
was viable to use for assessment, and in the future, worth mapping
out. I realized also that, based on my personal observation of students,
I was making decisions about what they were interested in studying
and how capable or willing they were to advance in dealing with
one American. In the face of working with three or four students,
I didnt think I had enough to make a real case for this kind
I then thought to modify the open-ended question format for my
beginning class, which had much more consistent
attendance. I wrote on the board:
I dont like________.
In English I can________.
I dont want________.
I explained like with a gesture of thumbs up and synonyms
such as good and okay, and I explained dont like
with a thumbs down sign and synonyms such as bad and shook my head
in the negative. To help them understand can, I gave
them examples of what they could do, such as speak in their native
languages, and cook or work, depending on what the students seemed
to express as their interests. It was an activity in itself to get
the students to talk about themselves.
Their answers to these open-ended questions were varied. One of
the Mexican men wrote:
I like weldin, money, dance worken (working).
I dont like cold, rainy.
In English I can yes, okay, I like.
In English I cannot pronunciation.
The Somalian woman wrote:
I like my eyes.
I like mosque.
I wan to go school
I dont want club
I dont want cinema
The Montagnard man wrote:
I like corn.
I like put a pant.
I dont like eat mushrooms
In English I cannot say understand
I want-say I love you
I dont want eat chicken
The Taiwanese man wrote:
I like woman-girl-mother
I dont like winter-paint-shovel
I can cut cook clear trim
I told them they could use picture dictionaries to carry out this
exercise. During its course, I observed them. Who took the initiative
to tell me they were stuck or unclear about the exercise, and who
just waited for the end of the exercise to see the results of everyone
else? Who wanted me to look immediately at their work, and who had
reservations? It seemed these types of activities, while the activity
itself was important, were just additional ways to determine who
seemed motivated and unafraid, and who needed motivation. I was
frustrated with the generalities of the answers and looked for another
way to get at exactly what the students thought they were supposed
to be getting from English class.
Trying to help students feel comfortable enough to express their
needs, I presented myself not as the authority of the classroom,
but only as the authority in the language. I led them in modeled
activities, which included talking about themselves and their everyday
experiences, by sharing my life with them. I shared what I thought
was good and bad and encouraged them to do the same. I was not sure
that the example I was trying to set was having an impact until
something amazing happened. One of the Mexican men who had recently
joined the class said one day that he wanted to learn how to get
food at McDonalds. He didnt exactly use that
terminology. It was more like Eat McDonalds how?
From his request came a two-week lesson on ordering food from fast
food restaurants, which included vocabulary, speaking and listening,
and writing practice. Even the most disengaged students became more
active in their participation. After this incident my beginning
students started to ask simple questions about how to spell certain
words. Even the Taiwanese man lost his hesitation and would stop
the lessons in the middle of a word or sentence to clarify a letter
and the sound that went with it. One of the other Mexican men took
the initiative to be the server and waited on the whole class.
I repeatedly teach words used to ask questions in my beginning
class in an effort to get them to understand what people are asking
them. It is also to help them understand how to ask questions themselves.
Part of my observation at the close of the session has to do with
who seems to have overcome fear of mistakes and who has learned
to encourage his fellow classmates. It may not seem that this is
related to assessment. Yet as part of teacher observation, forward
movement of this kind is crucial in determining how far students
have come in their overall communication skills. It suggests a comfortability
with the language that cannot be quantified on any kind of standardized
Closing Statements: A Work In Progress
While recording these reflections, I have come to realize that
the most comprehensive form of assessment that I have as an instructor
is my personal observations. It is, however, lacking in some components.
In one of the articles I perused in my research, one practitioner
observed that she had not asked the learners to help develop this
list of strategies with her. Therefore she risked asking them to
measure themselves against objectives which may not be realistic
goals for them or which simply may not be their goals (Barry, 1993).
My observations as a teacher are not so much an accurate reflection
of students goals, but rather of the standards by which I
determine where students are in their ability to handle the language.
It would be effective in the future to record these questions on
a graph or chart and expose them to the class as the pointer that
I use to figure out if they are ready to go on to the next class.
In this way, if they disagreed with my line of reasoning, they could
become active participants in revising the questions. This would
work particularly well to integrate students self articulated
goals. In the case of upper level learners, this approach would
make the most sense. With beginning learners, however, it seems
that it would be more effective for the teacher to make a chart
with pictures that depicted students progress in functional
areas, such as writing the entire alphabet, or memorizing their
address and phone number correctly, or learning to ask for food
or directions in a grocery store.
No matter what form teacher observations take, teachers in any
program should collectively note what they observe and pool the
information to create a document or documents that could be circulated
throughout the levels, along with the currently used form as a guideline
for documenting ongoing assessment. This last step would serve to
deflect any criticism that observation is not a documentable means
of assessment. Since viable documentation should be part of the
goal of assessment, it makes sense to have a form that consists
of teacher and student observations. The current information in
use is much too broad to reflect what specific progress has been
Something else to consider is that observations should be continuous,
not necessarily at measurable intervals. By recording segments of
conversations with students over time, a teacher can gather a true
reflection of their views of themselves as learners (Barry, 1993).
It isnt necessary that a log of these conversations be formally-recorded
events, but that they are annotated according to the date that they
take place. It could be more a jotting down of impromptu blurbs.
I come away from this project with a clearer knowledge of how to
enable my students to be aware of their desires regarding English,
other than just knowing the structures of the language. I also gained
insight into how to record my students progress using tools
which I already possess and use on an informal but ongoing basis.
I have acquired a vision of how our current assessment tools can
be adapted and/or modified to help teachers better capture their
students progress. It would also be helpful for teachers to
take a bigger stake in their students progress by conducting
interviews with them on an individual basis at the end of the session
to see where each student stands, and to record the outcome of this
My hope is that, in spite of any obstacles that are inherently
a part of programs such as this, other ESOL instructors will see
that it is possible to determine the needs of the moment from their
students, no matter what their level.
Huerta-Macias, Ana. Alternative Assessment: Responses to
Commonly Asked Questions, TESOL Journal Vol. 5, (1995), No.
McNamara, Martha & Deane, Debra. Self-Assessment Activities:
Toward Autonomy in Language Learning, TESOL Journal, Vol.
5, (1995), No. 1.
Barry, Eileen & F., Pat, Reflections on On-going Assessment:
Documenting Self-Esteem and More, Adventures in Assessment,
Vol. 5, (1993).
Spruck Wrigley, Heide. Learner Assessment in Adult ESL Literature,
ERIC Q&A National Clearinghouse on Literacy Education, (1992),
Gear, Caroline. The Learners Log: Evolution of an Assessment
Tool, Adventures in Assessment, Vol. 5, (1993).
Reprinted from Building Together: The Inquiry Writings,
The North Carolina Adult ESOL Curriculum Frameworks Inquiry Project,
Literacy South, 1998, (919) 682-8108
Originally published in Adventures in Assessment,
Volume 11 (Winter 1998),
SABES/World Education, Boston, MA, Copyright 1998.
Funding support for the publication of this document
on the Web provided in part by the Ohio State Literacy Resource
Center as part of the LINCS
Assessment Special Collection.