Oral ESL Test Anxiety with Emirati Secondary School Students


Diploma Thesis, 2011

105 Pages, Grade: none


Excerpt


CONTENTS

ABSTRACT

LIST OF FIGURES

LIST OF TABLES

ACKNOWLEDGMENTS

DEDICATION

1. OVERVIEW OF THE STUDY
Statement of the Problem
Roles of the Researcher
Review of Chapters and Appendices

2. REVIEW OF THE LITERATURE
What Is an Oral Proficiency Interview
Common Formats of an OPI
Validity and Reliability in Oral Proficiency Interviews
Construct Validity
Construct-irrelevant Variance
Predictive Validity
Content Validity
The Link between Reliability and Validity
Reliability
Factors Affecting Reliability
Test Administration Reliability
Rater Reliability
Test Reliability
Context of Test Questions
Student-Related Aspects of Reliability
Gender Differences and Interlocutor Effect
Language Learning Anxiety
Types of Anxiety Related to Language Learning
Communication Apprehension
Test Anxiety
Fear of Negative Evaluation
Factors Affecting Anxiety
Student Beliefs
Learner Self-esteem
Instructor Beliefs
Environmental Factors and Anxiety
Gender Issues
The Effects of Language Anxiety
Physical Manifestations of Anxiety
Cognitive Difficulties
Input
Processing
Output
Schema Theory and Oral Response
Social Effect of Being Evaluated
Washback
Washback and the Effect on Teachers
Washback and the Impact on Education and Society
Achieving Positive Washback on Tests
Learning and Test-Taking Strategies
Anxiety-Reducing Strategies
IGCSE ESL Oral Proficiency Interview Exam
Advantages and Disadvantages of the Edexcel IGCSE ESL OPI
Marking the IGCSE ESL OPI

3. METHODOLOGY
Overview of the Study
The Participants
The Setting
The Edexcel IGCSE ESL Course
The IGCSE Oral Interview
Data Collection
The OPI Observation
The Semi-structured Follow-up Individual Interview
The Questionnaire
The Semi-structured Focus Group Discussion

4. DATA ANALYSIS AND FINDINGS
Results
Observed Manifestations of Anxiety in the OPI
(Research Question One)
Part One of the OPI
Part Two of the OPI
Part Three of the OPI...
Reported Causes of Anxiety
Causes of Oral Anxiety in Class
Oral Anxiety in the OPI
Students Suggestions
Gender and Anxiety
Reported Test-Taking Strategies
Observed Test-Taking Strategies
Strategies Reported Being Used Prior to the OPI
Anxiety-Reducing Strategies Reported Used
During the OPI
Miscellaneous Responses

5. CONCLUSIONS AND IMPLICATIONS
Summary of Findings
Implications of the Study
Implications for Teachers
Implications for Schools
Implications for Edexcel
Limitations of This Study and
Directions for Further Research
Final Thought

REFERENCE LIST

Appendix

A. EDEXCEL ORAL INTERVIEW TEST

B. EDEXEL SPEAKING MARKING GUIDE (PUBLIC VERSION)

C. INITIAL OBSERVATION SHEET: TEST-TAKING SYMPTOMS AND PREPARATION

D. SEMI-STRUCTURED FOLLOW-UP INTERVIEWS

E. QUESTIONNAIRE

F. SEMI-STRUCTURED FOCUS GROUP DISCUSSION (STUDENT VERSION)

G. SEMI-STRUCTURED FOCUS GROUP DISCUSSION (TEACHER VERSION)

FIGURE

1. Recursive relationships among anxiety, cognition and behaviour

TABLES

1. Background Data about the Participants

2. Number of Students Manifesting Observable Signs of Test Anxiety

3. The Amounts and Types of Physical Signs of Anxiety in Part One of the OPI

4. The Amounts and Types of Physical Signs of Anxiety in Part Two of the OPI

5. The Amounts and Types of Physical Signs of Anxiety in Part Three of the OPI

6. A Summary of the Reported Causes of Anxiety

7. The Causes of Anxiety in English Classes and Oral Exam Assessments

8. Number of Students‘ Test-taking Strategies Used in Part Two of the OPI

9. A Summary of Anxiety-reducing Strategies

10. Feelings about Learning and Using English in the classroom

ABSTRACT

In the UAE and globally, high-stakes testing is prevalent in second language learning. One important and integral part of high-stakes English language tests is the oral proficiency interview, which can be a pre-requisite to gaining admission to an Englishspeaking university.

The purpose of this study is to examine Emirati secondary school boys‘ and girls‘ experiences with and perceptions of anxiety in the classroom and oral assessments and tests. Furthermore, this study focuses on the different types of anxiety experienced in class and during an Edexcel IGCSE (International General Certificate in Secondary Education) ESL (English as a Second Language) OPI (oral proficiency interview). In addition, this study, conducted in the UAE, examined the strategies students used to prepare for oral tests and whether test-taking strategies students used in oral proficiency interview exams assisted with their anxiety. The study also observed physical signs of test anxiety and anxiety differences between secondary boys and girls during an Edexcel IGCSE ESL OPI test.

The volunteer participants were 25, 15-17-year-old Emirati students from two IGCSE ESL classes that I do not teach. This study included a videotaped mock oral proficiency IGCSE interview; audiotaped semi-structured individual interviews, questionnaires, and an audiorecorded semi-structured focus group discussion. The findings suggested that language and test anxiety is multi-faceted and can affect boys and girls in a number of different ways and at different times during class activities and also in an OPI. In addition, all participants showed different physical signs of test anxiety during the first two stages of the OPI, and these physical signs of test anxiety were considerably less frequent in the final part of the OPI. The causes and types of anxiety reported by the students ranged from language learning difficulties, problems trying to retrieve appropriate English vocabulary, code switching from Arabic language to English and vice versa, differences in social status of the teachers/language instructors, and unfamiliarity with the interlocutor.

The pedagogical implications of these findings for understanding anxiety and oral test anxiety with second language students for teachers, schools and examination boards are discussed, as are suggestions for future research. Furthermore, considering the important role of teachers in second language pedagogy and the use of English as the main language of instruction, this study also offers suggestions to lessen anxiety for oral class activities and oral assessments, and presents test-taking strategies.

ACKNOWLEDGMENTS

I must first of all express my sincere gratitude to Dr. Betty for her help, patience, and support. Her feedback, suggestions, ideas were extremely useful to me. I am as well thankful to my helpful and caring committee members, Dr. Cindy Gunn and Dr. David Prescott whose helpful comments were extremely useful and invaluable. I also thank all the MA TESOL professors for being inspiring and supportive to me throughout my MA TESOL journey. Thanks also go to all the students who provided me with the opportunity to conduct this research. I finally have to thank my wife and son without whose understanding, support and never ending encouragement I could have done nothing.

DEDICATION

Words cannot express the huge thanks and support I would like to express to my wife, Ana and my son Afonso. Thank you for everything.

CHAPTER 1 OVERVIEW OF THE STUDY

One common traditional and globally recognized means to examine speaking skills and competency used in the UAE is an oral proficiency interview (OPI) exam. This OPI exam can be taken throughout a language course, formative, and at the end of a course, summative. Typically, the OPI exam is recorded and conducted between one interlocutor/examiner and one test taker. Furthermore, the OPI is either rated by a school examiner/ interlocutor or, as in the case in my school, sent to external raters to be marked. Currently, my school uses the internationally recognized examination board in the United Kingdom, Edexcel, which reports grades to the school approximately three months after the exam.

The Edexcel IGCSE ESL (International General Certificate of Secondary Education English as a Second Language) exam (see Appendix A) is compulsory for all English-as-a-second-language students at the school where I currently work. ESL students are the vast majority of all the students at the school. Furthermore, these ESL learners are all Arabic first language students. Although most of the students have been at the school learning through English as the main language of instruction and studying English from around the age of four, I have often witnessed students manifesting forms of ESL anxiety in the classroom (hands shaking and hesitating while trying to pronounce words) and in test-taking situations.

The IGCSE ESL oral interview is part of this high-stakes exam for students at my school because each year the Edexcel IGCSE ESL course is graded by this single summative exam. The IGCSE ESL exam is also timetabled as the first exam of all the subjects in the school that all second language students undertake each year.

Another important reason that the IGCSE ESL exam is high stakes for students at my school is the school‘s policy. If an IGCSE ESL student gets a C grade (a pass), he/she has to re-sit the one-year Edexcel IGCSE course the following year. The reason for this policy is that it is assumed the student will be motivated to improve and get a better grade the next time. However, for students, having to do another year using similar materials, being placed in a group below their own year, and being labelled as a ―re-sit" can translate into a de-motivating factor for learning English.

One other demotivating factor stemming from students‘ test experience is that the exam papers are sent to the examination board in the United Kingdom to be marked and only single grades are sent back that range between A+ (―excellent skill") and G (―little to no skill") (Edexcel, 2010, p. 3). These grades provide no feedback to individual test takers or give any advice to teachers about how the test taker was graded in each part of the exam. In terms of washback which refers to the effect tests have on learning and teaching (see Cheng & Curtis, 2006), then negative washback may add to raising uncertainty and anxiety in students‘ exam preparation.

Statement of the Problem

Addressing test anxiety, therefore, becomes an integral part of teaching, as teachers need to prepare students for tests and subsequently be partially responsible for helping students deal with oral test anxiety. Thus factors which may cause students to fail oral proficiency interviews need to be examined. One oral exam which is used throughout the world and in the UAE includes the high-stakes oral interview test, IELTS (International English Language Testing System). In 2008 and 2009, the UAE was ranked one of the lowest in the world in terms of test taker performance in the IELTS oral speaking interview. This test is a pre-requisite for students to gain the grades needed from the exam for university admission (IELTS, 2009).

In addition, as an IELTS examiner in the UAE, I often see not only negative effects of student-related factors of reliability in terms of test anxiety and lack of testtaking strategies during oral interview exams, but also the impact of the role highstakes IELTS exams play in allowing or not allowing students to study in Englishspeaking universities in the UAE or overseas. Moreover, these high-stakes tests are being used as a vital part of obtaining immigration status in English-speaking countries. Two examples of high-stakes language tests which are used for immigration purposes include the Adult Migrant English Program (2011) used in Australia and also IELTS. According to IELTS (2011) a number of international government agencies require IELTS band scores and these countries include Australia, New Zealand, Canada, and the United Kingdom. Grades from this global high-stakes English language OPI test are becoming recognised as valuable currency and present opportunities to study, work, and live in English-speaking countries. One reason students fail these exams may be student-related factors affecting their reliability as test takers, not having exam strategies for effective test-taking.

Student-related issues of reliability could also include anxiety towards the exam itself, which may affect performance. The Edexcel IGCSE ESL OPI can be compared to IELTS as it has the same exam structure and design (three parts), with similar time frames and tasks (individual questions, presenting an unfamiliar topic, and discussion).

Oral test anxiety can also stem from a number of other factors. Firstly, some ESL teachers do not fully understand how oral anxiety affects students in oral assessments or oral tests. Such ESL teachers, e.g., English teachers who have no training in teaching second language learners, may not understand how this type of anxiety can be an obstacle towards learning another language or another subject taught in a second language. In the context where I work, for example, 90 percent of the teachers are native English-speakers, and their previous experience is in teaching first language students from their own country (School Handbook, 2010). Moreover, these teachers often have a number of different accents, South African, New Zealand, Australian, varied English accents which some students find confusing to understand. Furthermore, these teachers have little or no ESL experience in teaching Arabic learners before they arrive in the UAE from their home countries. In addition, as the school adopts a British curriculum, the school policy of recruitment is directly through agencies in English-speaking countries. This policy means that qualified English speaking teachers in English-speaking countries are considered highly valuable, whether or not they have any experience teaching in an Arabic context. Also, the majority of native English language teachers at my school do not speak Arabic or a second language. Thus some teachers are unaware of the challenges facing ESL students studying in English as the main language of instruction.

Another factor which contributes to the anxiety students experience is through oral assessments and tests. Oral test anxiety can also be manifested before, during, and after an oral interview and can be a contributing factor towards learning and using a second language in test taking situations. Anxiety is experienced at different stages and dealt with in different ways by different students. Anxiety can also become an obstacle towards retrieving information in the stages leading to oral responses. Test anxiety in oral interviews can also be observed through physical manifestations, such as sweating, or pencil tapping, and also in the strategies (or lack of strategies) students use for their oral preparation. For example, for the IGCSE ESL OPI, students are presented in Part Two with a task prompt and allowed one minute to prepare and take notes. These notes are a way of processing information and activating schemata to prepare for their oral presentation. Observing what types of strategies which students use to prepare their oral presentations, e.g., note taking, brain storming, drawing, etc. (or no strategy used), highlights whether their test-taking strategies are effective or ineffective to control anxiety when they present their oral responses.

To consider these factors of oral anxiety in oral interview exams for Emirati students at a secondary school in one educational zone in the UAE, my specific research questions are the following:

1) How is test anxiety manifested by Emirati secondary school male and female students in an Edexcel IGCSE ESL oral proficiency interview?

This question is addressed through a videotaped individual oral proficiency interview which followed the rubric of the IGCSE ESL oral proficiency interview exam. The video was subsequently viewed for physical signs of test anxiety as noted in the literature, and a check list and observation notes were then written up for data analysis.

2) What anxiety-reducing strategies are these students observed using during the oral proficiency interview?

This question was addressed through the videotaped observation in terms of what observable strategies were used, if any, before the exam and in all three sections of the OPI.

3) What do these students report as factors in their experiencing oral test anxiety?

This question was addressed directly after the OPI observation by the audiotaped semi-structured follow-up individual interview. All the students undertook the questionnaire and semi-structured focus group discussion, which took place right after the OPI.

4) What strategies do these students report using to control oral test anxiety?

5) What do these students report as factors causing them oral anxiety in class and strategies used to reduce oral anxiety in class?

These questions were addressed in the semi-structured follow-up individual interview, the questionnaire, and the semi-structured focus group discussion.

Roles of the Researcher

I had three roles in this research. Firstly, as an IGCSE ESL teacher it was my duty and responsibility to prepare students for the summative Edexcel IGCSE ESL exams at the end of the year, in particular, the part of the Edexcel IGCSE ESL exam which is the OPI. Secondly, I was appointed by the school to be the only IGCSE ESL Edexcel OPI interlocutor, which means that I had to administer the OPI, select exam topics, set up the exam equipment and interview individual students. I have had five years as an IELTS examiner and five years as an Edexcel IGCSE ESL interlocutor at my school. However, I have had no formal or on-going training as an Edexcel IGCSE interlocutor, and I am also the only teacher who is asked to be the interlocutor at school. My third role was that of a researcher, observing manifestations of oral test anxiety, anxiety-reducing test-taking strategies, and factors affecting oral test anxiety.

Review of Chapters and Appendices

Chapter One reviewed the context of where the study took place and problems which may cause students to experience oral anxiety in class and in test-taking situations (Edexcel IGCSE ESL OPI). Chapter Two, the literature review, defines what an OPI is and how this method of oral assessment is used in the Edexcel IGCSE ESL exam, the factors of reliability and validity that affect OPIs, and how anxiety may affect language learners and test takers. Subsequently, this chapter also reviews test anxiety and the impact of washback on teaching, learning and students. Chapter 3 discusses the methodology used in the study, including the design of the study, the participants, the setting, and the data gathering instruments used. Chapter 4 then discusses the findings concerning the causes and types of anxiety observed and experienced by the participants, strategies used to decrease test anxiety, and gender differences in test anxiety. Chapter 5, the final chapter, discusses the conclusions, implications, and limitations of the study. It also presents suggestions for future research.

CHAPTER 2 REVIEW OF THE LITERATURE

This section starts by defining OPIs (oral proficiency interviews), OPI formats, and the advantages and disadvantages of using an OPI in terms of validity and reliability. Also, this chapter discusses the types of language anxiety students may experience and the factors which affect students‘ learning of and being tested in a second language. The last part of this chapter reviews the term washback, strategies for learning, test-taking strategies and anxiety-reducing strategies which may be used in class and tests. Finally is a review of the Edexcel IGCSE OPI (International General Certificate of Secondary Education English as a Second Language), which was the focus of this research.

What Is an Oral Proficiency Interview?

An OPI, according to Brown (2003), is a type of oral assessment in which test takers sit ―down in a direct face-to-face exchange and proceed through a protocol of questions and directives" (p. 167). The OPI as a form of assessing and testing second language students has become a popular way to assess and test oral skills. Underhill (1992) notes that ―the interview is the most common of all test; for many people, it is the only kind of oral test" (p. 54). One reason why OPIs are popular ways of testing oral skills is that oral interviews may offer a realistic means of assessing students‘ oral language performance (Chalhoub-Deville, 1995).

An OPI may be used for a variety of different purposes, ―including academic placement, professional credentialing, student assessment" (Fall, Adair-Hauck & Gilsan, 2007, p. 380). For the purpose of grading, OPIs may also be recorded and assessed by either a trained examiner, as done with IELTS or, alternatively, sent away for external marking, as done with Edexcel IGCSE ESL OPI.

Common Formats of an OPI

The format of oral interviews as a tool for assessment/testing can follow different interviewing designs: the teacher/examiner interviewing the students, the students interviewing each other, or the students interviewing the teacher/examiner (Graves, 2000). Alternatively, a one-to-one interview follows a ―direct, face-to-face exchange between learner and interviewer" (Hughes, 1992, p. 54). There are variations of the one-to-one OPI format. One example is to place two test takers at one time with the interviewer (Cambridge, 2011).

A practical advantage of placing two test takers at one time in an OPI is that test centers can see more students at one time (Hughes, 2003). Another advantage of having two students together is that test tasks could be presented to encourage student-to-student interaction. This interaction, as Brown (2004) notes, can be achieved through posing problem-solving activities, and through ―role plays, the interviewer can maximize the output of the test takers while lessening the need for his or her own output" (p. 171).

Maden and Taylor (2001) also suggest, linking interviewing to teaching, that the interviewer (usually the language instructor or teacher,) should also enter into the interaction with students for both teaching and assessment purposes. However, as Maden and Taylor (2001) note, the length of interaction with the instructor/ interviewer may affect the type of response and hinder maximum spoken responses from the test taker.

Canale (1984) believes that to maximize a test taker‘s performance in OPIs, students should be led through four main stages in test administration: Warm up, level check, probe, and wind-down. The warm up stage is intended to put the ―test taker at ease and to familiarize him or her with the target language and with the interviewers" (Canale, 1984, p. 354). The level check is designed to seek ―to identify that proficiency level at which the test taker performs best (i.e. most comfortable and most satisfactorily)" (Canale, 1984, p. 354). According to Brown (2004), using question prompts at this stage can also help provide the interviewer with a picture of what the test taker can do and cannot do, whereas the probe stage provides an opportunity for the interlocutor to challenge test takers to try and go beyond their oral skill level. According to Canale (1984), the purpose of this probe is to ―verify the test taker‘s maximum proficiency level and to demonstrate to the test taker what tasks he or she cannot yet perform" (p. 454). The wind-down section, according to Brown (2004), is to dedicate a ―short period of time during which the interviewer encourages the test taker to relax with some easy questions, sets the test taker‘s mind at ease, and provides information about when and where to obtain the results of the interview" (p. 168). Brown (2004) also adds that this stage is not scored.

Validity and Reliability in Oral Proficiency Interviews

A test is said to be valid if it really measures what it is supposed to measure (Weir, 2005; Hughes, 2003). However, there are many aspects of validity, including construct validity, construct-irrelevant variance, predictive validity and content validity.

Construct Validity

Bachman and Palmer (1996) define construct validity as ―the meaningfulness and appropriateness of the interpretations that we make on the basis of test scores" (p. 21). The meaningfulness and usefulness of test scores are to make sure a test score is an accurate representation of a student‘s level of language knowledge skills (Luoma, 2004; Weir, 2005). Underhill (1992) also states that a test should ―share the same assumptions and the same philosophy as the program of which it is part" (p. 106). If, for example, an OPI is measuring aspects of communicative competence, then the test needs to reflect these components.

Construct-irrelevant Variance

Construct-irrelevant variance can be defined as extraneous factors affecting the ―test taker‘s ability on the construct that causes the test score to be high or low" (Fulcher & Davidison, 2007, p. 25). If aspects of a test task are irrelevant to the focal construct of a test, then this may make the test irrelevantly more difficult for some individuals or groups (Messick, 1989). Construct-irrelevant variance in a test may lead to lower scores for some test takers but higher scores for other test takers (Weir, 2005). Fulcher and Davidson (2007) point out that test anxiety and test unfamiliarity would introduce a construct-irrelevant element.

Predictive Validity

Predictive validity may be used to predict a student‘s success in English at some future point in their educational journey (Hughes, 2003). Brown (2004) also adds that predictive validity is achieved in tests if tests can accurately ―predict a test taker‘s likelihood of future success" (p. 25). One example of what assessment seeks to aim at predicting through an OPI can be illustrated through the high-stakes IELTS exam which includes an OPI. The IELTS mission statement says that the test is ―measuring real ability for real life. IELTS encourages, reflects and tests English as it is used in work, study and life. This real life authenticity gives you a personal and valid indicator of just how good you are!" (IELTS, 2008, p. 1). In other words, IELTS claims to measure general English ability. However, Dooey‘s (1999) study into whether IELTS was an accurate predictor of performance and success for business, science, and engineering students with 89 undergraduate students, found that language was not a key factor contributing to academic success, as measured by the IELTS test. On the other hand, Woodrow‘s (2006) study found significant correlations between writing, speaking and listening of 82 IELTS students and the same students‘ GPA in their first semester at university. Determining predictive validity with tests like IELTS may depend on factors such as the group of students and the course studied, as well as the size of the sample and length of the study.

Content Validity

When considering validity and testing general language proficiency, Hughes (2003) states that ―a test is said to have content validity if its content constitutes a representative sample of language skills with which it is meant to be concerned" (p. 23). To reinforce content validity in a test, the tasks used should be relevant and representative and at the same time show opportunities of complexity which can reflect different levels or abilities from individual students (Messick, 1996). Bachman and Palmer (1981, cited in Weir, 2005) also note that content validity is ―principally concerned with the extent to which the selection of test tasks is representative of the larger universe of tasks of which the test is assumed to be a sample" (p. 25). However, selecting valid oral tasks to reflect oral proficiency in a language is challenging. Weir (2005) observes that validating tasks in tests is challenging, owing to the ―attempts to operationalize real-life behaviour in a test" (p. 20).

Hughes (2003) also points out that in an achievement test selecting an accurate range of test tasks which reflect the course aims can be problematic if the course aims are set out in more general terms. Understanding fully what concepts the test content aims to measure becomes essential if ―the results of performance on a test give us an accurate picture of the underlying abilities or constructs we are attempting to measure" (Weir, 2005, p. 12). In addition, Brown (2004) points out that these constructs should aim at eliciting an adequate and equal weighting towards the tasks students have to perform, e.g., tests offering a variety of item types and appropriate time distribution which are representative for what skills are taught in a course. Then to achieve content validity, the tasks and content must accurately match the list of skills or functions from a language curriculum.

The Link between Reliability and Validity

A complex issue is how and to what degree validity and reliability impact a test (Alderson, Clapham, & Wall, 1995). Weir (2005) points out researchers focusing on aspects of test reliability and validity on tests in the past had different opposing viewpoints.

However, Weir (2005) observes that the validity and supporting reliability of a test should work hand in hand: ―validity of a test does not lie in what test designers claim; rather, they need to produce evidence to support such claims" (p. 15). According to Chappelle (1999), reliability is now seen as a type of validity evidence in tests. Considering their importance, both validity and reliability in tests are ―complementary aspects of identifying, estimating, and interpreting different sources of variance in test scores" (Bachman, 1990, p. 239).

Reliability

Reliability in test scores, according to Bachman and Palmer (1996), should mean that the ―score will be consistent across different characteristics of the testing situation" (p. 19). Brown (2004) defines reliability in classroom tests, saying when teachers ―give the same test to the same student or matched students on two different occasions; the test should yield similar results" (p. 20).

Factors Affecting Reliability

Factors which may reduce and affect reliability in tests may include test formats, the content of the questions and the time given for test takers (Coombe, Folse & Hubley, 2007). Other factors of test reliability include test administration, raters, the test itself and test tasks.

Test Administration Reliability

Test-administration-related factors which may impact the reliability of a test can include outside noise during a test, bad photocopying, the amounts of light and temperature where the test is placed, and the conditions of the desks, chairs and time of day the test is administered (Brown, 2004). The way a test is conducted and delivered by the interlocutor/ examiner may also impact the reliability of a student‘s performance. An interlocutor‘s accent and/or speech rate also may cause students not to fully understand instructions on a spoken test (Weir, 2005).

Rater Reliability

According to Bachman and Palmer (1996), ―if raters rate more severely than others, then the ratings of different raters are not consistent, and the scores obtained could not be considered to be reliable" (p. 20), resulting in inter-rater reliability problems. Human error or subjectivity (inter- and/or intra-rater reliability) may also cause differences with the reliability in the scoring process, and this affects reliability in measurement of oral samples from a test taker (Coombe et al, 2007; Brown & Hudson, 1998).

Another way raters may cause the test scores to be unreliable is if the interlocutor is familiar with the test taker. Brown (2004) notes that unreliability in test scores may happen due to the interlocutor or examiner being biased towards the test taker.

Creating a consistent scoring system which is workable and reliable for tests is a challenge (Brown, 2004). Underhill (1992) also points out that one reason oral samples are difficult to score is that ―at higher levels it is difficult to produce such well-defined scales" (p. 56). In addition, understanding how to score a limited speaking sample so it is relevant to a given context also may be difficult (Davidson & Fulcher, 2007). Another reason for unreliability in the scoring process may be due to the interview experience of the interlocutor/examiner (Luoma, 2004).

However, to increase reliability in test administration, training can be offered to interlocutors/ examiners to follow standardization in terms of scripts and allocated time frames for each test task. Another way to reinforce standards in OPIs is to offer on-going training for those who are involved in the exams (Cambridge, 2011). According to Cambridge IELTS (2009) reinforcing standards through ―recruitment, training, benchmarking, certification and monitoring for IELTS examiners ensures that they are fully qualified, experienced and effective" (p. 1). Similarly, McNamara (2001) also states that ―an important way to improve the quality of rater-mediated assessment schemes is to provide ongoing training for raters" (p. 44). Reinforcing standardisation through monitoring and training for OPIs also reinforces reliability as it may prevent interlocutors/examiners giving any unfair advantage (or disadvantage) to a test taker (Weir, 2005).

Test Reliability

The test itself may prove unreliable due to time given for test takers and unsuitable test formats and content of test tasks (Coombe et al, 2007). Brown (2004) also points out the issues of time discriminating against students who are fatigued and consequently do not perform well under timed constraints. Wigglesworth‘s (1993) study also examined the effects of planning time (one minute or no time) on oral test discourse. Wigglesworth notes that while planning time was beneficial for highproficiency test takers in terms of accuracy, low proficiency test- takers did not benefit from increased planning time. Having time constraints in an OPI task may seem unnatural and cause students difficulty if they are asked to present a topic for an extended time, as in Part Two of the Edexcel IGCSE ESL OPI. Individually presenting for an extended time frame in an OPI may seem unnatural since this ―rarely occurs in everyday communication" (Woodrow, 2006, p. 322).

The unnatural lack of social interaction between the interlocutor/examiner and test taker in OPIs may also reduce reliability of oral performance. McNamara (1997) points out that ―we must correct our view of the candidate as an isolated figure, who bears the entire brunt of the performance" (p. 459). Furthermore, the ―power relationship between interlocutor and test taker is yet another factor that can shape the interaction that emerges during the testing event" (Taylor & Wigglesworth, 2009, p. 328). Hughes (2003) also adds that the relationship between the candidate and interlocutor/examiner is one of dominance and the consequences mean that the candidate may be unwilling to take the initiative when communicating.

Furthermore, using the format of an oral interview to measure natural language may also affect reliability of a test taker‘s oral performance. Fulcher and Marquez (2003) note that an interview ―generates a special genre of language different from normal conversation" (p. 183). The question-answer response format of an OPI to measure natural language may also ―be problematic as oral interviews are rarely used in everyday situations" (Woodrow, 2006, p. 322).

Context of Test Questions

Problems with reliability in tests can also depend on the type of test tasks and a test item‘s ability to measure a language construct or level of a test taker‘s performance. Elder, Iwasita, and McNamara (2002) note that two problems in tests are ―gauging the influence of task characteristics and performance conditions of a candidate and how to determine the difficulty of a task" (Elder, Iwashita, & McNamara, 2002, p. 348). Achieving reliability in selecting appropriate contextualized speaking tasks for individual test takers in a test-taking situation may prove difficult because ―speaking takes place in specific social settings, between people with particular communicative goals" (Fulcher & Marques, 2003, p. 51).

Gathering a range of reliable oral samples from short OPI test questions may be limiting for test takers and cause difficulties for test designers to reinforce factors of reliability and validity, e.g., selecting tasks which are culturally appropriate and providing test takers with opportunities to show a range of language skills in test tasks. Weir (2005) notes that if test tasks ―reflect real-life tasks in terms of important contextually appropriate conditions and operations it is easier to state what a student can do through the medium of English" (Weir, 1993, cited in Weir, 2005, p. 56).

Test takers also need to be provided with opportunities to show a range of language skills through the tasks presented in tests. Bachman and Palmer (1996) note that tests should provide opportunities ―in which the test taker‘s areas of language knowledge, metacognitive strategies, topical knowledge, and affective schemata are engaged by the test task" (p. 25). Weir (2005) also notes that to reinforce reliability with tasks in tests, ―every attempt should be made to ensure that candidates are familiar with the task type and other environmental features before sitting the test" (p. 54). The purpose of providing students with familiarization of task types may help ―promote a positive affective response to the task and can thus help test takers perform at their best" (Bachman & Palmer, 1996, p. 24).

Student-related Aspects of Reliability

The effects that a student experiences from an assessment or a test such as fatigue, sickness, anxiety, or emotional problems may cause a student‘s score to deviate from the score that reflects his or her actual ability (Coombe et al, 2007). These student-related aspects of reliability can also stem from gender differences between the candidates and the interlocutor. Factors potentially affecting reliability and validity in a test, including ―the age, sex, educational level, proficiency/native speaker status and personal qualities of the interlocutor relative to the same qualities in the candidate are all likely to be significant in influencing the candidate‘s performance," according to McNamara, (1996, p. 54).

Gender Differences and Interlocutor Effect

One factor of reliability impacting validity which has been studied in OPIs is whether or not the gender of the candidate and interlocutor makes any difference in terms of performance or scores. Studies have shown that the gender differences in OPIs may influence both the reliability of a test-taker‘s performance and also the way the performance is graded. Young and Milanovic‘s (1992) study found that women and men had different oral response times to tasks in interviews. Another study into gender differences in OPIs found that interlocutor familiarity with the males and females in OPIs made a difference in the length of a test takers oral performance (McNamara & Lumley, 1997). One study by O‘ Sullivan and Porter (1996) also revealed that males and females had different interview styles, depending on the cultural background of the test taker.

However, O‘Loughlin‘s (2002) study into whether the gender of the interlocutor/examiner had an impact on OPI scores in the IELTS interview test revealed that the gender of candidates and those who rate candidate‘s performance did not have a significant impact on the rating process. O‘Loughlin (2002) also noted from his study that one possible reason why there is no effect is that oral interviews may be ―gender neutral" (p. 21).

How an interlocutor may affect student‘s behavior in a test due to the use of non-verbal communication may also be a factor which can influence a student‘s oral performance. Plough and Bogart‘s (2008) study into the paralinguistic and nonverbal behavior of an examiner (eye contact and body posture; paralinguistic features such as voice volume, speed, and non-lexical sounds; and verbal and nonverbal turn-taking and listening behavior such as head nods and back channel cues) in oral tests found that these prominent factors influenced the test takers‘ oral performance and also influenced marking from those who rated the performance. Using appropriate nonverbal communication between interlocutor and test taker may also mirror the strategies of engagement of a real authentic conversation. In contrast, not looking engaged, yawning, for example, may have the reverse affect. Another study by Jenkins and Parr (2003) reviewed whether or not non-verbal communication influenced interlocutor‘s marks in oral tests. Jenkins and Parr‘s study conducted in Canada found that higher marks were rewarded to test-takers in oral proficiency exams who employed non-verbal behavior considered appropriate by North American raters.

Language Learning Anxiety

Previous literature and research into how language anxiety affects language learning and performance has not been clear and has often been problematic (Young, 1991). One area of research has looked into how anxiety has a negative relationship with language learning and performance (MacIntryre & Gardner, 1994). Other research has shown how anxiety may have a positive relationship with language learning (Brandl, 1986, cited in Young, 1991). The positive relationships between anxiety and learning and taking for students may include the motivation to get good grades and ―compel students to work harder, to learn certain content and skills" (Cizek & Burg, 2006, p. 25).

This problematic relationship between language learning and anxiety does depend on the language context and the learners. Kim (2009) points out that ―learners bring to the class a litany of different experiences and proficiencies that influence the level of anxiety they have regarding the learning task" (p. 139). Furthermore, those students who display less self-confidence in class (due to being anxious) may influence the reliability of performance of using and learning a target language (Pichette, 2009).

Cultural considerations and the context of where language learners are placed may also be a factor for language anxiety. Jung and McCroskey (2004) point out that ―living in a different culture combined with different norms can function as another suspected situational variable" (p. 172). Xie and Leong (2008) add that although ―anxiety may be a universal emotion, cultural beliefs and practices still have important influences on experiences and manifestations of anxiety" (p. 52).

Two aspects of how anxiety can change across different settings or conditions can be illustrated through the terms of state anxiety and trait anxiety characteristics. State anxiety is a temporary individual condition which is only evident in specific situations (Cizek & Burg, 2006). Trait anxiety, however, is already part of an individual‘s personality or character and is a more stable personality characteristic (Cizek & Burg, 2006). A student who experiences higher levels of trait anxiety and then added amounts of state anxiety may become highly anxious under test-taking conditions.

Types of Anxiety Related to Language Learning

Horwitz, Horwitz, and Cope (1986) describe the relationships between different types of foreign language anxiety and how these types of anxiety can influence performance when using English as a foreign language: Communication apprehension, test anxiety, and fear of negative evaluation. These three concepts provide insights into how anxiety affects language learners and also the causes of English second/foreign language anxiety.

Communication Apprehension

Horwitz et al (1986) define communication apprehension as ―a type of shyness characterized by fear or anxiety about communication with people" (p. 128). They also note that this type of anxiety can be caused by learning and through the experiences of learning a second language.

The causes of communication apprehension with language learners can be ―explained through their negative self-perceptions stemming from the inability to understand others and make [students] understood" (MacIntyre & Gardner, 1989: cited in Ohata, 2005, p. 13). In addition, language learners who also feel less in control of a communicative situation may also feel that their attempts at oral work are being monitored (Horwitz et al, 1986).

Communication anxiety with language learners may also be caused by the environment in which they are placed, both in terms of a change in cultural settings or learning environment. One example of anxiety change that a second language may experience is where and with whom the language learner uses the language. Woodrow (2006) observed that for the participants involved, ―communicating with native speakers was the most referred to out-of-class stressor" (p. 322). Language learners then may experience different aspects of communication apprehension in different unfamiliar social contexts and with people they are not familiar with (outside a class setting).

Test Anxiety

Test anxiety can be defined as a ―type of performance anxiety stemming from fear of failure" (Horwitz et al, 1986, p. 127). Cizek and Burg (2006) also add that test anxiety ―is one of the many specific forms of anxiety; it results in a combination of cognitive and physical responses that are aroused in testing situations or in similar situations in which a person believes that he or she is being personally evaluated" (p. 1).

Some individuals may also be more prone to suffer or be more vulnerable to test anxiety than others. For example, when it comes to tests some individual test takers or learners may be described as being laid back, whereas others may be described as highly nervous (Cizek & Burg, 2006).

One reason why test anxiety may become an obstacle to students learning a foreign/second language is that exams and tests have become a major part of today‘s society (Spielberger & Vagg, 1995). A consequence of English-as-a-second language tests for students may mean that there is a continuous pressure requiring results for a particular educational context. For example, Lloyd and Davidson (2005) observe that for students failing high-stakes English second language tests at Zayed University in the UAE, it ―may mean dismissal from university" (p. 324).

However, although test anxiety may be perceived as a negative experience some students may perform better under pressure. Brandl (1986, cited in Young, 1991) considers ―a little bit of intimidation a necessary and supportive motivator for promoting students‘ performance" (p. 50). Cizek and Burg (2006) also add that ―preparing and reminding test takers of the importance of tests may raise levels of anxiety, but also may raise motivation to do well" (p. 92).

Fear of Negative Evaluation

The fear of negative evaluation may not only relate to assessments/tests but may also occur in any social, evaluative situation such as interviewing for a job (Horwitz et al, 1986). However, these areas of anxiety relate to the learners‘ negative self-perceptions and also their ability to be understood by others or feelings behind being able to be understood (MacIntyre & Gardner, 1994; Lee, 2007). These feelings of being negatively evaluated or sounding dumb in front of others can affect students in learning and using a second language (Young, 1991).

Factors Affecting Anxiety

Horwitz et al (1986) identified factors of anxiety which may affect second language learning. These include student beliefs, learner self-esteem, and instructor beliefs. However, other factors which may affect anxiety include environmental factors and gender issues.

Student Beliefs

Implementing something new, like a new language, threatens a student‘s sense of self-identity. Language anxiety may stem from beliefs surrounding learning a new language, and these may cause frustration and tension in class (Howitz et al, 1986, from Young 1991). Young (1991) also adds that the beliefs and perceptions of learning a language can contribute towards language anxiety.

Furthermore, these beliefs and perceptions may stem from unrealistic feelings toward learning a language. Horwitz (1988, cited in Ohata, 2005) points out that some students‘ beliefs are based on six different ideas: ―1) Some students believe that accuracy must be sought before saying anything in the foreign language. 2) Some attach great importance to speaking with excellent native L1-like accent. 3) Others believe that it is not okay to guess an unfamiliar second/foreign language word. 4) Some hold that language learning is basically an act of translating from English or any second/foreign language. 5) Some view two years as sufficient to gain fluency in the target language. 6) Some believe language learning is a special gift not possessed by all" (p. 138).

Learner Self-esteem

The way students perceive themselves as individuals is also another factor which contributes toward language learning anxiety. The concept of ―self" relates to how an individual sees him/herself as a learner, perceptions of failing and being evaluated in class and assessment/tests (Horwitz et al, 1986). According to Horwitz et al (1986), ―learners‘ self-esteem is vulnerable to the awareness that the range of communicative choices and authenticity is restricted" (p. 128). This belief is based on the idea that any oral performance can become an obstacle and consequently lead to embarrassment when the concept of self as being competent is challenged in some way.

Instructor Beliefs

ESL teachers and instructors also have their own perceptions about effective teaching student interaction in a language learning context. Brandl (1987, cited in Onwuegbuzie, Bailey & Daley, 1999) points out that ―anxiety is exacerbated when instructors believe that their role is to correct students when they make errors and do not promote group work" (p. 220). The type of class activity which a second language teacher selects may also influence the levels of anxiety experienced by those in a class. Young (1991) points out that some instructors think that they ―cannot have students working in pairs because the class may get out of control" (p. 428). However, Frantzen and Magnan (2005) believe that anxiety in language classes could be ―ameliorated by the sense of community that instructors ... established in their classrooms" (p. 183). Young (1991) also notes that ―students felt more at ease when the instructors‘ manner of correction was not harsh and when the instructors did not overreact to mistakes" (p. 432).

For instructors recognising the causes and signs of language anxiety may assist with effective teaching of a new language to students. Onwuegbuzie et al (1999) note that it is important that ―foreign language instructors not only recognise the possibility that some students experience high levels of anxiety, but also identify these at-risk students" (p. 232). Horwitz et al (1986) also point out that anxiety should be considered when ―attributing poor student performance solely to lack of ability, inadequate background, or poor motivation" (p. 131).

[...]

Excerpt out of 105 pages

Details

Title
Oral ESL Test Anxiety with Emirati Secondary School Students
Course
MA IN TESOL
Grade
none
Author
Year
2011
Pages
105
Catalog Number
V181410
ISBN (eBook)
9783668335868
ISBN (Book)
9783668335875
File size
1430 KB
Language
English
Keywords
oral, test, anxiety, emirati, secondary, school, students
Quote paper
Christopher Blake (Author), 2011, Oral ESL Test Anxiety with Emirati Secondary School Students, Munich, GRIN Verlag, https://www.grin.com/document/181410

Comments

  • No comments yet.
Look inside the ebook
Title: Oral ESL Test Anxiety with Emirati Secondary School Students



Upload papers

Your term paper / thesis:

- Publication as eBook and book
- High royalties for the sales
- Completely free - with ISBN
- It only takes five minutes
- Every paper finds readers

Publish now - it's free