Open Access

Impacts of the Test of English Listening Comprehension (TELC) on teachers and teaching in Taiwan

Asian-Pacific Journal of Second and Foreign Language Education20172:5

DOI: 10.1186/s40862-017-0028-9

Received: 6 December 2016

Accepted: 24 May 2017

Published: 30 May 2017


High-stakes testing has frequently exerted a strong influence on teaching and learning in educational contexts. In 2012, the Ministry of Education passed legislation that made the new Test of English Listening Comprehension a requirement for the University Entrance Examination. The present study investigated how far the launch of the new English listening test has impacted on English teachers and their instructional practices in senior high school. This research involved a semi-structured interview with 20 senior high school English teachers in Taiwan. The findings revealed a complicated entanglement of external and internal factors that influenced the degree and intensity of test impact on teachers and teaching. The new national English listening test was found to have both positive and negative effects on teachers. While how the listening instruction was delivered was affected by teachers’ pedagogic knowledge and past experiences in teaching and learning, the frequency of instructional practice was constrained by the key stakeholders, such as parents’ and students’ expectations.


Test impact Listening Teacher Teaching approach Stakeholder


The effect of testing on teaching and learning has long been known as ‘washback’, and this can be harmful, beneficial, or both (Alderson and Wall 1993; Hughes 2003; Shohamy et al. 1996). The concept of washback derives from the notion that tests or examinations inevitably affect in large measure what is taught in class, and in the case of extreme effects, the result is referred to as ‘measurement-driven instruction’ (Cheng and Curtis 2004). Andrews and Fullilove (1994 p.64) highlighted the point that the design of a new test should ‘embody the characteristics of a good test,’ in the sense that the test tasks should be designed so as to exert a positive influence on teaching. Nevertheless, one of the commonly cited washback effects is the generally negative impact of traditional, large-scale, multiple-choice tests on curriculum design, teaching approaches, and classroom practice. As teachers’ instructional decisions shape what and how their students learn, and can ultimately promote or reduce students’ motivation for learning and influence their achievement, it is important to establish how teachers respond to a new English test and how far the test has had positive and/or negative effects on teaching.

The University Entrance Examination (UEE) for senior high school students in Taiwan consists of three tests: the General Scholastic Ability Test (GSAT), the Advanced Subjects Test (AST), and the Test of English Listening Comprehension (TELC). The GSAT is used for school recommendations, university applications, and functions as a public examination. The AST is purely a public examination. The GSAT and AST have been developed to replace the Joint College Entrance Examination (JCEE) since 2002. Similar in form to the English test part of the JCEE, the GSAT and AST consist of discrete-point grammar and vocabulary tests, reading comprehension tests, Chinese-English translation, and a 120-word composition. The TELC, however, is a new test, whose policy was set out by the Ministry of Education (MOE) in 2012, and it has been used for school recommendations, for personal applications, and as a public examination from 2015. The contents of the TELC follow the official English language curriculum for senior high school education. The test has four purposes: (1) to assess learning outcomes of English listening based on the senior high school curriculum for English, (2) to foster listening, speaking, reading, and writing skills, (3) to bridge the gap between students’ listening and communicative abilities in everyday situations, and (4) to link high school English education with globalisation (Chou 2015, 2016). As regards content, the TELC comprises ‘picture identification’, ‘question and answer’, ‘short talk’, and ‘short report’ subtests. First, ‘picture identification’ tests candidates’ listening comprehension of high frequency words (4500 English vocabulary; see College Entrance Examination Center, CEEC, 2016a) and sentence structures. Second, ‘question and answer’ items assess candidates’ ability to understand statements or questions and to answer by choosing appropriate responses. Next, ‘short talk’ items concern candidates’ comprehension of conversations in everyday contexts. Finally, the ‘short report’ section tests candidates’ understanding of oral reports. Sample test items are provided on the official website (College Entrance Examination Center, CEEC, 2016b).

The initial intention to incorporate the TELC into senior high school English education and the university entrance examinations is surely positive. However, the extent to which this new listening test has been incorporated into the teaching and has benefited or detrimentally affected Taiwanese senior high school teachers remains unknown. The results of this study will, it is hoped, contribute to a better understanding of the current English language teaching at senior high school, of teachers’ responses to the new test policy, and may actually help to suggest possible instructional practices that could be useful to teachers responsible for planning course preparation.

Literature review

The washback of English tests on teaching

The focus of washback research has shifted over the past few years from an emphasis on high-stakes international English language testing preparation courses (e.g., Alderson and Hamp-Lyons 1996; Green 2006; Read and Hayes 2003) to attention to test stakeholders’ perspectives, particularly those regarding assessment, learning, and teaching in school (e.g., Cheng et al. 2011; Chou 2015). Washback effects can be positive, negative, or both. Positive washback occurs when a test encourages beneficial changes in teaching and learning, and teachers and students work willingly and collaboratively toward achieving the objectives of the test (Cheng and Curtis 2004; Hughes 2003; Pearson 1988). By contrast, negative washback occurs when teaching and learning are directed to practicing past examination papers, teaching test-taking skills, and drilling via multiple-choice worksheets, regardless of whether or not the rationale or the aim of the test is fully understood (Cheng and Curtis 2004; Davies 1968; Noble and Smith 1994). As washback is by definition the effect of testing on teaching and learning, the teacher is one of the major stakeholders whose teaching is highly likely to be affected by examination policies.

Several studies have examined the relationship between test impact, teachers, and teaching. Shohamy et al. (1996) found that changes in a national EFL test in Israel created a major impact on teaching activities, time devoted for test preparation, and production of teaching materials. In a study of Chinese students preparing for the IELTS course in the UK, Green (2006) found that teachers teaching for the test had a greater impact on learners’ learning outcomes than teachers in non-IELTS courses. In Boardman and Woodruff’s (2004) study of the impact of a new instructional practice in a state-wide student assessment in the USA, the high-stakes assessment had impacted on teaching practices regarding which teaching components to implement, how often to use these components, and what materials to utilise for test preparation. In East Asia, Cheng (1997, 2004), conducting a series of studies to investigate the washback from the Hong Kong Certificate Examinations in English on teachers’ perceptions towards their classroom teaching, discovered that these exams did impact on secondary teachers, but the impact varied in their perceptions and in the teaching methodologies they adopted. Watanabe (2004), in his study of the teacher factors mediating washback effects of the Japanese UEE on secondary students, highlighted the point that the teachers’ attitudes towards the test washback effect in the classroom was far more complicated than the test policy itself; teachers’ psychological states and their degree of familiarity with a range of teaching methods appeared to be involved in shaping what was taught in the actual classroom. In short, these studies have shown that new language policy or tests did impact on teachers and their teaching to differing degrees, relating to, and varying with, political, cultural, and educational backgrounds.

Spratt (2005), summarising a number of empirical studies on washback effects (Alderson and Hamp-Lyons 1996; Cheng 1997; Lumley and Stoneman 2000; Read and Hayes 2003; Shohamy et al. 1996; Wall and Alderson 1993; Watanabe 2000), concluded that there were five key factors affecting the degree and nature of washback on teaching: (1) teacher beliefs about the exam, (2) teachers’ attitudes towards the exam, (3) teachers’ education and training, (4) resources, and (5) the school. At the pedagogical phase, teacher knowledge has been found to be linked to teachers’ planning thoughts, beliefs, classroom decision-making, and reflections on their teaching (Tsui 2011). Shulman (1986) stated that teacher knowledge consists of three basic domains: pedagogic knowledge, content knowledge, and pedagogic content (content presentation) knowledge. Before the implementation of the TELC, the English tests in the UEE in Taiwan (from 1972 to 2013) had only assessed vocabulary, grammar, reading, translation, and composition writing. However, the teachers’ own learning experience in senior high school and the format of the English test certainly constitute part of their ‘teacher knowledge’ in the English class. Hence, it is worth investigating how far the introduction of the new English listening test has impacted on, or perhaps challenged, the teachers, who are used to the traditional grammar-translation method of teaching English.

Public and private senior high schools in Taiwan

The senior high schools in Taiwan are administered either by the MOE (public) or by private sector organizations. In addition to special programs for elite students who excel in mathematics, English language, physical education, arts or dance, public schools are not allowed to place students in groups according to their achievement. On the other hand, private schools take all kinds of students, and they usually adopt a strict ability grouping system to ensure teaching and learning qualities (Hwang 2014; Kuo and Tao 2013; Liang and Sun 2014). Private schools usually offer afterschool programs, in the sense that students stay at school to take additional courses as a means of academic reinforcement after their regular school hours (Liang and Sun 2014). Though both public and private high schools offer secondary education to Taiwanese students, their recruitment system and the way students are placed in classes may influence how teachers teach and prepare for national tests. It is worth investigating if there is a difference in the teaching approaches and in the preparation for the TELC between public and private school teachers.

Research on senior high school education in Taiwan has so far been primarily focused on educational leadership (e.g., Chen 2008; Hsiao et al. 2008), the relationship between GSAT and students’ academic performance (e.g., Lee 2003; Yang 2003), teacher’s work value and job stress (e.g., Chen 2003), and students’ learning motivation and attitudes (e.g., Hardré et al. 2006; Tsai 2013). Until now, little is known about the impact of the newly-introduced listening test on English teaching at senior high school level, and how public and private senior high school English teachers have reacted to it. Considering the scarcity of research regarding language policy, language test impacts, and group differences in Taiwan, there is a clear need for a study that investigates the extent to which, and the manner in which, the TELC affects senior high school teachers and teaching. The present study accordingly addresses the following two research questions:
  1. 1.

    To what extent does the TELC impact on the English language teachers and teaching in senior high schools in Taiwan?

  2. 2.

    Is there a difference in how the English language teachers respond to the implementation of the TELC between public and private schools?



The methods for exploring test impacts commonly involve questionnaire surveys, interviews, observation, analysis of textbooks, exam papers or other relevant documents, and ethnography (Cheng et al. 2011; Spratt 2005; Watanabe 2004). Surprisingly perhaps, although questionnaires can be efficient and economical tools and as such are frequently adopted to gather opinions, attitudes, views, or beliefs (Denscombe 2014), reliable and valid examples exploring washback effects on teachers are rare. On the other hand, an interview has a greater possibility of eliciting in-depth and detailed information and insights from participants (Bell 2010; Cohen et al. 2011; Denscombe 2014). Also, at the exploratory stage, it is difficult to specify and indicate what instructional practices need to be observed in the English class since it is unclear to what extent the new test has impacted on teaching. Compared with structured interviews and unstructured conversations, semi-structured interviews gives interviewees more freedom to express their opinions in depth when necessary, while the event as a whole stays under the control of the interviewer (Cohen et al. 2011; Denscombe 2014). Thus, in the present study, face-to-face semi-structured interviews with teachers from different senior high schools were deemed the most appropriate. The interview questions were based on teacher-related factors that have been found to underlie washback effects in previous studies (Spratt 2005). The four fundamental factors included (1) teacher beliefs about the exam (what teachers think about the exam), (2) teachers’ attitudes towards exams and preparation of exam materials (tests, textbooks, or teaching materials), (3) teachers’ education and training (teachers’ knowledge about teaching relevant language skills), and (4) resource availability (additional support for students to practice the language skill). The interview questions were piloted with another five senior high school teachers to check if there were any problems. A few corrections on wordings were made. The five interview questions were as follows:
  1. 1.

    Do you think the TELC matches the English teaching curriculum and objectives in senior high school education? Why or why not?

  2. 2.

    What English textbooks and/or teaching materials do you use in class and for what purposes? How many hours are spent on teaching listening?

  3. 3.

    Are you currently teaching listening in your English class? If yes, how do you teach it? If no, why not teach it?

  4. 4.

    Have you (or your school) made any changes to encourage students to practice listening after the implementation of the TELC?

  5. 5.

    What do you think about adopting a more communicative teaching approach that allows students to practice listening and speaking abilities in English classes?


Research participants, data collection, and data analysis

The interview was carried out with 20 English teachers from 10 senior high schools across Taiwan. Of the 20 teachers, ten (50%) were from public schools, and the remaining ten (50%) were from private schools. To qualify for a teaching position in a public senior high school, a teacher usually graduates from a “teacher’s university” which offers compulsory teacher training courses, and he or she usually obtains a master’s degree in relevant subjects. The majority of the teachers in private senior high schools, on the other hand, have usually graduated from universities other than a “teacher’s university” (Chen 2003; Lin 2014), but they need to take extra teacher training courses while studying for their bachelor’s or a master’s degree, and pass the teacher qualification test. In the present study, twelve teachers held a master’s degree, seven teachers had just a bachelor’s degree, and one teacher had a doctoral degree in English language and literature, TESOL, or language education.

The interviews were conducted in the teacher offices or conference rooms in the schools. The surroundings were quiet during the interview. It took about 25 min for each teacher to answer the five questions. The semi-structured interview data were analysed based on the interview questions concerned (Cohen et al. 2011).


The teachers’ viewpoints on the TELC and its impact on teaching were gathered via the semi-structured interview. Nearly all the public and private senior high school teachers (18 of them) agreed in the interviews that the TELC matched the English teaching curriculum and objectives in the senior high school education. Generally speaking, the teachers believed that listening skills are as essential as reading, speaking, and writing skills, that they are a requisite for the development of communicative competence, and that the test should have been launched earlier. Though the teachers supported the implementation of the TELC, the new test had impacted on them and their teaching to a quite large extent. First, the majority of the teachers expressed a strong concern to use appropriate approaches to teaching English listening in class. Therefore, they suggested that relevant in-service training courses should be offered. The second problem involved the teaching of listening in the tight teaching schedule. Since the number of hours for English courses were fixed in the public schools, the public school teachers complained that they did not have additional hours to teach English listening. The following two excerpts illustrated the teachers’ belief in the TELC and their concerns:

Definitely yes. I think English listening needs to be tested, but I feel that we teachers don’t really know how to teach listening. Because it wasn’t tested before, teachers didn’t teach it, and as time went by, we didn’t find out how to teach listening. For me, it’s hard to say what counts as an effective English listening course. The traditional way is to give students tests, but this isn’t teaching. I think we teachers need to learn how to teach listening. You can’t just play the recordings and ask students to comprehend everything themselves. (Interview Q1; private school teacher 02; translation)

Yes. I think the intention of implementing the TELC is positive, but corresponding measures are insufficient. We don’t have systematic in-service training programs or workshops to teach teachers how to teach listening. The international English listening materials are varied and sometimes costly; it would be helpful if there could be a unified version of listening materials published for Taiwanese high school students. (Interview Q1; public school teacher 14; translation)

In response to interview Question 2, the public school teachers said that there were five hours of English classes (four compulsory hours plus one additional hour) every week, while the private school students received six to eight hours of English (four compulsory hours, plus two to four afterschool hours in the evening). In both public and private schools, the teachers said that a main English textbook for reading and an English language learning magazine for listening comprehension were used. Supplementary textbooks on English grammar, vocabulary, and English proficiency tests, such as the TOEIC and General English Proficiency Test, were frequently used in the regular and afterschool English classes. Listening articles, conversations, or passages in the English magazines were played to all students in class once a week. During the 30-min listening sessions, students sometimes listened to the recordings and read the magazine, or took a listening quiz.

In addition, slightly more private (7 out of 10) than public school teachers (6 out of 10) answered that they had begun to teach English listening after the implementation of the TELC. The approaches the private school teachers used to teach listening could be roughly categorised into three types: (1) the conventional ‘comprehension approach’ (Field 2008, p.26) with a ‘pre-listening – listening – post-listening’ procedure, (2) the test-taking method, and (3) interactive listening with listening input and oral discussion. At the pre-listening stage of the ‘comprehension approach’, the teachers pre-taught unfamiliar vocabulary that students would encounter in the listening. Afterwards, the teachers prepared listening comprehension questions in multiple-choice format, played the recordings, and students answered the questions. Finally, the teachers explained vocabulary, analysed grammar, and played the CD again. The test-taking method, on the other hand, simply required students to complete the past test papers or to take mock exams; vocabulary and grammar were explained. Interactive listening involved an introduction to the topic, short listening passages, and student discussion and feedback. However, the quality of interactive listening depended on the students’ English proficiency and pressure from parents. These three excerpts exemplified how the private school teachers taught English listening:

Yes, I do teach English listening in class. I usually teach new vocabulary in the listening magazine text first, or introduce the listening topic, and then ask my students to memorise the new words after school. I then give students a vocabulary test of the new words in the listening the next day. Next, I’ll play the recordings and give them some questions to answer. After listening to the recordings, I’ll explain some more words, analyse sentence patterns, and highlight some common usage of English. And I’ll play the recordings again. I sometimes ask my students how they feel about certain issues in the listening, but only a few of them are willing to speak English in class. (Interview Q3; private school teacher 07; translation)

Yes. I usually ask my students to finish the English listening test first. You know there’re many test papers provided by the English magazines. I’ll point out the keywords and teach them how to look for key information in the listening passages. (Interview Q3; private school teacher 18; translation)

Yes. In my listening class, I prefer interacting with my students to using a textbook. I tend to design handouts with daily life issues myself, and emphasise some important and practical vocabulary. But I don’t particularly teach grammar. [Interviewer: How was the interaction between you and the students?] It depends on students’ English proficiency. Students with better English ability tend to talk more, but it’s more stressful for me to teach them because their parents value the children’s learning outcomes very much. They want to see that their children can receive high scores in English listening tests. On the other hand, teaching students with lower English ability is more relaxed, both for me and my students. They have more opportunities and are more willing to interact with me in class. (Interview Q3; private school teacher 09; translation)

Unlike the private school teachers, who used English magazines as the primary listening materials for their students, the public school teachers preferred to use multimedia, such as movie clips, English songs, and other published English listening materials to teach listening. Compared with the private school teachers, the public school teachers were more inclined to the interactive listening approach. They usually asked their students to listen to the recordings or watch the movie clips first without questions, listen (or watch) again, fill in the blanks or complete short-answer questions, discuss the answers with the group, share them with the class, and they ended with the teachers’ feedback. However, the frequency of teaching English listening and doing activities was more limited in public schools owing to the fixed teaching hours for English as a subject. Two excerpts illustrated the public school teachers’ viewpoints:

Yes. I have used movie clips to teach listening. Students love movies; it’s interesting and creative to incorporate movies into teaching contents. I usually play the movie without subtitles the first time. The students can get the main idea via the visual support. I then play the clip again and give them some short-answer questions. Most of them can get the keywords for each question. Finally, I play the clip with the English subtitles for the last time, so that they’re able to understand the whole story and get the answers. (Interview Q3; public school teacher 01; translation)

Yes. I’ve used international English listening materials to teach listening to my students. The English listening textbook adopts the different text types, such as chronological order, compare and contrast, or process. I design the worksheet for my students and ask them to provide answers. [Interviewer: What do the students feel about these activities?] They cooperate if the activities are carried out many weeks before the exams. (Interview Q3; public school teacher 14; translation)

The teachers who did not teach English listening in class (from both types of school) gave a number of similar reasons. One main reason for the public school teachers was that they did not have additional class hours for listening training. Other reasons included the belief that students could learn by themselves by practicing listening after school and their tendency to focus on teaching the test content of the GSAT, namely, reading comprehension, grammar, and translation. Owing to the strict ability grouping system in the private schools, the teachers from these said that they had spent a large amount of time on training basic grammatical knowledge and reading skills to students with lower English proficiency, and that this had left no time for listening instruction.

In addition to teaching in class, the private schools concerned had made more changes to encourage their students to practice listening after the implementation of the TELC. The changes included additional English listening and speaking courses, using an on-line English listening test system, using supplementary on-line audio-visual listening materials, and having a regular (weekly) English listening quiz. The public schools, by contrast, had offered just a weekly one-hour, optional English class taught by native speakers (NS) of English, where the students as a class unit requested it, but the teaching content was not always about listening. English listening quizzes were given to public school students as well, but irregularly. The three excerpts illustrate the differences in changes made by private and public schools.

Our school offered an English listening and speaking class and we hire NS of English to teach students, and there’s one hour especially for the listening and speaking training every semester, in addition to the teaching of listening in the ordinary English class taught by me. Also, our school has bought a large on-line English listening test bank and the computerised equipment for our students to use it. (Interview Q4; private school teacher 10; translation)

We hire native NSs of English to teach English one hour every week for the students enrolled in the special program, particularly English and math. This isn’t free, so if the students in the ordinary class want to take an English course from a NS, they, as a class unit, need to pay for the extra tuition. [Interviewer: What did the NS of English teachers teach?] There’s no restriction. It could be speaking or reading; it depends on the English proficiency of the whole class and their learning needs. (Interview Q4; public school teacher 17; translation)

Finally, though more public school teachers agreed that communicative competence needed to be emphasised in secondary English education and there was a significant difference between the two groups, the teachers from both types of school reported several and similar obstacles that limited the freedom of adopting a communicative approach in English classes. Major obstacles were attributed to parental pressure, the teaching schedule, and teachers’ teaching and students’ learning styles. Three excerpts reporting teachers’ feedback towards using a communicative approach in English classes illustrate the points made:

I think I’m the only English teacher who employs communicative teaching; not many teachers in our school dare to try. You know our GSAT is more grammar-translation oriented. If you spare class time designing speaking activities for students, you may suffer the consequences that students may not perform well in the school exams, and parents would complain that the teachers weren’t doing their jobs, I mean, giving lectures. Besides, speaking abilities aren’t tested in the GSAT or AST exams – they’re useful in the future, but just not in the current exam system. [Interviewer: Did your students like the speaking activities?] It depends. Some students participated actively in the discussion, but a few good students may not be willing to share scores in the speaking activities with other students, since they want to save high scores for future university application. Some students don’t like cooperative learning because their English listening proficiency isn’t good. (Interview Q5; public school teacher 04; translation)

I think it’s necessary to use a more communicative teaching approach. Actually we do have teachers asking students to give English oral presentations and to engage the whole class in discussion. But the thing is to make the speaking activities work on a regular basis, so that students won’t feel too much of a change in their learning processes. (Interview Q5; public school teacher 05; translation)

I am for the communicative approach, but I prefer to communicate with students individually and face-to-face. It takes a lot of time to interact with the whole class (50 students), and we still have a teaching schedule to follow; there isn’t much time to do anything else. This is the reality for private schools. If you use activities that are not directly related to the test and the activities are scored in class, parents will question us. This is a hindrance. To be honest, the GSAT and AST are so high-stakes that we teachers have to teach the two tests. (Interview Q5; private school teacher 09; translation)

To summarise, the implementation of the new English listening test has had both negative and positive impacts on the teachers and their teaching. Since one of purposes of implementing the new listening test was to strengthen students’ listening and communicative abilities in everyday situations, the adoption of a more communicative approach in the present study – one which engaged students in oral discussions or interactive activities – helped achieve the intended objective of the new test policy. Although the majority of the teachers acknowledged the value and usefulness of the test, and its benefits on the development of students’ listening abilities, the preparation for, and teaching of, the new test markedly increased their workload and challenged their past grammar-translation teaching method. As a result, this new test has brought unintended, but predictable, negative effects on teaching, in the sense that frequent practice of listening test items and drilling on grammar and translation were the main instructional practices with respect to listening in English classes. Moreover, both the public and private school teachers were under pressure from parents, students, and school expectations of helping students achieve high academic scores in English tests. The different recruitment systems and ability grouping policies in the two types of school influenced how teachers taught English listening and what resources the school authorities were capable of offering.


The results showed that the newly-introduced TELC has impacted on English teachers and their teaching approaches in senior high schools in Taiwan to a fairly marked degree. The degree of test impact was affected with respect to five interwoven things: (1) teachers’ beliefs in teaching, (2) syllabus arrangements, (3) existing national English examinations (GSAT and AST), (4) fixed time allotment, and (5) parents’ and schools’ expectations concerning students’ academic scores.

Though nearly all teachers agreed that the implementation of the TELC matched the senior high school English curriculum and teaching objectives, not all teachers modified their teaching approaches. One reason was that the TELC is an additional test, quite separate from the GSAT and AST, so some teachers still focused on the training of reading, writing, and Chinese-English translation skills, believing that in-class listening quizzes and students’ autonomous learning would enhance listening ability and scores. The teachers who responded to the TELC positively incorporated listening as well as speaking activities in class, adopting either comprehension or interactive approaches, while remaining with their existing grammar-translation methods for the preparation of the GSAT and AST. The findings were consistent across a number of cases of test impact on teaching in diverse educational contexts, where teachers had adopted new instructional practices, or modified original teaching approaches to prepare for the high-stakes national examination in order to raise students’ scores (Boardman and Woodruff 2004; Green 2006; Shohamy et al. 1996; Watanabe 2004).

However, the teaching approaches for listening to English discovered in the present study remained highly teacher-centered, comprehension-based and ‘testlike’. Field (2008), and Graham et al. (2014) both argue that emphasizing the product of listening comprehension via exchanging questions and answers, and teaching test-wise strategies for achieving correct answers, fail to provide insights as to how listeners process listening passages. Although the comprehension approach provides listeners with experiences and massive exposure to the target language, pre-set multiple-choice questions restrict the information that listeners are allowed to respond to and undermine the authenticity of language use in real-life situations (Field 2008). Instead of asking students to answer pre-determined questions with restricted answers, Rost (2011) indicates that teachers can encourage second or foreign language learners to give informational facts, express opinions or viewpoints based on what was heard.

Furthermore, these traditional, large-scale multiple-choice tests are perceived as having had negative influences on the quality of teaching and learning (Cheng and Curtis 2004; Hughes 2003; Madaus and Kellaghan 1992). Hughes (2003) suggests that direct testing involving students in performing authentic tasks in speaking or writing tests provides teachers with a clearer judgment on, and interpretation of, students’ abilities to use the target language in more real-life contexts than answering multiple-choice questions. Chou (2015), in her investigation of senior high school students’ expectation of listening instruction, discovered that the students preferred to have oral communication in their English classes. In the current study, several teachers had involved a certain degree of oral interaction by asking the students to express their opinions, or to give feedback towards issues raised in the listening extracts. Pedagogically, Rost (2011) and Field (2008) both agree that this type of ‘interactive listening’ approach forces comprehensible output and negotiation that encourage second or foreign language learners to formulate ideas in the target language, and this consequently ensures a degree of learner engagement in the listening task. Specifically, Flowerdew and Miller (2005) recommend that communicative, task-based, or integrated approaches encompassing problem-solving tasks serve practical techniques for eliciting speaking performance. An integration of the comprehension approach and oral production that offers students opportunities to share the outcomes of listening is thus suggested.

In addition to the impact of the new test policy on teaching approaches, teachers’ responses to the test were influenced by other stakeholders, such as school authorities, parents, and students. In the present study, though the majority of them agreed that communicative competence needed to be reinforced in senior high school English classes and some teachers had introduced interactive activities, the frequency of the listening and speaking teaching was limited for the public school teachers. Due to the obligation to focus on raising students’ test scores in a fixed teaching schedule, the teachers’ autonomy to carry out two-way interactive listening exercises was unfortunately sacrificed. This is in line with previous research, in which parents’ expectations of, and involvement in, their children’s academic performance has been found to shape teaching and learning in school (Cheng et al. 2011; Linn 1993; Mulvenon et al. 2005). Moreover, the teachers’ and school authorities’ responses to the test were found to fluctuate with their students’ English language proficiency. In the present case, the public school students with better English abilities, or who were in the special programs, were more likely to receive more teaching resources, such as English courses taught by NS of English, than their ordinary peers. Similarly, the teachers would be able to spend more time on teaching listening. As the practice of communication skills often results in a more positive backwash effect on learning and teaching a second or foreign language (Hughes 2003), it is important to raise parents’ awareness of the value of two-way interactive listening and of the potential benefits of engaging students in more authentic learning activities.

Last but not least, teachers’ beliefs, past teaching and learning experience, and knowledge of pedagogy and subject matter also determine how far a new test impacts on instructional practice. Both teachers from public and private schools expressed a need for systematic, in-service training courses in listening instruction. As Shulman (1986) indicated several decades ago, teacher knowledge is constituted by pedagogic, content, and pedagogic content knowledge. Nonetheless, Johnson (1997, p.780) argued that L2 pedagogical training programs tend to present teachers with theoretical knowledge in ‘oversimplified, decontextualised, and compartmentalised’ teaching and learning practices without taking social, cultural, and educational contexts into account. To bridge the gap between theory and practice, Williams (1999), concluding from Eraut’s (1994) public theories (i.e. received knowledge of a subject domain) and private theories (i.e. individual or personal beliefs in teaching), suggested that language teachers should integrate public and personal (or private) theories and then interact with practice. In the present case, while the teachers have expressed a need for in-service training courses in listening instruction, they may need to incorporate their personal experiences into the received pedagogic knowledge, then practice, examine, and reflect on it to achieve more positive washback effects.


The findings of the present study revealed that the new English listening test had both positive impacts (the instructional practices reflected the intended purpose of the test policy) and negative ones (heavy emphases on test practice and drilling of grammar) on the teachers from both types of school. Though the teachers perceived the test impact generally similarly across type of school, teachers from both types of school showed opposite attitudes towards fostering students’ communicative competence in class, and teaching approaches and materials to listening comprehension. Owing to the competitive teaching environment of private schools and pressure from other stakeholders, the private school teachers were found to focus more on the teaching of test-taking skills, and authorities have purchased expensive equipment and test banks, or have opened listening and speaking courses to help their students. The public school teachers, by contrast, tended to use multimedia more often and design more interactive activities that engaged students in oral discussion and presentation. Due to the fixed time allotment, courses taught by NS of English were offered mainly to the students enrolled in special programs.

The present study has yielded findings that have both theoretical and pedagogical implications, but it was not without limitations. First, the generalisation of the results to other populations with different test policies in other countries or teachers with different school environments may be limited. Another constraint was that this study was based on semi-structured interviews. Further studies using other research methods, such as questionnaires, observation, or analysis of textbooks could be carried out to explore more about washback effects in Taiwanese educational system. While the present study focused on foreign language teacher knowledge and engagement, perhaps future research could explore outcomes of teacher education activities or program designs.

Washback effect is a complicated issue and tends to have a more considerable impact on teachers than students (Spratt 2005; Watanabe 2004). In an ideal situation, Spratt (ibid., p.23) notes that ‘when and where the teacher is in control of the factors determining washback, washback itself is largely in the teacher’s control.’ Nonetheless, the present study found that the senior high school teachers, at least in a test-driven teaching context in Taiwan, had work commitments and experienced pressure from other stakeholders, in the sense that they had to maximise the students’ academic achievement; teachers’ autonomy for exercising different teaching approaches in class was clearly limited. To produce a positive backwash effect on teaching and learning in the long run, teachers need first to strengthen their pedagogic content knowledge of listening instruction, and then to integrate the result into current instructional practice, while weighing the impact of the pressure from other stakeholders. Moreover, it is necessary to raise parents’ awareness of the long-term benefits of engaging their children in a more authentic language environment. It is recommended that a systematic in-service training course of English listening instruction should be offered to all senior high school English teachers. To conclude, since the high-stake national examinations will continue to exist, the teachers need to strike a balance between the impact of the new test policy, content knowledge needed for the new instructional practice for students with different English proficiency, and the autonomy allowed to implement it.



This study was funded by the Minis try of Science and Te chnology (MOST 104-2410-H-160-007).

Competing interests

The author declares that she has no competing interests.

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Authors’ Affiliations

Department of Foreign Language Instruction, Wenzao Ursuline University of Languages


  1. Alderson, C., & Hamp-Lyons, L. (1996). TOEFL preparation courses: A study of washback. Language Testing, 13(3), 280–297. doi:10.1177/026553229601300304.View ArticleGoogle Scholar
  2. Alderson, J. C., & Wall, D. (1993). Does washback exist? Applied Linguistics, 14(2), 115–129. doi:10.1093/applin/14.2.115.View ArticleGoogle Scholar
  3. Andrews, S., & Fullilove, J. (1994). Assessing spoken English in public examinations – Why and how? In J. Boyle & P. Falvey (Eds.), English language testing in Hong Kong (pp. 57–85). Hong Kong: The Chinese University Press.Google Scholar
  4. Bell, J. (2010). Doing your research project (5th ed.). Maidenhead: Open University Press.Google Scholar
  5. Boardman, A. G., & Woodruff, A. L. (2004). Teacher change and “high-stakes” assessment: What happens to professional development? Teaching and Teacher Education, 20(6), 545–557. doi:10.1016/j.tate.2004.06.001.View ArticleGoogle Scholar
  6. Chen, J.-H. (2003). 私立高中教師工作價值觀,工作壓力與組織承諾之研究 [A study of the relationship among work values, job stress, and organisational commitment of private senior high school teachers.] (Unpublished master’s thesis). Changhua, Taiwan: National Changhua University of Education.Google Scholar
  7. Chen, P. (2008). Strategic leadership and school reform in Taiwan. School Effectiveness and School Improvement, 19(3), 293–318. doi:10.1080/09243450802332119.View ArticleGoogle Scholar
  8. Cheng, L. (1997). How does washback influence teaching? Implications for Hong Kong. Language and Education, 11(1), 38–54. doi:10.1080/09500789708666717.View ArticleGoogle Scholar
  9. Cheng, L. (2004). The washback effect of a public examination change on teachers’ perceptions toward their classroom teaching. In L. Cheng, Y. Watanabe, & A. Curtis (Eds.), Washback in language testing: Research contexts and methods (pp. 147–170). New Jersey: Lawrence Erlbaum Associates.Google Scholar
  10. Cheng, L., Andrews, S., & Yu, Y. (2011). Impact and consequences of school-based assessment (SBA): Students’ and parents’ views of SBA in Hong Kong. Language Testing, 28(2), 221–249. doi:10.1177/0265532210384253.View ArticleGoogle Scholar
  11. Cheng, L., & Curtis, A. (2004). Washback or backwash: A review of the impact of testing on teaching and learning. In L. Cheng, Y. Watanabe, & A. Curtis (Eds.), Washback in language testing: Research contexts and methods (pp. 3–17). New Jersey: Lawrence Erlbaum Associates.Google Scholar
  12. Chou, M.-H. (2015). Impacts of the test of English listening comprehension on students’ English learning expectations in Taiwan. Language, Culture and Curriculum, 28(2), 191–208. doi:10.1080/07908318.2015.1027216.View ArticleGoogle Scholar
  13. Chou, M.-H. (2016). Strategy use for listening in English as a foreign language: A comparison of academic and vocational high school students. TESOL Journal, 7(3), 513–539. doi:10.1002/tesj.214.View ArticleGoogle Scholar
  14. Cohen, L., Manion, L., & Morrison, K. (2011). Research methods in education (7th ed.). New York: Routledge.Google Scholar
  15. College Entrance Examination Center (CEEC) (2016a). 高中英文參考字彙表 [English word list at senior high school level]. Accessed 3 Nov 2016.
  16. College Entrance Examination Center (CEEC) (2016b). 高中英語聽力測驗試題範例 [Sample test questions of the Test of English Listening Comprehension]. Accessed 3 Nov 2016.
  17. Davies, A. (Ed.). (1968). Language testing symposium: A psycholinguistic approach. Oxford: Oxford University Press.Google Scholar
  18. Denscombe, M. (2014). The good research guide: For small-scale social research projects (5th ed.). Maidenhead: Open University Press.Google Scholar
  19. Eraut, M. (1994). The acquisition and use of educational theory by beginning teachers. In G. Harvard & P. Hodkinson (Eds.), Action and reflection in teacher education (pp. 69–88). Norwood: Ablex.Google Scholar
  20. Field, J. (2008). Listening in the language classroom. Cambridge: Cambridge University Press.Google Scholar
  21. Flowerdew, J., & Miller, L. (2005). Second language listening: Theory and practice. Cambridge: Cambridge University Press.View ArticleGoogle Scholar
  22. Graham, S., Santos, D., & Francis-Brophy, E. (2014). Teacher beliefs about listening in a foreign language. Teaching and Teacher Education, 40(1), 44–60. doi:10.1016/j.tate.2014.01.007.View ArticleGoogle Scholar
  23. Green, A. (2006). Washback to learners: Learner and teacher perspectives on IELTS preparation course expectations and outcomes. Assessing Writing, 11(2), 113–134. doi:10.1016/j.asw.2006.07.002.View ArticleGoogle Scholar
  24. Hardré, P. L., Chen, C.-H., Huang, S.-H., Chiang, C.-T., Jen, F.-L., & Warden, L. (2006). Factors affecting high school students’ academic motivation in Taiwan. Asia Pacific Journal of Education, 26(2), 189–207. doi:10.1080/02188790600937326.View ArticleGoogle Scholar
  25. Hsiao, H.-C., Chen, M.-N., & Yang, H.-S. (2008). Leadership of vocational high school principals in curriculum reform: A case study in Taiwan. International Journal of Educational Development, 28(6), 669–686. doi:10.1016/j.ijedudev.2007.12.002.View ArticleGoogle Scholar
  26. Hughes, A. (2003). Testing for language teachers (2nd ed.). Cambridge: Cambridge University Press.Google Scholar
  27. Hwang, J.-J. (2014). Problems and perspectives for implementing the entrance system of senior high and vocational high schools in Taiwan’s 12-year National Basic Education Program. Taiwan Educational Review Monthly, 3(9), 102–132.Google Scholar
  28. Johnson, K. E. (1997). The author responds. TESOL Quarterly, 31(4), 779–782. doi:10.2307/3587761.View ArticleGoogle Scholar
  29. Kuo, Y.-C., & Tao, H.-L. (2013). A study of the relationship between entrance channels, family background and test scores – Implications for equality and efficiency of entrance channels. Journal of Social Science and Philosophy, 25(3), 421–456.Google Scholar
  30. Lee, C.-L. (2003). 大學入學考試中心學科能力測驗與高中在校成績關係之研究 [The correlation between the performance in the General Scholastic Achievement Test and the academic achievement at the high school] (Unpublished master’s thesis). Taipei, Taiwan: National Taipei University of Education.Google Scholar
  31. Liang, Y.-H., & Sun, Y.-C. (2014). Worries and foresight of the 12-year National Basic Education Program. Taiwan Educational Review Monthly, 3(2), 96–100.Google Scholar
  32. Lin, C.-H. (2014). 探討台灣高中學生與高中老師在二十一世紀之學習與教學趨勢 [Developing questionnaire for 21-century Taiwanese high school students’ learning and teachers’ teaching] (Unpublished master’s thesis). Taipei, Taiwan: National Taiwan University of Science and Technology.Google Scholar
  33. Linn, R. (1993). Educational assessment: Expanded expectations and challenges. Educational and Policy Analysis, 15(1), 1–16. doi:10.3102/01623737015001001.View ArticleGoogle Scholar
  34. Lumley, T., & Stoneman, B. (2000). Conflicting perspectives on the role of test preparation in relation to learning? Hong Kong Journal of Applied Linguistics, 5(1), 50–80.Google Scholar
  35. Madaus, G. F., & Kellaghan, T. (1992). Curriculum evaluation and assessment. In P. W. Jackson (Ed.), Handbook of research on curriculum (pp. 119–154). New York: Macmillan.Google Scholar
  36. Mulvenon, S. W., Stegman, C. E., & Ritter, G. (2005). Test anxiety: A multifaceted study on the perceptions of teachers, principals, counselors, students, and parents. International Journal of Testing, 5(1), 37–61. doi:10.1207/s15327574ijt0501_4.View ArticleGoogle Scholar
  37. Noble, A. J., & Smith, M. L. (1994). Old and new beliefs about measurement-driven reform: “the more things change, the more they stay the same” (CSE technical report 373). Temple, AZ: Arizona State University, National Center for Research on Evaluation.Google Scholar
  38. Pearson, I. (1988). Tests as levers for change. In D. Chamberlain & R. L. Baumgardner (Eds.), ESP in the classroom: Practice and evaluation (pp. 98–107). London: Modern English.Google Scholar
  39. Read, J., & Hayes, B. (2003). The impact of IELTS on preparation for academic study in New Zealand. IELTS International English Language Testing System Research Reports, 4, 153–206.Google Scholar
  40. Rost, M. (2011). Teaching and researching listening (2nd ed.). Harlow: Pearson Education Limited.Google Scholar
  41. Shohamy, E., Donitsa-Schmidt, S., & Ferman, I. (1996). Test impact revisited: Washback effect over time. Language Testing, 13(3), 298–317. doi:10.1177/026553229601300305.View ArticleGoogle Scholar
  42. Shulman, L. (1986). Those who understand knowledge growth in teaching. Educational Researcher, 15(2), 4–14. doi:10.3102/0013189X015002004.View ArticleGoogle Scholar
  43. Spratt, M. (2005). Washback and the classroom: The implications for teaching and learning of studies of washback from exams. Language Teaching Research, 9(1), 5–29. doi:10.1191/1362168805lr152oa.View ArticleGoogle Scholar
  44. Tsai, C.-C. (2013). An investigation into English as a foreign language listening anxiety among Taiwanese senior high school student. International Journal of Learning and Development, 3(4), 108–113. doi:10.5296/ijld.v3i4.4247.View ArticleGoogle Scholar
  45. Tsui, A. B. M. (2011). Teacher education and teacher development. In E. Hinkel (Ed.), Handbook of research in second language teaching and learning (pp. 21–39). New York: Routledge.Google Scholar
  46. Wall, D., & Alderson, J. C. (1993). Examining washback: The Sri Lanka impact study. Language Testing, 10(1), 41–69. doi:10.1177/026553229301000103.View ArticleGoogle Scholar
  47. Watanabe, Y. (2000). Washback effects of the English section of Japanese entrance examinations on instruction in pre-college level EFL. Language Testing Update, 27, 42–47.Google Scholar
  48. Watanabe, Y. (2004). Teacher factors mediating washback. In L. Cheng, Y. Watanabe, & A. Curtis (Eds.), Washback in language testing: Research contexts and methods (pp. 129–146). New Jersey: Lawrence Erlbaum Associates.Google Scholar
  49. Williams, M. (1999). Learning teaching: A social constructivist approach – Theory and practice or theory with practice? In H. Trappes-Lomax & I. McGrath (Eds.), Theory in language teacher education (pp. 11–20). Harlow: Longman.Google Scholar
  50. Yang, Y.-L. (2003). 高中英文寫作教學之我見:從大學入學考試英作測驗談起。[viewpoints of teaching English writing in senior high school: From the English composition writing in college entrance examination]. Newsletter for Teaching the Humanities and Social Science, 14(1), 92–113.Google Scholar


© The Author(s) 2017