Understanding EFL Learners’ Errors in Language Knowledge in Ongoing Assessments

,


Introduction
The need for the English language is increasing daily worldwide for various purposes like education, employment, or business transactions.Research has shown that a vast number of people are now learning English as a second language (ESL) or foreign language (EFL) (Crystal, 2002;Nunan, 2003).However, learning English in ESL/EFL settings appears to be challenging for many students in different parts of the world, and Arabic-speaking students are no exception.The experience of learning English for many students is a difficult journey that affects their academic performance and progress.Many Arab students learning English would find it challenging to perform satisfactorily in reading, writing, listening, and speaking, and eventually in their majors (Alshammari, 2022;Atmaca, 2016;Kampookaew, 2020;Wood, 2017).In this regard, learning English grammar, in particular, is a demanding and challenging task for a considerable number of students.It limits their ability to learn English as a whole and perform better in it.The difficulties learners experience in grammar vary at the learners' level, in reasons, and from one learning situation to another.For instance, research has reported that Arab students found learning skills in English a challenging task (Al Hosni, 2014;Ansari, 2012;Rabab'ah, 2005).An understanding of the learners' difficulties in English enables teachers to provide corrective feedback and design suitable learning tasks which can help these learners improve their language knowledge.
During their study at the university, students are often assigned courses and syllabi with specific academic tasks of various types and take mid-term exams, do quizzes and tests to reinforce learning, and check and evaluate the level of learning.The courses and syllabi contain specific learning outcomes students have to achieve.These are assessed by tests, other academic tools, and class activities known as continuous or ongoing assessments.Continuous assessment (CA) is an ongoing assessment (OA) that helps teachers to evaluate the learners' performance (Iseni, 2011).As a pedagogical tool, (CA) is essential, and many strategies can be helpful such as portfolios, assignments, tests, writing journals, vocabulary logs, and notebooks.Thus, (CA) can be used: to find out what students know and can do, to provide students with opportunities to show what they know, promote learning for understanding, improve teaching, and let students know how well they are progressing in learning.As learners of English, students' performances in tests related to (OA) or (CA) can provide useful information about their level of understanding, learning abilities, and learning difficulties in the language.
Moreover, (OA) or (CA) may enable teachers to provide students with feedback that will help them in their learning and academic performance.Therefore, by examining the performance of learners of English in (OA) and other class activities and by way of employing Error Analysis (EA) procedures, we will be able to come closer to the students learning level and understand better their learning difficulties and provide the necessary academic support and assistance that may facilitate learning.Analyzing learners' errors is a well-established practice in the area of English language teaching (ELT), particularly in ESL/EFL settings.Based on Corder's (1967) EA model researchers and teachers attempt to analyze language students' errors.The steps of EA include identification and description of errors (error detection), classification of errors (categorizing of errors), diagnosis of errors (causes of errors), and evaluation of errors (to determine merits and values of errors).

Learning English and Learner Difficulties
In second language acquisition theory and research, learners' difficulties with the target language (TL) and their production of its features have been valuable data sources.The learners' production during the process of learning the TL results in interlanguage (IL), an in-between point that reveals their knowledge of the TL system compared to what is beyond their knowledge.A significant amount of research has been done both theoretically and empirically so far in order to explore learner errors.In these studies, either previous hypotheses about the phenomenon were tested or theories were proposed.Corder's 'Error Analysis and Interlanguage' (1981) is a significant contribution to the field of Error Analysis and language learning.The book dealt with the methodological and theoretical problems of error analysis.In the chapter titled, The Significance of Learner's Errors, Corder discussed the value of learners' errors and their significance for teachers and researchers.The importance of Corder's ideas is that the learner makes a significant cognitive contribution to learning.He argued that "learner's errors, then, provide evidence of the system of the language that he is using" (p.10).Moreover, learner difficulties are of value as they make teachers aware of these problematic areas.
In his argument, Corder proposed the idea that learners' errors are a significant part of linguistic development, and that errors are important because they may represent the inadequacy that develops between the system of the learner's transitional competence (the intermediate systems constructed by the learner in the process of his language learning) and that of the TL.Transitional competence implies that the learner's difficulties are dynamic and constantly moving in the direction of the TL.This suggests that language learners use certain devices of language in every part of their development, and this is why they produce systematic errors.Moreover, Corder explained that the learner's inbuilt syllabus is the factor that determines the sequence in which the grammar is acquired, and that investigation into the learner's errors might elaborate insights into this order.Error Analysis (EA) is considered an alternative to describe and explain errors made by ESL/EFL learners since the errors could show the sources of the difficulty learners have.If these sources are understood, FL teachers could be informed about how to correct the errors and how to treat them better Alhaysony (2012).
Learning a language, particularly a second or additional is a process that involves many linguistic and non-linguistic factors and requires assessing the learning process of the students.Students learning English are required to develop language knowledge and improve language skills.This instruction-based learning process is assessed via tests designed for such a purpose.Tests and quizzes are part of (OA) and are good sources of information about the students learning progress that should be utilized by teachers and researchers.To ensure their validity reliability and effectiveness in learning, tests must be prepared based on both students' prior knowledge and on specific tasks.This helps to easily identify their performance level and possible errors (Iseni, 2011).A common test used by teachers in assessing students' grammar and vocabulary is the cloze test.Designing different types of standard cloze tests involves the deletion of both function words and content words (Read, 2012, p. 309).Research has investigated the impact of testing on learning English and overall students' performance and the challenges they face in different test types and categories.For instance, students learning English were reported to have difficulties with multiple-choice and cloze test questions.As an assessment alternative, multiple-choice items are a well-established progress testing technique (Hammerly & Colhoun, 1984).According to Tabatabaei and Shakerin (2013), the challenges faced by the test takers were on both, multiple-choice and cloze tests with unfamiliar than familiar contents.Further, the researchers stated that the student's background knowledge helped them understand the text well.
It is important that students are trained on test items in order to be able to develop familiarity with content since content familiarity has important effects on the students' performance on cloze tests (Ahmadi & Bahrani 2011;Afghari & Tavakoli 2004;Al-Shmaimei, 2006).In another study by (Ridha, 2012), it was found that the most noticeable and frequent errors were grammatical and mechanical errors.The researcher also concluded that the majority of students' errors were caused by their mother tongue (Arabic) interference.The differences between English and Arabic systems are expected to be one of the major causes of errors by students who learn English.This explains the reason why the influence of the mother tongue was one of the major causes of errors in both ESL and EFL contexts, according to Muhammadi and Mustafa (2020).The researchers also reported that errors occurred in a multilingual context and in productive skills.Lui (2013) reported that learners in ESL/EFL settings also perform errors due to carelessness and native language interference.
This study aims at investigating the errors made by students in multiple-choice questions (MCQs) and cloze test questions particularly in grammar and language knowledge tests as a part of OA.The study is hoped to provide more information about the students' level of learning, progress in learning, and the type of difficulties they faced in test types.Therefore, the information obtained about the students' performance will help in knowing about the students' specific difficulties in both language learning and test types.Such an understanding is hoped to help in designing better instructional tasks and test types that suit the students' level of language knowledge and abilities.To serve its objectives, the present study addresses the following questions: 1) What types of grammatical challenges do EFL learners encounter in ongoing assessments?
2) How do EFL students handle multiple-choice questions and cloze test questions in grammar?
3) To what extent are EFL learners' errors in ongoing assessments useful?

Method
This study aimed at examining the types of difficulties EFL undergraduate students experience with ongoing assessment tasks in an Omani higher learning institution.It specifically attempted to highlight the usefulness of understanding the students' level of language knowledge in relation to their performance in continuous assessments.The study is qualitative and utilizes some statistical analysis for its purpose.

Research Subjects
Four groups of Foundation Programme students (N=83) at a University in Oman took part in this study as sources of data.All of them were Arabic speakers aged 17-19.The students in this represented four levels; from Level One to Level Four.Level One comprised a group of (N=11) students, Level Two (N=21) students, Level Three (N=26) students, and Level Four (N=25) students.They were from different majors (Business Administration, Information Technology, English, Mass Communication, and Accounting).All the subjects in the study were enrolled in a course and were assigned a course text content that includes four language skills: reading, writing, listening and speaking.Grammar and vocabulary were taught under language knowledge syllabi.The units tested in this syllabus were related to specific vocabulary and grammatical structures which are all included in the four skills units.All the students in this study participated in the two quizzes taken in nine weeks' time.The two Language Knowledge (LK) quizzes were chosen to find out whether or not the learning has taken place before the understanding of the test format and after.

Data Collection Procedures
This study examined the students' performance in ongoing assessments and the benefits of understanding their level of knowledge by looking at their errors.The data were collected by using two quizzes about grammar in language knowledge (LK) in ongoing assessments covering a period of nine weeks.The first test was taken in Week 3 while the second test was taken in Week 12 in the Spring Semester.The Language Knowledge quiz has two parts: Grammar and Vocabulary.In each test, there are two types of questions, multiple-choice questions, and rational cloze.Sentence level MCQs and paragraph-level cloze respectively.The reading passages for cloze tests were selected with familiar topics content.In each test, five items were tested.Each test was taken after extensive teaching in grammar and vocabulary and students were expected to be familiar with the topics and test format.

Data Analysis and Discussion
To serve the study's objectives, a total of 83 foundation students' answer scripts were collected and analyzed by the researcher.The analysis focused basically on the students' performance in the grammar tests.Analyzing students' performance is aimed to provide more information about students' levels, types of errors they make, and the way they handle MCQs and cloze test questions.The students' performances in grammar quizzes were analyzed and categorized following Error Analysis (EA) procedures to see how the students performed in this area.The 40 test items were divided into certain grammatical categories such as verbs tenses, prepositions, adjectives, adverbs, nouns, and pronouns.All the errors made by students in different areas of grammar were categorized and presented in the tables below.Infinitive "ing" 54.5% "BE" form-sing.18% "BE" form-Pl.45.4% Infinitive "ing" 18% Table 1 reveals that in MCQs students in Level 1 committed errors equally in question words and infinitive "ing" (54.5% each), while the score of present simple tense errors was (36.3%).On the other hand, the analysis of errors in the Cloze test reveals that the students had difficulties most in both Simple Present Tense Singular and 'BE' plural form (45.4%) each.Moreover, the Simple Present Tense Plural errors scored (36.3%), while both 'BE' form singular and the infinitive "ing" scored (18%) each.The results also show that infinitive 'ing' was problematic for the students in the MCQs while it was not the case in the Cloze test.This may suggest that the students were able to identify the correct answer in the Cloze test items easier than in the MCQs.This result indicates the significance of designing test items to ensure that students who are trained in that will be able to recall what they have already learned.This idea confirms the findings of previous research in this regard (Ahmadi & Bahrani, 2011;Afghari & Tavakoli, 2004;Al-Shumaimeri, 2006).In addition, Simple Present Tense remained a challenging area for students in both test types.The persistent errors in the Present Tense show that more attention should be given to this grammatical aspect.Knowing grammar in general and tenses and their usages, in particular, is a very important language area that students have to master (Al Hosni, 2014;Ansari, 2012;Rabab'ah, 2005).Results in quiz 2 below illustrate more about the types of difficulties learners had in grammar.They also made errors equally in conjunction and transitional words (18%).However, in the Cloze test, the highest errors committed by the students were in tenses.The most errors have been made in the simple present tense (63.6%) followed by simple past (54.5%),and present "BE" plural verb (45.4%), present progressive (27.2%) respectively.Results in the second quiz reveal that students were still having difficulty with tenses even though the test types did not change.Knowing about the students' actual knowledge on the basis of their performance in ongoing assessments is important because reliable information can be obtained to help in designing suitable tests and instructional strategies.Thus, looking at the students' performance during and after test design has important teaching and learning gains (Read, 2012).The following section presents the results of analyzing Level 2 students' errors.In this category, the error percentages were considerably high.All the students found the preposition of movement (to) the most challenging (100%), followed by prepositions of time (in, at, on), (66.6%), (52.3%), (47.6%) respectively.Errors in the preposition of place scored (38%).The analysis of the students' performance in quiz 2 below provides significant information about the actual learning difficulties the students had in learning grammar.Moreover, Table 4 shows that in MCQs the students committed errors in past progressive questions (38%), followed by modal verb negative "must not" (28.5%) and the modal verb "could" (23.8%).On the other hand, in the cloze test, the students had difficulty in the past tense.In this category, the most common errors made by students were in past "BE" form singular (61.9%), and past "BE" form plural (33.3%) respectively.Errors in the past continuous scored (47.6%).This result indicates the same error pattern the students displayed in Level 1 in the area of tenses.Learning tenses remains a challenging aspect for students which suggests more instructional focus to be given to this category.Teaching grammar is expected to take into consideration the difficulties students face in this grammatical aspect.Students need to be assisted to overcome this learning difficulty.In the following section, Level 3 students' errors are presented and discussed.The analysis of the students' performance as shown in Table 6 illustrates that error scores in MCQs vary according to the test items.The most common errors made by the students were in the simple past tense (61.5%), followed by past progressive third person singular (53.8%).Results also demonstrate that errors in quantifier scored (50%), and (19.2%) in pronouns.The students also had difficulty in learning past progressive "I" in which they scored (15.3%).However, in the cloze test, the highest percentage of errors was in the past continuous "I" (92.3%), followed by past continuous singular (69.2%), and simple past verb (30.7%).It is also observed that errors were equal in both simple past "BE" plural form and simple past plural (15.3%) each.The following section presents the analysis of Level 4 students' errors in learning grammar.Results of the analysis as shown in Table 7 reveal that the most common errors made by students in MCQs were in conjunctions (100%) followed by phrasal verbs (96%) and the relative clause (88%).On the other hand, in the Cloze test, the most common errors were in phrasal verbs (100%) followed by conjunctions (56%) and pronouns (32%).The analysis demonstrates that students continued to struggle with these grammatical items in both quiz types.Results in quiz 2 below provide more information about the students' learning progress, actual knowledge, and the way they dealt with test types.Moreover, Table 8 indicates that the students' most common errors in MCQs were in the passive voice of present continuous tense (64%) followed by conditional perfect tense (60%) and second conditional (44%).However, errors in possessive pronouns scored (40%).On the other hand, in the Cloze test, the present passive voice was the most common error whose percentage (76%) was higher than MCQs.Results also reveal errors in the second conditional negative score (68%), and in the past passive (64%) while in modal verbs (40%).The least in the cloze test was the past perfect errors category with only (28%) score.The findings indicate that the students performed almost equally in both quiz types.As revealed by the analysis, the students' difficulties in conditional sentences conform with the findings of Suraprajit (2022) who reported that the complex nature of conditional sentences makes learning this grammatical aspect a challenging task for EFL learners.The difficulties students faced in different grammatical items indicate that grammar remains a difficult aspect for the students regardless of their familiarity with the test type.

Conclusion and Recommendations
Learning an additional language is a challenging task for many learners.The student's actual knowledge of the target language system can be understood according to their performance in ongoing assessments.The ongoing assessments have value for both teachers and learners.Through various class activities, students can consolidate what they have learned and can know what they should do in order to develop language knowledge and skills.Teachers, on the other hand, can learn more about their students' strengths and weaknesses and as such design suitable instructional strategies which can help their students learn the language better.In this regard, further research is needed to examine more features of continuous assessments and their relevance to assisting students with better learning experiences.In addition, future research may focus on the usefulness and impact of feedback on the students' performance in OA.
Error Analysis is still a relevant practice in EFL class settings and is better supported by more implementation in the class activities and ongoing assessments.The students' errors in various grammatical categories suggest that teachers should identify the common errors that frequently occur and provide proper corrective feedback at the right time.This study focused mainly on the students' performance in grammar in OA.Thus, it did not focus on the perceptual dimension.In addition, the study limited itself to grammar rather than other features such as vocabulary and other language skills.
To motivate learners, language teachers should provide additional practice activities with specific test items.
Students should keep a record of their own to identify their errors and correct them with their peers.In addition, teachers should provide feedback on individual errors while keeping motivating students in mind.Errors can be a positive indication of learning and as such their occurrences should be understood in relation to various causes and roots and be taken as a way of learning the target language (Nuruzzaman & Islam, 2018).Language learners will continue to make errors and this inevitable fact makes utilizing ongoing assessments as valid sources of data on learners' performance and actual knowledge a necessary teaching requirement for better and effective teaching and learning.

Table 3
illustrates that in both MCQs and Cloze test, the students in Level 2 scored high percentages of errors in all the items tested.For instance, in MCQs adverbs of time scored (95.2%) followed by relative pronouns (80.9%).Errors in the conjunctions (71.4%) and in the adverbs of frequency (33.3%).On the other hand, errors in possessive pronouns scored (23.8%) representing the least among the five items tested in quiz 1.It is interesting to see that in the Cloze test, prepositions of time, place, and movement appeared to be the most challenging items for the students.

Table 5 .
Quiz 1 Error Frequency and Category (Level 3) Level 3 students were concerned, Table5reveals that the students continued to struggle with more grammatical categories.The students' errors in MCQs indicate that they performed most errors in superlative (63.3%), followed by passive voice (57.6%), conjunctions (46.1%), and (26.9%) in both relative clause and passive voice questions.On the other hand, in the Cloze test, students made errors in simple present singular verb (38.4%) and comparative (26.9%).However, the students scored few errors in the simple past (3.8%).The analysis of the students' performance in quiz 2 sheds more light on their learning progress, level of knowledge, and aspects of difficulties.