An Evaluation of China’s Automated Scoring System Bingo English


  •  Jianmin Gao    
  •  Xin Li    
  •  Peiqi Gu    
  •  Ziqi Liu    

Abstract

The study evaluated the effectiveness of Bingo English, one of the representative automated essay scoring (AES) systems in China. 84 essays in an English test held in a Chinese university were collected as the research materials. All the essays were scored by both two trained and experienced human raters and Bingo English, and the linguistic features of them were also quantified in terms of complexity, accuracy, fluency (CAF), content quality, and organization. After examining the agreement between human scores and automated scores and the correlation of human and automated scores with the indicators of the essays’ linguistic features, it was found that Bingo English scores could only reflect the essays’ quality in a general way, and the use of it should be treated with caution.



This work is licensed under a Creative Commons Attribution 4.0 License.
  • ISSN(Print): 1923-869X
  • ISSN(Online): 1923-8703
  • Started: 2011
  • Frequency: bimonthly

Journal Metrics

Google-based Impact Factor (2021): 1.43

h-index (July 2022): 45

i10-index (July 2022): 283

h5-index (2017-2021): 25

h5-median (2017-2021): 37

Learn more

Contact