"English-learning students’ scores on a state test designed to measure their mastery of the language fell sharply and have stayed low since 2018 — a drop that bilingual educators say might have less to do with students’ skills and more with sweeping design changes and the automated computer scoring system that were introduced that year.

English learners who used to speak to a teacher at their school as part of the Texas English Language Proficiency Assessment System now sit in front of a computer and respond to prompts through a microphone. The Texas Education Agency uses software programmed to recognize and evaluate students’ speech.

Students’ scores dropped after the new test was introduced, a Texas Tribune analysis shows. In the previous four years, about half of all students in grades 4-12 who took the test got the highest score on the test’s speaking portion, which was required to be considered fully fluent in English. Since 2018, only about 10% of test takers have gotten the top score in speaking each year."

  • Eheran@lemmy.world
    link
    fedilink
    arrow-up
    12
    arrow-down
    1
    ·
    edit-2
    1 month ago

    No shit, a sharp drop suddenly with a new test procedure… How could you even start to think it is anything but the test? Any why not run the 2 tests in parallel to actually make them comparable to begin with? Who are the idiots doing such things? Ask GPT how to do that next time, it can’t do worse than this.

    • ColeSloth
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      1 month ago

      Teachers tend to just pass students through. It makes the teachers look better if their students are doing well. The new testing involved is that it became impartial, so teachers evaluating can no longer claim words that the students butchered was correct.