Guardian investigation finds almost 7,000 proven cases of cheating – and experts says these are tip of the iceberg

Thousands of university students in the UK have been caught misusing ChatGPT and other artificial intelligence tools in recent years, while traditional forms of plagiarism show a marked decline, a Guardian investigation can reveal.

A survey of academic integrity violations found almost 7,000 proven cases of cheating using AI tools in 2023-24, equivalent to 5.1 for every 1,000 students. That was up from 1.6 cases per 1,000 in 2022-23.

Figures up to May suggest that number will increase again this year to about 7.5 proven cases per 1,000 students – but recorded cases represent only the tip of the iceberg, according to experts.

The data highlights a rapidly evolving challenge for universities: trying to adapt assessment methods to the advent of technologies such as ChatGPT and other AI-powered writing tools.

  • rescue_toaster@lemmy.zip
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    2
    ·
    1 day ago

    Chatgpt output isn’t crap anymore. I teach introductory physics at a university and require fully written out homework, showing math steps, to problems that I’ve written. I wrote my own homework many years ago when chegg blew up and all major textbook problems were on chegg.

    Just two years ago, chatgpt wasn’t so great at intro physics and math. It’s pretty good now, and shows all the necessary steps to get the correct answer.

    I do not grade my homework on correctness. Students only need to show me effort that they honestly attempted each problem for full credit. But it’s way quicker for students to simply upload my homework pdf to chatgpt and copy down the output than give it their own attempt.

    Of course, doing this results in poor exam performance. Anecdotally, my exams from my recent fall semester were the lowest they’ve ever been. I put two problems on my final that directly came from from my homework, one of them being the problem that made me realize roughly 75% of my class was chatgpt’ing all the homework as chatgpt isn’t super great at reading angles from figures, and it’s like these students had never even seen a problem like it before.

    I’m not completely against the use of AI for my homework. It could be like a tutor that students ask questions to when stuck. But unfortunately that takes more effort than simply typing “solve problems 1 through 5, showing all steps, from this document” into chatgpt.

    • Taiatari@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      11
      ·
      1 day ago

      Personally, I think we have homework the wrong way around. Instead of teaching the subject in class and then assign practice for home, we should be learn the subject at home and so the practice in class.

      I always found it easier to read up on something, get an idea of a concept by my self. But when trying to solve the problems I ran into questions, but no one was there I could ask. If the problem were to be solved in class I could ask fellow students or the teacher.

      Plus if the kids want to learn the concept from ChatGPT or Wikipedia that’s fine by me as long as they learn it somehow.

      Of course this does not apply to all concepts, subjects and such but as a general rule I think it works.

      • Saik0@lemmy.saik0.com
        link
        fedilink
        English
        arrow-up
        9
        ·
        1 day ago

        Instead of teaching the subject in class and then assign practice for home, we should be learn the subject at home and so the practice in class.

        Then you get students who get mad because they’re “teaching themselves”. Not realizing at all that the teacher curated what they’re reading/doing and is an SME that’s available to them when they’re completely lost.

      • rescue_toaster@lemmy.zip
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 day ago

        This is mostly the purpose of my homework. I assign daily homework. I don’t expect students to get the correct answers but instead attempt them and then come to class with questions. My lectures are typically short so that i can dedicate class time to solving problems and homework assignments.

        I always open my class with “does anyone have any questions on the homework?”. Prior chatgpt, students would ask me to go through all the homework, since much of my homework is difficult. Last semester though, with so many students using chatgpt, they rarely asked me about the homework… I would often follow up with “Really? No questions at all?”

    • confusedwiseman@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      1 day ago

      This is very insightful and provides good perspective.

      If I boil it down to take away is that GPT is enough to get through the fundamentals of student material, students can fake competence of the subject up to the cliff they fall off at the test.
      This ultimately isn’t preparing them for the world. It’s nearly impossible to catch until it’s too late. The pass or fail options aren’t helping because neither really represents the students best interests.

      The call to ban it for school is the only lever we can grasp for is because every other KNOWN option has been tried or assessed.

      • rescue_toaster@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        24 hours ago

        Yeah, that fake competence is a big thing. Physics Education Research has become a big field and while i don’t follow it too closely, that seems to be a reoccuring theme - students think they are learning the material with such reliance on AI.

        I intend to read a bit more of this over the summer and try to dedicate a bit of the first day or two next semester addressing how this usage of chatgpt hurts their education. I teach a lot of engineering students, which already has around a 80% attrition rate, i.e. 200 freshman, but only 40 of these graduate with an engineering degree. Probably won’t change behavior at all, but I gotta try something.

      • Warl0k3@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        1 day ago

        Not them but also an instructor - where I teach, we’re having to pivot sharply towards grades being based mostly on performance in labs and in person quiz/test results. Its really unfortunate since there are many students with test anxiety and labs are really exhausting to turn into evaluation instead of instruction, but it’s the only workable solution we’ve been able to figure out.

        • rescue_toaster@lemmy.zip
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 day ago

          Yep same here. I liked having homework a significant portion of grade. But with the prevalence of chatgpt, am reducing that portion of the grade and increasing the in-class exam weight.