Some teachers are using AI to grade their students, Anthropic finds - why that matters

5 hours ago 4
Anthropic for acquisition  report
Anthropic

Follow ZDNET: Add america arsenic a preferred source on Google.


ZDNET's cardinal takeaways

  • Anthropic published its Education Report, analyzing educators' Claude usage. 
  • Teachers are utilizing Claude to assistance people students, a arguable usage case.
  • AI companies are doubling down connected tools for education. 

Much of the absorption connected AI successful acquisition is connected however students volition beryllium affected by AI tools. Many are acrophobic that the temptation to cheat and AI's erosion of critical reasoning skills volition diminish the prime of their education. However, Anthropic's latest acquisition study focuses connected educators' outlook connected AI successful the schoolroom -- and finds immoderate astonishing ways teachers are implementing the tech. 

Also: The tasks assemblage students are utilizing Claude AI for most, according to Anthropic

AI companies are alert of the tensions users acquisition betwixt utilizing AI arsenic a copilot oregon enactment and letting it automate definite parts of their work. Anthropic's investigation shows however educators are navigating that tension, and however those choices alteration connected a lawsuit by lawsuit basis.

To behaviour this report, Anthropic analyzed anonymized conversations betwixt Claude.ai, its chatbot, and Free and Pro accounts associated with higher acquisition email addresses, and filtered for education-specific tasks from May and June of 2025. Within that clip period, Anthropic identified 74,000 conversations involving tasks specified arsenic creating syllabi, grading assignments, and more. 

The institution besides matched each speech to the astir fitting task from the database of acquisition tasks successful the US Department of Labor's Occupational Information Network (O*NET) database. Separately, Anthropic besides said that it bolstered its investigation with survey information and qualitative probe from 22 Northeastern University module members who are aboriginal adopters of AI.

For a implicit breakdown of the setup, you tin work the report. Now for the findings. 

Some educators are automating grading 

Anthropic recovered that the astir communal usage cases of AI for educators were program improvement (57%) and world probe (13%). In a smaller usage case, however, Anthropic recovered that the remaining 7% of educators utilized Claude to "assess pupil performance," which includes giving students feedback, grading against rubrics, and summarizing evaluations -- contempt the outlook galore teachers stock that grading is simply a less-wise usage of AI. 

Also: AI agents get successful US classrooms

When educators utilized Claude to grade, they relied connected it to the constituent of automation astir fractional the clip -- 48.9%. "That's contempt pedagogue concerns astir automating appraisal tasks, arsenic good arsenic our surveyed module standing it arsenic the country wherever they felt AI was slightest effective," Anthropic said. 

By contrast, the study showed teachers utilized AI to augment tasks similar teaching and instruction, penning assistance proposals, world advising, and supervising world work. Besides grading, tasks with higher automation tendencies included managing acquisition instauration finances and fundraising, maintaining pupil records, and managing world admissions and enrollment -- galore of which tin beryllium much admin-heavy. 

Augmentation vs Automation
Anthropic

The signifier from those choices shows that educators are much consenting to automate tedious, method tasks. However, for those who necessitate much analyzable and captious thinking, educators volition usage AI to collaborate instead. 

Also: Where AI educators are replacing teachers - and however that'll work

Anthropic added that the precocious percent suggesting automated grading is concerning -- fundamentally expressing alarm astatine the thought that educators are handing specified a delicate portion of teaching disconnected to AI. The interest is astatine slightest a partial acknowledgement that AI whitethorn not beryllium recommended for specified a task, and conveys immoderate deficiency of assurance connected Anthropic's portion that Claude should beryllium utilized this way. 

One Northeastern prof Anthropic worked with agreed, citing ethical concerns and accuracy issues: "I person tried immoderate experiments wherever I had an LLM people papers, and they're simply not bully capable for me. And ethically, students are not paying tuition for the LLM's time, they're paying for my time. It's my motivation work to bash a bully occupation (with the assistance, perhaps, of LLMs)."

Even arsenic the smallest usage case, Anthropic noted it was the 2nd astir automated task. 

"While it's not wide to what grade these AI-generated responses origin into the last grades and feedback, the interactions surfaced by our probe bash amusement immoderate magnitude of delegation to Claude," Anthropic wrote. 

Other ways teachers usage AI 

Other unsocial usage cases recovered successful the information included creating mock ineligible scenarios for acquisition simulations, processing workforce grooming content, drafting proposal letters, and creating gathering agendas. 

While the Northeastern module reported utilizing AI for their ain learning arsenic different communal case, Claude.ai investigation was not capable to corroborate it due to the fact that of challenges with the filtering mechanism. The Northeastern module did suggest that educators are leveraging AI for these tasks due to the fact that AI tin automate tedious tasks, collaborate arsenic a thought partner, and personalize learning experiences for students, according to the report. 

Also: My apical 5 escaped AI tools for schoolhouse - and however they tin assistance supercharge your learning

Beyond utilizing the existing tools for schoolroom help, teachers are besides gathering their ain AI tools. For example, Anthropic said teachers are often utilizing its Artifacts feature, which allows users to make an app without coding, to physique "interactive acquisition materials." These creations see interactive acquisition games, appraisal and valuation tools, information visualization, world calendars and scheduling tools, fund planning, and more. 

AI's acquisition creep

Just successful clip for back-to-school season, AI companies person been on a tear to merchandise tools marketed toward some students and teachers. Anthropic recently launched a caller Learning Mode in Claude.ai chatbot and Claude Code, a complement to OpenAI's Study Mode -- some mean to employment the Socratic method to make a back-and-forth with a idiosyncratic alternatively than spitting retired answers. Elsewhere, text-to-speech app Speechify launched a competitor to NotebookLM's AI podcast tool, and Google made its $20-per-month suite of AI tools free to assemblage students

Also: Why AI chatbots marque atrocious teachers - and however teachers tin exploit that weakness

Putting the statement astir AI's relation successful acquisition speech for a moment, a assemblage declaration tin beryllium lucrative -- arsenic tin creating a pupil dependency connected your tools to get done a pugnacious semester. Given however burnt retired teachers are by definition, particularly fixed the COVID-19-related exodus from the profession, is it truly a astonishment that immoderate educators are changing their tune connected automating parts of their occupation with AI? Can an manufacture gunning to beryllium successful everyone's workflow, including successful the classroom, explicit interest erstwhile that begins to happen, particularly without argumentation restricting definite uses of AI, oregon guidance from those companies themselves?

AI successful the schoolroom is inactive excessively nascent to archer wherever this volition go, but for now, AI companies are wading into -- and creating -- a analyzable aboriginal for education. Individual schoolhouse and assemblage policies whitethorn yet find the outcomes, and adjacent then, with a constricted scope of control, arsenic these tools stay truthful easy accessible. 

Read Entire Article