This week covers a Penn Engineering course that teaches engineers to work across code, law and responsibility, a randomised study showing that guided AI tutoring with sequenced practice outperforms open-ended AI, a free CurricuLLM training session for teachers, and strong early results from NSWEduChat at Gymea Technology High School.
Teaching engineers to work across code, law, and responsibility
This Penn Engineering piece describes a course built around trade-offs, legal obligations, and human consequences in AI system design. Students are asked to work through real cases, test assumptions, assess risk, and explain the decisions behind what they build. The focus is on judgment, with the understanding that technical decisions also carry social and legal consequences.
Safety, privacy, fairness, accountability, cost, and usability are all shaped during design. Teaching engineers to work across code, law, and responsibility is a stronger preparation for real-world AI than treating ethics as a separate topic.
As the role of the software engineer evolves, I think the profession will become more rounded, with technical expertise increasingly paired with judgment, communication, and a stronger understanding of responsibility.
Guided AI tutoring outperforms open-ended AI
This paper looks at a different way to build AI tutors. Rather than waiting for students to ask questions, the system guides the learning process by choosing problems and structuring practice over time. In a five-month randomised study with 770 high school students, the guided version led to better performance on a final exam completed without AI support, with the biggest gains among less experienced learners.
The improvement came from how the system shaped the learning experience, not just from giving good answers. It sequenced tasks to keep students working at an appropriate level of challenge, with enough support to keep progressing. That led to stronger engagement and more effective use of the tool during learning.
This is similar to how CurricuLLM works. Past conversations add structure to future conversations, with support aligned to official progression standards rather than treating every interaction as a blank slate. The paper adds more evidence that the value in AI for schools is not just the model, but the way progress, sequence and curriculum alignment are built around it.
Free CurricuLLM training session for teachers
I am running a free training session for teachers who want to learn how to use CurricuLLM in the classroom. It will cover how to get started with the platform, how to use it to support planning and teaching, and how to create useful classroom resources while staying aligned to curriculum expectations.
This session is for educators who want practical examples rather than theory. I will walk through how the free version of CurricuLLM can help with day-to-day teaching, where curriculum-aligned AI is more useful than generic AI tools, and how teachers can start using it in a way that is safe, simple and genuinely helpful.
Gymea Technology High School: NSWEduChat in practice
Gymea Technology High School is one of the early examples of what system-wide AI rollout can look like in practice. The school is reporting strong results from NSWEduChat, with benefits around student skill development, academic integrity, and more structured classroom use of AI.
This is the direction more education systems will keep moving in. Safe, curriculum-aligned AI tools built for schools are very different to open consumer tools, and schools are starting to see the value in platforms designed around teaching, learning, and responsible use from day one.
The thread tying this together
This week's theme is that the value of AI comes from what is built around it. An engineering course that embeds judgment, not just compliance. A tutoring system that sequences learning, not just answers questions. A school that sees results because the tool was designed for the context. In each case, the technology matters less than the structure, the intent, and the design decisions that shape how it is used.
AI for schools works when it is deliberate. When the curriculum guides the interaction, when progress is tracked against real standards, and when the tool is designed for the learner and the teacher, not just for the model.

