We can't romanticise the AI-less classroom - here's what I'm building instead
One of the biggest challenges I faced teaching sport marketing last year wasn't getting students to engage with the content. It was getting them to think creatively with it. They could define segmentation, recite funnel stages, and reproduce frameworks back to me on an exam. But when I asked them to generate an original marketing idea for a real client, they froze. Not because they weren't capable, but because they hadn't yet built the creative muscle to know where to start.
I kept coming back to one question: how do you expose students to a wide range of marketing thinking in a way that actually develops that muscle, rather than just adding more to their reading list?
I tried a bank of links — campaign examples, award databases, and industry articles. It was too much and too unstructured. A second-year student staring at a Cannes Lions shortlist has no real way to connect what a global FMCG brand did to what NZ Cricket needs to solve. The gap between inspiration and application is exactly where they get lost.
That problem, and a move from ChatGPT to Claude earlier this year (like many others), is what led me to start building interactive HTML tools for my second year Sport Marketing classes at AUT. Not AI chatbots. Not generated content. Structured learning scaffolds — built with AI, but designed to make students think harder, not less.
Creative Precedent Analysis Tool
The first is a Creative Precedent Analysis tool. Students select their real client and using their client's actual brief (which guides them for the whole semester), the real problem, the real barriers — alongside a curated inspiration guide of campaigns from inside and outside sport, each chosen because they address a similar audience behaviour or conversion challenge.
The framing isn't "look at these cool ads." It's "here is why this example is relevant to your specific problem, and here is what to look for when you analyse it."
Students work through five questions that map directly to their assessment: What was the idea? What problem was it solving? Why did it work? What does this suggest about how your audience responds? And what principle transfers to your client? They save their analysis as a PDF — which matters practically because the tool runs in a browser, and students need to be able to take their thinking with them when they sit down to write their actual assessment.
The feedback from students was immediate: the tool gave them confidence to start. The framework made the task feel possible. And being pointed toward examples that were already connected to their brief sparked thinking they wouldn't have found on their own.
Marketing Objective Builder
The second tool came directly out of something I noticed mid-semester. I teach the same course twice a week across two campuses. After the first session on writing marketing objectives, it was obvious that students weren't translating the framework into practice. They understood what an objective should include, but they couldn't produce one.
So between campuses, I built a Marketing Objective Builder. Students select their client, read their brief, define their own audience segment, choose the type of marketing problem they're solving, select an action word that signals the funnel direction, and describe what needs to change — and the objective assembles in front of them as they work. The second campus got it.
What I learned building it, though, was just as important as the tool itself. The first version had click-to-fill audience suggestions, sentence starters, pre-written examples, and a field prompting students to name the barrier behind their objective. Every addition felt supportive, but every addition quietly reduced the thinking students were supposed to be doing themselves. I stripped almost all of it out. What remained was their brief: a question and a text field. The scaffold is the structure. The thinking has to be theirs.
AI & Education: My thoughts
Both tools are built as single HTML files using Claude, hosted on Netlify, and linked directly into our LMS. Students click and it opens instantly — no login, no app, no friction. The speed of building matters too. The objective tool didn't exist before my first class finished. It existed because I could see in real time what wasn't landing, and I had a way to respond before the next session.
I want to be honest about what I think this is and isn't.
These tools don't replace student thinking — they create the conditions for it to start. There's a version of the AI-in-education conversation that romanticises what classrooms looked like before: students finding their own examples, writing their own frameworks from scratch, building creative confidence through productive struggle. I understand that instinct. But I also remember what that actually looked like for a lot of students — blank pages, surface-level responses, creativity that never got off the ground because there was no scaffold to push against. Telling a student to "go find two campaigns and analyse them" without giving them a way to think about relevance, transfer principles, or what to actually look for isn't rigour. It's just a harder version of being lost.
What I'm trying to build is something different: tools that expose students to more, stretch their thinking further, and give them a structured way to develop the kind of creative and analytical muscle that the AI-less classroom often assumes would just develop on its own. If a student leaves my class having genuinely interrogated why a campaign worked, what principle sits underneath it, and how that transfers to a real client brief — that's more learning, not less. The tool didn't do that thinking. It made the thinking possible.
For those who've followed this blog, you'll know I've been exploring AI in teaching for a while — from designing Jack, an AI industry agent who acts as Auckland FC's Marketing Manager, to using conversational agents to scale personalised feedback. Jack is still in my toolkit and doing something different: he's a thinking partner who asks questions and challenges assumptions in real time. These tools sit alongside that work, not in place of it. They're different kinds of scaffolding for different moments in the learning process, and together they're helping me build a teaching toolkit that is more responsive, more interactive, and more honest about what students actually need in order to grow.