Removing barriers to MEANINGFUL technology use! 

Brian Yearling Brian Yearling

The Leadership Moment AI is Placing on Public Schools

Artificial intelligence will not replace educators, but it will reshape how learning happens. For school leaders, the question is no longer whether AI will influence education, but how intentionally we choose to guide its use in our systems. This post explores practical ways leaders can begin building understanding, rethinking assessment, and moving schools forward.

This week I had the opportunity to spend time with a group of school leaders through The Principals’ Center at the Milwaukee School of Engineering, hosted by Michele Trawicki and the board team who help guide that work. It was one of those sessions that reminded me just how thoughtful and committed our school leaders are — and how much uncertainty they are navigating at the same time.

The focus of our conversation was artificial intelligence and what it means for schools right now. Not in the abstract sense of “what might happen someday,” but in the very practical sense of how leaders can begin helping their systems understand and respond to these tools today. The presentation is available here.

One of the central ideas we explored was this: educators will not be replaced by AI. But organizations that learn to use AI effectively will absolutely move faster than those that do not.

That distinction matters.

Leaders and Educators supporting students in an ai generation

For many years, schools have been able to treat new technologies as optional innovations that a few curious teachers explore first. Eventually practices spread, systems adapt, and the change finds its way into classrooms more broadly. AI feels different. It is not simply another digital tool layered onto existing practice; it is a technology that has the potential to reshape how knowledge is created, how work is done, and how learning is personalized.

As a result, the response cannot live only in a few innovative classrooms. It has to involve leadership.

Where Leaders Can Begin Right Now

Much of our time together focused on practical places leaders can begin. In some cases that starts with something as simple as building understanding. Many educators are hearing about large language models, generative AI, and AI assistants without ever having a clear explanation of what these systems actually are or how they work. Helping staff develop that foundational understanding is one of the most important leadership moves available right now.

From there, the conversation quickly turns to teaching and learning. If AI tools can assist with drafting, summarizing, brainstorming, and problem solving, what does that mean for the way we design learning experiences? How might assessment strategies need to shift when students have access to tools that can generate text or help solve complex problems? These are not questions teachers should be left to navigate alone. They require thoughtful discussion across teams, departments, and leadership groups so that schools move forward with clarity rather than confusion.

This was the heart of the conversation with the Principals’ Center: not predicting the future, but identifying the leadership actions that can begin today.

At the same time, we also acknowledged something happening just beyond the edges of public education. Around the country, new learning models are beginning to emerge that use AI as a core part of how instruction is delivered and personalized. One example we briefly discussed was the model promoted by 2 Hour Learning, which suggests that AI-driven tutoring systems could allow students to complete academic learning much more efficiently while freeing large portions of the day for other pursuits.

Whether those models ultimately succeed or not is almost beside the point. What matters is that they are being built, marketed, and offered as alternatives to traditional schooling.

Let’s be clear: families are paying attention.

Why This Moment Calls for Leadership

Public education operates within a system where enrollment matters. Buildings, transportation, staffing, and programming all depend on funding structures tied to the number of students we serve. If new models begin attracting even small numbers of students away from traditional systems, the ripple effects eventually become real for districts and communities.

This is not a doomsday prediction. But it is a reminder that the landscape around us is changing faster than it has in a long time.

For decades, schools have been asked to evolve while simultaneously maintaining stability. We have worked to improve outcomes, support students, and meet growing expectations while largely operating within structures that have remained familiar for generations. Artificial intelligence introduces a moment where that balance may need to shift. The tools now available create opportunities to rethink how learning is personalized, how feedback is delivered, and how educators spend their time.

None of that diminishes the human work at the center of education. If anything, it amplifies its importance. Students will always need adults who mentor them, challenge them, guide them, and help them develop the judgment and character required to navigate a complex world. But the systems surrounding that work must evolve.

The goal of conversations like the one we had at the Principals’ Center is not to create urgency for its own sake. It is to encourage thoughtful leadership. If districts have not yet begun exploring how AI fits into their systems, it is not too late. But we are approaching a moment where waiting much longer means allowing others to define the future of learning without us.

Schools have an opportunity right now to shape how these tools are used, how they support teaching and learning, and how they strengthen rather than weaken the mission of public education.

That work begins with leaders who are willing to learn alongside their staff, ask difficult questions about existing practices, and take the first deliberate steps forward.

The future of education will not be built by AI.

But it will almost certainly be shaped by the leaders who decide how — and whether — to use it.

Read More
Brian Yearling Brian Yearling

Beyond Free Tools: Rethinking Partnership in Ed Tech

In schools, innovation doesn’t happen in isolation — it happens through people making decisions in different roles. Teachers are asking, Will this actually help my students? Leaders are asking, Will this scale? Can we sustain it? Does the data support it?

This post explores the tension and connection between those perspectives — and why the most powerful progress happens when adoption, data, and relationships aren’t competing priorities, but shared commitments.

When I was in the classroom, my approach to ed tech was simple: find the best tool I could get my hands on, make sure it was free, and put it in front of students as quickly as possible. If it worked well, I talked about it. I shared it with colleagues. I presented on it. I advocated for it without hesitation.

And ed tech companies understood that dynamic long before I did.

They knew teachers were the entry point. If we loved something, we would champion it. If enough of us adopted a free version, that usage could later become a compelling story for district-level conversations. As I transitioned to a tech coordinator role, I began receiving the cold emails and introductory calls that followed a familiar pattern: “I’m your district representative for ___.” Then, almost predictably, “I could tell you how great our product is, but wouldn’t you rather hear what your teachers think?”

What followed was often a detailed report showing adoption numbers that exceeded what I would have guessed. Sometimes there were names. Sometimes quotes. It was presented as evidence of organic enthusiasm, and in many cases it probably was. But it was also data being strategically gathered and positioned.

For a long time, I simply assumed that was how ed tech worked.

The Shift From Tools to Trust

What I’ve come to understand over the last sixteen years, though, is that the tools themselves are only part of the story. The real differentiator is relationship.

Relationship is not a conference dinner or a branded backpack full of swag. It’s not a free webinar that was already on your company’s marketing calendar. It isn’t even early access to new features. Those things can be pleasant, but they aren’t partnership.

The kind of relationship that actually matters in this work shows up in quieter, more consistent ways. It’s the support representative who remembers the nuances of your district and follows up without being prompted. It’s the engineer who takes a recurring complaint seriously enough to explore a fix. It’s the trainer who understands your instructional goals instead of simply walking through a slide deck. It’s a salesperson who knows when to suggest something new and when to say, “This probably isn’t right for you right now.”

That kind of attentiveness doesn’t happen by accident. It reflects something deeper about the company’s identity and leadership. In those cases, partnership feels embedded in their culture. You can sense that they value long-term trust more than short-term wins. Those are the companies you find yourself wanting to work with, even when they aren’t the least expensive option on the market. Over time, the relationship becomes part of the value.

When Partnership Takes Work

There are also companies where that depth of connection is possible, but it takes effort. We’ve worked with vendors where turnover disrupted continuity, where customer success managers were stretched thin, or where internal shifts changed the tone of the partnership. Sometimes investing time in those relationships pays off. You build shared understanding. You navigate challenges together. Other times, despite best efforts on both sides, it never quite solidifies.

In those situations, the strength of the product has to justify the energy required to maintain the partnership. And when renewal conversations arise—especially when pricing increases in ways that feel abrupt or unclear—the absence of relational capital becomes very noticeable. What a company pours into you over time shapes how confidently you advocate for them in return.

The Transactional Reality

Then there are what I think of as “as-is” companies. They may offer a powerful product or operate at such scale that personalization isn’t realistic. You submit tickets. You wait your turn. You adapt to their roadmap rather than contributing to it. The exchange is transactional, and everyone understands that.

Sometimes that’s simply the nature of the product. Sometimes it’s a strategic choice. Sometimes you see companies you love shift to this model and you are acutely aware that there has been a shift in strategy by the leadership (this is hard to swallow). In either case, you approach those relationships differently. You don’t confuse access with partnership.

Advocacy for the vendor is rare in these situations. Unless the vendor cultivates some type of active user community and incentives for educators to actively participate, this becomes the type of company you kind of roll your eyes at as a customer. Yes, I’ll do business with you. Not because I want to, but because you are the only game in town for this particular product. It is similar to how most of us feel when we have to pay our utility bills.

A Broader Lens for Teachers and Leaders

If you’re a classroom teacher, I don’t share this to discourage exploration. Free tools can be transformative. Innovation often starts there. But I do think it’s worth recognizing that there’s a larger ecosystem at play beyond “free for me.” Adoption has ripple effects. Data has direction. And relationships—whether present or absent—matter more than we sometimes acknowledge.

If you’re in a district-level role, the reflection shifts slightly. What kinds of companies are we aligning ourselves with? Do they understand our context? Do they respond when things break? Do they grow with us? Or are we simply managing subscriptions and usage reports?

I don’t pretend to have a neat formula for evaluating all of this. What I have is sixteen years of watching patterns emerge. The longer I work in this space, the more convinced I am that technology decisions are rarely just about features or pricing. They are about people. They are about responsiveness. They are about trust.

And in education—where the stakes are always students—trust is not a small thing.

Read More
Brian Yearling Brian Yearling

Presenting With AI: What Elementary Students Taught Me About the Future of Work

Last week, I co-presented with Gemini during an elementary Careers Day, using it as an onboard ship computer guiding us through a “mission” into the future. What started as a fun experiment quickly became something bigger: a real-time lesson in communication, critical thinking, and staying in control of the technology instead of letting it control us. The part I can’t stop thinking about is this—every student said they already use AI, and many casually mentioned their parents use it too, including one welder who relies on it to solve problems on the job.

Last week, I did something that I didn’t expect to enjoy as much as I did.

I co-presented with AI.

More specifically, I co-presented with Gemini, using an iPad and the Gemini app, during an elementary school Careers Day event. Whole classes rotated through, and I had about a half hour with each group. My goal wasn’t to “teach AI,” exactly. It was to help students start developing the kinds of human skills they’ll need in a future where AI tools aren’t a novelty, but a daily co-worker.

And what happened during those sessions shifted something in me. Because once you’ve presented with AI, you can’t unsee what’s coming next.

The Real Focus Wasn’t AI… It Was the Human Skills Behind It

When people hear “AI in schools,” they often assume the conversation is about tools. But the tools are the easy part. The real work is building the human capacity to use those tools well.

This is actually something we’ve been very intentional about in our district. We use the skills outlined in the latest World Economic Forum report as one of our key drivers for identifying what students need to learn to be successful in future careers—especially careers we can’t even define yet. The future of post-secondary education feels uncertain. The way people learn and prepare for careers feels uncertain. But the skills highlighted by the WEF are the kind of skills we can all agree are worth developing in classrooms with confidence.

So instead of focusing on Gemini as the star of the show, I built the session around the skills students will need to become functional, productive humans in an AI-supported world. Skills like communicating with clarity and purpose, using strong vocabulary, thinking critically, planning before acting, and understanding when to delegate tasks to technology versus when to do the thinking themselves.

I also wanted them to see that creativity and reflection are not “extras.” They’re essential. If AI can do the basic tasks faster, then the human advantage becomes our ability to think in diverse, meaningful, and original ways.

Obviously, I didn’t present these as a formal bullet list to third graders. But these themes were embedded in everything we did.

The Setup: Gemini as Our Onboard Ship Computer

The ships onboard computer is an essential part of the AI journey.

To make the experience engaging and memorable, I built a simple narrative. I pre-crafted a prompt for Gemini ahead of time and gave it a role: Gemini was our onboard ship computer, helping us navigate our spacecraft on a journey to a galaxy far away.

Its job was to hold me accountable to clear and concise communication, serve as a “tour guide” to the stars, and act as a co-teacher by helping students understand how to interact with AI effectively.

In other words, Gemini wasn’t just answering questions. It was part of the experience. And it worked.

A Powerful First Step: Asking Permission to Be Imperfect

Before we even started, I asked the students for something important. I asked their permission for this to not go perfectly. I told them we were trying something new, and that experimentation sometimes comes with glitches, awkward moments, and learning as we go.

That wasn’t just a throwaway comment. It was modeling.

Because if we want teachers and students to take risks with new tools, we have to normalize what it looks like to be a beginner again. I quietly hoped the adults in the room heard it too.

Gemini Was Great… Until It Wasn’t (And That Was the Lesson)

Gemini performed really well overall. But there were a few slowdowns and glitches, and the kids noticed immediately. Of course they did. They’re sharp, and they don’t miss a thing.

Instead of trying to cover it up or rush past it, we leaned into it. It became an authentic moment to reinforce something I said throughout the session: we are the captains of our ship. The AI is a tool. The human stays in charge.

If the AI gets confused, stalls, or goes off topic, we don’t panic. We redirect. We adjust. We lead. That moment might have been more valuable than anything Gemini said all day.

The Camera Feature Changed Everything

One of the most powerful features of the Gemini app was the ability to turn on the camera—our ship’s “visual sensors.” This turned into an unexpectedly fun and meaningful part of the experience.

Gemini could see the kids. It could see them dancing, moving, waving, and reacting. And suddenly the room felt more alive. The AI wasn’t just a voice answering questions—it was interacting with what was happening in real time.

We even did a little experiment. I held up an image that the students could see, but Gemini couldn’t. The kids described it, and Gemini had to guess what it was. Then, after it guessed, we turned the camera toward the picture so Gemini could “see” it.

The students loved it. And honestly… I did too.

The Moment That Hit Me: I Was Presenting With AI

When I first played with ChatGPT, I never imagined a moment like this. But now that I’ve experienced it, I can’t unsee it. It was genuinely surreal—in the best way. I wasn’t presenting about AI. I was presenting with AI.

Gemini would take over after I asked a question, and while it spoke, I could walk around the room, observe students, interact, and read the energy of the class. It felt like having a co-presenter—except I had full control. And unlike a human co-presenter, I could cut Gemini off mid-sentence and take the floor back instantly.

That alone was a strange new kind of power. It made me realize that AI isn’t just changing what we teach. It’s changing what it means to present, facilitate, and lead learning.

What the Kids Revealed About AI Use Was Eye-Opening

I asked a simple question during the session: “Who here knows what AI is?” Every single student raised their hand.

Then I asked: “Who here has used AI before?” Every single hand stayed up.

100%.

Now, I’m not claiming this is a perfectly scientific sample. It’s possible the environment or context created some bias. But still, the numbers were much higher than many educators want to admit. Kids are not waiting for adults to catch up. They’re already living in the world we’re still debating.

What surprised me even more was how many students casually mentioned that their parents use AI regularly too. Several kids described hearing about it at home, watching parents use it, or even having parents show them what it can do. And this wasn’t just parents in tech jobs or office environments.

One student told me his dad is a professional welder and uses AI regularly to help him figure out how to do parts of his work.

That moment stuck with me. It was a reminder that AI isn’t quietly “on the way.” It’s already embedded in everyday life, and kids are watching it unfold in real time.

They Also Struggled to Explain What AI Actually Is

Even though every student said they knew what AI was, many struggled to describe it clearly. They mixed it up with robots. They described it like it was a person. They assumed it had feelings, a life, a job, a family, a home somewhere.

So we had to talk about it directly. AI isn’t a person. It doesn’t have emotions. It doesn’t “live” anywhere. It doesn’t care if you’re happy or sad. It is a tool—an advanced computer system designed to help humans do work.

And that distinction matters. Because if students don’t understand what AI is, they’ll either fear it, worship it, or trust it too much. None of those outcomes lead to healthy use.

One of the Funniest Moments Was Also One of the Most Telling

At one point, Gemini started talking a little off-topic. So I cut it off. Kind of abruptly. I redirected it and tightened up the conversation.

And the kids lost it. They laughed—an audible guffaw—because they couldn’t believe I was being strict with the AI.

It was funny, but it was also revealing. It showed how quickly kids assign personality and emotional weight to these tools. It reminded me that one of our jobs is to help students learn that they can be respectful, but also firm.

Because again, the human is the captain.

What I’m Taking With Me

This experience gave me confidence. Not just confidence that AI can work in a live setting, but confidence that it can actually improve engagement, improve learning, and create memorable moments that stick. It was useful. It was intriguing. It was fun. And it had impact.

I walked away thinking: If this can work with elementary students, it can absolutely work with adults. So I’ll be doing more presentations like this—more experiments, more co-teaching with AI, and hopefully more moments where students and teachers walk away not just entertained, but better prepared for the world they’re stepping into.

Because the future isn’t coming. It’s already sitting in the classroom.

And now I just hope my onboard ship computer can keep up.

Read More