Your Customers Have Better AI Than You

By Damian Mathews and The Last Mile Team

Somewhere right now, a 24-year-old is using AI to draft their twelfth escalation email to a company that still routes refund requests through a web form.

They are not going to stop. Their AI does not get tired, does not get frustrated, and does not hang up after twenty minutes on hold. And they are not an edge case. They are the mainstream consumer, and they aren’t going away.

Millennials (roughly ages 28-43) carry an estimated $3-4 trillion in annual spending power and make more than half of their purchases online. Gen Z (ages 13-28) runs close behind at $1.5-2 trillion in spend, but with a significantly larger 80% of that online. These are the dominant e-commerce cohorts, and they’re also the people leading AI adoption hardest.

Adobe’s research shows 46% of Millennials are already using AI for online shopping, with another 12% expecting to by the end of 2025. One in three Gen Z shoppers now prefers AI platforms over search for product research. Rather than just browsing differently, they are negotiating, escalating, and seeking support with more commitment than any human customer ever did.

Your customers get access to the same frontier AI models you do. The difference is they do not have a governance board. No security review. No procurement cycle. No six-month pilot. They saw the tool, then they opened an app, and they just… started using it. Meanwhile your last AI deployment took 18 months to clear legal.

That gap is where the pressure seems to be building.

The friction points your service model assumes (the hold queue, the callback window, the 48-hour email SLA) have contributed to a delicate equilibrium: service gets harder to reach when it’s overwhelmed, so customers eventually give up.

A tired human calls once, waits, gets frustrated, and maybe tries again tomorrow. Your customers’ AI does not have a patience ceiling. A frustrated 32-year-old can set their agent to email your team every hour, call every ten minutes, and reopen a chat session every time the previous one expires. 

There are two ways to respond.

The first is the obvious one: just resolve the issue. If a customer’s AI is running your contact center in circles at 3am, the underlying problem almost certainly existed at 6pm and nobody closed it. An AI that can genuinely solve the problem, not just log it, breaks the loop. The persistence will stop when the need is met.

The second is harder. You need to be reading the signal at scale Refund requests up 100x this week is  a signal that something broke upstream. A product issue. A fulfillment failure. A policy gap. Your customers’ AI found it first, and now it is running the same play over and over until someone on your side notices.

The OODA loop piece I wrote a few weeks back covered how to build that detection loop. The prerequisite is analyzing every conversation with AI, continuously, so the pattern surfaces in hours instead of quarters. Without that, you are always responding to last month’s problem.

Younger customers didn’t grow up expecting companies to be slow. They built their habits in an environment where the tools got better every few months. When your service model fails them, they have something earlier generations did not: a tireless agent that will keep the pressure on indefinitely.

Will you know when customers’ AI starts spamming customer services? 

Do you have what it takes to respond (and solve the root causes) when they do?

— Damian

PS: If you’re new here, this newsletter brings you the best from Waterfield Tech experts and the frontier of AI, CX, and IT. Also, Kerry posts weekly at The Dualist, and Fish and Dan share their thoughts every other week at Outside Shot and Daichotome.

Here’s what went down this week.

Bleeding Edge

Early signals you should keep on your radar.

Google DeepMind released Gemini Robotics-ER 1.6, a spatial reasoning model now live in Boston Dynamics’ Spot robot for industrial inspections. The upgrade lets Spot autonomously read analog gauges, detect debris, and reason about physical environments during facility walkthroughs. Warehouse and plant experts could soon face a new procurement question: do you hire an inspector, or lease a robot that never misreads a pressure dial?

Gartner forecasts global semiconductor revenue will top $1.3 trillion in 2026, the highest growth rate in two decades. AI chips now account for roughly 30% of total semiconductor revenue, with hyperscaler spending expected to climb more than 50% this year. When one component category rewrites the growth math for a trillion-dollar industry, supply chain teams may want to plan further ahead than they currently do.

Leading Edge

Proven moves you can copy today.

Meta shipped Muse Spark, its first proprietary model from the new Superintelligence Labs, now powering Meta AI across WhatsApp, Instagram, and Messenger. Built from scratch in nine months, it scored 52 on the Artificial Analysis Intelligence Index, trailing Gemini 3.1 and GPT-5.4 at 57. Meta’s pivot from open-source Llama to a closed model suggests it sees more value in controlling its AI stack than in developer goodwill.

OpenAI launched a $100-per-month ChatGPT Pro plan with five times the Codex access of Plus, directly targeting Anthropic’s Claude Max at the same price point. Enterprise revenue now makes up over 40% of OpenAI’s total, with the company generating roughly $2 billion per month. The pricing war between OpenAI and Anthropic may compress margins for both, but enterprise buyers should appreciate the leverage it creates.

Off the Ledge

Hype and headaches we’re steering clear of.

Nearly 80,000 tech workers lost their jobs in Q1 2026, with almost half of the cuts explicitly attributed to AI and automation. Oracle alone shed an estimated 30,000 positions via a 6 AM email, while Meta, Salesforce, and Amazon all linked reductions to AI-driven productivity gains. Companies are simultaneously cutting headcount and pouring billions into AI infrastructure, creating what analysts are calling the “AI employment paradox.”

OpenAI is shutting down Sora, its AI video app that burned an estimated $15 million per day in compute against lifetime revenue of just $2.1 million. Disney’s planned $1 billion partnership unraveled after the company learned of the shutdown less than an hour before the public announcement. Sora may become the cautionary benchmark: impressive demos and sustainable revenue are proving to be very different problems in generative AI.

Sorry, no content found.