Search Is Becoming an Agent Manager (Here's What That Means for Your Tree Service Website)
- AI Search
- SEO
A few weeks ago, Sundar Pichai sat down for a long-form interview about the next decade at Google. Most of the headlines focused on the $180 billion CapEx number and Google's AI comeback. We're focused on a quieter line buried in the middle of the conversation:
"A lot of what are just information-seeking queries will be agentic in Search. You'll be completing tasks. You'll have many threads running... Search would be an agent manager in which you're doing a lot of things." — Sundar Pichai, CEO of Google
Plain English: in the not-too-distant future, when a homeowner needs an arborist, they may not type a search and click through three websites. They'll tell an AI agent what they need and let the agent do the legwork — comparing service areas, verifying credentials, finding availability, even submitting an estimate request.
If your tree service website is built only for the human visitor, it's going to lose ground to the ones built for both humans and the AI agents acting on their behalf.
Here's what "AI-ready" actually means at the architecture level — and why most of the homework is the same homework that already makes a website faster, cleaner, and higher-converting for the homeowners visiting today.
This isn't a forecast — it's already shipping
In April 2026, Google's Search Product Lead Rose Yao announced the global rollout of agentic restaurant booking inside AI Mode:
"Date nights and big group dinners just got a lot easier... Tell AI Mode your group size, time, and vibe — it scans multiple platforms simultaneously to find real-time, bookable spots. No more app-switching. No more hassle."
Translation: a user describes the task, and Google's agent fans out across restaurant websites and booking platforms to complete it on their behalf. That's not search returning a list of links. That's an agent finishing a job.
Restaurants are first. Local services are next. Picture the same shift applied to a homeowner in Boise after a windstorm — a maple has dropped a major limb onto the driveway — and instead of opening Chrome and clicking through five tree service websites, they tell their assistant:
"Find a certified arborist who can come tomorrow morning, request an estimate, and prefer companies with current insurance and TCIA accreditation."
The agent's job is to scan candidate websites, parse credentials, check service areas and hours, find a contact path, and where possible initiate the estimate request. Your website becomes the input to a decision, not the destination at the end of one.
That changes the design brief.
Three pillars of an AI-ready tree service website
None of these are new ideas in isolation. They're the same things that already make a site faster and more conversion-friendly for humans. The agent just makes them non-negotiable.
Pillar 1: Speed is a culture, not a number
Sundar described how Google internally treats latency:
"They now have for sub-teams, latency budgets in the milliseconds. You get 50% credit... If you ship something which shaves off 3 milliseconds, you earn 1.5 milliseconds for your latency budget. Depending on what we think you're doing, some people may get a latency budget of 30 milliseconds or 10 milliseconds."
He also noted Search latency at Google has improved 30% over the last five years even while the product got significantly more capable.
The point isn't that local websites need millisecond-level targets. The point is that the culture of speed has shifted at the platforms shaping how the web gets consumed. Slow is a problem to be solved, not a tradeoff to be tolerated.
For an AI agent visiting your website on behalf of a customer, slow is worse than slow for a human. A human waits a beat and clicks. An agent has a budget for how long it's willing to spend on each candidate site, because anything else makes for a terrible product. A 6-second time-to-first-byte on your tree service site doesn't just bounce a homeowner — it gets your business deprioritized by the agent before the human ever sees the candidate list.
This is the same fight we've been picking for years: hand-coded, lightweight, edge-deployed websites with perfect Core Web Vitals. The audience for that fight just got bigger.
Pillar 2: Clean, semantic code is how the agent understands your business
Sundar pointed out something most people miss when they think about Google's AI history:
"Transformers were immediately used. BERT and MUM, people underestimate how much... Some of the biggest jumps in search quality in that period where search went ahead of everyone else was because of BERT and MUM... We built Transformers and used it immediately in Search to improve language understanding, understanding web pages, understanding your queries."
The same family of models that interpret a user's query also read your pages. AI agents — Google's, OpenAI's, Anthropic's, whoever — work the same way. They need to understand your page, not just crawl it.
That puts a premium on:
- Real HTML at first paint, not content trapped behind JavaScript that has to execute before anything renders
-
Proper heading structure and semantic tags (
<main>,<nav>,<article>,<section>) -
Schema.org markup for
LocalBusiness,Service,OpeningHours,ServiceArea,Review, and — critically for tree services —hasCredentialfor ISA Certified Arborist or TCIA Accreditation - Plain-text phone numbers and addresses, not text baked into hero images
- Service descriptions that answer specific questions ("emergency removal," "stump grinding," "24/7 storm response") in plain language
The page-builder tree service site that loads a 4MB JavaScript bundle and renders content client-side may look fine to the human visitor. The AI agent reading your page in headless mode is missing half the story — and the part it's missing is the part that wins the recommendation.
Pillar 3: Agent-actionable architecture
This is the new pillar. The first two existed before agents; this one is specifically about giving an agent the surface area it needs to act on a user's behalf.
Search marketer Mike Stewart framed the right set of questions to ask in this new model:
"Who controls the journey? What sources does the agent trust? Where does your business show up in that decision layer?"
Pillar 3 is how you answer the second and third. For a tree service website, that means:
-
An estimate request form whose fields are clearly labeled, with proper
name,id,autocomplete, andaria-labelattributes — so an agent can populate it without guessing what each field is for -
A
tel:link on a visible phone number on every page, not buried inside a contact widget - Credentials marked up as structured data, not just listed in a hero image. ISA certification, TCIA membership, insurance carrier, and bonding should all be machine-readable
- Service area data — actual ZIP codes or city names — in the markup, not only plotted on a JavaScript-driven map
- Emergency and after-hours availability called out explicitly. If you take 24/7 storm calls, an agent shouldn't have to infer it from your copy
- A book-an-estimate path that doesn't require navigating a five-page funnel before a form appears
You're effectively writing a small API into your website using HTML. The richer that surface area, the more agents will pick your business out of the candidate list.
The new-build advantage
One of the most useful things Sundar said in the interview was about why younger companies have an edge in this transition:
"That's one advantage startups are going to have. More AI-native teams... Whereas for us, we would have retraining, transformation, et cetera."
The same logic applies at the website level. A site built AI-native from day one — clean HTML, structured data, fast TTFB, agent-friendly forms — is dramatically cheaper to ship than retrofitting a five-year-old WordPress build with twelve plugins, a page builder, and a render-blocking JavaScript stack.
Most tree service competitors are running the latter. Most aren't going to fix it. The window to build this advantage is wider right now than it'll be in two years.
A starting checklist
If you're auditing your current tree service website for AI-readiness, run through this:
- Time to first byte under 600ms on a real mobile device on a slow connection
- All critical content (services, service area, phone, credentials) present in HTML at first paint — not loaded by JavaScript later
-
LocalBusinessschema markup with full NAP, hours, service areas, and review aggregate -
Each service (removal, trimming, stump grinding, emergency response) on its own URL with
Serviceschema - Credentials (ISA, TCIA, insurance carrier, bonding) in structured data — not only listed on the about page
- Forms with semantic field labels, autocomplete attributes, and a clear single-step submission flow
tel:links on every page, footer included- No primary content gated behind cookie banners, modals, or interaction-required UI
- No page-builder bloat — clean, hand-written or framework-output HTML
Most of these are things our team already does on every site we ship. The new framing is just that this work isn't only for the homeowner anymore. It's for the agent that homeowner is going to send.
The takeaway
Sundar's interview is going to get quoted for the CapEx numbers and the data centers in space. The line that matters for local service businesses is the quieter one: "Search would be an agent manager in which you're doing a lot of things."
If your tree service website is going to be one of the things the agent does business with, the time to build for that is before the agent shows up — not after.
We build websites for both audiences: the homeowner today and the agent tomorrow. If you'd like an honest audit of where your current site lands on the checklist above, that's a conversation worth having now while the gap between you and the slower competition is still wide open.
Sources & further reading
- Sundar Pichai interview on the Cheeky Pint podcast with John Collison and Elad Gil
- "Google's Task-Based Agentic Search Is Disrupting SEO Today, Not Tomorrow" — Roger Montti, Search Engine Journal (April 13, 2026)