Industry News
How AI Agents Read Your Business Website. Most Sites Fail That First Visit.

Three different AI agents now browse the web on behalf of real customers. ChatGPT Atlas launched in October 2025. Anthropic's Claude Computer Use feature went into broader release in March 2026. OpenAI's ChatGPT agent took over from the older Operator product the previous summer. None of these are demos. They click links, fill forms, compare options, and increasingly book appointments while the human who started the request is doing something else entirely.
I have spent the last few weeks watching server logs from sites on our infrastructure to see what these agents do when they hit a small business website. The short version is depressing. Most sites are not built for them. The structure that lets a human visitor figure out where the booking page is and how to call the business does not work for a machine that reads the page top to bottom or analyses screenshots one frame at a time.
This article is the reference I keep wanting to send to business owners trying to make sense of what is changing. None of the fixes are exotic. Most of the work is structural.
What Is Actually Happening on the Web Right Now
Agentic browsing has stopped being theoretical. Adobe Analytics published data in March showing AI-driven visits to US retail sites grew by roughly 393% year on year in the first quarter of 2026, with those AI-referred shoppers converting around 42% better than visitors arriving from traditional channels. Engagement rates ran about 12% higher and session lengths close to half-again longer. The reasoning is straightforward. An AI agent has already filtered the search and arrived with intent.
Three product launches drove most of that volume. OpenAI's Atlas browser, released to Plus, Pro, and Business users in October 2025, builds ChatGPT directly into a Chromium-based browser and lets it act on what it sees. Anthropic's Claude Computer Use, in research preview from October 2024 and expanded a year and a half later, runs on a continuous screenshot-analyse-act loop. ChatGPT agent, released in summer 2025, replaced Operator entirely.
The commercial infrastructure is following. As CNBC reported in late 2025, Visa and Mastercard are both preparing payment rails for agent-initiated transactions, and Google launched a Universal Commerce Protocol in January 2026 with Shopify, Target, and Wayfair. The broader picture is one I have written about in the piece on AI bots already accounting for half of website traffic. Agents are not a side experiment anymore. They are the new traffic source the rest of the industry is being asked to accommodate.

How an AI Agent Reads Your Site Differently Than a Human Does
Most agents fall into one of two camps. The first parses the underlying HTML and looks for structured signals: schema markup, OpenGraph tags, semantic landmarks like header, main, footer. The second takes screenshots and tries to interpret them visually, then converts visual elements into action targets. ChatGPT Atlas can use either approach depending on the task. Claude leans more on the visual route. The older crawlers training language models on web content stay on the structured side.
Either way, the agent is not reading your homepage the way a human reads a magazine. It does not pause on your hero image. It does not warm up to your tagline. It is hunting for specific information: hours, location, prices, contact methods, available time slots, product attributes. Anything those answers are buried under, a JavaScript modal, a "click here to see availability" button that opens a popup, an image of a phone number rather than the number as text, is friction the agent has to work around. Sometimes it does. Often it gives up.
The street-level version of this is simple. A customer asks ChatGPT to find a dog groomer with a free Saturday slot in June. The agent visits five candidate sites. On three, the booking calendar is rendered by a third-party JavaScript widget that loads after the rest of the page. The agent's screenshot is taken too early, sees nothing useful, moves on. On the fourth, the phone number is in an image. On the fifth, contact details are in plain text in the footer with a tel: link, and Saturday slots are listed in a structured calendar. That site gets the booking. Same dog. Different outcome entirely because of how the page was built.
Five Structural Details That Decide Whether an Agent Uses Your Site
After watching this play out across our customer base, the same five details show up repeatedly as the difference between a site that gets used and a site that gets skipped.
Machine-readable contact information. Phone numbers as tel: links, emails as mailto: form, opening hours in structured text, addresses tagged with LocalBusiness schema. None of this is hard. Most agency-built sites still get it wrong because the designer wanted the phone number rendered as a custom font in an image.
Structured data on the right pages. Schema.org JSON-LD describing what the business is, what it sells, where it operates, and what its services cost. According to Google's Search Central blog earlier this year, structured data quality is now an input AI Mode considers when selecting sources, alongside PageRank and content freshness. Search Engine Land's recent piece on this rightly cautioned against treating schema as a citation cheat code. It is not. But pages without it sit further down the consideration set.
Server-rendered HTML that loads quickly. If your page renders meaningful content only after JavaScript runs in the browser, an agent that takes a screenshot too early sees nothing. Slow servers compound the problem. The Web60 sites I have measured against agent test scripts, doing the same booking lookup on each, generally land in the half-second to one-second range, well inside the window where most agents will wait. Sites on cheap shared hosting routinely time out before the agent commits to any action.
Semantic HTML structure. Headings in the right order. A single H1. Lists that are actually ul lists, not styled div wrappers. Forms with proper label tags. The agent uses these as landmarks. WordPress on a sensibly built theme gets this right by default. WordPress with two dozen page builders bolted on often does not.
Open access for legitimate agents. Aggressive bot-blocking can lock out the same agents your customers are using. ChatGPT Atlas reports its user agent honestly. Claude does too. Where your hosting provider supports it, you want allow rules for verified agent traffic, not a generic anti-bot wall that treats every non-human request as hostile. Otherwise you may be blocking your future customers along with the scrapers.
Why This Matters for Local Irish Business Sites
A veterinary practice in Limerick I have been watching uses a custom-built WordPress site with a plugin-based booking system. The plugin renders entirely in the browser, after the rest of the page loads. Tested through ChatGPT Atlas asking for an emergency cat appointment in Limerick this Saturday, the agent could find the practice, found the contact page, but never managed to read the booking calendar. It defaulted to suggesting the customer phone the practice. That is not the worst outcome, but it is one less route for an after-hours customer to convert before they call somewhere else whose calendar happens to be structured properly.
What 'AI-Ready' Actually Looks Like
You can describe an AI-ready business website in a paragraph, and it sounds almost dull. WordPress on a properly maintained stack. A current theme using semantic HTML. A schema markup plugin configured for the business type. Contact details in plain text. Server-rendered pages that load fast. SSL. Privacy-first analytics that does not slow the page or trigger consent dialogs that confuse agents. None of this is exotic. It is the same advice that has been in WordPress optimisation guides for a decade, with the consequences raised.
This is the platform layer of the problem, and it is where Web60's all-inclusive €60-a-year managed hosting earns its place. The default stack ships with WordPress core, Nginx and Redis for fast page rendering, free SSL, structured analytics, and managed updates. A Web60 site is not magically AI-optimised. But it has the structural prerequisites in place from the moment it is provisioned, and that matters when you are competing with a site whose JavaScript-heavy hero never finishes rendering before the agent moves on. The wider case for AI-powered WordPress as the default for Irish business websites is laid out in our pillar piece on what every business owner needs to know in 2026.
If you are running a high-traffic enterprise WordPress estate with a dedicated DevOps team, complex deployment pipelines, and bespoke caching needs, an enterprise-tier managed host genuinely fits that workload better than an all-inclusive plan. That is a real concession to make. But it is also not where most local Irish businesses sit. For the rest, structural prerequisites by default beats configuration overhead every time.

If you are running an enterprise content site whose business model depends on traffic clicking through to read the article, blocking AI crawlers and AI agents at scale makes commercial sense, and the major news publishers have set this up through Cloudflare and equivalents. That is a genuine case where keeping agents out is the right call. But for a local services business whose goal is being found and being booked, blocking is the last thing you want.
The Sync Reality Check
AI agents make mistakes. They click the wrong button. They sometimes book the wrong appointment. They occasionally repeat the same form submission three times because the confirmation page took too long to load. Where your hosting provider supports it, server-side form deduplication and basic rate limiting cover most of this. None of it is unique to AI traffic, but agents stress these systems faster than human users do.
There is also a temptation, encouraged by every "GEO" agency that has popped up in the last six months, to install an llms.txt file at the root of the site and assume the job is done. A recent crawl of nearly 300,000 domains found roughly one in ten has adopted llms.txt at all, and Google has stated publicly that AI Overviews and AI Mode continue to rely on traditional SEO signals rather than this file. Add it if you want. It does no harm. Do not treat it as a substitute for the structural work.
How to Get Your Site AI-Agent Ready
Five steps. None of them require a developer if you are on a sensibly built WordPress stack.
Audit the structural basics. Verify that headings, contact details, and key business information render in plain HTML, not as images or after-the-fact JavaScript. View the page source if you have to.
Configure schema markup. Use a maintained schema plugin to mark up the business, its services, opening hours, and location. Validate the output through Google's Rich Results test.
Verify production performance. Push the homepage and the booking or contact page through PageSpeed Insights. Anything below 70 on mobile needs attention before anything else.
Verify firewall rules. Confirm that legitimate AI agents are not being blocked by an over-zealous bot rule. Check your hosting panel or ask your provider.
Test it with the agents themselves. Open ChatGPT in agent mode and ask it to perform the most common booking or contact task on your site. Watch what it does. Fix what it cannot.
The Bottom Line
The web is being read by machines that act on what they read, and they are starting to drive real customer traffic. The structural fixes are not glamorous. Semantic HTML, fast pages, schema markup, plain-text contact details. Same advice that has applied to WordPress sites for years, with the difference that the cost of ignoring it is now measurable in lost bookings rather than lost ranking points. The small business owner who treats their website as an asset that needs to be machine-readable as well as human-readable will be in the consideration set when AI agents start choosing where to send customers. Everyone else will rely on the agent giving up and showing the customer a phone number to call.
Frequently Asked Questions
Should I block AI bots and AI agents from my business website?
For most local businesses, no. The agents browsing on behalf of real customers, ChatGPT Atlas, Claude, ChatGPT agent, are increasingly the ones choosing where to send bookings. Blocking removes you from that consideration set. Enterprise content publishers whose model depends on direct page views may have a genuine reason to block, but a local services business almost never does.
What is llms.txt and do I need it on my business website?
llms.txt is a proposed file you place at the site root to tell large language models how they can use your content. Adoption is moderate, around one in ten domains in a recent crawl. Google has confirmed AI Overviews and AI Mode do not rely on it. Adding one does no harm and can help with AI training crawlers, but it is not a substitute for the structural work that helps AI agents actually use your site.
How do I know if my website is AI-agent friendly?
The fastest test is to run the most common customer journey through an AI agent yourself. Ask ChatGPT in agent mode to find a service like yours and book an appointment or get a quote. If the agent gets stuck, you have your answer. The harder, more useful tests involve schema validation, PageSpeed scores, and a manual audit of how key information is rendered.
Does schema markup guarantee that AI search engines will cite my site?
It does not. Schema is one input alongside content quality, entity authority, and site freshness. Pages with proper schema are cited more often in AI Overviews, but no markup type guarantees inclusion. Treat it as table stakes, not a magic switch.
Sources
- Introducing ChatGPT Atlas, OpenAI
- Introducing ChatGPT agent: bridging research and action, OpenAI
- Introducing computer use, a new Claude 3.5 Sonnet, Anthropic
- AI traffic grows but retail sites lag in AI search visibility, Adobe
- How schema markup fits into AI search, without the hype, Search Engine Land
- Google Search referrals to the web have plummeted, 9to5Google
Graeme Conkie founded SmartHost in 2020 and has spent years building hosting infrastructure for Irish businesses. He created Web60 after seeing the same problem repeatedly — Irish SMEs paying too much for hosting that underdelivers. He writes about WordPress infrastructure, server security, developer workflows, managed hosting strategy, and the real cost of hosting decisions for Irish business owners.
More by Graeme Conkie →Ready to get your business online?
Describe your business. AI builds your website in 60 seconds.
Build My Website Free →More from the blog
Google Made Search Console Useful for Non-Technical Business Owners. Here's What Changed.
Google's Search Console upgrade makes website performance data accessible to every business owner. Here's what changed and why it matters in 2026.
Steady Rankings, Dropping Enquiries: What Google AI Search Changed for Local Business Websites in 2026
Google's AI search has fundamentally changed what determines local business visibility. Your website now carries more weight than your Google Business Profile.
