Skip to main content
web60

Web60 Features

Every Analytics Script You Install Slows Your Site. Server-Side Analytics Does Not.

Graeme Conkie··12 min read
Abstract illustration of data flowing from one website into a single secure collection point with smaller disconnected trackers fading in the distance

Reviewing performance data for one of our customers this morning, a Kilkenny craft brewery running an online shop, I watched the same pattern play out that I have seen a dozen times before. Mobile Largest Contentful Paint had crept from 2.1 seconds to 3.4. Interaction to Next Paint had collapsed from a respectable 180ms to almost 600. The homepage had not changed. No plugin updates. No theme redesign. What had changed was marketing.

They had added Google Analytics 4. Then Meta Pixel for Instagram ad retargeting. Then a heatmap tool a marketing consultant had recommended. Then, because a consent banner was blocking everything, a consent management platform that loads another 80 kilobytes of JavaScript before any of the above kicks in. Four tools. All of them doing broadly similar things. All of them running in the customer's browser on every page load, for every visitor, on every device.

I was looking at Core Web Vitals. They were looking at a dashboard full of data. Nobody had asked the obvious question, which is whether that data was worth the speed they were paying for it.

This is the quiet cost of browser-based analytics, and it is the reason server-side analytics exists.

What Your Analytics Stack Is Actually Doing to Your Site

The HTTP Archive's Web Almanac 2024, in its chapter on third parties, tracked resources across the top 1 million websites and found the median site carries 27 third parties per page. Top 1,000 sites run a median of 66. Scripts account for roughly 30 percent of all third-party requests, and most of them are analytics, tag managers, consent tools, or advertising pixels.

Every one of those scripts competes for the main thread. That is the browser resource responsible for rendering your page, responding to taps, and processing user input. When it is busy executing tracking code, it cannot do the things your customer actually came to do.

The Web Almanac's performance chapter found that mobile pages carrying user-behaviour (analytics) scripts maintain a good Interaction to Next Paint score only 37 percent of the time. Pages with consent management platforms do only marginally better. Google's own performance documentation, in its guidance on tag managers, puts it plainly: we have seen a correlation between the size of tag managers and poorer INP scores.

Here is what that means for an actual customer. INP measures how quickly a site responds to a tap. Above 200ms, the site feels sluggish. Above 500ms, customers notice delay on every interaction. When analytics scripts are saturating the main thread, your checkout button feels slow. Your menu takes a moment to open. Your "Add to cart" hesitates. The customer who was about to buy decides they will come back later. Most of them do not.

A Google Tag Manager container starts at around 33 kilobytes compressed and grows from there as you attach tags. Plus GA4 itself. Plus Meta's pixel. Plus whatever the marketing team has wired into it. The page is not just larger. It is more fragile. Any one of those services can misbehave, fail to respond, or block other tasks, and the whole loading sequence is hostage to the slowest of them.

This matters directly to sites here. Our breakdown of Core Web Vitals failure rates across Irish WordPress sites found that most Irish sites already fail Google's mobile performance thresholds before any analytics tooling gets involved. Adding four third-party scripts to a site that is already struggling is like loading more weight onto a suspension that is already sagging.

The "so what" of all this is that you are not just collecting data. You are funding that data collection with seconds of your customer's life, and with lost revenue when they bounce before the page finishes loading.

The GDPR Problem Nobody Talks About

Here is where it gets legally untidy.

In January 2022, the Austrian Data Protection Authority ruled that standard use of Google Analytics on a European website violated GDPR rules on transfers of personal data outside the EU. The French CNIL followed a month later with formal notices against website operators doing the same thing. In July 2023, the Swedish data protection authority issued the first major fine, one million euro, specifically for Google Analytics use.

The Irish Data Protection Commission has not, at the time of writing, issued a headline ruling against Google Analytics specifically. But Irish businesses operate under exactly the same GDPR regime, and the DPC's published cookie guidance is unambiguous: consent for non-essential cookies must be a clear, affirmative act, freely given, specific, informed, and unambiguous. Implied consent does not count. Banners that set cookies before the customer clicks anything do not count. The DPC's own cookie sweep found that most Irish business sites were not compliant.

Then came the European Data Protection Board's Guidelines 2/2023, which extended consent requirements beyond cookies to tracking pixels, fingerprinting, unique identifiers, and any local processing that reads or writes on the user's device. In plain English: the "cookieless" workarounds that many analytics vendors pitched during 2022 do not, by default, exempt you from consent obligations. If you are still reading or writing identifiers on the user's browser, you still need consent.

So your choice with browser-based analytics runs roughly like this. Either you show a consent banner and only collect from the subset of customers who click accept, which by most estimates sits somewhere between 40 and 60 percent. Or you collect without consent and carry real legal exposure, particularly if a competitor, a regulator, or a disgruntled ex-employee decides to file a complaint.

This is a compliance bill dressed up as a marketing tool, and it is worth understanding before you attach more tags to the page.

Abstract network illustration showing a single central node collecting data while scattered outer nodes drift away and fade
Server-side analytics collects from one controlled point rather than scattering scripts across every visitor's browser.

What Server-Side Analytics Measures (And What It Does Not)

Server-side analytics works differently. Instead of loading a script in the customer's browser that phones home to a third-party, the collection happens on the server that is already serving your website. Every request your site handles generates a log entry. The analytics engine reads those entries and aggregates them into reports.

Plausible Analytics, for example, builds a daily rotating identifier using a hash of a private daily salt, the website domain, the visitor's IP address, and user agent string. The salt is deleted every 24 hours. Raw IPs and user agents are never stored. There is no persistent identifier. No cross-site tracking. No cross-device stitching. Matomo operates a similar cookieless mode and can run in a full server-side configuration in which the customer's browser never knows the analytics system exists.

What you get from this approach:

  • Page views, unique visitors, sessions
  • Referrer data (which sites sent traffic)
  • Country-level geography
  • Device type and browser
  • Entry and exit pages
  • Conversion events (form submissions, completed orders) via server-side signals

Here is the Sync Reality Check, because anything that sounds too clean probably is.

Server-side analytics will not give you 30-day retargeting audiences. It will not give you Meta lookalike audiences. It will not give you unified multi-touch attribution across Google Ads, Facebook, TikTok, and email in a single report. It will not identify an individual visitor across devices. If your marketing depends on running paid acquisition with sophisticated conversion optimisation, server-side analytics alone does not replace that stack. You will need to either run both (accepting the speed and consent cost), or migrate your tracking into a server-side Google Tag Manager setup, which is its own project with its own overhead.

For most business websites, which just need to know how many people visited, where they came from, what they read, and whether they bought something, server-side analytics is not just sufficient. It is better. Every visitor counted, no consent banner required for analytics, no script tax on page load, and a much simpler compliance story.

How Web60 Handles This

Server-side privacy-first analytics ships with every Web60 site by default. You do not install a plugin. You do not configure a tracking ID. You do not paste a script tag. When you log in to the dashboard, analytics are already running. Traffic, referrers, top pages, country breakdown, all visible, without having touched a line of code.

The data stays on Irish infrastructure. No transfer of personal data to the United States. No third-party processors. No cookies set on the visitor's device for the purposes of analytics. The JavaScript shipped to your visitor's browser contains zero analytics code, because collection is happening server-side.

A note on legal caution, because this is where vendors get too enthusiastic. Running server-side analytics removes the cookie consent requirement for analytics specifically. It does not mean your privacy policy disappears. You should still describe server-side analytics in your privacy policy for full transparency, and your wider GDPR obligations remain. Work with a solicitor on your overall compliance posture. What Web60 removes is the analytics-shaped piece of that problem, which is usually the most fraught.

Now compare the alternative reality. Without server-side analytics, the default is that your customer waits while Google Tag Manager initialises, Meta Pixel fires, a consent banner scrambles into position, and somewhere in the mess your actual page tries to render. Your Core Web Vitals collapse. Somewhere between four and six out of ten customers bounce if the page is still loading after three seconds. Then the cookie banner kills roughly half of what you do manage to measure. Bad customer experience and bad data, in the same transaction.

All of this sits inside the all-inclusive €60/year Web60 platform, alongside hosting, SSL, backups, staging, and the rest. For a deeper view of the performance layer the analytics approach sits on top of, the complete WordPress performance guide for business owners covers Nginx, PHP-FPM, Redis, and the FastCGI caching stack in detail.

When Third-Party Analytics Is Still the Right Tool

A concession, because this rule does not hold everywhere.

If you are running a quarter of a million euro a year in paid acquisition across Google Ads, Meta, and TikTok, and you need rich conversion signals fed back to each platform so their algorithms can train, you cannot replace Google Analytics and Meta Pixel with server-side alone. You can migrate them into server-side GTM, which recovers some of the speed penalty, but you still need a consent management platform and you still have a real compliance story. That work genuinely suits enterprise marketing teams with a dedicated analyst or agency.

For everyone else, which covers most businesses, the server-side default is better. Small businesses are not running sophisticated remarketing at scale. They want to know whether people are visiting, where from, which pages are working, and whether anyone bought something. All of that is measurable without attaching four third-party scripts to every page.

I Made This Call Wrong Once

I recommended Google Tag Manager to a client about four years ago because it seemed like the tidy thing to do. They had two analytics tools, a pixel, and a chat widget, and I thought consolidating the tags under GTM would make the site feel lighter. It did not. The container grew as the marketing team attached more tags, and six months later we were back at the same speed problem, just with one script loader instead of five. GTM is a management tool, not a performance tool. Would not make that recommendation again without cutting something on the way in.

Conclusion

The practical upshot is this. For most business websites, the analytics data you actually use (visitor counts, referrer sources, top pages, basic conversions) does not require the weight of four third-party scripts and a consent platform. Cleaner alternatives exist. They are faster. They simplify the compliance work considerably, without pretending the compliance work disappears entirely.

What you do with that on Monday morning is not for a blog post to decide. But it is worth a cold look at what your analytics stack is actually returning, set against the speed, the consent banner, and the legal exposure it is costing you.

Frequently Asked Questions

Does server-side analytics mean I no longer need a privacy policy?

No. You still need a privacy policy that describes what data you process and how. Server-side analytics removes the cookie consent requirement for analytics specifically, but your overall GDPR obligations, including disclosure of processing activities, remain. Mention server-side analytics in your privacy policy for full transparency, and work with a solicitor on your wider compliance posture.

Will Google Ads still work without Google Analytics installed?

Yes. Google Ads runs independently of Google Analytics. Conversion tracking for Google Ads uses its own tag, and you can continue running campaigns without GA4 in place. Where it gets trickier is audience building and attribution across multiple touchpoints. If your marketing depends on feeding rich behavioural signals back to Google Ads or Meta, you may need a server-side GTM setup rather than removing all browser-based tracking.

How does Web60 track conversions without a browser pixel?

Web60's built-in analytics records page views, form submissions, and custom events triggered from the server. For a contact form submission or a WooCommerce checkout completion, the conversion is captured when your server processes the action, not when the browser loads a tracking script. The data lands in the Web60 dashboard without any cookie or script on the visitor's device.

Can I still use Google Search Console with server-side analytics?

Yes, and you should. Google Search Console is a separate service that does not require browser tracking. It reports on how your site appears in Google Search, impressions, clicks, ranking positions, index coverage. Search Console works alongside server-side analytics with no conflict.

Does server-side analytics work with WooCommerce?

Yes. On a Web60 site, server-side analytics captures product views, add-to-cart events, and completed orders without a browser pixel. You get visitor counts, referrer data, and conversion volumes. What it does not replace is the multi-touch attribution you would get from Google Ads and Meta Pixel running together. For shops running paid acquisition at scale, a hybrid approach with server-side GTM may be required.

Sources

HTTP Archive, Web Almanac 2024, Third Parties chapter

HTTP Archive, Web Almanac 2024, Performance chapter

web.dev, Best practices for tags and tag managers

European Data Protection Board, Guidelines 2/2023 on the Technical Scope of Article 5(3) ePrivacy Directive

CNIL, Use of Google Analytics and data transfers to the United States: formal notice to website operators

Irish Data Protection Commission, Guidance on Cookies and Other Tracking Technologies

Graeme Conkie
Graeme ConkieFounder & Managing Director, Web60

Graeme Conkie founded SmartHost in 2020 and has spent years building hosting infrastructure for Irish businesses. He created Web60 after seeing the same problem repeatedly — Irish SMEs paying too much for hosting that underdelivers. He writes about WordPress infrastructure, server security, developer workflows, managed hosting strategy, and the real cost of hosting decisions for Irish business owners.

More by Graeme Conkie

Ready to get your business online?

Describe your business. AI builds your website in 60 seconds.

Build My Website Free →