What gets tested, what doesn’t, and why.
Every analytics tool in this directory is reviewed against the vendor’s published documentation, sub-processor list, DPA, and pricing page before earning a spot — and re-checked when those change. This page explains the process so you can decide whether to trust the verdicts.
1. How tools are selected for inclusion
The directory currently lists 21 web-analytics tools and 3 mobile-app analytics tools. Inclusion is editorial, not algorithmic. A tool gets considered when it meets all four of:
- Privacy-first by default. Either cookieless out of the box, or has a documented cookieless mode that doesn't degrade core functionality. Tools that require cookies for basic pageview tracking don't qualify.
- Real product, not lookalike. The vendor has a working SaaS or self-host distribution, public pricing, and a contactable support channel. Marketing landing pages without a working product don't qualify.
- Active maintenance. A commit, release, or vendor blog post in the last 90 days. Abandoned projects don't qualify, even if they were once popular.
- Honest pricing model. Per-pageview or per-event pricing displayed publicly, OR a flat-rate plan documented on the pricing page. Tools that hide pricing behind "contact sales" without any transparent tier don't qualify unless they're enterprise-only by design (e.g., Piwik PRO).
Tools that fail one of these get listed in our tools index notes as "watching" but don't get a full review until they earn it.
2. What gets verified, per tool
Every tool review is checked against the same 38-axis matrix. The categories that get the most weight in editorial verdicts:
- Cookieless verification. I deploy the tracker on a clean test site and inspect the request payload + Set-Cookie headers in DevTools. Self-attested cookieless claims that drop a tracking cookie when you actually use the product fail this check.
- Sub-processor disclosure. I read the vendor's DPA and sub-processor list (or note when one isn't published). A tool that ships data to AWS US without disclosing it doesn't pass the GDPR audit.
- Data residency. I check where the actual data lives — not just where the marketing page says. Tools that claim "EU-hosted" but route through a US CDN with caching get called out.
- License clarity. Source-available isn't always open-source. AGPL with a brand-restriction clause isn't the same as plain AGPL. The license badge on the directory reflects the actual license string, verified against the GitHub repo or vendor terms.
- Real-world setup. Compatibility across Next.js, Astro, and static-HTML setups is documented based on the vendor’s integration guides. SPA tracking, server-side proxying, and cookieless heuristics are reported against vendor claims and known integration patterns.
- Pricing math. I run the published pricing model against three reference profiles (10k pv/mo, 100k pv/mo, 1M pv/mo) and report the all-in number. Vendors who change pricing get a date-stamp on the pricing block.
3. What does NOT get verified
To keep the workload honest, I explicitly do not test:
- Vendor-claimed conversion rates or revenue lifts. "Customers see 18% more revenue with our tool" is a marketing claim, not a measurable fact. I report what the tool does, not what the vendor claims it produces.
- Support response time at scale. I open one support ticket per vendor and note the response time, but a single ticket isn't a representative sample. Treat support ratings as anecdotal, not statistical.
- Long-tail compliance frameworks. I verify GDPR, CCPA, PECR, ISO 27001, SOC 2 Type II, and HIPAA when the vendor claims them. I do not audit COPPA, CPRA, LGPD, PIPL, etc. — your legal team should verify those for your specific use case.
- Edge integrations. Whether tool X integrates cleanly with framework Y's third-party plugin Z is outside the scope. I cover popular integrations (WordPress, Shopify, Webflow, Vercel, Cloudflare Workers); deeper specifics are vendor docs territory.
4. How comparisons are written
Head-to-head pages (e.g., Matomo vs Plausible) follow a six-section editorial structure:
- Decision box — "Pick A if / Pick B if / Pick neither if". No winner, just fit.
- Real all-in cost on 100k pageviews/month — itemized invoice, not list price.
- Workflow walkthrough — same fictional B2B SaaS, same Monday-morning report.
- Attribution reality (2026) — iOS 17, Safari ITP, Chrome cookie deprecation per tool.
- GA4 migration — what carries, what doesn't, honest gotcha.
- Three things each tool quietly can't do — disqualifiers named explicitly.
Any pair page that hasn't yet been rewritten in this format uses the older "where they differ" template. The site is gradually migrating all pair pages to the new structure, prioritizing the top-5 web tools (Plausible, Matomo, Fathom, Umami, Simple Analytics).
5. How this site stays independent
No vendor pays for placement. There are zero affiliate links anywhere on this site — including in articles, comparisons, tool reviews, and FAQs. If you click a vendor link, the URL is the vendor's plain domain. You can verify this in your browser address bar.
I have declined affiliate offers from at least four vendors listed in this directory. If a vendor has been "featured" or "recommended" in a way that benefits them commercially, I will say so explicitly on that tool's review.
Outside of writing, I am available for analytics implementation advice, GA4-to-privacy migrations, and second-opinion consulting for SaaS teams — but anyone who reaches out for that is not a vendor of any tool reviewed here, and those conversations don't influence how the directory is written.
6. How to flag an error
Spotted a factual error, an outdated price, or a vendor claim that no longer holds? Email mark@analytics-alternatives.com with the URL and the correction. Every report is read; verified corrections are reflected on the affected page with a visible "updated" timestamp. Vendor PR pitches go to the same inbox and get a polite "I'll consider it for the next quarterly review" reply unless the change is factual.
7. Refresh cadence
Each tool review is reviewed on a 90-day cycle: pricing checked, vendor docs re-read, sub-processor list re-pulled. Hub pages (cookieless, GDPR, open-source, self-hosted) re-rank the tool list quarterly. The "Last updated" date on each page reflects the most recent verified-fact pass, not just a CSS tweak.