People type “how to check website traffic” into Google for two completely different reasons. Either they want to know how much traffic their own site is getting, or they want to estimate a competitor’s traffic. The methods, the accuracy, and the tools are different for each. This guide covers both — six free methods that actually work — without pretending any single number is gospel.
I’ll be honest about what each method measures, where it lies to you, and which number to trust when two tools disagree (they always disagree). No GA4 worship, no fluff. Just the mechanism behind each approach so you can pick the right tool for your situation.
Your Own Site vs. Someone Else’s Site
Before you pick a method, get clear on what you’re trying to do:
- Your own site: You have full access. You can read server logs, install scripts, plug into Search Console, or pull stats from your CMS. The numbers are real, not estimates.
- Someone else’s site: You have zero access. Every “traffic check” tool is making an educated guess based on clickstream panels, search ranking models, and toolbar data. Numbers can be off by 50%+ on small sites.
I’ll cover six methods. Methods 1–4 and 6 are for your own site. Method 5 is the only honest way to estimate a competitor.
Method 1: Server Logs (Your Own Site, No Tools)
Every web server already records every single request. Apache, Nginx, LiteSpeed — they all write an access.log file by default. This is the most accurate, most private way to count traffic, and you don’t need to install anything.
A typical access log line looks like this:
203.0.113.42 - - [29/Mar/2026:09:50:01 +0000] "GET /pricing HTTP/2" 200 18432 "https://google.com/" "Mozilla/5.0..."
That’s an IP address, a timestamp, the URL, an HTTP status, the bytes sent, the referrer, and the user-agent. Multiply by every visit and you have a complete picture: pageviews, unique IPs, popular pages, top referrers, bot traffic, error rates.
You won’t read raw logs by hand. Use a parser:
- GoAccess — terminal-based, generates an HTML dashboard. Run
goaccess access.log -o report.htmland open the file. Free, real-time, beautiful. - AWStats — older but still works. Generates static HTML reports per domain. Most cPanel hosts have it pre-installed under “Awstats” in the control panel.
- Matomo Log Analytics — feeds your logs into Matomo so you get a dashboard without ever loading JavaScript on a page.
What you measure: every HTTP request, including bots, RSS readers, monitoring tools, image hotlinks. You’ll need to filter bots yourself (most parsers do this with a heuristic).
What you miss: client-side events (button clicks, form interactions), session duration, scroll depth. Logs are great at “who fetched what, when” — they don’t see what happens in the browser after the page loads.
Privacy and consent: server logs are first-party operational data. Most jurisdictions consider them legitimate interest under GDPR. No banner needed if you’re only using them to count traffic and debug errors. (Don’t share raw logs publicly — they contain IPs.)
Method 2: Privacy-First Analytics Tool (Plausible / Fathom / Umami)
If you want a pretty dashboard without a cookie banner, install a privacy-first analytics tool. These are JavaScript trackers — same mechanism as GA4 — but they don’t use cookies, don’t fingerprint, and don’t ship data to ad networks. Setup takes about five minutes.
The big three:
- Plausible — €9/mo for up to 10k pageviews. EU-hosted. Lightweight script (<1KB). See my full Plausible review for the gotchas.
- Fathom — $15/mo, similar approach, Canadian/EU hosting.
- Umami — open source, self-host for free or $9/mo cloud. Good if you already have a small VPS.
- GoatCounter — free for personal sites and small projects, ~1KB script, no cookies. Our standalone GoatCounter analytics tool page covers the deployment story for low-traffic blogs.
You add one line of JavaScript to your site. The tool counts unique visitors using a daily-rotating hash of IP + user-agent (so the same person on day 1 and day 2 counts as two unique visitors — yes, slightly inflated, but no cookie). You get pageviews, top pages, top referrers, country, device, and not much else. That’s the point.
What you measure: client-side pageviews and basic events. Real visitor counts, not requests, so bots are mostly filtered.
What you miss: user identity across sessions (by design), full e-commerce funnels, custom audience segments. If you need any of that, you’re back to a heavier tool.
Privacy and consent: the whole point. No cookies, no personal data stored, no banner required in EU/UK. See my privacy analytics guide for the legal reasoning. If your shortlist is a tiny static site versus a small SaaS, the GoatCounter vs Plausible comparison shows where each tool earns its keep.
Method 3: Search Console + Bing Webmaster (Search Traffic Only)
Google Search Console (GSC) and Bing Webmaster Tools are free, and they tell you exactly how much traffic you got from search engines — broken down by query, page, country, and device. Setup is a DNS or HTML file verification, then wait 24 hours for data.
What GSC shows you:
- Clicks — actual visits from Google search results to your site.
- Impressions — how often your URL appeared in search results.
- CTR and average position — for every query, every page.
- Top queries and pages — the keywords you rank for, the pages that earn clicks.
Bing Webmaster Tools gives you the same for Bing (and DuckDuckGo, since it pulls from Bing). Bing is 5–10% of US traffic for most sites — worth setting up if you’re already in GSC.
What you measure: search-engine-referred visits only. Not direct traffic, not social, not email, not other referrers.
What you miss: everything that isn’t search. If 60% of your traffic is from a Hacker News post or a newsletter, GSC will show you a fraction of reality.
Privacy and consent: Google and Bing are the data controllers. You’re using their reporting interface, not deploying their tracker. No banner needed.
Method 4: Cloudflare Analytics (If You’re Behind CF)
If your site sits behind Cloudflare (free or paid plan), you already have analytics turned on. Log into the Cloudflare dashboard, click your domain, click Analytics — done. No JavaScript, no setup, no banner.
Cloudflare counts every request that hits its edge network. Because traffic must pass through Cloudflare to reach your origin, the numbers are inherently accurate for what touches the edge. They show requests, unique visitors (by hashed fingerprint), bandwidth, top countries, top user agents, and a useful “Bots” filter that separates good bots, bad bots, and humans.
The free tier gives you 30 days of history. Workers and Pages plans get longer retention plus deeper request analytics.
What you measure: requests at the network edge, including bots and automated traffic. The “human visitors” count is a heuristic but reasonable.
What you miss: on-page events (clicks, form submits, scroll depth). Cloudflare can do client-side analytics too (Web Analytics, also free) — that’s a separate JS snippet if you want it. Same as Plausible/Fathom in spirit.
Privacy and consent: server-side request logs at the CDN level — same legal status as your own server logs. The optional client-side Web Analytics is also cookieless and considered consent-free in most EU interpretations.
Method 5: Estimate ANY Site (Competitor Research)
Now the harder one. You want to know how much traffic someone else’s site gets. There is no honest way to know exactly. Every tool that claims to is estimating from one of three sources:
- Clickstream panels — anonymized browsing data from millions of users (sold by browser-extension and ISP partners). This is what Similarweb does.
- Search ranking models — they crawl Google, see which keywords your competitor ranks for, multiply by search volume and CTR curves. This is what Ahrefs and Semrush do.
- Toolbar / DNS resolvers — sample data from network operators or browser extensions.
Each method has known biases. Estimates can be 50%+ off for small sites (under 10k/mo visits) because the panel sample size is too small to extrapolate. For larger sites (100k+/mo) the estimates get tighter — usually within 30% of the truth.
Free tiers worth knowing:
- Similarweb — free overview at similarweb.com/website/example.com. Shows monthly visits, traffic sources, top countries, top referrers. Best for sites with 50k+/mo.
- Semrush Free — 10 free searches per day. Shows organic traffic estimate, top organic keywords, paid keywords.
- Ahrefs Free Site Explorer — limited free tier on ahrefs.com. Shows organic traffic estimate (search-based) and backlinks.
- Serpstat — has a free trial; estimates organic traffic from keyword rankings. Strong on Eastern European and emerging markets.
- Cloudflare Radar — free, aggregate-only traffic trends. Good for spotting whether a site went up or down, less precise on absolute numbers.
For a deeper take on doing this without becoming a creep, read how to track website traffic without creeping on your users.
What you measure: a model’s best guess at total traffic and traffic sources. Useful for ranking competitors against each other (relative numbers are tighter than absolute).
What you miss: ground truth. Always treat these numbers as ±30% on big sites and ±50–100% on small ones.
Privacy and consent: you’re querying a third-party tool, not installing anything. No consent issues for the user — you’re the user.
Method 6: Built-In CMS Analytics
If you run WordPress, Ghost, Substack, Shopify, Squarespace, or Webflow, your CMS already has a stats panel. You don’t need to install anything. The numbers are first-party (the CMS counts pageviews itself) and there’s usually no banner required.
- WordPress — Jetpack Stats (free tier, 30 days history) or Independent Analytics plugin. Site Kit is also free and pulls GSC + GA into the dashboard.
- Ghost — built-in Members + native analytics on Pro plans. Visits, signups, top posts.
- Substack — native dashboard. Open rate, signups, top posts. No external tracker needed.
- Shopify — Analytics tab in the admin. Sessions, conversion rate, top products, attribution.
- Squarespace / Wix / Webflow — built-in analytics on every plan. Limited but enough for most owners.
What you measure: pageviews and basic events the CMS chose to surface. Often just enough.
What you miss: custom events, full attribution, raw data export (most CMS analytics are sealed dashboards).
Privacy and consent: usually first-party, cookieless, no banner. Check your specific CMS — Jetpack Stats sets a few cookies that may need disclosure in EU.
Methods Comparison Table
| Method | Setup Time | Free? | Accuracy (Own Site) | Best For |
|---|---|---|---|---|
| Server logs (GoAccess/AWStats) | 15 min | Yes | Highest (raw data) | Technical owners |
| Plausible / Fathom / Umami | 5 min | From €9/mo | High (cookieless visitor count) | Privacy-conscious owners |
| Search Console + Bing Webmaster | 10 min | Yes | Exact (search clicks only) | SEO measurement |
| Cloudflare Analytics | Already on | Yes | High (edge requests) | Anyone behind CF |
| Competitor estimators (Similarweb etc.) | 0 min | Free tiers | ±30–50% | Competitor research |
| CMS built-in (Jetpack, Ghost, etc.) | 0 min | Yes | High | Non-technical owners |
Why Each Method Disagrees (and Which Number to Trust)
Run all six on the same site and you’ll get six different numbers. This is normal. They’re measuring different things.
| Method | Counts | Typical Bias |
|---|---|---|
| Server logs | Every HTTP request | Inflated by bots, prefetch, hotlinks |
| Cloudflare edge | Every edge request | Similar to logs but bot-filtered better |
| Plausible / Fathom | Pageviews from real browsers | Misses ad-blocked users (~5–25%) |
| GA4 | Sessions/users (sampled) | Misses ad-blocked + consent-rejected (~10–40%) |
| Search Console | Clicks from Google search | Search-only; rounded; missing <10-impression queries |
| Similarweb | Modeled visits | ±30% on big sites, ±50–100% on small |
Which to trust? If you need one number, take Cloudflare or server logs (raw data) and subtract bot traffic. That’s the closest thing to ground truth. Plausible/Fathom will be lower because of ad blockers. GA4 will be lower still because of consent rejections. None of them is “wrong” — they’re answering different questions.
Privacy and Compliance Considerations
Three quick rules:
- First-party server logs (your own logs, Cloudflare logs, CMS-internal counters) — generally legitimate interest under GDPR. No banner needed.
- Cookieless client-side trackers (Plausible, Fathom, Umami, Cloudflare Web Analytics) — generally consent-free in EU/UK. Disclose in privacy policy, no banner.
- Cookie-based or fingerprinting trackers (GA4, Hotjar, FB Pixel) — banner required in EU/UK, ideally with proper consent management. If your banner is killing conversions, see how to fix that.
Bing Webmaster Tools and Google Search Console don’t deploy any tracker — they read data from the search engine side. No consent issues.
If you’re moving away from cookie-based tracking entirely, read first-party tracking explained without cookies.
Free vs Paid: When to Upgrade
For most sites under 100k/mo visits, the free methods above (server logs + Search Console + Cloudflare + CMS native) are enough. You don’t need a paid analytics tool.
Upgrade when:
- You’re running paid ads and need attribution beyond utm parameters.
- You have a real funnel (multi-step signup, e-commerce checkout) where you need to see drop-off rates.
- You need to share dashboards with non-technical stakeholders who won’t read GoAccess.
- You’re handling 1M+/mo pageviews and need server-side aggregation.
For small business owners specifically, I wrote a focused guide: web analytics for small business — a no-nonsense guide.
If you’re shopping for a paid tool, my 15 best Google Analytics alternatives roundup covers the landscape.
Common Mistakes I See
- Trusting one number as truth. If GA4 says 8k and Plausible says 11k, neither is wrong. Pick one for trend-tracking and stop comparing them month over month.
- Counting bot traffic as real visitors. A new site can look like 5k/mo “visits” that’s actually 4k bots. Filter bot user-agents in your log parser before celebrating.
- Using Similarweb numbers on a small competitor. Under 10k/mo the panel sample is too thin. The number you see is essentially a guess. Use it for direction, not magnitude.
- Forgetting Search Console doesn’t show non-search traffic. GSC clicks ≠ total traffic. If your GSC shows 2k/mo and you’re disappointed, check your other channels first.
- Skipping the privacy check. Installing GA4 in the EU without a consent banner is the most common compliance failure I see. Even free tools can land you in trouble if you skip the checkbox.
Frequently Asked Questions
Can I check competitor traffic accurately?
No tool gives you exact numbers. Similarweb, Semrush, and Ahrefs estimate from clickstream panels and ranking models, with ±30% accuracy on large sites and ±50–100% on small ones. Use them for relative comparison (Site A vs Site B), not absolute truth.
What are GA4’s free tier limits?
GA4 is free for up to 10 million events per property per month, with 14 months of standard reporting retention. The hidden cost is in compliance (consent banners), data sampling on big reports, and the steep learning curve. For traffic counting alone, GA4 is overkill.
Why does Search Console show different numbers than my analytics?
GSC counts only clicks from Google search results. Your analytics counts every visitor from every source. They will never match. GSC clicks should be a subset of your total — usually 30–70% depending on how SEO-driven your traffic is.
How accurate is Similarweb?
For sites doing 500k+/mo visits, Similarweb is usually within 20–30% of the truth. For sites under 50k/mo, accuracy degrades fast — sometimes off by 2x in either direction. Treat the source/country breakdown as more reliable than the absolute traffic number.
What’s the best free traffic estimator?
For competitor research, Similarweb’s free overview is the easiest. For ranking-based estimates, Ahrefs free Site Explorer or Semrush free searches give a different angle (organic-only). Cloudflare Radar is free for trend data. Use two of them and triangulate.
Do I need a cookie consent banner to check my own traffic?
Not for server logs, Cloudflare server-side analytics, Search Console, or cookieless tools like Plausible/Fathom/Umami. You do need one for GA4, Hotjar, Facebook Pixel, and any tool that sets persistent identifiers in the EU/UK.
What’s the difference between real-time and daily traffic data?
Real-time shows visits in the last 5–30 minutes — useful for confirming a launch worked or a campaign is firing. Daily/weekly aggregates are for trend analysis. Don’t make decisions on real-time data; the sample is too small. Server logs, Plausible, GA4, and Cloudflare all offer both views.
Bottom Line
For your own site, you have free, accurate options: server logs (GoAccess), a privacy-first script (Plausible/Fathom/Umami), Search Console for search-only data, and Cloudflare Analytics if you’re already behind it. Pick one for daily use, layer in GSC for SEO insight, and you’re done.
For competitors, accept that every number is an estimate. Similarweb is best for big sites, Ahrefs/Semrush better for SEO-driven sites, and all of them lie about small sites. Triangulate two tools and treat the answer as a range, not a number.
Stop comparing GA4 to Plausible to Cloudflare and asking which one is “right.” They’re all right — they just measure different things. Pick one, watch the trend, and use the others for context.