I was tired of cookie banners. I was tired of 40KB tracking scripts slowing down my lighthouse score. I wanted to answer one question: “How many unique visitors read my blog today?”
The Requirements
- Zero Client-Side JS: No tracking scripts. Use server-side logging or a 1x1 pixel.
- Privacy First: No IP storage. Hash IPs with a daily rotating salt.
- Hyper-Fast: Write operations should take < 5ms.
The Stack
- Language: Go (Golang) for its raw HTTP performance.
- Database: Redis for counting, periodically flushed to Postgres for long-term storage.
- Visualization: A simple React dashboard using Recharts.
Step 1: The Collector
The core is a tiny Go HTTP server. It accepts a request to /hit.
func handleHit(w http.ResponseWriter, r *http.Request) {
// 1. Get User Agent and IP
ip := r.Header.Get("X-Forwarded-For")
ua := r.UserAgent()
// 2. Anonymize
visitorID := hash(ip + dailySalt)
// 3. Increment HyperLogLog in Redis
ctx := context.Background()
rdb.PFAdd(ctx, "visitors:daily:"+today, visitorID)
// 4. Return transparent pixel
w.Header().Set("Content-Type", "image/gif")
w.Write(transparentGif)
}
Why HyperLogLog?
Redis HyperLogLog is magic. It can estimate the cardinality (unique count) of a set with >99% accuracy using only 12KB of memory, regardless of whether you have 100 visitors or 100 million. For a simple view counter, it’s perfect.
Step 2: Handling “Referrers”
I wanted to know where traffic came from. I extracted the Referer header. To avoid storing URLs (which can contain PII), I only store the domain.
domain := extractDomain(r.Referer())
rdb.ZIncrBy(ctx, "referrers:daily:"+today, 1, domain)
This uses a Redis Sorted Set to keep a leaderboard of top traffic sources.
Step 3: The Front-End
I exposed a simple JSON endpoint protected by Basic Auth. The React frontend fetches this JSON and renders a line chart.
The Problems I Faced
Bots. So many bots. My initial data showed 50x more traffic than expected. I had to implement a User-Agent blocklist. I ended up using a middleware that checks against a known list of bot signatures (Googlebot, Ahrefs, Bingbot).
The Cost
This entire system runs on a $5/month DigitalOcean droplet, which also hosts my personal site. Redis takes about 50MB of RAM. It’s incredibly efficient.
Conclusion
Building your own tools gives you a deeper appreciation for the trade-offs companies like Google make. Is my tool as powerful as GA4? No. Does it tell me exactly what I want to know without spying on my users? Absolutely.