Logwick Is Live: See the Traffic Your Analytics Misses
By Stanislav Chirk2 min read
Most analytics dashboards are blind to a big chunk of reality.
Google Analytics, Plausible, and similar tools rely on JavaScript. But many automated clients never execute JS: AI agents, training scrapers, archive bots, uptime checks, and privacy-focused fetchers. If you only look at pageviews, you're guessing.
Logwick is now live — an open-source tool that analyzes raw HTTP logs and shows who is actually making requests: real users, known crawlers, AI agents, and suspicious automated traffic.
Why it matters
- AI agents don’t run JavaScript. They won’t show up in client-side analytics.
- Raw server logs are the source of truth. Every request is there.
- Classification beats pageviews. You need to know what fraction of traffic is human vs automated (and which families of bots are active).
What Logwick gives you
- Traffic classification: humans vs bots vs AI agents (including GPTBot, Claude, Perplexity, and more)
- Session rollups: group requests into sessions to understand behavior, not just hits
- Local-first by design: a dashboard on
127.0.0.1and a SQLite database on disk - No tracking snippet, no SaaS: your data stays in your infrastructure
- License: AGPL-3.0 (commercial licensing available for closed-source/internal use)
Get it
The repository, docs, and quick start are on GitHub:
Logwick — See every visitor (including AI agents)
If you want to understand how much AI and automated traffic is hitting your site, this takes about 15 minutes to set up — using the JSONL logs you already have from your CDN or origin.