I'd been shipping this endpoint for 6 months. It passed code review. It passed load tests. Here's what it was actually doing.
The output
I added 2 lines to a realistic Express + PostgreSQL app and ran it for a few minutes under normal load. This is what printed to the console:
[QUERY] SELECT * FROM quotes WHERE author_id = $1 β 47 executions in 1.8s
β³ n-plus-one: quotes.service.ts:34 fired 47 times in one request.
Fix: batch with WHERE author_id = ANY($1) or use a dataloader.
traceId: 3a9f2b β correlated with GET /quotes/by-author (durationMs: 2140ms)
[QUERY] SELECT * FROM authors β 1 execution in 3ms
β³ select-star: Returning all 23 columns. Hot path only uses: id, name, bio.
Consider: SELECT id, name, bio FROM authors
[MEMORY] Heap used: 187MB β 203MB over 60s
β³ heap-growth: Consistent upward trend β no GC plateau detected.
Check for accumulating closures or unbounded caches.
[EVENT LOOP] lag: 48ms | pollDelay: 61ms
β³ Elevated above baseline (< 10ms expected). Check for synchronous work in request handlers.
Let me explain what each one means.
What it found
1. The N+1 query
GET /quotes/by-author fetches a list of quotes. For each quote, it then fetches the author with a separate query.
With 47 quotes in the response, that's 47 individual SELECT * FROM quotes WHERE author_id = $1 calls β one per row. The endpoint took 2.1 seconds. Fixed with a single batched query, it dropped to 80ms.
No linter catches this. No static analysis catches this. A slow endpoint in your APM dashboard gives you no cause. You have to see the query pattern at runtime.
2. SELECT * on a hot path
The authors query returns 23 columns. The endpoint only renders id, name, and bio. Every request is pulling avatar_url, created_at, updated_at, bio_short, social_links (JSON), and 17 other fields β for nothing.
This one is easy to miss because the query is "fast" in isolation. At scale it matters.
3. Heap growth
Steady heap increase with no GC plateau usually means something is accumulating β an unbounded cache, event listeners not being cleaned up, or a closure keeping references alive. This one pointed to a module-level Map that was never evicted.
4. Event loop lag
48ms event loop lag means the main thread is doing synchronous work somewhere. In this case it was a synchronous JSON parse on a large payload inside a request handler. Moved to a worker thread, dropped to < 5ms.
The 2-line setup
npm install argus-apm
import { ArgusAgent } from 'argus-apm';
await ArgusAgent.createProfile({ environment: 'dev', appType: ['web', 'db'] }).start();
Drop this at the top of your entry file and run your app under normal load for a few minutes. Findings print to the console automatically in dev mode.
No account. No cloud. No config file. No side effects in production (the dev profile disables console output and sampling is 100% regardless of environment).
How it works (the interesting part)
The agent hooks into node:diagnostics_channel β Node's official observability primitive, stable since Node 18. No monkey-patching, no prototype pollution.
For N+1 detection: it tracks query fingerprints (parameterized, without values) per HTTP transaction. If the same fingerprint fires more than N times in a single request window, it flags it and reports the call site using the current stack trace.
For privacy: SQL values are destroyed at the AST layer β the query is parsed into an AST, values are stripped, and only the structural fingerprint is stored. This happens before any metric is recorded. Not redacted from logs β shredded before they exist.
Supported drivers: pg, mysql2, mongodb, redis, ioredis, prisma, typeorm, sequelize, mongoose, sqlite3, better-sqlite3, mariadb, cassandra-driver, tedious, oracledb.
Try it on your app
npm install argus-apm
Point it at a real project β not a hello world. Run it for 5 minutes under actual load. See what it finds.
β Drop your result in the GitHub Discussion β what was caught, what driver, what framework. Even "found nothing" is useful data.
β GitHub repo
United States
NORTH AMERICA
Related News
What Does "Building in Public" Actually Mean in 2026?
19h ago
The Agentic Headless Backend: What Vibe Coders Still Need After the UI Is Done
19h ago
Why Iβm Still Learning to Code Even With AI
21h ago
I gave Claude a persistent memory for $0/month using Cloudflare
1d ago
NYT: 'Meta's Embrace of AI Is Making Its Employees Miserable'
1d ago