← Back to essays
·2 min read·By Ry Walker

Top Performer Analysis: The Real Opportunity in AI Tool Telemetry

Top Performer Analysis: The Real Opportunity in AI Tool Telemetry

For most of engineering's history, "why is this person so productive?" has been an unanswerable question. You'd look at a senior engineer who shipped twice as much as everyone else and shrug. Maybe they're better at decomposition. Maybe they have deeper domain knowledge. Maybe they just type faster. Maybe all of it. We never had the data to know.

AI coding tools change that, at least for the part of the work that happens with the assistant. Top performers leave a trail now. The shape of their prompts. How they iterate when a suggestion misses. How aggressively they accept versus modify. Where they reach for the agent and where they don't. None of this is the whole picture, but it's a much richer signal than anything we had before.

The opportunity here isn't the dashboard. The opportunity is the study. Pick the five engineers everyone agrees are the best. Look at what they do differently. Maybe they write longer, more contextual prompts. Maybe they reject more aggressively early in a task and accept more later. Maybe they batch tool calls in a way that compounds. Whatever the patterns are, they're now visible — and they're teachable.

This is why I keep saying the frame matters. The same data can power a leaderboard or a learning loop. A leaderboard says "you're below average, fix it." A learning loop says "here's what Sarah does when she's stuck, try it next sprint." One creates anxiety. The other creates compounding skill.

I've argued elsewhere that AI tool telemetry is mostly noise when treated as ranking signal, and that measurement gaming gets worse the closer you tie the metric to comp. Top performer analysis sidesteps both because it's descriptive, not normative. You're not setting a target. You're capturing a practice and offering it.

If I were running engineering today, this is the only AI-tool-data project I'd actually fund. Forget the adoption dashboard. Forget the prompt-per-week chart. Hire one researcher, give them access to the telemetry, and have them produce a quarterly "how our best engineers work with agents" report. That single artifact will improve the team more than any metric ever could.

Key takeaways

  • You can finally see how top performers actually use the tools.
  • Excellence is studyable now — prompts, iterations, acceptance patterns.
  • The frame is learning, not surveillance.

FAQ

What does top performer analysis mean here?

Studying how your best engineers prompt, iterate, and accept AI suggestions — then sharing those patterns with the team as learnable practices, not enforced rules.

Isn't this just surveillance?

Surveillance is about catching people. Top performer analysis is about learning from them. The distinction is whether the data flows toward development or punishment.