Cursor usage metrics
See detailed Cursor usage patterns across teams and modes
Available at Insights / AI assistants / Cursor

If you're looking for a productivity boost from Cursor, it's not enough to enable the tool for your teams and then forget it. People need to pick up the new habit and invest the time to learn how to use it effectively. To support that, you need proper visibility into the current usage patterns.
With the Cursor usage breakdown, you can:
Understand where Cursor is gaining traction and where extra support may be needed.
Know if people are just trying it out or actively relying on it in their work.
Notice patterns in how suggestions are used across tab completions, agent, chat, and cmd+k.
Read more in our full guide:
Measure the productivity impact of AI toolsTo track Cursor adoption and licenses, see:
AI adoption metricsSetup
Enabling Cursor metrics:
AI coding tool integrationsDefinitions
Active users = There's no exact definition of an active user, but presumably it includes users who have opened the Cursor editor or interacted with the Cursor agent.
Engaged users = accepted a suggestion or tab completion on a given day.
Code suggestions = The number of Cursor code suggestions (excluding tab completions)
Code acceptances = The number of Cursor code suggestions (excluding tab completions) accepted by users
Acceptance rate (suggestions) = Code acceptances / Code suggestions
Not all accepted code ends up in the codebase. An engineer could accept 200 Cursor-generated lines over the course of creating a 10-line pull request.
You can't use these metrics to get an accurate number of the share of your code that's AI-generated.
Last updated
Was this helpful?