Sourcegraph Analytics
Enterprise customers can use Sourcegraph Analytics to get a clear view of usage, engagement, performance, and impact.
Sourcegraph Cloud Analytics
This solution is available to:
- Sourcegraph Cloud customers
- Self-hosted customers that have fully enabled usage telemetry, and that are running a supported version of Sourcegraph (5.9+)
Sourcegraph Cloud customers can use our managed cloud analytics service for Cody and Code Search usage data. Self-hosted customers can also use this service, but they must:
- Upgrade to a supported version of Sourcegraph (5.9+)
- Have fully enabled usage telemetry
- Enablement instructions
For more details on setting up Sourcegraph Analytics, see our enablement instructions
Air-gapped Analytics
Air-gapped customers will soon be able to use our self-hosted and locally deployed analytics service, built on Grafana, to see Sourcegraph usage data.
This product is in development now. If you would like to learn more, please contact your Sourcegraph representative.
Metrics
We provide a set of metrics that can be used to monitor Sourcegraph usage, engagement, performance, and impact within your enterprise.
Learn more about how we think about the ROI of Sourcegraph in our blog.
Overview metrics
Metric | Description |
---|---|
Percent of code written by Cody | Percentage of code written by Cody out of all code written during the selected time. Learn more about this metric. |
Lines of code written by Cody | Total lines of code written by Cody during the selected time |
User metrics
Metric | Description |
---|---|
Total active users | Total number of unique users who have used any Sourcegraph product during the selected time |
Average daily users | The average number of unique users who use Sourcegraph per day during the selected time |
Average days of activity per user | The average number of days each user actively uses Sourcegraph during the selected time |
Daily active users | Number of unique users who used Sourcegraph by day |
Weekly active users | Number of unique users who used Sourcegraph by week |
Daily users by product | Count of daily users across different Sourcegraph products (Cody, Code Search, Code Insights, etc.) |
Product usage distribution (by percentage of users) | Percentage breakdown of users across different Sourcegraph product groupings during the selected time |
Detailed user activity (top 20 users) | List of the 20 most active users during the selected time and their usage patterns |
Frequency of usage | Count of users who used Sourcegraph n times over the past 30 days |
Cody-only user metrics
Many of the metrics above are also available for Cody only. However, some user definitions are slightly different:
User category | Description |
---|---|
Authenticated users | All users signed in to Cody in an editor or interacted with Cody on the web during the selected period. Read more. |
Active users | All users who sign in to Cody in an editor or interact with Cody on the web during the period selected. Read more. |
Code Search & navigation metrics
Metric | Description |
---|---|
Total in-product searchers | Number of unique users who performed searches using the Sourcegraph search interface during the selected time |
Total result clicks | Count of times users clicked on search results to view files or other resources during the selected time |
Total file views | Number of times users viewed individual files through Sourcegraph during the selected time |
Hours saved | The number of hours saved by search and code navigation users, assuming 5 minutes saved per search and 30 seconds per code navigation action |
Daily in-product search activity | Count of search operations performed each day through the Sourcegraph interface |
Daily search users (in-product and API) | Number of unique users performing searches each day, including both UI and API usage |
All searches (in-product and API) by type | Breakdown of searches by category (e.g., literal, regex), including both UI and API usage |
Total code navigation actions | Count of all code navigation operations performed (e.g., go-to-definition, find references) during the selected time period |
Precise code navigation % | Percentage of code navigation actions that used precise intelligence rather than search-based results during the selected time |
Daily code navigation activity | Count of code navigation operations performed each day |
Daily code navigation users | Number of unique users utilizing code navigation features each day |
Precise vs. search-based code navigation actions by language | Comparison of precise vs. search-based navigation success rates broken down by programming language |
Autocompletion metrics
Metric | Description |
---|---|
Total accepted completions | Count of completions accepted by users during the selected time |
Hours saved | The number of hours saved by Cody users, assuming 2 minutes saved per completion |
Completions by day | The number of completions suggested by day and by editor. |
Completion acceptance rate (CAR) | The percent of completions presented to a user for at least 750ms accepted by day, the editor, day, and month. |
Weighted completion acceptance rate (wCAR) | Similar to CAR, but weighted by the number of characters presented in the completion, by the editor, day, and month. This assigns more "weight" to accepted completions that provide more code to the user. |
Completion persistence rate | Percent of completions that are retained or mostly retained (67%+ of inserted text) after various time intervals. |
Average completion latency (ms) | The average milliseconds of latency before a user is presented with a completion suggestion by an editor. |
Acceptance rate by language | CAR and total completion suggestions broken down by editor during the selected time |
Chat and prompt metrics
Metric | Description |
---|---|
Total chat events | Total number of chat interactions with Cody during the selected time |
Hours saved by chats | Total hours saved through Cody chat interactions during the selected time , assuming 5 minutes saved per chat |
Cody chats by day | Daily count of chat interactions |
Cody chat users | Daily count of chat users |
Lines of code inserted | Lines of code generated by Cody in chat that get applied, inserted, or pasted into the editor. Only VS Code is included in this metric for now |
Insert rate | Percent of code generated by Cody in chat that gets applied, inserted, or pasted into the editor. Only VS Code is included in this metric for now |
Chat apply & insert persistence rate | Percent of code inserted by Apply and Insert actions that are retained or mostly retained (67%+ of inserted text) after various time intervals |
Prompts created, edited, and deleted by day | Daily count of prompt management activities, including creation, modification, and removal |
Users creating, editing, and deleting prompts by day | Number of unique users performing prompt management activities each day |
Command metrics (deprecated)
As of Sourcegraph version 5.10, commands are being deprecated in favor of our new feature, chat prompts. As a result, if your Sourcegraph instance is running version 5.10 or later, you may see decreased command usage. In the coming weeks, these command metrics will be deprecated and replaced with new prompt metrics on the Chats tab.
Metric | Description |
---|---|
Total command events | Total number of command executions during the selected time |
Hours saved by commands | Total hours saved through command executions during the selected time, assuming 5 minutes saved per command |
Cody commands by day | Daily count of command executions |
Cody command users | Daily count of command users |
Most used commands | Ranking of most frequently used Cody commands during the selected time |
CSV export
You can download underlying user activity data from Sourcegraph Analytics as a CSV export. To do this, click the Export user-level data
button at the top right of any tab within the portal.
When exporting, you can group the data by:
- User
- User and month
- User and day
- User, day, client, and language
Each row in the CSV represents a user's activity for a specific combination of these groupings (e.g., a particular day, month, client, and/or language). The CSV includes metrics such as searches, code navigation actions, chat conversations, code completions, and more.
Important Notes
- Not all billable actions are included: Some Sourcegraph features, such as code monitors and batch changes, are not yet represented as columns in the CSV export. We plan to add these fields soon. If a user engages with these features, their activity may appear as a “blank” row in the export. If a user is listed as an “Active Sourcegraph User” (see column D), but the rest of the row is blank, they performed a billable action that isn’t currently tracked in the CSV
- A zero or blank value in a row can also mean that the user did not perform tracked actions for that specific date, client, or language
Column Name | Description |
---|---|
Instance user ID | Unique identifier assigned to each user in the system. |
User Email | Email address associated with the user's account. For self-hosted instances, customer must sign the user metadata export addendum for this column to populate. |
Username | User's login identifier, usually matches the username from the auth provider (e.g., GitHub). For self-hosted instances, customer must sign the user metadata export addendum for this column to populate. |
Active Sourcegraph User | Indicates if the user activity in this row is considered billable. Read more |
Client Name | Name of the client application or interface used. Common values: - [IDE_name].cody : Cody extension interactions from IDEs. - server.web : Interactions with the web interface (e.g., searches, code navigation, insights, and some chat events). Variations like server.svelte-web may exist. |
Timestamp Date (or Month) | When the activity was recorded. |
Language | Programming language of the interaction. This is only recorded for a subset of events, mostly completion and code navigation, and will therefore be blank for many rows. |
Searches | Number of search queries performed by the user in the web UI. Searches via other methods (e.g., API) are not captured here. |
Code Navigation Events | Number of times the user navigated through code structures (e.g., "find references", "go to definition"). |
Code Insight Views | Number of times a code insight was viewed. |
Chat Events | Number of Cody chats executed. |
Command Events | Number of Cody commands executed. |
Combined Completion Suggestions | Number of code completion suggestions offered. |
Combined Completion Acceptances | Number of code completions accepted by the user. |
Total Accepted Char Count | Sum of characters from the user’s accepted code completions (includes both full and partial acceptances). |
Combined CAR | Completion acceptance rate (ratio of accepted to suggested completions), combined across editors. |
Weighted CAR | Similar to CAR, but weighted by the number of characters presented in the completion. Gives more weight to accepted completions with more code. |
Total Characters Written by Cody | Inserted code that Cody generates via chat, prompt responses, accepted autocompletions, or suggestions/fixes. Used as the numerator in the "Percentage of Code Written by Cody" ratio. |
Total Characters Written | Total new code inserted into the editor (includes both user-generated and Cody-generated characters). Used as the denominator in the "Percentage of Code Written by Cody" ratio. |
Percentage of Code Written by Cody | Measures Cody's impact: (Total Characters Written by Cody ÷ Total Characters Written) × 100. Learn more about this metric. |