Profiling data and source code live in different tools. You find a hot function in a flame graph, note the file and line number, switch to your editor, navigate there, and try to reason about the code with the numbers you're holding in your head. If you need more context, a different profile type, a different time range, you have to switch back to the dashboard and start over.
Today we're releasing the VSCode Polar Signals Profiler extension which brings profiling data directly into the editor as inline annotations. CPU time, memory allocations, etc all visible right next to the code they correspond to.
It works with both Polar Signals Cloud and self-hosted Parca instances, and is available for VS Code, Cursor, Windsurf, Antigravity, and any other editor that supports the VS Code extension API via the Open VSX Registry.
What It Does
The extension queries your profiling data in our backend and annotates your source files with the corresponding data. When you open a file, you see how much CPU time each function consumed, how much memory it allocated, or how long it spent blocked, depending on the profile type you selected.
The annotations persist as you navigate between files during a session and appears inlined, in context, where you're already reading the code.
Connecting to Your Profiling Data
The extension supports two modes:
Polar Signals Cloud
Authenticates via OAuth. When you first launch the extension, it walks you through a setup wizard. Select "Cloud," sign in with your Polar Signals account, choose a project, and you're connected.
If you've used our MCP server with Claude or Cursor, the authentication model is the same, we use OAuth across all our AI and developer tool integrations, so there's no need to create or manage long-lived tokens.
Self-hosted Parca
Connects directly to your Parca instance without authentication. Point the extension at your Parca URL (defaults to http://localhost:7070) and it queries the API directly.
Querying Profiles
There are two ways to get profiling data into your editor:
Presets
The extension ships with built-in presets for common profiling scenarios such as:
- On-CPU - where your application spends CPU time (with 15m, 1h, 24h time ranges)
- Off-CPU - where it's waiting or blocked
- Memory allocations - what's allocating, and how much
- Memory in-use - what's holding memory at a point in time
Open the Command Palette, run Polar Signals: Select Preset, pick a scenario, and the annotations appear.
You can also define custom presets in your workspace settings for queries your team runs frequently:
{
"polarSignals.presets": [
{
"id": "api-cpu-1h",
"name": "API Server CPU (1h)",
"profileType": "parca_agent:samples:count:cpu:nanoseconds:delta",
"timeRange": "1h",
"labelMatchers": ["namespace=\"api\""]
}
]
}
Custom Queries
Polar Signals: Fetch Profile for Current File lets you configure the profile type, time range, and label filters on the fly. This is useful when you're investigating something specific and don't want to create a preset for it.
AI Integration
Two commands are designed for workflows with AI assistants:
- Copy Line for AI Analysis copies the current line with its profiling annotation, formatted for pasting into an AI conversation
- Copy File Summary for AI copies a summary of all annotations in the current file to get a top to bottom performance analysis
If you're already using our MCP server with Claude Code or Cursor, this gives you a complementary workflow: the MCP server lets AI agents query profiling data programmatically, while the VS Code extension lets you see the same data visually and share specific findings with an AI assistant.
Getting Started
| Editor | Install from |
|---|---|
| VS Code | VS Code Marketplace |
| Cursor, Windsurf, Antigravity, and others | Open VSX Registry |
Or search for "Polar Signals Profiler" in the Extensions panel of any compatible editor.
On first launch, the setup wizard will walk you through connecting to Polar Signals Cloud or your self-hosted Parca instance. From there, select a preset or run a custom query, and the annotations will appear in your source files.
Conclusion
The extension is open source. Issues, feature requests, and contributions are welcome on GitHub.
We're excited and looking forward to what can be achieved when profiling data is right next to the source!
Let us know if you try the extension and give us your feedback on Discord!