Intercom
A customer service software company that used Claude Code to improve engineering throughput. Relevant here for measuring AI adoption, productivity, and workflow instrumentation.
Key Highlights
- Intercom reportedly doubled merged pull request throughput per R&D head within nine months using Claude Code workflows.
- Its AI adoption approach emphasized instrumentation, logging skill invocations to Honeycomb and session data to S3.
- The company built targeted automations like PR description enforcement and an autonomous flaky-specs fixer.
- Intercom was also noted as training open-source models in-house, citing cost, speed, and effectiveness advantages over APIs.
Intercom
Overview
Intercom is a customer service software company that has become a useful reference point for AI Product Managers because of how explicitly it has operationalized AI inside engineering workflows. In newsletter coverage, Intercom stands out less for a consumer-facing AI feature and more for its disciplined internal adoption playbook: setting a clear throughput goal, building workflow-specific Claude Code automations, and instrumenting usage so leadership could measure real impact.For AI PMs, Intercom matters as an example of moving beyond anecdotal “AI is helping” narratives into measurable productivity systems. Its reported 2× increase in merged pull request throughput per R&D head within nine months illustrates how success can come from pairing model capability with workflow design, telemetry, and organizational goals. It is also notable for experimenting with in-house training of open-source models, suggesting a pragmatic stance on model economics, speed, and control.
Key Developments
- 2026-03-27: Intercom was highlighted alongside Pinterest, Airbnb, Notion, and Cursor as a company training open-source models in-house, with the claim that these models were cheaper, faster, and more effective than API-based alternatives.
- 2026-04-21: Intercom reported a 2× increase in merged pull request throughput within nine months after going all-in on Claude Code. The company built specific workflows such as PR description quality enforcement and an autonomous flaky-specs fixer, while logging skill invocations to Honeycomb and session data to S3. The effort was tied to a top-down target set by CTO Darra, with details discussed by Brian Scanlan.
Relevance to AI PMs
- Measure workflow outcomes, not just model usage. Intercom’s example shows the value of tying AI adoption to a concrete KPI like merged PR throughput per head, rather than vanity metrics such as prompt count or seat adoption.
- Instrument every meaningful AI action. Logging skill invocations to Honeycomb and session data to S3 is a practical model for building observability into AI systems from day one, making it easier to evaluate reliability, ROI, and bottlenecks.
- Design narrow, high-leverage automations. Intercom’s reported wins came from targeted workflows like PR description enforcement and flaky-test fixing, which is a strong reminder that domain-specific AI agents often outperform generic copilots in business impact.
Related
- Claude Code: Central to Intercom’s engineering productivity workflows and the main AI tool cited in its throughput gains.
- Brian Scanlan: Shared details of Intercom’s Claude Code adoption and engineering workflow instrumentation.
- Darra: Intercom CTO who reportedly set the 2× throughput goal that framed the initiative.
- Honeycomb: Used to log and analyze every skill invocation, highlighting the importance of observability.
- S3: Used to store session-level data for analysis and workflow instrumentation.
- Snowflake: Relevant as part of the broader analytics and data stack context for operational measurement.
- GitHub CLI: Related to developer workflow automation and likely adjacent to PR-based AI tooling patterns.
- Pinterest, Airbnb, Notion, Cursor: Peer companies mentioned alongside Intercom in the context of training open-source models in-house.
Newsletter Mentions (2)
“Intercom achieved a 2× increase in merged pull request throughput within nine months by building and instrumenting Claude Code workflows—such as enforcing PR description quality and an autonomous flaky-specs fixer—and logging every skill invocation to Honeycomb and session data to S3.”
#5 ▶️ How Intercom 2X'd engineering velocity with Claude Code | Brian Scanlan How I AI Podcast Intercom achieved a 2× increase in merged pull request throughput within nine months by building and instrumenting Claude Code workflows—such as enforcing PR description quality and an autonomous flaky-specs fixer—and logging every skill invocation to Honeycomb and session data to S3. Within nine months of going all-in on Claude Code, Intercom’s engineering team doubled merged PRs per R&D head after CTO Darra set a 2× throughput goal.
“clem 🤗 highlights that after Pinterest, Airbnb, Notion, and cursor_ai, Intercom is training open-source models in-house—finding them cheaper, faster, and more effective than APIs.”
#7 𝕏 clem 🤗 highlights that after Pinterest, Airbnb, Notion, and cursor_ai, Intercom is training open-source models in-house—finding them cheaper, faster, and more effective than APIs.
Related
Anthropic’s coding agent/product used at Intercom to instrument engineering workflows and automate fixes. Relevant for AI PMs evaluating coding agents, telemetry, and productivity gains.
A workspace and knowledge management platform mentioned as part of Nebula’s integrations. It appears in the AI workflow automation context.
A travel and lodging platform increasingly associated with AI-driven experiences and services. The newsletter mentions it in the context of a new hire from Meta.
GitHub’s command-line interface, used here to merge fixes via hooks in an automated Claude Code workflow. Relevant to PMs designing developer automation and toolchain integrations.
A data warehouse used to analyze archived Claude Code session data. Relevant for AI PMs building analytics on agent usage and productivity.
Stay updated on Intercom
Get curated AI PM insights delivered daily — covering this and 1,000+ other sources.
Subscribe Free