Overview of "Why LinkedIn is turning PMs into AI-powered 'full stack builders' | Tomer Cohen (LinkedIn CPO)"
This episode (hosted by Lenny Rachitsky) interviews Tomer Cohen, LinkedIn’s Chief Product Officer, about LinkedIn’s Full Stack Builder (FSB) program: a company-wide rethinking of how products are built by tightly combining human builders with AI agents and re-imagining roles, org structure, tooling and culture. Cohen explains the why, the technical and organizational work required, early results from pilots, and practical guidance for other leaders looking to adopt similar models.
Key takeaways
- The skills needed for most jobs are changing fast — LinkedIn data shows ~70% of the skills required for today’s roles will change by 2030.
- Full Stack Builder (FSB) is a model to empower individuals (across functions) to take ideas from insight to launch, supported by AI agents, smaller pods, and new tooling.
- Success needs three investments: platform (re-architecture), tools/agents (customized AI), and culture (training, incentives, change management).
- Off-the-shelf AI rarely works without heavy customization and curated “gold” training examples.
- Early pilots show time savings and better-quality experiments; top performers adopt the fastest and benefit most.
- Organizational rollout requires deliberate change management (performance evals, visible wins, training), not just handing out tools.
What is the Full Stack Builder model?
- Goal: enable a “builder” (PM, designer, engineer, BD, researcher, etc.) to own an idea’s entire lifecycle — research, spec, design, code, launch, iterate.
- Emphasis on the human traits AI shouldn't replace: vision, empathy, communication, creativity, and — most importantly — judgment.
- Operational model: smaller cross-functional pods (mission-focused, re-assembled each quarter), people who can flex across domains, and AI agents that handle many tactical tasks.
- New career paths: LinkedIn has created a “Full Stack Builder” title and is replacing its APM program with an Associate Product Builder (APB) program to train juniors in coding, design and PM for LinkedIn’s environment.
Platform, Tools & Agents — how LinkedIn built this
Three pillars:
- Platform (foundation)
- Re-architect core systems so AI can reason over them (composable UI components, server-side APIs).
- Heavy internal engineering to integrate with third-party tools — “never works out of the box” with legacy stacks.
- Tools & Agents (product-level AI)
- Specialized agents (single job-to-be-done) rather than one monolithic agent; orchestrator to connect them later.
- Examples:
- Trust agent: surfaced privacy/harm vectors for specs (built by Head of Trust).
- Growth agent: evaluates growth ideas using LinkedIn’s unique loops/funnels.
- Research agent: trained on member personas + historical research/support data to critique specs.
- Analyst agent: natural-language interface for querying LinkedIn’s graph.
- Coding/maintenance/QA agents: handling builds, maintenance tasks — maintenance agent handles close to 50% of failed builds in Cohen’s description.
- Tools mix: using Copilot Enterprise, ChatGPT Enterprise and various design/coding vendors, but with extensive internal customization and orchestrator logic.
- Culture
- Training programs (APB), pods for early adopters, incentives, public wins and changes in evaluation criteria.
Implementation & rollout (how they did it)
- Timeline: Announced internally end of last year; first MVP agents ready ~4–5 months later after corpus collection and training.
- Pilot model: small pods within R&D using agents and giving feedback to improve tools; teams must provide feedback as a condition for access.
- Breadth: not yet organization-wide GA, but “substantial part” of org using agents; more users being onboarded as platform/tools mature.
- People programs:
- Associate Product Builder program (replaces APM) starting January — intensive training in coding/design/PM for LinkedIn’s context.
- New FSB title and career ladder — people from PM/design/engineering/BD can become FSBs.
- Measurement: Cohen frames impact as Experiment Volume × Quality ÷ Time — early signals include hours saved/week, better-quality specs and faster iteration.
Early results & examples
- Savings: teams (PMs, designers, engineers) report saving hours per week on research, prototyping, and analyst queries.
- Quality: higher-quality discussions and insights, better hypothesis testing before engineering starts.
- Concrete wins:
- Trust agent found vulnerabilities in an “open to work” spec that earlier reviews missed.
- Research agent shifted a team’s direction by critiquing a marketing spec from the perspective of a persona.
- Maintenance agent automates many build/QA tasks (large % of failed builds handled).
- Adoption: top performers are the first and heaviest adopters — AI tends to amplify high-skill builders more than turn weak performers into great ones.
Culture & change management: what's required
- Tools alone are insufficient — culture work is critical:
- Make tools accessible, create training, share early wins publicly, and set expectations.
- Align hiring, calibration, and performance reviews to reward AI fluency and full-stack outcomes.
- Use scarcity/fomo strategically for initial access, but scale deliberately.
- Encourage grassroots experimentation, but maintain an executive-backed narrative and visibility.
- Recognition that not everyone will (or should) be a full stack builder — specialization still matters; the goal is fewer unnecessary silos.
Common pitfalls & surprises
- Off-the-shelf agents rarely work well with large legacy corpora; letting agents read everything (unfiltered drive content) causes hallucination and poor weighting — you must curate golden examples and focus the knowledge base.
- Tool fragmentation: different teams gravitate to different vendors; convergence is hard but necessary to avoid 8 parallel design agents.
- Adoption challenge: many employees won’t self-start; change management and incentives are needed.
- Not all functions are equally trivial to automate; design craft and tooling required extra work.
Practical advice for leaders (how to start)
- Invest in three areas: platform (re-architecture), tools/agents (customization + orchestrator), and culture (training, incentives).
- Curate limited, high-quality corpora and “gold” examples for agents — don’t dump unfiltered data.
- Start small with pods that must provide feedback; use these to create visible wins.
- Build training programs (like APB) to seed full-stack fluency and surface internal role transitions.
- Change performance metrics and evaluation to include AI fluency/agency.
- Be patient and commit resources up front — expect months to see meaningful organization-level impact, not weeks.
- Over-communicate vision and progress; provide permission to experiment without waiting for org-wide reorgs.
Notable quotes
- “The goal itself is to empower great builders to take their idea and to take it to market, regardless of their role in the stack.”
- “Everything else, I’m working really hard to automate. Really, really hard.”
- “If you build all those tools, will they use it? … It’s not enough to give them the tools. You have to build the incentives, programs, the motivation, the examples.”
- “You have to put the initial work to get the gains after — cleaning up the corpus, curating examples. Reasoning over your entire knowledge base does not work.”
Lightning-round extras (Tomer’s personal mentions)
- Book recommendations: Why Nations Fail; Outlive (about personalized medicine/longevity); The Beginning of Infinity.
- Favorite podcast recommendation: a Hebrew show called One Song (deep dives into songs); similar English example: Song Exploder.
- Life motto / mindset: “Becoming is better than being” — focus on continuous growth and iteration.
- Personal note: Tomer announced he’s leaving LinkedIn after 14 years and plans to focus on new problem sets and learning.
Action checklist for product leaders (quick-start)
- Audit your platform: which parts of your stack must be re-architected to let AI reason over them?
- Identify 2–3 high-value agents to build first (e.g., trust, research, growth, analyst).
- Curate golden training corpora and examples; avoid unfiltered “read everything” approaches.
- Launch a pod-based pilot with committed feedback loops and track Experiment Volume × Quality ÷ Time.
- Create a training pathway (internally or hiring) for full-stack fluency; consider an APB-style track.
- Adjust hiring, promotion and performance criteria to reward AI use/fluency and cross-functional ability.
- Communicate early wins broadly and celebrate career transitions enabled by the program.
This summary condenses Tomer Cohen’s practical account of how LinkedIn is changing product development by combining AI agents, platform work and deliberate culture change to create smaller, faster, more adaptive builder teams. The core lesson: invest in platform and tooling, but prioritize culture and curated data — otherwise adoption and quality gains won’t stick.
