Back to all essays
George's Takes

The 14 → 5 Layoff That Taught Me About AI and Teams

·10 min read
George Pu
George Pu$10M+ Portfolio

27 · Toronto · Building businesses to own for 30+ years

The 14 → 5 Layoff That Taught Me About AI and Teams

Headcount is a 20th century metric.

We got from 14 people down to 5 people in 2023. For the next two years, we basically have not hired.

One of the most painful moments in my career. Also one of the most instructive.

Here's what that layoff taught me about AI, teams, and why headcount used to matter but doesn't anymore—and what we lose when we optimize too hard for lean.

The Historical Logic (That Made Perfect Sense)

For most of the 20th century, headcount = output capacity:

Fast food restaurants: Need bodies to operate fryers, take orders, clean tables Airlines: Need crew for safety and service (non-negotiable regulatory requirements) Factories: Need workers on assembly lines to build products Consulting firms: Need consultants to deliver billable hours to clients

The equation was simple:

  • More people = more capacity
  • More capacity = more output
  • More output = more revenue
  • More revenue = more successful company

Headcount was a perfect proxy for business scale and importance.

This made sense because humans were the execution layer. There was no alternative to people doing the work.

Example: 2010 software company:

  • Need developers to write code (no AI coding assistance)
  • Need designers to create interfaces (no AI design tools)
  • Need marketers to create campaigns (no AI content generation)
  • Need customer service to handle support (no AI chat assistance)
  • Need analysts to process data (no AI analytics)

Double the team = double the output capacity. The math was clear.

When Everything Changed

AI broke the headcount-to-output relationship:

The new reality:

  • 3 people with AI can outproduce 15 people without it
  • The execution layer is being automated
  • Headcount now often equals coordination overhead, not capacity
  • More people can mean slower decisions, not faster delivery

Example: 2024 software company:

  • AI writes code faster than most developers
  • AI generates designs and marketing content
  • AI handles customer service inquiries
  • AI processes data and generates insights
  • AI automates routine operational tasks

The humans shift from doing work to directing AI, making decisions, and solving novel problems.

Suddenly, the 20th century headcount equation breaks down:

  • More people ≠ more capacity (AI provides capacity)
  • More people = more coordination complexity
  • More people = higher costs without proportional output increase
  • More people = slower decision-making and reduced agility

The Painful Personal Experience

The Context Leading Up

By late 2022, we had 14 people:

  • 6 developers (full-stack, front-end, back-end specialists)
  • 2 designers (product design, marketing design)
  • 2 marketers (content, campaigns)
  • 2 customer success (support, onboarding)
  • 1 operations (HR, finance, coordination)
  • 1 analyst (data, reporting)

The team felt right: Everyone had clear roles, good chemistry, meaningful contribution.

The problem: Revenue wasn't scaling with team size. Customer acquisition costs were high. Profit margins were thin.

The AI factor: Throughout 2022, AI tools kept getting better. ChatGPT launched. Code generation improved dramatically. Design automation advanced rapidly.

By Q4 2022, I started noticing: Tasks that required full-time people were increasingly handled by AI tools in minutes or hours, not days or weeks.

The Painful Decision

The moment of truth came in early 2023:

  • Runway was 14 months with current burn rate
  • Revenue growth was 15% quarterly but expenses were growing 25%
  • AI tools could handle 60-70% of what we were hiring people to do

The options:

  1. Raise more capital to extend runway (dilute equity for inefficient operations)
  2. Reduce headcount to sustainable levels and use AI for execution
  3. Continue burning cash and hope revenue would catch up

We chose option 2. Got from 14 people down to 5 people.

The Human Cost

This wasn't a "big tech layoff" with severance packages and outplacement services.

This was a small company where:

  • We knew each other's families
  • Had worked together for 2-3 years
  • Shared meals, celebrated wins, struggled through problems together
  • Everyone had joined believing in the long-term vision

The conversations were brutal:

  • "We can't afford to keep the team at this size"
  • "It's not about performance, it's about economic reality"
  • "AI can handle most of what we need, but we can't afford both AI tools and full team"

One of the most painful moments in my career. These weren't numbers on a spreadsheet. They were people whose careers and livelihoods I was affecting.

The guilt was overwhelming: Did I hire too fast? Should I have seen this coming? Was I a bad leader for not figuring out sustainable growth?

The Immediate Aftermath

What we kept (5 people):

  • 2 developers (senior full-stack developers who could direct AI tools)
  • 1 designer (senior product designer with strategic thinking)
  • 1 customer success (who understood our customers deeply)
  • 1 founder (me, handling strategy, sales, operations)

What we replaced with AI:

  • Marketing content creation (ChatGPT, Claude, specialized tools)
  • Customer support (AI chatbot handling 80% of inquiries)
  • Data analysis and reporting (automated dashboards, AI insights)
  • Routine development tasks (AI code generation, testing, documentation)
  • Design asset creation (AI design tools for marketing materials)

The financial impact was immediate:

  • Monthly expenses: $180K$65K
  • Runway extension: 14 months → 36+ months
  • Profit margins: 15%68%

The operational impact was mixed:

  • Faster decisions: 5 people align quicker than 14
  • Less coordination overhead: Fewer meetings, simpler processes
  • Higher individual responsibility: Each person's impact more visible
  • AI amplification: Small team achieving output of much larger team

What AI Replaced (And What It Didn't)

What AI Successfully Replaced

Execution tasks we would have hired for:

Content Creation:

  • Blog posts, social media, email campaigns
  • Product documentation, help articles
  • Marketing copy, sales materials
  • Internal communications, process documentation

Analysis and Reporting:

  • Customer usage analytics and insights
  • Marketing performance analysis
  • Financial reporting and forecasting
  • Competitive research and market analysis

Customer Support:

  • 80% of customer inquiries (through AI chatbot)
  • Onboarding documentation and tutorials
  • FAQ responses and troubleshooting guides
  • Routine account management tasks

Development Support:

  • Code generation for routine features
  • Testing and quality assurance automation
  • Bug detection and initial troubleshooting
  • Documentation generation and maintenance

Design and Creative:

  • Marketing graphics and social media assets
  • Basic UI elements and design systems
  • Image editing and optimization
  • Presentation templates and formatting

The result: We maintained similar output levels with 64% fewer people.

What AI Couldn't Replace

This is where I made the strategic error:

The people who could have reshaped the company's direction:

Strategic Thinking We Lost:

  • Product vision: Different perspectives on what customers actually needed
  • Market insights: Diverse viewpoints on competitive positioning and opportunities
  • Customer empathy: Deep understanding of user problems and motivations
  • Innovation: Creative approaches to business and technical challenges

Specific examples of what we missed:

The Marketing Perspective: Our AI could create content, but we lost the person who understood why certain messages resonated with customers. Result: Technically good content that didn't drive conversions.

The Customer Success Insights: AI handled support tickets, but we lost daily customer conversations that revealed product improvement opportunities. Result: Missed several feature ideas that customers were asking for.

The Technical Architecture Vision: AI could write code, but we lost the senior developer who questioned fundamental technical decisions. Result: Built features efficiently but may have chosen wrong technical direction.

The Business Development Instincts: AI could analyze market data, but we lost the person who had intuition about partnership opportunities. Result: Missed potential collaborations that could have accelerated growth.

"I have not counted in the people who could have made a huge difference into reshaping the company's routes."

The Counterfactual Problem

The trap of AI-powered lean teams:

What's visible: The cost savings, efficiency gains, faster execution What's invisible: The strategic thinking, diverse perspectives, creative solutions you're not getting

The Questions I Can't Answer

Would we have grown faster with a larger team?

  • Maybe the marketing person would have identified a customer segment we missed
  • Maybe the additional developer would have prevented technical debt that slowed us down
  • Maybe the business development person would have landed partnerships that 10x'd growth

Did we optimize for the wrong metrics?

  • Focused on cost efficiency and profit margins
  • May have sacrificed growth potential and strategic optionality
  • Chose financial sustainability over maximum opportunity pursuit

What innovations didn't happen?

  • AI can execute existing ideas efficiently
  • Humans generate novel ideas and question assumptions
  • Smaller team = fewer perspectives = potentially fewer breakthrough insights

The brutal reality: You can't measure what didn't happen. The counterfactual is invisible.

The Echo Chamber Risk

Small AI-amplified teams face a specific danger:

What makes small teams efficient:

  • Faster decision-making
  • Less coordination overhead
  • Higher individual impact
  • AI amplification of execution

What makes small teams dangerous:

  • Limited perspective diversity
  • Groupthink acceleration
  • Blind spot amplification
  • Echo chamber effects

Real example from our experience:

Product direction decision (5-person team): We decided to focus on enterprise features because the data suggested enterprise customers had higher lifetime value.

If you're finding this useful, I send essays like this 2-3x per week.
·No spam

What we considered:

  • Customer analytics showing enterprise usage patterns
  • Revenue per customer analysis
  • Market research on enterprise vs. SMB segments

What we missed:

  • SMB customers were easier to acquire and scale
  • Enterprise sales cycles were much longer than we anticipated
  • Product complexity increased dramatically for enterprise features
  • SMB market was larger and growing faster

A larger team might have included:

  • Someone who worked at a company that tried enterprise pivot and failed
  • Someone with SMB market experience who could have provided alternative perspective
  • Someone with enterprise sales experience who understood the real timeline and costs

Result: We spent 8 months building enterprise features before realizing SMB focus would have been better strategy.

The lesson: AI amplifies execution of decisions, but can't improve the quality of decision-making. That requires human diversity and perspective.

The Modern Team Composition Challenge

The key question: How do you get the benefit of lean (AI-powered efficiency) without losing the benefit of perspective (diverse human thinking)?

Option 1: Stay Ultra-Lean + Advisory Network

What we chose:

  • Core team: 5 full-time people + AI
  • Advisory network: 8-10 people contributing 2-5 hours monthly
  • Contractor network: Specialists for specific projects

Advantages:

  • Lower fixed costs and higher flexibility
  • Access to diverse expertise without full-time commitment
  • Ability to adjust advisory input based on needs

Disadvantages:

  • Advisors have limited context and availability
  • Less consistent strategic input
  • Harder to build deep working relationships

Option 2: Strategic Selective Hiring

Alternative approach:

  • Core team: 7-8 people optimized for strategic thinking + AI execution
  • Hire for perspective diversity, not just skill gaps
  • Focus on people who challenge assumptions and provide unique viewpoints

Advantages:

  • Full-time commitment and context
  • Daily strategic input and decision support
  • Better team chemistry and collaboration

Disadvantages:

  • Higher fixed costs and coordination complexity
  • Risk of hiring for wrong reasons (trying to replace AI capabilities)
  • Harder to adjust team composition as needs change

Option 3: Hybrid Model

Emerging best practice:

  • Core execution team: 4-6 people + AI for daily operations
  • Strategic council: 3-4 part-time senior people for major decisions
  • Specialist network: On-demand expertise for specific challenges

Structure:

  • Daily operations: Core team + AI handles execution and routine decisions
  • Weekly strategy: Core team + strategic council for important direction choices
  • Monthly planning: Full network for major strategic decisions and market assessment

What I Would Do Differently

The Decision I Got Right

Reducing headcount to sustainable levels: The financial math was clear. We couldn't afford 14 people with our revenue levels and growth trajectory.

Using AI for execution: The productivity gains were real. AI handled routine tasks better and faster than humans.

Maintaining quality standards: Despite smaller team, product quality and customer satisfaction actually improved.

The Decision I Got Wrong

Not replacing strategic thinking capacity: I optimized for execution efficiency and cost savings without considering strategic thinking diversity.

Focusing only on measurable benefits: Cost reduction and productivity gains are easy to measure. Strategic insights and creative breakthrough potential are harder to quantify but may be more valuable.

Underestimating compound effects: Small team with limited perspectives makes incremental improvements efficiently but may miss major pivots or innovations.

What I Would Do Now

If facing the same decision today:

Phase 1: Core Team + AI (months 1-6)

  • Reduce to 6-7 people (not 5) with intentional diversity
  • One person from each critical perspective: technical, customer, market, business
  • Use AI aggressively for execution and routine tasks

Phase 2: Strategic Input Addition (months 7-12)

  • Add 3-4 part-time strategic advisors with different backgrounds
  • Monthly strategic reviews combining core team + advisory input
  • Focus advisors on questioning assumptions and providing alternative perspectives

Phase 3: Selective Scaling (year 2+)

  • Add full-time people only when specific strategic thinking gaps become clear
  • Hire for complementary perspectives, not just execution capabilities
  • Maintain AI-first execution with human-first strategic decision making

The Broader Implications

For Founders and Leaders

Headcount is no longer a proxy for:

  • Business importance or scale
  • Execution capacity or output potential
  • Team capability or competitive advantage

The new metrics that matter:

  • Revenue per employee: How much value does each person create?
  • Decision quality: Are you making better strategic choices?
  • Innovation rate: How quickly do you identify and pursue new opportunities?
  • Adaptability: How fast can you pivot when market conditions change?

Key insight: In the AI era, optimize for strategic thinking quality, not execution capacity.

For Investors

Red flags in AI-era company evaluation:

  • Large teams doing work that AI could handle
  • High headcount without corresponding strategic complexity
  • Hiring for execution roles without clear AI strategy
  • Celebrating headcount growth as primary success metric

Green flags for AI-era investment:

  • Small teams with disproportionate output and revenue
  • Clear AI integration strategy and execution
  • Diverse perspectives and strategic thinking capacity
  • Metrics focused on efficiency and decision quality rather than team size

For Employees

Career strategy implications:

  • Develop skills that complement AI rather than compete with it
  • Focus on strategic thinking, creative problem-solving, and relationship building
  • Become comfortable working with AI as amplification tool
  • Build expertise in directing AI rather than being replaced by it

Job security comes from:

  • Unique perspective and judgment that AI can't replicate
  • Ability to question assumptions and generate novel solutions
  • Strategic thinking that shapes direction rather than executes plans
  • Human relationships and emotional intelligence

The Uncomfortable Questions

For Your Own Team

Are you optimizing for the right metrics?

  • Is headcount growth actually correlated with business outcomes?
  • Could AI handle a significant portion of your team's current work?
  • Are you getting diverse strategic input or just more execution capacity?

What strategic thinking are you missing?

  • Do team members challenge your assumptions regularly?
  • Are different perspectives represented in major decisions?
  • How do you know what innovations or pivots you're not considering?

Is your team composition future-ready?

  • What happens when AI capabilities improve further?
  • Are you building sustainable competitive advantages or just temporary efficiency?
  • How will your team structure evolve as AI continues advancing?

For Industry Leaders

If AI continues eliminating execution work:

  • What does optimal team composition look like in 5 years?
  • How do we maintain innovation and strategic thinking in ultra-lean organizations?
  • What metrics should we use to evaluate company strength and potential?

How do we balance:

  • Efficiency gains from AI-powered small teams
  • Innovation benefits from diverse perspectives and larger groups
  • Economic pressure to minimize costs with strategic need for varied input
  • Short-term profit optimization with long-term growth potential

The Lessons Learned

What the Layoff Taught Me

About AI:

  • AI is incredibly powerful for execution and routine decision-making
  • AI amplifies human capability but doesn't replace human judgment
  • AI-powered teams can achieve remarkable efficiency gains
  • AI doesn't generate novel strategic insights or challenge assumptions

About teams:

  • Small teams make faster decisions but may make wrong decisions more efficiently
  • Diversity of perspective matters more in AI era, not less
  • Echo chambers are more dangerous when they have AI acceleration
  • The counterfactual (what doesn't happen) is often more important than what does

About leadership:

  • Optimizing for measurable benefits (cost, efficiency) may sacrifice immeasurable value (innovation, strategic insight)
  • Hard decisions about people are still the hardest part of leadership
  • Financial sustainability enables strategic flexibility, but shouldn't be the only goal
  • The most painful decisions often teach the most important lessons

The Right Answer (For Now)

Headcount is a 20th century metric, but team composition still matters.

The optimal approach:

  • Small core team optimized for AI amplification and execution efficiency
  • Diverse strategic input through advisors, part-time experts, and selective hiring
  • Ruthless focus on decision quality rather than execution speed
  • Regular evaluation of what strategic thinking and perspective you might be missing

The key insight: Use AI to amplify human strategic thinking, not replace it. Get the efficiency benefits of small teams while building in mechanisms for diverse perspective and assumption challenging.

The goal: Build companies that are both efficient and innovative, lean and strategic, fast and thoughtful.

The reality: This is still an unsolved problem. We're all figuring it out as AI capabilities continue advancing.**

But the 14 → 5 layoff taught me this: The cost savings are obvious and immediate. The strategic costs are hidden and compound over time.**

Don't make the same mistake I did. Optimize for both.