Most organizations I work with are grappling with the same question: what does AI mean for our engineering teams? Two recent examples from the crypto industry offer a stark contrast — and a framework for thinking about your own choices.
Block: The Headline Approach
Jack Dorsey announced 4,000-employee layoffs — 40% of Block's workforce — claiming that "intelligence tools have changed what it means to build and run a company. A significantly smaller team, using the tools we're building, can do more and do it better." His formula: "100 people + AI = 1,000 people." The stock jumped 24%.
Coinbase: The Engineering Approach
Senior Director of Engineering Chintan Turakhia led 1,000+ engineers through AI adoption. PR review compressed from 150 hours to 15 hours. Feedback-to-ship cycles shortened dramatically. No layoffs. "Everybody shipped faster."
These aren't variations on a theme. They represent fundamentally different choices about AI's role in organizations.
Why the Block Narrative Deserves Scrutiny
Block built an internal AI tool called "Goose" that reportedly saved engineers 8-10 hours weekly, boosting productivity 40%. The company mandated daily generative AI tool usage and integrated AI fluency into performance evaluations.
But the timeline tells a different story. March 2025 layoffs (931 people) explicitly stated AI wasn't driving the cuts or changing headcount caps. Eleven months later, AI framed everything. In September 2025, Block spent $68 million on a three-day employee festival — roughly 200 engineers' annual payroll. Between 2019 and 2023, Block's headcount had tripled from 3,800 to 13,000 during pandemic hiring.
Multiple analysts pointed out this looked like classic pandemic overhiring correction. Sam Altman himself acknowledged that companies sometimes blame AI for otherwise-planned layoffs. Tech journalist Om Malik called "100 people + AI = 1,000 people" a bumper sticker, not a business plan, and described an emerging template: cut half the company, blame machines, watch stocks rise.
What Coinbase Actually Did
Turakhia's team faced a formidable challenge: rewriting Coinbase's self-custody wallet into a consumer social app. They were competing against multi-thousand person teams with ten-year advantages, working with six-to-nine month timelines and a smaller team.
Rather than mandating AI usage, Turakhia demonstrated commitment. He spent January through April 2025 daily in Cursor, picking up bugs, writing PRs. He enabled peer learning through showing, not decreeing.
The Adoption Sequence That Worked
Phase 1: Remove the soul-sucking work. Start with unit tests, linting, PR descriptions — the draining paper cuts nobody wants to do. Initial wins were modest but meaningful: 20 automated unit tests that ran while engineers grabbed coffee. Those small wins sparked curiosity.
Phase 2: Create a shared wins/losses channel. Celebrating successes and learning from failures built collective knowledge. "I have a cursor rule for that" became common language across teams.
Phase 3: The speedrun. 100 engineers picked trivial tasks — copy changes, small bugs. Fifteen minutes produced 70 PRs, breaking GitHub. Eyes lit up. Later company-wide speedruns: 800 engineers generated 300-400 PRs in 30 minutes.
Phase 4: Full-cycle compression. Internal bots captured live user audio feedback, processed it through LLMs that identified bugs, auto-created Linear tickets, and kicked off PR generation. User feedback received production fixes within 30-minute calls.
That last point deserves emphasis. Customers on support calls saw their issues fixed before the call ended. That's not an incremental improvement — it's a fundamentally different customer relationship.
Three Critical Shifts
Shift 1: From headcount to velocity. Old models measured capacity through people. Dorsey took this to the extreme: fewer people means less overhead. Turakhia demonstrated the alternative: identical team, dramatically faster shipping. The real value isn't payroll savings — it's market speed. Fixing customer issues during live calls represents a competitive advantage that headcount cuts cannot create.
Shift 2: From mandates to demonstrated conviction. Every successful AI adoption I've seen shares one trait: leaders who are hands-on with the tools. Not strategy documents. Not a delegated "Head of AI" role. Actual daily usage and an understanding of both capabilities and limitations.
Turakhia was in Cursor daily, submitting PRs from Ubers. His engineering teams saw their director writing code with AI. Permission structures shifted from management initiative to team norm. Compare this with mandating "daily AI tool usage" with performance review integration. One creates adoption. The other creates compliance.
Shift 3: From individual to system velocity. The real value isn't making one engineer faster — it's compressing entire systems from feedback through deployment.
Turakhia's team built custom Slack bots, automated ticket creation from live audio, and agent-powered PR generation. They didn't just adopt AI tools. They rearchitected development workflows around AI capabilities.
The Strategic Question
Rather than "should I cut 40% and call it AI?" consider a different question: what would happen if your team shipped 3-4x faster without losing anyone?
What markets would you expand into? What feedback cycles would compress? What customer relationships would transform?
The Block approach treats AI as a cost-reduction tool. The Coinbase approach treats AI as a capability multiplier. Both paths are available. Only one compounds over time.
Your action step
Map your engineering team's cycle time from feedback or feature request to shipped production code. Document every handoff, wait period, and approval queue in the pipeline. Identify which steps AI could compress — and focus on the coordination work (ticketing, PR reviews, tests, deployment) rather than just coding. Coding often represents less than 30% of total cycle time. The biggest gains are in the spaces between the code.