It's been a tough few weeks for some of my former AWS colleagues.
In late January, Amazon announced another round of layoffs - 16,000 positions this time, part of a larger restructuring targeting 30,000 corporate roles. AWS, Prime Video, HR, retail. The news came tied to a $100 billion investment in AI. "Reducing layers, increasing ownership, removing bureaucracy," the announcement said.
I've been talking to people I worked with at AWS. Some will land quickly. Others might not. All of them are processing what it means when a company that built its culture on leadership principles makes decisions that feel at odds with one of those principles in particular.
"Leaders work every day to create a safer, more productive, higher performing, more diverse, and more just work environment. They lead with empathy, have fun at work, and make it easy for others to have fun. Leaders ask themselves: Are my fellow employees growing? Are they empowered? Are they ready for what's next? Leaders have a vision for and commitment to their employees' personal success, whether that be at Amazon or elsewhere."
I'll be honest - when Amazon introduced this principle in 2021, I was skeptical. "Earth's Best Employer" felt like corporate PR, a response to criticism of warehouse conditions and delivery driver treatment. It's an impossible standard, almost satirical in its ambition. No company can be Earth's best employer.
And yet.
The questions embedded in this principle - are my employees growing? Are they empowered? Are they ready for what's next? - have never been more urgent. Not despite AI, but because of it.
Sooner rather than later you will need to accept the fact that your team now includes AI.
Not metaphorically. Literally. AI agents are handling tasks that humans did six months ago. Microsoft's research calls this the "agent boss" era - everyone from interns to the C-suite managing a constellation of AI teammates. The skill that will define careers isn't completing tasks end-to-end. It's directing, supervising, and refining work done by agents.
This changes what "best employer" even means.
When AI can handle the repetitive, the routine, the analytically intensive - what's left for humans? And how do you prepare your people for a role that didn't exist until recently?
The burnout data makes this urgent. 76% of employees now report experiencing burnout - double pre-pandemic levels. Interestingly, Deloitte's 2025 research found that cognitive strain, not workload volume, is now the primary driver of burnout. People aren't burning out from too many tasks. They're burning out from too many decisions, too much context-switching, too much responsibility for outputs they didn't fully control.
Sound familiar? That's exactly what happens when you add AI to someone's workflow without redesigning the work itself.
So how do you apply "Strive to be Earth's Best Employer" when your team includes agents? I think the three questions in the principle become a diagnostic:
- Are my employees growing?
Growth used to mean learning to do more tasks, better. Now it means learning to orchestrate - to set intent clearly, review AI outputs critically, and make the judgment calls that agents can't.
The 34% of organizations successfully implementing agentic AI invest in developing people alongside deploying agents. The 66% that fail often treat AI as a cost savings tool while expecting humans to "do more with less."
Ask yourself: Am I helping my people develop the skills to direct AI work? Or am I just giving them AI tools and hoping they figure it out?
- Are they empowered?
Empowerment used to mean giving people authority to make decisions. Now it also means giving people AI tools that multiply their capabilities - without drowning them in cognitive load.
AI can reduce burnout by handling exhaustive work. Or it can increase burnout by adding another system to manage, another set of outputs to review, another source of responsibility without control.
The difference is implementation, not technology. Same AI, opposite outcomes, based on whether you designed for human thriving or just efficiency.
- Are they ready for what's next?
This is the question that I think about a lot (and try to act upon).
Microsoft's WorkLab predicts that everyone will become an "agent boss." The skill that matters isn't using any single AI tool well - it's orchestrating multiple tools into coherent workflows. It's the meta-skill of directing AI work.
Are your people ready for that? Have you prepared them for the identity shift from "I do the work" to "I direct the work"?
Possibly the cruelest (and least effective) thing a leader can do right now is deploy AI without preparing people for the transition.
The trap I am talking about is treating agents as cost savings while telling humans they should be grateful they still have jobs.
I've seen this pattern happen. A company deploys AI to automate tasks, reduces headcount, expects remaining employees to manage the agents plus their previous workload, then wonders why burnout increases and quality decreases.
"Strive to be Earth's Best Employer" doesn't mean never making hard decisions. It means making those decisions with genuine commitment to people's success - whether that be at Amazon or elsewhere.
For the colleagues I've been talking to this month, "elsewhere" is now their reality. This leadership principle asks: Did they leave ready for what's next? Did we invest in their growth while they were here, or just extract their labor?
Those questions apply whether someone leaves by choice or by layoff.
This week, pick one person on your team. Ask yourself the three questions:
Is this person growing? What new skills are they developing for an AI-augmented world?
Are they empowered? Do your AI tools multiply their capabilities - or just add to their cognitive load?
Are they ready for what's next? Have you prepared them to orchestrate, not just execute?
If you can't answer confidently, you've found your leadership priority for next month.
Being "Earth's Best Employer" is an impossible standard. But asking these questions, honestly, for every person on your team - that's achievable. And in the age of AI, it's the minimum.