The Human-AI Alliance – Why Emotional Intelligence Must Lead the Way

“Without emotional intelligence, we might just automate ourselves out of humanity.”

Melika Mohammadzadeh’s latest research, under the supervision of Dr. Anna Rostomyan and published in an article in The European Business Review on March 27, 2025 has really grabbed my attention. Primarily because it both confirms what I have been writing about in my newsletters in recent times, but also challenges some of what I have been saying. I stand ready to be suitably challenged and chastised it need be.

The article takes a bold step into the evolving terrain of AI in the workplace. It doesn't talk about robots replacing jobs. It talks about the new skillset we need to stay human in an AI-powered environment—emotional intelligence (EI).

Those of you who’ve followed my newsletters will hear a strong echo of what we’ve discussed before: that a primary psychosocial hazard facing today’s workplaces isn’t the AI tools themselves—it’s the emotional toll of trying to work with systems that aren’t designed to meet emotional needs.

This paper confirms much of what we’ve been saying:

Trust is key: High-EI employees are more likely to trust and appropriately challenge AI tools, engaging critically without falling into blind reliance.

Engagement is emotional: Mohammadzadeh’s study reinforces that emotionally intelligent workers interpret AI insights through a human lens, creating ethical, empathetic decisions in HR and customer service alike.

Fear of job displacement isn’t just about logic—it's about emotional safety. And without EI, we risk triggering organisational trauma and institutional betrayal.

This aligns perfectly with my own experience in supporting high-performing teams under pressure. The danger isn’t AI. The danger is leaders who don’t realise that emotions show up long before logic does—and that EI is the only bridge across that divide.

A few insights that really resonated:

Employees with high EI are interpreters, not just users, of AI systems.

Trust in AI tools stems not from perfect tech, but from transparency, empathy, and clear emotional signals—exactly what emotionally intelligent leaders try to develop and show.

Emotional AI, if left unregulated, has the potential to manipulate rather than support. Ethical use matters deeply here.

This paper reads like an endorsement of what many of you have already heard me talk about: constructive disruption, dynamic disharmony, and the importance of psychological safety as the bedrock of sustainable performance.

Let’s not wait for AI to teach us how to be human.
Emotional intelligence isn’t a “nice to have”—it’s mission-critical in the age of AI.

How emotionally intelligent is your leadership culture?

Let’s talk about how to ensure your tech supports empathy, trust, and real connection.

Previous
Previous

When the Machines Watch Back: Surveillance, Stress, and the Fractured Team

Next
Next

When AI Joins the Team – Leadership's Role in an Emotionally Intelligent Transition