Who gets to shape technology?
This article is part of a series exploring leadership, power and system design through the lens of artificial intelligence.
While AI is often discussed as a technical or ethical challenge, it is also a leadership issue. Decisions about how technology is designed, governed and deployed reflect who holds power, whose voices are included, and which outcomes are prioritised.
Drawing on my work with the Fawcett Society and my broader focus on leadership, equity and organisational systems, this series examines what AI can teach us about accountability, inclusion and the future of leadership.
When we talk about artificial intelligence, we often jump straight into the technical: data sets, algorithms, automation. But there’s a deeper question we need to ask first:
Who gets to shape technology?
And just as important: who is left out?
These are not just technical questions – they're also leadership questions. They are questions about power, accountability, and whose realities get to count.
I had the privilege of contributing to a project with Fawcett Society alongside a group of incredible women brought together as Experts by Experience. Together, we explored how AI and technology could be designed differently – not as neutral tools dropped into the world, but as systems shaped intentionally with care, justice and lived experience at their core.
The work drew on something called feminist design thinking. And while that phrase might sound academic, what it really means is simple: design that centres people who are too often excluded. Design that recognises bias is not a glitch but a mirror of society. Design that asks not just “does this work?” but “who does this work for, and who might it harm?”
Why this matters
AI is not objective. It reflects the inequalities of the world it’s built from.
We see it in recruitment algorithms that favour men over women. We see it in medical AI trained mostly on male data, producing less accurate results for women. We see it in facial recognition that misgenders trans people or fails to recognise non-binary people altogether.
These harms are not accidental. They are the product of systems designed without accountability to the people most affected.
And that’s what struck me most about this project: the power of inclusion not as an afterthought, but as the foundation. This wasn’t about running a quick survey or “consulting” people at the end. It was about co-creating from the very start. It was about saying: the knowledge of those who live with exclusion, bias and inequality is not just valid – it’s essential.
The universal user myth
One of the most powerful ideas that came out of our discussions was the myth of the “universal user”.
In traditional tech design, there’s often an assumption that you can design for an “average” person. In practice, that “average” almost always defaults to a narrow slice of humanity: white, cisgender, able-bodied, middle-class men.
The result? Technology that erases or misrepresents those most likely to be excluded. Tools that work well for those at the centre and fail those at the margins.
Feminist design challenges this. It says: let’s stop pretending there’s a universal user. Let’s start by recognising difference. Let’s centre the people who are most likely to be excluded, because when technology works for them, it’s far more likely to work for everyone.
That’s not just good design. That’s good leadership.
What leaders can learn from feminist design
As I took part in these workshops, I kept noticing the parallels with leadership in organisations.
Leaders too often assume that one way of working will suit everyone. They design policies, structures and strategies based on what works for the “majority”, usually those most like themselves. But great leadership isn’t about creating for an average but rather listening deeply to those at the edges, the ones whose voices are often drowned out.
Because the truth is: exclusion is costly. In technology, it produces systems that scale harm. In business, it leads to disengaged teams, untapped potential, and leaders stuck in delivery mode rather than growth.
What this project reminded me is that accountability is not optional. Whether in tech or in leadership, when we ignore whose voices are missing, we risk building systems that replicate inequality. When we choose to listen, we create the conditions for people to thrive.
Justice is not a feature
One of the strongest messages from this project was simple but crucial: justice can’t be treated as a feature of tech, it has to be the foundation.
That idea holds true well beyond technology. We cannot tack equity on at the end of a process and hope it sticks. We have to embed it at the beginning.
For me, this connects to my wider mission in leadership and philanthropy. We talk a lot about innovation, scale, performance – but rarely about justice. What if we shifted the frame? What if success wasn’t just about efficiency, but about care? Not just about speed, but about impact? Not just about growth, but about accountability?
Those are the kinds of questions feminist design thinking asks of technology. They’re also the questions we should be asking of leadership.
A beginning, not an end
This article is the first in a five-part series about my experience with this project and the lessons it holds, not just for AI, but for leadership, equity and the future of work.
In the next article, AI design, reimagined, we’ll explore how design principles like care, transparency, collaboration and resistance can reshape not just tech products but organisational cultures. We’ll then look at data, bias and power – and what it means for leaders who want to build fairer systems. And finally, we’ll reimagine futures where technology doesn’t just work for some, it works for everyone.
The future of AI is not yet written. That gives us both the opportunity and the responsibility to shape it.
The same is true of leadership. The question is: will we design it by default and repeat old patterns, or will we design it with intention?
Until next time!
Tania.