Technical work is never just technical
In IT and software development, there is a strong belief in meritocracy and objectivity: good ideas win, code speaks for itself. But design decisions are always shaped by people, and people bring with them:
- Prior experiences
- Implicit assumptions
- Team dynamics and power structures
- Shared ideas of what is “normal” or “default”
When development teams are homogeneous, these assumptions often remain invisible. They aren’t intentionally harmful, but they are rarely questioned either. The result is technology that works well for some users, but less well for others.
What my research shows about Finnish IT development
My research combines narrative data from experienced IT professionals with a quantitative survey. Three key findings are especially relevant for software development work.
1. Women experience higher cognitive load in technical roles
Many women described a continuous need to prove their competence, justify their technical decisions, or fight to be heard in design discussions. This creates additional cognitive load: mental effort that has nothing to do with solving technical problems.
That cognitive load reduces the capacity for creative thinking, long-term architectural reasoning, and innovation.
This isn’t an individual issue. It directly affects a team’s design quality and problem-solving ability.
2. Design thinking doesn’t automatically reduce bias
Design thinking is often described as human-centred and inclusive by default. However, my research shows that:
- Awareness of design thinking doesn’t automatically lead to its active use
- Awareness of bias doesn’t automatically change behaviour
- The effectiveness of design thinking depends heavily on team diversity and reflexive practices
In other words, methods aren’t neutral if the way they are used isn’t reflective.
3. Bias affects outcomes, not just workplace experience
When certain perspectives are systematically sidelined, the consequences go beyond individual wellbeing. They affect:
- User assumptions
- Risk identification
- Whose needs are prioritised
- Whose problems remain invisible
This becomes particularly critical in data-driven and AI-enabled systems, where small design assumptions can scale into large societal impacts.
Why this matters right now
The next phase of digitalisation is no longer just about efficiency and automation. It is about:
- Trust
- Ethical sustainability
- Legitimacy
- Serving diverse users in meaningful ways
If technology is built primarily from a narrow experiential perspective, it risks producing both inequality and inferior technology.
What can practitioners take from this?
One key conclusion from my research is simple but challenging:
Reducing cognitive and gender-based bias isn’t a one-off intervention. It is an ongoing design practice.
In practice, this means:
- Paying attention to team composition and informal roles
- Creating space for reflective discussion about design assumptions
- Recognising that “tacit knowledge” isn’t always neutral knowledge
- Understanding that silence or withdrawal in design discussions can signal overload, not lack of competence
Closing thoughts
Technology doesn’t exist separately from the people who design it.
If we want to build ethical, sustainable, and high-quality digital solutions, we must also examine the structures and assumptions of shaping design work—not just the final output.
In the next posts in this series, I will focus on:
- AI development: Why cognitive and gender-based bias shape AI systems long before data and algorithms
- Clients and impact: How these design choices affect customers, users, and trust in technology
Interested for more? Read my research or check out what Tivi wrote about it. And see how we do modern software development.