The road to hell is paved with good intentions. You’d be hard-pressed to find a company that says it prefers to create sexist designs using racist design methods. But beyond the (very human) desire to appear inclusive, we must dig deeper to think about the ways our products are created.
We must be intentionally anti-racist, from kickoff to delivery. Shame and guilt are not effective motivators for making the right decisions. A desire for equality, inclusion, and most of all justice, provides us with a purposeful path toward a decolonized future of design.
Our role as human-centered researchers and designers is to advocate for this future in every decision-making task we take part in. At Bresslergroup, our research team is made up of anthropologists, designers, and cognitive scientists who have studied how to most effectively reduce bias in our research. We’re committed to continuing to learn new strategies to do so, and to decolonize design. In the wide range of Design Research that we practice — from global contextual inquiry to quantitative surveys — we work to mitigate bias in all that we do.
Our role as human-centered researchers and designers is to advocate for a decolonized future of design in every decision-making task we take part in.
Before we get ahead of ourselves, it’s important to acknowledge that the fields of research and design are historically traumatizing ones for marginalized communities. In general, “good research” practices and “good design” practices have been defined by white men and are rooted in the Eurocentric idea of “discovering” other cultures.
When it’s the researcher who determines how to read data and interpret the narratives embedded in data, research risks stripping people of their agency to tell their story. One way we combat this is by acknowledging our own biases as researchers and how they might — as well as how our positionality to our subjects might — affect our interpretation of the data we collect.
Decolonizing Research Methods
Researchers must understand that we’re not blank voids to be filled with data, but rather people whose experiences will color our data if we’re not careful. If we make the mistake of ignoring our own implicit biases in our research studies, we risk missing huge areas of opportunity — or worse, we create a product that’s not usable or is actively harmful to the population we aim to design for.
To begin to combat these biases, our Design Research team bakes these three strategies into all that we do. We employ them as guidelines on every project:
1 – Diversity in Participant Population: We strive to bring diverse voices to the decision table.
2 – Trauma-Sensitive Data Collection: We foster awareness around potentially traumatic data collection.
3 – Truth in Delivery: We deliver what the client needs to hear, not what they want to hear.
1 – Diversity in Participant Population
Sometimes clients know they need to talk to a certain sub-population. Sometimes they don’t. Sometimes we need to tell them exactly who they need to talk to.
Develop the language that will allow you to understand your participant population’s breadth of diversity. Companies are now realizing that Male / Female options for gender survey responses are inadequate. But why? When you don’t have access to the detailed information you need to appropriately segment or recruit your research participants, you may miss out on understanding a user population that you hadn’t yet considered.
Yes, it’s easier for the researcher to present “Race” as a single-choice option. But forcing participants to uncomfortably self-sort into a gender or racial binary for ease of analysis impacts the accuracy of a survey’s data.
“Recruit a mix” isn’t the solution you think it is. If you’ve ever designed or read a recruiting screener document, you’ve likely seen the phrase “Recruit a mix.” This is the equivalent of slapping a dirty Band-Aid on a wound and expecting it to heal on its own. “Recruit a mix” implies that any mix of gender, race, ability, or socioeconomic attributes will do for the study.
The researcher wants a “spread” of representation to show that they’ve considered the diversity of their participant pool, but isn’t willing to put forth the thought required to determine how that spread should look. Here’s where we can get into the messy space of tokenization with the thinking, “My study had a few Black people in it — I did my job!”
“Recruit a mix” is the equivalent of slapping a dirty Band-Aid on a wound and expecting it to heal on its own.
One example that comes to mind is a haircare project our research team was bidding for. The client wanted to include three participants with natural hair in the study of fifteen people. The product we were studying was not formulated for people with natural hair. The client wanted to “just see” how it would go.
We pushed back. What would you learn from talking to participants who know the product they’re assessing isn’t designed for them? How would that make participants feel? What would you learn that you don’t already know? This user population represented an entirely different problem space with completely unique needs. Instead, we recommended creating a second study to focus on how the product can be reformulated to work with natural hair. This “I don’t see color” approach to recruitment is not an effective method of uncovering meaningful design opportunities.
Be aware of your biases — we don’t know what we don’t know. It’s nearly impossible to know the impact that colonization has had on your study until you dig into it. We can plan as much as we like, but unless we’re open to the prospect of uncovering these uncomfortable findings, we can easily miss them.
I worked on a study where an international company hosted a single server that needed to be deployed globally for transmitting large files. We traveled to countries around the world to assess how different groups within the organization used the system. When we went to a country far from the European server center, the user experience of the system degraded dramatically. Files took minutes to transfer; multiple people were hired to manage the crawling system; and efficiency went out the window.
There had been some debate at the start of the study over whether it would be “worth it” to visit this country. Evidently, it was!
2 – Trauma-Sensitive Data Collection
When collecting data from participants, it’s important that the right person is asking the right questions to ensure participant comfort and safety.
Ask yourself: Are you the right person to lead the study? Research is inherently biased. Our job is to mitigate that as much as possible. We all bring our life experiences with us into interviews whether or not we’d like to admit it. Because of this, we try to ensure that the right researcher is in the room to ask the right questions the right way.
We all bring our life experiences with us into interviews whether or not we’d like to admit it.
Has someone on your research team experienced similar challenges as your participants? Will they have less bias to overcome? Will they be more likely to set your participants at ease? For a study where we spoke to people who had experienced difficult pregnancies, moderators were all people who had experienced pregnancy. In a study for an AFAB-specific medical device, all team members were AFAB (assigned female at birth) themselves.
Consent is fundamental. Always make extremely clear at the start of the session that the participant is in control. They can choose to not answer a question, and they can leave the session at any time. This is especially important if the content of the interview touches on sensitive, potentially traumatic, or triggering experiences.
Continue to provide participants with an “out” throughout the interview. Consent, as in all areas of life, is not a one-time event. It takes work to ensure that the participant is comfortable and aware of this throughout the session, especially if the researcher clearly does not have similar life experiences.
Know that you might mess up. Then adjust. The work to decolonize our research is never done. Researchers must be willing to pivot and evolve as new information becomes available to them.
In a study where we assessed the changes to daily life that occurred after the start of the global pandemic, we used the phrase “new normal” repeatedly throughout our data collection. One participant got a funny look on their face when we said this to them.
“You know I hear that all the time in my PTSD support group, right?”
We paused our initial line of questioning, apologized to the participant, and asked if they would be willing to share more with us about the use of the phrase. We stopped using the phrase “New Normal” in future sessions to ensure that participants wouldn’t have a similar triggering moment during their interview about a relatively benign, everyday topic.
3 – Truth in Delivery
Censoring your findings to appease your client or a design team is not an option.
Report what you find. It can be uncomfortable to tell a client something they don’t want to hear, but as a researcher, that’s our job. We’re the bad guys of the design process. Our inquiry can uncover unsavory facts about a situation that can’t always be solved by design alone. It’s still vital to present those findings, especially if they impact the context in which design decisions will be made.
It can be uncomfortable to tell a client something they don’t want to hear, but as a researcher, that’s our job. We’re the bad guys of the design process.
In the same study as referenced above with people who experienced difficult pregnancies, we learned about the significant impact that access to healthcare, healthy diet options, and transportation had on a pregnant person’s ability to care for themselves. We described these factors in our findings even though our client, who was designing a health device application, couldn’t do anything about them. This information helped them frame their design choices and make decisions about language that would be sensitive to the broader socioeconomic factors at play.
Use humanizing language. When reporting on participants who have a particular disease or experience, employing people-first language reinforces clients’ and stakeholders’ desire to make human-centered decisions. Instead of “the homeless,” we say “people experiencing homelessness.” Instead of “diabetic,” we say “people with diabetes.”
Simple changes in language may seem trivial, but when presenting work to clients unfamiliar with the human-centered design process, these seemingly minor adjustments ensure that the person being designed for is at the front of everyone’s minds.
Keep the right people in the room. However, it’s not enough to simply use people-first language. You need to design with those people. A hand-off to the client or design team without factoring in additional concept development research with the affected user populations is a missed opportunity.
We aim to include multiple research touch points in any project so the people being designed for can help us make decisions. Co-design is an excellent activity for having our users assist in making those design decisions.
We, as researchers, are not the arbiters of truth. We do not own an unbiased view of a design problem. Without repeated, respectful engagement with an affected user group there’s always the possibility of designing a tone-deaf, unusable or even actively harmful product.
The Design Researcher’s Evolving Role
Researchers are in a unique position to help clients change the way they think. In baking the strategies above into the work we do as a matter of course, we’ve helped our clients begin to understand that everyone benefits when you design to decolonize.
Acknowledging that this article was written by a college-educated white woman, there is always room to grow. We’re committed to listening and learning from other researchers doing this important work and welcome any feedback on the strategies outlined in this article. This work is never done. All of us in the design community must do our part to continue to move the needle. Small decisions ladder up to bigger decisions, and big decisions can change the world.
This article was originally published in the Spring 2021 issue of IDSA’s Innovation magazine.