Yifan Butsik, bootcamp participant: An Ethnographic Approach to Inclusive Business Technology Development

--

Yifan Butsik is currently completing an Masters of Research in Creative Computing at the UAL Creative Computing Institute in London, and working as a graduate teaching assistant in equitable AI practice. In January 2023, Yifan took part in Namla’s Winter School with the University of Amsterdam. They were part of a team that spanned three cultural backgrounds with vastly different life experience; and because of this, the team were able to design a chatbot called Bankbot, that could make the landscape of chatbots in banking costumer service inclusive to all genders, in one quick swoop. Yifan wrote a reflection on their experience.

My interest in creating gender inclusive banking chatbot comes from my personal experience of working with gender minority groups. My observation is that their voices and life are stigmatized and stereotyped in the digital representation. I particularly suggested the case of banking chatbot since this technology is highly integrated into the digital financial service and are widely used. Academic professionals have researched how gender bias in design might reduce customers’ experience but none yet has provided an actual design solution for addressing gender inclusive values. In a solution-oriented approach, our project could help to identify the impact of implementing gendered communication encoded in language and visual elements (e.g., avatar). Most importantly, it can potentially benefit the banking industry (aka the product owner) by leveraging gender minority customers’ digital experience of interacting with its service.

Starting from literature studies, we discovered that language might affect how users interact with the chatbot. We therefore conducted semi-structured interviews with cis-female and transgender respondents and a survey for a broader audience as our first-round field research. In the interview, I deliberately asked participants to comment on three samples of gender-coded chatbot interactions. Interestingly, my participant (cis-female) did not find them problematic because “chatbots are not human and do not have a gender”. My interview however helped to reveal that certain language style (e.g., confirming one’s request) might lead to the assumption of a female-coded chatbot. My interview also showed that a humanized but not sexist or gender stereotyped chatbot might help improve user experience when negative emotions were evoked (e.g., disputing a fraud charge). Similarly, our group’s first-round survey demonstrated that most participants did not see a gender in a chatbot language design but anticipated a humanized and efficient user-chatbot interaction.

With our findings and literature analysis, our second-round research (prototyping and user testing) looked at additional features such as the avatar and chatbot’s name that might help provide a humanized and gender inclusive digital chatbot experience. Our analysis showed that users have a preference over a combination of mechanic and human(e) experience design, that is, to create a chatbot agent that has a clear indication of its robotic characteristics (robotic avatar, “bankbot” name, options for connecting human agent) as well as some humanized language interaction (e.g., “How can I help you today?”). The second research also indicated that respondents prefer a mechanic interaction style (e.g., “please select from the following options:”) that accords well with our interview findings (e.g., “for simple inquiries, I think the mechanic conversation is perfect…because they are very clear and efficient to me.”). Considering users’ preference over language style, avatar design, and chatbot name, we then created our own banking chatbot that highlights customers’ perception of gender inclusivity.

To our target audience, the banking industry, having gender awareness (e.g., applying gender inclusive language) during the product ideation stage can help gain potential customers and promote the commercial value of “inclusion, diversity, and equality”. Hiring a gender diverse product development team and conducting user research on gender minority populations actively (even rapid ethnographic research) are fundamental to achieve the goal.

As a computing student, how to conduct human research ethically has been tough for me. Conventionally, a human participant-involved project needs to be reviewed by an ethical panel and agreed by participants. For this project, the whole data collection process might not meet the ethical standard at an academic level. What I did was to address our research objectives and data use at the beginning of the interview. Although our group agreed not to specify the research aims in our survey (as the goal is to conduct a blind test), when sharing the survey with my personal contacts, I mentioned that user feedback will be fully anonymized and their data will not be used for any published work.

In general, this applied anthropology program is very helpful to inspire my design research, especially when social and cultural values are tightly engaged. When digital service design intertwines with gender sensitive topics, I have the awareness of my privilege of being cis-male and reflect on how my privilege might lead to a biased prototype. My team helped me further think of my ways of articulating the core concept of this project, which is about designing for gender neutrality, to the general public in a thoroughly considered manner. For instance, instead of calling it “gender neutral design” (which might offend some participants), I should carefully describe it as “combating sexist and outdated gender stereotype in chatbot design”. In this way, I was able to further engage with those who want to share their own stories or opinions about gender inclusive chatbot design. Through this course, what I learned the most is to focus not too much on details (e.g., whether native language plays a role in understanding language communication) but key questions (e.g., visual elements that might reinforce gender biased design) to our participants.

However, this rapid nature also limited our research. One of limitations is that I could not investigate why there is a divide between cis-female and transgender individuals detecting and perceiving gendered language. Besides, it is unknown why some participants (my teammate’s personal contacts) were upset about our gender neutral chatbot proposal. That being said, this course helped me familiarize with defining a project’s scope, applying user-centric ethnographic research, and rapid testing skills with interested groups. Most importantly, I learned what needs to be considered (e.g., brand image, corporate culture) when pitching a product to stakeholders in business settings.

--

--

Namla - people centered ideas for wicked problems

We are on a mission to bring applied social sciences and organisations closer together so that more effective solutions for wicked problems can be found.