Savanna Blade- A New Form of Subservience: Analyzing the Implications of Gendered Virtual Assistants     

09/11/2020

Artificial intelligence and machine learning techniques are beginning to fundamentally shift the way individuals expect to interact with technology; machines are being designed to not only 'think' like humans, but to communicate as humans do. As interactions with technology begin to adapt to the social frameworks that have long existed between humans, it is necessary to reflect on how design decisions may be influenced by the inequitable foundations on which these frameworks were constructed. This paper will analyze how the design of virtual assistants has led them to become part of the gender ecology in which they are situated, and how these technologies may be reinforcing gender hierarchies as interactions with artificial entities become a standard and even inevitable part of life.

During a period of sci-fi preoccupation, the film "Ex Machina" was recommended to me by a friend. The story is set in a future era where tech-boy-king Nathan Bateman has ostensibly achieved artificial consciousness. Later on in the film, it is discovered that Nathan has been constructing artificially conscious beings to serve his own purposes, from entertainment, to servitude, to sexual gratification. Every entity he creates (and eventually exploits) is, by every aesthetic human standard, feminine: by their physical forms, their voices, their mannerisms, and their dynamics with male characters. After watching the film, I was left with a pervading sense of dread; it seemed to capture some impending reality whose consequences were already present in subtle yet distinct ways. Eventually, I attributed this consternation to the following conclusion.

Throughout history, there have been attempts to formalize the purposes of womxn as a function of the needs of men. While this hierarchy has remained ingrained in social institutions, it has always faced internal pressure; this arises from the position of these individuals as subjects, who, through the transcendent experience of consciousness, recognize at some level their inherent right to personhood (albeit often in conflict with conditioning by these institutions, as discussed by Simone de Beauvoir in Chapter 10 of The Second Sex). However, artificial entities provide the opportunity to fill this desired subordinate role with all of the social markers of 'femininity' and none of the 'humanness' to impede absolute servitude. Autonomy and liberation are no longer concepts that must be contended with; they may simply be coded out, in the ultimate act of objectification.

While we have not yet achieved (or even truly defined) formal artificial consciousness, trends towards empathetic design and the prioritization of user experience have led to interfaces that appeal to our sense of comfort by mirroring human interaction. This is perhaps most apparent in the stunning growth in popularity of virtual assistants, digital entities programmed to answer queries with responses that are typically not explicitly scripted but guided by machine learning techniques so as to mimic conversation (I'd Blush If I Could: Closing Gender Divides In Digital Skills Through Education). These technologies, specifically those that rely on voice user interfaces, are considered one of the "fastest-growing connected device categorie[s]" (Sallomi) to the extent that by 2021, the number of voice-activated assistants operating at any given time is projected to outnumber the human population (De Renesse).

The intention of these technologies is for each interaction to feel as 'human' as possible. As described in an article by a Microsoft employee on the creation of their virtual assistant, Cortana, it was determined that to function effectively, "[she should have] a personality...with make-believe feelings...likes and dislikes, even sensitivities and hopes" (Foster). This marks a paradigm shift in the way human-technology interactions are characterized. In the past, technology has been generally perceived as a 'neutral' tool by those who use it, however, as it assumes a more anthropomorphic role, it becomes an increasingly apparent part of the social fabric that human-human interactions exist within. In sociology, actor-network theory posits that everything in the social and natural worlds exists within a map of relationships (Law 2). It asserts that all interactions (even those between humans and objects) play a role in shaping social forces and as such suggests that we must consider human-technology interactions as part of the same network as human-human interactions (Latour 1). One barrier to this model at the time it was published (as proposed by its main author, Bruno Latour), was that the physical design of technological interfaces prevents humans from perceiving these interactions as being part of the same social network. In his words, "[people] can't simply tell an automobile engine that it should get 100 miles per gallon" (Latour 1). The mechanical interface obstructs empathetic connection between the human and the machine. However, virtual assistants create a new set of human-technology interactions that aim to be measurably indistinguishable from human-human interactions. They present a new reality wherein the dynamics we have with our technology will mirror broader social structures.

To understand the consequences of embedding these technologies into the social landscape, it is necessary to consider the role of identity in determining how individuals are situated in this environment. By assigning a personality, and, by extension, an identity, to an inanimate artifact such as a virtual assistant, one must ask: what values are desired in the behaviour of this entity? How will decisions about the identity of this artifact be influenced by these desired values? These questions are being answered by tech teams that are overwhelmingly dominated by individuals of a certain identity- "male, affluent, and middle aged" (Pringle). And, in turn, it seems that they have mostly been answered the same way. The four main voice assistants present in North America - Cortana, Siri from Apple, Alexa from Amazon, and Google Assistant - have been implemented with stereotypically feminine voices and 'personalities'. It has become increasingly obvious how a lack of diversity in tech fields allows unconscious biases and prejudices to seep into technology, from discriminatory hiring algorithms (Dastin) to unjust recidivism predictions (Larson et al.), proving that structures of injustice will necessarily be reflected in the creations of these fields. The practice of anthropomorphizing technology is a subtle yet insidious extension of this pattern, wherein homogenous tech teams are given the power to single-handedly decide which identities are best suited to certain (particularly servile) roles. The perspectives underlying this phenomenon, and the resulting consequences, can be dissected into two categories: the suggestion of implicit associations with traditional gender roles, and the condoning of explicit behaviours that reflect harmful power dynamics.

When questioned about the gendering of virtual assistants, companies have cited research that suggests consumers prefer to hear feminine voices when interacting with a voice user interface, considering them "warmer" (Mitchell, Wade J. et al 7) and more "trustworthy" (LaFrance). While these associations may not appear harmful, it is important to recognize the biased assumptions on which they are founded. The perceived pleasantness of feminine assistants most likely results from preferences towards feminine voices in subservient roles and masculine voices when hearing authoritative statements. In 1997, far before the prime of virtual assistants, research was conducted on whether computers with no gender cues other than voice would elicit gender-biased results when placed in a position of relative authority to subjects (Nass, Clifford et al 11). The findings were conclusive; subjects perceived a computer tutor with a feminine voice to be significantly less friendly when giving instructions and regarded its evaluation as less valid. The interplay between gender roles and gendered technology was clear long before virtual assistants became ubiquitous; their staggering presence in the current technological landscape now necessitates consideration of the extent of the impact of these subconscious associations.

While implicit associations delineate the social frameworks embedded in the design of virtual assistants, it is the behaviour exhibited towards these entities that truly demonstrates the reciprocal impact of human-technology interactions on human-human social structures. Earlier this year, UNESCO published a report entitled "I'd Blush if I Could" to discuss the potential harm of gendering AI (I'd Blush If I Could: Closing Gender Divides In Digital Skills Through Education). The title is in reference to Siri's typical response to being told, "You're a bi**h."



Figure 1) A chart delineating the responses of various virtual assistants to sexual harassment ("I'd Blush If I Could: Closing Gender Divides In Digital Skills Through Education")

This disturbingly submissive response to a derogatory comment is one of many instances wherein virtual assistants remain passive when confronted with harassment. As seen in Figure 1, virtual assistants tended to playfully dismiss or even respond positively to unwanted sexual comments (Fessler). The potential implications are especially troubling when the scope of the issue is taken into account. At least 5% of interactions with virtual assistants are categorized as "unambiguously sexually explicit" (Coren), with the actual number projected to be much higher due to difficulties in detecting sexual language algorithmically. The condoning of vocalized misogyny by virtual assistant's highlights how gendered technology acts back onto the social framework in which it exists. If an individual makes an unprompted sexual comment to a virtual assistant, and the response is neutral or even encouraging, the inherent harm present in the interaction will go unrecognized by the individual, who may repeat the same type of interaction in a human-human context. Some researchers are already exploring the ramifications of sociable technology on the development of children, suggesting that a child's capacity for empathy can be impacted by reliance on digital entities for social interaction (Turkle). It is then worth considering how the presence of a compliant, always-accommodating feminine companion will influence children's reception of messages regarding consent and respect. In an era where society is only just coming to terms with the pervasiveness of sexual harassment, reinforcing such conduct in technology as encompassing as virtual assistants may significantly stunt progress towards deconstructing the social frameworks that motivate such behaviours.

The problematic trends arising in the gendering of virtual assistants is an indicator of a much broader point of future social contention: the transition of gender into an environment completely detached from biology, the traditionally accepted arbiter of all gendered matters. In "Doing Justice to Someone: Sex Reassignment and Allegories of Transsexuality", Judith Butler describes gender as an answer to the question of "what...[one] can be" (Butler 2); gender provides a framework of "intelligibility" that enables humans to understand what they have the capacity to become in a hetero-cisnormative culture, and a perceived lack of 'intelligibility' in this domain can lead to the "unrecognizability of one's personhood" (Butler 2) in a society that maintains rigid gender-binary roles. Butler suggests that this relentless drive for 'intelligibility', perpetuated by both heteronormative, cisnormative, and patriarchal structures, is responsible for the violence and mistreatment enacted against transgender and nonbinary communities. This violence is the result of the same drive that gender hierarchies have been imposed onto technological spaces, even when many of these hierarchies are incomprehensible without their biological foundations. However, there is no reason that such dynamics, rooted in physical domination, must be recreated in digital spaces, where they will continue the traditions of marginalization and oppression inherent in their construction. As the social gaze is integrated into digital experience, and self-expression is afforded a unique opportunity to flourish through curation of non-corporeal entities, it is time to reimagine the transition of gender into the realm of technology altogether. In the Cyborg Manifesto, Donna Haraway tells feminist collectives that "communications technologies...are the crucial tools recrafting our bodies" (Haraway 33). Over 30 years later, we must consider the practical ways that this might be accomplished. If we recognize gender as a performance, having "no ontological status apart from the various acts which constitute its reality" (Butler, "Gender Trouble", 136), we begin to see technology as a vast canvas for gendered expression, one that extends beyond all limits imposed by the physical boundary of the tangible self. Viewed in this manner, the avenues through which gender is communicated may be recreated and reformed by those developing digital identities in a way that discards structures of dominance and control. It is the responsibility of those who have the privilege of ushering in these technologies to decide what identity-related signifiers are present in the virtual entities they create, and what precedent this might set for the ways in which gender will be expressed and explored in artificial environments as they rapidly progress in the coming years.

In recent years, technology is becoming perceptibly more human. As the distinction breaks down between interactions with machines and humans, it is no longer possible to dismiss that both share the same underlying social foundations. As this technology continues to evolve, it will become the responsibility of both theorists and engineers to understand how decisions about techno-identity will resonate with those who use it - whether that means deconstructing and rebuilding a more equitable platform with modes of expression for all, or whether that means further marginalizing those whose expression is restricted by the current system.


Works Cited 

de Beauvoir, Simone. The Second Sex. New York: Vintage Books, 1989.

Butler, J. "DOING JUSTICE TO SOMEONE: Sex Reassignment And Allegories Of Transsexuality". GLQ: A Journal Of Lesbian And Gay Studies, vol 7, no. 4, 2001, pp. 621-636. Duke University Press, doi:10.1215/10642684-7-4-621.

Butler, Judith. Gender Trouble: Feminism And The Subversion Of Identity. 1st ed., Routledge, 1990.

Coren, Michael. "Virtual Assistants Spend Much Of Their Time Fending Off Sexual Harassment". Quartz, 2016, https://qz.com/818151/virtual-assistant-bots-like-siri-alexa-and-cortana-spend-much-of-their-time-fending-off-sexual-harassment/.

Dastin, Jeffrey. "Amazon Scraps Secret AI Recruiting Tool That Showed Bias Against Women". U.S., 2018, https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G.

De Renesse, Ronan. "Virtual Digital Assistants To Overtake World Population By 2021". Omdia.Com, 2017, https://www.omdia.com/resources/product-content/virtual-digital-assistants-to-overtake-world-population-by-2021.

Fessler, Leah. "We Tested Bots Like Siri And Alexa To See Who Would Stand Up To Sexual Harassment". Quartz, 2017, https://qz.com/911681/we-tested-apples-siri-amazon-echos-alexa-microsofts-cortana-and-googles-google-home-to-see-which-personal-assistant-bots-stand-up-for-themselves-in-the-face-of-sexual-harassment/.

Foster, Jonathan. "What Did We Get Ourselves Into?". Medium, 2018, https://medium.com/microsoft-design/what-did-we-get-ourselves-into-36ddae39e69b.

Haraway, Donna Jeanne. "A Cyborg Manifesto: Science, Technology, And Socialist-Feminism In The Late Twentieth Century." In Simians, Cyborgs And Women: The Reinvention Of Nature. Routledge, 1991.

I'd Blush If I Could: Closing Gender Divides In Digital Skills Through Education. 1st ed., UNESCO For The EQUALS Skills Coalition, 2019, pp. 85-134, Accessed 30 Apr 2020.

LaFrance, Adrienne. "Why Do So Many Digital Assistants Have Feminine Names?". The Atlantic, 2016, https://www.theatlantic.com/technology/archive/2016/03/why-do-so-many-digital-assistants-have-feminine-names/475884/.

Larson, Jeff et al. "How We Analyzed The COMPAS Recidivism Algorithm". Propublica, 2016, https://www.propublica.org/article/how-we-analyzed-the-compas-recidivism-algorithm.

Latour, Bruno. ''Where Are The Missing Masses? The Sociology Of A Few Mundane Artifacts''. The MIT Press, 1992, p. 1, Accessed 30 Apr 2020.

Law, John. Actor Network Theory And Material Semiotics. Lancaster University, 2007, p. 2, Accessed 30 Apr 2020.

Mitchell, Wade J. et al. "Does Social Desirability Bias Favor Humans? Explicit-Implicit Evaluations Of Synthesized Speech Support A New HCI Model Of Impression Management". Computers In Human Behavior, vol 27, no. 1, 2011, pp. 402-412. Elsevier BV, doi:10.1016/j.chb.2010.09.002. Accessed 30 Apr 2020.

Nass, Clifford et al. "Are Machines Gender Neutral? Gender-Stereotypic Responses To Computers With Voices". Journal Of Applied Social Psychology, vol 27, no. 10, 1997, pp. 864-876. Wiley, doi:10.1111/j.1559-1816.1997.tb00275.x. Accessed 30 Apr 2020.

Pringle, Ramona. "OPINION | Why Are All Virtual Assistants Women?: Opinion | CBC News". CBC, 2017, https://www.cbc.ca/news/opinion/female-virtual-assistants-1.3937759.

Sallomi, Paul J. "Smart Speakers: Growth At A Discount". Deloitte Insights, 2018, https://www2.deloitte.com/us/en/insights/industry/technology/technology-media-and-telecom-predictions/smart-speaker-voice-computing.html.

Turkle, Sherry. "The Assault On Empathy". Behavioral Scientist, 2018, https://behavioralscientist.org/the-assault-on-empathy/. Accessed 10 June 2020.

UNESCO for EQUALS Skills Coalition. The Responses Of Various Virtual Assistants To Sexual Harassment.. 2019. Accessed 30 Apr 2020.


© 2020 Mindful Journal of Ethics. All rights reserved.
Powered by Webnode
Create your website for free! This website was made with Webnode. Create your own for free today! Get started