The present and the future are inevitably becoming rapidly technological. The possibilities are quite literally endless, with autonomous cars, clean meats and virtual reality technologies becoming more and more of a reality than ever before. As a result of this, the workforce has changed dramatically since the early 1900s, — manual labour is becoming less […]
Since its modern iterations, artificial intelligence (AI) has been – unfortunately and possibly mistakenly – linked to gender. Even though AI has been theorised about since the Ancient Greeks (you can find a timeline of AI here), it was Alan Turing’s conceptualisation of a test to ascertain a machine’s intelligence (now known as the Turing test) that may have caused this (Halberstam 1991). To conduct the Turing test, a judge communicates with a man and a machine via written means and without ever coming into contact with either subject. The machine should be indiscernible from the man. The issue with this test is that Turing uses a male and a female as the control for the test, erroneously believing gender is an intrinsic value in a human (based on anatomy alone).
In our postfeminist context, we know that gender is a complex spectrum amounting from a combination of brain structure, genetics, sex hormones, genitals and most importantly societal conditioning. “Turing does not stress the obvious connection between gender and computer intelligence: both are in fact imitative systems” (Halberstam 1991). We know now that gender is constructed and reconstructed over time. If gender should apply to AI, it would present itself as a product of the AI’s programmer/s individual gender practice rather than something innate to the machine.
Instances of AI in everyday life already surround us, the most easily recognisable of which are the personal assistant softwares in smartphones, tablets and computers (Siri, Cortana, and now Google Assistant). Each of these have female voices as a default setting. In a discussion of the many feminine-named assistants, Dennis Mortensen, founder of x.ai, has said that we take orders better from a female rather than a male. This is trend continues in Microsoft’s endeavors to create AI bots on Twitter, most namely the “teen girl” conversation bot, Tay.
Bots and smartphone apps are both examples of weak AI – AI that simulates human intelligence by executing the simplest version of a task. In this podcast about Tay’s rapid corruption into racist Tweets, Alex Hern refers Microsoft’s previous app Fetch!, which identifies dog breeds from pictures – any picture, it need not include an actual dog. Based on this understanding of weak AI, I can only assume female voices are programmed in order to make the apps and bots more palatable and appealing. However this can only be described as “machines in drag“, with very little positive effect on intersectional feminism in society today (Robbins 2016).
Due to the close association of the machine with military intelligence (one of the first iterations of computer was developed by Turing in WWII in response to the Nazi’s Enigma after all), “computer technology is in many ways the progeny of war in the modern age” (Halbersham 1991). The probability of weaponised autonomous AI becoming a threat led to a gendering of the technology as female. Feminist theory sees the female as Other by comparison to the male in the same way that, even in the Turing test, technology is also othered. Andreas Huyssen identifies writers at the heart of this imagining of technology as female harbingers of destruction (cited in Halberstam 1991).
Halberstam, J 1991, ‘Automating Gender: Postmodern Feminism in the Age of the Intelligent Machine’, in Feminist Studies, vol. 17, no. 3, pp439-460.
Haraway, D 1991. ‘A Cyborg Manifesto: Science, technology and socialist-feminism in the late twentieth century’, in Simians, Cyborgs and Women: The Reinvention of Nature, Free Association, London.
Robbins, M 2016, ‘Is BB-8 a Woman: why are we so determined to assign gender to AI?’, The Guardian, 12 February, viewed 6 April, <https://www.theguardian.com/science/the-lay-scientist/2016/feb/12/is-bb-8-a-woman-artificial-intelligence-gender-identity>.
Twitter is all a-flutter about Tay, the racist lady-AI from Microsoft who was taken offline less than a day after her launch. According to her makers, “The more you chat with Tay the smarter she gets, so the experience can be more personalized for you.” Unfortunately this makes her extremely easy to manipulate and she was quickly transformed into a genocide-loving racist.
Tay is an example of a phenomenon in AI theory: the emergence of a gendered AI.
AI has been described as the mimicking of human intelligence to different degrees: ‘strong AI’ attempts to recreate every aspect, costing much more money, resources and time; while ‘weak AI’ focuses on a specific aspect. Tay, as a female AI targeted towards 18-24 year olds in the US, is very much about communicating with Millennials. In my previous posts, I’ve mentioned a number of AI representations in the media, all of which are gendered, usually as female. Dalke and Blankenship point out “Some AI works to imitate aspects of human intelligence not related to gender, although the very method of their knowing may still be gendered.”
They go on to suggest that the Turing Test “arose from a gendered consideration, not a technological one,” wherein Turing’s original paper proposing this test, the examiner is trying to determine the difference between a man and woman and that the same differentiation process could be applied to humans and AI.
If AI is gendered, then the researchers are proposing there is an algorithm for gender, which in our post-feminist context seems to be oversimplifying the issue. Gender is entirely constructed and would be constructed on the part of the AI in its development in the same way that humans construct and reconstruct their own gender in tandem with their identity.
Tay is a glorified bot that responds to specific stimuli. Perhaps it’s the other way around – AI is a glorified bot designed to respond to stimuli and learn from it.
More sources to consider: