By Shannon Horning

Photo by Bill Stafford, James Blair, Regan Geeseman

To many, robots and AI are above human design. People believe that these advanced models are capable of generating knowledge beyond human understanding, pulling world-altering solutions from thin air like a magician pulling a rabbit out of a hat. While it is true that robots and AI are great for automating tasks and increasing efficiency, this technology is still as susceptible to bias and prejudice as humans are. At the end of the day, robots are products of human creation, inherently reflecting the knowledge and biases of their creators.

This topic is what Jennifer Robertson, a professor of anthropology and art history at the University of Michigan, discussed in her lecture “Robo-Sexism.” On Friday, Feb. 16, Robertson stood in front of the Posner Hall Grand Room and talked about how humanoid robots reinforce gender stereotypes through their human-engineered design.

According to Robertson, only a small fraction of robots are humanoid. The other 90 percent are industrial, used in manufacturing across a wide range of industries to automate the production process. However, Robertson states that the robots that are humanoid “blatantly reinforce male and female design.” 

Take ASIMO, for example. ASIMO is a humanoid robot, created by Honda in 2000 to serve as their brand ambassador. Its design leans toward the masculine side — blocky, with a relatively straight figure. 

Then, there is HRP-4C, also known as “Miim.” In 2010, Miim was created by the National Institute of Advanced Industrial Science and Technology (AIST), a Japanese research lab. Miim’s design is strikingly more feminine than ASIMO’s. Her face resembles more that of a human woman rather than a simple screen. The “skin” on her face is made of silicon, the same as her hands, and she sports a pageboy hairstyle. The rest of her body is fashioned from silver and plastic and is much more anatomically realistic than ASIMO’s. While ASIMO has a straight figure, Miim’s is much curvier. 

Not only do the designs of humanoid robots reflect stereotypes regarding gender appearance, but also behaviors. “Some of these robots are even programmed with centralized notions of female and male behaviors,” Robertson said. For instance, if you observe ASIMO run, its shoulders are hunched and its knees stay bent. On the other hand, when Miim runs, her hips sway and her arms stay by her side. As Robertson describes it, it is almost like a catwalk. 

The biases of roboticists are also reflected in other ways. In a 2014 volume of the Journal of the Japanese Society for Artificial Intelligence, the cover shows art of a robot holding a broom and a book, cleaning the room around it. However, this robot is clearly depicted as a woman, drawn in the Japanese anime art style. The publishers faced public backlash, with many questioning why the cover depicted a robot with a feminine design instead of a male or even a gender-neutral one. This artistic choice only seemed to reinforce the association between women and household chores, even within the realm of robotics and artificial intelligence.

But why do these biases and prejudices emerge when genderless robots seem to be the obvious solution? Robertson argued that it’s a result of a trend in industrial design called kansei kogaku, or “affective engineering.” 

“Kansei kogaku is a consumer-centered approach to design developed in the 1970s by the now-retired engineer Nagamachi Mitsuo,” Robertson said. “This is a type of engineering that seeks to design products perceived by users as familiar, friendly, safe, desirable, cute  — ‘kawaii’  — unthreatening, and so forth.”

The parameters for designs that elicit these safe feelings within consumers are derived from surveys completed by consumers themselves. However, all consumers have their own presumptions regarding race, sex-gender systems, class division, and religion. Therefore, their answers and preferences reflect these biases. 

“Consumers’ presumptions, together with those of the roboticists, can be exaggerated in the design and programming of robots,” Robertson said. As a result, the designs of these humanoid robots reinforce and perpetuate gender norms appropriate for women, men, or even gender-diverse people.

“AI does not create bias, it reflects bias and prejudices of its programmers,” Robertson reminded the audience. “There’s nothing ‘artificial’ about artificial intelligence. It’s inspired by people, created by people, and has an impact on people.”

How do we aim to design more progressive technologies? At the end of her lecture, Robertson offered Science, Technology, Engineering, Art, and Math (STEAM, in contrast to STEM) as a solution. She proposed that we introduce the arts and humanities as an intellectual tool to question robo-sexism by bringing a more nuanced understanding of culture, ethics, and social dynamics to the table. By adopting a more humanistic framework, we can start to deconstruct the biases and stereotypes embedded within the design and programmed behavior of humanoid robots, she said. “We have a lot of work to do,” Robertson concluded.

Author

Leave a Reply

Your email address will not be published. Required fields are marked *