Androcentrism is the practice, conscious or otherwise, of placing a masculine point of view at the center of one’s world view, culture and history, thereby culturally marginalizing femininity. Voice assistants products in life (reality prime) and Artificial Intelligence products in science fiction suggests we have a bias when it comes to robots’ gender assignments. These stereotypes have an impact on our relationships with real people as much as it has for voiced products.
How do we design bots and voice assistants without having a default gender and categorize male voices to be an authority and female voices to be helpful and accommodating? This brings up concepts like the Smurfette Principle, where a character’s most important and interesting quality is her femaleness, and The Bechdel Test where a character will default as “socially male” if they don’t appear to have a specific gender (R2D2).
Hey, Human is a chatbot that interrupts your daily life for you to do certain assignments. It encourages the user to be more active and rethink the way we design bots and voice assistants.
iPhone users now can opt-in to daily reminders from Siri that is aimed to improve your wellbeing.
Defines the concept and the use of this practice in every day life.
Looks at media and language effects on gender constructs and representation.
Explores the concept of social gender and robots.
Analyzing data to answer questions like “Are female AIs more subservient than male AIs?”
A test created by Alison Bechdel for female presence in fictional media.
Questions why a character’s only defining quality would be their femaleness.
If we are giving human qualities to AI systems, we should at least have hierarchy of needs for it as an infrastructure to design from.
When the algorithms we design for AI are able to pass as human consciousness, will they adopt a new role as our cohabitants?
Designing a better future requires building worlds that are unrealistic but not undesirable.