Alexa, Why Are You Not Alexandro? Reinforcing Cognitive Biases Through Technology
This essay uses a post-phenomenological approach based on I-technology-world relations to analyse gender biases in technologies. The focus of this work is on feminised personal assistants like Alexa, Siri, and Cortana. These systems, designed to mimic human interaction, often reinforce stereotypes by associating femininity with obedience and care. This perpetuation of stereotypes poses a dual problem: it reinforces existing biases in human cognition and contributes to their normalisation through daily technological interactions.