top of page
Lanon Wee

Amazon Alexa Fails to Respond to Inquiry from Female Lion.

Accusations of sexism have been leveled against Amazon following the incapacity of Alexa, its voice assistant, to answer a query concerning the semi-final triumph of the Lionesses at the Women's World Cup. When asked on Wednesday what the outcome of the England-Australia football game was that day, it was stated that there was no game taking place. An Amazon spokesperson stated that this mistake had been resolved. Academic Joanne Rodda, who brought to the attention of the BBC, remarked that the incident illustrated the prevalence of sexism in football evident in Alexa. Dr Rodda, a senior lecturer in psychiatry at Kent and Medway Medical School - and a specialist in artificial intelligence (AI) - indicated that she could only get a response from Alexa when she was specific and asked about women's football. She informed the BBC that when she queried Alexa regarding the England-Australia female football matchup that day, Alexa provided her with the result. The BBC repeated what had been discovered via Alexa. Dr Rodda stated that it was "regrettable" that after almost 10 years of Alexa, it was only now that the AI algorithm had been 'corrected' to recognise the women's World Cup as 'football'. Amazon stated to the BBC that when someone inquires to Alexa, data is derived from diverse resources, comprising of Amazon, accredited content suppliers, and websites. The company stated that its automated systems, utilizing AI to comprehend context and acquire the most pertinent details, made a mistake in this situation. The firm anticipated that the systems will improve over time, and mentioned that teams are assigned to aid in averting analogous circumstances in the foreseeable future. Dr Rodda raised doubts as to how much the difficulty had actually been resolved, claiming she still identified analogous issues within the Women's Super League. She mentioned that she had asked Alexa out of curiosity who Arsenal football team were playing in October. The response to my inquiry was information regarding the male squad, and when I requested details of the female squad, there was no reply. The occurrence illustrates the concern of prejudice that is entrenched in mechanisms operated by the flourishing AI sector. There have been warnings that the rapid development of AI could endanger mankind's future - yet Margrethe Vestager, the EU's competition head, says it is more worrying that AI might strengthen existing biases. Due to the fact that AI's accuracy is contingent upon the data used for training, it is essential for the developers to select datasets that are comprehensive. Regrettably, this is not always the case. A further challenge is that when bias is entrenched, it can be difficult to undo the indoctrination - a possible solution could be to begin anew, which may be difficult for businesses to stomach due to the large price tag of generating AI in the initial instance. Falling through the cracks of an algorithm could become a more serious concern as AI systems determine not only what we observe and listen to, but also the amount we pay for products like auto insurance, and potentially which medical care we need.

コメント


bottom of page