Guidance on AI and Data Protection
The Children’s code design guidance | ICO
Re-examining Whether, Why, and How Human-AI Interaction Is Uniquely Difficult to Design
Home Office algorithm to detect sham marriages may contain built-in discrimination
Persistent Anti-Muslim Bias in Large Language Models
Voice Recognition Still Has Significant Race and Gender Biases
Data, Tech & Black Communities
One ****tech worker explained: “Obviously we do have teen users, we assume, but we don’t collect age data about people.” The report calls this abdication of responsibility “strategic ignorance,” and the research suggests that by carefully choosing what they don’t know about their users, tech companies fail to protect the health and well-being of adolescents on their platforms.
AIAAIC - AI, Algorithmic, Automation Incident and Controversy Repository
219 examples of discrimination in AI products/services