The American Civil Liberties Union (ACLU) has uncovered significant racial biases in Amazon's facial recognition technology, highlighting a concern that could impact civil rights and ethical standards in technology. During their evaluation, the ACLU discovered that Amazon's AI erroneously identified individuals with darker skin tones as criminals more frequently than their lighter-skinned counterparts. This concern is amplified when such biased technologies are applied in law enforcement, potentially leading to automated racial profiling. Previous research, including a 2010 study by NIST and the University of Texas in Dall