WASHINGTON, DC, is the base of one of the most powerful governments in the world. It’s also home to 690,000 people and 29 obscure algorithms that influence their lives.
The city’s agencies employ automated screening to identify applicants for housing, predict recidivism for criminals, identify fraud in food assistance to determine if a school student is likely to drop out, assist with the sentencing decision of youngsters, and many more.
The snapshot of semi-automated city life is based on an upcoming Electronic Privacy Information Center (EPIC) report. The non-profit investigated for 14 months the use of algorithms by the city and found that they were employed in 20 different agencies, including greater than one-third of them being used in criminal justice and policing. In many instances, the city’s agencies should have disclosed the exact details about how their technology functioned or was utilized. The team working on the project concluded that the city was likely to be using additional algorithmic processes that they could not discover.
EPIC has delved into one city’s use of AI to show how they impact residents’ lives and inspire other cities to engage in similar activities. Ben Winters, who leads EPIC’s work in AI as well as human rights, states that Washington was selected partly because about half of people living there identify as Black.
“More often than not, automated decision-making systems have disproportionate impacts on Black communities,” Winters states. For example, the researchers discovered evidence that automated traffic enforcement cameras are more often located in areas with greater numbers of Black residents.
Cities that have a significant Black population have recently been a major factor in protests against the use of municipal algorithms, specifically in the field of policing. For example, Detroit was the epicenter of controversy over face recognition after an incident that led to the fraudulent arrests and convictions of Robert Williams and Michael Oliver in 2019 when algorithms failed to identify both faces. In addition, in 2015, the implementation of facial recognition technology in Baltimore following the murder of Freddie Gray in police custody resulted in several of the original congressional inquiries about police use of this technology.
EPIC was a hunter of algorithms, looking for public disclosures made by city agencies. They also submitted open records requests, as well as requests for contracts and data sharing agreements, privacy impact studies, and other documents. Six of twelve city departments responded by providing documents, including the contract worth $295,000 in partnership with Pondera Systems, owned by Thomson Reuters, which makes fraud and theft detection software known as FraudCaster, which is used to assess food assistance applicants.
It states that governments can assist citizens in better understanding their usage of algorithms by making it mandatory to disclose whenever a system makes an important decision regarding a person’s life. Certain elected officials have favored making public registers of automated decision-making methods used by governments. In the last month, lawmakers from Pennsylvania, the state where screening algorithms had been used to screening algorithm, have been accused of neglecting parents with low incomes and proposed legislation to register algorithms.