Copy
View this email in your browser

AJL MID YEAR UPDATE
Advocacy | Research | Art


Dear Champions of AJL, 

The fight for Algorithmic Justice continues. 2019 thus far has been a year of advancement for the cause. Using advocacy, research, and art, we continue to raise awareness about the harms of AI while pushing for protections.

Currently, the Algorithmic Justice League is getting ready to testify at the House Science Committee  Congressional hearing on "Artificial Intelligence: Societal and Ethical Implications"  June 26 at 10:00am ET.  Tune in for live coverage.

See below for AJL's 2019 updates thus far!

All the best,
The AJL Team

ADVOCACY
US House Committee on Oversight and Government Reform, May 2019
"Facial Recognition Technology: Its Impact on our Civil Rights and Liberties"
Alexandria Ocasio-Cortez & Joy Buolamwini tackle racial bias in AI facial recognition software in Congress. Watch here.
ACLU of MA: Press Pause on Face Surveillance

AJL participated in the launch of The ACLU of Massachusetts' "Press Pause on Face Surveillance" campaign that seeks to educate the public on the civil liberties concerns posed by face surveillance technology and the need to pass statewide moratorium
on the government's use of the technology. The Boston Herald reports, "Nine in 10 Massachusetts voters think the state should regulate government use of face surveillance technology."

Amicus Letter: Tenants Fight Against Facial Recognition Technology

AJL writes an Amicus Support Letter, standing with Brooklyn tenants who are fighting back against a facial recognition technology entry system. CityLab says, "The landlord of a rent-stabilized apartment in Brooklyn wants to install a facial recognition security system, sparking a debate about privacy and surveillance."
RESEARCH
Raji, I & Buolamwini, J. (2019). Actionable Auditing: Investigating the Impact of Publicly Naming Biased Performance Results of Commercial AI Products. 

AJL Bias Study Makes the Front Page of the NYT Business Section, Again.


"Now a new study from researchers at the M.I.T. Media Lab has found that Amazon’s system, Rekognition, had much more difficulty in telling the gender of female faces and of darker-skinned faces in photos than similar services from IBM and Microsoft. The results raise questions about potential bias that could hamper Amazon’s drive to popularize the technology." -New York Times
 
New York Times: AI Experts Question Amazon
Medium: AJL Response to Amazon Attacks
Bloomberg: Amazon Schooled on AI Facial Technology

More Research Highlights Include:
Congratulations to Deborah Raji who was the lead investigator on our latest test of facial technologies from Amazon, IBM, Microsoft, Face++ and Kairos.
ART
AJL in Barbican London's AI: More than Human exhibit

Barbican Exhibition “AI: More Than Human”, May 16 to August 26 2019, is a ‘festival-style’ exhibition exploring creative and scientific developments in AI. The curators of the exhibition asked to display AJL projects on bias in facial recognition systems, including "Gender Shades" and "AI, Ain't I A Woman?".

Select AJL Exhibitions:
Conversation with Darren Walker (Ford Foundation), Dr. Latanya Sweeney (Harvard), and Joy Buolamwini (AJL) at the Radcliffe Institute's  Vision & Justice Convening.
2019 HIGHLIGHTS
TIME Magazine Optimist Edition, edited by Ava Duvernay features AJL op-ed:  "Artificial Intelligence Has a Problem With Gender and Racial Bias. Here’s How to Solve It". 
NBC's Stephanie Ruhle explores AJL research, why it matters, and what can be done about algorithmic bias.

Doha Debates - Artificial Intelligence
3 million viewers tuned in for a lively debate on the future of AI.
Watch the full debate or review the debate recap.

D4BL II —Data for Black Lives 2nd Conference held at MIT Media Lab. AJL participated in the "What is a Movement Scientist?" Panel.

Additional Features:
GET INVOLVED
Increase Awareness
Report Bias
Request Bias Check
Copyright © 2019 Algorithmic Justice League, All rights reserved.

Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.