Home Artificial Intelligence News Highlighting AI Bias

Highlighting AI Bias

Audio version of the article

On Monday, IBM made a monumental announcement: the company is getting out of the facial recognition business, citing racial justice concerns and the need for legal oversight.

Racial and gender bias is at the heart of Shalini Kantayya’s documentary “Coded Bias,” which investigates the corporate and societal implications of machine-learning systems when left unchecked. The film focuses on the work of several female mathematicians and data scientists — “outsiders,” by way of gender, race or sexuality. IBM’s decision was influenced by research presented by Joy Buolamwini and coresearchers Deborah Raji and Timnit Gebru, who are all featured in the film.

Buolamwini, a Black Ph.D. student in MIT’s Media Lab, is the central character of the documentary, and focuses her research on issues of biases in facial recognition and artificial intelligence. She founded the Algorithmic Justice League, which pushes for greater data oversight for large companies using data to create AI systems with outsized influence.

“The proof of this technology being racist was done by Black women, and done by the Black women in my film. I’m really proud of them for making real change,” Kantayya says. “I’m cheering on IBM, and I’m challenging Amazon to do the same. Amazon’s made statements about them standing out against racism, while they continue to market and sell technology that is proven to be racist. [Late Wednesday, Amazon announced that they will ban police use of its facial recognition software for one year.] Until we know that this technology is unbiased, until we know that it’s fair, until there’s some public schools in place to govern this technology from abuse, we need to press pause on facial recognition.”

The documentary premiered at Sundance and will screen this week online as part of the Human Rights Watch Film Festival and next week with the AFI Docs Film Festival. Kantayya teases a forthcoming announcement about wider distribution. “Coded Bias” is the Brooklyn-based filmmaker’s second feature documentary; her first, “Catching the Sun,” which was executive produced by Leonardo DiCaprio, looked at the clean energy movement.

Kantayya’s documentary got a boost from its Sundance premiere, where it was well received by film and tech insiders. “I had someone who worked at Google say to me, ‘We’ve been having this conversation among ourselves, and you made a film that is a conversation that we can have with everyone.’ And I thought that was a really strong support of the film,” she says.

“Coded Bias” makes the argument that math is being used as a shield for deceptive practices in machine learning, and questions blind faith in big data. At the same time, many of the data sets used to feed machine learning systems contain biases that exist in society. The issues highlighted in the film underscore the need for transparency and checks for accuracy and bias. Because the systems are opaque, it is difficult to discern whether someone has been the victim of systemic discrimination via algorithmic bias.

“We’re living in an age where if you say something, and a computer says something, you’re obviously wrong,” Kantayya says. “It’s this blind faith in big technology and the invisible automated decision-makers that are making massive decisions about who gets opportunities in life, that is deeply troubling.

She presents algorithmic justice issues as a human rights issue, and ground zero for the battle for civil rights and democracy in the 21st century. At the same time, there’s a lack of public understanding. The implications of machine learning include what products and services are targeted to whom — aligned with standard marketing practices — to more troubling applications, such as using biometric data to create unique, individual profiles.

Left unchecked, systems created from data have the potential to automatically eliminate certain demographics from job searches or college admissions, determine health coverage, and make troubling criminal justice recommendations. In one example, Kantayya highlights how one AI recruiting tool used by Amazon was biased against women, penalizing graduates of women’s colleges and résumés that mentioned female-centric clubs. (The tool has reportedly since been scrapped.)

“It’s going to roll back all of the civil rights advances that we made over 50 years,” she says. “I came to see how algorithm justice dovetails with basic freedoms we’ve been guaranteed in the Constitution: the right to assembly, the freedom to associate,” she says. “And so if we want to hold on to our civil rights and our democracy we really have to empower ourselves around these issues.”

The film prominently features “Weapons of Math Destruction” author and data scientist Cathy O’Neil and Big Brother Watch director Silkie Carlo. The organization is monitoring how facial recognition is being tested by the police on the streets of London — despite high rates of inaccuracy — to identify criminality. In one scene, Buolamwini captures the aftermath of a 14-year-old Black teen boy in a school uniform being wrongfully stopped by the police. The boy, one of several thousand to get mistakenly stopped because of the technology, is visibly confused as a Big Brother Watch activist attempts to explain why he was stopped by five plainclothes police officers.

“If you live in New York, you know what the impact of stop and frisk has been on communities of color,” Kantayya says.

The film also presents China’s widespread use of biometric facial recognition and individual “social scores” in everyday life. The technology can be used to identify and jail protesters; in late 2019, government protesters in Hong Kong donned masks and destroyed CCTV cameras. With large numbers of protesters taking to the streets Stateside this month to speak out against police brutality, conversations around how facial recognition is used — and by whom — is more urgent than ever.

More than 117 million American faces are already included in police databases, which are being used to develop machine learning systems for use by the police, the FBI and ICE — so far, with no government oversight.

Later in the film, Buolamwini and her team present this information along with their research in front of politicians including Rep. Alexandria Ocasio-Cortez, D-N.Y., and Rep. Jim Jordan, a leading figure in the Republican Party, during a congressional hearing. Jordan expressed particular concern to the fact that American faces from police data bases are being used with no government oversight.

“He was as terrified as any Democrat, which was crazy,” Kantayya says. “We need systems and legislature that understand these systems so we can govern them. Just a few companies are having an outsized amount of power in a society — that’s not democratic.”

As to what she hopes viewers will take away from watching her film, Kantayya identifies an easy starting point: people start questioning the technology they use everyday.

“I hope that people will start to question this blind faith we have in technological systems,” she says. “And peel away that magic and see that technology is only as good as the human in it.”

This article has been published from a wire agency feed without modifications to the text. Only the headline has been changed.

Source link

Must Read

Is Data the new gold

 Digitalization is yielding vast quantities of data, which offer opportunities for business, human well-being and the environment, if used effectively. New business models...

Understanding Holistic Scene with Panoptic

Real-world computer vision applications, such as self-driving cars and robotics, rely on two core tasks — instance segmentation and semantic segmentation. Instance segmentation identifies the class and...

China Aims to Dominate World Blockchain Industry

The Takeaway:China's blockchain infrastructure BSN is set to provide global access to its services next month. The network’s reliance on U.S. cloud service providers makes...

Integreting Supply Chain with Blockchain

In the course of my 20 years of experience in information technology, I’ve seen many tech trends come and go: big data, the cloud,...

Using Blockchain to support Dairy Industry

Australian dairy farmers are enlisting blockchain technology to regain control of their industry and redress the power balance with processors and retailers.The recent history...

Basics about natural language processing

Learn the basics about natural language processing, a cross-discipline approach to making computers hear, process, understand, and duplicate human speech.It wasn't too long ago...

Is Amazon lenient towards Alexa skills policies

Amazon claims it reviews the software created by third-party developers for its Alexa voice assistant platform, yet US academics were able to create more...

VC’s warns US could lose technology race with China

http://www.podcastone.com/downloadsecurity?url=aHR0cHM6Ly9wZHN0LmZtL2UvY2h0YmwuY29tL3RyYWNrL0UyRzg5NS9hdy5ub3hzb2x1dGlvbnMuY29tL2xhdW5jaHBvZC9hZHN3aXp6LzIwMDIvMDcxNDIwX0xFQURFUlNfQU5EX0xFR0VORFNfcG9kY2FzdF9iNXgxXzk3Y2U0YTJhLm1wMz9hd0NvbGxlY3Rpb25JZD0yMDAyJmF3RXBpc29kZUlkPTU0MDZkZjNhLTgyYWItNGUzZi05ZjVjLTM0MDg5N2NlNGEyYSoqfDE1OTYyNzg5ODMxMzEqKnw=.mp3This week on Leaders and Legends Nick Beim, partner in the Venture Capital Firm Venrock joined Aileen Black to discuss the need to sound...

Installing MongoDB on Ubuntu

If your company is in the business of using, handling or depending on data, chances are you’re in need of a document-oriented, NoSQL database....

Data Collection And Management To Power Sound Recognition

 https://www.dataengineeringpodcast.com/podlove/file/627/s/webplayer/c/episode/Episode-139-Audio-Analytic.mp3 Summary We have machines that can listen to and process human speech in a variety of languages, but dealing with unstructured sounds in our environment...
banner image