How to Solve Homelessness With Artificial Intelligence | Rising Stars

Eric Rice walked along the Venice Beach boardwalk on a balmy February afternoon, his eyes peeled. He was searching for Jacob, a homeless person around 20 years old he’d met at a Safe Place for Youth, a drop-in center where Rice has done research and volunteered. An hour later he found Jacob on the beach, passing a joint back and forth with friends. Rice told Jacob he’d been selected as a peer educator for a pilot HIV education program. Jacob had never struck Rice as an exemplary leader; he usually was high when he came by the center, and he stayed far from adults. But Rice hadn’t picked Jacob — an algorithm did.

The machine learning had undercut human assumptions. When Rice pulled up to the center at 8:30 the next morning for training, Jacob was waiting with his skateboard and a cup of coffee. Throughout the project, he proved to be a crucial connector to homeless youth living in Venice, Rice says.

Rice, 46, is a social work professor and co-founder of the University of Southern California Center for Artificial Intelligence in Society (CAIS), where he and engineering professor Milind Tambe develop predictive models for public health interventions. In their 2016 pilot, now being replicated with a larger sample size in Los Angeles, algorithms analyzed social networks of homeless youth like Jacob, who opted into the study. To replace social workers’ subjective gauge of who had clout, the algorithm selected a group of “influencers” to spread information about HIV to their peers, empowering them to be the change-makers.

Study interviews revealed HIV testing rates increased for both an AI-assisted group and a comparison group — but the increase doubled with the algorithm’s help, while condom use rose faster as well. Rice’s team is working with the Los Angeles Homeless Services Authority to deploy AI-infused vulnerability assessments that match people with best-suited housing and mental health interventions, expediting this process for overloaded social workers. Rice compares the housing algorithm to Uber (optimally pairing resources with recipients) and the HIV project to Google Maps (directing resources through channels to destinations faster).

The scope of homelessness is staggering in Rice’s city. An estimated 53,000 people experience homelessness on a given night, including 3,000 between the ages of 13 and 24 — though experts say this is undercounted. Los Angeles spends more than $100 million to house 15,000 people annually. “Even if we improve the efficiency [of the system] by 5 or 10 percent, that’s thousands of more lives,” Rice says. Over the past decade, gentrification has pushed homelessness into sight as people spread from skid row to highway underpasses, boosting political will to address the problem. “Everybody [in LA] feels like they have a stake in it,” Mischa DiBattiste, a Safe Place for Youth drop-in manager, says over afternoon chatter at the center. Still, “not in my backyard” attitudes often prevail.

For the first time, computer science and social work are merging to battle these complex problems. Just like “bacon-wrapped dates,” this pairing sounds counterintuitive, Rice says with a smile — but it works. “This is a pioneering phase for social work and AI,” says Desmond Patton, a professor at Columbia University who uses natural language processing algorithms to analyze gang violence. “I don’t think it’s a space yet.”

The irony of leveraging technology to deepen human interactions isn’t lost on Rice. It’s always been essential for him to put “the person to the number” and build trust with youth, says Heather Carmichael, executive director of My Friend’s Place, a homeless service provider where Rice has volunteered and done research. “He did not stay in that ivory tower and make assumptions about who young people were or what they needed.”

Rice grew up in western New York — his father taught social science at SUNY Buffalo — and he’d wanted to become an academic since age 13. At the same time, Eric’s brother, Brian Rice, describes a prevailing “do-gooder gene” on both sides of the family: Many relatives were nurses, firefighters and public servants.

Just before Rice headed to the University of Chicago to study sociology, his father took his own life. This trauma not only shaped the fabric of the family, but it also colored Rice’s years-long struggle to identify a calling from which he could find meaning. He had detected his father’s sense of regret that he hadn’t done “something that mattered,” like his fireman brother. Rice studied social network theory at Stanford University, earning a master’s and a Ph.D. in 2002. A mentor, Oscar Grusky, told Rice that pursuing HIV prevention work guaranteed he would “never go home at the end of the day and think, ‘What am I doing with my life?’” Rice was sold.

Rice, a music buff who takes weekly guitar lessons and boasts a 6,000-strong record collection, takes on an intense, restless energy whenever he talks about homelessness. “I’ve been thinking about this for the last 15 years,” he tells me. Sitting beneath palm trees and a placid blue sky in Venice, he’s constantly in motion — picking grass beside his brown leather shoes, fiddling with his plaid scarf — as he talks quickly about systems failing homeless youth. A tent, with a young person sitting next to it, flutters in the breeze not far from where we sit. His blue eyes light up when he mentions Jessica, a lesbian who cycled through foster care and whose life is far steadier now than when they met 12 years ago.

But there are inherent concerns when AI makes choices for marginalized communities. Those designing algorithms must scrutinize the AI for bias and be wary of blindly trusting the output, Patton says. While social workers have strict codes of ethics, AI deployment remains essentially unregulated. As a cautionary tale, Rice highlights algorithmic risk assessments used in courts to determine defendants’ bonds and predict the likelihood of re-offending, which have inadvertently encoded racial bias.