Teresa Hodge knows what it’s like to face the dreaded box.

Several years ago she was online applying for a job, and after entering a few personal details like her name and address, a check box popped up asking: “Have you ever been convicted of a crime?”

Having recently served a 70 month prison sentence for a white collar conviction and not wanting to lie, Hodge clicked “yes.”

“I recall like it was yesterday; just taking the deep breath, checking the box. And the moment I did that the screen went black,” said Hodge, who along with her daughter and business partner Laurin Leonard, was recently named a Technology and Human Rights Fellow at the Carr Center for Human Rights Policy. “And then it said: ‘Something you said has disqualified you for this opportunity.’ Well, it was really obvious what had happened.”

In the seven years since she was released Hodge and Leonard have been on a mission to infuse human rights and human considerations into the space where technology meets people who have been incarcerated. As part of the initial cohort of Technology and Human Rights Fellows, Hodge and Leonard are writing a discussion paper touching on artificial intelligence, human rights, criminal background checks, and what they call “algorithmic justice.”

The two women say one-in-three Americans, approximately 70 million, currently have an arrest record, conviction record, or both. By the year 2030, that number is expected to be 100 million. And those people invariably find that the algorithm-driven measurements used by banks, educational institutions, and potential employers to gauge their reliability—credit scores and criminal background checks—are not their friends.

“Algorithms have been used to deny bail. They've been used in the courtrooms to validate long sentences,” Hodge said in a recent interview. “In general, algorithms have been used in a very punitive and harsh way.”

Because people (often people of color) can’t escape these electronic assessments, they get trapped in a cycle that cuts off any upward mobility.

“What we're finding is people are relegated to a social status that typically will result in significantly lower income over the lifetime, even high rates of homelessness, and then also being locked out of educational opportunities, which really impinges on your ability to manage your life,” Leonard said.

Hodge and Leonard’s first venture was Mission: Launch, Inc. and LaunchPad, a business and leadership development accelerator for entrepreneurs living with criminal convictions. The venture that has supported the formation of worker-owned cooperatives, sole proprietorships, and limited liability corporations throughout the Maryland and Washington, D.C. region.

But soon they saw that in order to attack a tech problem, they needed a tech solution. So they found a software engineer and created their second venture, R3 Score, which helps formerly-incarcerated entrepreneurs navigate around a standard background check, especially when seeking financial products and services.

Hodge and Leonard said R3 Score uses its own own proprietary algorithm and expanded inputs to present a more complete, holistic view of a job or loan applicant. It incorporates credit score and criminal background check information with the results of an online questionnaire that asks users about things like additional education, volunteer work, and community involvement. ‘

“An individual is able to add in things that they may have done positively to move their lives forward, and there's just really no other way to glean that information,” Leonard said of the questionnaire, which takes about 45 minutes to complete.

After all the inputs are entered, the R3 algorithm calculates a score ranging from 300 for someone who is not a good risk, to 850 for someone who is. The tool is not only beneficial for applicants, the two women say, but also for banks and employers, who can now feel comfortable doing business with people they might have rejected before.

“Banks are losing out and they know there's money being left on the table and that there are borrowers that they could be lending to if they had a little bit more information,” Hodge says. “And on the employer side, there are certain industries are experiencing workforce shortages and they can't afford to not look at a third of our country for potential employees.”

Hodge and Leonard say that by the year 2030 they hope that R3 Score becomes the “gold standard” for evaluating people who have been arrested or incarcerated. But they also say that one of the most important things they’ve already done to add diversity and a human rights perspective to the process of how algorithms are created.

“More of us who are proximate to these problems need to write algorithms,” Hodge said. “They can’t all be written by the same people or the same type of people.”

Photo by Robert Galbraith