Class action lawsuit on AI-related discrimination reaches final settlement

Mary Louis’ excitement to move into an apartment in Massachusetts in the spring of 2021 turned to dismay when Louis, a Black woman, received an email stating that an “outside agency” had denied her rent.

That third-party service included an algorithm designed to score rental applicants, which became the subject of a class action lawsuit, with Louis at the helm, claiming the algorithm discriminated based on race and income.

A federal judge on Wednesday approved a settlement in the lawsuit, one of the first of its kind, with the company behind the algorithm agreeing to pay more than $2.2 million and roll back certain parts of its screening products covered by the lawsuit. claimed they were discriminatory.

The settlement does not include an admission of guilt by the company SafeRent Solutions, which said in a statement that while it “continues to believe that the SRS scores comply with all applicable laws, litigation is time-consuming and expensive.”

While such lawsuits may be relatively new, the use of algorithms or artificial intelligence programs to screen or score Americans is not. For years, AI has covertly helped bring consequences to US residents.

When someone submits a job application, applies for a mortgage, or even seeks certain medical care, there’s a chance that an AI system or algorithm will score or rate them the way Louis did. However, these AI systems are largely unregulated, even though some have been found to be discriminatory.

“Management companies and landlords need to know that they are now on notice, that these systems that they assume are reliable and good are going to be challenged,” said Todd Kaplan, one of Louis’ attorneys.

The lawsuit alleged that SafeRent’s algorithm failed to take into account the benefits of housing vouchers, which they said were an important detail in a renter’s ability to pay the monthly bill, and therefore discriminated against low-income applicants living in were eligible for the aid.

The lawsuit also accused SafeRent’s algorithm of relying too heavily on credit information. They argued that it fails to provide a complete picture of an applicant’s ability to pay rent on time and unfairly provides housing vouchers to applicants who are Black and Hispanic, in part because they have lower average credit scores, attributable to historical inequality.

Christine Webber, one of the plaintiff’s attorneys, said that just because an algorithm or AI is not programmed to discriminate, the data an algorithm uses or weighs “can have the same effect as if you told it to intentionally discriminate.” ”

When Louis’ application was denied, she attempted to appeal the decision, sending two references from the landlord to show that she had paid the rent early or on time for sixteen years, even though she had no strong credit history.

Louis, who had a housing voucher, was struggling, having already informed her previous landlord that she was moving out, and was left to care for her granddaughter.

The response from the management company, which used SafeRent’s screening service, said: “We will not accept an appeal and cannot overrule the outcome of the tenant screening.”

Louis felt defeated; the algorithm didn’t know her, she said.

“Everything is based on numbers. You don’t get the individual empathy from them,” Louis said. “The system cannot be defeated. The system will always beat us.”

While state lawmakers have proposed aggressive regulations for these types of AI systems, the proposals have largely been just that failed to gain sufficient support. That means lawsuits like Louis’s are starting to lay the groundwork for AI accountability.

SafeRent’s attorneys argued in a motion to dismiss that the company should not be held liable for discrimination because SafeRent did not make the final decision on whether to accept a tenant. The agency would screen applicants, give them a score and submit a report, but leave it up to landlords or management companies to accept or deny a tenant.

Louis’ attorneys, along with the U.S. Department of Justice, which filed a statement of interest in the case, argued that SafeRent’s algorithm can be held accountable because it continues to play a role in housing access. The judge rejected SafeRent’s request for dismissal on these points.

The settlement states that SafeRent may not be able to include the scoring feature in tenant screening reports in certain cases, even if the applicant is using a housing voucher. It also requires that if SafeRent develops a different screening score that it plans to use, it must be validated by a third party that the plaintiffs agree to.

Louis’ son found her an affordable apartment on Facebook Marketplace which she has since moved to, although it was $200 more expensive and in a less desirable area.

“I’m not optimistic that I’m going to take a break, but I have to keep going, that’s all,” Louis said. “I have too many people relying on me.”

___

Jesse Bedayn is a staff member for the Associated Press/Report for America Statehouse News Initiative. Report for America is a nonprofit national service program that places journalists in local newsrooms to report on undercovered issues.