SafeRent Solutions, an AI-powered tenant screening company, settled a lawsuit alleging that its algorithm disproportionately discriminated against Black and Hispanic renters and those relying on housing vouchers.
Tenant Mary Louis filed the suit after being rejected housing through SafeRent. According to the suit, SafeRent’s scoring system, which landlords use to assess tenant eligibility, assigned systematically lower scores to members of these groups. This practice effectively blocked many qualified applicants, like Louis, from securing housing, perpetuating cycles of exclusion and inequality.
The lawsuit spotlights how racism in AI can amplify existing societal inequities. SafeRent’s algorithm, designed to evaluate financial and personal data, penalised applicants for using housing vouchers—a legal form of income assistance—while favouring applicants from wealthier demographics. Such systems often reinforce systemic racism, as historical disparities in income, credit, and housing stability disproportionately affect Black and Brown communities. The $2.3 million settlement now requires SafeRent to cease using its scoring system for voucher holders and to reform its screening methods to promote equity.
This case underscores a critical lesson: As the housing crisis deepens and more landlords turn to technology for tenant selection, it is imperative to demand stronger regulations and ethical oversight to prevent algorithms from exacerbating inequality. SafeRent’s settlement is a step forward, but it also serves as a reminder that vigilance and counteractions remain necessary, and thankfully, Mary Louis did just that!
See: She didn’t get an apartment because of an AI-generated score – and sued to help others avoid the same fate at The Guardian.
Image from the original article at The Guardian.