Government’s ‘guardrails’ on AI not enough to keep us safe

By Peter McDonald , Project Lead, Data for Good partnership at the Centre for Social Impact, Flinders University

The federal government’s announcement on the introduction of mandatory guardrails for artificial intelligence in high-risk settings leaves too many unanswered questions, and too much power in the hands of the tech giants.

While Minister for Industry and Science Ed Husic's proposal addresses specific high-risk applications, it fails to account for the pervasive ways AI is already affecting vulnerable populations .

It is imperative to scrutinise the broader implications of AI technology, and automated decision making more broadly, on everyday Australians. The reality is that AI and automated decision making is disrupting lives at a more fundamental level, from exacerbating housing discrimination to undermining fair access to essential services.

Minister Husic's focus on limited regulatory measures risks missing the larger picture of AI's societal impact, underscoring the need for a more comprehensive approach to safeguard all Australians against its adverse effects.

Researchers at the Centre for Social Impact – Flinders University (CSI) are in the final stages of a critical evaluation investigating the negative impacts of automated decision making on disadvantaged communities. Our forthcoming results will provide evidence that automated decision making is increasingly shaping the trajectory of countless Australian families, particularly in their quest and ability to secure a rental home.

The core issue lies in automated decision-making—a process where technology makes decisions about us with little or no human oversight. This automation, increasingly embedded in rental applications and other vital services, often exacerbates existing inequalities and disproportionately affects those already marginalised.

Over the last six months we have interviewed people who have struggled to rent because they have been discriminated against by the automated decision-making process, which often unfairly decides they are ineligible for a home.

The pending report by the Centre for Social Impact at Flinders University paints an emerging picture of the negative impacts of biased automated decision making. While we welcome Minister Husic’s mandatory guardrails, the announcement does not protect against these negative impacts for people who are seeking to find a rental home. Here is one of many examples uncovered in the report.

A single mother with a young son was searching for a place to live. She registered for many rental platforms, but struggled significantly to get through the first gateway to view a house.

She had a hunch that the online application was using something in her profile to exclude her from viewing the property, let alone securing a home. To prove her hunch she set up two profiles to apply for the same property. One profile included her son and related income benefits. The other did not. She told us:

“I am a single parent. I was being very honest with the machine and then I went hmm, this is not working for me. So I changed the profiles. This is where it is interesting… it was less income with no child and no Centrelink... and I got two, I got two rentals!”

This renter later describes her experience of online renting as “completely sexist”. Given that the vast majority of single-parent households in Australia (81%) are headed by females, there is a logic to women questioning if they are experiencing automated gender-related discrimination or sexism.

Because the real estate sector is not classified as a high-risk industry, the Minister’s guardrails for AI notably exclude this crucial area.

This oversight means that many families who face discrimination through automated decision-making will continue to endure the instability of precarious housing situations. As a result, these individuals will remain vulnerable to the long-term consequences of AI-driven biases, potentially impacting their quality of life and access to stable homes.

The Minister refers to the latest update of the AI adoption index, published by Fifth Quadrant, where industry discloses that it understands that unchecked AI can be racist and biased. Disturbingly over 40% percent of industry indicates that they have no plans to address discrimination by AI.

Like most discrimination, this issue is typically opaque and hidden. At the Centre for Social Impact, we want to ensure the negative impacts of this on those seeking to rent a home is never obscured or underestimated.

Compulsory guardrails are essential to safeguarding renting families from such automated discrimination. Our research underscores the urgent need for the real estate sector to transparently disclose its use of AI and automated decision-making. It is crucial for the industry to take proactive steps to address and rectify any instances of unfair bias, ensuring that these technologies do not perpetuate discrimination.

BY PETER R MCDONALD

This opinion piece, written by Peter R. McDonald, Project Lead for the The Data for Good Partnership with Uniting Communities Inc at CSI Flinders, was originally published in The Canberra Times.