Supercharging discrimination: The Targeted Compliance Framework and the impact of automated decision-making

Action Tank blog post header.png

The United Nations Special Rapporteur on extreme poverty is preparing a report on the human rights impact of the introduction of digital technologies in social security systems. The Human Rights Law Centre made a submission to the Special Rapporteur focusing on Australia’s social security system and how technology is increasingly being used to target and punish people, especially single mothers, through programs like ParentsNext. Monique Hurley (@monique_hurley) from the Human Rights Law Centre (@RightsAgenda) summarises their submission, focusing on the gendered impacts. The full submission is available here. More information about the Special Rapporteur’s report is available here.  

 

Single mothers are being left stranded because of the “decisions” made by machines about their social security payments. While technology is often viewed as a way to deliver services to people more efficiently, automation driven only by this mindset promises to worsen inequality. It is creating what Virginia Eubanks calls a “digital poorhouse” – “an invisible web woven of fiber-optic threads” – that seeks to police, target and control those who dare ask for income support through Centrelink.  

The TCF and automated decision-making

One thread in the web is the Targeted Compliance Framework (TCF). The TCF is a highly automated system of sanctions that applies to hundreds of thousands of people receiving social security payments – people in regional towns who cannot find decent paid work, single mothers with young children to raise, and grandparents who cannot earn enough to survive because of illness or disability.

Implemented on 1 July 2018, the TCF is described by the Morrison Government as making “use of improved technology to allow job seekers to see their compliance status at any time” that is “designed to be simpler, fairer and more effective”.

The social security payments that the TCF applies to have mandatory “participation requirements” that force people to do certain tasks, such as attending appointments or applying for 20 jobs each month. A person commits a “mutual obligation failure” if they fail to jump through these hoops and meet their “participation requirements”.

The TCF has three zones: the Green Zone, the Warning Zone and the Penalty Zone. All people start in the Green Zone and, when people commit “mutual obligation failures”, they move through the zones and are exposed to an escalating series of sanctions. The automation of this process is problematic – payments can be suspended automatically, without any consideration being given to the financial hardship a payment suspension might cause. 

Australia’s own Parliamentary Joint Committee on Human Rights has raised concerns about this. The Committee has even said that the TCF is likely incompatible with Australia’s international human rights obligations. The Committee has been particularly concerned by the lack of flexibility in the penalty process and people having their payments cut in circumstances where they have no right to seek a waiver on the basis of financial hardship. Removing that right to seek a waiver eliminates the opportunity for human empathy and compassion for the consequences of leaving someone without money for food, rent and other basic necessities.

Automating the social safety net is magnifying the discriminatory practices built into its foundations, raising human rights concern. Photo by Nadine Shaabana on Unsplash

Automating the social safety net is magnifying the discriminatory practices built into its foundations, raising human rights concern. Photo by Nadine Shaabana on Unsplash

The increase in automated decision-making means a decrease in human-level interactions, and a lack of interpersonal responsibility and accountability on the part of the institution implementing the decisions. Instead, a greater burden is put on individual people to hold both the Government, and the private companies contracted by the Government to deliver employment services, accountable.

Rigid, automated systems do not provide flexibility to accommodate the daily realities of life. Technology can fail people when they need it most and sometimes it can be as simple as not having mobile phone credit or access to the internet.

Recent data shows that one in five people who had their social security payments cut were later found to have a valid reason for not meeting their requirements. This is because payments can be suspended by a computer program without notice or further consideration. For example, there have been cases where a single mother has had her payment stopped for not attending an appointment because the job service provider responsible for working with her forgot to tell her about the appointment. Despite not being at fault, she bore the burden of working out how to fix the mistake.

Some private job service providers do not make it easy for women to attend appointments. Women have reported having to miss study or leave paid work in order to attend appointments, so that the private provider can write on a piece of paper and confirm that they are indeed studying or working. Other women have been handballed to different staff members at each appointment, and asked to recount extensive details of family violence over and over again in front of their children.  Still others are required to attend appointments at inconvenient times – for example over school holidays.  

Supercharging discrimination

Where a social security program has a discriminatory design or impact, automation will entrench and intensify inequality. While removing human discretion and letting a computer treat each decision the same might seem like a way to stop discrimination in decision-making, it actually has the potential to compound racial injustice. Automated decision-making can “supercharge” racial discrimination by targeting a particular demographic, subjecting them to a particular punishment and collecting data on them, which often serves as the justification to continue the discriminatory practice.

An example of this is the impact of the TCF on Aboriginal and Torres Strait Islander parents who are targeted by the Government’s punitive ParentsNext program – another thread in the web. The ParentsNext program applies to parents in receipt of the parenting payment (who meet particular criteria) and an “intensive” stream targets regions where there are high numbers of Aboriginal and Torres Strait Islander people in receipt of the parenting payment.

The ParentsNext program targets Aboriginal and Torres Strait Islander women because the Government says they “have lower employment rates than Aboriginal and Torres Strait Islander men and non-Indigenous people”. The ParentsNext program imposes rigid requirements, with no evidence-base, which leaves Aboriginal and Torres Strait Islander women more exposed to the risk of financial sanctions. The ParentsNext program does this while doing nothing to address the lack of affordable child care, employment discrimination, limited job security and other structural barriers that make it difficult for women to participate in the workforce.  

Data is already showing that, as at 31 December 2018, parents in the “intensive” stream – which targets Aboriginal and Torres Strait Islander parents – are having their payments suspended more often. We have already seen how data like this is collected and used to blame individuals for the structural causes of poverty, like the recent segment on Sunrise that characterised people on Newstart as “dole-bludgers”.

All this time, very little is being done to address the underlying causes of inequality.

Technology entrenching gender inequality

ParentsNext, combined with the operation of the TCF, has increased levels of emotional and financial stress for the single parent families disproportionately impacted. The stakes are high when it comes to social security – the decision about whether someone receives support is a decision about whether they eat or go hungry. Women have been left without money for daily essentials, and have been forced to turn to charities for food vouchers. Others have described feeling like their movements are being monitored and controlled by private job service providers who seek to enforce compliance.

This is happening in in a context where, according to the Australian Bureau of Statistics, women make up 70% of primary unpaid care workers for children. It is estimated that the monetary value of unpaid care work is approximately $650.1 billion, which equals to a massive 50.6% of GDP. Rather than thanking single mothers for their invaluable unpaid care work and recognising how hard it is to raise young children, parents targeted by the ParentsNext program are required to undertake extra tasks. If they do not complete their tasks, they risk having their payments suspended, reduced or cancelled.

ParentsNext and the TCF should be scrapped

The ParentsNext program should be scrapped. While the operation of the TCF exacerbates the impacts of the program, the very foundation of it is discriminatory and has no place in Australia. It is fundamentally flawed in a multitude of ways that have previously been summarised on this blog and are comprehensively set out in the joint submission made by SNAICC – National Voice for our Children, the National Family Violence Prevention Legal Services Forum and the Human Rights Law Centre to the Senate Inquiry into ParentsNext earlier this year.

The TCF reflects the Government’s increasingly punitive and dehumanising approach to social security, a trend that urgently needs to be reversed. The dignity and humanity of those needing social security must be prioritised ahead of efficiency and cost gains. A human rights-based approach to social security and technology needs to be adopted that gives priority to the voices and needs of those closest to the pain caused by growing inequality in Australia.

This post is part of the Women's Policy Action Tank initiative to analyse government policy using a gendered lens. View our other policy analysis pieces here.