AI powered Applicant Tracking Systems (ATS) use artificial intelligence to streamline and enhance the hiring process by automatically screening resumes, ranking candidates based on predefined criteria, analysing language for job fit, and predicting candidate success. These systems can reduce time-to-hire, identify top applicants more efficiently, and help remove bias when trained responsibly, offering recruiters a smarter, faster, and more scalable way to manage talent pipelines.
As artificial intelligence (AI) becomes more deeply embedded in recruitment, it’s reshaping how organisations find and evaluate talent. According to the Institute for Future work, 98% of enterprise level organisations now use AI or data driven software of some form during their recruitment process. From CV screening and automated assessments to predictive analytics and video interview scoring, AI promises speed, efficiency, and arguably, objectivity.
However, AI in ATS can actually reinforce age discrimination because it often learns from historical hiring data that favours younger candidates, uses proxies like graduation year or length of experience to infer age, and prioritises linear career paths or recent job history. These factors can lead to older applicants being unfairly filtered out, not due to lack of skill or suitability, but because of algorithmic bias that replicates existing ‘human’ age discrimination. Without deliberate and intentional safeguards, and better, more inclusive training of the AI, it’s use risks making age bias faster, less visible, and systemic.
AI is now typically involved at almost every stage of the recruitment journey, scanning CVs for keywords and formatting, predicting job fit based on experience and ranking candidates by algorithmic matching scores. In theory, an AI overlay removes human bias by applying neutral rules, but in practice, the AI often learns from historical data, which reflects years of biased hiring decisions.
AI is inherently prejudiced because it learns from human-generated data and that data often reflects existing societal bias, including those related to race, gender, age, and ability. Rather than thinking independently, AI systems identify patterns and make predictions based on historical information, which means they can inherit and amplify the very inequalities we hope to eliminate. If a dataset reflects biased hiring practices, for example, the AI trained on it will likely replicate those same patterns.
So, what can we do about this inherent bias, whilst ensuring the continued benefit of AI led ATS systems to employers?
Firstly, from an employer’s perspective, it’s important that the technologists and hiring managers are aware of the potential flaws in the ATS and indeed, are aware of their own bias when it comes to mid-career candidates. Many hiring managers remain unaware of their own unconscious bias, regardless of any exacerbating impact their ATS may have.
Specifically, regarding their ATS, organisations should consider:
There is also a role for government in managing the impact of AI generally, but also more specifically when it comes to managing prejudice and bias in the workplace. Laws that govern both discrimination and use of technology tend to lag societies use of new tech and our expectations when it comes to equity, however, policy makers should consider:
In general terms, older, mid-career workers bring experience, judgment, loyalty, and deep sector knowledge, especially in critical industries like healthcare, education, and infrastructure. Yet AI hiring systems often discard their applications before a human ever sees them. If we don’t act soon, we risk creating a labour market where generations are excluded, not by explicit prejudice, but by invisible, automated barriers, an especially concerning development given the speed at which the Australian population is ageing.
Whether by accident or design, AI systems are disproportionately filtering out mature-age candidates largely through the hidden bias baked into the data and tools themselves. However, AI can and should be be part of a more equitable future, but only if it's actively designed with fairness in mind. That means interrogating the data, questioning the outcomes, and ensuring that every candidate, regardless of age, can be seen, heard, and fairly assessed.
We need to build systems that don’t just find the quickest hire but focus on the best one and let’s continue to work towards a solution which means that age is never a reason to be left out of the workforce.