The Australian Communications and Media Authority has released its first dedicated sector report on the use of artificial intelligence by interactive gambling providers. The report, published in April 2026, draws on academic studies, industry interviews and operator submissions to map how AI is reshaping the way Australian wagering companies set odds, market to customers, and detect harmful behaviour.
How Operators Are Using AI
The ACMA report identifies four primary uses of AI by licensed Australian wagering providers. The first is predictive analytics and odds setting, where machine learning models process larger and more granular data sets than previously possible to generate prices on a wider range of markets. The second is personalised promotions and services, where AI tools build profiles of individual customers and serve them tailored offers and content within the operator's account interface.
The third is content creation, design and new product features, including AI-generated graphics for racing previews and dynamic same-game multi suggestions. The fourth and most policy-relevant is detecting harmful and fraudulent gambling behaviour, with operators using anomaly detection models to flag accounts displaying patterns associated with problem gambling, money laundering, or coordinated fraud.
The Harm Detection Question
The report devotes significant attention to AI harm detection. Several major operators including dabble, Ladbrokes and bet365 already use behavioural models to identify accounts that may be exhibiting signs of harm. The challenge is that the same systems used for harm detection can also be repurposed for retention, by spotting customers who appear to be cooling off and pushing offers to keep them active.
The ACMA acknowledges this dual-use problem in the report. It notes that AI harm detection is inherently a black-box process for the customer, who has no visibility into how their data is being analysed or what triggers an intervention. International research cited in the report has found that operator-led harm tools are most effective when paired with mandatory thresholds and external oversight, rather than left entirely to the operator's discretion. The full document is published on the ACMA website.
What This Means for Punters
If you have an account with any major Australian bookmaker, AI is already involved in how you experience the product. Your odds, your promotional offers (where allowed), your account limits and your customer service contacts are likely shaped to some degree by machine learning. None of this is new, but the ACMA report is the first formal acknowledgement of how widespread these systems are. Expect this report to inform the next phase of regulation. The federal government has flagged that future ACMA standards may require operators to disclose their use of AI in customer-facing decisions, particularly around harm minimisation and account closures. Customers concerned about how their data is being used can request access to it under the Privacy Act.