Australian culture is built on a good dose of fairness and helping a mate. It permeates our sport, it is a lesson we teach our children, it is the core of our political, social and judicial systems. So why abandon these principles, weaken and or overhaul the copyright regime to carve out a special right for AI companies to scrape and ingest copyrighted works at scale?
We already have existing legal and policy frameworks that set important boundaries, but they need to be refined.
The Copyright Act of 1968 sets a clear baseline: copying and communication to the public require permission unless an exception applies. The Act’s existing exceptions –fair dealing, temporary-copy allowances, and anti-circumvention rules – set boundaries that are workable for AI companies. However, these are narrow and pre-digital and should be adjusted to include a ‘Foundational AI fair use’ clause.
Even with a review of the fair use element, AI developers need to be willing to license what they need and creators need to utilise the tools at hand to protect and monetise their work. To do anything else would be simply un-Australian.
Australia’s privacy framework rightly impinges on AI training on personal information, even when data is publicly available. The OAIC has reinforced that collecting personal information must be necessary, lawful, fair, and transparent, with meaningful security and retention limits.
The Privacy and Other Legislation Amendment Bill 2024 has sharpened enforcement and introduced a statutory privacy tort to address serious invasions of privacy.
Together, these legislative controls create obligations that clearly, and in our opinion rightfully, bite on large-scale unsupervised scraping.
Instead of diluting copyright, Australia should:
- Hold AI developers to the existing licensing-first baseline
- Add a targeted, Australian ‘Foundational AI fair use’ clause designed for clearly non-consumptive, low-substitution uses (such as text & data mining for AI training or research purposes) with guardrails and robust opt-outs
Help creators protect and monetise their content with technical measures, terms, collective licensing, and bargaining mechanisms
Our view
The Australian Copyright Act’s exceptions are narrow and context-specific. Australia does not have a general ‘fair use’ defence. What it has is defined fair dealing purposes: research or study, criticism or review, parody/satire, news reporting, legal advice.
Section 40 of the Act permits fair dealing for research or study, assessed by fairness factors. However, it’s purpose-bound and not a carte blanche for broad commercial ingestion.
The word ‘fair’ is key to the solution. As a country, we need to decide on what is fair and not simply follow the lead of international Governments under the “they did so we should do it” mantra.
Australian culture is built on a good dose of fairness and helping a mate. It permeates our sport, it is a lesson we teach our children, it is the core of our political, social and judicial systems. So why abandon that?
The solution
AI companies, like every other Australian business, should be paying to access and use copyrighted data. How can we possibly build ‘sovereign’ capability when that capability comes at the expense of our own creatives? That’s not Australian and certainly not good sovereignty.
The key to the payments needs to be a clear determination of ‘fair use.’
For example, I buy 5 books at the bookstore for a total of $100. I then ingest those books and use this new corpus of information to launch a business selling advisory services based on the collection of knowledge. As a society, we never question if this is fair use. Foundational AI models are essentially doing the same thing and monetising it in the same way.
Australia needs a Foundational AI fair use clause in the Copyright Act, supported by a broader policy framework to guide licensing, transparency and enforcement. This policy could also be extended to cover research.
We already have many tools in place to help guide us through a fair use policy.
The News Media and Digital Platforms Mandatory Bargaining Code forced platforms to the table and catalysed deals without rewriting copyright. A light-touch, sector-neutral variant should facilitate licensing media for AI companies.
Australian libraries enjoy specific carveouts in the Copyright Act. However, they are also subject to the conditions of the Australian Public Lending Right (PLR). The PLR is a government scheme that provides payments to Australian authors and publishers for the free use of their books in public libraries.
The PLR, established in 1974, is subject to a current review of the National Cultural Policy. The review expressly aims to examine “any opportunities, risks and challenges for Australia’s arts and creative sectors associated with emerging technologies such as artificial intelligence.”
This provides Australia with a unique opportunity to leverage these two tools to create a fair and equitable solution for Australian Foundational AI models.
Whatever mechanism is used to create a fair and equitable approach to data use by AI companies, the overarching tenet must be one of complete transparency. This is critical to the successful adoption of AI by everyday Australians who simply distrust AI because of its opacity.
International AI models often refuse to disclose the data on which they were trained, how that data was curated, and what biases may exist in the datasets. We need to be far more transparent in Australia.
Mandating high-level dataset transparency (categories, volumes, licensing share, meta-tagging with acquisition information, provable opt-out honouring) in a way that includes audit compliance, without forcing disclosure of trade secrets, should be the minimum standard for Australian AI companies.
An Australian AI Fair Use Policy should also explicitly call out conditions for non-fair use. Uses that compete with the original or materially substitute for it (e.g. training that allows close reproduction of protected expression) must be controlled through penalties. Similarly, ingestion that ignores opt-outs or circumvents access controls should have serious repercussions. Most importantly, AI training on personal information without expressed written consent must be avoided at all costs.
Creative Responsibility
In the spirit of being fair and reasonable, it’s unfair for creatives to complain about AI ingestion while they openly leave that content unprotected. Creators must take reasonable, modern steps to keep their copyrighted materials safeguarded.
Where feasible, creatives need to put valuable archives behind accounts, paywalls, or token-gated APIs. Anti-circumvention law reinforces these barriers. However, many creatives knowingly allow, by lack of action, other parties to republish their content in whole or in part on free websites and blogs.
Although protecting their material can be at times difficult and time-consuming, it must be done, or creatives must accept that their efforts will be part of AI ingestion. The job of policing copyright is not the work of AI companies, nor should it be.
Australia doesn’t need to dismantle copyright to enable AI. What we need is a fair use clause fit for AI, clear licensing and privacy obligations, transparency, and shared responsibility between developers and creators. That balance will let AI grow here in a way that is fair, sovereign, and trusted.