The Privacy Act amendments are already legislated. From 10 December 2026, any APP entity that uses a computer programme to make decisions significantly affecting individuals — on eligibility, pricing, employment, claims, access to services — must disclose that in its privacy policy. The Office of the Australian Information Commissioner has confirmed the obligation (Office of the Australian Information Commissioner, 2025), and guidance is coming. Most organisations have accepted this as a compliance documentation task and assigned it accordingly.
That framing is the problem.
In Brief
- From 10 December 2026, Privacy Act amendments require APP entities to disclose automated decision-making systems in their privacy policies.
- The disclosure obligation is a transparency requirement built on top of an accountability architecture most organisations have not yet constructed.
- Disclosing a system you cannot defend to a regulator is meaningfully worse than not disclosing it — it signals the obligation was treated as a document exercise.
- Boards that treat the December deadline as a copywriting task are approaching a governance problem with the wrong instrument.
- The organisations best placed to meet the obligation are those already able to explain — with evidence — who reviews every significant automated decision and on what basis.
Disclosure without the architecture beneath it does not satisfy the obligation; it exposes the gap. A privacy policy that states “we use automated systems to make decisions about your eligibility” is not a governance achievement. It is an invitation to the regulator’s second question: what accountability sits underneath that? The organisations that will struggle in December 2026 are not those that failed to update their privacy policies. They are those that updated their policies without being able to answer what comes next.
Australian Privacy Principles (APPs), include:
- Government Agencies: Federal departments, agencies, and bodies.
- Businesses: Companies with > $3M annual turnover, partners, or trusts.
- Non-profits: Non-profit organizations with > $3M turnover.
- Health Service Providers: Private hospitals, doctors, gyms, or insurers.Credit Bodies: Credit reporting bodies or credit providers.
The disclosure sits downstream
The Privacy Act Review Report, published by the Attorney-General’s Department in February 2023, was explicit about the intent behind the automated decision-making provisions: the obligations were designed to give individuals meaningful information about consequential decisions that affect them, not to create a documentation category (Attorney-General’s Department, 2023). The Attorney-General’s response, and the subsequent Privacy and Other Legislation Amendment Act 2024, carried that intent into legislation. The operative word in the obligation is not disclosure; it is accountability.
No regulator investigating a complaint about an automated hiring decision, a credit refusal, or a denied claim will stop at the privacy policy text. The question that follows disclosure is whether the organisation can demonstrate that the decision-making system was accurate, fair, and subject to meaningful human review. The IAPP’s analysis of Australia’s AI governance landscape confirmed the December 2026 date and identified the accountability architecture question — not the transparency statement — as the substantive governance challenge facing boards and executives in the coming eighteen months (International Association of Privacy Professionals, 2025).
Organisations that confuse the disclosure obligation with the accountability obligation are building a document on top of a gap. That gap tends to become visible at the worst possible moment: under regulatory scrutiny, when the cost of having nothing underneath the disclosure is at its highest.
What boards discover too late
Capable organisations with well-resourced compliance functions are nonetheless underprepared for this. The reason is structural, not about effort or intent.
Automated decision-making systems are typically implemented as technology projects. The accountability question — who reviews a decision made by this system, on what basis, with what escalation pathway — is rarely answered at implementation because it is not a technology question. It belongs to the governance layer that sits above the system. When the system is built and tested and deployed, the governance layer is presumed to exist. Often, it does not. The consequence is that the decision-making capability is live, the volume of decisions is accumulating, and the accountability mechanisms that would allow the organisation to answer a regulator’s question are either absent or have never been tested against a real dispute.
The Governance Institute of Australia’s 2026 governance agenda for directors is unambiguous about what this structural gap means for boards: digital competence is no longer optional, and the gap between AI deployment velocity and governance readiness is now a material risk (Governance Institute of Australia, 2026). The pattern that surfaces across organisations of all sizes is that governance frameworks were designed for decisions made by people and were never extended to decisions made by systems. The gap is invisible until it is not.
The architecture is a governance question
Understanding this reframes what December 2026 actually requires. Updating the privacy policy takes a few weeks. Building the accountability architecture takes most of a year — and that clock needs to start now.
What that architecture contains is not novel. It requires knowing, for each significant automated decision, what personal information the system uses, what kinds of decisions it produces, who in the organisation is accountable for reviewing outputs, how an affected individual can seek review, and what the organisation does when the system produces an outcome that cannot be defended. None of this is technically complex. What makes it hard is organisational: the answers cut across technology, legal, operations, and executive governance in ways that no single team naturally owns.
The organisations that are best positioned for December 2026 are those that have already asked this question: if the OAIC investigated a specific automated decision we made last month, could we demonstrate the accountability that sits behind it? Where the answer is yes, the disclosure obligation is a documentation task. Where the question has never been asked, the disclosure obligation is a governance signal that the organisation is not yet ready to disclose.
The deadline follows the work
The organisations that treat 10 December 2026 as the deadline — the date by which the privacy policy must be updated — will discover that a correctly-worded privacy policy and an absent accountability architecture are a combination regulators find more concerning than a policy that has not yet been updated. A board that has moved on disclosure without building the architecture has told the regulator it understood the obligation well enough to write it down, and did not understand it well enough to build anything beneath it.
The executives who will be best placed next December are those who treated the disclosure requirement as the visible surface of a governance problem they are already solving — not as the problem itself.
References
Attorney-General’s Department. (2023). . Australian Government. https://www.ag.gov.au/rights-and-protections/publications/privacy-act-review-report
Governance Institute of Australia. (2026). . https://www.governanceinstitute.com.au/news_media/the-2026-governance-agenda-priorities-for-directors/
International Association of Privacy Professionals. (2025). . https://iapp.org/resources/article/global-ai-governance-australia
Office of the Australian Information Commissioner. (2025). . Australian Government. https://www.oaic.gov.au/privacy/privacy-legislation/privacy-act-reforms