
The bill sets a precedent for regulating AI‑driven workplace hazards, potentially reshaping compliance obligations across Australia’s safety landscape.
The rapid adoption of AI, automation and cloud‑based collaboration tools has stretched traditional work‑health‑and‑safety (WHS) regimes. In Australia, WHS laws have historically focused on physical hazards, but digital work systems now influence fatigue, mental strain and even physical injury through algorithmic scheduling and monitoring. Policymakers argue that existing statutes lack the granularity to address these new risk vectors, prompting jurisdictions like New South Wales to explore targeted legislation that treats technology as a distinct safety factor.
The Digital Work System Duty introduced by the NSW amendment requires businesses to assess and mitigate risks arising from any digital system that could compromise employee wellbeing. Compliance will involve routine algorithm audits, transparent AI governance, and redesign of work processes to eliminate hazards at the source. While the bill aims to protect workers from “modern workplace pressures,” it also imposes significant documentation and reporting burdens on employers, especially SMEs that may lack dedicated compliance teams. Failure to meet the duty could trigger penalties under state WHS enforcement mechanisms, adding a new layer of regulatory exposure.
Industry reaction has been sharply divided. Safety advocates, such as the Australian Institute of Health and Safety, contend the law overlooks the fundamental issue of poor work design, potentially diverting attention from systemic organisational flaws. Conversely, technology firms warn that the duty could stifle innovation and create fragmented standards across states, undermining the national WHS framework. As the debate unfolds, businesses should anticipate tighter scrutiny of AI‑driven processes and consider proactive risk‑management strategies to align with both state and emerging federal expectations.
Comments
Want to join the conversation?
Loading comments...