AI Fear and Trust Gap Requires Focus on People and Education

AI Fear and Trust Gap Requires Focus on People and Education

ARN (Australia)
ARN (Australia)Apr 2, 2026

Why It Matters

Bridging the people and data dimensions is essential for Australian enterprises to mitigate risk, build trust, and unlock AI’s strategic value across the organization.

Key Takeaways

  • 84% Australians use AI at work, yet lack formal training
  • EY estimates AI success: 60% people, 30% data, 10% tech
  • Only 40% of workers got AI training last six months
  • Logicalis hackathon required two‑hour AI tool training for participants
  • Cross‑functional leadership essential for responsible enterprise AI governance

Pulse Analysis

Australia is rapidly integrating artificial intelligence into daily workflows, with Microsoft data showing that 84 % of employees now interact with AI tools. Yet a parallel confidence gap persists: six in ten Australians report no formal AI training, and many fail to recognize the technology behind routine digital experiences. EY’s regional chief technology officer Katherine Boiciuc frames the issue as a cultural transformation, arguing that AI is 60 % people, 30 % data, and only 10 % technology. Without addressing the people and data dimensions, organisations risk low trust and sub‑optimal outcomes.

To close that gap, EY has built an AI Academy that delivers immersive curricula for executives and board members. The program blends technical fundamentals with governance, privacy and risk‑management modules, reflecting Boiciuc’s view that AI leadership must extend beyond CIOs to chief risk, marketing, people and procurement officers. By quantifying the skill deficit—only 40 % of workers received AI training in the last six months—EY positions education as the primary lever for accelerating adoption. Companies that invest in upskilling their workforce can transform AI from a novelty into a strategic asset.

Logicalis Australia has taken a hands‑on approach, embedding education into a company‑wide hackathon for 43 cross‑functional participants. Each entrant completed two hours of mandatory training on Copilot and Copilot Studio, then tackled real operational problems while applying data‑governance and security safeguards from day one. The focus on outcome‑driven solutions, rather than flashy demos, yielded tangible use‑cases and boosted confidence across teams in both Australia and Malaysia. This model demonstrates that scalable, structured learning combined with clear governance can turn AI experimentation into responsible, business‑critical innovation.

AI fear and trust gap requires focus on people and education

Comments

Want to join the conversation?

Loading comments...