How can we help?
Close

A Director's Guide to AI and Cyber Governance

FBA Partners, Sparke Helmore explores strategies and resources that can help family businesses assess their use of AI, ensuring it enhances their operations while minimising unnecessary risks.

13 February, 2025
Queensland, Article
image description
image description
image description

Technology, particularly artificial intelligence (AI), has become an inescapable part of our lives. From workplace desktops and email filters to home Wi-Fi networks with numerous devices, AI is integrated into everything we do—from search engines and document drafting tools to social media platforms.  AI powers facial recognition on smart phones, facilitates access to bank accounts, and enables banks to analyse spending patterns to detect fraud.

As AI becomes increasingly pervasive, businesses must understand not only how to leverage AI to their advantage but also how to implement appropriate safeguards to ensure its safe and responsible use. With the vast number of resources available, it can be challenging for businesses to know where to begin.

This article explores strategies and resources that can help businesses assess their use of AI, ensuring it enhances their operations while minimising unnecessary risks.

The Australian Institute of Company Directors has published a comprehensive guide on using AI responsibly.  This guide makes eight recommendations relevant to both cybersecurity and AI.

Roles and responsibilities

  • Consider whether decision-making processes incorporate AI, as well as associated risks and opportunities.
  • Identify and document the AI applications in use, along with those involved in AI system procurement, development, and execution across the business.
  • Determine and document who at the board and management level is responsible for and accountable for AI usage.

Governance

  • Establish appropriate oversight for AI initiatives.
  • Identify suitable external experts for consultation.
  • Define the nature and frequency of reporting to the board regarding AI.

People, skills & culture

  • Confirm that management possesses the necessary skills and training regarding AI.
  • Consider the impact of AI on the workforce, including future needs and skills development.

Principles, policies & strategy

  • Ensure that AI considerations are embedded within the organisation’s overall strategy. Avoid adopting AI for its own sake.
  • Adopt policies that incorporate safe and responsible AI principles, as outlined in Australia’s AI Ethics Principles.

Practices, processes & controls

  • Develop a clear risk appetite statement and a corresponding risk management framework.
  • Implement an AI impact assessment capability and a compliance process.

Supporting infrastructure

  • Maintain an inventory of AI systems and data usage– identify where and how AI is utilised.
  • Ensure the data governance framework is in place and updated to account for AI.

Stakeholder engagement & impact assessment

  • Ensure stakeholders understand the impact of AI and manage their expectations accordingly.
  • Implement appropriate practices for accessibility and inclusion.
  • Ensure AI outcomes are managed and can be appealed.

Monitoring, reporting & evaluation

  • Establish a risk-based monitoring and reporting system for mission-critical and/or high-risk AI systems.
  • Develop and implement a comprehensive monitoring and reporting framework.
  • Considering seeking both internal and external assurance on AI practices.

The Australian Institute of Company Directors (AICD) and Cyber Security Cooperative Research Centre (CSCRC) have also released updated Cyber Security Governance Principles (Principles). These Principles provide best practices for Australian boards to adopt to building cyber security governance and resilience.

The recommendations include:

  1. Building an inventory of digital assets.
  2. Assessing the criticality of digital assets and determining appropriate security measures for each.
  3. Ensuring that third-party relationships, such as SaaS suppliers, are taken into account when evaluating risk management practices in the digital supply chain.
  4. Developing a well-established cybersecurity crisis management plan.
  5. Keeping directors updated on new threats and vulnerabilities.

Key takeaway

AI and cyber governance are moving targets.  The key is to make a start, learn and improve.  It is no longer possible for businesses to outsource these responsibilities.  Businesses must:

  • Know what data they collect and how it is used.
  • Understand what AI is used and how it is applied.
  • Be aware of the technology used to deliver and manage their AI and data.
  • have a plan that evolves in depth and complexity; it is essential to manage and mitigate risks.
  • Avoid a ‘set and forget’ mentality—IT and data management should be discussed at every board meeting.

Author: Hamish Fraser, Partner at Sparke Helmore


Since 1882, Sparke Helmore has been providing legal services to Australian businesses. That represents 140 years of success through bringing our experience and knowledge to our clients. We are a truly national full service firm with offices in all mainland capital cities and strong connections throughout regional Australia. Our strong history, underpinned by extensive experience means our firm is well placed to assist our clients to achieve the best business results across all phases of the business lifecycle, whether this be acquisitions and disposals, land related dealings, business structuring, intergenerational succession, intellectual property and technology matters, cybersecurity, employment, work health and safety or taxation.


Learn more about Sparke Helmore