The Policy for the responsible use of AI in government (the policy) provides mandatory requirements for departments and agencies relating to accountable officials, and transparency statements. The policy sets out the Australian Government’s approach to Artificial Intelligence (AI). This page provides details of the Department of Agriculture, Fisheries and Forestry’s (DAFF’s) implementation of these policy requirements to ensure our use of AI is safe, responsible, ethical and legal.
Scope
DAFF aligns to the definition of AI in the policy:
An AI system is a machine-based system that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments. Different AI systems vary in their levels of autonomy and adaptiveness after deployment (Organisation for Economic Co-operation and Development (OECD).
This definition and the scope of this statement excludes rules-based automation, as these systems do not infer or predict how to generate outputs from the inputs it receives.
DAFF’s approach to AI adoption and use
DAFF is committed to ensuring the governance, design and application of AI is safe, responsible, ethical and legal, in alignment with our vision and values, Australia’s AI Ethics Principles and the APS Experience Design Principles.
DAFF is exploring the adoption of emerging technologies, including AI, as part of the Australian Government’s broader commitment to improve regulatory service delivery and decision-making. DAFF will continue to explore innovative ways of using AI. DAFF is committed to building strong governance foundations to support the safe and responsible use of AI.
All staff have access to the AI in Government fundamentals training, produced by the Digital Transformation Agency (DTA), and are required to report on the use of AI to the department’s Digital Business Division. The department actively encourages staff to undertake the AI in Government fundamentals training through regular communications and promotional materials.
Additionally, DAFF has an Artificial Intelligence User Acceptance requirement, which staff must confirm and acknowledge that they are familiar with before accessing generative AI tools online. This requires staff to agree that they will:
- Be responsible for ensuring the accuracy and relevancy of any content generated by any Artificial Intelligence platform.
- Not input any personal, sensitive or classified information that may be subject to government or third-party restrictions.
Staff are reminded that they may only input publicly available information and must act in accordance with the Government’s Protective Security Policy Framework and Information Security Manual. Staff are also encouraged to read the department’s ICT Acceptable Use and Security Policies and Privacy Policy and are required to complete mandatory annual training on these topics.
DAFF’s use of AI
Based on the classification system detailed in the Policy for the responsible use of AI in government, DAFF uses AI in the following ways:
- Domains include:
- Corporate and enabling
- Scientific
- Service delivery
- Usage patterns include:
- Analytics for insights
- Workplace productivity
The department is using AI to:
- Automate routine tasks, manage workflows, create content, and facilitate communication
- Summarise high volumes of documents or information
- Identify, produce or understand insights from data
- Understand patterns and trends in large data sets
- Categorise documents for storage and retention.
DAFF does not currently use AI in any way that directly interacts with the public without a human intermediary or intervention.
Accountable Official
DAFF has an Accountable Official under the Policy. The Chief Data Officer (CDO) was designated as the Accountable Official on 16 October 2024.
AI safety and governance
DAFF has an internal register of AI use cases. The internal register provides transparency and monitoring of AI use cases throughout the AI lifecycle.
DAFF has measures to ensure:
- Ensure the design and application of AI is effectively governed and managed.
- Ensure risk management frameworks include AI-specific considerations, ensuring mitigation and controls are implemented when risks are identified.
- Ensure AI use cases are tracked and monitored through the internal register across the AI lifecycle.
- Ensure AI use across the department is visible, to support the effective governance, assurance, reporting and risk management of use cases.
- Promote the safe and responsible use of AI to departmental staff.
- Promote collaboration across the department and other government agencies on the use of AI, including the ongoing development of resources to ensure its safe and responsible use.
Compliance
DAFF will only utilise AI in accordance with applicable legislation, regulations, frameworks, and policies.
Continuous Improvement
DAFF is committed to regularly reviewing and updating AI policies and practices. This includes staying informed about new developments in AI technology, ethics, and regulatory requirements.
For example, in September 2024, DAFF participated in the Pilot Australian Government AI Assurance Framework (AI Assurance Framework) and applied a number of aspects in the AI Assurance Framework to certain AI use case trials to test its application and impact. Following on from the pilot’s conclusion, DAFF uses the AI Assurance Framework to assess current AI use cases at the various stages of the AI lifecycle.
As the landscape evolves, DAFF will continue to systematically review relevant AI policies and frameworks in consultation with staff, stakeholders, the community and our partners, where appropriate.
Updates
DAFF acknowledges that the AI Transparency Statement will be updated and published:
- At least once a year.
- When making a significant change to our approach to AI.
- When any new factor materially impacts the existing statement’s accuracy.
- This Transparency Statement was modified on 25 July 2025.
Contact us
For any enquiries relating to DAFF’s use of AI or the information provided within this Transparency Statement, contact ai@aff.gov.au.