Shadforth Artificial Intelligence (AI) Policy

Explore Shadforth's comprehensive AI policy to understand guidelines and strategies for responsible use of artificial intelligence technology.

Written By Ben Peterson (Super Administrator)

Updated at April 8th, 2025

Purpose of This Policy

AI tools are being introduced to help you work more efficiently. These tools can assist with tasks like planning, reporting, and answering questions. However, it’s important to know how to use them responsibly, especially when it comes to privacy and data protection.

Follow Australian Privacy Laws

When using AI tools, you must follow Australian laws like the Privacy Act 1988. This means:

  • Do not share personal or sensitive information about others (e.g., clients, colleagues) in AI tools unless you have their permission. This includes sensitive documents such as drivers’ licenses, and or any document that can personally identify an individual
  • Report any accidental sharing of private information immediately, as we must notify the government if there’s a data breach.

Be Careful of the Data you Input

If you are using an AI tool regardless of whether it is a public chat tool, or a private subscription provided by the company, you should always check what sort of privacy controls the tool has. If unsure, contact the IT team, or your supervisor.

Public AI tools (those not provided by Shadforth)
Anything you input into a public facing AI tool is used to train that AI model. It is therefore potentially then available to everyone else in the world who use that same tool. Therefore:

  • No Sensitive Information: Don’t put any confidential data (like financials, contracts, or client details) into AI tools unless it’s absolutely necessary and approved.
  • Use AI for the Right Tasks: Stick to using AI for work-related tasks like drafting emails, generating reports, or improving work processes. Avoid using it for personal matters or anything that involves private company or client data without approval or training.

Internal AI tools (Provided by Shadforth on a subscription basis)
Egnyte and Microsoft’s Co-Pilot tools are provided to a number of staff with the aim of applying AI across existing private internal company data. These tools can be used to interrogate data such as contracts and private documents without fear of that data being made available publicly. If you are unsure how this works, of whether you are in fact using a private tool, you should always contact the IT team or your supervisor

Data Access, Retention and Deletion

  • The company ensures that AI tools we use store and manage data securely. 
  • All existing access permissions are fully respected by internal AI tools. This means that an AI tool cannot be used to request information the staff member does not already have access to.
  • Be mindful that some public AI tools may store and use data outside Australia. We only use tools that meet strict security standards, but it’s still important to be cautious. You should always confirm within the tool that it is referencing Australian information which may differ from other countries.

Choosing the Right AI Tools

  • Only use AI tools that have been approved by the company. If you’re unsure whether a tool is approved, ask your manager or IT.
  • We evaluate these tools to make sure they’re safe and won’t compromise our company’s data. We can also provide recommendations and shared experiences on the uses of these tools.

Your Responsibilities

  • Complete Training: Make sure you attend any training sessions about AI tools, so you understand how to use them properly.
  • Respect Privacy: Don’t use AI to make decisions about hiring, promotions, or anything that could unfairly affect other employees.
  • Don’t Rely on AI for Everything: While AI can be helpful, use your judgment. If you’re unsure about the information AI provides, check with your manager. An AI tool is not a replacement for an expert in the subject matter.
  • If you’re using AI to handle client information, make sure the client is aware and gives permission. Be upfront with clients about how their data is being used.
    If you’re asked to share data with an AI tool, always make sure the person whose data you’re using has agreed.

Disclosure of AI Use

  • Employees must disclose when AI tools are being used to generate content, assist with decision-making, or interact with clients or stakeholders, especially in external communications. This ensures clarity, maintains trust, and allows recipients to understand the role of automation in the interaction. Disclosure is particularly important when AI output may influence decisions, carry reputational risk, or involve customer data.

Prohibited Uses of AI

  • Prohibited uses of AI include any application that compromises privacy, security, or ethical standards. This includes using AI to generate or spread misinformation, engage in discriminatory practices, make decisions without appropriate human oversight, or process sensitive data without proper authorization. Additionally, employees must not use AI tools to impersonate others, or to perform tasks that violate company policies, legal obligations, or contractual agreements.

Monitoring Use

  • The company may monitor how AI tools are being used to ensure compliance with this policy. Make sure you follow the guidelines.
  • Audits will happen from time to time to check that we’re using AI responsibly and within legal requirements.

What to Do if Something Goes Wrong

  • If you think something went wrong—like private data was entered into an AI tool by accident—report it immediately. The faster we act, the better we can protect everyone’s information.

Avoiding Bias and Fair Use

  • Be careful not to use AI tools in ways that could be unfair to others, especially when it comes to decisions about hiring, promotions, or assessing performance.
  • AI is here to help, but it’s important to make sure it doesn’t introduce bias or unfairness into our workplace.