Uncertainty about AI is normal. When leaders don't create a culture of open discussion, teams may not talk about concerns, and people may make assumptions. They may not raise legitimate concerns and could avoid asking questions or experiment without clear support.
Talking about uncertainty and concerns helps teams:
- move from uncertainty and shared understanding
- separate real risks from assumptions
- identify potential risks to the organisation
- identify where the team needs support or guidance
- understand the responsibilities of leaders
- agree on practical approaches.
This activity can help your team make sense of uncertainty and prepare for change.
It doesn’t replace:
- building practical AI skills
- documented risk management processes
- an accountability framework
- testing low-risk ways to use AI
- setting clear rules for AI use
- ongoing support and follow-up
- understanding and complying with WHS legislation
- formal major change consultation processes defined under the Fair Work Act.
How to run this activity
Work through the steps as a team discussion. Keep the conversation grounded in your team’s real work, current concerns and readiness.
This activity works best when people feel invited, not exposed.
Set clear expectations before you start. For example:
- participation is voluntary
- people's voices are valued
- how things that are raised might influence outcomes or processes
- people may speak, write, listen or pass
- emotional, practical and structural concerns are all valid
- this is a space to understand concerns, not debate them
- not everything needs to be resolved during the session.
If you’re leading the session and also feel uncertain, it’s okay to say that. Being honest about uncertainty builds trust when it’s paired with clarity about what happens next.