Commentary: AI ‘workslop’ is creating unnecessary extra work

WHAT EMPLOYERS CAN DO
For employers, investing in governance, AI literacy, and human-AI collaboration skills is key.
Employers need to provide employees with clear guidelines and guardrails on effective use, spelling out when AI is and is not appropriate.
That means forming an AI strategy, identifying where AI will have the highest value, being clear about who is responsible for what, and tracking outcomes. Done well, this reduces risk and downstream rework from workslop.
Because workslop comes from how people use AI – not as an inevitable consequence of the tools themselves – governance only works when it shapes everyday behaviours. That requires organisations to build AI literacy alongside policies and controls.
Organisations must work to close the AI literacy gap. Our research shows that AI literacy and training are associated with more critical AI engagement and fewer errors, yet less than half of employees report receiving any training or policy guidance.
Employees need the skills to use AI selectively, accountably and collaboratively. Teaching them when to use AI, how to do so effectively and responsibly, and how to verify AI output before circulating it can reduce workslop.
Steven Lockey is Postdoctoral Research Fellow, Melbourne Business School. Nicole Gillespie is Chair in Trust, Professor of Management, The University of Melbourne; Melbourne Business School. This commentary first appeared on The Conversation.
Source: CNA










