The momentum of Artificial Intelligence (AI) continues to rise, manipulating its way into every industry, challenging existing structures and processes, and making us question efficiency and efficacy. The education industry is far from immune, with AI rapidly revolutionising the educational landscape, impacting how students learn, how teachers educate, and how schools function day-to-day.
As AI usage within the industry nears normalisation, school governors need to understand the intricacies of the technology, and its interrelationship with school governance. With the appropriate knowledge and appreciation of how, when, where and why, school governors can capitalise on the innovative technology, and increase productivity to achieve greater outcomes for their schools.
What is AI?
AI refers to technology systems that are capable of instantaneously processing significant amounts of information, and performing tasks that replicate that of human intelligence, including problem-solving, learning, decision-making and perception.
AI technology is based on three main components; data, models and algorithms. The combination of these components enables the technology to recognise patterns and make predictions, known as training algorithms. Once trained, the AI systems can continue to learn and adapt, increasing competence and accuracy over time.
Common AI applications that may be used in an educational context include but are not limited to virtual assistants, such as Siri and Google Assistance, chatbots such as ChatGPT and Microsoft Copilot, and intelligence tutoring systems, such as Duolingo and MATHia.
Use of AI by School Governors
As the ultimate drivers of school improvement, school governors are in a unique position to use AI to enhance governance processes, and improve educational offerings and outcomes in their schools. Some of the many ways in which school governors can employ AI are set out below.
Strategy
School governors can utilise AI platforms to assist with the full spectrum of strategic planning. This may include employing AI to assist with strategic meetings, whether it is setting the agenda, preparing relevant meeting documentation, or summarising meeting notes. School governors can use AI to scour the internet for relevant strategic resources, materials or comparative data, and use it to analyse performance data, and provide feedback and implementation strategies to improve a school’s effectiveness.
The ability to gather, decipher and disseminate this information almost instantly not only allows school governors to be better informed, but it also frees them up to spend more time critically strategising, and determining and developing the long term vision and direction of the school.
Management
AI proves to be very useful for the purposes of general management, including enabling school governors to support the headteacher and senior leadership team. Some ways AI can be used in this regard, include as part of the recruitment process. For example, school governors can utilise AI to prepare position descriptions for job openings, formulate thought-provoking and intellectual interview questions and establish relevant criteria to assess candidates. This can assist to build and grow a robust leadership within a school.
Governance
School governors can use AI to streamline their governance processes. This may include the preparation of minutes, agenda, meeting notices, resolutions, schedules of delegations, policies and procedures, guidance and operational materials.
AI can also used utilised for the purposes of governance evaluation and improvement, by imputing relevant data, and having AI identify areas of weakness, and providing relevant recommendations to implement necessary changes to existing structures.
Key Risks
Whilst AI in education is undoubtedly beneficial and can enhance school governance in many ways, it should be approached with caution and restraint. The most effective way to use AI within a school context involves ongoing awareness, an understanding of the potential risks associated with its use, and instituting a careful and deliberate approach.
Privacy and data protection
AI integration in educational settings poses a real threat to the privacy and data of students, educators and the school itself. This is because inherent in the nature of AI is a reliance on mass amounts of data, and productivity increases with the amount of information that is provided.
It is plausible that school governors and educators are at risk of breaching their data protection obligations by carelessly imputing personal identifiable information into AI systems, unknowingly using that information for reasons other than for which it was collected. Doing so can put schools at risk of non-compliance, and can be detrimental for students particularly in circumstances of an AI data breach.
School governors can reduce this risk by putting in place stringent cyber security policies, along with a dedicated AI policy which reflects the school’s AI strategy. This policy should reflect the ways in which the school can and cannot use AI, and the long-term direction of the school with regard to the use of AI. It is important these policies prohibit the use of personal identifiable information, and urge AI users to be deliberate and considered when using the technology.
As part of an AI dedicated strategy, school governors should consider setting protocols in relation to which AI platforms it uses. The procurement of commercial AI licenses tends to yield better terms and conditions than standard terms and conditions associated with one-off uses, and school governors should consider the relevant data protection clauses within these terms and conditions.
Safeguarding
School governors need to understand, and take precautions against emerging safeguarding risks associated with the use of AI. Deepfakes, which are images, videos, GIFs or audio that have been manipulated using AI to artificially replicate someone’s body, face or voice, generally without consent, are a major threat to safeguarding, causing significant harm and distress to school communities.
As means of mitigating safeguarding risks, school governors should review and update school incident policies, and institute a risk register to assist in identifying and addressing potential safeguarding risks arising from the use of AI. Ensuring the school has robust processes and procedures in place to deal with these risks, such as the creation and dissemination of deepfakes, is paramount to combating this behaviour.
School governors should consider adding AI to the agenda of its existing safeguarding training. Raising awareness of emerging AI safeguarding risks, providing guidance on handling AI-related incidents, and demonstrating consistency in approach of how the school responds to harm in person versus online, will ensure the school is equipped to deal with these risks appropriately.
Exam malpractice
As AI becomes more popular, so too does AI-driven cheating in school exams and assessments. Exam Malpractice may take many forms, including but not limited to plagiarising AI-generated content, failing to recognise or acknowledge the use of AI tools or utilising AI despite not being an approved resource.
School governors should consider the role of AI in exam malpractice when reviewing and updating malpractice policies, ensuring that they appropriately cater for the wave of AI-related misconduct. School governors may also consider investing in plagiarism software that has the ability to detect AI usage, enabling schools to address incidents of malpractice, and restore academic integrity.
DfE position
The Department of Education (DfE) continues to release various resources on the role of AI in education settings. Whilst there is some acknowledgment on the emerging benefits and risks of AI in the sector, the DfE materials indicate an overwhelming endorsement of the use of AI in school settings.
It is important to note the DfE has received significant criticism for its stance on AI, and specifically, both its minimisation of the risks, and concerns that it has failed to recognise nuance in the complexity of risk associated with its use. Critics fear that a superficial acknowledgment of the risks is effectively a transfer of responsibility for its implications onto schools, and that schools should supplement guidance with risk-mitigation and a considered approach.
Conclusion
School governors have a lot to gain from embracing AI. A better understanding of AI, including how to capitalise and its inherent risks, will enable school governors to make measured decisions about the extent of its use at every level of schooling.
In line with DfE guidance, school governors should implement a rigorous AI strategy to navigate the everchanging landscape. To give effect to the strategy, school governors should ensure robust policies and procedures are in place, including a dedicated AI policy, and relevant policies that may be affected by the rising impetus of AI.
If you require further advice or assistance including with the development of an AI policy, or review of existing policies, please contact the Schools Support team at schoolsupport@wslaw.co.uk.