24 Top UK Universities Draft Principles for Use of AI in Education
5 principles include commitment to supporting AI literacy
The Russell Group, a consortium of 24 leading UK universities that includes Cambridge, the University of Edinburgh, Kings College, and the London School of Economics, announced a new set of Principles for the Use of AI tools in Education. A joint statement highlighted the consortium viewpoint and five core principles:
The rise of generative artificial intelligence (AI) has the potential for a profound impact on the ways in which we teach, learn, assess, and access education. Our universities wish to ensure that generative AI tools can be used for the benefit of students and staff – enhancing teaching practices and student learning experiences, ensuring students develop skills for the future within an ethical framework, and enabling educators to benefit from efficiencies to develop innovative methods of teaching.
….
Universities will support students and staff to become AI-literate.
Staff should be equipped to support students to use generative AI tools effectively and appropriately in their learning experience.
Universities will adapt teaching and assessment to incorporate the ethical use of generative AI and support equal access.
Universities will ensure academic rigour and integrity is upheld.
Universities will work collaboratively to share best practice as the technology and its application in education evolves.
Education Grapples with Generative AI
The rise of ChatGPT has led to a variety of responses across educational institutions. Several educational institutions, including Sciences Po, RV University, Washington University, and The University of Vermont, banned ChatGPT or identified generative AI use as a plagiarism offense earlier this year. Others, such as the State University of New York at Buffalo and Furman University, plan to add AI classes to their required curriculum.
A consortium of eight leading Australian universities also tried to position itself as taking an aggressive stance against generative AI use in December. The Russell Group has taken a more deliberate approach that acknowledges the potential for misuse as well as the need for its students to be AI literate. Alongside warnings related to privacy, bias, inaccuracy, and plagiarism is a commitment to training.
1.2 Our universities will provide guidance and training to help students and staff understand how generative AI tools work, where they can add value and personalise learning, as well as their limitations. By increasing AI-literacy, our universities will equip students with the skills needed to use these tools appropriately throughout their studies and future careers, and ensure staff have the necessary skills and knowledge to deploy these tools to support student learning and adapt teaching pedagogies.
…
2.1 Our universities will develop resources and training opportunities, so that staff are able to provide students with clear guidance on how to use generative AI to support their learning, assignments, and research.
2.2 The appropriate uses of generative AI tools are likely to differ between academic disciplines and will be informed by policies and guidance from subject associations, therefore universities will encourage academic departments to apply institution-wide policies within their own context. Universities will also be encouraged to consider how these tools might be applied appropriately for different student groups or those with specific learning needs.
There are also comments about academic rigor, legitimate and illegitimate use, and an understanding that the technology will evolve and the principles may need to be adjusted over time. Too often, we have seen knee-jerk reactions either lauding generative AI for its capabilities or eviscerating it for a litany of flaws and ethical considerations. It is refreshing to see a group take a reasoned approach to generative AI that treats it as another tool in the education arsenal.
Rules vs Implementation
Granted, the implementation of principles can sometimes be at odds with the spirit of their development. University students and faculty must monitor whether the intent is fulfilled through years of implementation. However, a worse alternative would be flawed principles and uneven application. The Russell Group seems to have struck a reasonable balance with its initial statement.
The fact is that generative AI is here, and its applications are already widely used. Students restricted from employing them while in educational institutions will be behind their peers when they enter the job market.
Some people may recall when calculators were banned in schools, and later computers and spreadsheet applications faced similar restrictions. Eventually, these became viewed as valuable tools that were deeply integrated into educational curricula. Generative AI is about to follow the same path, just at a highly accelerated rate.
Let me know what you think about generative AI in education in the comments below. I’d enjoy hearing your perspective.
Nice
birmingham