We’ve been talking with stakeholders across the sector about tools to support assessment and feedback for some months now. There’s no shortage of cool stuff out there (much of it increasingly powered by AI) and I’m nervous that there is an expectation in some quarters that our research will produce a ‘top 10’ list to pick from. Spoiler: it won’t.
If anything, these conversations are broadening the scope of the project and making it clear that any discussions about assessment practice, and the tools to support it, need to be situated in a much broader context of future directions for learning and the kind of ecosystem we need to support them. These are just a few personal reflections on what we are seeing so far [deliberately provocative and sufficiently badly presented so you can be sure I created them].
Ecosystems, lifecycles and learner journeys
The models, frameworks, diagrams and process maps we used to talk about this area are almost universally organisationally focused [please someone prove me wrong and share alternatives].
It’s too easy with these models to lose sight of this as a learner journey and forget what that learner is already carrying with them on the journey. Too often we treat the learner as a blank sheet of paper for learning to be written on. Distinctions such as traditional/non-traditional learners aren’t much more helpful.
Each learner is bringing with them their lived experience to date and the richness of that diversity is something we should value and build on in the design of learning and assessment activities.
Authentic assessment shouldn’t be only about the context to which learning is applied; it is equally about how individual experience contextualises the learning.
How do we create a representation of the assessment life-cycle and its place in the ecosystem that allows for learner individuality?
How many academics, procurement officers, IT staff and change management consultants does it take to change a lightbulb?
It’s not easy to take an ecosystem approach in an organisational context where it’s really difficult to change anything.
Delivering a rich and adaptable plug and play learning ecosystem is technically possible but there are a host of organisational and cultural reasons why the education sector finds it difficult to take advantage of the possibilities.
We end up in a situation where core information systems become increasingly monolithic with duplicate systems needed to meet some important requirements and other things going on entirely under the radar posing all manner of risk.
We know what interoperability means in principle yet we fail to demand it of our edtech suppliers. We allow suppliers to claim that they meet open standards whilst still charging for bespoke interfaces that deliver a poor user experience.
As a sector, our level of IT procurement maturity is low. We tend to try to minimise risk by specifying functionality requirements to the nth degree thus locking ourselves into ways of doing things that will be outdated by the time our implementation is complete and radically constraining which of the available market options we can consider.
Curriculum design and analytics
I struggle to remember how long it is since we first talked about the possibility that a continuous flow of information about learning activity would make ‘stop and test’ a thing of the past. Over assessment remains a significant issue for both staff and students.
A range of excellent learning design tools have helped build curriculum design capacity across the sector but assessment design remains a weakness. Staff lack the skills and confidence to design differently and issues of assessment bunching and over, or under, assessing certain learning outcomes within a programme of study remain.
Whilst we have used the tracking capabilities of digital tools to measure individual learner activity and engagement for predictive purposes, we are not currently using data to improve curriculum design. This is an area where Jisc is currently exploring possibilities. What if, rather than looking for patterns in a sea of data, we could define upfront what data will help us understand if our learning designs are working as intended? What if curriculum analytics could help us design learning experiences that are relevant, engaging and proven to accelerate learning?
Generative Artificial Intelligence (AI)
I really think we are sweating the small stuff here!
It is good to see that the dialogue is progressing beyond initial knee-jerk reactions and people are coming up with lots of well-reasoned arguments about why we need to prepare our learners for a world where this technology is ubiquitous how it could be used to help redesign assessment in ways that are authentic, creative and can support the kind of critical reasoning that constitutes information literacy in this new scenario.
There are a host of ideas on how generative AI can be used to save staff time on repetitive tasks, create better and more extensive question banks etc. [I confess to a certain amount of scepticism about some of the time-saving on things like lesson and assessment planning – generating and comparing a lot of options could give you a better end product but is it really quicker?]
Something we’ve barely scraped the surface of yet is how AI could be used to turn academic descriptions of curriculum into descriptions of skills and competencies that are meaningful in wider contexts. The idea that this could give an impetus to the idea of ongoing recognition of achievement/micro-credentials is exciting and obvious benefit to learners.
However, the debate around ethics has shifted too quickly to focus on academic misconduct/acknowledgement of sources etc. This has to be missing the point when we don’t actually know what the underlying sources are.
We are worried about what our students might do with the technology yet we are not asking the really big questions of the edtech suppliers. What about the carbon footprint of AI, what about the low-paid human labour that goes into training AI, what about the data sources and algorithms? The big edtech suppliers need to be prepared to give us some answers to these questions. What is equally worrying is the proliferation of start-ups who are focused on getting their neat idea to market faster than anyone else and who may through naivety and/or lack of resources cut corners on due diligence
Recognising achievement
Rethinking assessment practice is a vital part of making higher education fit for the modern age and part of that concerns how and when we recognise achievement.
Programmes of study should deliver a value that is more than simply the sum of the component parts but that doesn’t mean those component parts are not worthy of recognition. It is increasingly difficult for learners to devote years at a stretch to gaining a qualification. Learners often have to interrupt their studies and too often that is equated with failure. Offering recognition for what has been achieved can motivate the learner (and those supporting them) to future completion.
We can even question why assessment has to take place to a rigid timetable in such a significant majority of cases. What if there was greater flexibility around learning at your own pace and being assessed when you feel ready?
How far are we from a scenario where learners have a rich record of the skills and competencies they are gaining all along their learning journey in a form that is digital verifiable, transferable and stackable?Join the discussion
This work is raising some big topics and all of them are being addressed in various ways by Jisc directly and in collaboration with a range of stakeholder groups.
I pulled out this particular group of topics because they are all things that we will be following up in a forthcoming event with colleagues from across Europe.
1EdTech Europe: Evolving the Digital Learning Ecosystem Across Borders will take place at the University of Nottingham on 14-15 September.
Jisc will be leading interactive sessions on Assessment and Feedback and Curriculum Analytics, joining a panel discussion on the learner journey in the age of AI and engaging in sessions on all of the other topics raised in this blog post.
If you have good practice related to the above topics to share, we may be able to squeeze in a few more contributions so get in touch: gill@aspire-edu.org
Find out more and join us:
https://web.cvent.com/event/ca43d697-f0cf-47d6-aafd-9e6219970f13/summary