How did Universities all over the world first react to the pandemic, the shut down? What were our first responses to the crisis? What emerged as a first priority – as performed? And how does this form a baseline to understanding the present and future of Higher Education? We ran a study to ask these questions – and present our findings.
What is the real Question:
If one looked at the popular press and outpourings on social media, it seemed as if the world had suddenly gone online. Some, very very few, were prepared, since they ran online courses anyway. Most were not, and faculty were thrown in at the deep end. It would have been easy to assume that we were studying a shift to the online world. Indeed, we made the mistake of calling our study a quest for an index of online readiness. Thankfully, we were rescued by good methodology, and as we set out designing our hypothesis, we heard a lot more from universities – they did not like it for online readiness was not what they were about. They were right, and we realised that we were trying to understand what their response was – including online education, but not limited to that. The shift to online was a visible action, the true test was one of responsive governance, of crisis readiness and of the ability to hold to their core. This was how we found our study variables for the Index of Response Readiness.
What We Did:
We asked, and they answered. We asked as best as we could, in the early days of the lockdown. As with most other studies conducted during the lockdown, there are limitations, but the wonderful team at Leverage Edu kept at it, seeking the data we needed, rejecting what we so desperately needed but would not hold as good data. We crunched, we mulched, we analysed. We cut it up in different ways, and asked it every possible question. And here we are, ready to share what we learnt.
This is a sneak preview of results. Ask us more about them, and why they matter.
Why was the Question important:
The pandemic brought us educators face to face with a paradigm shift in Higher Education that was unprecedented – because of the opportunities we had to mitigate the loss of learning now. Pandemics have plagued educational institutions and Universities across centuries, but none has created the widespread impact that we see in the here and now.
For one, this is because universities are no longer ivory towers, isolated in their studies of esoteric knowledge, but are active participants in progress. They are agents of inclusion, of social and economic mobility and actively seek to include more on the journey of progress. Come the pandemic, consequently, the impact was on a larger proportion of people too – as students, teachers, professors, administrators and staff. We all felt the hit in one way or another, but the question that was critical was: How did we react in the Higher Education Sector? What did we learn about ourselves based on our first responses to the paradigm shift?
The question is important both to academics and to policy makers at the University and national level because it helps us identify our natural institutional responses and design intervention units that support smooth processes in future disruptions. The one thing that futurists agree upon is the pace of change and the frequency of disruption. Understanding ourselves and figuring out the reasons for our reactions is a first step to identifying what happened, and then comparing it with what should have happened. Each University needs to complete this analysis in order to understand the gap between their response, the typical response and their standards of provision.
How it was done:
We decided to partner with the best in the trade to ask this question. LeverageEdu is a young and dynamic firm that helps students find their next education goals, and they engage with Universities across the world to make this happen. They supported the operations and analysis of the survey across the world. The Global survey of Response Readiness was conducted in the early days of the adjustment to the pandemic, when universities were under great pressure. This is therefore both more authentic in it’s responses at that moment in time, but is also limited by the very same conditions that impeded excellence in University operations at that time. It is indeed, a valuable time capsule of what was prioritised and what fell by the wayside. In conducting the survey, we asked both students and universities similar questions and analysed the results on structural parameters. The results give us both a snapshot, and an ongoing methodology for universities to benchmark themselves against their own standards, and against their peers.
We identified six variables in the light of the first reactions to the pandemic. These six variables were distilled from the first exploratory discussions across the education world and were validated with a small team of experts as key to understanding the phenomenon of response to disruption. (One must note at this stage that there could not possibly be any expertise in such a major global disruption at scale, and this process could therefore not be as rigorous as one would have desired. In order to maintain rigour, random sampling amongst experts not involved with the study was seen to be a fair process).
There are three drivers of managing change – and that was our starting point. Change is handled by managing Purpose, People, Process. Each of these drivers gave over to two pillars each.
The pillars of our investigation were Teaching and Learning, and Continuity (Purpose); Experience during the upheaval, and Support systems that were available or activated (People); and, Technology and Assessment (Process). We used these as the six variables of our survey based study.
We realised that during the crisis, the key goals of every institution was to maintain the three Cs – Community, Continuity and Care. Without these three, the universities would have to struggle for survival during and after the pandemic. Holding these close would mean holding on to the viability of the university. We tested each selected variable against these, reframed as criteria for variable selection. These six pillars passed the test.
These were validated as the six pillars of our analysis.
For each of these, we asked three types of questions, broadly along three vectors (Experience, action and information), explained better in the full report. These were asked of both those who received learning, and those who provided learning.
LeverageEdu’s stellar team collated the responses, having disseminated the questionnaires and collected them. In addition to sorting the responses along the six variables, we did more. We identified structural factors, and spliced the responses along these. The analysis therefore was three dimensional, along the six variables, two target groups, and, four structural factors. These four are – geography, funding, size and ranking of institution. This gave us rich and deep insights for benchmarking individual institutions. As more institutions feed into this database, the benchmarks grow stronger and more useful. For future research, this forms a critical baseline record of responses to the disruption.
Some of our findings reinforced our cynicism, or at least reminded us that pragmatism rules especially at times of sudden systemic shock. Other findings left us totally surprised. To read them all, do go to the full report. This is but a quick set of highlights – that is so much more fun, and each of them has proper charts to evidence the findings.
The most interesting finding was that even in the face of a deep disruption, most of the responses were in the land of immediate, visible and tangible action, even if came at the cost of placing support structures where they were inevitably needed. Almost all universities focused first on the delivery of teaching and on continuity of assessments. This was visible, it was an act of showing up – and thus comes our first conclusion that Performance mattered more than Practice.
Knowing that the teaching and tech teams were stretched beyond their capacity, we find that building support structures for digital pedagogy or instructional design was not a priority. Conclusion two is that Delivery mattered more than Support Structures. This was of course at the beginning of the crisis, the first two months of shifting to online teaching for most universities across the world. It was interesting to note that strategic action was less inclined towards building institutional solutions for quality delivery. The first response was tactical – show up and man the front, keep the show on the road.
This was obviously going to lead to much anxiety on all fronts. While many excellent universities – not necessarily the biggest and the best ranked – did very well to adapt for inclusion, our third broad finding is that Continuity mattered more than Inclusion in the first wave response. While the circumstances of students and indeed educators were not ignored, they did not get first priority. In the decision making sequence, we can infer that supply side continuity was ensured first while demand (learner) side ability to join sessions was a, later action. From a management and governance perspective, this is worthy of reflection. Managers are supposed to have a bias towards action, securing the ship when in a storm. The reaction was indicative of the ship in a storm paradigm, which is not necessarily relevant to the current context. The governance response ideally would have been to include the whole learning community to ensure safety, connect and then scale up continuity – as delivered. Some, but very few universities did so.
There is much to reflect upon in these results. One, the risk aversion of universities that harks back to different times, indeed the ivory tower. Modern universities are outreach agencies for advancement – and our crisis responses must reflect this. The survey indicates that there is work ahead. Two, the focus on short term wins, such as delivery indicates a more business oriented view than universities are willing to admit. To indicate bastions of security, or even a self confidence based on purpose, the initial responses would not have worried so much on presence but on the solidity and quality of presence. While ensuring lessons continue, there should have been more and better early action on digital design, on enabling the cusp of technology and pedagogy, and on strengthening the foundations of remote learning. Three, in the rush to demonstrate their value, universities showed that they do realise that students come to them for exams and certification, not for core learning. Assessments were more of a priority than the quality of learning clearly and consistently.
How we behave in a crisis shows our character, and across the world, many universities found that they do not behave in line with their projected values, while other universities found their core of care, and outreach well ahead of their global positioning.
Other findings were quite interesting too, for example, the universities ranked in the two hundreds and three hundreds seem to do much better in a crisis – and it would be interesting to study this further. Either they were not so hard on themselves, or their students did not expect so much from them, or there was a genuine connect here that emerged from the lack of pressure on rankings. This deserves both validating replication research and further investigation into the relationship between universities and students.
The issues were highlighted by the crisis, but the crisis only reflects institutional instincts and response mechanisms. Our study gives universities a chance to reflect on how their institutional responses resonate with their values and goals. This is why each institution is enabled to find their own score and to map it to their cohort – be it geography, rank or others. It is both a tool for reflection, and for benchmarking. In our study, we only make the benchmarks public, not the individual university scores.
This is the first and hopefully only time that we will undergo such a shock. Shocks are opportunities for us to find our weak spots, and our strengths. We hope that this study enables this process in universities. The same study can be replicated, possibly with some greater rigour since the first study was conducted while dealing with the shock simultaneously, but the results of this one are very insightful precisely because of the timing. The principle of regularly investigating response readiness applies to all times though, and we look forward to growing and refining the benchmarking to enable
Where are the exciting Charts and Document?
You can read more about it in the PDF uploaded here. This is a pre-release document and discussions are invited.
There is a lot more in the actual study. This is a preview for those who follow my work.
The document will be released in a couple of days on the edquirer.org site. Do keep an eye out for it and other exciting insights into interesting education questions.