Train for Certainty, Educate for Uncertainty – but how? – Helga Nowotny

Nowotny Title

Helga Nowotny - CACE 2023

What role can education play in times of increasing uncertainty? This question was the topic of Helga Nowotny’s keynote speech at the fourth CACE („Crossroads in Academic Continuing Education“) event. Nowotny is an internationally recognized science researcher and professor emeritus at ETH Zurich. She co-founded the European Research Council and was its president between 2010 and 2013.   

The expert described the current situation as one in which more and more of the certainties for which we were trained no longer exist. There are various reasons for this, such as foreign policy tensions or the emergence of new technologies such as artificial intelligence (AI). Nowadays, we are increasingly confronted with blurred realities, Nowotny said. This is because AI can easily generate new content from images or texts – and a lot of it isn’t clearly recognizable as artificially created.   

It is more important than ever to train for uncertainty, said the speaker. She distinguished uncertainty from the concept of risk as a situation in which the probabilities of various phenomena or processes occurring can hardly be estimated. Here it is possible to estimate the consequences. Nowotny gave the example of the pandemic, which epidemiologists had already warned about years ago. It just wasn’t clear when it would happen. 

The longing to be able to better predict the future is deeply rooted in us humans. The means to do this have changed over time, and today studies and reports are produced. Predictive algorithms are a very powerful tool in this regard. However, trust in this technology involves risks, as the expert pointed out.  

For example, it must be taken into account that predictive algorithms only calculate probabilities based on data from the past. Surprising developments cannot be predicted by algorithms, and they only provide probabilities and not absolutely certain statements.   

Another danger is that the forecasts act like self-fulfilling prophecies. Nowotny cited Black Monday in the USA in the last century as a prominent example of such a self-fulfilling prophecy. If AI tools suggested a certainty about future developments, we would lose our ability to act and shape things. And this leads all the more to the realization of predictions. In contrast to earlier times, the concept of the future is nowadays fraught with fear, Nowotny explained. Fear is particularly prevalent among the younger generation, for example due to climate change.   

In the university context, ways are needed to integrate artificial intelligence into research and teaching. For lifelong learning, Nowotny believes this means that continuing education must support the co-evolution of humans and digital machines.   

Share this article with your colleagues