This content was originally published on The Resilience Shift website. The Resilience Shift, a 5-year programme supported by Lloyd’s Register Foundation and hosted by Arup, transitioned at the end of 2021 to become Resilience Rising. You can read more about The Resilience Shift’s journey and the transition to Resilience Rising here.
Last week resilience experts from all over the globe convened in Freiburg, Germany, for the 2nd annual summit of the Global Resilience Research Network (GRRN). Hosted by Fraunhofer EMI, the summit followed last year’s inaugural event in Boston. Savina Carluccio, project lead for our work on tools and approaches, represented the Resilience Shift. In this post, Savina provides some of her insights from the summit.
The GRRN is a global platform for collaborative research, focused on informing and advancing resilience. Its one mission is to transfer innovative resilience concepts, solutions and applications from labs and campuses to real world environments. The GRRN currently encompasses 26 partners, including leading academic and research institutions from over 14 nations. The summit in Freiburg was focused on “Baking Resilience In’, considering novel solutions, technologies and applications.
What is resilience?
In his keynote, Stefan Hiermaier, Director of Fraunhofer EMI, described resilience as a means of preserving functionality, ensuring graceful degradation, and enabling fast recovering availability following disruption.
Brian Walker, Research Fellow at CSIRO in Australia, suggested that there are 3 elements of resilience, including: (1) Thresholds – resilience of what, to what (specified resilience); (2) General resilience (adaptability), and; (3) Transformability. He argued that if a shift is to a bad state, then the only option is transformation. With resilience and transformation not being opposites, but complementary. However, maintaining resilience at one scale can require transformational changes at other scales.
Overall, resilience was recognised as a multi-dimensional concept, that is highly complex. Drivers for complexity include: interconnectedness, heterogeneity, and decentralised decision-making. “Systems are so interconnected that nobody really understands them“, said Lauren Alexander Augustine, Executive Director of the Gulf Research Program at the National academy of Science. Lauren argued for more research in systems engineering, leading to a much deeper understanding of complex systems, but that also importantly provides a much clearer translation of what this means in practice.
Professor Jon Coaffee, from the University of Warwick, argued that a shift from resilience as outcomes to resilience as a process (i.e. process vs product) is required. He went on to say that this would allow for innovation and adaptive capacity to become mainstream in governance processes, therefore shifting the status quo.
Lauren summarised by saying that there is “no single way to do resilience, no recipe, no cookie cutter, no checklist“. Her main point was around the fact that resilience is unique to communities, making it “place-based’. There is no single definition of resilience, and therefore communities have an opportunity to contextualise what resilience means for them.
Quantifying resilience
Stefan advocated a need towards quantitative resilience analysis. He stressed that this should be supported by robust data that can support meaningful analysis and simulations; although, predicting is only one thing data can tell us, data can also help understanding.
Lauren argued that it’s not just about sharing data and information but also reflecting on lessons learnt, best practices what has been done differently. There’s simply no “silver bullet’ approach.
Hans Heinimann, Director at the Singapore ETH Center in Singapore, also warned not to expect too much from resilience-based models. Hans argued that complex analysis should only be undertaken if you have a good understanding of the system; aligning with Lauren’s suggestion of more research in understanding of systems complexity.
Our tools and approaches work found that there are very few tools that can model, measure and quantify resilience outcomes. Models and metrics help make the case for resilience, but co-benefits are often not quantifiable.
The group recognised that we are currently in the age of “big data‘. Therefore, we can be pressed into thinking that this represents the “solution’. Although there can be lots of data, it is typically temporally “thin’, covering a short period of time, or we have few samples. There’s simply not enough to allow us to predict resilience challenges over significant timescales.
The context of the resilience challenge was considered important, particularly the social and political context. However, the big question is how to incorporate power and politics into resilience frameworks and/or assessment? Resilience expertise was therefore considered extremely important.
Risk vs Resilience
John Vargo, Executive Director of Resilient Organisations stated that established methods (e.g. risk assessment) are good for the known but not for the unknown. Hans commented that resilience is a strategy to unexpectedness and complexity, whereas risk management can be used for everything else.
However, John argued that negating the concept of risk is to be avoided, mitigated, passed on. He went on to say that we need a parallel way of thinking, where risk is embraced, and where organisations learn through their mistakes. Overall, accelerating the process to embrace risk and accelerate our learning is the only way we can keep up with change. Another stated that cost of the risk should not be the only determinant, and we should also consider longer term and secondary effects.
It was recognised that if risks don’t emerge, then communities become apathetic and switch-off to the potential risk. Therefore, resilience needs to take a threat-agnostic approach.
Collaboration
Everyone was in agreement that collaboration is key to ensuring resilience is “baked in’. Hans Heinimann argued that interaction between science and practitioners, with a process that goes from problem to solution needs to be developed and understood. Stephen Flynn, Founding Director of the Global Resilience Institute at Northeastern University, said that collaboration is the key to ensuring that research gets brought into practice sooner, and that practitioners should be brought into the conversation at an early stage.
Brian Walker asked the question of what is missing in resilience theory and practice? He argued that this is the roles of power and politics. For example, in terms of power 10% of the world’s corporations generate 80% of all profits globally.
It wasn’t just practitioners who need to be involved, attendants also recognised the need to get the resilience message out to the lay person and general public. With Lauren adding that building community resilience requires participation of diverse local stakeholders.
Accelerating theory to practice
Stephen argued that the current timescales for the implementation cycle are too long. For example, moving from basic research, to informing the publication of a standard and adoption into “business as usual’ can take in excess of 20-30 years. However, Lauren stated that the role of science is important to build a more complete picture and to elevate the profile of resilience.
Therefore, there is an urgency and need for accelerating implementation and scaling up of applied solutions; something that the Resilience Shift is focusing on as it helps to transfer theory to practice.