for hello welcome everybody and thank you for attending this event about boosting the development Effectiveness with adaptive management organized by the idb's office of evaluation and oversight before we start a little bit on Logistics you can write your questions in the Q&A chat and we are having live interpretation to Spanish and Portuguese you can use the world icon to find the interpretation of your preference I want to pass the floor to Maria Lisa M our over director who will give um a welcoming and introductory remarks welcome Mar Lisa thank you Claudia H many thanks and it is my uh pleasure to be here and have the opportunity to give some introductory remarks to this very important session a special uh welcome and thank you to the panelists and also to all colleagues who are joining us from various regions achieving development outcomes lifting out of poverty more than 700 million people who now live on about $2 a day in the world and reducing income inequalities within countries and across countries just to mention two of the sustainable development goals that the development Community has committed to achieve by 2030 is a very ambitious and highly complex Endeavor it requires among other things that multilateral development banks have a Clear Vision clear strategy um ra more capital from the private sector and leverage it much more than they're currently doing and also that they have great implementation capacity now underlying all of this the vision the strategy also establishing Partnerships and very importantly implementing activities well is the ability to continuously learn and quickly adjust the course of action which is what adaptive management is about now I have experienced firsthand the importance of adaptive management learning from my own uh failures and some successes when I first started working in development it was more than 20 years ago I designed a project for a country in Latin America that had as a key objective that of improving competition in the country that means reducing barriers so that you know private company can come into U various sectors and then produce more products and the prices go down consumers benefit particularly uh and including and sometimes particularly lowincome uh people and what happened four month after project approval is that the country had already a best practice antitrust law in Parliament and the embrio of a competition policy agency ready to start working that was great success it was very uh quick and indeed at closure seven eight years after the project was evaluated and the competition part was rated successfully now shortly after completing the design of the project I moved to another region it so happens and I started working somewhere else I returned to Latin America after 10 years and I returned to realize that the level of competition in the country where I had had designed the project that was implemented in the past s eight years was very low was actually one of the worst in Latin America if I remember correctly was second to last and we could measure this well thanks to new indicators that the oecd had developed the pro market regulation indicators that we jointly started to collect with the oecd in Latin America and also in the Middle East so what we realized based on this data is that several sectors in that country had barriers to entry the fewer products were available compared to neighboring countries and the prices were much higher including prices such as fertilizer for example which are very important for Farmers including micro Farmers so H something had gone wrong what I thought was very successful I realized 10 years after it actually didn't didn't work and this is my example of failure of adaptive management and as I look back the failure was due to several factors including the fact that at the time of design we establish objectives and indicators that were focused on intermediary outcomes so the passing of the law the creation of the agency rather than impact um the lack of impact data really when we designed the project was another issue but also the fact that when the data became available after four five years they were not reflected in the project and then also the lack of institutional incentives as I moved to another region I didn't have much focus on what was happening there is only when I returned 10 years after that I started looking at what happened and what went wrong granted we did realize 10 years after that we had to change our approach to competition and we did and we focus much more starting at that point on reducing um barriers to enter sectors many of them are simply administrative barriers rather than passing laws and strength in uh the agency so at that point things started to change and we appli that approach not only in that country but more broadly in the region and in other regions as well so we did learn and we did adapt and we saw positive changes but after more than a decade now I also had the Good Fortune of experiencing some successes and the positive consequences of adaptive management particularly at the time when I restructured projects and that means changing the objectives uh project indicators reallocating project money from one activity to one activity to another when it becomes clear that where you're putting money is not um likely to lead to outcomes particularly again if new evidence becomes available like new data like it happened in on competition in in in that country and being able to do all of this while the project is under implementation changing indicators objectives moving money across makes a difference in achieving meaningful development outcomes in few months or few years rather than more than a decade and if you think about our commitments lifting 700 million people out of extreme poverty 28 million people just in Latin America and the Caribbean by 2030 that's in six years we don't have 10 years to learn lessons and adapt we need to act much more quickly an evaluation has a key role to play in helping management be relevant and quick on its feet focus on development outcomes change course of actions based on new evidence change monitoring and evaluation systems when needed as they also need to be adaptive and this is what today's panel will discuss now it's my pleasure and my honor to give the floor back to uh Claudia and um Mariana Claudia figero and Mariana gamar Claudia is a an evaluation specialist in O and she has worked on several evaluations country sector uh corporate and Mariana is a research fellow who is now with but has experience both in valuation and in uh policy with the oecd and UNESCO I have to apologize I won't be able to stay for the panel but uh I wish you all the best on what seems to be uh a very Rich discussion and great presentations and I will make sure I ask Claudia and Mariana for key insights that are coming out of this session because I really think it's important that we continue to focus on this and we learn and we take it forward and we Implement whatever lenson come out knowledge panel exchanges of knowledge such as this one thanks to all thank you Mar Lisa for sharing those very concrete examples that remind us that ultimately what matters is whether we are helping our counterparts to achieve meaningful outcomes on the ground and that our procedures tools and metrics should go uh to that direction let me introduce now the distinguished panel of experts that is joining us today who bring diverse expertise and offer insights from research implementation and Innovative evaluation to enhance outcome orientation Dr Estelle Rondo she is head of Methods at the world bank's independent Dev Val ation group she is an internationally recognized expert on evaluation methodologies has advised governments and international organizations and evaluation system and has published extensively on the topic welcome Estelle leny wild uh she's deputy director at Global Partners governance leading on research m& and strategic Communications she has over 20 years of experience and is recognized in political analysis and a leader in adaptive management among many other topics welcome Lenny and S Vester works at the undp focusing on transnational corruption Financial integrity and Innovation San has also LED undp's m& sandbox rethinking monitoring evaluation and impact measurement for contemporary challenges welcome sain today we will discuss adaptive management what is it why it is relevant how it can be done rigorously and the implications for monitoring and evaluation and Beyond this is an important topic as development interventions typically occur in complex and dynamic environments that require proactive approaches to maintain a real focus on results so let's start with our first question what is adaptive management and why it is relevant to achieving results Lenny can you give us an overview of what this concept means thank you Claudia and thank you very much for the invite to participate today and it's great to be part of of such an excellent panel so I'm looking forward to our discussion today um I should say I am a researcher an analyst I've worked for think tanks I currently work for a small organization that supports um capacity building for political institutions so I'm not from a funding organization a lending um um institution a different background um but what I would say is I I think the first thing we have to recognize when thinking about adaptive management is that it is not new I'd say arguably it's been around for as long as development practice and thinking has been around in the 1960s people like Hershman were talking about it then in the late 8s early 1990s there were calls for more structured flexibility in development and I think consistent evaluations and research have shown over many decades that development projects to often do not learn from their experiences and and do not change based on what is learned and Mari Lisa gave us a very excellent example from her experience of that as well um adaptive management borrows also from the experience in lots of other fields um from kind of business management and practices uh military strategy Natural Sciences it's a sort of amalgamation of the principles that have have emerged from Lots of different types of large organizations trying to deliver difficult things um and there's lots of different definitions out there but the one that I um quite like is is the definition used by usaid which describes adaptive management as an intentional approach to making decisions and adjustments in response to new information and changes in context so I like the fact that it talks about being intentional about learning and adjusting but also that what you're doing is you have your goal in mind and what you're adapting is the understanding of how to get there it's not necessarily changing the end outcomes it's it's learning about how to achieve them um and I think adaptive management is an approach that inherently recognizes the complexity that exists in almost all change processes which I think is ultimately what development is about it's about supporting changes in systems in behaviors um in outcomes ultimately um some of which again Mar Lisa talked about in terms of the big development goals that exist you know none of these things can EAS can we easily predict how to achieve them in advance so approaches that are based on intentional testing and learning are much more appropriate when you're dealing with complexity uncertainty Behavior change lots of different actors organizations that need to come together to support an outcome so I think it can and it should offer a better approach for substantively realizing results I think it's interesting that it's often seen and I often hear criticisms that it's a more risky proposition um but I think if we're thinking about decisions on how to how to lend or how to spend limited resources I think we need to be really open that blueprint planning or approaches that are assume a certain level of certainty in terms of how things can be planned delivered and achieved up front is not low risk and actually has contributed and regularly contributes to a lot of costly failures so you know for me adaptive management it's not about figuring out as you go it means starting with some initial hypotheses how do you think something's going to change or you know or reform will be achieved testing those revising them as you're doing something using the best available information that you have at the time having those regular feedback loops and actually acting on them as part of implementation and I think it's also helpful because it you know it has to be very context specific it means you have to really figure out what is going to work in a particular context environment situation and why through that testing and iteration process um so I guess for me the ideas of adap management now seem to be fairly well established a whole range of different types of organizations in development whether they're funders lenders implemented Partners increasingly are using the rhetoric or have policy commitments around adaptive management I think the problem we have now is the reality of big gaps in actual implementation are we doing what we said we were going to do here um the reality is lots of large bureaucracies of different kinds still have very low tolerance for experimentation for really putting learning at at the center so I think there are real questions not only about what is adaptive management but why aren't we doing it when we say we should and are maybe I'll pause there thank you thank you Lenny for this very clear explanation that help us understand that this concept is not like a high level theoretical idea but a very concrete practical approach that is context specific that has been around for a long time that uh requires being intentional on testing and learning uh to make decisions and that um Estelle uh what does adaptive management look like at the level of country engagements uh at the World Bank group um thank you Claudia for this question and uh to Lenny for really giving us a really nice framework to work on um and maybe a couple of things at the out set so that uh our participants understand you know on what basis I will be talking so in 2020 we we LED an evaluation of the bank group's outcome orientation at the country level um and so the idea was really to try to understand you know how the the bank group as a whole with a very country Centric model aims for outcomes how it captures those outcomes how it measures how it understands and learns and how it manages for outcomes and so this last part was very much about understanding whether teams are practicing adaptive management and above all whether the systems that we have in place these results based systems um are enabling or hindering them in doing so so that's that's kind of the the framework within which I I will talk today um and so as Lenny highlighted of course adaptive management is not new um there has been quite a lot of of Science of delivery and understanding you how things work but mostly at the project level or within a particular intervention there hasn't been as much reflection on what it means to be adaptive at the country level how do you manage a portfolio of interventions how do you manage a country engagement adaptively how do you learn from the you know both the lending but also many of our institutions are doing a lot of advisory work training capacity building institutional development and so understanding how the sum of the parts um work together and how to adapt that in context so the evaluation try to really uh be steeped in the literature but also adapt some of these Concepts and understand what are the decisions that are being made around this country program this country portfolio and what are the the the opportunities for being adaptive so just to put things quite concretely country teams the types of decisions that they have to deal with are about you know establishing or managing a p pipeline of operation um approving or not approving new interventions new new projects how to restructure uh within a sector or within one intervention how to determine whether government reforms are mature enough to be able for instance to go for policy Landing policy financing um deciding which pieces or portfolio of analytical work is needed to support uh oper operations and lending within sectors and then sometimes reallocating resources across different parts of the portfolio so these are the kinds of decisions that need to also be informed um by a Pursuit for outcome and understanding a little bit more of of a country program approach so what we did we find in this evaluation on the practice of adaptive Management in the bank group in the world bank group um on this on on at this level what we found that country teams very much practice several aspects of adaptive management but not all um so they tend to very closely monitor the health of the portfolios focusing on on some clear metrics you know disbursements um trying to really work through the delivery issues that are happening and also navigating uh changes in context including changes in political economy um so there is a real systematic effort on the delivery side right driven by by disbursements by delivery issues um at the same time it was very clear from the the the evaluation that we did and and a lot of feedback we got from country teams is that they recognized that this was not sufficient it was not the whole set of adaptive management decisions that they needed to make and um and often it was difficult to actually adapt based on new evidence of what works and what doesn't work and why and this testing hypothesis that Lenny kind of laid out at the beginning um and so there was a disproportionate uh emphasis on on these metrics of disbursements of output delivery and not sufficiently and understanding whether progress was being made along complex results chain so the major second takeaway was that um these adaptive behaviors were not supported by the results systems that were in Place uh at the bank group these um and I can go into it a little bit later what were these limitations um but I guess the big big conclusion is that these results systems had been set up as Lenny said um you know many years ago uh at the project level it's in the 60s at the country level it was in the 90s um and they have had their path dependency right so it has been actually extremely difficult to change these systems despite multiple Diagnostics that they were not fully supporting adaptive management and sometimes they were actually um coming in the way of it so being an obstacle to these adaptive management uh practices let me stop here for now and we can elaborate a little bit later thank you very much Estelle uh for these very concrete examples of uh what adaptic management look like uh at the country level and the type of decisions that that implies in terms of managing the portfolio or appr deciding on approvals um and and the issue that you that you present that results system should help but in the end are kind of being an um an obstacle for adaptive management is is something very interesting and that we we will come back to that um so s what can you tell us uh in terms of in which context like when this adaptive management approach makes sense yeah thank you cloud and such a pleasure to be on on this panel and and discuss these important issues um also really great point that that um it was raised just now about adaptation at a portfolio level if we have constellations of projects that are synergetic if we addapt one that is relevant for for others so I think that's maybe a different discussion but but I think a really interesting point there um I think in in terms of when it makes sense to to use this approach first of all is I think Lenny was also talking about it's Clause correction this is not new and and I think CLA correction should be part of any good project management regardless of of what you do right you you need to be able to adapt if you're building a house and your contractor um is arrested for fraud then maybe you need to change contractor right but I think generally it makes sense to use at least in my perspective adaptive management especially if you deal with complex problems so we can make distinction between simple complicated complex or or chaotic uh problems and if we're dealing with a a fairly simple project like um or problem like building a house or maybe even building a hospital that might be a comp complicated thing but but transforming health system is a complex uh Challenge and and these the more complexity we deal with the more important it is that we're able to continuously learn and adapt similarly I think the context or environment that we work in is really important um if there's a high degree of uncertainty in the operating environment let's say we working in a crisis context whether Sudan or Haiti or Ukraine or or elsewhere that requires us everyone to be much more agile and able to adapt as as unpredictable things happen so I think under these cont circumstances this is a particularly important um approach to be practicing um I think there are also some other things that affect the when it makes sense or or to do it um at least affect whether we able to to practice um this type of approach um we can talk about the sort of wider enabling environment um for this there are certain types of skills that are really important to have in a program or project organization or whoever is deploying the Adaptive management approach um we need to be able to continuously learn like skills like sense making and and so are important certain mind mindsets um that people are open to actually learning and changing their minds and and having a growth mindset is important to be developing and fostering that culture of learning um having certain program governance processes n systems that allow for this we we just heard from from still that you know sometimes our ex existing systems don't necessarily allow for this that they might even disincentivize it so are our it systems for instance do or Finance systems do they allow us to move funds from one activity to another um so there's those types of things that are really important to make sure are in place uh similarly flexibility in terms of our partners and funders if is there willingness on behalf of a funer to or whoever we accountable towards to actually allow for adaptation to maybe make changes to a lock frame or maybe not have a log frame um and do we have authority to adapt and make decisions that they need to be made who has that type of authority and and how do we make sure that they are then give the information they need to to make decisions and a Deb so those are some of the extra things I would bring into the equation when we talk about when it makes sense to apply this approach thank you very much s for explaining that um while this approach can be very useful it's also not like the Panacea or or the answer to everything um and that we need to um identify those cases that require this approach especially as you were saying uh cases of complex interventions or um cases where there is a lot of uncertainty and have in mind as well um the conditions from Pro from the funders like this flexibility is something that that is kind of appropriate or allowed um now that we know better about the what and why and when of adaptive management let's move to a discussion that I think is it's even harder about the how um uh how can adaptive management be done and specifically how can we do this rigorously um uh I'll I'll start with Lenny again but I'll promise for next questions we we will switch um leny can you explain what would be rigor in this context thank you yes I think I mean there is the common concerns as I mentioned I often hear about adaptive management is that you know is it just sort of making things up as you go along not having a plan not knowing what you're doing not knowing what's going to happen um but I think again adap you know adaptive approach can undoubtedly be rigor be rigorous indeed they you know if you like they kind of bake into themselves the incentives to actually collect and use evidence and data from the start um but we do need to think about what rigor means I think in some different ways which is what I want to comment on I think at the same time we also need to recognize um you know that part of the problem here I think is that in reality adaptation is happening all the time at multiple all levels but often it's happening under the radar it's not captured in our reporting systems it's often based on you know the very well informed but the informal and tacit knowledge that people have at multiple levels and therefore how that shapes their decision making um and that means it remains somewhat hidden and I think that's why it's often challenging for when you're looking at this kind of way of working um uh it can often it can sometimes still be quite weak in actually really capturing and sharing and being transparent about the sort of underlying rationale for how decisions were taken you know how fun why funding choices were made and what learning has happened so I think this that's part of the conundrum that we Face here so to me if we're thinking about adaptive rigor and what that looks like it's about having a more documented and transparent Trail of the intent itions the decisions and the actions that have been taken it's making explicit what is often implicit in terms of the judgments people are constantly making that influence their decision making so it means that you can clearly sort of show how those initial you know hypotheses you know why you why you thought you do something why you thought change would happen in a certain way the assumptions that sit beneath those you know why you thought that how how you tested that and what you did as a result of what you learned so I think it's that kind of being more transparent and getting better at documenting that decision making process and the evidence that you're using as part of that is the key element for me of adductive rigor you know it means being very upfront about the appropriate collection of timely data that can inform decision- making in an ongoing on an ongoing basis again there can often be concerns here this could be very burdensome or can be collected too late I think it it requires often much more ongoing regular sometimes like touch assessments of what's happening but combined crucially with longer term more in-depth assessments of those deeper change processes that you're aiming at so it's not just about short-term rapid data collection It's a combination I think of of different time frames and I think crucially for me it's about having as I said very clear and transparent processes for how you have those regular kind of reflection points where you interrogate data um sain talked about sense making um all of that relies on judgment right but I think it's about being transparent about how those judgments were made aiming as much as you can for Collective processes in that sense making and make and kind of interrogation interpretation of data so it's not just one person making a decision and it sits in their head but it's an open and transparent way involving as many different kind of key um um stakeholders as possible all of that to me is is part of how you demonstrate sort of rigor in how you're working in these ways and it's having key points at which that is fed into decision- making on an ongoing basis so it's not just that we collect this data we do this really great learning but it sits completely separate to the management and decision-making structures there have to be identified moments where that feeds directly into decisions that are taken um and I think there should also be some initial assumptions that measurement indicators and methods might need to change as part of that process and as you learn more about the problems that you're trying to address again this can make people nervous you know that you're sort of changing what you measure as you go to make yourself look better but again if you have a very clear documented and more transparent trail of why you've made that decision then it can be scrutinized and you know and can be held to account for it um but I think again as others have already started to mention this isn't just about the right sort of tools and methods and processes it's about the underlying incentives working cultures capabilities we're really talking about rigor so so that these things can actually be operationalized you know to me it's about having wider accountability and Reporting processes that reinforce this rather than working against them you know so for for example decision makers wherever they sit aligning their own reporting and accountability requirements to these processes of testing and learning um rather than simply adding it on as a bit of extra work for people to do alongside everything else um that is happening um I think it yeah I think it it I think we need to be much more open about the fact that this requires a lot of trust at different levels within teams and across organizations who are who are collaborating together so that you can be open about your uncertainties what's not working the changes you need to make so I think we need to recognize that if we're thinking about rigor and how we support it it's not just about getting a particular tool or method or process right it's about the wider operating environment as well thank you Lenny um for highlighting that adaptive management is happening but it might be hidden and the point is how do we kind of document this uh in a transparent way and and and about making explicit what was implicit and what you said about culture and incentives I think it's a key element because that shapes how everybody behaves uh within the Institute and it's and and it's a topic that is very important right now at the idb as well um because the president of the idb has uh been very clear on on on the importance of promoting a culture uh to be more focused on results um so San what are your thoughts can adaptive management be rigorous thank you Claudia I first of all agree with all of Lenny these points and I'll try to initially just be a bit provocative and say like again this depends on what we mean by rigor right is it rigorous to pretend that we can foresee everything that will happen in an initiative up front including various risks and opportunities that we can just make a plan for how to to implement it um identify a few preset kpis and then we go and implement it regardless of what happens in the real world you know is is that rigor or is it more rigorous to recognize that the world is complex and unpredictable and um you know to focus on continuously making sense of what's Happening based on a range of information qualitative and quantitive you know more systematic or anecdotal information then to make decisions based on this and this L said this doesn't mean you're moving the goalpost and that anything goes um but it means that we recogn that we don't know everything up front things can change people can make mistakes um and we need to work in a way that reflects that um and if we do work in that way then we're able to use our resources more effectively and efficiently we able to better manage the risks that might arise and seize opportunities that we couldn't have predicted um and to maximize the impact that that we can have um that being said there are obviously ways of doing adaptive management M that can be more or less uh rigorous and and have more or less Integrity uh and I'll try to complement and not overlap too much with Lenny's um excellent points for instance on the importance of documenting what the adaptation but but a few Reflections is I I think first of all we need some rigor and be a bit more systematic in how we articulate what it is we want to learn about like how do we articulate learning questions and learning agenda and how who is for like who who is trying to learn here um and who should be part of that conversation how do we make it more inclusive and and and really focused on on benefiting those that that that that we want to benefit and how do we generate learning and I talked about sense making that's a very structured process that UNP is is using for systematically extracting learning across a portfolio of interventions and then translating those interaction um um how often you do this I think depends on on many facturers often we we see these type of reflection points happen every six months which I think is a useful frequency but it obviously depends on how much uncertainty that you're facing how fast things are changing in your context um I think there are also different frequencies maybe for for doing reflection and adaptation at at different levels of learning so some people here might be familiar with the termin ology of single Loop learning double Loop learning Triple Loop learning and you know I was S I was simply reflecting on on more day-to-day like how well we implemented particular activities or are we trying to reflect and and adapt on like whether our fundamental assumptions in our initiative hold up we can't question our fundamental assumptions every three weeks but we can have some more light touch more project management single Loop learning reflection right so thinking through how do we distinguish between maybe different types of learning loops I think is important and can help add some riger and not you know lead to Overkill uh for instance UNP and and the Swedish ad jmca has have been working together in North Macedonia where they had a a more light touch single Loop learning type of reflection um every month and then every quarter or every six months that then had a more in-depth um Collective sense making um and related to the riger question I think is the the question of like what is good enough data because as Lenny said like you you might need some like good enough information or timely information at a point that allows you to make decisions so I think there's a balance there between how how rigorous information we we look for and and what is good enough information to to make decisions at a certain point at certain time um and lastly I would emphasize rigor in in how decisions are made based on on learning can introduce processes or templates where we explicitly link certain learning or insights to decisions and can we clarify like what you know who needs to action this who needs to implement this um and how are they accountable for them how do we avoid decision-making biases like confirmation bias or or or what have you um can we introduce Devil's Advocate or or or red teams and so on in some of our governance processes that introduces a bit more rigor when it comes to decision making I'll pause there thank you thank you sain for this perspective um that I mean adaptive management can be even more rigorous like recognizing up front that we don't know everything um at the at the beginning and also that this can help uh size opportunities and and um defining those processes to link the learning to decision making and who needs to act on it um it it's a key part um so Estelle um from the World Bank group perspective um how is this being applied and what would be triggered uh in that case thank you um I kept nodding to everything that sain and and um Lenny said because basically you know when when we did this evaluation we were uh very much trying to observe right observe what was happening uh within country teams what were their behavior how did they uh um adapt and how did they interact with this system which we ended up finding to be extremely rigid and for the most part UNH helpful at the country level um and so I I really conquer with the fact that um there is a lot of um hidden or non-explicit or tcid adaptive management uh that's happening at the country level within project of course but that wasn't the the object of our evaluation and um some of you might be aware of the work of of Dan hunig uh who has explored adaptive management quite a lot with his his book on navigation by judgment um and that's very much what was happening in those country teams trying to really leverage as much as possible um you know these tcid channels experiences um of course networks etc etc um and so really making these these decisions these judgments based on on on mostly implicit and tcid knowledge um so they are happening at the same time they're happening despite the formal system that is supposed to help those adaptive management uh uh decision and are either unhelpful or sometimes come in the way of these so let me be a little bit more specific um at the bank World Bank group and you know many mdbs have kind of had their isomorphic mimicry when it comes to systems you know we kind of copy each other um so many mdbs have have this kind of of system in place as well an very much around this idea of a results framework that is you know set out at the beginning of of a country engagement cycle Revisited midways and then ultimately is the basis for uh a self- evaluation and then a validation from the Independent evaluation group where where I sit um and Within These there are some formal processes of reporting around these adaptation right there is there was at the bank there has been a program learning review that happens Midway where it is the opportunity to make some changes to results framework and also to account for the changes that have taken place so there is this aspect of of rigor in terms of trying to be transparent um at the same time what we found is that those moments those those those midterm reflection that are supposed to be for more of this probably second Loop learning um we're very much about report in up and not as much about having this um this really rethink of of of a of a program um of understanding where we are in terms of assessing risks and results of requestioning our underlying theories of change um and the most of the time was spent trying to run after you know uh team leaders to get an update on their indicators trying to adjust the targets so that you know ultimately the the Judgment the rating would be right and so it wasn't taken it was crowded out by this Reporting System rather than really enabling uh the type of of contestability and and really in-depth discussion so um that that was quite clear clear when it came to the the last piece of the cycle which is this the country learning review um they were there were really some good instances of taking this opportunity to do a bit more of double or triple Loop learning at the same time the timing was pretty not aligned with the new country engagement cycle you know there was a lot of work that had been done on the country strategy and then this learning review came a little bit as as an afterthought um so there was a bit of stocktaking and again reporting up and not as much useful for uh the Adaptive management decisions that needed to be made at the country level um so um all in all you know it it's there are there are you know very motivated staff at the country level who know what they need to change in some ways and they are working within a system that is not really Geared for that and not enabling them uh to do as much of this so a lot of the work that the evaluation that I led uh was about trying to rethink some of these fundamental principles of accountability incentives and then ultimately the toolkit which of course come comes after thank you Estelle for for sharing those findings and and we see those challenges as well uh at the idb with the reporting systems and and that um reporting coming up but not necessarily being useful for Learning and adapting um I'll pass the floor now to my colleague Mariana to continue this very interesting discussion so hello everyone it's a pleasure to join you in this discussion as your second moderator um as our pelist just highlighted the how of adaptive manage management is often the least documented part way way less than the what and the why right uh and it can only be effectively done with proper evidence show to inform decisions so as evaluators as evaluators it is important to consider how adaptive management impacts our approach to monitoring and evaluation so I would like to first as sudden in what in what way if any em needs to change to support adaptive management and maintain a focus on results thank you um I'm obviously biased I I spent a lot of time in this Emin sandbox that was all about rethinking Emin especially when we work on complex problems so I'm I think there are a number ways in which monitoring Invasion needs to change but I think there's also an aspect of one thing is um how how to do em differently another is how do we set our organizations and systems up to enable us to do em differently and I think that's a a pinpoint or headache that many people run into when they begin to put these things into practice right and that requires some other skills like change management an internal political organizational political you know navigation and so on but maybe putting that aside but but I it's just a thing that that occurred to me here but I think em needs to change to be designed or need to be designed to serve three key functions um and I think we need some changes in relation to each of these functions first of all as we've already talked about a lot um is that em needs to be able to um allow us to continuously learn and adapt um this requires a few different things I think there's a we already talked about new types of methods and emphasis and learning and and I think Luminate a philanthropic outfit is a really interesting example how they internally scrap traditional performance kpis and instead focused on like conversations and learning questions and so on so look into that example if you're interested because I think the key here is that em is not just an accountability compliance reporting function but a shift here requires a shift in power em as we I think um still was talking about is is quite you know top down or extractive how how it's normally done is that people that are doing the work on the ground are need to report an also the ways to a principle so it data is fed up to the system to headquarters or management or or or what have you and we need to flip it around em ideally should be about empowering people closest to the problems to be able to continuously learn and adapt and what they're doing because people know what to do and how to get on with it but we've heard that they prevented from doing this by by existing systems so we need to kind of rethink that Power Balance and I think there's some great examples of organizations that are doing this um search for common ground has been running this youth list let research initiative where it's really focused on empowering local youth to be able to make sense of the reality they live in and and how they can navigate that system to to create change that's a very different approach to to to looking at generation of learning and using that um this links to discussions around decolonizing Aid and and and monitoring and evaluation that I think are really important um so the focus on on learning and adaptation I think that's one key thing but there's also some changes to two other functions one is about how we track interim progress and Report um this is I think what a lot of people emphasize when or think about when they think about m& is about reporting and progress tracking and kpis in the log frame and so on and I think a key point is that change is a long-term process and I think there's a question in the chat on this so we need to know in some way if we on track in the meantime but the things we look at to know if we on track to be accountable to others and to elves um should allow us incentivize us to learn and adapt so we shouldn't be accountable for implementing what whatever was in our work plan we should be accountable for um adapting and learning and and you know generating change so we need uh progress metrics that allow us to to adapt even if you know change our work plan that remain relevant even if we change our our work plan and and and planned outputs and so on um and this is this is quite a big shift um we've seen organizations introdu things like kpis that are focused on learning um rather than kpis or or outputs that are focused on learning as intermediate results rather than just delivering a report or a workshop or what have you um and the last function I think is is important to create some focus on changing is is how we understand the wider kind of impact or change um this is a longterm you know change is long-term um process at at sort of really impact level um so how do we how do we measure that um and I think clarifying are we interested in just understanding like learn about how change is happening out there or are we interested in evaluating whether that is good or bad change and are we interested in measuring our contribution to that change I think there are different question to unpack here generally I think we need a shift in framing the focus towards the bigger system out there and where the change is happening ra rather than what we are doing and whether we are you know directly generating change so it's like the so much this um cernic sort of shift in astronomy where the Earth is not the center of universe anymore we need to not look at ourselves the center of the universe but look at the the bigger system and how we contributing to um to change in that uh and there are some great organizations that are doing interesting work on that asdi voka has a really interesting methodology for how they trying to make sense of bigger changes out there in the system um Lords Foundation are using rubrics to evaluate whether change in the wider system is is good or bad um but I think this cannot be done through traditional kpis so this is again an area of change we need to get some more holistic ways of capturing whether a change is happening out there qualitative and quantitative evidence using contribution analysis and stories and so on um and again there are some some great example one from the poverty in human development monitoring agency in in audish in India where they've been been ad deploying a much more human Centric approach to to capturing whether changes happening and I think that is part of the Adaptive management discussion rethinking how we capture and learn about longer term change and distinguishing between that and more interim progress tracking um I could keep talking but I'm going to pause here and and pass it over to my co-panelist thank you s I think it's especially valuable to get um some examples of how this is happening how this is being uh changed in in the different um organizations as an anthropologist I'm biased about reflectiveness so I I'm sold there uh but I I think also the the P the points you raised about Frank participatory and and and constant reflection on what should be measured and why what for and also evaluated to allow for Learning and distinguishing between these two types of learning uh as a purpose which also connects to the point that Lenny raised in the beginning about being very intentional uh in that exercise I think are all very um avilable so St could you tell us a little bit about what should be continued or discontinued in terms of evaluation practices or any other comments you may have sure I hope you can hear me well I had some yes okay great um so what was very clear from you know engaging with all of these country team throughout the evaluation is that you know if we propose something different or something new some something that else needs to go right because there is you know in in everything that Lenny s and I have been discussing you know it is an intensive approach it can be a rigorous approach and it requires it requires Collective activities it requires indepth uh analytical work evaluative work it requires quite a lot um and so the key question one of the key questions that the evaluation was trying to answer is you know what currently in the systems that we have are not really serving the expected purpose and so I came back to this uh this point on the results framework which at at the country level you know results framework very much have their place in uh in the whole adaptive and results based management um and they can serve a very good purpose when it comes to you know interventions that we are routine where we know what works um where the kpis or the the outcome uh indicators are really good proxy for what we're trying to achieve where there is not too much risk of gaming and creaming and all of that right so they do serve a purpose but sometimes they are not the right tool um and so in the uh report we try to lay out um you know in Broad Strokes um an A change in in how to think about the array of possibilities when it comes to monitoring evaluation and learning some organizations are are way ahead of the game and have already added many more letters to This Acronym but for us you know already trying to distinguish between the the purpose of monitoring evaluating and learning and trying to have distinct channels there was something important so what we we we try to think about are you know monitoring has a very important role when it comes to Country portfolios as well um and the frequency of that needs to be adjusted to to the kind of of decision so it needs to be focused on the health checks of the portfolio right and it needs to be embedded in an IT system that's you know iterative enough and user centered etc etc the the World Bank group at this point didn't have such such kind of system um and the second piece of these monitoring evaluation learning plans was also to have a much more decentralized way of thinking about evaluation in our in tion contrary to undp or UNICEF where there is already there there are some decentralized evaluation function as well um the word bank group is very centralized we have the IG and then there are self- evaluation reports but there is less of a culture of having you know country teams country directors country managers um coming up with evaluative studies that would serve uh their learning and and accountability purpose so we were trying to also uh um you know lay out some of the possibilities for longer term Reflections um you know our country engagements are five year cycle but we know that in some sectors the band group has been embedded for for decades right and and so having a longer term view of what has happened how change has taken place at the country level and how the bank group has contributed to this so these kinds of of evaluative studies are also necessary mixed methods Etc and then a major aspect of the proposal was to also have as Lenny had outet you know a rigorous learning plan with clear learning questions clear understanding of the learning gaps and then some activities to uh to fit these right with the understanding that they they wouldn't be always very routine that you know there needs to be space carved in in very already demanding kind of uh uh country um um management tasks uh but but really dedicated times dedicated ways of thinking with these you know red teaming aspects and and kind of also having these these mechanisms to to question the evidence and to really learn from it um so that that was a little bit in Broad Strokes what we were trying to to to propose and so you know it's already been four years and I think there has been quite a lot so these meel plans haven't been adopted as a recommendation per se but some of the practices and when we will talk later around these big principles of changing accountability what does that mean the incentive system some changes have taken place and more are to come with the evolution road map of the world bank group we can see that um the report has been quite helpful in in advancing some of the thinking and getting unstuck from some of the past pendency that that we had before thank you so much St uh I think that's very helpful to illustrate already what we should unlearn uh in a way and the importance of distinguishing uh between the the purpos of uh monitoring evaluation and learning as they often are being packaged together nowadays and how you've been identifying the gaps and addressing them uh in the world bank group that's that's super interesting um Lenny what would you add or comment on these perspectives perhaps coming back to to the Tet and under the Rader character of adaptation that you mentioned early uh in the discussion curious to hear from you thank you yes I think we are in danger of all agreeing with each other a bit too much but there's been so many great contributions made from the others which I'm very thankful for um I'll just maybe add two thoughts one is a bit on kind of methods and one is on some of those questions about accountability judgment Etc that you also flagged Mariana so I think I think on in terms of what needs to change for & I think partly uh we do need to be more uh sort of clearer that we're often talking about more Theory based approaches in terms of em methods when we're talking about these sorts of ways of working you know approaches that are concerned not just with saying what happened but why and how which are more about measuring contribution to rather than sort of sole attribution um I think that one of the comments in the chat was asking about particular tools you know are theories of change better therefore than log frames for example and how do you make them useful and I think often theories of change are used uh as part of um adaptive management um program or processes um and they are helpful where they specify Pathways for Change and spell out assumptions and allow you to kind of measure and test that at different levels I think not all theories of change currently do that or do it well and they're often not kept to sort of living documents that are then revised based on what is learned I often see they're done up front once and kind of left on the Shelf so I think there's a lot we're learning about kind of how to make some of the these specific tools methods approaches work better um but I've also particularly liked the work by people evaluators like Tom ason Marina abgar who've talked about the idea of brick AAR that we shouldn't Focus too much on a particular method approach tool but really start to think much more creatively about what is the combination of different things we need to bring together depending on what it is that's being analyzed you know the intervention the portfolio what kinds of questions do you have and therefore bring together you know in more creative and sort of thoughtful ways a whole combination of different things that can help you to unpack them so um I would encourage others to look up their work if they haven't already um and then I think on these issues of accountability on judgment again I I guess I just wanted to underline points that have already come up that we really do need to recognize that I think the challenge for Emin is has often been done a fairly sort of mechanical way right it's about uh collecting data as others have said for sort of upward accountability and Reporting but for these sorts of approaches what we need are those regular spaces where we're interrogating exploring making judgments about what the data tells us and doing that collectively to determine what the implications um should be and I think that means almost starting to put evalu ative thinking from the very start as part of whatever it is you're trying to do or achieve with those frequent processes of reflection and learning rather than it having being something that you do at the end to work out whether or not you achieved what you wanted to um and I guess one of my big bug Bears is that still I think across almost all of the organizations that I look at and work with em is a separate function person unit team right we must integrate it within the decision making the program management whatever whatever that is in the organization but within that function you know we need to ask as I said whether reporting and accountability mechanisms are aligned with Emon do you know do those reporting and accountability mechanisms incentivize learning is Emon clearly positioned as an internal team function of whatever the unit is rather than something that is done by someone else or is their responsibility and I think if we could make that change it could be you know key to unlocking some of this yeah I think many points to to rescue there but um I'll I'll I'll keep to what you just explained it's that The Importance of Being uh just very clear and explicit about the underlying assumption and causal Pathways in our theories of change in a way that supports adaptation based on evidence and also provides clarity about what changing and and adjusting them would be um in in in terms of of results right um however a closing question remains here I think one that may be of concern of of many in our audience here today and I'll ask you first tell uh what are the implications of adopting adaptive management for accountabilities and I think we've touched upon that across across the discussion but I would like to to focus on that now and for the incentives in development institutions thank you um yeah and so I think it's important that you know so someone who sits in who is the garden of the system right in some ways um and that was an important contribution of IG to say you know we we realize that we we we sit on a system that has been built um decades ago for you know that that served their their purpose for a while and now perhaps are really really need to be changing along with you know some of these core fundamental principles underlying these systems um accountability has for too long now I think and and been been understood in a much too narrow sense which um for a while was about you know counting or accounting um what has been achieved the idea that um you know we're accountable for delivering results on or delivering outcomes and that's that's you know a bit of a fantasy or FY at this point because of the complexity that s and Lenny painted very clearly so there were a few things there are a few things that need to change and this idea also of attribution right which is so core um in the discourse but also in how the systems have been built and how the ratings that we are producing are built um which really needs to be challenged especially at the country level but more and more so also in projects so what we tried to paint in the report in the last chapter is is is what kind of switch do we need in those principle so you know um we need to re-embrace the principles of Paris uh 2008 I think Paris agreement right of mutual accountability um and so the idea is um perhaps that you know um this Mutual accountability collective learning informed risk taking transparency in decision making and maintaining trust through um this kind of of of transparency and and openness are the key principles that need to underly this systems um and so moving away from conceiving of accountability as attribute as meeting certain targets meeting certain indicators um is is really kind of of a key piece um the you know perhaps the system would be better off it's if teams country teams were held accountable for providing well eviden descriptions um of achievements and of failures for learning from them and from adapting accordingly not for reaching this this these these indicators that you know we're doing a good face attempt at at at painting the right one but sometimes they are not the right one and also for many things we are aiming for we don't have good proes that are measured by indicators right um and so really also understanding the limitations of that um and then relaxing the focus on on metrics attribution time boundedness of these cycles that make sense from an administrative point of view but not from a change perspective um and and is is really important as well um so perhaps plausible contribution you know adequate evidence-based time appropriateness selectivity as opposed to trying to Encompass everything and then contestability mechanisms for the evidence are things that we would like to see more of um and when it comes to incentives as I mentioned you know there is it was very very clear there is so much intrinsic motivation from country teams working you know with clients um to adapt to adjust to have this long-term goal and and you know we need incentives mechanisms and signals that support this intrinsic motivation not that comes in the way of that um and that's difficult to change incentives and changing signals but in the report we also try to see you know in our current processes procedures what gets rewarded what gets noticed how can we also uh much more support that motivation for adaptiveness um and take away those those aspects that are getting in the way of that so for instance we were trying to think about you know um having meetings that are less I mean really high level meetings we have so many that are about you know approving a new portfolio Etc and we could think about higher level meeting whether it's like at the regional level Etc around just understanding evidence discussing you know um you know in a portfolio way um you know a 10year retrospective what do we learn from that without necessarily getting into okay what's the next business plan you know um and also in the performance systems of our staff uh you know and ensuring that uh you know delivering and and having and having the financing and disbursements are not the only things that are looked at um we were also thinking about these contestability mechanisms because there is something to be about transparency but there are also needs for safe spaces and how do you build contestability in that and so we were thinking about the different kinds of roles U and a different kind of of skill set that would go into that so the report goes into more details but that's that's that was the idea of trying to see where we where the bank group was at that time and what were these kind of quick wins that needed to be happening and what could be taken away to stop the crowding out of of of the inner and motivation that was thank you st I think your your reflection on Shifting the incentives from merily meeting targets and reporting to creating space for collective learning uh especially powerful I think uh I especially like your your provocation on on accountability maybe uh leaning towards the the Learned aspects of lessons learn right teams being accountable for for not repeating mistakes and actually learning um from from what we know already not pretending that we don't know what we know as as sing said in one of his articles um Lenny uh what are your views on this matter on the balance between accountability and learning um I mean I think yes I think I would underline very much a lot of what El said and I really like the way she captured that point about sort of re going back to what we think accountability means rather than it being about counting what we have achieved but it's the idea of making an account right explaining how decisions have been taken and why so that others can scrutinize that I think is really really important um I'm conscious of time so I was just going to maybe add a one one kind of further reflection into the mix which is that I think another key change we could make here is um is to move away I thought there there is so much Focus still with within mtbs and and Elsewhere on approval processes right all of our time and attention is spent on getting approval design perfect up front um and I think we need to move away from the idea that we can perfectly design things up front and accept that the testing the learning adaptation needs to be built in but that means if you like lowering some of the approval bar but but lifting it up for implementation and I think we haven't got the balance right yet in in multiple ways about how we do that um I think that also as I said it was about kind of linking these different functions so for example reviewing management and performance incentives so that we are rewarding people at different levels for behaviors that are more likely to lead to Effectiveness in this space you know program managers deliverers implementers can be rewarded for their skills in problem solving in learning in adapting based on what has been learned I think again we don't currently reward and incentivize people to be paying attention to those sorts of things so I think there's yeah a couple of things there that I think we could change alongside a lot of the good points that Asel has already made thank you thank you leny and thank you so much for your punctuality I think you're all being wonderfully punctual um so yeah changing the culture incentives of course are both um crucial um I'm curious to hear your thoughts s yeah I'm running the risk of just repeating what's already been said um I think I think the incentives point is just really important here like to keep emphasizing um and I think there's an inherent challenge in in the sort of tension between measuring something for accountability purposes because we need to report to someone and our performance and funding or whatever is linked to it and measuring something because we want to learn about it um Toby L from the center for public impact talks a lot about this about you know when when we use a measure for accountability purposes then all sorts of weird incentives arise to game it like to hit you know if if the target is five on some kpi we just obsessed about like how do we get to that number five and then it stops being useful as an indicator that that helps us learn um you know it corrupts a process it's is intended to measure like in education where students and teachers end up training to pass a test rather than like to actually learn right and I think that's a tension that that is really hard to to get around and and I'm still in two minds as to whether it makes sense to have one system that is kind of feeding the Beast of the accountability kpis and then you have a different thing that's about the learning and and I think at the moment we are still you know most people strapping these systems where they're being asked to do all this new stuff on learning and adaptation as Lenny said and at the same time you still have all the traditional reporting things right so I don't know how whether these can be merged somehow or if the incentives are just so different that we need to come up with a way to kind of have to have both to avoid the accountability measurement kind of corrupting the learning uh process um and I think an interesting you know what some and I think there's a lot of value in thinking about like learning as a result in itself and being accountable for learning but it also holds challenges again there's a great example that that Tom eston flags from the the Pearl Nigeria program which is UK funded program in um in Nigeria where there was a there was an as part of the lock frame there was a as there there's an output I think on on learning and there was even link to a payment by results mechanism and and that helps increase focus on on learning that program but it also created incentives to like hit whatever that kpi for learning said rather than actually learning right so like it's really hard to get around and just being very conscious about incentives and what incentives something like a system will trigger I think is maybe the best solution at this stage and and then lastly just a reflection I think there are some other kind of related to the the incentives and accountability some some site to also focus on we often talk about M or resource based management and then independent evaluation is a bit of a kind of a different thing and and and if we aren't also focusing on how how people for instance UNP colleagues in the country office my colleagues they are be conscious of you know there might be an independent evotion at the end of the day and they might not be interested or the criteria they're using might not be focused on on how we adapted but using other you know criteria right so making sure there's alignment there I think is important and same with audit which is another thing like people say yeah if we get audited though and we've just been moving funds around and adapting like how's that going to make us look so make sure some of these other things that are related and and very relevant also enable an environment where we can learn and adapt rather than being punished for for cause correction I think it's important yeah there's a lot of food for thought there I won't comment because I I I would like to move to the Q&A part because I I think we have many interesting questions in our Q&A and uh by all means if the participants want to add more questions there uh so sadly at least for me we're moving to to the final part of the the event um I will start maybe with the first question from Olivia mutasi uh that's asking about if adding more bureaucracy uh to to Adaptive management would fundamentally undermine its uh ad hoc nature instead of increasing uh the ability of of adaptation and empowering them to do so through building trust and following up so if additional forms strategy policies would disempowering less of empowering um alternatively she asks does adaptive Management in this Con in this context refer uh to systematic solutions to systematic problems and not to day-to-day adaptability I think maybe we could start with st or if anyone else wants to jump in um I mean yeah obv yeah sure absolutely like if you if you bureaucratize uh adaptive management then you know it's antithetical in some ways um but but there are some clear tradeoff right I mean I think we also try to lay that out in the report um as s just hinted at there are some important needs that need to be served by a system as well right so we can't just dismiss reporting up um and you know um and being able to also um um have have these more higher level um sometimes aggregation at the portfolio level for key shareholders etc etc so there are multiple needs in our institutions and somehow we need to kind of um meet them all um it is a very interesting thing to ponder I think for a long time we keep saying you know accountability and learning at the two size of the same coin maybe not um but so to to get back to to the question at hand I think yeah the goal is to have systems that are supportive um of adaptation and you know good good management practices don't come in the way of that um but also allow some level of scrutiny transparency for the learning to happen for the accountability mechanisms to be able to be valid um and um there no magic bullet there right so there will be tradeoff um and depending on the size of the organizations depending on the level of scrutiny from Outsiders perhaps you know you know the philanthropic space is somewhere where May there has been more Innovation there as well because of the differ kind of political economy around it um and not every solution that can be working in the philanthropic space would work in an MDB environment um so so yeah it's it's yeah not over bureaucratizing but also living with the fact that we are bureaucracies and for for there are some good things about that as well thank you so much St um maybe I'll direct one question for each of uh the panelists so that we have a chance to to address more questions uh so maybe for Lenny I think uh the question from feder C gar on regards of timing uh so basically he says that impact most times uh seems something on the long term so how do we concile that uh with real measurement of of impact in such projects how do we match the timing issue yeah I mean I think as I said I think that um there's a there's there's a few dangers here one is that we think if we're working adaptively then we do lots of Rapid cycles of testing and iteration and we focus very much on sort of shortterm measures and I think we you do some of that but you also have to do that kind of longer term assessments of the extent to which whatever it is you're doing investing in is contributing to a broader change process so in a way it's that combination of keeping an end goal outcome that can be somewhat measured or at least your contribution to it can be in mind and tracking that whilst also tracking your kind of immediate testing and learning you're doing about the pathway to get there so I think there's a danger here this isn't either or it's both but in different ways um I do just want to comment briefly on Olivia's Point as well because I think in some ways it gets to the heart of some of the things we have just been discussing and again my own personal view is it's we're in a diff we're in a challenging place right now in sort of trying to do both things as saen said like we feed the beasts of accountability and you try to curve out your space for Learning and I completely understand that's kind of pragmatically perhaps where many um different kinds of organizations institutions are at but I think there are dangers what I have seen working very much as an advisor and support to large programs trying to work in this way is that people are actually struggling now under the pressures of meeting all of their kind of traditional accountability reporting requirements and trying to do really great learning adaptation and documentation of that and I think it's it's not really tenable to say we try to do both right I think we have to try to be more radical and to change how we think about accountability how we construct it how we approach it within our large development Banks as well as other kinds of organizations and you know let's test and learn and iterate and trial it out and see what works I actually think Estelle was right that a lot of the experimentation thus far has been at the project level I think there's more scope perhaps at portfolio level to think about how you shift accountability because you can think about how you spread your risks how you balance across your portfolio perhaps you're kind of safer bets and your things that require more adaptation and perhaps that gives you a bit more flexibility about the menu or the set of options that you're offering in your accountability Frameworks but I think we need to try to do things differently otherwise I worry that adaptive management will kind of continue in this cycle of periodically popping up and then dying away because we can't fully embed it within our ways of working thank you thank you so much Lenny um I think the last question for sorting maybe the one about foresight because I think you've been working a lot of experimentation uh we have the question from from laa gardoni and I'm sorry if I mispronounced it uh she said that she's reflecting on the importance of foresight especially now with the emerging Ai and new technologies and their opportuni but also risks and costs so if you have any comments on that yeah absolutely on for side I it's something we we've been doing a bit and and I think it's there are some really useful exercises to think about like how do we spot emerging Trends and opportunities and risks and and so on that's part of your Arsenal for like what generates really useful insights that you can use to make decisions on right so so you it makes a lot of sense to integrate that into your um sort of measurement M reflection processes um and on on the point about the the costs and and and risks and and so on there obviously many there's a whole discussion around risk in in Ai and so on but one thing I would emphasize is the these Technologies still don't help get people to use learning to adapt like one thing is Genera a lot of insight another is using it to make decisions and take cannot necessarily help us do that so we might just be focusing a lot on generating all sorts of interesting dashboards and visuals but how to make sure we use that to make decisions and and adapt I think that's an important um point to to bear in mind and and a risk to consider I know we at times so let stop it thank you so much thank you all so much I would like to to thank our brilliant panelists uh for sharing your your great insights and verying reaching uh contributions to to our understanding of adaptive management and its implications for development I would like to thank our audience for your participation and also a heartfelt thank you to our inter interpretation team the technical support team and also our Communications team um I think we just briefly summarizing which would be impossible but I'll try uh I think we explored the concepts of adaptive management and and its crucial role in navigating the the development complexities emphasizing that it involves fostering a culture of evidence-based adaptation rather than merely collecting data and Reporting uh I think we discussed a lot uh the need for rigorous framework that support continuous learning and real-time decision making making monitoring evaluation more uh Dynamic and focused on adaptation we discussed uh the importance of having an enabling environment where staff have the capacity the incentives and the authorization to act on evidence recognizing and supporting adaptation efforts and we also heard different viewpoints on implementing adaptive Management in various context the importance of tailoring it to context on the um to to the context of specific organizations and the situations right uh so now pass the floor to to Claudia to officially close our event yeah and know very briefly that I think this this discussion also leaves us evaluators with with a huge task on on how do we push for new ways of doing things how do we rethink accountability and also recognizing that from the evaluation part we also generate uh incentives so stay connected um and keep up with our F events please follow o on LinkedIn and Twitter um we will be sharing the recording in a few weeks so thank you everyone in the audience thank you to our panelists for such an amazing discussion thank you so much thank you bye e