Presentations

Conference Presentations

Davies, B., Abernethy, P., Belcher, B., & Claus, R. (2017, September). An Empirical Evaluation of Knowledge Translation in Policy-relevant Forestry Research. Presented at IUFRO 125th Anniversary Congress, Freiburg, Germany.
Abstract: The increasing external demand from research funders and research managers to assess, evaluate and demonstrate the quality and the effectiveness of research is well known. Less discussed, but equally important, is the evolving interest and use of research evaluation to support learning and adaptive management within research programmes. Researchers and research managers need to know whether and how their work is contributing to positive social and environmental outcomes to be able to adapt and improve their projects and programmes. We have done a series of theory-based evaluations of international forestry research projects. Each evaluation began with documentation of a theory of change (ToC) that identified key actors, processes and results. Data collected through document reviews, key informant interviews and focus group discussions were analysed to test the ToCs against evidence of outcomes in the form of discourse, policy formulation and practice change. The analyses identified strengths and weaknesses in knowledge translation, helped understand the conditions and mechanisms of knowledge translation and suggested improved strategies to increase research effectiveness. The evaluation approach proved valuable as a learning tool for researchers and research managers and it has facilitated communication with funders about actual and reasonable research contributions to change.
Belcher, B., & Palenberg, M. (2016, September). Outcomes and Impacts – Towards Conceptual Clarity. Presented at European Evaluation Society Conference, Maastricht, The Netherlands.
Abstract: The terms “outcome” and “impact” are ubiquitous in evaluation discourse. But there are many competing definitions in use that lack clarity and consistency, and sometimes even represent fundamentally different views. This leads to profound confusion, undermines efforts to improve learning and accountability, and represents a challenge for the evaluation profession. This paper investigates how the terms are defined and understood by different institutions and communities. It systematically breaks down the definitions into a total of 15 distinct defining elements and uses this framework to compare usage patterns across definitions and to methodically assess usefulness and limitations. Based on this assessment, the paper proposes remedy in three parts: by applying several good definition practices in future updates, by differentiating causal perspectives, and by employing a set of meaningful qualifiers when using the terms outcome and impact.
Davies, B., Colomer, J., Belcher, B., & Suryadarma, D. (2016, February). Effective Research for Development: Using Theory-based Approaches to Assess the Effectiveness of Knowledge Programs. Presented at 2016 Australasian Aid Conference, Canberra, Australia.
Abstract: With significant finance being channelled into research and knowledge for development programs internationally, organizations that work in this field are being challenged to answer: what does aid effectiveness look like in the context of knowledge creation and utilization? The challenge is multi-fold. Research and knowledge creation are unpredictable; we don’t know what we will find out. Knowledge-based interventions tend to operate early in results chains, with multiple stages and multiple actors required to achieve impacts on the ground, and with the time lags and feedback loops of complex systems. Theoretical understanding of how knowledge-based interventions work is still not well developed. And the knowledge creation process is, by necessity, becoming more diffuse, with research-for-development activities becoming increasingly inter- and transdisciplinary; the interventions themselves are often multi-pronged. To deal with these challenges, three organisations working in a UK International Climate Fund -financed knowledge-for-development partnership have trialled the use of theory driven approaches for design, monitoring and evaluation and learning (DMEL) as a method for assessing and communicating the contribution of knowledge to broader social, economic and environmental impacts. Experience so far shows that theory driven approaches are appropriate and helpful for designing, adapting, and evaluating knowledge for development programmes. The approach provides a framework to develop a testable, causal model connecting research activities to policy and practice changes and ultimately to the desired social and environmental impacts. Enhanced attention to the “sphere of influence” and the identification of key actors who need to be involved (boundary partners) has supported more effective program design and implementation, by focusing attention on the intermediate results that knowledge programs are able to manage towards delivering. The approach enhances our ability to test theories about the role of knowledge in policy and development and therefore to enhance learning about how investing in knowledge for development works. In other words, it has begun to shift the emphasis of DMEL from an accountability-driven administrative task to a process that is central to project design and organisational learning. This collaboration has also revealed a number of constraints and opportunities for further improvement.  We have found that theory driven DMEL approaches are frequently constrained by a failure to adequately resource or incentivise systematic DMEL approaches, with a real or perceived disjuncture between stated support for systematic DMEL and actual resource allocation and performance management. There are also frequently real or perceived conflicting demands, expectations and prescribed DMEL approaches from multiple donors. Practically, there are few published examples of applied theory-driven approaches in relevant programs and sectors; limited hard data on DMEL as a critical impact delivery mechanism and a lack of knowledge, experience and institutional flexibility to support adaptive management during the activity cycle of knowledge generation programs. This presentation will share lessons based on the experience of applying theory driven DMEL to their investments in enhanced DMEL techniques. This has involved developing theory of change models and tracking performance in relation to: creating enabling internal systems and cultures; improving internal capacity and practices; influencing debates and practice with the wider donor and practitioner community. The presentation will highlight key findings from the application of theory driven techniques in programmatic areas and insights into the factors that enable and constrain the development of evaluative cultures. It will also outline priority next steps for internal practice.
Davies, B., Belcher, B., & Colomer, J. (2015, March). Learning Partnerships in Participatory Planning, Monitoring & Evaluation. Presented at Conference on Monitoring and Evaluation for Responsible Innovation, Wageningen University and Research Centre.
Belcher, B. (2013). Building a Theory of Change for Impact-oriented Natural Resources Management Research. Presented at the meeting of the program on “Management and conservation of Forest and Tree Resources”, Bioversity International, Rome, Italy.
Belcher, B., Abernethy, P., Claus, R., & Davies B. (2017, September). Transdisciplinarity in International Forestry Research: An Assessment of 3 CIFOR Projects. Presented at IUFRO 125th Anniversary Congress, Freiburg, Germany.
Abstract: As boundaries between disciplines are crossed, and as research engages more with stakeholders in complex systems, traditional academic definitions and criteria of research quality are no longer sufficient; there is a need for a parallel evolution of principles and criteria to define and evaluate research quality in a transdisciplinary research (TDR) context. We used a systematic review of the literature on TDR quality to develop a prototype TDR quality assessment framework (QAF), which we tested empirically in three international forestry research projects conducted by the Center for International Forestry Research. The data were collected using participatory evaluation workshops, semi-structured stakeholder interviews, and document analysis. The developed TDR quality assessment criteria of the framework helped to evaluate the degree to which current projects are employing transdisciplinary approaches and to systematically evaluate strengths and weaknesses in the projects reviewed. We found good examples of the application of transdisciplinary principles and criteria but also considerable scope to further improving research design and implementation for improved effectiveness. This presentation will provide an overview of the QAF and the assessment methods, results and lessons learned for designing and evaluating TDR in the forestry sector.
Belcher, B., & Palenberg, M. (2016, October). Outcomes and Impact Terminology: Towards Conceptual Clarity. Expert lecture presented at the American Evaluation Association, Atlanta, Georgia, USA.
Abstract: The terms “outcome” and “impact” are ubiquitous in evaluation discourse. But there are many competing definitions in use that lack clarity and consistency, and sometimes even represent fundamentally different views. This leads to profound confusion, undermines efforts to improve learning and accountability, and represents a challenge for the evaluation profession. This paper investigates how the terms are defined and understood by different institutions and communities. It systematically breaks down the definitions into a total of 15 distinct defining elements and uses this framework to compare usage patterns across definitions and to methodically assess usefulness and limitations. Based on this assessment, the paper proposes remedy in three parts: by applying several good definition practices in future updates, by differentiating causal perspectives, and by employing a set of meaningful qualifiers when using the terms outcome and impact.
Belcher, B. (2015, October). Seeking Evidence of Research Effectiveness: Lessons from an International Research-for-development Programme. Presented at Institute of Advanced Study Seminar, Princeton, New Jersey, USA.
Abstract: Publicly supported research faces high expectations from funding agencies to achieve and prove “impact”. At the same time, there are strong internal pressures to evaluate research effectiveness, to better learn from experience and to improve research design and implementation. The tools for doing this are still not well developed, but there is substantial emerging experience with utilization-focused evaluation and theory-based evaluation approaches applied to research. The Center for International Forestry (CIFOR), an international research-for-development organization, is on the forefront of this development. This seminar will provide an overview and context of CIFOR’s research focus and approach, with emphasis on an ongoing reform process that markedly shifted the center and its scientists from a primary focus on high quality outputs to a shared responsibility for outcomes. We have developed and started implementing new approaches to planning, monitoring and evaluating in which the intended contributions of research are deliberate, explicit and testable. This improves our ability to gather evidence, assess and communicate outcomes and impacts for enhanced accountability, and our ability to learn from experience. The approach is promising, with important technical lessons and lessons about the inherent cultural change and how to support it. Questions still remain about whether the evidence produced and used in this approach will satisfy all constituencies, and about how to further improve the quality of this evidence.
Belcher, B., & Suryadarma, D. (2015, July). Lessons from a Participatory Evaluation of the Global Comparative Study of REDD. Presented at the inaugural meeting of KNOWFOR Knowledge Mobilization Community of Practice, IIED, London, UK.
Belcher, B. (2012, September). Co-learning on Impact Evaluation Design in NRM Research Programmes. Presented at Natural Resources Management Research Impact Evaluation Community of Practice, The Worldfish Center, Penang, Malaysia.
Abernethy, P., Belcher, B., & Reed, M. (2017, August). How to Make Transdisciplinary Research Relevant, Credible, Legitimate and Effective? Bridging Social Theories and Quality Assessment of Transdisciplinary Research. Presented at Resilience 2017, Stockholm, Sweden.
Abstract: In recent years, transdisciplinary sustainability research has become one of the key approaches to address challenges and identify opportunities of the Antropocene. While knowledge co-creation and mutual learning have become broadly acknowledged as key components in inter- and TDR, less attention has been paid to the complex nature of social interaction in stakeholder engagement and its implications to both resilience building and the quality of research itself. Some of the main issues we identified relate to inclusivity, power asymmetries, type of engagement, and limited use of the new knowledge produced. To address these challenges we explored how existing social theories could be used to improve emerging TDR practices. We investigated some of better-known theories related to deliberative participation, power, empowerment, and social capital. Combining insights from Habermas, Foucault, Freire, and Putnam with evaluation criteria for TDR regarding relevance, credibility, legitimacy, and effectiveness, we developed a TDR quality assessment framework. This framework was tested by analysing five sustainability science projects, each of which engaged both researchers and other stakeholders in knowledge production. The case studies focused on natural resource management and sustainability governance at national (Indonesia, Canada and Peru) and global (international) scales. The data were collected by document analyses, interactive workshops, and semi-structured stakeholder interviews and analysed using the abovementioned framework. Findings indicate that integrating existing social theories in applied theoretical frameworks can help make TDR more effective by offering more nuanced, in-depth tools to plan research projects, to assess research processes, and to analyse research findings. We describe the development and assessment of the integrated theoretical framework and discuss associations between different types of knowledge production and changes in policy and practice. This new approach has a great potential to improve sustainability research in the Anthropocene, by increasing our understanding of social-ecological interactions and how to address knowledge-to-action gaps.
Belcher, B., & Soni, S. (2016, June). Applying and Testing a Quality Assessment Framework for Inter- and Transdisciplinary Research. Presented at 2016 Canadian Evaluation Society Conference, St. John’s, Canada.
Abstract: Research increasingly seeks to generate knowledge and contribute to real-world solutions. As boundaries between disciplines are crossed and research engages more with stakeholders in complex systems, traditional academic definitions and criteria of quality of are no longer sufficient. There is a need for a parallel evolution of principles to define and evaluate quality in a transdisciplinary research (TDR) context. We developed a quality assessment framework (QAF) based on a systematic review, organized around four principles: relevance; credibility; legitimacy, and; effectiveness. QAF scores reflect the degree to which projects apply theoretically-derived principles in their design and implementation. We tested the QAF on 34 theses completed by RRU students. Inter-rater reliability was good and the instrument proved to be practical and can be applied consistently across projects and reviewers. The QAF helped systematically evaluate strengths and weaknesses in the projects reviewed and provide guidance for improving research design and implementation. This presentation will provide an overview of the QAF and the assessment methods, results and lessons learned for designing and evaluating quality TDR, and for enhancing solution-oriented student research at RRU.
Belcher, B. (2015, October). Seeking Evidence of Research Effectiveness: Lessons from an international research-for-development programme. Presented at Institute of Advanced Study Seminar, Princeton, New Jersey, USA.
Belcher, B., Rasmussen, K., Kemshaw, M., & Zornes, D. (2014, June). Defining and Measuring Research Quality in a Transdisciplinary Context: A Systematic Review. Presented at Interdisciplinary Social Sciences Conference, Vancouver, Canada.
Belcher, B. (2012). Specifying Outcomes to Enhance Research Effectiveness. Presented at Consortium Research Program on Forests, Trees and Agroforestry. Nairobi, Kenya.

Invited Talks

Belcher, B. (2018, October). Impact Assessment in Landscapes Research: Practical & Conceptual Considerations. Invited Presentation to CGIAR ISPC’s Standing Panel on Impact Assessment (SPIA) on impact assessments and landscapes in CGIAR. Stellenbosch, South Africa.

Download full presentation

Belcher B. (2017, July). Theory Based Evaluation of Policy Relevant Research: Lessons from a Series of FTA Case Studies. Invited presentation at Impacts of International Agricultural Research: Rigorous Evidence for Policy. Nairobi, Kenya.
Abstract: The increasing demand from research funders and research managers to assess, evaluate and demonstrate the quality and the effectiveness of research is well known. Less discussed, but equally important, is the evolving interest and use of research evaluation to support learning and adaptive management within research programmes. This is especially true in a research-for-development context where research competes with other worthy alternatives for overseas development assistance funding and where highly complex social, economic and ecological environments add to evaluation challenges. Researchers and research managers need to know whether and how their interventions are working to be able to adapt and improve their programmes as well as to be able to satisfy their funders. This paper presents a theory-based research evaluation approach that was developed and tested on several policy-relevant research activities undertaken by CIFOR and FTA. Each research evaluation began with documentation of a theory of change (ToC) that identified key actors, processes and results. Data collected through document reviews, key informant interviews and focus group discussions were analysed to test the ToCs against evidence of outcomes in the form of discourse, policy formulation and practice change. The approach proved valuable as a learning tool for researchers and research managers and it has facilitated communication with funders about actual and reasonable research contributions to change. Evaluations that employed a participatory approach with project scientists and partners noticeably supported team learning about past work and about possible adaptations for the future. In all four cases, the retrospective ToC development proved challenging and resulted in simplistic ToCs. Further work is needed to draw on social scientific theories of knowledge translation and policy processes to develop and further test more sophisticated theories of change. This theory-based approach to research evaluation provides a valuable means of assessing research effectiveness (summative value) and supports learning and adaptation (formative value) at the project or programme scale. The approach is well suited to the research-for-development projects represented by the case studies, but it should be applicable to any research that aspires to have a societal impact.

Download full presentation

Belcher, B. (2015, December). Defining and Measuring Research Quality in a Transdisciplinary Context. Invited presentation at Institute of Advanced Study Public Lecture, Hatfield College, Durham University, Durham, United Kingdom.
Abstract: Research increasingly seeks to both generate knowledge and to contribute to real-world solutions, with strong emphasis on context and social engagement. As boundaries between disciplines are crossed, and as research engages more with stakeholders in complex systems, traditional academic definitions and criteria of research quality are no longer sufficient – there is a need for a parallel evolution of principles and criteria to define and evaluate research quality in a transdisciplinary research (TDR) context. We conducted a systematic review to help answer the question: What are appropriate principles and criteria for defining and assessing TDR quality? Articles were selected and reviewed seeking information on: arguments for or against expanding definitions of research quality; purposes for research quality evaluation; proposed principles of research quality; proposed criteria for research quality assessment; proposed indicators and measures of research quality; proposed processes for evaluating TDR. We used the information from the review and our own experience in two research organizations that use TDR approaches, to develop a prototype framework for evaluating TDR. The presentation will provide an overview of the relevant literature and summarize the main aspects of TDR quality identified there. Four main principles emerge: relevance, including social significance and applicability; credibility, including criteria of integration and reflexivity, added to traditional criteria of scientific rigour; legitimacy, including criteria of inclusion and fair representation of stakeholder interests, and; effectiveness, with criteria that assess actual or potential contributions to problem-solving and social change.
Belcher, B. (2018, October). Impact Assessment and Research Evaluation at CIFOR/FTA. Invited Presentation to CIFOR’s Annual Meeting: Forests Matter. Bogor, Indonesia.

Download full presentation

Belcher, B. (2017, January). Evaluation with and Evaluation of ToC: Lessons from FTA. Invited presentation at CGIAR Independent Evaluation Arrangement Symposium on use and evaluation of Theories of Change. Rome, Italy.
Belcher, B. (2012). Specifying Outcomes to Enhance Research Effectiveness. Presented at Consortium Research Program on Forests, Trees and Agroforestry. Nairobi, Kenya.
Belcher, B. (2017, January). Evaluation with and Evaluation of ToC: Lessons from FTA. Invited presentation to CGIAR Independent Evaluation Arrangement Symposium on Use and Evaluation of Theories of Change. Rome, Italy.
Belcher, B. (2016, December). Using Theories of Change to Enhance Research Effectiveness. Invited presentation to World Agroforestry Center “Research to Impact” Meeting, Bogor, Indonesia.

Events

Co-building Reconciliation and Sustainability 2017

Baie Comeau, QC The Sustainability Research Effectiveness team in collaboration with the University of Saskatchewan and the Canadian Biosphere Reserve Association was responsible for organizing an exciting event in which practitioners, academics, indigenous leaders and government stakeholders came together to share experiences, lessons learned and discuss paths forward for reconciliation and sustainable development.