This post was originally published here (Resources – Philanthropy Journal News)
By Kelly M. Hannum and Jara Dean-Coffey
This article is the second in a two-part series on developing an evaluative mindset.
Evaluation approaches offer strategic and systematic ways to gather, make sense of, and use information that help us frame, understand, and develop effective strategies to solve problems. A transparent and strategic approach is particularly important for vexing and persistent conditions that are often coupled with deeply entrenched structures, beliefs, and value systems that can be painful to talk about it. Scratch the surface of any long-standing social problem, and a host of preconceptions rooted in identity emerge. The narrative about us and them kicks in; it’s who we are, it’s who they are, and it’s how things are. But is it really? Whose truth is truest or matters most? This is when moving beyond evaluation tasks to having an evaluative mindset is essential. An evaluation mindset is about being curious in strategic and systematic ways. It’s about being open to changing how you see and think about things based on new information and being intentional about data sources, data collection instruments and methods, data analysis and synthesis, and lastly making sense of information and determining implications. It’s also about shared work of reflecting on and gathering information about the values we hold as well as the value we create.
Thoughtfully gathering and considering ideas and experiences from different perspectives is needed in order to develop solutions that are contextually and culturally appropriate and likely to have sustained success. It’s how we avoid things like encouraging early reading by establishing a library of children’s books in Spanish for a community that speaks French. Well meaning intentions; little or no intended impact. Evaluation should help us document and understand the ways in which our efforts are contributing to progress towards stated aims and also help us determine if these are the right goals and what are the intended and unintended impacts of doing so. In order to set direction, align activities, and assess progress towards our intentions, we must surface and challenge the assumptions that get in the way of, slow down, and in some cases prevent that progress. The process of setting a shared direction, aligning activities in service of that direction, and gaining commitment are also seen as what it takes to make leadership happen (McCauley, 2014). It’s how we achieve or amplify our efforts. We have to get clear on what we’re seeking to do, why it would add value and for whom (from different perspectives), and what “evidence” (is appropriate and feasible) to use and to gather input and gauge progress. This is valuable work and it contributes to understanding and potentially more effectively addressing root causes versus treating symptoms. It is hard work. It is uncomfortable work.
Many foundations are positioned to uncover and address ‘root causes’ of societal problems. To do so means values and assumptions must be challenged. No one likes that, but it is needed and we all learn from it when it is done well. Foundations have the power and influence to model identifying, collecting, analyzing/synthesizing using information as part of an intentional and critical reflection about value and how to create it. Combining an evaluative mindset with an evaluation skill-set helps with these essential functions:
- Framing an issue in context, considering historical and contextual contributions as well as the roles various group, systems & dynamics play
- Defining the problem with the issue that is to be addressed
- Understanding obstacles and opportunities from different perspectives
- Understanding solutions and consequences from different perspectives
- Creating a shared understanding of/vision for success
- Deciding what information is needed to inform efforts and track progress
- Determining how to make sense of and use information
An evaluation mindset is important because there is no “how-to” model that will fit every context. As a fairly new field, evaluation approaches and methodologies are continually being developed and refined. There is no universally agreed upon “right” way to evaluate. There are guiding principles offered by the American Evaluation Association as well as a statement regarding cultural competence in evaluation, which is a term more commonly replaced with “culturally responsive” evaluation. If you are involved in evaluation work in any way, reading both these documents can prompt reflection on your practice and your role in evaluation.
Much is happening in the field of evaluation and foundations are key players in that evolution. Below are some of the things we recommend to help you in the journey:
- Learn about evaluative thinking
- Figure out and focus on what truly matters
- Be sure your evaluation approaches align with your organizational and programmatic values
- Stop gathering and focusing on data that isn’t being used in meaningful ways
- Invest in systems focused on gather information you really need and that reflect the heart of what matters
- Develop strategies and systems for reflecting on information with stakeholders who are affected or who can affect the challenge being addressed or potential solutions
- Share information & resources about approaches you are trying and what is or isn’t working well so we can figure this out together
Kelly M. Hannum. Ph.D. is the President of Aligned Impact LLC an evaluation consulting firm based in Greensboro, NC. She has worked with clients all over the world and across sectors to enhance their development efforts.
Jara Dean-Coffey, MPH, is Founder and Principal of the Luminare Group (formerly jdcPartnerships) a minority woman-owned and women-led practice in the San Francisco Bay Area established in 2002. Her work is in three areas, strategy and evaluation, capacity building and pushing practice. Jara has worked extensively with public, social, and philanthropic sector organizations.
Buchanan, P. (2014). Beyond formulas and easy answers. Alliance, 19(1), 59-60.
Coffman, J., Beer, T and Patrizi, P. (2012) Evaluation in Foundations: 2012 Benchmarking Data. The Evaluation Roundtable. Washington. DC
Coffman, J. and Buteau, E. (2016) Benchmarking Foundation Evaluation Practices. The Evaluation Roundtable. Washington. DC
Dean-Coffey, Jara, and Jill Casey. (2014). “Raising the Bar-Integrating Cultural Competence and Equity: Equitable Evaluation.” The Foundation Review 6.2: 81.
McCauley, C. (2014) Making Leadership Happen. Center for Creative Leadership White Paper (retrieved at: http://www.ccl.org/wp-content/uploads/2015/04/MakingLeadershipHappen.pdf)
Thompson, E. H., & Patrizi, P. (2010). Necessary and not sufficient: The state of evaluation use in foundations. Philadelphia: Evaluation Roundtable.