Interactive Impact Labs

post-header

Crossposted from Medium.com and LinkedIn

 

I am a nonprofit evaluation consultant. After resisting this label for over a decade, I’ve finally surrendered to this professional identity. My consulting practice has always encompassed strategy, evaluation, and organizational learning, as well as the intersection of these three disciplines. But until now, I’ve shied away from branding myself as an evaluator, even though I’ve used evaluation methodologies consistently throughout my career.

In jest, I’ve compared my “coming out” as an evaluator to my “coming out” as gay.

If only I could feel as much pride about the nonprofit evaluation sector as I do in my sexual identity.

But why this stigma? Upon further reflection, I’ve always thought of the discipline of evaluation as rather dry. This perception stems specifically from the reporting of evaluation. I’m an avid reader of fiction and nonfiction and on average read 3-6 books per month, but whenever I have to read an evaluation report or memo, I find myself putting it off, dreading the prospect of reading anywhere between forty to sometimes 200+ pages of wordy, inaccessible, and soulless prose. Oftentimes, after the first page or two of the “Executive Summary,” I’m nodding off on the sofa before leaping towards the kitchen to brew a pot of coffee. In a caffeinated fever state, I white-knuckle through the first twenty pages, after which, I reluctantly skim the remainder. In our social media age, where nonprofit leaders are overworked and overscheduled, who has the time or attention span to read any report over a few pages? It’s almost as if clients are paying us by the word.

Of the hundreds of evaluation reports I’ve read over the years, the vast majority are at best, boring, and at worst, completely incomprehensible. The latter of which is filled with jargon, lofty visions, and often fails to answer the basic questions of “what, where, when, why, how, and how much (let alone the “so what” or “to what end”)? Recently, a refreshingly “short” evaluation report crossed my desk—only 28 pages!—on a philanthropic initiative to address opportunity gaps for youth of color. I read it twice. But I was still left unclear whether the initiative achieved anything. The bulk of the report focused on seemingly everything else.

The lack of brevity, clarity, creativity, or attention to “story” overwhelmingly hurts what evaluators seek to achieve. If people in the social sector don’t want to read the reports, they won’t, and therefore, won’t learn from them. And if they don’t learn from them, then organizations can’t develop or pivot their strategies, drive continuous improvement, or make informed decisions. Friends of mine who serve as Program Officers at foundations and Executive Directors of nonprofits have often confessed to me that they actively “dislike” evaluation and avoid it until absolutely necessary.

The sad part of all of this is that evaluation has the potential to be incredibly useful. Thoughtful and well-executed primary and secondary research is critical to ensuring that we are collectively learning as a sector. Moreover, as a consultant, I find most evaluation processes quite fun. Yes, fun! I enjoy sleuthing and refining the upfront research questions with the client, focusing on which audiences we seek to target, and most of all, conducting interviews and talking to all of the smart and committed people working in the field. In doing this, I often feel like an investigative journalist, or even a detective, seeking real answers, fresh perspectives, multiple sides of the story—it’s like peeling an onion, revealing new layers of cellular complexity and pungent intrigue. From talking to my peers, they too find these aspects of our work the most fulfilling and intellectually engaging. So if this is the case, why do we convert all of the exciting insights, stories, learning, and data into the most boring and wordy form possible?

an illustration of a man taking a gigantic book from a bookshelfThis brings me to why I’m reaching out to the broader nonprofit sector for thoughts and feedback. What are some ways we can make evaluation reporting more interesting, engaging, and accessible*?

Below is my initial list of recommendations:

  1. Keep it short: As mentioned earlier, evaluators need to rethink the length of deliverables. Can we say what we need to say in half the words?  Can we convey information and complexity in 3-5 pages or just a dozen slides? What information would you convey if you were limited to 1-page or even a few sentences?
  2. Emphasize story: Storytelling is the oldest form of communication for a reason. Entire cultures have passed on their histories this way because stories engage and stay with the reader. Given this, how can evaluation deliverables use storytelling techniques (e.g. plotting tension in the story through “cause and effect”, character development such as mapping the journey of an organization vs. an individual) to engage their audience? We need to consider narrative evaluation** and other related methodologies to capture and engage readers in the digital age.
  3. Write like The New York Times: Clear writing is clear thinking. Why do so many evaluation reports read like they are written in a different language, aka Academ-ese? If half of the sentences contain jargon, chances are it doesn’t cointain clear thought. Why can’t evaluation memos read like a well-written news article? This means shorter sentences, less jargon, writing to a high-school reading level, and most importantly, maintaining a sense of story, where all of the facts and headlines are laid out clearly.
  4. Move past the PDF: Why must all evaluation reports be long-form and formatted as PDF? How about translating evaluation findings into a concise digital format that can be read on tablets and phones? No one prints anymore, especially with the move to working-from-home, so why are we still using PDFs?
  5. Make it interactive if at all possible: Even newspapers — those dinosaurs of mass media — have moved online and taken up infographics, data visualization, and mutable digital maps and charts. These tools provoke user engagement and make it more likely readers will read till the end of the report. See this list for some great ideas and examples.

* I would argue that accessibility is a precursor to equity.

** A method of inquiry that relies on various forms of storytelling to unpack and link organizational efforts and investments with results, outcomes, and impact.