Organizing Dialogue, Experience and Knowledge for Complex Problem-Solving

Visualizing Evaluation and Feedback

August 3rd, 2011

Seeing Ourselves in the Mirror of Feedback

Evaluation data is not always made accessible and part of the reason is that it doesn’t accurately reflect the world that people see. To be more effective at making decisions based on data, creating the mirrors that allow us to visualize things in ways that reflect what programs see may be key. 

Program evaluation is all about feedback and generating the kind of data that can provide actionable instruction to improve, sustain or jettison program activities. It helps determine whether a program is doing what it claims to be doing, what kind of processes are underway within the context of the program, and what is generally “going on” when people engage with a particular activity. Whether a program actually chooses to use the data is another matter, but at least it is there for people to consider.

A utlization-focused approach to evaluation centres on making data actionable and features a set of core activities (PDF) that help boost the likelihood that data will actually be used. Checklists such as the one referenced from IDRC do a great job of showing the complicated array of activities that go into making useful, user-centred, actionable evaluation plans and data. It isn’t as simple as expressing intent to use evaluations, much more needs to go into the data in the first place, but also into the readiness of the organization in using the data.

What the method of UFE and the related research on its application does not do is provide explicit,  prescriptive methods for data collection and presentation. If it did, data visualization ought to be considered front and centre in the discussion.

Why?

If the data is complex, the ability for us to process the information generated from an evaluation might be limited if we are expecting to connect disparate concepts. David McCandless has made a career of taking very large, complex topics and finding ways to visualize results to provide meaningful narratives that people can engage with. His TED talk and books provide examples of how to use graphic design and data analytics to develop new visual stories through data that transcend the typical regression model or pie chart.

There is also a bias we have towards telling people things, rather than allowing them to discover things for themselves. Robert Butler makes the case for the “Colombo” approach to inviting people to discover the truth in data in the latest issue of the Economist’s Intelligent Life. He writes:

What we need to do is abandon the “information deficit” model. That’s the one that goes: I know something, you don’t know it, once you know what I know you will grasp the seriousness of the situation and change your behaviour accordingly. Greens should dump that model in favour of suggesting details that actually catch people’s interest and allow the other person to get involved.

Art — or at least visual data — is a means of doing this. By inviting conversation about data — much like art does — we invite participation, analysis and engagement with the material that not only makes it more meaningful, but also more likely to be used. It is hard to look at some of the visualizations at places like

At the very least, evaluators might want to consider ways to visualize data simply to improve the efficiency of their communications. To that end, consider Hans Rosling’s remarkably popular video produced by the BBC showing the income and health distributions of 200 countries over 200 years in four minutes. Try that with a series of graphs.

 


Filed under: art & design, evaluation Tagged: art, contemplative inquiry, data visualization, design, program evaluation, research, utilization-focused evaluation

Leave a Comment

Categories: art, art & design, contemplative inquiry, data visualization, design, evaluation, program evaluation, research, utilization-focused evaluation

Leave a Reply