Engaging with different stakeholders is regarded as very important, but there is little knowledge about what works. Sharing experiences about engagement was the central theme of the plenary workshop presented by John Young of the Overseas Development Institute on Friday morning. Increasingly, donors want to invest as much effort as getting research into use, as investing in research itself.
To illustrate the complexity of public engagement, the group discussed “what do you think would be convincing evidence of the value of public engagement?” which showed the diverse ways of looking at this type of evidence and what is needed for engaging with the public. Among the many suggestions indicating evidence of the value of public engagement were: participation of the community; trust of the community; two-way communication and greater debate in the community; positive health/behaviour outcomes flowing from public engagement; visible evidence in the form of buildings, programmes and services; positive policy change; legislative reform; new links and networks.
Resuming the session after the break-away, John discussed one of the Overseas Development Initiative programmes, RAPID, which promotes greater use of research based evidence in development policy and practice (www.odi.org.uk/rapid). The group engages with a wide group of actors through a range of tools including the media, web, publications, meetings and collaborative work. Policy process is “fantastically complex” – everyone tries to influence everyone else. Research-based evidence generally plays a small role in policy development, (see presentation for more detail), although it can have significant impact, as seen for example in health service reform as seen in Tanzania which led to a 43-46% reduction in infant mortality.
“In terms of engagement, it is important to know what is need to be achieved. Indicators should be SMART in order to achieve this. Change takes a long time and many projects fail when the input ceases. If you want to see lasting benefits of research, we need to see changes of behaviour of stakeholders, whether it be communities, researchers, and policy makers,” said John.
There are several tools in terms of measuring and assessing change (see IDRC’s website: www.idrc.ca). Learning through stories (what was the situation, what was the challenge, what was done, what was the result, what lessons can be drawn) is one way of learning and documenting stories of change. (www.odi.org.uk/rapid/toolkits). Horizontal evaluation (www.chsrf.ca) is done very well by the Canadian health research foundation to evaluate their programmes.
ODI also does “after-action” reviews to evaluate how projects could have been done better (what was supposed to happen, what actually happened, why was there a difference, and what can we learn from it) which is simple and quick to do. Case studies (how did evidence shape policy decisions) and episode studies which retrospectively tracks back from policy change can be useful as well. ODI combines the classical case study approach and retrospective approach, with “RAPID outcome mapping” to look at behavioural change. A social network analysis (www.rcuk.ac.uk/cmsweb/downloads/rcuk/pulications/evaluationsguide.pdf) is very helpful too to assess who can influence who needs to be influenced.
The group then split into four to discuss what sort of evidence was needed in individual projects, what sort of evidence was needed about other public engagement projects, how should that evidence be obtain, and who should do what. The groups focused on policy, community, media and creative/other. Group feedback will be posted to the blog at a later stage.
posted by Greer van Zyl