This reflection is part of the 20 lessons included in the paper Lessons learned on promoting better links between research and policy in Latin America
Throughout the six years of work within our programme Spaces for Engagement, we used to define very specific objectives for each capacity building activity. Though the latter were implicitly linked to the general goals of the programme we did not apply an overall framework or theory of change to capacity building. Instead, we aimed at learning from different types of interventions and using them as pilots to detect where we could bring more value.
This approach to objectives is consistent with what others have found in terms of planning, monitoring and evaluation. Horton (2002) points out that most common techniques involved in the planning and managing of development projects and programs usually assume that objectives are well defined and that blueprints and logical frameworks can be developed to properly guide the implementation, monitoring, and evaluation processes. However, he argues, blueprint approaches rarely work for capacity-development efforts.
He continues suggesting that “Capacity-development efforts can benefit from a solid initial diagnosis and proper planning. But the plans developed should be viewed as works-in-progress rather than finished blueprints. Managers involved in capacity-development efforts need the flexibility to be able to modify planning targets and implementation procedures as conditions change and lessons are learned (Mosse, Farrington, and Rew 1998).”
On another hand, capacity building for us was both a means and an end since it was a way to achieve the larger goal of SFE: to support concrete links by creating spaces of engagement with the participation of representatives of policy research institutions (PRIs) that conduct or use investigations to influence policy, policymakers, or decision making processes. This purpose became the underlying theme and glue among the diverse CB activities. In consequence, each CB activity was a concrete space of engagement where knowledge was shared among experts and members of PRIs. At the same time, each created space was an end in itself since we had specific objectives to achieve through its development. Furthermore, some of these spaces were linked, i.e. several participants of regional conferences or online courses were selected to conduct peer assistance activities.
Whether to consider CB as a means or an end is not an irrelevant choice at all. We need to know and be clear about why we are doing this. As one participant of the CB group that we created to discuss these issues stated, some organisations might approach it as an end to avoid being prescriptive in terms of principles and just focus on a transversal knowledge that can be applied towards different means. Yet, in our field, it is not very likely that an institution will want to operate with such a “neutral” position. Even worse, the risk is that a CB leader is not even aware of its beliefs and position in many issues so as to convey apparent neutrality and attract more demand.
The end of the CB efforts is tightly linked to the identity and expertise of who offers it. That is why it is crucial to have a clear view and assumptions on the intended and unintended effects of the effort. In this case, CIPPEC is itself a think tank that believes in the value of informing policymaking with research and works to encourage this in Argentina; also, we had already conducted research on bridging research and policy, we had local experience on the interaction between civil society leaders and policymakers and organizational action-based experience on bridging research and policy in a set of diverse policy areas. On another hand, GDNet has ample experience in helping southern researchers communicate their work more effectively through its series of research communications capacity building training events and its range of learning materials. It also brought to the table a recognized trajectory in building and sustaining regional partnerships, a key pillar for the success of the programme.
A second main aspect linked with how we established CB objectives was the level of intervention. In our case, and due to the limited budget in the initial phase, we decided to mainly focus on individuals. We knew well that this would not directly lead to organizational change. Literature on CB is clear about this point: as Horton (2002) argues “It is often assumed that developing individual capacities will automatically lead to improved organizational capacity and performance. This is not the case. For example, there are many cases where individuals have developed skills in participatory research, but very few cases where participatory research has become institutionalized in the standard operating procedures of research or development organizations (Blackburn and Holland 1998).”
For us, defining the adequate level clearly depended on both our expertise and the financial resources we had to develop capacity. It is important to very well measure the scope of the possible intervention and to compare this with the available resources. For example, a think tank based in Perú that needs to continuously seek for funding in an environment where financial support for Latin America is decreasing might probably decide a different level for its CB strategy from the Think Tank Initiative who works in Africa, Asia and Latin America with secured long term funding or Ausaid investing 100 million AU$ in developing the knowledge sector in Indonesia.
In this sense, one participant of the CB group coincided in the importance of thinking about the role of funding in setting objectives. Although acknowledging that core funding is usually scarce, she pointed out that still there are organisations work in very different manners to tackle this challenge, for example by looking for alternative ways to develop CB even if there is no specific budget for this such as collaborating with an existing university.
Related to this point it is worth emphasizing the importance of sustainability; another member of the CB group expressed that “the problem with CB (however defined) is that we all recognise that it is important but no one has cracked the best way of delivering it in ways that are systemic and sustainable”.
This is related to the timeframe used to establish goals: we can set up long term goals or prefer to have very short ones tied to specific activities. For example, working with universities to enhance both “sides” of research suppliers and users implies a much longer intervention than if we want to conduct an initial workshop that prompts the interest from the university to take up a change in curricula.
Another relevant consideration is linked to purpose and principles: many players in this field explicitly indicate the intent or direction of their capacity building efforts. There are many who emphasize that capacity is for performance (i.e. strengthen financial stability); others promote efficiency, effectiveness, sustainability, etc. (i.e PEPFAR considers CB as the ability of individuals and organisations or organizational units to perform functions effectively, efficiently and sustainably). Hence, we should think whether we just want to develop a specific capacity in itself, if we want to improve performance based on that capacity and/or if we also want to change the way things are done).
Finally, one should also regard how to deal with the participation of those whose CB will be built in the definition of the objectives: there are several ways of engaging participants in defining expected outcomes of a CB effort. INASP and IDS, for example, in their Training Programme: Pedagogy Skills for Trainers of Policy Makers asked participants on the first day of the training to write down in post-it notes what were their own objectives. These were then matched to the facilitator’s objectives and most of them coincided but also two new objectives identified by participants were added to the list.
This is especially relevant when CB is organized as a response to the demand of a funder/client. As a member of the CB group highlighted, “the needs of different clients are not comparable, the CB needs of a university interested in developing Public Policy Analysis capabilities are different from those of a Parliamentary Committee interested in performing Technological Assessments prior to budgetary approval, as well as there are not identical “cultures” in different environments”.
So, to sum up, we believe that for CB objectives to be of value, it is important to think about, discuss and agree on a set of key aspects:
Is it a means or an end, or both?
Our identity and expertise
Desired level of intervention
Funding and sustainability
Timeframe
Purpose and principles
How those whose capacity will be built will participate in the definition of the objectives
Comments