[Editor’s Note: This is the first part of the interview with Diego Gonnet, Director of the Department of Information for Management and Open Government at the Direction of Management and Evaluation, Office of Planning and Budget, Presidency of the Republic of Uruguay. He holds a B.A. Political Science from the University of the Republic of Uruguay and a M.A. in Public Policy from Victoria University of Wellington, New Zealand. This interview is part of a wider series of talks with current and former officials, politicians and researchers from different countries in Latin America about the role of research and information in the public policy process. Interviews are an input to the development of an upcoming online course produced by P&I aimed at promoting the use of evidence to inform policy decisions in Latin America. The Spanish version of this interview is available here.]
What is the role that evidence generated by research may have on the policy process based on your experience?
In Uruguay the reality is uneven. On the one hand, there are ministries that have relied strongly on consultants and universities. For example, the Ministry of Social Development maintains fluid links with the Faculty of Social Sciences and the Faculty Economics of the University of the Republic, from where come many of its officials. Especially in the field of analysis and evaluation of results of the social sector, there exists a large flow of reports, consulting and academic works. Also agencies related to the industrial, productive, innovative and energy policies intensively use qualified research. On the other hand, there are many units that do not count on scientifically produced evidence to make decisions.
How was the need for an agency for the production of management information born?
By 2005, a comprehensive diagnosis about the main problems common to ministries and major providers of public services (health and education mainly) was developed. One of the identified problems was that these agencies did not have ‘brains’ nor structures to conduct planning, monitoring and evaluation. They were primarily administrative process-oriented machineries. The message that was strongly conveyed to politicians was that they were driving “cars without keyboards”. The diagnosis was part of an ambitious reform of the central administration, which was then executed only partially. But that attempt left the idea that a cross-agency to produce and make available information from various ministries, and to promote the development of skills and practices for planning, monitoring and evaluation was necessary. So, the Direction of Management and Evaluation arose in the context of the Office of Planning and Budget.
What actions does the Direction of Management and Evaluation lead to promote the use of information in decision-making?
First, we work with ministries and their sub-units to improve their five-year plans. We helped them set goals, and performance and process’ indicators. Each year, as part of the Budget Law, we obtain the data for these indicators and present the results to the Parliament so that it can count with financial and performance information of the agencies. We also establish and oversee contracts with different organizations, where budget funds are conditioned to achieving predefined management goals.
Second, we have tried to promote Balanced Scorecards in some ministries and decentralized services. But the tool did not reach a complete use by the agencies in order to systematically monitor the achievements of the ministries’ medium-term strategies. Rather, it has served to integrate information within the same ministry that was scattered and make it more accessible to decision makers.
Third, we promote evaluations of public interventions. To do this, we use a Chilean methodology, adapted to our context, called Design, Implementation and Performance Evaluation (DID, in Spanish).
Fourth, and especially at the Department of Information Management and Open Government, we seek to publish documents on public policy issues and open government databases and promote their use among academia and other users. In doing so, we seek to send to different audiences the message that data is relevant, and that we need to develop products based on qualified empirical evidence and discuss public policy in those terms and with those inputs. This information is available at the Uruguayan Observatory of Public Policies, a website in the public domain, so as to strengthen the culture of thinking about public problems with data in hand. Similarly, in incipient form, we aim to support the agencies to improve their information systems to support monitoring and evaluation.
What are DID evaluations?
DID evaluations are conducted with information already available and have a duration of four months. The methodology consists of three stages: the analysis of the internal consistency of the project design (basically, the consistency between objectives and proposed activities), gathering information available on indicators of process and products, and the survey of performance indicators existing and other sources of available evidence regarding the achievement of the expected results of the project. Finally, the results of the evaluations are sent to the ministries that are responsible for the evaluated interventions: the Ministry of Finance and the Office of Planning and Budget. Improvement agreements, where interventions undertake to perform some activities to correct identified problems, are also defin
When do you or your team turn to available evidence to make your job?
The Department of Management and Evaluation usually starts its work with a review of what is being done in the world in different subjects with which we deal. For example, we recently designed a survey project of citizen satisfaction with Uruguayan public services, and we made a thorough review of international experiences, while we investigated other tools already used in the country with similar objectives. In addition, we submitted the project and the questionnaire to the review of national experts on population satisfaction and public opinion. We also look at other experiences when we propose new ways to present the results to the ministries under the budget process. Furthermore, we are constantly applying and integrating data scattered in ministries to produce inputs for the Director and Deputy Director of the Office of Planning and Budget.
Comentarios