Field:

Developing the regions' learning

The purpose of this report is to (1) make an analysis of the so-called learning plans that the counties have been commissioned by the government to draw up; (2) study the results of the ongoing evaluation method used in the national structural funds programme and the regional structural funds programme; and (3) analyse how learning within the regional growth policy can be developed.

In an earlier study (Tillväxtanalys, 2011b), Growth Analysis reviewed the so-called learn­ing plans that the counties have been commissioned by the government to draw up for the current year on the basis of Growth Analysis’ guidelines. The aim is to contribute to the process that is now being initiated by the regional self-government bodies, municipal co­operation bodies and county administrative boards to strengthen the regional growth pol­icy.

The learning plans can be viewed as an attempt to build a system for learning within the regional growth policy but the difficulties are many and must not be underestimated. There are many players and levels involved, which can lead to delays and difficulties in estab­lishing new lines of thought and interventions. Growth Analysis sees a need for reflection on a kind of learning that concerns both form and content. This also means that many of the interventions that are important in getting a leaning system to work better at regional level encompasses both administrative aspects and questions of content.

In collaboration with Kontigo, Growth Analysis has analysed the learning plans and made proposals for how learning primarily at regional level can be strengthened with the learn­ing plans submitted by the counties as the starting point. The purpose of the commission was thus to deepen this knowledge concerning the application of the regions’ learning plans in practice and on the basis of this knowledge draw up proposals for how regional learning can be developed.

In the continuing work of following up and evaluating the learning plans, Growth Analysis recommends (Tillväxtanalys (2011b) that the regions consider the following: a future-ori­ented view; evaluation of the previous learning process to identify relevant obstacles to learning; clarity as regards the level of ambition and delimitations; planning of various types of intervention to strengthen all three phases (planning, implementation and sign-off); and more reflection around the project’s content.

Within the framework of the structural fund programmes, the focus on evaluation and re­sults is further strengthened over the 2007-2013 period and relatively substantial resources are dedicated to ongoing evaluation. Here Growth Analysis presents the concept of ongo­ing evaluation and its results within the national structural fund programme and the re­gional structural fund programmes and suggests how learning at the regional level can be strengthened.

Growth Analysis has established that ongoing evaluation has had a very high level of am­bition, which has to a significant degree led to a reinforcement of learning within the structural fund projects and programmes initiated during the current programme period. But Growth Analysis’ overall conclusion is that based on the programmes and project evaluations that we have reviewed. The evaluators have not managed to resolve the diffi­culties in producing credible evaluation results. They have, however, contributed valuable results that have to do with the actual implementation of programmes and projects.

In summary, we draw the following conclusions as regards the results of ongoing evalua­tion based on the review we have made:

  • There are still problems with making an evaluation of goal attainment.
  • The evaluators have had to devote a considerable amount of time to trying to visualise the programme logic in the projects after the event.
  • It was easier for the evaluators to elucidate the actual implementation than to describe the results of the interventions that had been implemented.
  • Measures of comparison are not widely used in the project evaluations.
  • There are examples of final evaluations that are totally lacking in evaluation strategy, display significant shortcomings in methodology, and where no attempt is made to present any results.

Growth Analysis therefore proposes that the evaluators’ request concerning qualitative indicators that are used during the current programme period be further reviewed and that the changes necessary to be able to better control and monitor result development over the course of the next programme period be made.

It must, however, be pointed out that reporting credible results within these structural fund programmes is no easy task due to the way the programmes and projects are structured and implemented. The Swedish Agency for Economic and Regional Growth has earlier stated that this kind of additional or innovative programme cannot be evaluated in any simple way using so-called robust assessment methods and monitoring indicators. The agency’s opinion is that the indicators are often not reliable and that it is difficult for robust assess­ment methods to capture the level of ambition by influencing structures, institutions and attitudes. At EU level, both impact assessments and theory-based evaluations have been held up as legitimate ways of trying to get at the results of the structural fund programmes and projects. In this case, the Swedish Agency for Economic and Regional Growth has prescribed theory-based evaluation as a way to try to get at the problems that exist as re­gards reporting effects and results.

Growth Analysis is of the opinion that we shall prescribe neither the one nor the other method but instead try to combine different methods as best we can in order to as far as possible find valid indications that these interventions really deliver results based on the resources applied. All methods have their own weaknesses and the theory-based method of assessment that ongoing evaluation has embraced relies strongly on the belief that on the basis of scientific theories it is possible to capture and interpret results in programmes that have largely been formulated in a political context. This does not make the interpretations any easier but lead to other kinds of difficulties in interpreting the results. Based on the review we have made, there are examples of such difficulties where the evaluators en­counter problems when they try to put the reported results in a scientific theoretical framework. Theory-based approaches to evaluation are good in that they are broad ap­proaches and can be used to analyse causal mechanisms in different parts of the organisa­tion in greater depth, but at the same time require that information of different kinds be collected, processed and analysed. Attempts have been through theory-based impact evalu­ation to find evidence and knowledge of long-term effects, and in the international arena a mix of methods has been held up that largely consist of combining impact assessment with theory-based evaluations. These, however, are both resource-intensive and time-consuming activities, but give an opportunity to determine what works and does not work when the focus is on long-term effects.

Our proposals for developed learning are therefore as follows:

  • Work to establish the methods. The analysis has shown that the needs concern establish­ing working methods where learning is a distinct part of the development of quality in the regional growth policy. The main challenge is to build professional ca­pacity to implement the policy, where learning is included. The political level also needs to see quality and learning as a centrally prioritised area of the regional growth policy.
  • Develop the system view of the regional growth policy and link it to learning. The elements and complexity of the regional development policy need to be clearly de­scribed. Sorting and structuring interventions of different kinds gives a clearer picture of both the parts and the whole. On the strategic level, i.e. linked to the superordinate work of developing the regions, it is our opinion that the development of learning should be adapted to the needs we see for this level. On the programme level, the re­gions must identify for which of all the programmes that concern the region it also has an implementation and learning responsibility. The main task at this level is to define assessable goals for the results and impact of a programme when the programmes are designed and begin. These goals should be formulated as changes in the conditions that one aims to influence by means of the interventions in the programme.
  • Create continuity in learning.

Finally, it is impossible to take all steps at once, or it may be necessary to limit the levels of ambition as regards individual parts of the growth policy system described above or different types of learning. It would therefore appear to be a good idea to integrate a plan for how learning can be developed in stages in documents such as the regional develop­ment strategy.

Developing the regions' learning

Serial number: Report 2013:02

Reference number: 2011/056

Download the report in Swedish Pdf, 2.2 MB.