Research on evaluation
From formal theory to real-world solutions
Research impact story
Since developing our evaluation capacity in the 1980s with national evaluations of UK government education policies, we have played an influential role in the development and implementation of educational policy in the UK and in strengthening evaluation practice worldwide. Beginning with evaluations of UK vocational education policies in the 1980s, we focused on European TEL based projects in the 1990s. In the first decades of the 21st century, we turned to higher education policies and practice in the UK and on building evaluation capacity. This was through a global evaluation presence in the International Organisation for Cooperation in Evaluation (IOCE) and the EvalPartners initiative supported by all UN agencies, all evaluation societies and by the European Commission. Using the distinctive ‘social practice’ approach to evaluation we have:
- Changed policy makers’ conceptions of how policies impact on practices;
- Improved the management of national teaching and learning policy initiatives;
- Influenced practices at an institutional level;
- Shaped international policy debates about how to develop useable and socially just evaluation.
Impact of our Research
1. Research has changed policy makers conceptions of how policies impact on practices.
For example: In the UK, the Scottish approach to quality enhancement within higher education has been supported by evaluative resources from HERE within the Department for some 11 years. QAA Scotland have acknowledged the key role of the analytical framework, used by Murray Saunders and his research team, in providing QAA Scotland with a model of cultural change to conceptualise institutional development in terms of systems of shared practices and values. The Head of QAA Scotland commented that “We took up and ran with your model of culture change [set out in Saunders, M. et al., 2011] through explicit work on developing and supporting quality cultures and we still conceptualise the
model of institutional development in terms of systems of both shared practice and shared values”.
2. Research has improved the management of national teaching and learning policy initiatives.
For example: Saunders' evaluation reports and discussions with policy makers in Scotland led to an improvement in the ways in which Scottish teaching and learning policy initiatives are managed. Based on evidence from the team’s evaluations, the external institutional reviews were revised to clearly align with the vision of an enhancement approach to quality (by re-emphasising supportive and collegiate approaches) and less on the assurance or checking of systems.
3. Research has influenced practices at an institutional level.
For example: Through reports and ongoing discussions with individual higher educational institutions, Saunders' team had a direct influence on the practices of staff and thus the students in Scottish universities. Amongst other outcomes, Saunders' evaluation was crucial in developing a change-focused approach to enhancement at the institutional level, and had a direct impact on the ways in which some institutions approached the internal evaluation of their educational innovations. Students were among the stakeholders who were able to use the evaluation outputs in training for their role within internal reviews.
4. Research has shaped international policy debates about how to develop useable and socially just evaluation.
For example: The social practice approach to evaluation (developed by Saunders and other members of here@lancaster) has had international impact through policies and related publications from the European Commission, publications from United Nations agencies such UNICEF, and through Saunders' vice presidency of the International Organization for Cooperation in Evaluation (IOCE) and on the executive of the EvalPartners initiative. This presence is exemplified, for example, by Saunders’ co-chair of the Task Force dedicated to Equity Focused and Gender Responsive evaluations which manages funding for two international collaborations designed to strengthen international capacity in this area under an initiative titled The Innovation Challenge. The Head of Evaluation at the Directorate General for Regional Policy (European Commission) stated that “Your work [Saunders, M. 2011; 2012] fed into a new concept which is reflected in the draft
regulation for the 2014-2020 period, which is aimed to be clearer on the different types of
indicators used and the differentiated roles of monitoring and evaluation... [It confirmed] that we need to build up practice at different levels of European Governance and seek to accumulate evidence across different contexts for the different policy areas”.
Saunders, M., Trowler, P. and Bamber, V. (2011). Reconceptualising evaluation practices in Higher Education. McGraw Hill/Open University Press.
Saunders, M., (2011) Capturing effects of interventions, policies and programmes in the European context: a social practice perspective. Evaluation, 17: 89-103.
Saunders, M. (2012) The use and usability of evaluation outputs: A social practice approach. Evaluation 18: 421-436