Please use this identifier to cite or link to this item:
|Title:||Lessons from the evaluation of the UK's NHS R&D Implementation Methods Programme|
|Keywords:||research implementation;implementation science;research utilization;research impacts;research payback;research evaluation|
|Publisher:||BioMed Central Ltd|
|Citation:||Soper B, Hanney S (2007) Lessons from the evaluation of the UK's NHS R&D Implementation Methods Programme, Implementation Science, 2:7|
|Abstract:||Background: Concern about the effective use of research was a major factor behind the creation of the NHS R&D Programme in 1991. In 1994, an advisory group was established to identify research priorities in research implementation. The Implementation Methods Programme (IMP) flowed from this, and its commissioning group funded 36 projects. In 2000 responsibility for the programme passed to the National Co-ordinating Centre for NHS Service Delivery and Organisation R&D, which asked the Health Economics Research Group (HERG), Brunel University, to conduct an evaluation in 2002. By then most projects had been completed. This evaluation was intended to cover: the quality of outputs, lessons to be learnt about the communication strategy and the commissioning process, and the benefits from the projects. Methods: We adopted a wide range of quantitative and qualitative methods. They included: documentary analysis, interviews with key actors, questionnaires to the funded lead researchers, questionnaires to potential users, and desk analysis. Results: Quantitative assessment of outputs and dissemination revealed that the IMP funded useful research projects, some of which had considerable impact against the various categories in the HERG payback model, such as publications, further research, research training, impact on health policy, and clinical practice. Qualitative findings from interviews with advisory and commissioning group members indicated that when the IMP was established, implementation research was a relatively unexplored field. This was reflected in the understanding brought to their roles by members of the advisory and commissioning groups, in the way priorities for research were chosen and developed, and in how the research projects were commissioned. The ideological and methodological debates associated with these decisions have continued among those working in this field. The need for an effective communication strategy for the programme as a whole was particularly important. However, such a strategy was never developed, making it difficult to establish the general influence of the IMP as a programme. Conclusion: Our findings about the impact of the work funded, and the difficulties faced by those developing the IMP, have implications for the development of strategic programmes of research in general, as well as for the development of more effective research in this field.|
|Appears in Collections:||Health Economics Research Group (HERG)|
Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.