The way we see it

Quality improvement

Authors: Toby Hillman, Alice Roueche 

Publication date:  08 Apr 2011


Clinical audit is failing; long live quality improvement, argue Toby Hillman and Alice Roueche

We are facing some of the biggest changes to healthcare in England since 1948. Amid much controversy and uncertainty, strong themes that will shape the future NHS landscape are emerging. Two of these are “quality” and “clinical leadership.” A challenge for those charged with developing the medical workforce of the future is to embed these themes within medical training.

This has been made easier in part by the development of the medical leadership competency framework,[1] which has been adopted by all the medical royal colleges. The framework highlights that “improving services” is a fundamental part of clinical leadership, yet gaining practical experience in this field remains a challenge.

On the surface this has been dealt with through the compulsory involvement of junior doctors in clinical audit, as one of the pillars of clinical governance. As a meaningful educational and service improvement tool, however, this is failing to deliver. It needs to be either laid to rest as a mandatory requirement for career progression or invigorated with a new purpose and identity.

What’s wrong with junior doctor led audit?

A number of studies have shown that the traditional “clinical audit” as expected in medical curriculums is an ineffective way of improving practice or changing process—the very thing that it is intended to do.[2] [3] [4] In one study only 27% of audits were considered to be complete, and only 22% were reaudited. The steps studied in that paper may lead us to conclude that only 5% of audits led to any change in the practice or process studied.[2] This is a worrying finding as it implies that all the other audits were simply data collection exercises. With time being one of the most valuable resources of a junior doctor,[5] we cannot tolerate such waste from either an educational or efficiency viewpoint.

What are the main blocks to effective audit?

Purpose: Many clinical audits are done as a “tick box” exercise, with little understanding of why the audit is being carried out.

Burden of evidence: There may be a perceived need to collect sufficient data from poor sources to justify the conclusions drawn.

Time: It takes a long time to extract meaningful data from notes and written records.[6] In addition, junior doctors are often in a department or job for just three or four months, especially at the foundation level. This makes completion of the audit cycle difficult.[7]

Organisational inertia: External pressures on clinicians to measure, report, and improve often result in resistance to change and a desire to maintain a stable environment for clinical work.[8] [9]

Lack of support: Many junior doctors feel that there is not enough support from senior staff and audit departments.[10]

Cultural factors: Traditional hierarchies and cultural norms within the medical profession can disempower junior doctors, and role models for leading change are often lacking.[11]

A quality improvement approach—what is so different?

Rather than seeing quality improvement as being separate from clinical audit, we must look at quality improvement as the wider goal, an umbrella under which clinical audit can sit. In examining a clinical process and seeking to improve it there are circumstances where clinical audit might be appropriate, but in many more cases a less rigid quality improvement approach is more fitting. In training our doctors for the future we need to equip them with skills to lead improvement in all settings, and restricting their involvement to clinical audit does not achieve this.

Emphasis

The current emphasis for most junior doctors undertaking clinical audit is on data collection.[12] A quality improvement approach changes this focus. Instead of an endpoint in themselves, data should be seen as a resource that can show that a change is needed and that an improvement has been made. Quality improvement projects alter the approach from one of overwhelming scientific rigour in data collection to a more collaborative working style to investigate problems, identify solutions, and work with an entire team to raise standards.

We recognise, however, that change is difficult. Many studies have looked at the reasons for the failure of efforts to change in healthcare.[13] Strategies have been developed to facilitate change that use small tests of change within a system. These efforts are often more likely to result in sustained changes of practice than large change efforts perceived as being “top down.”[14] Engaging with a system of care and understanding how to improve it seems more valuable than trawling through hundreds of notes to prove that not everyone writes in black ink.

Criteria and standards

Many clinical audits are done against nationally agreed criteria and standards, often from clinical guidelines.[15] These criteria and standards offer a benchmark against which to judge a service and are therefore vital when assessing performance. However, many areas that affect patient safety, patient experience, and performance of a service have no agreed standards or criteria.

The freedom offered by a quality improvement approach allows junior doctors to concentrate their efforts not only on clinical problems for which there are clearly defined standards but on a wider range of issues. These broader aspects of healthcare can have an equally important bearing on the standard of care delivered. This freedom also encourages an ethos of looking for potential improvements and learning to challenge illogical, inefficient practices.

Motivation

In the authors’ experience, the motivation to do an audit usually falls into three broad categories:

  • I need this for my annual review of competence progression

  • I am really annoyed by this aspect of care and think it should change

  • I am not sure if this is research or audit, but I’ll push it through as audit because this is easier.

Since audit became a compulsory activity for doctors in training, it has been seen mainly in terms of “ticking the box.” We believe that achieving the change desired in the second motivation above is at the heart of quality improvement work and is far more likely to result in a change of practice, as there is a deeper personal motivation.

The third point is unethical and must not be condoned. It does, however, link to an important motivation for trainees: the desire for publications to build a competitive CV. With limited hours available, trainees are keen to focus on work that can be published. Good quality improvement projects can easily be written up for publication by using existing templates,[16] but this is not widely known.

Local context

Audits, which are often against national standards, rarely take into account local context, and rightly so—unwarranted variation in key areas is not a desirable finding. However, auditing high level outcome measures often means that the results lack local context. Quality improvement projects usually focus on the actual process of care at a local level. Inclusion of the key stakeholders in a local team allows for identification of specific local problems, rather than concentrating on high level outcomes. The intrinsically local efforts used by effective quality improvement projects allow more targeted solutions to be developed. Local ownership of the solution should enhance the sustainability of any change project.[17]

Participation and acceptance

Junior doctors are too often seen as peripatetic units of work rather than as individuals who can contribute meaningfully to the organisations they work in. To become an accepted and valued member of a community of practice, doctors should seek to understand the conventions underlying that community.[18] Thus a trainee who leads or engages with a well thought out quality improvement project that works towards a common goal, involves others, shares information, and is relevant to the local setting can improve their own acceptance within the professional community.

Learning organisations

The theory behind many popular approaches to quality improvement can be seen as having parallels in learning theory. There is a notable similarity between the PDSA (plan, do, study, act) cycle[19] and Kolb’s cycle of experiential learning.[20] A key point here is the value of “experience.” A real strength of quality improvement projects is that the idea for the improvement can be based on the trainees’ experience. To make an improvement they need to reflect on that experience to understand the process and plan the change project. The rest of the improvement cycle takes them through a learning process that helps them understand how organisations and systems work. On a larger scale, by encouraging a working culture that is based on improvement we are developing an organisation that learns.[21]

Putting this into practice

A quality improvement approach is increasingly recognised as an engaging and appropriate way to develop the leadership skills that trainees need. The London Deanery is running a “Beyond Audit” project to support trainees’ involvement in quality improvement. The Royal College of Physicians is piloting a similar project called “Learning to Make a Difference.” At the same time, fellowships in clinical leadership are giving junior doctors experience in quality improvement, and many are leading initiatives to engage fellow junior doctors. Organisations such as the Health Foundation and the King’s Fund have been supportive of these initiatives.

So far most work has been hospital based, but there are growing initiatives to spread this to primary care training as well. This is particularly relevant given the proposed NHS reforms,[22] which will put GPs in positions of important clinical leadership.

Conclusion

In the changing NHS climate there is increasing emphasis on improving quality within healthcare and also a move to encourage clinicians to lead this drive. Doctors’ training must therefore tackle the need for skills in quality improvement and leadership.

The time has come to recognise that junior doctors have a valuable contribution to make to healthcare delivery in the United Kingdom. Simply asking for a tick in the audit box at an annual review is not good enough. We need to issue a call to arms for juniors of all disciplines to take up the challenge to focus on change—for the better—and not simply record in endless audits that we are failing to live up to gold standards. In this way we can hope to develop a future generation of doctors who are equipped with the skills and attitudes to lead a constantly improving NHS.

Competing interests: None declared.

References

  1. Academy of Medical Royal Colleges and NHS Institute for Innovation and Improvement. Medical leadership competency framework. 3rd ed. NHS Institute for Innovation and Improvement, 2010.
  2. John CM, Mathew DE, Gnanalingham MG. An audit of paediatric audits. Arch Dis Child  2004;89:1128-9.
  3. Guryel E , Acton K, Patel S. Auditing orthopaedic audit. Ann R Coll Surg Engl   2008;90:675-8.
  4. Stanton E. An audit of audits. London Deanery, 2009 (unpublished data).
  5. Temple J. Time for training. Medical Education England, 2010.
  6. Gnanalingham J, Gnanalingham MG, Gnanalingham KK. An audit of audits: are we completing the cycle? J R Soc Med  2001;94,288-9.
  7. Collins J. Foundation for excellence. An evaluation of the foundation programme. Medical Education England, 2010.
  8. Gollop R, Whitby E, Buchanan D, Ketley D. Influencing sceptical staff to become supporters of service improvement: a qualitative study of doctors’ and managers’ views. Qual Saf Health Care   2004;13:108-14.
  9. Som CV. Nothing seems to have changed, nothing seems to be changing and perhaps nothing will change in the NHS: doctors’ response to clinical governance. International Journal of Public Sector Management  2005;18:463-77.
  10. Cai A, Greenall J, Ding DCD. UK junior doctors’ experience of clinical audit in the foundation programme. British Journal of Medical Practitioners  2009;2:42-5.
  11. Stanton E, Lemer C, Marshall M. An evolution of professionalism. J R Soc Med  2011;104:48-9.
  12. Nettleton J, Ireland A. Junior doctors’ views on clinical audit—has anything changed? International Journal of Health Care Quality Assurance   2000;13:245-53.
  13. Narine L, Persaud DD. Gaining and maintaining commitment to large-scale change in healthcare organisations. Health Services Management Research   2000;16:179-87.
  14. Berwick D. Developing and testing changes in delivery of care. Ann Intern Med   1998;128:651-6.
  15. Royal College of Psychiatrists. Clinical audit—step-by-step guide. [Link] .
  16. Quality and Service Improvement Project report template. 2010. [Link] .
  17. Narine L, Persaud DD. Gaining and maintaining commitment to large-scale change in healthcare organisations. Health Services Management Research   2003;16:179-87.
  18. Swales J M. Genre analysis: English in academic and research settings. Cambridge University Press,1990.
  19. Cleghorn GD, Headrick LA. The PDSA cycle at the core of learning in health professions education. Joint Commission Journal on Quality Improvement  1996;22:206-12.
  20. Kolb DA. On management and the learning process. Working paper. Massachusetts Institute of Technology Alfred P. Sloan School of Management, 1973. [Link] .
  21. Sheaff R, Pilgrim D. Can learning organisations survive in the newer NHS? Implementation Science  2006;1:27. [Link] .
  22. Department of Health. Equity and excellence: Liberating the NHS. 2010. [Link] .

Toby Hillman respiratory registrar and clinical leadership fellow
Alice Roueche paediatric registrar and clinical leadership fellow London Deanery

 tobyh@doctors.org.uk

Cite this as BMJ Careers ; doi: