You are currently not logged in to BMJ Careers.

The regulation of medical training: a problem ignored

Authors: Benjamin Dean, Matt Jameson Evans 

Publication date:  15 May 2012


Benjamin Dean and Matt Jameson Evans look at how and whether training posts are being monitored for quality

Medical training in the United Kingdom has undergone many reorganisations in recent years, and these look set to continue with Andrew Lansley’s health bill, although the consequences of this bill for training remain unclear.[1] A crucial moment in training reform came in 2007 when the medical training application service (MTAS) and Modernising Medical Careers provoked junior doctors to go on a march, organised by Remedy UK. Since then many reviews of training have taken place, including those by John Tooke and John Collins,[2] [3] but are the roles and responsibilities of the various organisations involved in the regulation of training clear and transparent?

The current regulatory system

The General Medical Council’s “quality improvement framework” (QIF) lays out the structure of training regulation, and the GMC’s role is defined as “setting and regulating professional standards not only for qualified doctors’ practice, but also for both undergraduate and postgraduate medical education and training” (fig ). Specifically, since the demise of the Postgraduate Medical Education Training Board (PMETB), the GMC’s roles have included setting standards; identifying where these standards are not being met, through quality assurance and ensuring that those responsible take appropriate action; and ensuring better standards in medical education and training across the United Kingdom.

Regulatory structure of medical training in the UK[4]

As the figure shows, many organisations have a role to play in the regulation of training, and the GMC states that the quality improvement framework will “align” this activity. The GMC is responsible for quality assuring educational bodies (deaneries and universities), training posts, and curriculum and assessment systems. As part of quality management, deaneries are responsible for the educational governance, according to the standards defined in The Trainee Doctor.[5] It is made clear that if there are training concerns a dean is able to remove a group of trainees from a particular setting, and this must then be immediately reported to the GMC. However, “only the GMC” may approve or withdraw training approval. A key part of the quality improvement framework is the reliance on an extensive network of competency based training methods, which include lengthy curriculums, workplace based assessments, and online training portfolios.

Freedom of information requests to the GMC and deaneries

We asked the GMC and the deaneries a series of specific questions about withdrawal of training posts before and after the creation of the PMETB. The GMC was asked how many posts it had withdrawn since taking control from the PMETB in 2009, and the deaneries were asked how many posts had been withdrawn from the year 2000 to the present day, how many threats to withdraw approval had been made, and the number of posts where training approval had been withdrawn after a visit from either the GMC or the PMETB.

The GMC admitted that it had not withdrawn approval from a single training post since taking full control from PMETB. The GMC also admitted that it did not know that many posts had had their training approval withdrawn locally by the deaneries and that this information was not gathered systematically by the GMC or its predecessor, the PMETB. The GMC also confirmed that it had no record of training post withdrawals while the PMETB was in control (2005-9). The GMC stated that there was a degree of local variation in practice between deaneries and that the “GMC does not define when this locally driven removal of training posts from education providers should happen.” The GMC advised us that a “request could be sent to each deanery for them to confirm the number of training posts withdrawn over a specified period of time.”

The response from the deaneries was mixed. Three (East Midlands, West Midlands, and North East strategic health authority) responded that “since 2005 it has been the responsibility of the PMETB and more recently the GMC for training approval” and that before the PMETB’s inception it had been the responsibility of the appropriate royal colleges. The Yorkshire and Humber Deanery, the North Western Deanery, and the Severn Deanery confirmed that they held no records of post withdrawal. Several deaneries have not removed training approval from any posts (Defence, East of England, and Peninsula). Three deaneries had removed training approval from posts before and after 2005 (London; Mersey; Kent, Surrey, and Sussex). The Oxford Deanery and the Wessex Deanery have both removed training accreditation for posts since 2005, and the relevant regulatory bodies were informed (PMETB and GMC).

We have received a response from the Royal College of Surgeons of England detailing the removal of training approval for 12 posts from 2002 to 2005, solely surgical training posts. The Royal College of Physicians does not have data readily available. The Royal College of Paediatrics and Child Health confirmed that it does not have the data.

Overall, the GMC, the PMETB, and the deaneries reported that since 2005 not a single surgical training post had had approval withdrawn. This contrasts starkly with the Royal College of Surgeons of England’s watch, when an average of three higher surgical training posts had approval withdrawn each year. The lack of data from the other royal colleges makes any other comparisons impossible.

Although the removal of training approval is generally a last resort, it seems unclear whose responsibility it is to remove training approval. It is a cause for concern that the GMC has not removed approval for a single post, as is the lack of systematic data collection by educational bodies (despite the infrequency of these events). It is also of concern that some deaneries had informed the GMC or PMETB of these withdrawals of training approval but that the GMC seemed to have no easily accessible record of this.

Survey results

The GMC’s national training survey seems to gloss over several key issues; for example, the 2010 survey[6] showed some worrying statistics about training and hours worked. A considerable proportion of consultants (35%) believed that they were unable to deliver training at the same standard on a 48 hour rota as on a 56 hour rota. Almost a third of trainees (27%) believed that their training needs were not being met within an average 48 hour week. The recent Remedy UK survey of surgeons in training has provided further evidence indicating that working hours are still a big issue,[7] and this is also backed up by other survey data from organisations such as the Association of Surgeons in Training.[8] The belief that in all specialties the problems posed to training by working hours can be solved by “higher quality” training appears questionable,[9] especially in the context of an NHS in which there simply is not the money to fund the required consultant delivered service for this higher quality training to happen.

With regard to regulation, Remedy UK’s survey of surgeons in training showed that almost half of respondents (46%) believed that their annual review (an annual review of competence progression (ARCP) or a record of in-training assessment (RITA)) did not give them an adequate opportunity to point out any training issues. Preliminary results from the Remedy UK survey of current foundation doctors in training has shown that 42% of over 600 respondents believed that the current foundation regulatory process did not give them an adequate opportunity to highlight any problems with their training; 34% were neutral; and only 24% believed that the regulatory process gave them an adequate opportunity.

An optimistic explanation for the lack of removal of training approval of any post by the GMC since 2009 might be that training is currently so excellent that this measure has simply not been necessary. Unfortunately this explanation does not seem to be consistent with the survey data and the opinions of trainees and trainers on the ground. In the opinion of the authors, the number of training posts removed each year is a reasonable surrogate marker of the effectiveness of the regulatory process.

The future

The Temple report states that “high quality training produces professionals who are both competent and confident.”[9] The current regulation of medical training is, however, producing professionals who invariably appear competent on paper but are not necessarily competent and confident in reality. One of the key reasons for this mismatch is the modern educationalist shift from a training model that relied on a time based apprenticeship to one that relies on the minimum standards inspired by competency based training.[10] [11]

The GMC states that a visit could be triggered if there is a “lack of opportunity for students or trainees to learn new skills under supervision such that they are unable to reach the required competences.” Our recent survey of foundation doctors showed that 87% (499 doctors) of respondents believed that incompetent trainees could obtain satisfactory results from workplace based assessments, showing the problematic nature of the GMC’s stance of relying on a failure to achieve competency on paper as a measure of training quality.

Tooke said that “the management of postgraduate training is currently hampered by unclear principles, a weak contractual base, a lack of cohesion, [and] a fragmented structure.”[2] These problems remain, which has been evidenced by the responses we have received from the GMC and the deaneries about the withdrawal of training post approval. Accountability in the current regulatory system is confused and conflicting. Deaneries are both providers and regulators, meaning that trainees are often afraid to reveal any concerns about training they may have.

Tooke also concluded that “the profession should develop a mechanism for providing coherent advice on matters affecting the entire profession,”[2] but this issue has been ignored by the Department of Health and has not yet been dealt with. The current regulatory system is run by the GMC, an organisation of political appointees—an effective regulator should arguably connect more directly with the frontline of training. Our opinion is that highly motivated doctors in training and their trainers are let down by a regulatory framework unburdened by accountability, which can potentially encourage minimum standards, drains enthusiasm, and stifles excellence.

If the regulation of training is to be reformed as part of the new health bill, then all the issues we raise above must be taken into account. Training reform must capture the experience of those who are actually involved on the ground and not be driven by inhabitants of ivory towers. There must be a separation of powers between delivery and regulation. If this could be achieved then the key issues that have been ignored until now, such as the diminishing training standards caused by contracted working hours and competency based training, could be addressed with a degree of confidence.

References

  1. Roland D. A new deal for deaneries? BMJ Careers  25 Oct 2011. http://careers.bmj.com/careers/advice/view-article.html?id=20005142.
  2. Tooke J. Aspiring for excellence. Final report of the independent inquiry into MMC. 2008. www.mmcinquiry.org.uk/MMC_FINAL_REPORT_REVD_4jan.pdf.
  3. Collins J. Foundation for excellence: an evaluation of the foundation programme. NHS Medical Education England, 2010.
  4. General Medical Council. Quality improvement framework. GMC, 2010.
  5. General Medical Council. The trainee doctor. GMC, 2011.
  6. General Medical Council. National training surveys. 2010. www.gmc-uk.org/static/documents/content/Training_survey-FINAL2010.pdf .
  7. Dean BJF, Pereira EAC. Surgeons and training time. BMJ Careers  26 Oct 2011. http://careers.bmj.com/careers/advice/view-article.html?id=20005162.
  8. Association of Surgeons in Training. Optimising working hours to provide quality in training and patient safety. 2009. www.asit.org/assets/documents/ASiT_EWTD_Position_Statement.pdf.
  9. Temple J. Time for training. A review of the impact of the European Working Time Directive on the quality of training. May 2010. www.mee.nhs.uk.
  10. Miller A, Archer J. Impact of workplace based assessment on doctors’ education and performance: a systematic review. BMJ  2010;341:c5064.
  11. Dean BJF, Pereira EAC. British surgeons’ experiences of mandatory online workplace-based assessment. J R Soc Med  2009;102:287-93.

Benjamin Dean orthopaedic registrar, Oxford rotation, Oxford, UK
Matt Jameson Evans chairman and medical director, healthunlocked.com, London, UK

 bendean@doctors.org.uk

Cite this as BMJ Careers ; doi: