Skip to main content
  • Research article
  • Open access
  • Published:

Increasing the satisfaction of general practitioners with continuing medical education programs: A method for quality improvement through increasing teacher-learner interaction

Abstract

Background

Continuing medical education (CME) for general practitioners relies on specialist-based teaching methods in many settings. Formal lectures by specialists may not meet the learning needs of practitioners and may cause dissatisfaction with traditional CME. Increasing learner involvement in teaching programs may improve learner satisfaction.

Methods

A quality improvement program for CME for 18 general practitioners in the Tel Aviv region was designed as a result of dissatisfaction with traditional CME activities. A two-step strategy for change was developed. The CME participants first selected the study topics relevant to them from a needs assessment and prepared background material on the topics. In the second step, specialist teachers were invited to answer questions arising from the preparation of selected topics. Satisfaction with the traditional lecture program and the new participatory program were assessed by a questionnaire. The quality criteria included the relevance, importance and applicability of the CME topic chosen to the participant's practice, the clarity of the presentation and the effective use of teaching aids by the lecturer and the potential of the lecturer to serve as a consultant to the participant.

Results

The participatory model of CME significantly increased satisfaction with relevance, applicability and interest in CME topics compared to the traditional lecture format.

Conclusions

Increased learner participation in the selection and preparation of CME topics, and increased interaction between CME teachers and learners results in increased satisfaction with teaching programs. Future study of the effect of this model on physician performance is required.

Peer Review reports

Background

General practitioners in many settings rely heavily on specialist-based continuing medical education (CME) methods. These include direct consultation with experts, reviews in journal and textbooks, and formal continuing education activities. [1–3] A traditional hierarchical relationship results in a one-way transfer of knowledge from specialists to general practitioners [4]. However general practitioners may wish to control their own educational agenda and to inform specialists of their learning needs. Learning centered on clinical cases is likely to be of greater use to family physicians than formal lectures [5]. Some specialists may regard lectures as the principal method of transferring information and few may have given any serious consideration to alternative teaching methods. The disparity between what general practitioners want to learn from specialists and what specialists think they need can be a barrier to effective educational interaction if there is no negotiation between teachers and learners [6]. This paper describes a quality improvement program which introduced a new method of CME with the objective of increasing satisfaction with CME in a group of board certified general practitioners in Israel, in order to try to bridge the needs/wants gap.

Context of the problem

A group of 18 board-certified family physicians have been participating in a continuing medical education course in pediatric medicine in Tel Aviv, Israel for several years. The physicians had a mean seniority of 8.3 years in practice (s.d. 6.8 years). In the initial phase of the course, board certified specialists in pediatrics were invited to give weekly lectures to the course. The course began with three traditional lectures on various pediatric topics followed by questions from the audience.

Method

Outline of problem

A high degree of dissatisfaction was noted among participants in the traditional CME program based on responses to open questions on a feedback sheet collected at the end of each weekly session. As a result a decision was made to modify the teaching program.

Key measures for improvement

The course participants set the objectives for quality improvement in the teaching program after the start of the course. They chose to assess the relevance, importance and applicability of the CME topic chosen to the participant's practice, the clarity of the presentation and the effective use of teaching aids by the lecturer and the potential of the lecturer to serve as a consultant to the participant.

Strategy for change

As a first step in modifying the CME program, the participants listed the pediatric topics that were most important to them (needs assessment). Three participants were selected to prepare the topic for presentation and to lead group discussion on the topic. One physician was designated to present theoretical material from the pediatric literature, the second was required to present the evidence base for the topic and the third was to present a relevant case from their clinical practice. Important questions and unanswered questions from the discussion were recorded.

In a second step, the specialist who was to lecture to the group on a selected topic received the questions of the participants that arose in the group discussion of the topic. The lecturer was free to construct the presentation as they chose but was expected to relate to the questions of the participants that were provided in advance. The lecture was divided into short segments to allow for several periods of free discussion and questions.

Process of gathering information

The interactive lectures were evaluated by completion of a feedback questionnaire examining the quality criteria described above. The questionnaire consisted of 6 questions relating to the ideal performance of a specialist lecturing to family physicians. Responses were given on a six-point Likert-type scale with a score of six denoting strongest agreement with the item. Space was provided for additional free-text comments at the end of the questionnaire. The results of the feedback from the two course periods (3 frontal lectures and 7 interactive lectures) were compared. Data from the questionnaires were entered and analysed using Epi-Info software. Mean scores for each question were compared using t-tests with significance set at the 0.05 level.

Results

Effects of change

The results of the feedback questionnaire for the two periods of the program ("frontal" and "interactive" lectures) are listed in Table 1.

Table 1 Means scores of items on satisfaction with CME scale (n = 18)

Analysis and interpretation

The results show significantly higher satisfaction for interactive lectures compared to frontal lectures in all categories. Although the involvement of the course participants in setting the objectives for the intervention and the construction of the intervention after the start of the course may be considered sources of bias in a classical research design, they are inherent in a quality improvement program. The disadvantages of the new method were described in the free-text comments included in the participants' feedback. The new method required considerable time for preparation of the teaching sessions and discussions. The method required the goodwill and cooperation of both the participants and the lecturers. Some of the family physicians did not participate actively in the discussions but claimed that they contributed more than in the old method. Occasionally the discussion focused on trivial issues or issues that seemed less relevant to the majority of the group, yet they occupied the group's time.

Discussion

Adult learning theory and theories of how professionals maintain and develop competence emphasize the importance of self-directed learning and point to clinical practice and problem solving as key areas of interest [9]. Many studies have shown that patient care provokes frequent information needs [10]. General practitioners often rely on specialist-based CME because textbooks, journals, and other existing information tools are not adequate for answering clinical questions that arise in practice. Textbooks may become outdated quickly, journals are often not useful in daily practice and both methods are time-consuming and expensive. [10, 13] Computer systems that have been developed to help doctors are not widely used [10–12]. General practitioners have clear views of the content and style of teaching they wish to receive from their specialist colleagues [14]. The satisfaction questionnaire developed by the family physicians in this program reflected a wish that teaching be directly related to theirclinical work.

In this quality improvement program, the family physicians' desire for two-way interaction and for effective mutual education was achieved by family physicians expressing clearly what they wanted from their specialist colleagues and by specialists developing greater educational expertise. Prior needs assessment is important for informing and directing the educational process. [14]

Two models of educational interaction between family physicians and specialists are described here [2]. The first model presented is based on traditional didactic lectures given by specialists to general practitioners. General practitioners may dislike didactic lectures but specialists often prefer this method. The second model consists of interactive sessions centred on clinical cases. This model was popular with both family physicians and specialists. Heale et al found that traditional lectures and large group lectures were the least preferred method of CME. [15] A transition from passive to interactive learning groups was also recommended in another study [16].

Systematic reviews [17–21] of educational interventions have shown that continuing medical education can improve clinical performance and patient outcomes by changing doctors' behavior. The most effective methods described in these reviews include learning linked to clinical practice, interactive educational meetings, outreach events, and strategies that involve multiple educational interventions (for example, outreach plus reminders). The least effective methods are those most commonly used in general practice continuing medical education, namely, lecture-format teaching and unsolicited printed material (including clinical guidelines).

Conclusions

This report has described one method of increasing satisfaction with CME by increasing interaction between teachers and learners. Further study is required to test the association between the increase in satisfaction and changes in physician knowledge, competence and performance in this population.

References

  1. Slawson DC, Shaughnessy AF: Obtaining useful information from expert based sources. BMJ. 1997, 314: 947-949.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  2. Ely JW, Osheroff JA, Ebell MH, Bergus GR, Levy BT, Chambliss ML, Evans ER: Analysis of questions asked by family doctors regarding patient care. BMJ. 1999, 319: 358-361.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  3. Cantillon P, Jones R: Does continuing medical education in general practice make a difference?. BMJ. 1999, 318: 1276-1279.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  4. Little P: What do Wessex general practitioners think about the structure of hospital vocational training?. BMJ. 1994, 308: 1337-1339.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  5. Marshall MN: Qualitative study of educational interaction between general practitioners and specialists. BMJ. 1998, 316: 442-445.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  6. Bayley TJ: The hospital component of vocational training for general practice. BMJ. 1994, 308: 1339-1340.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  7. Marshall MN: How well do general practitioners and hospital consultants work together? A qualitative study of cooperation and conflict within the medical profession. Br J Gen Pract. 1998, 48: 1379-1382.

    CAS  PubMed  PubMed Central  Google Scholar 

  8. Marshall MN: How well do GPs and hospital consultants work together? A survey of the professional relationship. Fam Pract. 1999, 16: 33-38. 10.1093/fampra/16.1.33.

    Article  CAS  PubMed  Google Scholar 

  9. Holm HA: Quality issues in continuing medical education. BMJ. 1999, 318: 1276-1279.

    Article  Google Scholar 

  10. Smith R: What clinical information do doctors need?. BMJ. 1996, 313: 1062-1068.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  11. Swinglehurst DA, Pierce M, Fuller JC: A clinical informaticist to support primary care decision making. Qual Health Care. 2001, 10: 245-249. 10.1136/qhc.0100245...

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  12. Ely JW, Osheroff JA, Ebell MH, Chambliss ML, Vinson DC, Stevermer JJ, Pifer EA: Obstacles to answering doctors' questions about patient care with evidence: qualitative study. BMJ. 2002, 324: 710-10.1136/bmj.324.7339.710.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Chambliss ML, Conley J: Answering clinical questions. J Fam Pract. 1996, 43: 140-144.

    CAS  PubMed  Google Scholar 

  14. Gelula MH, Sandlow LJ: Use of focus groups for identifying specialty needs of primary care physicians. J Contin Educ Health Prof. 1998, 18: 224-226.

    Article  Google Scholar 

  15. Heale J, Davis D, Norman G, Woodward C, Neufeld V, Dodd P: A randomized controlled trial assessing the impact of problem-based versus didactic teaching methods in CME. Proc Annu Conf Res Med Educ. 1988, 27: 72-77.

    CAS  Google Scholar 

  16. Eliasson G, Mattsson B: From teaching to learning. Experiences of small CME group work in general practice in Sweden. Scand J Prim Health Care. 1999, 17: 196-200. 10.1080/028134399750002403.

    Article  CAS  PubMed  Google Scholar 

  17. Davis DA, Thomson MA, Oxman AD, Haynes RB: Evidence for the effectiveness of CME. A review of fifty randomised controlled trials. JAMA. 1992, 268: 1111-1117. 10.1001/jama.268.9.1111.

    Article  CAS  PubMed  Google Scholar 

  18. Davis D: Does CME work? An analysis of the effect of educational activities on physician performance or health care outcomes. Int J Psychiatry Med. 1998, 28: 21-39.

    Article  CAS  PubMed  Google Scholar 

  19. Oxman AD, Thomson MA, Davis DA, Haynes RB: No magic bullets: a systematic review of 102 trials of interventions. Can Med Assoc J. 1995, 153: 1423-1427.

    CAS  Google Scholar 

  20. Davis DA, Thomson MA, Oxman AD, Haynes RB: Changing physician performance: a systematic review of continuing medical education strategies. JAMA. 1995, 274: 700-705. 10.1001/jama.274.9.700.

    Article  CAS  PubMed  Google Scholar 

  21. Kerwick S, Jones RH: Educational interventions in primary care psychiatry: a review. Primary Care Psychiatry. 1996, 2: 107-117.

    Google Scholar 

Pre-publication history

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yacov Fogelman.

Additional information

Competing Interests

None declared

Authors' contributions

LG and YF initiated the study and collected the data. LG wrote the original draft of the paper. JY conducted the statistical analysis of the data and wrote subsequent drafts of the paper. All authors read and approved the final manuscript.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Gercenshtein, L., Fogelman, Y. & Yaphe, J. Increasing the satisfaction of general practitioners with continuing medical education programs: A method for quality improvement through increasing teacher-learner interaction. BMC Fam Pract 3, 15 (2002). https://doi.org/10.1186/1471-2296-3-15

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1471-2296-3-15

Keywords