Competency-Based Medical Education – The Basics

Introduction

Like all fields, healthcare is engaging with the characteristics and consequences of our digital society. Data with varying degrees of contextualization pours in from diverse sources.  Knowledge changes fast. Problems are complex.  In this new age, healthcare professionals (like everyone else) must not only be technical experts, but also team workers, lifelong learners, creative thinkers, and great communicators.

Additionally, medical education is engaging with the same wicked problems and reform movements as the rest of higher and professional education. Competency gaps have been documented at every transition from undergraduate (UME) to graduate (GME) to postgraduate environments.  Like others, medical school faculty and program directors experience overwork, conflicting priorities, and a lack of faculty development and support around teaching and learning. A stubborn hidden curriculum chips away at the intentional approaches and lessons of the classroom.  Buzzwords like accountability; evidence-based practice; inclusivity; and active, social, and applied learning design approaches are just as dominant in the healthcare literature as they are elsewhere in educational research.

In the “Beyond-Flexner” era, the trend in medical education is to move towards competency-based education as a way to enhance program and student accountability, flexibility, and performance.

The purpose of this blog post series (and I promise – it will be a series) is to provide a little background for individuals who need an introduction to competency-based medical education (CBME).  It will review basic vocabulary: What are competencies?  Milestones? Domains? It will try to make sense of the relationships between and best uses of competency domains and EPAs while comparing these new frameworks to more traditional instructional and assessment frameworks found in medical education.  It will describe how some institutions have chosen to implement CBME and how it might manifest itself at different levels of study (e.g. pre-clerkship UME versus GME).

A blog post is a blog post. It is not a reviewed (peer, if you believe that in that sort of thing, or otherwise) article. This particular blog post introduces the basics as well as one of the best-documented examples of CBME available in the form of Vanderbilt University School of Medicine’s UME pre-clerkship curriculum.  The blog post is not meant to be comprehensive.  Rather, it represents me thinking through the articles I’ve been reading for the last week in an effort to put together a cohesive account.  It is a preliminary effort and should be considered as such.

Initiatives to Know

Highlights:

ACGME Core Competencies (US). A set of competencies (e.g. professionalism, interpersonal communication, medical knowledge) developed for graduate medical education.  Specialty-specific milestones that support the competenices have been created in colloboration with the American Board of Medical Specialities (ABMS).  Now the competencies and their milestones are being used in U.S. graduate medical education and ABMS-mandated Maintence of Certification (MOC) efforts.

Core Entrustable Professional Activities (US). Guidelines published by the Association of Medical Colleges (AAMC) for undergraduate medical education.  Currently, they are being piloted by ten medicals schools across the US. The thirteen EPAs (observable, workplace-based activities that all medical students should be able to perform successfully and with consistency when they matriculate into any residency training program) are meant to interface with the AGME’s core competencies and milestones.

CanMeds (Canada). A role-based (e.g. communicator, professional, scholar) competency framework developed by the Royal College of Physicians and Surgeons of Canada for residency training and specialty practice.  As of 2015, the Royal College is moving towards the development of milestones to support the use of CanMeds at the postgraduate continuing education level.

ACGME Core Competencies

In an effort to address the competency gap between GME requirements and the changing needs of the workforce, the Accreditation Council for Graduate Medical Education (ACGME) partnered with the American Board of Medical Specialities (ABME) in 1999 to create the six domains of clinical competency. These domains included:

  • Patient Care
  • Medical Knowledge
  • Interpersonal and Communication Skills
  • Practice-Based Learning and Improvement
  • Professionalism
  • Systems-Based Care

Over the next ten years, GME programs worked closely with the relevant ABMS-certifying boards to specialty-specific develop milestones (defined in more detail later).  The milestones provide definition around the core competencies, attributing specialty-specific activities to each competency.  These milestones assist in developing curriculum, resident assessment tools, and residency training program evaluation strategies Nasca et al., 2012).

AAMC Core Entrustable Professional Activities

In 2014, the Association of American Medical Colleges (AAMC; the accrediting arm of AAMC, in collaboration with the Amercian Medical Association, is the Liasion Committee on Medical Education – LMCE) followed in the footsteps of GME and created 13 Entrustable Professional Activities (EPAs) which will serve as a framework for assessing the readiness of medical students for graduate medical training.  They are defined as the point when a learner has demonstrated an activity at the level that they no longer require direct supervision.  They differ from traditional competencies because they address performance within distinct workplace expectations that demand trustworthiness as well as expertise (Englander et al., 2016).

As of Spring 2015, the EPAs include:

  • Gathering a history and performing a physical examination;
  • Prioritizing a differential diagnosis following a clinical encounter;
  • Recommending and interpreting common diagnostic and screening tests;
  • Entering and discussing orders and prescriptions;
  • Documenting a clinical encounter in the patient record;
  • Providing an oral presentation of a clinical encounter;
  • Forming clinical questions and retrieving evidence to advance patient care;
  • Giving or receiving a patient handover to transition care responsibility;
  • Collaborating as  a member of an interprofessional team;
  • Recognizing a patient requiring urgent or emergent care, and initiating evaluation and management;
  • Obtaining informed consent for tests and/or procedures;
  • Identifying system failures and contributing to a culture of safety and improvement; and
  • Performing general procedures of a physician (e.g. IV line insertion, phlebotomy, BVM ventilation, CPR).

The AAMC launched the Core EPA initiative: a feasibility pilot study of ten schools that aims to explore curriculum development, assessment, the path to entrustment, faculty development and ongoing development of the EPAs.

Learning Objectives: A primer

Best source for learning more: 

Thomas, P. A., Kern D.E., Hughes, M.T., & Chen, B.Y (Eds.). (2015). Curriculum development for medical education: A six-step approach (3rd ed). Baltimore, MD: Johns Hopkins University Press

There are certain terms everyone should know before entering into a conversation about competency-based education.  These conversations typically start with a discussion of learning goals and objectives. Although goals and objectives can be defined in different ways, they are typically distinguished from each other by their level of specificity.  Goals or outcomes speak to broad educational outcomes while objectives relate to specific measurable actions or concepts.

According to Thomas et al. (2016), objectives can be divided into learner, process and outcome objectives:

  • Learner Objectives.  These objectives address knowledge, attitudes, and skills at the individual student level.  More specifically
    • Knowledge (aka Cognitive) Objectives relate to a spectrum of mental skills relevant to curricular goals ranging from factual knowledge to problem-solving, and clinical decision making.
    • Affective Objectives involve attitudes, values, beliefs, biases, emotions or role expectations that can impact performance.
    • Skill (aka Psychomotor)  Objectives relate to specific actions involving hand or body movements, vision, hearing, speech, or the sense of touche.  Medical interviewing, interpersonal communication, examination, and procedural skills fall into this domain (Thomas et al. (2016), p. 55-6).
  • Process Objectives. These objectives address achievement at the program level and are often expressed as aggregations of learner objectives.
  • Outcome Objectives.  Thomas et al. (2016) points out that ‘outcome’ is not used consistently in the literature or in the world.  Sometimes these relate to patient care while at other times they relate to learner objectives.  The authors recognize that it is unreasonable to suggest that undergraduate medical schools should be held accountable for patient outcomes as their interactions are indirect or their care peripheral and/or secondary.  However, they suggest that some health outcome objectives should be included to emphasize the big picture and potentially influence curricular design choices.

CBME: The Overview

Best source for learning more: 

Carraccio, C., Wolfsthal, S.D., Englader, R., Ferentz, K., Martin, C. (2002). Shifting paradigms: From Flexner to competencies. Academic Medicine. 77(5):361-367.

Competencies are “learner outcomes which are observable behaviors that result from the integration of knowledge, attitudes, and psychomotor skills” (Thomas et al. 2016, p. 58).  These competencies are often grouped into domains.  CBME emphasizes learner objectives for the purpose of ensuring learners are progressing from novice to expert with a framework of essential competencies (Harris et al., 2010).

Some key points worth remembering:

  • CBME focuses on learner objectives. It does not specify particular learning strategies or formats (Harris et al., 2010).
  • CBME establishes a trajectory rather than a baseline for minimal competence (Pettepher, Lomis, and Osheroff, 2017).
  • CMBE requires a commitment to formative assessment, self-regulation, and personalized learning.
  • CBME can provide seamless transitions across educational levels if it is designed with each other in mind  (Harris et al., 2010).

Carraccio et al (2002) published a useful table for distinguishing between traditional and competency-based education – the following table is directly from Carraccio:

Educational program
Variable Structure- and process-based  Competency-based
Driving force for curriculum Content—knowledge acquisition Outcome—knowledge acquisition
Driving force for process Teacher Learner
Path of learning Hierarchical (teacher → student) Non-hierarchical (teacher ↔ student)
Responsibility for content Teacher Student and teacher
Goal of educational encounter Knowledge acquisition Knowledge application
Typical assessment tool Single subjective measure Multiple objective measures (“evaluation portfolio”)
Assessment tool Proxy Authentic (mimics real tasks of profession)
Setting for evaluation Removed (gestalt) “In the trenches” (direct observation)
Evaluation Norm-referenced Criterion-referenced
Timing of assessment Emphasis on summative Emphasis on formative
Program completion Fixed time Variable time

Assessment: Milestones and trajectories

Best source for learning more: 

Swing, S.R., Beeson, M.S., Carraccio, C., Coburn, M., Iobst, W., Selden, N.R., Stern, R.J., & Vydareny, K. (2013). Educational milestone development in the first seven specialties to enter the next accreditation system. Journal of Graduate Medical Education. 5(1) 98-106.

Assessment in CBME is formative, personalized, and based in the language of milestones and trajectories. Milestones describe the typical developmental pathway for a given competency.  They are best used to describe competencies that develop and must be demonstrated over time under increasingly complex clinical circumstances (Thomas et al (2016).

Milestones are defined by the following criteria:

  • Criterion-referenced. They offer a common set of benchmarks that are applicable across courses.  Milestones are standardized across a program to help students identify their personal learning trends and provide stronger evidence to support change, remediation, or growth (Lomis et al., 2017).
  • Performance-driven. Milestones are driven by performance rather than time. There may be a window of time in which – given equal opportunity and practice – most students will achieve expertise (similar to developmental milestones for children), but achievement is not dictated by time alone (Nasca et al., 2012).
  • Escalating behavioral anchors. The trajectory towards expertise is described in terms of incremental improvements in performance. The milestones described in much of the CBME literature is based on the Dreyfus Model of Learning Progression  (Novice, Advanced Beginner, Competence, Proficiency, Expertise) or Miller’s Pyramid of Clinical Skills (Knows, Knows How, Shows How, Does).
  • Indicates current status AND what behaviors are required to progress. (Schumacher et al, 2013 and Swing et al., 2013).   Swing et al. (2013) offers an annotated example of a milestone graphic designed to support the assessment of a student for a sub-competency.

Example: Vanderbilt University School of Medicine

Sometimes it’s easiest to explain something through an example.  Several years ago Vanderbilt redesigned its UME curriculum and assessment to support the development of four of the six ACGME competencies.  It is also a pilot school for the AAMC Core EPA Initiative (more to come on this in later blog posts).

This section draws from the articles that Vanderbilt has published since 2010 about their curriculum and assessment reform and the lessons they learned along the way.

Why did they do it?

Best source for learning more:

Miller, B.M., Moore, D.E., Stead, W.W., & Balswer, J.R. (2010). Beyond Flexner: A new model for continuous learning in the health professions.  Academic Medicine. 85(2): 266-272.

In 2008, Vanderbilt University School of Medicine proposed the following principles for a new model of health workforce development:

  • Learning is competency-based and embedded in the workplace.
  • All workers learn; all learners work.
  • Learning is undertaken by individuals, teams, and institutions and is linked to patient needs.
  • Learning activities are modular; the system allows multiple entry and exit points.
  • Learning is interprofessional, with shared facilities, common schedules, and shared foundational coursework.
  • A rich information technology infrastructure supports the healthcare/learning system.
  • Health outcomes and educational outcomes are directly linked (Miller et al., 2010).

As part of this larger initiative, the school decided to transition away from a traditional curriculum and assessment profile (one based primarily on medical knowledge and content-only exams) to one that was directly connected to four of the six ACGME competency domains.  They focused on the pre-clerkship years (Medical School Year 1 & 2) because they felt that the lecture-driven nature the curriculum was not allowing students to benefit as they might from the collective clinical and professional experience of the faculty.

Who actually did the work?

An interdisciplinary leadership team spearheaded the Foundations of Medical Knowledge (FMK; Years 1&2) redesign, a process which included significant research, piloting, interprofessional collaboration, and faculty training.  The team included an anatomist (PhD), biochemist (PhD), pathology (MD-PhD), and pediatric neurogenetics (MD; Pettepher et al. 2016).

What did they do?

Best source for learning more: 

Vanderbilt University School of Medicine Office of Undergraduate Medical Education. (2018). Competencies for Leaners Across the Continuum. Retrieved from: medschool.vanderbilt.edu/ume/competencies-learners-across-continuum

The leadership team worked with faculty to diversify the curriculum to include active learning (described in detail below) and small group activities which would allow for more faculty-student interaction.

They also created a new approach to assessment that was based on the four competency domains and definitions as follows: (Miller et al., 2010 and Pettepher et al. 2016):

  • Medical knowledge. Demonstrate deep knowledge of the sciences essential for one’s chosen field of practice.  Approach to learning: collect, analyze, interpret and prioritize new information to enhance one’s knowledge in the various disciplines related to medicine.
  • Practice-based learning and improvement.  Compare data about current performance at the individual, team, and/or systems level with expected outcomes, and identify and implement the learning strategies needed to improve performance. They emphasized critical reflection and self-assessment here (through group and personalized learning plans).
  • Systems-based practice. Because students are early in their training, they framed the systems-based practice domain around each student’s role in the learning (rather than care delivery) system. They emphasized interpersonal communication and teamwork skills here.  Discuss the elements of effective team building and utilize appropriate techniques to create, participate in and lead effective teams.
  • Professionalism. Demonstrate a commitment to the duties and obligations of the medical profession, its healthcare institutions and its individual practitioners to patients, communities and society. Demonstrate honesty and transparency in all dealings with patients, learners, and colleagues. Demonstrate compassion and respect for all persons regardless of differences in values, beliefs, and experiences.

For detailed sub-competencies, see the Vanderbilt webpage on competency-based education.

How…did they design and evaluate milestones?

Best source for learning more: 

Lomis, K.D., Russell, R.G., Davidson, M.A., Fleming A.E., Pettepher, C.C., & Cutrer, W.B. (2017). Competency milestones for medical students: Design, implementation, and analysis at one medical school.  Medical Teacher. 39(5), 494-504.

As outlined in Pettepher et al. (2016), the leadership team created milestones for the four ACGME competencies.

They started by using the following sources of information to define core behaviors that should be continually re-assessed throughout a student’s training: 

  • Faculty judgment of importance
  • Priorities represented in existing assessments
  • Consideration of areas in which students historically struggle
  • Areas amenable to measurement starting in the first year.

After applying a Modified Delphi technique to their findings, the leadership team established work groups to work on writing milestones for each competency domain.  They based their work on the modified Delphi results as well as grading forms from multiple UME courses, GME milestones, and GME entry-level expectations.

Next, they piloted their milestones for construct alignment and validated their construct through:

  • Expert review
  • Interrater reliability (including between student and assessor)
  • Qualitative evidence of feasibility
  • User satisfaction

Continuous quality improvement cycles are being implemented (Lomis et al., 2017).

How…did they design their curriculum?

Best source for learning more:

Pettepher, C.C., Lomis, K.D., & Osheroff, N. (2016). From Theory to Practice: Utilizing competency-based milestones to assess professional growth and development in the foundational science blocks of a pre-clerkship medical school curriculum. Medical Science Educator. 26(3). 491-497.

Vanderbilt redesigned their pre-clerkship curriculum to provide ample opportunities to practice and demonstrate competency-related behaviors. First, they rearranged coursework to create a highly integrated block schedule curriculum:

  • Introduction to the profession (1 week)
  • Foundational interdisciplinary science blocks 

    • Human Blueprint and Architecture (6 weeks)

    • Microbes and Immunity (6 weeks)

    • Homeostasis (12 weeks)

    • Endocrine, Digestion, and Reproduction (12 weeks)

    • Brain, Behavior, and Movement (12 weeks)

  • Concurrent longitudinal blocks
    • Physical Diagnosis
    • Foundations of Healthcare Delivery
    • Learning Communities and Research
      • All students are assigned a clinic-based learning community consisting of providers and postgraduate trainees who oversee the care of a population of patients and serve as teachers, advisors, and mentors  (for more information specific to learning communities, see Miller et al., 2010 and Fleming et al., 2013).

Examples of integration across the interdisciplinary science and longitudinal blocks:

  • The genetics content of Human Blueprint and Architecture took place at the same time as an ethics section in the Learning Communities
  • Chest examinations and heart sounds in Physical Diagnosis coordinated with the cardiovascular unit of Homeostasis
  • Pediatric and adult neurological examinations in Physical Diagnosis coordinated with the Brain, Behavior, and Movement block (Pettepher et al., 2016).

Then the school introduced case, team, and experiential learning activities throughout the year.  In fact, they built courses around a similar weekly template of learning activities in order to implement consistent assessment events across the FMK phase.

Case-based learning sessions were student-run, problem-based learning sessions. Groups were crafted intentionally so that they included students from diverse academic backgrounds.  Groups re-formed every twelve weeks to help students practice transitioning in and out of new teams (Pettepher et al. 2016).

Team-based learning sessions were differentiated from case-based learning sessions, but the details were not evident from available publications (for the general differences between case-based and team-based learning see Koles et al., 2005).

Experiential learning was promoted through dissection teamwork (integrated into all basic science blocks) and patient-focused activities.  As described by Miller et al. (2010):

“Whereas most medical schools now include clinical experiences in the preclinical years, this model eliminates the “preclinical” and “clinical” distinction by giving students real responsibility for clinic operations and patient care from the start of their professional education  and by making the learning-working team the focus of their educational experience.” (p. 4)

How…did they design their assessments?

Best source for learning more:

Pettepher, C.C., Lomis, K.D., & Osheroff, N. (2016). From Theory to Practice: Utilizing competency-based milestones to assess professional growth and development in the foundational science blocks of a pre-clerkship medical school curriculum. Medical Science Educator. 26(3). 491-497.

The leadership team sought to design competency-based assessments that facilitated the collection of many low-stake data points for every student.  These touchpoints included self-, peer, and faculty feedback collected through standardized rubrics with qualitative milestone assessments.

Digital milestone forms included six observable behavioral anchors for each competency and described escalating levels of performance from unacceptable to aspirational. Over the course of the phase, minimally acceptable levels of performance were raised to parallel the maturation of the students. Identical milestone descriptors were utilized across the entire four-year curriculum and shift as the student level and experience progresses (Pettepher et al. 2016 includes real examples).

Self-, peer-, and faculty feedback were sought at scheduled intervals.  This included initial self-assessment exercises, standardized rubric assessments, and qualitative “global” questions:

  •  What is at least one valuable contribution this person has made to your team?
  • What is at least one important thing this person could have done to more effectively contribute to your team?

Assessment of the medical knowledge competency domain includes weekly low-stakes assessments consisting of multiple choice questions (autograded by the learning management system) and brief online essays (graded by the case-based learning facilitators).  End-of-block examinations consist of NBME shelf exam questions (autograded on the NBME website) and online essays/short answer questions (graded by block faculty).

All blocks are graded on a pass/fail basis.  Competency domains are weighted equallyStudents achieving target scores in all domains pass, while those who receive a threshold score in one competency generally pass, but must set additional learning goals for the weak domain.  Learning goals were established and assessed with the student’s portfolio coach (described below).  

Students with received one sub-threshold score in any domain or multiple threshold scores across domains are handled on a case-by-case basis. Sub-threshold scores in the medical knowledge domain resulted in students passing a remediation exam (at minimum).  Deficiencies in the other domains were remediated through ongoing targeted coaching and monitoring of peer and faculty milestone assessments (Pettepher et al. 2016).

How…did they personalize learning and assessment (beyond remediation)?

Best sources for learning more:

Vanderbilt University School of Medicine Office of Undergraduate Medical Education. (2018). Personalized Learning.  Retrieved from: medschool.vanderbilt.edu/ume/personalized-learning

Spickard, A., Ahmed, T., Lomis, K., Johnson, K. & Miller, B. (2016). Changing medical school IT to support medical education transformation. Teaching and Learning in Medicine 28(1), 80-87.

As described above, the leadership team was interested in providing students with a multitude of low-stakes data points across standardized milestones so that they could follow their trajectory over time and make adjustments as necessary.  These data points are collected at the individual student level in longitudinal e-portfolios. The data collected in these portfolios are reviewed regularly by students and their personal learning coaches who work together to create and monitor a personalized learning plan.  An illustration of the portfolio-user interface can be found in Lomis et al., 2017.

Portfolios are linked to other digital platforms for streamlined data integration and visualization.   This includes the student clinic digital platform, which allows for seamless integration of patient-care skills assessment.


There is more to come on personalized learning, and medical school IT in future blog posts (coming soon). Also additional examples of how different institutions are implementing CBME.

Featured Image by jesse orrico on Unsplash

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s