Breaking Down Medical Education 2.0_A First Draft

In their article introduction, Spickard, Ahmed, Lomis, Johnson, & Miller (2016) identify the defining characteristics of Medical Education 2.0 as the presence of (1) early clinical exposure, (2) competency-based assessment, and (3) personalized pathways to degree completion.  Normally I might ignore the list, since Spickard et al. (2016) is an IT- rather than an ID-centric article, and in my experience, it makes a difference in how meaningful introductions tend to be.  However, authors Lomis and Miller are architects of the recent curriculum reboot at Vanderbilt University School of Medicine, so I stopped to give it some consideration.

This blog post was born out of the consideration.  The goal was to see if I could fit the design themes I have found in reviewing a dozen recent medical education curricular reboots into the three qualities listed in Spickard et al. (2016). My desire to do so is born entirely out of the fact that I need to create a workshop around these things very soon and a loose, three-spoked organizational structure would suit my needs perfectly.

Disclaimer: There is nothing scientific going on with my sampling.  I started with the AAMC Core EPA pilot schools and snowballed a little bit from there, picking and choosing what looked interesting and good and also well-documented enough for me to write something of substance. If a school didn’t have enough out on the internet (school website, websites or blogs of professional organizations, or in the research literature) for me to understand what was going on, they were also out of luck.

Early Clinical Exposure

I rolled my eyes when I first saw this on the list.  “Early clinical exposure” has been around for a long time – long before Medical Education 2.0. In 1998, my alma mater was touting “early clinical exposure” when they placed us in primary care offices two months into the first year.  Based on my review, preceptor experiences such as mine are still popular and are considered evidence of “early clinical exposure,” but I think we can expand the concept beyond just that of workplace learning. Clinical exposure can stretch to mean applying basic science to clinical problems or thinking like a clinician (e.g. differential diagnoses, clinical workups) as well as workplace learning.

Integrated curriculum

Most medical education reboots describe their curriculum as “integrated,” defined by the intentionally combined presence of :

  • Basic science knowledge and its clinical applications. Applied learning (case- or problem-based) usually takes place in small group settings. Vanderbilt describes this well; see my earlier post for references.
  • Longitudinal threads and block curriculum. Concepts like ethics, bioinformatics, and professionalism are woven through all didactic curriculum. The best infographic illustrating this comes from Oregon Health and Science University.
  • Workplace and didactic learning. While all programs describe this, the Shared Discovery Curriculum at Michigan State University College of Human Medicine offers a unique slant on their mash-up:  they intentionally structure workplace and didactic learning throughout all four years of medical school (as compared to the usual two) with a goal of making “students feel useful to the healthcare team.” In the first phase, students work predominantly with medical assistants and nursing coordinators (not necessarily physicians) in primary care settings to room patients, take vital signs, document chief complaints, etc.  This not only exposes students to other types of healthcare professionals but offers the opportunities to build foundational clinical skills quickly and with the healthcare professionals who typically perform them. In the second phase, students work with physicians on in inpatient settings to develop differential diagnoses and patient care skills.  In the third phase, they engage in core clinical clerkships which are shorter and more learning-dense than traditional clerkships, because students have already established basic competencies.  They also continue with their didactic learning blocks to support the more complex clinical exposure in the third phase.

Learning communities

In Medical Education 2.0, Etienne Wenger’s learning communities are often elevated to learning “societies” or “academies” or “colleges,” but the concept remains the same: learners from diverse academic backgrounds engage in informal, experiential, authentic, social learning experiences around a shared purpose.   In most cases, these learning communities serve at least one of the following purposes:

  • Academic and career advising.  This will be discussed below in ‘personalized learning’
  • Enrichment programming related to clinical and professional identity and culture. I like the OHSU Colleges at Oregon Health and Science University School of Medicine and have described them in-depth in a previous post.
  • Small group learning, reflections, and debriefings. While some schools (Vanderbilt) create multiple sets of small groups (to enhance diversity, facilitate multi-team skill development), Michigan State University School of Human Medicine keeps all small group activities within their learning societies to enhance trust development.  It’s very Hogwarts. Very very Hogwarts.

An interesting twist: 

Workplace learning

Consider this a placeholder.  We all know what workplace learning is in the context of medical education.  It differs from the conversation found in the integrated curriculum section in that it could be used to describe the different ways in which early (i.e. pre-clerkship) workplace learning can look. Preceptorships (individual mentoring-shadowing relationships between practicing clinicians and students) are a well-established approach (as evidenced by my own preceptorship, circa 1998).  However, Medical Education 2.0 programs are recognizing that team approaches, pedagogically structured experiences, or opportunities to debrief with or without the creation of learning artifacts are also beneficial.

  • Team approaches. From my own experience working as a senior learning architect with Mayo Clinic School of Medicine, I know that sending students into the workplace in teams of three to five students can be useful in terms of protecting resources but also as a means for the students to coach and learn from each other. See also Michigan State University and its description of early clinical experiences.
  • Pedagogical structure. The Transition to Clinical Experience at Oregon Health and Science University School of Medicine – and its implications for all clinical experiences thereafter – is my favorite example of pedagogical structure for workplace learning.  As I’ve written before (see this post if you want to read about it in the broader curricular context), the Transition is a two-week simulation bootcamp that aims to get students ready for their clinical clerkships.  The experience acts as a gateway experience; students cannot progress without passing it.  It not only teaches students the practicalities (example: gowning and gloving) but introduces them to the EPAs in which they will need to demonstrate competency before graduation. In that respect, it acts as a pre-assessment.  The researcher in me loves this too.
  • Small group debriefings. Most programs bookend early workplace learning with small group debriefing sessions.  These experiences provide opportunities for peer-to-peer learning, challenging elements of the hidden curriculum, reinforcing learning through simulation or review, reflecting and creation of learning artifacts.

Competency-Based Assessment

My new favorite book, The Practical Guide to the Evaluation of Clinical Competence, does an incredible job of outlining the dimensions of competency-based assessment: (1) Competencies (e.g. core competencies and EPAs); (2) Levels of Assessment (e.g. Miller’s Pyramid); (3) Assessment of Progression (e.g. Dreyfus-driven milestones). I say that here, I think, mainly to promote this book.

Competency-based assessment has emerged from a long tradition of workplace-based assessment, an approach that this earlier post only begins to approach.  Workplace-based assessment is characterized by multi-faceted formative and summative assessment, and while direct observation forms the basis of most of this type of assessment, different settings (e.g. simulation centers, standardized patients, authentic patient care) and different assessors (multisource feedback, peer review, expert assessors) to consider. In the case of undergraduate medical education, where there is a very real need for students to demonstrate medical knowledge science and pass standardized national licensing examinations, written assessments (in the form of quizzes and tests) are part of competency-based assessment too.

There are many ways to organize a conversation on competency-based assessment. However, I think one of the easier ways is to discuss it in terms of formative assessment, summative assessment, and platforms for documentation and tracking.

Formative Assessment

Formative assessments are the learning artifacts that accumulate over time to demonstrate a student’s progression along the competency trajectory (milestones). The quality and success of formative assessment will depend on the program’s ability to do the following:

  • Design learning opportunities that facilitate student expression of the desired behaviors.  I mean curriculum and instructional alignment.  While I started a conversation about curriculum mapping and milestone development in this post, I’m not prepared to go down that rabbit hole today.  My point in bringing it up here is to reinforce the thought that if a school chooses to engage in CBME and competency-based assessment, they may need to redesign their curriculum to align with new desired outcomes. Curriculum 2.0 at Vanderbilt University School of Medicine offers a perfect, well-documented example (See this post for resources).
  • Create meaningful – valid, reliable, standardized – competencies and milestones by which to judge student performance. Again, Curriculum 2.0 at Vanderbilt University School of Medicine is your well-documented friend (See this post for resources).
  • Support student interpretation of and decision-making around their progress. This will be discussed below in personalized learning.

Some program-level core competency lists if you’d like to compare:

Summative Assessment

In the best cases, formative assessment is the building block of summative assessment.  The Michigan State University School of Human Medicine website offers a particularly complete description of its progress testing: a systematic approach to multi-faceted summative assessment.  Since I have not written a post on this (yet), I’m going to outline the contents of MSU’s Progress Suite of Assessments here.  Twice a semester for the duration of the program,  the Student Competence Committee reviews student performance on the following assessments:

  • Progress Clinical Skills Examination. A standardized patient/simulation experience; students move through eight stations, spending 20 minutes on the clinical scenario and another ten minutes answering questions about the clinical and basic science relevant to the scenario. Frequency: four times a year.
  • Comprehensive Necessary Science Examination. NBME exam (multiple choice medical knowledge exam designed to simulate USMLE Steps).  Frequency: four times a year.
  • Multisource Feedback.  360-degree assessments from preceptors (workplace and classroom), peers, nurses, patients, other healthcare team members. Frequency: twice a year.
  • Portfolio. Weekly aggregation of learning artifacts, tagged with SCRIPT competency goals
    • Direct Observations – Assessment checklists and rubrics are found in the JustInTime learning management system.
    • Clinical Documentation – Student artifacts such as an H&P, assessed with a JustInTime rubric
    • Procedural logbooks
    • Skills Certification – BLS and CITI training are required
    • External Examinations – USMLE Step 1&2 scores
    • Formative Assessments – Competency-driven assessments from small group, laboratory, simulation, and other learning activities.
    • Additional artifacts – optional certificates (e.g. Rural Health Certificate),  reflection journals (e.g. from a service learning project) – anything chosen by the student to demonstrate competence.
  • Self-Assessment. Students populate a competency-driven rubric referencing aspects of the portfolio as evidence. Frequency: Twice a year

An interesting twist:

  • As described in my last post, Oregon Health and Science University School of Medicine takes summative workplace-based assessment to a new level with badging.  The details are a bit sparse (probably because it is still in the planning stages), but it appears that OHSU students compile a suite of assessments for each of the 13 core EPAs. When a student is ready to be assessed for an EPA badge, the Entrustment Group (structurally similar to Michigan State’s Student Competence Committee) reviews the evidence and makes a decision to award (or not) the badge.  When a student achieves all the badges, they may graduate, regardless of the time in school.

Digital Assessment Platforms

CBME assessment required real-time longitudinal data collection on student performance and improvement. In all cases, schools are hoping to make the data collection, analysis, and visualization as seamless (and maintenance-free) as possible.  Here are two examples.

The Michigan State University School of Human Medicine Just in Time platform is a mobile, cloud-based application that functions on smartphones, tablets, and laptops. It  acts as a (1) content management system (for holding curricular content and allowing for curriculum reporting; (2) logging system (for entering, viewing, and creating reports around student-patient encounters); and (3) student assessment system (for entering, viewing and creating reports around faculty assessment of student performance).  A demo simulation can be accessed by following the link above. Students must still run the reports to include in their portfolio for assessment.

Vanderbilt University School of Medicine published the story of their portfolio platform development.  They wanted a platform that would allow them to integrate their learning management system, student clinical skills performance, assessments and artifacts, and course evaluations.  Ultimately, they modified Moodle to support the self-directed online modules, social media networks and group work, and electronic portfolio.  This platform interfaces with the electronic health record, where clinical skills performance artifacts reside.  Students are able to track their competency milestones in real time through dynamic dashboards.

Personalized Learning

When done well, multiple pathways to success and diversity in degree offerings support the development of skills and habits of mind required for lifelong learning and self-regulation.  Personalized learning can also promote inclusivity and diversity and (in some cases) reduce student debt. It is both a component and outcome of competency-based medical education in ways that I will probably work out in more depth in the next few blog posts. However, for now, I will address personalized learning in terms of personal learning plans and pathways.

Personalized learning plans

Medical Education 2.0 reboots almost always mention a personalized learning plan.  These flexible and formative plans are developed by the student and a coach as a reflection of goals and performance-to-date.  Coaches go by different names – ‘Learning Society Fellow’ at Michigan State; ‘Portfolio Coach’ at OSHU and Vanderbilt – but they are trained faculty housed within the learning communities (or equivalent) who work with students for the duration of the program.   Students and their coaches meet on a schedule to discuss student goals, interests, and performance on formative and summative assessments before formulating an action plan to be carried out by the student before the next meeting.

Personal learning pathways

If personalized learning plans offer students the opportunity to practice structured and supervised self-assessment and -reflection, personal learning pathways offer students the opportunity to customize their education. To a certain extent, personal learning pathways have always existed in medical education through dual degree programs.  However, Medical Education 2.0 has increased offerings far beyond the traditional MD combinations (M.P.H., Ph.D., M.B.A). I plan to document this in greater detail in my next post, but for now here are some examples.

  • Accelerated programs. Many programs now have a 3-year accelerated pathway but New York University School of Medicine offers an excellent description of their pathway if you follow the link. Students enrolled in the accelerated program at NYU are offered guaranteed spots in one of the school’s residency programs.
  • Additional concentrations, certificates, or dual degrees. Some programs offer students additional certifications – typically in areas or fields connected to the university’s strategic initiatives or mission.
  • Time-variable degrees. In the spirit of true competency-based education, OHSU is moving towards a time-variable degree program in which students will be able to finish their degree in as few as three years if they are able to achieve competency in the 13 core EPAs in that time period.  Alternatively, students may take longer to complete their degree if necessary or desired.
  • Clinical experience choice. Some programs – including Vanderbilt, Michigan State, and OHSU – allow students to mix their required and elective clerkships in ways that make sense for their personal interests and goals.
  • Multi-tiered student choice. The Your MD program at OHSU is notable for the number of ways personalized learning is built into their program design.  In many ways, they’ve developed personalized learning to support CBME (rather than the other way around). I’ve written about much of this before and will do so again, so I’m going to leave things as a bullet list just to give you an idea. Students customize their education by choosing:
    • the OHSU College that caters to their clinical and professional interests.
    • the content and order of their clinical experiences
    • the ‘pathway’ and content of their scholarly project (see my previous post)
    • additional certifications or degrees (more information to come)
    • the length of the program (this is still in the planning stages and is dependent on their capacity to achieve competencies in a shorter timeframe).

More to come.


Featured Image by Ian Schneider on Unsplash

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s