Wanna get me some smarter…medical knowledge among residents

ClassroomAssessment

A recent article was just published about “assessing medical knowledge of emergency medicine residents”. This group systematically reviewed a list of educational tools to assess medical knowledge among EM residents. For example, some of the tools they reviewed were Multiple Choice Questions, In-training exams, direct observation, USMLE and OSCEs.  The authors looked at each method and described the existing literature about it’s effectiveness. While they did a nice review of what’s out there, their conclusions were disappointing. A call for further research…blah blah blah… but little call for true change. Anyways, I’ve put a bit of my own editorial below based on the article.

Apparently, in non-EM specialties, passing your board exams has been shown to translate to improved patient outcomes. WOW! I mean, if that isn’t evidence that we should continue board exams in their current format, I don’t know what is (please read sarcastically). There doesn’t appear to be data to suggest that better scores equals better patient outcomes, but even if it did, it could be easily subject to confounders. One would expect that a proportion of conscientious, smart physicians who spend a good deal of time learning to pass an exam will also apply that same conscientious attitude to their patients (even if they don’t necessarily apply the knowledge!).

What is fascinating is the next sentence in the paper…”we were unable to locate any data on the effect ABEM certification has on patient-centered outcomes“. In an era where we’ve begun seeking patient-centered outcomes, we have NO data about whether our ONLY means of accrediting staff physicians has any impact on patients in emergency medicine. Impressive…to say the least!

I remain puzzled as to why we continue to utilize the board exam in its current format (slightly different between US & Canadian but similar idea), where we require residents to recall ridiculous amounts of irrelevant material and long lists that will never be used clinically. We cling to this assessment format as if there was evidence to support it! And we reject other means of assessment because the evidence is lacking? There’s increasing evidence that we can replicate the stress of situations in a simulation setting – wouldn’t this be a great place to evaluate an emergency medicine training?

“Cognitive psychology has demonstrated that facts and concepts are best recalled and put into service when they are taught, practiced, and assessed in the context in which they will be used” (Cooke M et al NEJM 2006;355:1339-44)

Or what about more interactive, case-based formats? What we need to get away from is making residents regurgitate what is in a textbook. I can find the answer to a 10-item list in 10 seconds with a functional internet connection and a keyboard. I agree that emergent situations do require memory and recall, but a substantial portion of what we do would not fall under this category. A nice quote from an article written more than 10 years ago…yet little has changed!

“With knowledge so easily accessible, physicians in training as well as practicing physicians can depend less upon their own memories and more upon external memory devices” (Irby & Wilkerson J Gen Intern Med 2003;18:370-376)

Now I should probably be careful regarding my opinions…given I haven’t yet written my board exams! However, as we become increasingly surrounded by technology and immediately accessible “knowledge”, it’s time to evaluate medical trainees in a manner that will reflect their practice.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s