Annals of SBV
Volume 12 | Issue 1 | Year 2023

Assessment Methods in Medical Education

Dhastagir Sultan Sheriff

Department of Biochemistry, Anna Medical College, Montagne Blanche, Mauritius

Corresponding Author: Dhastagir Sultan Sheriff, Department of Biochemistry, Anna Medical College, Montagne Blanche, Mauritius, Phone: +230 58884033, e-mail:

How to cite this article: Sheriff DS. Assessment Methods in Medical Education. Ann SBV 2023;12(1):11–13.

Source of support: Nil

Conflict of interest: None

Received on: 31 August 2021; Accepted on: 29 April 2023; Published on: 19 July 2023


Assessment methods in competency-based medical education play a very vital role in imparting learning to learners. The assessment tools include formative and summative assessments. Based upon the milestones in the stepwise development of competencies, assessment methods are framed. The basis for an effective method of assessment is framing questions in alignment with the learning outcomes of the curriculum designed for the student. There is the active participation of students in formative assessment for their feedback drives the engine of learning and teaching. In this short communication, different methods of assessments are described.

Keywords: Assessment tools, Formative assessment, Summative assessment.


Competency-based medical education was developed basically in a stepwise manner. The major steps followed to develop competencies are (1) competency identification, (2) identification and determination of different components of competency, and levels of performance, (3) Evaluation of competency, and (4) overall assessment of the process.1

Once identifying the competencies to be learned, the next step is determining the components of competencies to be learned and levels of performance to be attained. The tasks for performing each competency or the benchmarks relate to the components of a competency. These tasks or benchmarks must be measurable and in total define a candidate’s achievement. The performance criteria have to be set clearly so that indicates the threshold to achieve a particular competency. To make the learner attain those competencies the teaching and learning methods have to be designed.2

The third step involves assessment methods to determine whether a candidate has attained the competence or not. Therefore, the conventional method of assessments is replaced by assessments that measure competence. Competence can be defined in a simple language as acquiring knowledge, psychomotor skills, values, and behavior to apply judiciously into professional practice. The domains of competence are interrelated, and these domains are knowledge, patient care, professionalism, communication, and interpersonal skills, learning through practice, and system-based practice.35

Competency includes developing a capacity to adapt to change, look for and generate knowledge, and improve overall performance.

A student or physician’s competence is tested through performance in a place or clinical setting. The factors that influence the practice will be how one sets the practice environment, demographic characteristics like the prevalence of a disease, symptoms presented by the patient, and the patient’s educational background. Knowledge gained through learning, its application, and analyses helps a learner to become competent in writing a case history, and developing clinical acumen.4

Competence is achieved through different milestones. Habits of mind, behavior and practical wisdom are gained through practice.

Goals of Assessment

Competency-based medical education is to make assessments timely, reliable, and precise. The aim is to optimize the competencies of learners, identify incompetent practitioners, and create a probable method to choose capable candidates for further training and admission to postgraduation.6

Assessments that have been recommended and put into practice are formative and summative.

Formative assessments are goal-oriented, continuous, and a stepping stone for future learning as well as to develop values. The feedback from the learners forms the basis for life-long learning and provides direction, motivation, and life-long learning.

Summative Assessments collect data regarding the progression or attainment of competencies by the learners from formative assessments. Such assessments help the certification process and grading of the learner. There is a cultural shift in assessment.7

Culture Shift in Assessment

Summative   Formative
Easily measured Hard to measure
Making grades Attaining standards
Identifying failing students Helping all students
Standardized methods Authentic measures

But summative assessments are poor feedback carriers. Pure formative assessments are not very long-lasting. Strategic use of assessment needs to be adopted like:

  • In-training assessment (internal assessment)

  • Embedded assessment (learning task/assessment task)

  • Combining summative and formative functions.

Assessment Methods

The following table provides a bird’s eye view of assessment methods adopted and used in various medical schools (Table 1).

Table 1: Various aspects of assessment
Goals of assessment
  • Provide direction and motivation for future learning, including knowledge, skills, and professionalism

  • Protect the public by upholding high professional standards and screening out trainees and physicians who are incompetent

  • Meet public expectations of self-regulation

  • Choose among applicants for advanced training

What to assess
  • Habits of mind and behavior

  • Acquisition and application of knowledge and skills

  • Communication

  • Professionalism

  • Clinical reasoning and judgment in uncertain situations

  • Teamwork

  • Practice-based learning and improvement

  • Systems-based practice

How to assess
  • Use multiple methods and a variety of environments and contexts to capture different aspects of performance

  • Organize assessments into repeated, ongoing, contextual, and developmental programs

  • Balance the use of complex, ambiguous real-life situations requiring reasoning and judgment with structured, simplified, and focused assessments of knowledge, skills, and behavior

  • Include directly observed behavior

  • Use experts to test expert judgment

  • Use pass-fail standards that reflect appropriate developmental levels

  • Provide timely feedback and mentoring

  • Be aware of the unintended effects of testing

  • Avoid punishing expert physicians who use shortcuts

  • Do not assume that quantitative data are more reliable, valid, or useful than qualitative data

The Table 1 gives a comprehensive idea about the various aspects of assessment.

These assessment methods are evaluated using certain criteria about their usefulness. These criteria are validity (whether the assessment method measures what it intended to measure), reliability (accurate and reproducible), its impact on future learning and practice, feasibility to learners and providers, and the cost involved.8

Written Examinations

Apart from the pencil and pen method of written examination questions, most of the universities use modified essay questions (MEQ) or case-oriented questions, or problem-solving skills. Most of these questions used are closed-ended or objective types of questions. The questions are content validated that can be used to test basic factual knowledge or transferability of such knowledge to real clinical situations or problems.9

The objective type of questions multiple choice questions (MCQ) help cover the majority of the content, frame in a relatively short period and be assessed by a computer. Multiple choice questions are framed using a stem that tests the student’s analytical skills as well as application and provides a list of possible answers to choose. Many a time a short case history is given and the student is asked to choose the answer from the given alternative answers. These questions are framed in such a way that it aligns with specific learning objectives and learning outcomes. Rather the content drives the Assessment.

Framing MCQs rich in content is difficult to write. Certain areas like ethical dilemmas or cultural ambiguities challenge the examiner to write clear objective type of questions. Multiple choice questions involving diagnosis (clinical/lab) with alternative answers provide a clue to the examiner before the actual diagnosis is made for the case listed for examination. (Cueing effect). To avoid such a defect extended matching or short answer questions may be asked.10

In many universities examination, a combination of essay, short essay, and multiplechoice questions are asked. To standardize a question, a blue-print is developed which guides in preparing questions according to blood taxonomy and weightage of modules taught. In some place template for question sets as well as guidelines for framing the questions are given both for formative or summative assessments.11

In clinical settings, the long case and mini-clinical-evaluation exercise (mini-CEX) are used to assess. Under a supervising Faculty the learner takes a clinical history of the case including physical examination for a period of 10–20 minutes. The learner is asked to provide the diagnosis and therapeutic plan for the case given.12 In some medical institutions, standardized patients for clinical case presentations are adopted for evaluating the intern or a medical student in clinical subjects. The standardized patient mostly includes actors who are trained to act like a patient with a particular disease explaining the clinical symptoms to the student who has to study the case and present the diagnosis. The supervising Faculty of the Department has a checklist before such cases are used in the assessments.13

Assessments made with the help of peers or by the members of the clinical team and sometimes the patient’s view will help grade the learner regarding work habits, whether one is good as a team member, and has good interpersonal interactions. Such a system is labeled as a multisource Assessment (360 degrees).14

In many places log books and portfolios are used to document the competencies attained by the student in different phases of learning. It provides a list of competencies attained along with competencies to be completed with certification from the respective faculty in charge.15,16

Apart from these methods objective structured practical examination (OSPE) and objective structured clinical examination (OSCE) are used as assessment tools of the learners.1618 There are many ways OSPE and OSCEs are arranged station-wise with one station as a resting phase. Every station is usually given time according to the need or stipulations framed by the concerned department or following the guidelines of the quality control committee related to examinations and assessments.


The changing pattern of medical education focuses on training and producing competent physicians. Assessments have become the central theme of medical education. To make assessments effective a right combination of assessment methods has to be made and also careful selection of trained assessors to conduct assessments.


Dhastagir Sultan Sheriff


1. Dunn WR, Hamilton DD, Harden RM. Techniques of identifying competencies needed of doctors. Med Teach 1985;7(1):15–25. DOI: 10.3109/01421598509036787.

2. Brown TC, McCleary LE, Stenchever MA, Poulson AM Jr. A competency-based educational approach to reproductive biology. Am J Obstet Gynecol 1973;116(7):1036–1043. DOI: 10.1016/s0002-9378(16)33856-x.

3. Swing SR. The ACGME outcome project: retrospective and prospective. Med Teach 2007;29(7):648–654. DOI: 10.1080/01421590701392903.

4. Lockyer J, Carraccio C, Chan MK, Hart D, Smee S, Touchie C, et al. ICBME Collaborators. Core principles of assessment in competency-based medical education. Med Teach 2017;39(6):609–616. DOI: 10.1080/0142159X.2017.1315082.

5. Gruppen LD, Mangrulkar RS, Kolars JC. The promise of competency-based education in the health professions for improving Global health. Hum Resour Health 2012;10:43. Available from:

6. Leung WC. Competency based medical training: Review. BMJ 2002;325(7366):693–696. PMID: 12351364.

7. Sheriff DS. Competency-based medical education in India. Annals of SBV 2020;9(2):39–41. DOI: 10.5005/jp-journals-10085-8125.

8. Friedman Ben-David M. The role of assessment in expanding professional horizons. Med Teach 2000;22(5):472–477. DOI: 10.1080/01421590050110731.

9. Van der Vleuten CP, Norman GR, De Graaff E. Pitfalls in the pursuit of objectivity: Issues of reliability. Med Educ 1991;25(2):110–118. DOI: 10.1111/j.1365-2923.1991.tb00036.x.

10. Norcini JJ: Current perspectives in assessment: The assessment of performance at work. J Med Educ 2005;39:880–889. DOI: 10.1111/j.1365-2929.2005.02182.x.

11. Schuwirth LW, van der Vleuten CP, Donkers HH. A closer look at cueing effects in multiple-choice questions. Med Educ 1996;30(1):44–49. DOI: 10.1111/j.1365-2923.1996.tb00716.x.

12. Case S, Swanson D. Constructing written test questions for the basic and clinical sciences. 3rd edition. Philadelphia: National Board of Medical Examiners; 2000. pp. 180.

13. Norcini JJ, Blank LL, Duffy FD, Fortna GS. The mini-CEX: A method for assessing clinical skills. Ann Intern Med 2003;138(6):476–481. DOI: 10.7326/0003-4819-138-6-200303180-00012.

14. Wass V, Jones R, Van der Vleuten C. Standardized or real patients to test clinical competence? The long case revisited. Med Educ 2001;35(4):321–325. DOI: 10.1046/j.1365-2923.2001.00928.x.

15. Eva KW, Reiter HI, Rosenfeld J, Norman GR. The ability of the multiple mini interview to predict preclerkship performance in medical school. Acad Med 2004;79(10Suppl):S40–S42.

16. Hodges B, McIlroy JH. Analytic global OSCE ratings are sensitive to level of training. Med Educ 2003;37(11):1012–1016. DOI: 10.1046/j.1365-2923.2003.01674.x.

17. Carraccio C, Englander R. Evaluating competence using a portfolio: A literature review and Web-based application to the ACGME competencies. Teach Learn Med 2004;16(4):381–387. DOI: 10.1207/s15328015tlm1604_13.

18. Harden RM, Stevenson M, Downie WW, Wilson GM. ‘Assessment of clinical competence using objective structured examination’. Br Med J 1975;1(5955):447–451. DOI: 10.1136/bmj.1.5955.447.

© The Author(s). 2023 Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and non-commercial reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated.