Crimson Publishers Publish With Us Reprints e-Books Video articles

Full Text

Open Access Research in Anatomy

Do We Need Exam Blueprinting to Validate the Anatomy Exams?

Aziz J*

Senior lecturer of Anatomy and Medical Imaging, New Zealand

*Corresponding author: Aziz J, Senior lecturer of Anatomy and Medical Imaging, New Zealand

Submission: November 04, 2019;Published: December 13, 2019

DOI: 10.31031/OARA.2019.02.000532

ISSN: 2577-1922
Volume2 Issue2


Anatomy is a big science with multitasking assessments including written, practical, labelling or even oral exams. The learning outcomes for each anatomy paper in the medical colleges are different based on the graduate profile and the specialty for the graduates. For instance, medical imaging students are not medical doctors but medical radiation technologists working as a radiographer and should have a basic understanding of human anatomy. On the other hand, they need comprehensive anatomical details in particular regions of the body e.g. musculoskeletal, cardiovascular, and gastrointestinal and nervous systems. Therefore, creating an anatomy exam for such types of students should consider these systems as top priority while the other systems have less weighting during exam construction. From my perspective on teaching anatomy across different students’ discipline i.e. medical and paramedical, I can confirm that there is a great value to have an exam proposal to validate the exam. Increase the students’ performance and decrease the students’ retention rate.

The test development process includes an overall plan, content definition, and test specification. The overall plan of an assessment will then include the identification of the target group, and the purpose of the exam such as selection, classification, placement, diagnosis or certification. Furthermore, the details of the test design, assembly and production, administration, scoring, standard setting, reporting results, item banking, post hoc analysis and logistic requirements must all be decided [1]. Once the purpose of the test is identified then the most crucial step in the development of a test is to ensure that the test should measure what it is supposed to measure, known as the test validity. Constructing a test blueprint enables stakeholders to have a bigger picture of the exam and determine the relative weights of individual content [2].

Herman [3] pointed out that “People [students] perform better when they know the goal, see models, know-how their performance compares to the standard”. The basic purpose of all tests is discrimination (to distinguish the level of aptitude, abilities, and skills among the test takers) regardless of the way how the test was constructed or conducted.

“It is said that ‘assessment is the tail that wags the curriculum dog.’ While this statement amply underscores the importance of assessment in any system of education, it also cautions us about the pitfalls that can occur when the assessment is improperly used [4].”

When we speak to undergraduate medical students after the examinations, not infrequently we hear them complaining in theory examinations that - Too lengthy paper, time was not enough to write; All questions were from few topics only! No questions from many other topics; Questions were too vague, what to write? What to cut? Long questions were bouncers! They have not taught these. And in practical examinations we hear them complaining that - I had never seen this case before; Most of the theory questions, long case, short case, and MCQ questions, all were from one/ few systems only. This happens because, in the traditional assessment system in most medical colleges, the question paper is set by one teacher/examiner with or without an unrenewable question bank. Also, the practical examinations are conducted by some other teacher, without any coordination and are not aligned to objectives (most of the time) [5]. Often, the content of what to assess is left to the decision of the examiners.

Moreover, the examiner/teacher imparts instruction according to what “she/he thinks is appropriate or important.” The assessment needs to be valid. Validity is a requirement of every assessment and implies that candidates for achieving the minimum performance level have acquired the level of competence set out in the learning objectives [3]. The validity that relates to measurements of academic achievement is content validity. The content of the assessment is said to be valid when it is congruent with the objectives and learning experiences, and congruence between these pillars of education can be facilitated by using blueprinting in the assessment [3].

In simple terms, blueprint links assessment to learning objectives. It also indicates the marks carried by each question. It is useful to prepare a blueprint so that the faculty who sets the question paper knows which question will test which objective, which content unit and how many marks it would carry [6].


  1. Hochlehnert A, Brass K, Möltner A, Schultz JH, Norcini J, et al. (2012) Good exams made easy: The item management system for multiple examination formats. BMC Med Educ 12: 63.
  2. Downing SM (2006) Twelve steps for effective test development. In: Downing SM, Haladyna TM (Eds.), Handbook of test development. Mahwah NJ. Lawrence Erlbaum Associates, New Jersey, United States, pp. 3-25.
  3. Herman JL (1992) A practical guide to alternative assessment. Association for Supervision and Curriculum Development. Alexandria, Egypt.
  4. Adkoli B, Deepak KK (2012) Blue printing in assessment. In: Anshu ST (Ed.), Principles of assessment in medical education. Jaypee Publishers, New Delhi, India, pp: 205‑213.
  5. Sunita YP, Nayana KH, Bhagyashri RH (2014) Blueprinting in assessment: How much is imprinted in our practice? J Educ Res Med Teach 2: 4‑6.
  6. Hamdy H (2007) Blueprinting in medical education. N Engl J Med 356: 387‑395.

© 2019 Aziz J. This is an open access article distributed under the terms of the Creative Commons Attribution License , which permits unrestricted use, distribution, and build upon your work non-commercially.