CC BY 4.0 · Journal of Health and Allied Sciences NU 2023; 13(02): 289-293
DOI: 10.1055/s-0042-1755447
Brief Report

Evaluation of Cognitive Domain in Objective Exam of Physiotherapy Teaching Program by Using Bloom's Taxonomy

1   Ziauddin college of Rehabilitation Sciences, Ziauddin University, Karachi, Pakistan
,
Sumaira Imran Farooqui
1   Ziauddin college of Rehabilitation Sciences, Ziauddin University, Karachi, Pakistan
,
Amna Khan
1   Ziauddin college of Rehabilitation Sciences, Ziauddin University, Karachi, Pakistan
,
Syed Abid Mehdi Kazmi
2   Department of Physiotherapy, Ziauddin Hospital, Karachi, Pakistan
,
Naveed Qamar
3   Physiotherapy Department, Aga Khan University Hospital, Karachi, Pakistan
,
Jaza Rizvi
1   Ziauddin college of Rehabilitation Sciences, Ziauddin University, Karachi, Pakistan
› Author Affiliations
Funding None.
 

Abstract

Objective For the development and growth in conceptual understanding of education, evaluation is one of the key factors of it. Improving a student's cognitive level is highly dependent upon the questions being asked in exams. The primary aim of this study is to analyze the cognitive level of physiotherapy exam papers using Bloom's taxonomy.

Material and Methods The study was performed in a Private Medical University, Doctor of Physical Therapy Program in all 5 years of mid-term examination of 2019. One thousand and eighty multiple-choice questions were evaluated on revised Bloom's taxonomy of cognitive domain.

Results It was found that most lower order cognitive questions were asked from first- and second-year students, whereas third- to fifth-year students were asked higher order cognitive questions ranging from 27.5 to 38%.

Conclusion The examination analyzed the efficacy of education being provided. It helped in finding the subject content that needs greater emphasis and clarification. The faculty should give consideration on higher order cognitive level questions to encourage critical thinking among students and the medical colleges should develop the policy on construction of question papers according to the goal of each study year.


#

Introduction

The important aspect of learning is domain modeling. However, any educator who begins teaching domain modeling faces multiple challenges.[1] One of the most challenging problems in the education system is to help students in developing an effective learning method. The questions asked in the paper that held per semester play a pivotal role in the endeavors to assess the overall cognitive standards of the students.[2] To help the education system, Bloom's taxonomy (BT) classifies the questions into different levels.[3]

In exam evaluation process, the use of multiple-choice questions (MCQs) is very common and well-accepted method of analyzing various characteristics of medical science education profession.[4] The MCQ designing requires a strong knowledge of the subject being tested as well as an ability of a teacher to ask a good question in exams.[5] Intelligently designed MCQs analyze the higher order cognitive level of students such as judgment and creation rather than simply evaluating the recall of information.[6] The formulation of high quality MCQs is a complex task for faculty, particularly for those who never had go through with any training for it.[7] This would be possible if the examiner is aware of designing a question in correct way, generally referred to as an object, comprised of a stem and some options.[8] Previously in 2008, University of Washington created Blooming Biology Tool that helps biology instructors with teaching practices and build classroom resources and exams using a unified evaluation system.[9] Similarly, some other medical related exam reported the application of their exam on BT and their findings surprised the education community with the fact that their paper was largely based on higher order cognitive thinking that was supposed to be based on content knowledge.[10]

Assessment is an essential element of a program for teaching–learning. A continuous monitoring of learning activities should be evaluated for giving feedback to teachers. It is thought that few academicians have enough knowledge that they would be able to determine the Bloom's cognitive level but most of them miscategorized it.[11] This incorrect identification of questions does not meet the exam standard required for a given year and subject.

Therefore, we conducted an evaluation on a physiotherapy exam paper to identify the cognitive level of MCQs according to BT to upgrade the items that needed to be updated, for developing a viable question bank for potential use in future. The teachers would also receive feedback on their effective teaching skill development.


#

Materials and Methods

A cross-sectional study was performed in Physiotherapy College of a Private Medical University of Karachi, Pakistan. The faculty was requested to provide the previous papers of the Doctor of Physical Therapy (DPT) program that had been recently done with the exams. The exam paper was collected from all the 5 years of DPT program. Total 27 objective question papers were received to all subjects of their odd mid-semester examination of 2019. Each objective paper carries 40 MCQs throughout in all subjects and years. The MCQ is one of the best type question, beginning with a stem and leading statement followed by five options. One is correct and the other four are distractors.

Following the receipt of data, MCQ was classified into different levels of cognitive domain using revised BT. The permission was taken via email from the Taylor and Francis Journal to use the modified version of BT in our study.[12] The evaluation process of labeling each MCQ according to the respective level of cognition was done by a thorough process; it was assessed by two independent assessors individually, who are experts of constructing MCQs at each cognitive level and had multiple trainings and courses to analyze the MCQs according to revised BT. The assessors are blinded to each other's evaluative answers, and at the end, the results of each question paper were disclosed for comparison, to make sure the accuracy in identification of level of questions. In any of the questions where answer was distinct, the final grading was considered by collaborative discussion of both the assessors along with the expert panel for accuracy of results.

Bloom's Taxonomy

In 1956, Benjamin Bloom, an educational psychologist, presented BT, a categorization of distinct objectives and skills that teachers set for students learning.[2] The BT is classified into three domains: cognitive, affective, and psychomotor. A cognitive taxonomy is a hierarchy, representing the higher level that could be attained if prerequisite knowledge and skills at the lower level are present. The lower order thinking skills include knowledge, comprehension, and application, whereas higher order thinking skills included analysis, synthesis, and evaluation.[13]

In the 1990s, Lorin Anderson, a cognitive psychologist, modified the taxonomy.[14] [15] Bloom's original and revised taxonomy is shown in [Fig. 1].

Zoom Image
Fig. 1 Bloom's original and revised taxonomy.

Mentioned below is the description of each level of modified BT that evaluates the student's cognitive level.


#

Remember

Remembering or memorizing a previous learned task or material, or any recall of information, includes a definition or copying/duplicating material, basic principles, and known facts. For example, can the student memorize the information?


#

Understand

Students can correctly explain the history of an event, report on the status of an organization as well as the understanding or apprehension such that the individual knows what is being communicated and can make use of the material or idea being communicated without necessarily relating it to other material or seeing its fullest implications. For example, can the student explain ideas or concepts?


#

Apply

Ability to apply knowledge to actual or new situations. The questions for programming in this category have the following criteria: understand the concept and use it to a new algorithm and modify controls. For example, can the student use the information in a new way?


#

Analyze

This level requires students to break down information into simpler parts and analyze each of it to achieve an objective. In addition, it should be able to explain what exactly happens to memory when the codes are executed line by line. For example, can the student distinguish between the different parts?


#

Evaluate

If a student achieves this level, the student should be able to integrate and combine ideas or concepts by rearranging components into a new whole. For example, can the student justify a stand or decision?


#

Create

This is a final level where judging, criticism, supporting, or defending one's own stand is involved. In the newer taxonomy, evaluating comes before creating as it is often a necessary part of the precursory behavior before one creates something. For example, can the student construct or compose a new product/invention?

The work was performed in accordance with the Declaration of Helsinki including, but not limited to, there being no potential harm to participants and the informed consent of college was obtained and their record kept confidential.


#
#

Results

Total one-thousand and eighty MCQs of the DPT program were analyzed on modified BT in which 200 questions were from first year, 280 from second year, 200 from third year, 240 from fourth year, and 160 from fifth year.

The evaluation of the first-year mid-semester examination found a higher percentage of level I MCQs (94.5%) regarding recall of information, whereas only 5.5% were of level II, about the understanding of the concept. The findings revealed in evaluation of second-year exams that all seven subjects have level I questions (91.4%) among which five of subjects also constitute level II questions, having a total percentage of 7.14%, whereas only three subjects possess level III questions in a very smaller proportion (1.4%).

In the third year, the level I MCQs proportion decreased to 49.5% in comparison with the first and second year. The level II and III MCQs possessed all subjects of the third year (23 and 25%, respectively), but surprisingly level IV MCQs were also found in two of the subjects (2.5%). The fourth year has much less level I MCQs with a percentage of 32.9%, and higher percentage in medical condition subject, that is, 50%, whereas all subjects possess questions between 30 and 22%. Level II MCQ was asked in all subjects of the fourth year in between 30 and 40%. Level III was also found in all subjects with a good proportion of 22 to 37%. The level IV MCQs found in four of the subjects constituted a total of 5%.

The fifth year had a total of four subjects with the least percentage of level I MCQs among all other years, that is, 27.5%, whereas the level II and III MCQs increased in comparison to other years, that is, 34.4 and 35.6%, respectively. But it was found that level IV MCQs do not constitute much percentage as compared with fourth year; it possesses only 2.4% as a whole. There is no single question with cognitive level of V and VI that was found in any of the academic year. To clarify in detail with respect to different subjects, [Table 1] represents the classification of each year of DPT program MCQs at different levels of cognitive domain of BT.

Table 1

Distribution of MCQs according to level of modified Bloom's taxonomy

Subjects

(each subject MCQs, n = 40)

Level of cognitive domain

n (%)

Level I

Level II

Level III

Level IV

First year

Anatomy II

40 (100)

Physiology II

38 (95)

2 (5)

Introduction to computer

40 (100)

Kinesiology II

31 (77.5)

9 (22.5)

Islamic studies

40 (100)

Total

189 (94.5)

11 (5.5)

Second year

Anatomy IV

40 (100)

Biomechanics and ergonomics

35 (87.5)

5 (12.5)

Sociology

30 (75)

8 (20)

2 (5)

Exercise physiology

34 (85)

5 (12.5)

1 (2.5)

Medical physics

39 (97.5)

1 (2.5)

Molecular biology and genetics

39 (97.5)

1 (2.5)

Biochemistry

39 (97.5)

1 (2.5)

Total

256 (91.4)

20 (7.14)

4 (1.4)

Third year

Pathology and microbiology II

27 (67.5)

6 (15)

7 (17.5)

Pharmacology II

32 (80)

2 (5)

6 (15)

Scient. inq. and res methodology

16 (40)

11(27.5)

13 (32.5)

Manual therapy

12 (30)

14(35)

12 (30)

2 (5)

Physical agents & electrother. II

12 (30)

13(32.5)

12 (30)

3 (7.5)

Total

99 (49.5)

46(23)

50 (25)

5 (2.5)

Fourth year

Prosthetics and orthotics

10 (25)

16 (40)

14 (35)

Surgery II

12 (30)

12 (30)

13 (32.5)

3 (7.5)

Neurological physical therapy

9 (22.5)

16 (40)

11 (27.5)

4 (10)

Medical condition II

20 (50)

5 (12.5)

15 (37.5)

Musculoskeletal physical therapy

15 (37.5)

14 (35)

10(25)

1 (2.5)

Evidence-based practice

13 (32.5)

14 (35)

9 (22.5)

4 (10)

Total

79 (32.9)

77 (32)

72 (30)

12 (5)

Fifth year

Pediatric physical therapy

10 (25)

20 (50)

8 (20)

2 (5)

Obs. and gynae. physical therapy

10 (25)

7 (17.5)

22 (55)

1 (2.5)

Sports physical therapy

16 (40)

15 (37.5)

9 (22.5)

Gerontology and geriatrics rehab

8 (20)

13 (32.5)

18 (45)

1 (2.5)

Total

44 (27.5)

55 (34.3)

57 (35.6)

4 (2.5)

Abbreviation: MCQs, multiple-choice questions.


The order of cognitive skills is summarily evaluated and illustrated in [Fig. 2]. The first and second year had majority questions of lower order of cognitive level, but in the third, fourth, and fifth year, the higher order cognitive level questions were seen.

Zoom Image
Fig. 2 Percentage distribution of higher and lower order of cognitive level of multiple-choice questions in different years of Doctor of Physical Therapy program.

#

Discussion

Post-examination analysis helps to measure the effectiveness of individual test items and study as a whole. It helps to find the subject content that needs greater emphasis and clarification. Further the questions that are not intent to analyze the cognitive level can be eliminated or modified from the question bank as per the need.

Choosing an appropriate evaluation means of measuring the efficiency of study has always been a challenging task for several medical institutions. Our research attempts to assess the exam paper to analyze the questions on BT about what level of cognition it required to solve the paper. It was found that in the first year the majority of questions (94.5%) belong to recall and remembering of information and 5.5% MCQs at understanding level. All MCQs belong to lower cognitive level, and none of them were at higher cognitive level. It is due to the fact that they are first-year students and at the initial level they require more focus on memorizing the information rather than generating the information. Similarly, in the second year, the questions belonging to level I, II, and III are 91.4, 7.14, and 1.4%, respectively.

A research done by Baig et al[16] in Pakistan evaluated the module questions on BT; they found 76% at level I, that is, recall of information and 24% at level II. But they mentioned it at a whole level, not at individual modules, as we mentioned in our study according to each year of DPT program, because each year of student requires a different level of questions. A teaching and learning conference[17] evaluated the performance of students on revised BT in mid-term and final exam; students showed improved results in level IV analysis. It was recommended to universities to endorse the highest level of questions to challenge students.

It was revealed in our study that third, fourth, and fifth year students have MCQs in level I, II, III and IV having higher cognitive levels as well. The third year constitutes the MCQs at level I, II, III, and IV with percentage of 49.5, 23, 25, and 2.5%, respectively, whereas fourth year constitutes MCQ of 32.9% at level I, 32% at level II, 30% at level III, and 5% at level IV. The fifth year has less low cognitive MCQs as compared with all other years, which are27.5% at level I, 34.3% at level II, 35.6% at level III, and 2.5% at level IV, respectively. This should be more at a higher cognitive level as they were final year students and believed to solve papers at a higher cognition level. A study done in 2012 that aimed to incorporate BT in a pharmacotherapeutics course to assess the effectiveness found significant results in knowledge, comprehension, and application levels as compared with analysis, synthesis, and evaluation question that required higher cognitive level and concluded that BT is a key method to assess the critical thinking of students.[18]

Faculty should be empowered and educated to develop MCQs for higher cognitive level. Our study represents that there needs to be more improvement at higher cognition level for better learning outcomes in students.


#

Conclusion

It is concluded that lower order cognitive questions were asked in earlier years, whereas senior year students were asked higher order cognitive questions ranging from 27.5 to 38%. The faculty should ask questions to encourage critical thinking in students. The medical college should develop the policy on constructing question papers according to the goal of each study year. Additionally, it is necessary to review the exam paper to improve in future. A faculty development program should be incorporated in all colleges to help teachers in developing the questions.


#
#

Conflict of Interest

None declared.

Acknowledgments

The authors would like to thank the university for permitting to publish the research. We also thank the entire faculty who provided their contribution in the study.

The work was performed in accordance with the Declaration of Helsinki including, but not limited to, there being no potential harm to participants in this study and the informed consent of college was obtained and their record kept confidential.


  • References

  • 1 Bogdanova D, Snoeck M. Domain Modelling in Bloom: Deciphering How We Teach It. In IFIP Working Conference on the Practice of Enterprise Modeling 2017; 3-17
  • 2 Omar N, Haris SS, Hassan R. et al. Automated analysis of exam questions according to Bloom's taxonomy. Procedia Soc Behav Sci 2012; 59: 297-303
  • 3 Forehand M. Bloom's taxonomy. Emerging perspectives on learning, teaching, and technology. 2010 41. 47-56
  • 4 Abdulghani HM, Irshad M, Haque S, Ahmad T, Sattar K, Khalil MS. Effectiveness of longitudinal faculty development programs on MCQs items writing skills: a follow-up study. PLoS One 2017; 12 (10) e0185895
  • 5 Palmer E, Devitt P. Constructing multiple choice questions as a method for learning. Ann Acad Med Singap 2006; 35 (09) 604-608
  • 6 Hingorjo MR, Jaleel F. Analysis of one-best MCQs: the difficulty index, discrimination index and distractor efficiency. J Pak Med Assoc 2012; 62 (02) 142-147
  • 7 Abdulghani HM, Ahmad F, Irshad M. et al. Faculty development programs improve the quality of multiple-choice questions items' writing. Sci Rep 2015; 5: 9556
  • 8 Collins J. Education techniques for lifelong learning: writing multiple-choice questions for continuing medical education activities and self-assessment modules. Radiographics 2006; 26 (02) 543-551
  • 9 Crowe A, Dirks C, Wenderoth MP. Biology in bloom: implementing Bloom's Taxonomy to enhance student learning in biology. CBE Life Sci Educ 2008; 7 (04) 368-381
  • 10 Zheng AY, Lawhorn JK, Lumley T, Freeman S. Assessment. Application of Bloom's taxonomy debunks the “MCAT myth”. Science 2008; 319 (5862): 414-415
  • 11 Yusof N, Hui CJ. Determination of Bloom's cognitive level of question items using artificial neural network. 10th International Conference on Intelligent Systems Design and Applications 2010.
  • 12 Taylor & Francis Online [Internet].. Tandfonline.com. 2020 [cited 14 July 2020]. Accessed July 23, 2022 from: https://www.tandfonline.com/
  • 13 Narayanan S, Adithan M. Analysis of question papers in engineering courses with respect to hots (higher order thinking skills). Am J Eng Educ 2015; 6: 1-0 (AJEE)
  • 14 Wilson LO. Anderson and Krathwohl Bloom's taxonomy revised understanding the new version of Bloom's taxonomy. The Second Principle 2016: 1-8
  • 15 Krathwohl DR. A revision of Bloom's taxonomy: an overview. Theory Pract 2002; 41: 212-218
  • 16 Baig M, Ali SK, Ali S, Huda N. Evaluation of multiple choice and short essay question items in basic medical sciences. Pak J Med Sci 2014; 30 (01) 3-6
  • 17 Al-Janabi A, Al-Rawahi N. Revised Bloom Taxonomy for Mechanical Engineering Courses: Evaluation and Performance. Sohar University teaching and learning conference
  • 18 Kim MK, Patel RA, Uchizono JA, Beck L. Incorporation of Bloom's taxonomy into multiple-choice examination questions for a pharmacotherapeutics course. Am J Pharm Educ 2012;76(06)

Address for correspondence

Al-Wardha Zahoor, MSc PT, DPT
4/B, Shahrah-e-Ghalib Rd, Ziauddin University, Block 6 Clifton, Karachi, 75600
Pakistan   

Publication History

Article published online:
13 September 2022

© 2022. The Author(s). This is an open access article published by Thieme under the terms of the Creative Commons Attribution License, permitting unrestricted use, distribution, and reproduction so long as the original work is properly cited. (https://creativecommons.org/licenses/by/4.0/)

Thieme Medical and Scientific Publishers Pvt. Ltd.
A-12, 2nd Floor, Sector 2, Noida-201301 UP, India

  • References

  • 1 Bogdanova D, Snoeck M. Domain Modelling in Bloom: Deciphering How We Teach It. In IFIP Working Conference on the Practice of Enterprise Modeling 2017; 3-17
  • 2 Omar N, Haris SS, Hassan R. et al. Automated analysis of exam questions according to Bloom's taxonomy. Procedia Soc Behav Sci 2012; 59: 297-303
  • 3 Forehand M. Bloom's taxonomy. Emerging perspectives on learning, teaching, and technology. 2010 41. 47-56
  • 4 Abdulghani HM, Irshad M, Haque S, Ahmad T, Sattar K, Khalil MS. Effectiveness of longitudinal faculty development programs on MCQs items writing skills: a follow-up study. PLoS One 2017; 12 (10) e0185895
  • 5 Palmer E, Devitt P. Constructing multiple choice questions as a method for learning. Ann Acad Med Singap 2006; 35 (09) 604-608
  • 6 Hingorjo MR, Jaleel F. Analysis of one-best MCQs: the difficulty index, discrimination index and distractor efficiency. J Pak Med Assoc 2012; 62 (02) 142-147
  • 7 Abdulghani HM, Ahmad F, Irshad M. et al. Faculty development programs improve the quality of multiple-choice questions items' writing. Sci Rep 2015; 5: 9556
  • 8 Collins J. Education techniques for lifelong learning: writing multiple-choice questions for continuing medical education activities and self-assessment modules. Radiographics 2006; 26 (02) 543-551
  • 9 Crowe A, Dirks C, Wenderoth MP. Biology in bloom: implementing Bloom's Taxonomy to enhance student learning in biology. CBE Life Sci Educ 2008; 7 (04) 368-381
  • 10 Zheng AY, Lawhorn JK, Lumley T, Freeman S. Assessment. Application of Bloom's taxonomy debunks the “MCAT myth”. Science 2008; 319 (5862): 414-415
  • 11 Yusof N, Hui CJ. Determination of Bloom's cognitive level of question items using artificial neural network. 10th International Conference on Intelligent Systems Design and Applications 2010.
  • 12 Taylor & Francis Online [Internet].. Tandfonline.com. 2020 [cited 14 July 2020]. Accessed July 23, 2022 from: https://www.tandfonline.com/
  • 13 Narayanan S, Adithan M. Analysis of question papers in engineering courses with respect to hots (higher order thinking skills). Am J Eng Educ 2015; 6: 1-0 (AJEE)
  • 14 Wilson LO. Anderson and Krathwohl Bloom's taxonomy revised understanding the new version of Bloom's taxonomy. The Second Principle 2016: 1-8
  • 15 Krathwohl DR. A revision of Bloom's taxonomy: an overview. Theory Pract 2002; 41: 212-218
  • 16 Baig M, Ali SK, Ali S, Huda N. Evaluation of multiple choice and short essay question items in basic medical sciences. Pak J Med Sci 2014; 30 (01) 3-6
  • 17 Al-Janabi A, Al-Rawahi N. Revised Bloom Taxonomy for Mechanical Engineering Courses: Evaluation and Performance. Sohar University teaching and learning conference
  • 18 Kim MK, Patel RA, Uchizono JA, Beck L. Incorporation of Bloom's taxonomy into multiple-choice examination questions for a pharmacotherapeutics course. Am J Pharm Educ 2012;76(06)

Zoom Image
Fig. 1 Bloom's original and revised taxonomy.
Zoom Image
Fig. 2 Percentage distribution of higher and lower order of cognitive level of multiple-choice questions in different years of Doctor of Physical Therapy program.