Open access peer-reviewed chapter

The Influence of Assessment Administration Modes on Students’ Academic Performance

Written By

Mfeuter Tachia

Submitted: 23 April 2023 Reviewed: 22 March 2024 Published: 18 September 2024

DOI: 10.5772/intechopen.114890

From the Edited Volume

Academic Performance - Students, Teachers and Institutions on the Stage

Edited by Diana Dias and Teresa Candeias

Chapter metrics overview

6 Chapter Downloads

View Full Metrics

Abstract

This chapter explores the impact of computer-based testing (CBT) on the academic performance of students, comparing outcomes with traditional paper-and-pencil tests. Emphasizing the important role of educational institutions, the research underscores the necessity for well-equipped facilities, including classrooms and libraries. It highlights students’ responsibility in attending classes, completing assignments punctually, and maintaining consistent study habits. The findings recommend cultivating students’ sense of responsibility. Proposing further investigations into perceived factors affecting academic performance, the study suggests potential strategies for improvement. Situated within the context of the book “Academic Performance - Students, Teachers, and Institutions on the Stage,” this contribution offers a practical understanding of the intersection between technology and academic success in physics education as a case study.

Keywords

  • educational assessment
  • test administration modes
  • computer-based test
  • academic performance
  • paper and pencil test

1. Introduction

Education, defined as the transfer of knowledge within formal, informal, or semi-formal settings, continually evolves, incorporating technological innovations to enhance pedagogical methodologies. This research delves into the dynamic landscape of academic performance, focusing on the perceived impacts of computer-based testing (CBT) on B.Sc. (Ed) physics students. Within the educational assessment framework, the study meticulously scrutinizes the contrasts between traditional paper-and-pencil tests (PPT) and CBT modalities. The implications derived from this investigation are significant for educators, institutions, and policymakers, offering insights into the complex interplay between assessment modes and student outcomes [1, 2].

As educational gateways often witness the influx of students into the first year (100 level), our findings underscore the necessity of strategic attention during this critical initiation phase. A key recommendation emerges, advocating for the alignment of students with their chosen courses to act as a motivational catalyst, fostering genuine interest and, subsequently, elevating academic performance. Integral to this proposition is the crucial role of educational institutions in providing requisite facilities for effective learning. The study proposed that adequately equipped classrooms, well-stocked libraries, state-of-the-art laboratories, and other resources are indispensable elements for creating an environment conducive to academic success. Furthermore, our research amplifies the call for active student involvement in their academic journey. The identified factors within students’ control, such as regular attendance, timely completion of assignments, and consistent study habits, emerge as key influencers of academic outcomes. Central to this emphasis is the cultivation of a sense of responsibility among students toward their studies, fostering a proactive approach [3, 4].

This contribution, situated within the broader context of the book titled “Academic Performance - Students, Teachers, and Institutions on the Stage,” provides a pragmatic and contextual understanding of the intricate relationship between technology and academic achievement in the domain of physics education as a case study. Understanding the transformative role of education in the contemporary world, where technological advancements shape the learning landscape, this research narrows its focus on computer-based testing (CBT). As the educational sphere increasingly adopts CBT, with its advantages of faster scoring, increased security, and adaptive testing capabilities, the study seeks to ascertain its impact on academic performance, particularly in the field of physics and other science-related disciplines. Aligning with the global trend, the research compares CBT with conventional paper-and-pencil testing (PPT), investigating potential effects on students’ academic achievements and seeking to provide insightful suggestions for refining the CBT approach to test administration. In an era where education is not only a means of knowledge transfer but a gateway to technological literacy, this research positions itself at the forefront, contributing to the ongoing discourse on academic performance and innovation in educational assessment [5, 6, 7, 8].

1.1 The significance of the chapter

This chapter, titled “The Influence of Assessment Administration Modes on Students’ Academic Performance,” holds profound relevance to the overarching theme of the book, “Academic Performance - Students, Teachers, and Institutions on the Stage.” The chapter contributes a valuable perspective to the collective understanding of academic performance by investigating the specific influence of computer-based testing (CBT) on a targeted group of students pursuing B.Sc. (Ed) physics.

Advertisement

2. Literature reviews

In the realm of higher education, computer-based testing (CBT) emerges as a widely embraced “innovative” assessment method, celebrated for its promise of quicker, cost-effective test delivery. The work of Gabriel and Yahaya [9] centers on the shift from traditional paper-and-pencil tests (PPT) to CBT in Nigerian tertiary institutions, with a specific focus on library and information science (LIS) students. Their study reveals that CBT adoption, while still in a pilot phase, has already dispelled the notion of drudgery among students, fostering enhanced thoroughness and technical skills. Despite prevalent challenges such as funding, personnel training, and the need for sustained student support, Gabriel and Yahaya [9] advocates that the benefits of CBT outweigh the challenges, holding the potential to elevate the quality of education in tertiary institutions.

Rocca and Zielinski [10], as part of the U.S. Department of Education’s race to the top assessment program, delve into the manifold benefits of CBT, emphasizing its effectiveness in administration, student preference, self-selection options, improved writing performance, and the potential to shift the focus from assessment to instruction. They highlight the versatility of CBT in enabling new methods of assessing student understanding, going beyond traditional approaches, and fostering interactive engagement with data. However, their study seeks to explore whether CBT has a measurable impact on students, aligning with the goal of this chapter to understand the influence of computer-based testing on academic performance.

Incorporating Jerome Burner’s constructivist theoretical framework, Namirembe [11] underscores the active nature of learning, wherein students build new ideas upon existing knowledge. The constructivist approach advocates for engaging students in active learning strategies, such as experiments and real-world problem-solving, aligning with the fundamental shift brought about by CBT. Namirembe [11] claims that CBT, rooted in self-directed learning (SDL), aligns with the constructivist educational philosophy, treating the learner as an active participant in the knowledge-creation process. This viewpoint contrasts with objectivism, emphasizing the passivity of the learner, and provides a foundation for understanding the potential impact of CBT on students’ academic engagement.

Dammas [12] investigates students’ attitudes toward CBT in chemistry courses, revealing a favorable inclination toward CBT among those with prior computer experience. While acknowledging issues in test administration, such as incorrect chemical equations and formulas, Dammas [12] notes a substantial success rate among students with previous exposure to computer resources. In a similar view, Efendi et al. [13] contributes to the discourse by proposing a web-based CBT paradigm, overcoming the drawbacks of traditional examination systems. The development research methodology adopted by Efendi et al. [13] results in an online test kit facilitating the assessment of students’ learning outcomes.

Oduntan et al. [14] compare student performance in paper-and-pencil tests (PPT) and CBT, drawing on data from the unified tertiary matriculation examination (UTME). The findings demonstrate increased interest in CBT among students, with superior performance noted in CBT compared to PPT. Furthermore, McClelland and Cuevas [7] scrutinizes the impact of testing modalities, particularly CBT and PPT, on students’ arithmetic performance. The study highlights the need for careful consideration of comparability between CBT and PPT, acknowledging the benefits of computer testing while emphasizing the importance of equivalence.

In general, the literature reviewed signifies a growing acceptance of CBT in educational assessment. While acknowledging the benefits, challenges, and varied attitudes toward CBT, these studies collectively contribute to the understanding of the impact of CBT on academic performance, setting the stage for further exploration in science courses.

Advertisement

3. Methodology

This chapter elucidates the methodological framework employed to examine the influence of computer-based testing (CBT) on the academic performance of B.Sc. (Ed) physics students at the University of Agriculture, now Joseph Sarwuan Tarkaa University, Makurdi. The research design, area of study, population sample, sampling techniques, instrumentation, method of data collection, and method of analysis are detailed within this methodological discourse. For this investigation, a descriptive survey design was adopted, aligning with the nature of the research objectives. This design was selected to delve into the general question of whether CBT affects students’ performance, offering a comprehensive yet insightful perspective on the phenomenon under study [9]. The study focused on the B.Sc. (Ed) physics program, a course offered by the College of Agricultural and Science Education at the Joseph Sarwuan Tarkaa University, Makurdi. This targeted approach ensures a concentrated examination of the impact of CBT within a defined academic context [9].

The population of this study comprises all 100L students who registered and took exams in the B.Sc. (Ed) physics program during the academic years 2011/2012 to 2015/2016 at the Joseph Sarwuan Tarkaa University, Makurdi. The total population consists of 639 registered students. Utilizing purposive sampling, a subset of this population was selected, considering the specific focus on B.Sc. (Ed) physics students to ensure relevance and depth in the study [2, 9]. As the study relies on secondary data provided by the Department of Science Education, no specific instrument was deemed necessary. The raw scores of students’ results were obtained from the examination officer of the department, and the Statistical Package for the Social Sciences (SPSS) was employed for the analysis of these scores [11, 12]. The data for this study were collected from the raw scores of students’ results, made available by the examination officer of the Department of Science Education. These raw scores, specific to the B.Sc. (Ed) physics program, formed the basis for assessing the impact of CBT on academic performance [11, 14]. To test the study hypothesis, the nonparametric sign test statistic was employed. This statistical tool, recognized for its applicability in non-normally distributed data, was chosen to analyze the results and ascertain the impact of CBT on students’ academic performance. The SPSS facilitated the execution of the analysis, including the use of the students’ t-test for specific evaluations [12, 14].

In conclusion, this methodological framework, comprising a descriptive survey design, purposive sampling, and statistical analyses using SPSS, was meticulously structured to investigate the impact of CBT on the academic performance of B.Sc. (Ed) physics students at the Joseph Sarwuan Tarkaa University, Makurdi. These chosen methods ensure a focused, contextually relevant, and rigorous exploration of the research questions, contributing to the broader discourse on the role of CBT in educational assessment.

Advertisement

4. Results

The statistical results obtained from the analysis conducted using the SPSS are presented in Tables 1 and 2. In Table 1, “N” represents the number of observations. Table 2 provides information on the probability values (“Sign.”) and the probability for a two-tailed test (“Sign. (2-tails)”). The critical value of the students’ t-tests is indicated as “t-crit.,” with the degree of freedom denoted by “DF.” Table 2 is particularly informative for the interpretation of results. The “Sign.” column, representing the probability value, is crucial in accepting or rejecting the null hypothesis (H0). The null hypothesis states that there are no significant effects of computer-based testing (CBT) on students’ academic performance. The alternative hypothesis (H1), on the contrary, suggests significant effects of CBT on academic performance. The decision rule is based on the comparison of the probability value (P-value) with the level of significance, which is set at 0.05.

Mode of testNMeanStd. deviationStd. error mean
% Pass rateCBT1691.556.0721.518
PPT1689.315.6901.423

Table 1.

Group statistics.

t-Test
t-Crit.DFSign. (2-tails)Mean differenceStd. error difference95% confidence interval of the difference
LowerUpper
% Pass rateEqual variance assumed1.076300.2902.2392.080−2.0106.487
Equal variance not assumed1.07629.8750.2902.2392.080−2.0116.488

Table 2.

Independent sample test.

The findings of the analysis using the SPSS are presented in Tables 1 and 2.

Upon analysis, the obtained P-value of 0.290 is greater than the set level of significance (0.05). As a result, the null hypothesis (H0) is accepted, indicating that there are no significant effects of CBT on students’ academic performance. Consequently, the alternative hypothesis (H1) is rejected. In conclusion, the research aimed to investigate the impact of CBT on the academic performance of B.Sc. (Ed) physics students at the Joseph Sarwuan Tarkaa University, Makurdi. The use of the students’ t-test and SPSS for statistical analysis revealed that CBT had no significant effects on students’ academic performance in science courses. This finding contributes to the broader discourse on the role of CBT in educational assessment and emphasizes the importance of considering various factors influencing academic performance.

Advertisement

5. Discussions

The findings derived from the aforementioned data unequivocally indicate that, based on the statistical analysis, the null hypothesis, which states that computer-based testing (CBT) has no discernible impact on the academic performance of students in the sciences, is accepted, leading to the rejection of the alternative hypothesis. This aligns with the research conducted by Oduntan et al. [14], which similarly concludes that there is no significant impact on individuals’ performance when provided with computer-assisted instructions. Several factors emerged as potential contributors to a negative impact on students’ academic performance. Firstly, time management surfaced as a critical issue. The contemporary lifestyle of students often involves distractions such as social media engagement, watching sports, playing games, and participating in various extracurricular activities. This widespread lack of focus on studies due to time mismanagement could have adversely affected academic outcomes. The statistical results indicating no significant effects of CBT on academic achievement raise questions about whether time allocation played a role in student performance.

Changes in course assignments were identified as another potential factor influencing academic performance. This leads to a loss of interest and focus among students when they are offered courses different from their initial preferences. Particularly impactful on 100L students entering a new educational system, this adjustment challenge may have contributed to diminished academic performance. Teaching methodology emerged as a critical consideration, especially for newly admitted students unfamiliar with the university system. The shift from the more familiar teaching style employed in secondary (high) schools to the lecture technique used at universities may pose challenges for some students. Difficulty in applying new teaching methods could potentially result in poor academic performance among this cohort.

Inadequate facilities also surfaced as a noteworthy factor impacting student outcomes. Overcrowded lecture spaces, unsuitable seating arrangements, and the lack of relevant equipment in most classrooms contribute to an environment that is not conducive to effective learning. Issues such as poor coordination during exams and a lack of amenities further compound the challenges faced by students. These facility-related challenges may have played a role in the observed failure among students. In conclusion, while CBT itself was not found to have a significant impact on academic performance, the identified factors, including time management, changes in course assignments, teaching methodology, and inadequate facilities, warrant careful consideration. Addressing these underlying issues may be crucial in enhancing overall student performance within the academic environment.

Advertisement

6. Conclusion

In conclusion, this chapter meticulously examined the impact of computer-based testing (CBT) on the academic performance of B.Sc. (Ed) physics students at the Joseph Sarwuan Tarkaa University, Makurdi. Utilizing a descriptive survey design and drawing on a focused population of 100L students enrolled in the B.Sc. (Ed) physics program, the research sought to address the overall question of whether CBT has a discernible effect on students’ academic outcomes. The research findings, derived from an analysis of raw scores obtained through secondary data from the Department of Science Education, indicated that CBT did not exhibit a statistically significant impact on students’ academic performance in the sciences. These results align with similar findings by Oduntan et al. [14], reinforcing the notion that computer-assisted instructions may not inherently influence academic achievement.

However, the exploration delved into potential contributing factors that might adversely affect student performance, offering a comprehensive understanding of the broader context. Issues such as time mismanagement, changes in courses offered to newly admitted students, teaching methodology, and inadequate facilities emerged as noteworthy considerations [15]. The contemporary lifestyle of students, characterized by various distractions, raised questions about the effective use of time for academic endeavors. Additionally, challenges associated with changes in courses offered to students, the shift in teaching methodologies, and insufficient facilities were identified as potential influencers of academic outcomes. While CBT itself may not be the direct cause of variations in academic performance, these underlying factors represent critical dimensions that warrant further attention and intervention. The chapter underscores the need for a holistic approach to educational assessment, acknowledging that the mode of testing is just one facet of a complex educational landscape. Addressing the identified challenges in time management, courses of study, teaching methodologies, and facilities may contribute significantly to enhancing the overall academic performance of students. The ensuing chapters will delve into these factors, providing a comprehensive understanding of their implications and proposing potential strategies for improvement.

Advertisement

A. Appendix: data presentation, analysis, and interpretation

A.1 Introduction

The methods for data analysis and interpretation of the results from the two test administration formats, namely PPT and CBT, on the academic performance of 100-level students are covered in this chapter. After compiling the data, reports with tables and qualitative analysis were generated. The data on the number of 100-level students who passed their exams in a few chosen science subjects for both PPT and CBT from the academic years 2011/2012 to 2015/2016 are shown in the tables below. Anyone who received less than 45% was said to have failed the study, with 45% being regarded as the pass grade.

The list of useful tables and brief descriptions are presented as shown.

SeeTable A1.

Academic sessions2011/20122012/20132013/2014
Courses/codesTotal no. passedNo. of %Total no. passedNo. of %Total no. passedNo. of %
Reg. studentsPassedReg. studentsPassedReg. studentsPassed
CHM111737498.65435184.31444793.62
CHM151697493.24505198.04454795.74
CMP111727497.305151100.004747100.00
GST111687491.89485194.12454795.74
MTH111717495.95475192.16444793.62
PHY111597479.73485194.12414787.23
PHY191567475.685151100.00454795.74
STA111727497.305151100.00454795.74
CHM122467263.89465190.20404686.96
CHM152677293.06475192.164646100.00
CMP122687294.44495196.08454697.83
GST112697295.83505198.04384682.61
MTH122697295.83495196.08424691.30
PHY132657290.28475192.16394684.78
PHY142437259.72495196.08414689.13
PHY192657290.28495196.08444695.65
Courses10321168775816691744
Total number of registered students: 341

Table A1.

PPT summary (the number of students passed from 2011/2012 to 2013/2014 academic sessions).

See Table A2.

The cumulative summary from 2011/2012 to 2013/2014 academic sessions
Courses/codesCumulative passCumulative no. of reg. studentsAverage no. of students (%)
CHM11116017293.02
CHM15116417295.35
CMP11117017298.84
GST11116117293.60
MTH11116217294.19
PHY11114817286.05
PHY19115217288.37
STA11116817297.67
CHM12213216978.11
CHM15216016994.67
CMP12216216995.86
GST11215716992.90
MTH12216016994.67
PHY13215116989.35
PHY14213316978.70
PHY19215816993.49
16 courses2498272891.57

Table A2.

The cumulative total of the students passed, the total registered students, and their average percentage passed.

See Table A3.

Academic sessions2014/20152015/2016Summation for two sessions
Courses/codesTotal no. passedNo. of %Total no. passedNo. of %Cumulative numberAverage % passed
Reg. studentsPassedReg. studentsPassed
Students passedReg. students
CHM111536186.89739180.2212615282.89
CHM151556190.16909198.9014515295.39
CMP111586195.08639169.2312115279.61
GST111566191.80909198.9014615296.05
MTH111566191.80879195.6014315294.08
PHY111556190.16829190.1113715290.13
PHY191566191.80689174.7312415281.58
STA111596196.72799186.8113815290.79
CHM122516085.00718682.5612214683.56
CHM1526060100.0828695.3514214697.26
CMP122536088.33808693.0213314691.10
GST112526086.678686100.013814694.52
MTH122556091.67698680.2312414684.93
PHY132576095.00718682.5612814687.67
PHY142516085.00748686.0512514685.62
PHY192566093.33818694.1913714693.84
16 courses883968124614162129238489.30
Total number of registered students: 298

Table A3.

CBT summary (the number of students passed from 2014/2015 to 2015/2016 academic sessions).

Table A4 presents the number of students who passed the courses with their corresponding percentage passed (%) for the three sessions that were written with the CBT mode of test administration.

Cumulative passed (%)
Courses/codesTest mode
PPTCBT
CHM11193.0282.89
CHM15195.3595.39
CMP11198.8479.61
GST11193.6096.05
MTH11194.1994.08
PHY11186.0590.13
PHY19188.3781.58
STA11197.6790.79
CHM12278.1183.56
CHM15294.6797.26
CMP12295.8691.10
GST11292.9094.52
MTH12294.6784.93
PHY13289.3587.67
PHY14278.7085.62
PYH19293.4993.84
Total number of courses: 16

Table A4.

The average percentage pass rate for both the PPT and CBT.

See Table A5.

Mode of testNMeanStd. deviationStd. error mean
% Pass rateCBT1691.556.0721.518
PPT1689.315.6901.423

Table A5.

Group statistics.

See Table A6.

t-Test
t-Crit.DFSign. (2-tails)Mean differenceStd. error difference95% confidence interval of the difference
LowerUpper
% Pass rateEqual variance assumed1.076300.2902.2392.080−2.0106.487
Equal variance not assumed1.07629.8750.2902.2392.080−2.0116.488

Table A6.

Independent sample test.

In Table 2, “Sign.” stands for the probability value, and “Sign. (2-tails)” stands for the probability (P-value) for a two-tails test. The critical value of the students’ t-tests is denoted as “t-crit.,” and the degree of freedom is denoted by “DF.” Once more, in Table 1, “N” denotes the number of observations. As a result of the analysis, we accept the null hypothesis (H0), which states that there are no significant effects of CBT on students’ academic performance, and reject the alternative hypothesis (H1), which states that there are significant effects of CBT on students’ academic performance, because the probability value, P-value, is greater than the level of significance (that is, P-value of 0.290 < 0.05 level of significance).

References

  1. 1. Egbe CI, Agbo PA, Okwo FA, Agbo GC. Students’ perception of computer-based tests in the use of English programme in Nigerian universities. TechTrends. 2023;67:477-488
  2. 2. Shobayo MA, Binuyo AO, Ogunmakin R, Olosunde GR. Perceived effectiveness of computer–based test (CBT) mode of examination among undergraduate students in South-Western Nigeria. International Journal of Education, Library and Information Communication Technology. 2022;1:1-12
  3. 3. Khan MA, Vivek V, Khojah M, Nabi MK, Paul M, Minhaj SM. Learners’ perspective towards e-exams during COVID-19 outbreak: Evidence from higher educational institutions of India and Saudi Arabia. International Journal of Environmental Research and Public Health. 2021;18(12):6534
  4. 4. Kingsley O, Unegbu PO, Atsenokhai B, Patani SJ. Administration, tertiary institutions, examinations, service delivery. BW Academic Journal. 2022;8:73-82
  5. 5. Fernández-Martínez I, Orgilés M, Morales A, Espada JP, Essau CA. One-year follow-up effects of a cognitive behavior therapy-based transdiagnostic program for emotional problems in young children: A school-based cluster-randomized controlled trial. Journal of Affective Disorders. 2020;262:258-266
  6. 6. Liz-Domínguez M, Caeiro-Rodríguez M, Llamas-Nistal M, Mikic-Fonte FA. Systematic literature review of predictive analysis tools in higher education. Applied Sciences. 2019;9(24):5569
  7. 7. McClelland T, Cuevas J. A comparison of computer-based testing and paper and pencil testing in mathematics assessment. The Online Journal of New Horizons in Education. 2020;10(2):78-89
  8. 8. Whiteside SPH, Sim LA, Morrow AS, Farah WH, Hilliker DR, Murad MH, et al. A meta-analysis to guide the enhancement of CBT for childhood anxiety: Exposure over anxiety management. Clinical Child and Family Psychology Review. 2020;23:102-121
  9. 9. Gabriel KM, Yahaya A. Imperatives of computer base test (CBT) on performance of LIS students: A case study. Library Philosophy and Practice. 2018:1-13
  10. 10. Rocca LHD, Zielinski S. Community-based tourism, social capital, and governance of post-conflict rural tourism destinations: The case of Minca, Sierra Nevada de Santa Marta, Colombia. Tourism Management Perspectives. 2022;43:100985
  11. 11. Namirembe E. E-Learning in Universities in Uganda: Predictors of Successful Adoption. South Africa: OpenUCT University of Cape Town; 2019
  12. 12. Dammas AH. Investigate students’ attitudes toward the computer based test (CBT) at chemistry course. Archives of Business Research. 2016;4(6):58-71
  13. 13. Efendi R, Lesmana LS, Putra F, Yandani E, Wulandari RA. Design and implementation of computer based test (CBT) in vocational education. Journal of Physics: Conference Series. 2021:1-13
  14. 14. Oduntan OE, Ojuawo O, Oduntan EA. A comparative analysis of student performance in paper pencil test (PPT) and computer-based test (CBT) examination system. Research Journal of Educational Studies and Review. 2015;1(1):24-29
  15. 15. Yin J, Goh TT, Yang B, Xiaobin Y. Conversation technology with micro-learning: The impact of chatbot-based learning on students’ learning motivation and performance. Journal of Educational Computing Research. 2021;59(1):154-177

Written By

Mfeuter Tachia

Submitted: 23 April 2023 Reviewed: 22 March 2024 Published: 18 September 2024