Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here .

Loading metrics

Open Access

Peer-reviewed

Research Article

COVID-19’s impacts on the scope, effectiveness, and interaction characteristics of online learning: A social network analysis

Roles Data curation, Formal analysis, Methodology, Writing – review & editing

¶ ‡ JZ and YD are contributed equally to this work as first authors.

Affiliation School of Educational Information Technology, South China Normal University, Guangzhou, Guangdong, China

Roles Data curation, Formal analysis, Methodology, Writing – original draft

Affiliations School of Educational Information Technology, South China Normal University, Guangzhou, Guangdong, China, Hangzhou Zhongce Vocational School Qiantang, Hangzhou, Zhejiang, China

Roles Data curation, Writing – original draft

Roles Data curation

Roles Writing – original draft

Affiliation Faculty of Education, Shenzhen University, Shenzhen, Guangdong, China

Roles Conceptualization, Supervision, Writing – review & editing

* E-mail: [email protected] (JH); [email protected] (YZ)

ORCID logo

  • Junyi Zhang, 
  • Yigang Ding, 
  • Xinru Yang, 
  • Jinping Zhong, 
  • XinXin Qiu, 
  • Zhishan Zou, 
  • Yujie Xu, 
  • Xiunan Jin, 
  • Xiaomin Wu, 

PLOS

  • Published: August 23, 2022
  • https://doi.org/10.1371/journal.pone.0273016
  • Reader Comments

Table 1

The COVID-19 outbreak brought online learning to the forefront of education. Scholars have conducted many studies on online learning during the pandemic, but only a few have performed quantitative comparative analyses of students’ online learning behavior before and after the outbreak. We collected review data from China’s massive open online course platform called icourse.163 and performed social network analysis on 15 courses to explore courses’ interaction characteristics before, during, and after the COVID-19 pan-demic. Specifically, we focused on the following aspects: (1) variations in the scale of online learning amid COVID-19; (2a) the characteristics of online learning interaction during the pandemic; (2b) the characteristics of online learning interaction after the pandemic; and (3) differences in the interaction characteristics of social science courses and natural science courses. Results revealed that only a small number of courses witnessed an uptick in online interaction, suggesting that the pandemic’s role in promoting the scale of courses was not significant. During the pandemic, online learning interaction became more frequent among course network members whose interaction scale increased. After the pandemic, although the scale of interaction declined, online learning interaction became more effective. The scale and level of interaction in Electrodynamics (a natural science course) and Economics (a social science course) both rose during the pan-demic. However, long after the pandemic, the Economics course sustained online interaction whereas interaction in the Electrodynamics course steadily declined. This discrepancy could be due to the unique characteristics of natural science courses and social science courses.

Citation: Zhang J, Ding Y, Yang X, Zhong J, Qiu X, Zou Z, et al. (2022) COVID-19’s impacts on the scope, effectiveness, and interaction characteristics of online learning: A social network analysis. PLoS ONE 17(8): e0273016. https://doi.org/10.1371/journal.pone.0273016

Editor: Heng Luo, Central China Normal University, CHINA

Received: April 20, 2022; Accepted: July 29, 2022; Published: August 23, 2022

Copyright: © 2022 Zhang et al. This is an open access article distributed under the terms of the Creative Commons Attribution License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Data Availability: The data underlying the results presented in the study were downloaded from https://www.icourse163.org/ and are now shared fully on Github ( https://github.com/zjyzhangjunyi/dataset-from-icourse163-for-SNA ). These data have no private information and can be used for academic research free of charge.

Funding: The author(s) received no specific funding for this work.

Competing interests: The authors have declared that no competing interests exist.

1. Introduction

The development of the mobile internet has spurred rapid advances in online learning, offering novel prospects for teaching and learning and a learning experience completely different from traditional instruction. Online learning harnesses the advantages of network technology and multimedia technology to transcend the boundaries of conventional education [ 1 ]. Online courses have become a popular learning mode owing to their flexibility and openness. During online learning, teachers and students are in different physical locations but interact in multiple ways (e.g., via online forum discussions and asynchronous group discussions). An analysis of online learning therefore calls for attention to students’ participation. Alqurashi [ 2 ] defined interaction in online learning as the process of constructing meaningful information and thought exchanges between more than two people; such interaction typically occurs between teachers and learners, learners and learners, and the course content and learners.

Massive open online courses (MOOCs), a 21st-century teaching mode, have greatly influenced global education. Data released by China’s Ministry of Education in 2020 show that the country ranks first globally in the number and scale of higher education MOOCs. The COVID-19 outbreak has further propelled this learning mode, with universities being urged to leverage MOOCs and other online resource platforms to respond to government’s “School’s Out, But Class’s On” policy [ 3 ]. Besides MOOCs, to reduce in-person gatherings and curb the spread of COVID-19, various online learning methods have since become ubiquitous [ 4 ]. Though Lederman asserted that the COVID-19 outbreak has positioned online learning technologies as the best way for teachers and students to obtain satisfactory learning experiences [ 5 ], it remains unclear whether the COVID-19 pandemic has encouraged interaction in online learning, as interactions between students and others play key roles in academic performance and largely determine the quality of learning experiences [ 6 ]. Similarly, it is also unclear what impact the COVID-19 pandemic has had on the scale of online learning.

Social constructivism paints learning as a social phenomenon. As such, analyzing the social structures or patterns that emerge during the learning process can shed light on learning-based interaction [ 7 ]. Social network analysis helps to explain how a social network, rooted in interactions between learners and their peers, guides individuals’ behavior, emotions, and outcomes. This analytical approach is especially useful for evaluating interactive relationships between network members [ 8 ]. Mohammed cited social network analysis (SNA) as a method that can provide timely information about students, learning communities and interactive networks. SNA has been applied in numerous fields, including education, to identify the number and characteristics of interelement relationships. For example, Lee et al. also used SNA to explore the effects of blogs on peer relationships [ 7 ]. Therefore, adopting SNA to examine interactions in online learning communities during the COVID-19 pandemic can uncover potential issues with this online learning model.

Taking China’s icourse.163 MOOC platform as an example, we chose 15 courses with a large number of participants for SNA, focusing on learners’ interaction characteristics before, during, and after the COVID-19 outbreak. We visually assessed changes in the scale of network interaction before, during, and after the outbreak along with the characteristics of interaction in Gephi. Examining students’ interactions in different courses revealed distinct interactive network characteristics, the pandemic’s impact on online courses, and relevant suggestions. Findings are expected to promote effective interaction and deep learning among students in addition to serving as a reference for the development of other online learning communities.

2. Literature review and research questions

Interaction is deemed as central to the educational experience and is a major focus of research on online learning. Moore began to study the problem of interaction in distance education as early as 1989. He defined three core types of interaction: student–teacher, student–content, and student–student [ 9 ]. Lear et al. [ 10 ] described an interactivity/ community-process model of distance education: they specifically discussed the relationships between interactivity, community awareness, and engaging learners and found interactivity and community awareness to be correlated with learner engagement. Zulfikar et al. [ 11 ] suggested that discussions initiated by the students encourage more students’ engagement than discussions initiated by the instructors. It is most important to afford learners opportunities to interact purposefully with teachers, and improving the quality of learner interaction is crucial to fostering profound learning [ 12 ]. Interaction is an important way for learners to communicate and share information, and a key factor in the quality of online learning [ 13 ].

Timely feedback is the main component of online learning interaction. Woo and Reeves discovered that students often become frustrated when they fail to receive prompt feedback [ 14 ]. Shelley et al. conducted a three-year study of graduate and undergraduate students’ satisfaction with online learning at universities and found that interaction with educators and students is the main factor affecting satisfaction [ 15 ]. Teachers therefore need to provide students with scoring justification, support, and constructive criticism during online learning. Some researchers examined online learning during the COVID-19 pandemic. They found that most students preferred face-to-face learning rather than online learning due to obstacles faced online, such as a lack of motivation, limited teacher-student interaction, and a sense of isolation when learning in different times and spaces [ 16 , 17 ]. However, it can be reduced by enhancing the online interaction between teachers and students [ 18 ].

Research showed that interactions contributed to maintaining students’ motivation to continue learning [ 19 ]. Baber argued that interaction played a key role in students’ academic performance and influenced the quality of the online learning experience [ 20 ]. Hodges et al. maintained that well-designed online instruction can lead to unique teaching experiences [ 21 ]. Banna et al. mentioned that using discussion boards, chat sessions, blogs, wikis, and other tools could promote student interaction and improve participation in online courses [ 22 ]. During the COVID-19 pandemic, Mahmood proposed a series of teaching strategies suitable for distance learning to improve its effectiveness [ 23 ]. Lapitan et al. devised an online strategy to ease the transition from traditional face-to-face instruction to online learning [ 24 ]. The preceding discussion suggests that online learning goes beyond simply providing learning resources; teachers should ideally design real-life activities to give learners more opportunities to participate.

As mentioned, COVID-19 has driven many scholars to explore the online learning environment. However, most have ignored the uniqueness of online learning during this time and have rarely compared pre- and post-pandemic online learning interaction. Taking China’s icourse.163 MOOC platform as an example, we chose 15 courses with a large number of participants for SNA, centering on student interaction before and after the pandemic. Gephi was used to visually analyze changes in the scale and characteristics of network interaction. The following questions were of particular interest:

  • (1) Can the COVID-19 pandemic promote the expansion of online learning?
  • (2a) What are the characteristics of online learning interaction during the pandemic?
  • (2b) What are the characteristics of online learning interaction after the pandemic?
  • (3) How do interaction characteristics differ between social science courses and natural science courses?

3. Methodology

3.1 research context.

We selected several courses with a large number of participants and extensive online interaction among hundreds of courses on the icourse.163 MOOC platform. These courses had been offered on the platform for at least three semesters, covering three periods (i.e., before, during, and after the COVID-19 outbreak). To eliminate the effects of shifts in irrelevant variables (e.g., course teaching activities), we chose several courses with similar teaching activities and compared them on multiple dimensions. All course content was taught online. The teachers of each course posted discussion threads related to learning topics; students were expected to reply via comments. Learners could exchange ideas freely in their responses in addition to asking questions and sharing their learning experiences. Teachers could answer students’ questions as well. Conversations in the comment area could partly compensate for a relative absence of online classroom interaction. Teacher–student interaction is conducive to the formation of a social network structure and enabled us to examine teachers’ and students’ learning behavior through SNA. The comment areas in these courses were intended for learners to construct knowledge via reciprocal communication. Meanwhile, by answering students’ questions, teachers could encourage them to reflect on their learning progress. These courses’ successive terms also spanned several phases of COVID-19, allowing us to ascertain the pandemic’s impact on online learning.

3.2 Data collection and preprocessing

To avoid interference from invalid or unclear data, the following criteria were applied to select representative courses: (1) generality (i.e., public courses and professional courses were chosen from different schools across China); (2) time validity (i.e., courses were held before during, and after the pandemic); and (3) notability (i.e., each course had at least 2,000 participants). We ultimately chose 15 courses across the social sciences and natural sciences (see Table 1 ). The coding is used to represent the course name.

thumbnail

  • PPT PowerPoint slide
  • PNG larger image
  • TIFF original image

https://doi.org/10.1371/journal.pone.0273016.t001

To discern courses’ evolution during the pandemic, we gathered data on three terms before, during, and after the COVID-19 outbreak in addition to obtaining data from two terms completed well before the pandemic and long after. Our final dataset comprised five sets of interactive data. Finally, we collected about 120,000 comments for SNA. Because each course had a different start time—in line with fluctuations in the number of confirmed COVID-19 cases in China and the opening dates of most colleges and universities—we divided our sample into five phases: well before the pandemic (Phase I); before the pandemic (Phase Ⅱ); during the pandemic (Phase Ⅲ); after the pandemic (Phase Ⅳ); and long after the pandemic (Phase Ⅴ). We sought to preserve consistent time spans to balance the amount of data in each period ( Fig 1 ).

thumbnail

https://doi.org/10.1371/journal.pone.0273016.g001

3.3 Instrumentation

Participants’ comments and “thumbs-up” behavior data were converted into a network structure and compared using social network analysis (SNA). Network analysis, according to M’Chirgui, is an effective tool for clarifying network relationships by employing sophisticated techniques [ 25 ]. Specifically, SNA can help explain the underlying relationships among team members and provide a better understanding of their internal processes. Yang and Tang used SNA to discuss the relationship between team structure and team performance [ 26 ]. Golbeck argued that SNA could improve the understanding of students’ learning processes and reveal learners’ and teachers’ role dynamics [ 27 ].

To analyze Question (1), the number of nodes and diameter in the generated network were deemed as indicators of changes in network size. Social networks are typically represented as graphs with nodes and degrees, and node count indicates the sample size [ 15 ]. Wellman et al. proposed that the larger the network scale, the greater the number of network members providing emotional support, goods, services, and companionship [ 28 ]. Jan’s study measured the network size by counting the nodes which represented students, lecturers, and tutors [ 29 ]. Similarly, network nodes in the present study indicated how many learners and teachers participated in the course, with more nodes indicating more participants. Furthermore, we investigated the network diameter, a structural feature of social networks, which is a common metric for measuring network size in SNA [ 30 ]. The network diameter refers to the longest path between any two nodes in the network. There has been evidence that a larger network diameter leads to greater spread of behavior [ 31 ]. Likewise, Gašević et al. found that larger networks were more likely to spread innovative ideas about educational technology when analyzing MOOC-related research citations [ 32 ]. Therefore, we employed node count and network diameter to measure the network’s spatial size and further explore the expansion characteristic of online courses. Brief introduction of these indicators can be summarized in Table 2 .

thumbnail

https://doi.org/10.1371/journal.pone.0273016.t002

To address Question (2), a list of interactive analysis metrics in SNA were introduced to scrutinize learners’ interaction characteristics in online learning during and after the pandemic, as shown below:

  • (1) The average degree reflects the density of the network by calculating the average number of connections for each node. As Rong and Xu suggested, the average degree of a network indicates how active its participants are [ 33 ]. According to Hu, a higher average degree implies that more students are interacting directly with each other in a learning context [ 34 ]. The present study inherited the concept of the average degree from these previous studies: the higher the average degree, the more frequent the interaction between individuals in the network.
  • (2) Essentially, a weighted average degree in a network is calculated by multiplying each degree by its respective weight, and then taking the average. Bydžovská took the strength of the relationship into account when determining the weighted average degree [ 35 ]. By calculating friendship’s weighted value, Maroulis assessed peer achievement within a small-school reform [ 36 ]. Accordingly, we considered the number of interactions as the weight of the degree, with a higher average degree indicating more active interaction among learners.
  • (3) Network density is the ratio between actual connections and potential connections in a network. The more connections group members have with each other, the higher the network density. In SNA, network density is similar to group cohesion, i.e., a network of more strong relationships is more cohesive [ 37 ]. Network density also reflects how much all members are connected together [ 38 ]. Therefore, we adopted network density to indicate the closeness among network members. Higher network density indicates more frequent interaction and closer communication among students.
  • (4) Clustering coefficient describes local network attributes and indicates that two nodes in the network could be connected through adjacent nodes. The clustering coefficient measures users’ tendency to gather (cluster) with others in the network: the higher the clustering coefficient, the more frequently users communicate with other group members. We regarded this indicator as a reflection of the cohesiveness of the group [ 39 ].
  • (5) In a network, the average path length is the average number of steps along the shortest paths between any two nodes. Oliveres has observed that when an average path length is small, the route from one node to another is shorter when graphed [ 40 ]. This is especially true in educational settings where students tend to become closer friends. So we consider that the smaller the average path length, the greater the possibility of interaction between individuals in the network.
  • (6) A network with a large number of nodes, but whose average path length is surprisingly small, is known as the small-world effect [ 41 ]. A higher clustering coefficient and shorter average path length are important indicators of a small-world network: a shorter average path length enables the network to spread information faster and more accurately; a higher clustering coefficient can promote frequent knowledge exchange within the group while boosting the timeliness and accuracy of knowledge dissemination [ 42 ]. Brief introduction of these indicators can be summarized in Table 3 .

thumbnail

https://doi.org/10.1371/journal.pone.0273016.t003

To analyze Question 3, we used the concept of closeness centrality, which determines how close a vertex is to others in the network. As Opsahl et al. explained, closeness centrality reveals how closely actors are coupled with their entire social network [ 43 ]. In order to analyze social network-based engineering education, Putnik et al. examined closeness centrality and found that it was significantly correlated with grades [ 38 ]. We used closeness centrality to measure the position of an individual in the network. Brief introduction of these indicators can be summarized in Table 4 .

thumbnail

https://doi.org/10.1371/journal.pone.0273016.t004

3.4 Ethics statement

This study was approved by the Academic Committee Office (ACO) of South China Normal University ( http://fzghb.scnu.edu.cn/ ), Guangzhou, China. Research data were collected from the open platform and analyzed anonymously. There are thus no privacy issues involved in this study.

4.1 COVID-19’s role in promoting the scale of online courses was not as important as expected

As shown in Fig 2 , the number of course participants and nodes are closely correlated with the pandemic’s trajectory. Because the number of participants in each course varied widely, we normalized the number of participants and nodes to more conveniently visualize course trends. Fig 2 depicts changes in the chosen courses’ number of participants and nodes before the pandemic (Phase II), during the pandemic (Phase III), and after the pandemic (Phase IV). The number of participants in most courses during the pandemic exceeded those before and after the pandemic. But the number of people who participate in interaction in some courses did not increase.

thumbnail

https://doi.org/10.1371/journal.pone.0273016.g002

In order to better analyze the trend of interaction scale in online courses before, during, and after the pandemic, the selected courses were categorized according to their scale change. When the number of participants increased (decreased) beyond 20% (statistical experience) and the diameter also increased (decreased), the course scale was determined to have increased (decreased); otherwise, no significant change was identified in the course’s interaction scale. Courses were subsequently divided into three categories: increased interaction scale, decreased interaction scale, and no significant change. Results appear in Table 5 .

thumbnail

https://doi.org/10.1371/journal.pone.0273016.t005

From before the pandemic until it broke out, the interaction scale of five courses increased, accounting for 33.3% of the full sample; one course’s interaction scale declined, accounting for 6.7%. The interaction scale of nine courses decreased, accounting for 60%. The pandemic’s role in promoting online courses thus was not as important as anticipated, and most courses’ interaction scale did not change significantly throughout.

No courses displayed growing interaction scale after the pandemic: the interaction scale of nine courses fell, accounting for 60%; and the interaction scale of six courses did not shift significantly, accounting for 40%. Courses with an increased scale of interaction during the pandemic did not maintain an upward trend. On the contrary, the improvement in the pandemic caused learners’ enthusiasm for online learning to wane. We next analyzed several interaction metrics to further explore course interaction during different pandemic periods.

4.2 Characteristics of online learning interaction amid COVID-19

4.2.1 during the covid-19 pandemic, online learning interaction in some courses became more active..

Changes in course indicators with the growing interaction scale during the pandemic are presented in Fig 3 , including SS5, SS6, NS1, NS3, and NS8. The horizontal ordinate indicates the number of courses, with red color representing the rise of the indicator value on the vertical ordinate and blue representing the decline.

thumbnail

https://doi.org/10.1371/journal.pone.0273016.g003

Specifically: (1) The average degree and weighted average degree of the five course networks demonstrated an upward trend. The emergence of the pandemic promoted students’ enthusiasm; learners were more active in the interactive network. (2) Fig 3 shows that 3 courses had increased network density and 2 courses had decreased. The higher the network density, the more communication within the team. Even though the pandemic accelerated the interaction scale and frequency, the tightness between learners in some courses did not improve. (3) The clustering coefficient of social science courses rose whereas the clustering coefficient and small-world property of natural science courses fell. The higher the clustering coefficient and the small-world property, the better the relationship between adjacent nodes and the higher the cohesion [ 39 ]. (4) Most courses’ average path length increased as the interaction scale increased. However, when the average path length grew, adverse effects could manifest: communication between learners might be limited to a small group without multi-directional interaction.

When the pandemic emerged, the only declining network scale belonged to a natural science course (NS2). The change in each course index is pictured in Fig 4 . The abscissa indicates the size of the value, with larger values to the right. The red dot indicates the index value before the pandemic; the blue dot indicates its value during the pandemic. If the blue dot is to the right of the red dot, then the value of the index increased; otherwise, the index value declined. Only the weighted average degree of the course network increased. The average degree, network density decreased, indicating that network members were not active and that learners’ interaction degree and communication frequency lessened. Despite reduced learner interaction, the average path length was small and the connectivity between learners was adequate.

thumbnail

https://doi.org/10.1371/journal.pone.0273016.g004

4.2.2 After the COVID-19 pandemic, the scale decreased rapidly, but most course interaction was more effective.

Fig 5 shows the changes in various courses’ interaction indicators after the pandemic, including SS1, SS2, SS3, SS6, SS7, NS2, NS3, NS7, and NS8.

thumbnail

https://doi.org/10.1371/journal.pone.0273016.g005

Specifically: (1) The average degree and weighted average degree of most course networks decreased. The scope and intensity of interaction among network members declined rapidly, as did learners’ enthusiasm for communication. (2) The network density of seven courses also fell, indicating weaker connections between learners in most courses. (3) In addition, the clustering coefficient and small-world property of most course networks decreased, suggesting little possibility of small groups in the network. The scope of interaction between learners was not limited to a specific space, and the interaction objects had no significant tendencies. (4) Although the scale of course interaction became smaller in this phase, the average path length of members’ social networks shortened in nine courses. Its shorter average path length would expedite the spread of information within the network as well as communication and sharing among network members.

Fig 6 displays the evolution of course interaction indicators without significant changes in interaction scale after the pandemic, including SS4, SS5, NS1, NS4, NS5, and NS6.

thumbnail

https://doi.org/10.1371/journal.pone.0273016.g006

Specifically: (1) Some course members’ social networks exhibited an increase in the average and weighted average. In these cases, even though the course network’s scale did not continue to increase, communication among network members rose and interaction became more frequent and deeper than before. (2) Network density and average path length are indicators of social network density. The greater the network density, the denser the social network; the shorter the average path length, the more concentrated the communication among network members. However, at this phase, the average path length and network density in most courses had increased. Yet the network density remained small despite having risen ( Table 6 ). Even with more frequent learner interaction, connections remained distant and the social network was comparatively sparse.

thumbnail

https://doi.org/10.1371/journal.pone.0273016.t006

In summary, the scale of interaction did not change significantly overall. Nonetheless, some course members’ frequency and extent of interaction increased, and the relationships between network members became closer as well. In the study, we found it interesting that the interaction scale of Economics (a social science course) course and Electrodynamics (a natural science course) course expanded rapidly during the pandemic and retained their interaction scale thereafter. We next assessed these two courses to determine whether their level of interaction persisted after the pandemic.

4.3 Analyses of natural science courses and social science courses

4.3.1 analyses of the interaction characteristics of economics and electrodynamics..

Economics and Electrodynamics are social science courses and natural science courses, respectively. Members’ interaction within these courses was similar: the interaction scale increased significantly when COVID-19 broke out (Phase Ⅲ), and no significant changes emerged after the pandemic (Phase Ⅴ). We hence focused on course interaction long after the outbreak (Phase V) and compared changes across multiple indicators, as listed in Table 7 .

thumbnail

https://doi.org/10.1371/journal.pone.0273016.t007

As the pandemic continued to improve, the number of participants and the diameter long after the outbreak (Phase V) each declined for Economics compared with after the pandemic (Phase IV). The interaction scale decreased, but the interaction between learners was much deeper. Specifically: (1) The weighted average degree, network density, clustering coefficient, and small-world property each reflected upward trends. The pandemic therefore exerted a strong impact on this course. Interaction was well maintained even after the pandemic. The smaller network scale promoted members’ interaction and communication. (2) Compared with after the pandemic (Phase IV), members’ network density increased significantly, showing that relationships between learners were closer and that cohesion was improving. (3) At the same time, as the clustering coefficient and small-world property grew, network members demonstrated strong small-group characteristics: the communication between them was deepening and their enthusiasm for interaction was higher. (4) Long after the COVID-19 outbreak (Phase V), the average path length was reduced compared with previous terms, knowledge flowed more quickly among network members, and the degree of interaction gradually deepened.

The average degree, weighted average degree, network density, clustering coefficient, and small-world property of Electrodynamics all decreased long after the COVID-19 outbreak (Phase V) and were lower than during the outbreak (Phase Ⅲ). The level of learner interaction therefore gradually declined long after the outbreak (Phase V), and connections between learners were no longer active. Although the pandemic increased course members’ extent of interaction, this rise was merely temporary: students’ enthusiasm for learning waned rapidly and their interaction decreased after the pandemic (Phase IV). To further analyze the interaction characteristics of course members in Economics and Electrodynamics, we evaluated the closeness centrality of their social networks, as shown in section 4.3.2.

4.3.2 Analysis of the closeness centrality of Economics and Electrodynamics.

The change in the closeness centrality of social networks in Economics was small, and no sharp upward trend appeared during the pandemic outbreak, as shown in Fig 7 . The emergence of COVID-19 apparently fostered learners’ interaction in Economics albeit without a significant impact. The closeness centrality changed in Electrodynamics varied from that of Economics: upon the COVID-19 outbreak, closeness centrality was significantly different from other semesters. Communication between learners was closer and interaction was more effective. Electrodynamics course members’ social network proximity decreased rapidly after the pandemic. Learners’ communication lessened. In general, Economics course showed better interaction before the outbreak and was less affected by the pandemic; Electrodynamics course was more affected by the pandemic and showed different interaction characteristics at different periods of the pandemic.

thumbnail

(Note: "****" indicates the significant distinction in closeness centrality between the two periods, otherwise no significant distinction).

https://doi.org/10.1371/journal.pone.0273016.g007

5. Discussion

We referred to discussion forums from several courses on the icourse.163 MOOC platform to compare online learning before, during, and after the COVID-19 pandemic via SNA and to delineate the pandemic’s effects on online courses. Only 33.3% of courses in our sample increased in terms of interaction during the pandemic; the scale of interaction did not rise in any courses thereafter. When the courses scale rose, the scope and frequency of interaction showed upward trends during the pandemic; and the clustering coefficient of natural science courses and social science courses differed: the coefficient for social science courses tended to rise whereas that for natural science courses generally declined. When the pandemic broke out, the interaction scale of a single natural science course decreased along with its interaction scope and frequency. The amount of interaction in most courses shrank rapidly during the pandemic and network members were not as active as they had been before. However, after the pandemic, some courses saw declining interaction but greater communication between members; interaction also became more frequent and deeper than before.

5.1 During the COVID-19 pandemic, the scale of interaction increased in only a few courses

The pandemic outbreak led to a rapid increase in the number of participants in most courses; however, the change in network scale was not significant. The scale of online interaction expanded swiftly in only a few courses; in others, the scale either did not change significantly or displayed a downward trend. After the pandemic, the interaction scale in most courses decreased quickly; the same pattern applied to communication between network members. Learners’ enthusiasm for online interaction reduced as the circumstances of the pandemic improved—potentially because, during the pandemic, China’s Ministry of Education declared “School’s Out, But Class’s On” policy. Major colleges and universities were encouraged to use the Internet and informational resources to provide learning support, hence the sudden increase in the number of participants and interaction in online courses [ 46 ]. After the pandemic, students’ enthusiasm for online learning gradually weakened, presumably due to easing of the pandemic [ 47 ]. More activities also transitioned from online to offline, which tempered learners’ online discussion. Research has shown that long-term online learning can even bore students [ 48 ].

Most courses’ interaction scale decreased significantly after the pandemic. First, teachers and students occupied separate spaces during the outbreak, had few opportunities for mutual cooperation and friendship, and lacked a sense of belonging [ 49 ]. Students’ enthusiasm for learning dissipated over time [ 50 ]. Second, some teachers were especially concerned about adapting in-person instructional materials for digital platforms; their pedagogical methods were ineffective, and they did not provide learning activities germane to student interaction [ 51 ]. Third, although teachers and students in remote areas were actively engaged in online learning, some students could not continue to participate in distance learning due to inadequate technology later in the outbreak [ 52 ].

5.2 Characteristics of online learning interaction during and after the COVID-19 pandemic

5.2.1 during the covid-19 pandemic, online interaction in most courses did not change significantly..

The interaction scale of only a few courses increased during the pandemic. The interaction scope and frequency of these courses climbed as well. Yet even as the degree of network interaction rose, course network density did not expand in all cases. The pandemic sparked a surge in the number of online learners and a rapid increase in network scale, but students found it difficult to interact with all learners. Yau pointed out that a greater network scale did not enrich the range of interaction between individuals; rather, the number of individuals who could interact directly was limited [ 53 ]. The internet facilitates interpersonal communication. However, not everyone has the time or ability to establish close ties with others [ 54 ].

In addition, social science courses and natural science courses in our sample revealed disparate trends in this regard: the clustering coefficient of social science courses increased and that of natural science courses decreased. Social science courses usually employ learning approaches distinct from those in natural science courses [ 55 ]. Social science courses emphasize critical and innovative thinking along with personal expression [ 56 ]. Natural science courses focus on practical skills, methods, and principles [ 57 ]. Therefore, the content of social science courses can spur large-scale discussion among learners. Some course evaluations indicated that the course content design was suboptimal as well: teachers paid close attention to knowledge transmission and much less to piquing students’ interest in learning. In addition, the thread topics that teachers posted were scarcely diversified and teachers’ questions lacked openness. These attributes could not spark active discussion among learners.

5.2.2 Online learning interaction declined after the COVID-19 pandemic.

Most courses’ interaction scale and intensity decreased rapidly after the pandemic, but some did not change. Courses with a larger network scale did not continue to expand after the outbreak, and students’ enthusiasm for learning paled. The pandemic’s reduced severity also influenced the number of participants in online courses. Meanwhile, restored school order moved many learning activities from virtual to in-person spaces. Face-to-face learning has gradually replaced online learning, resulting in lower enrollment and less interaction in online courses. Prolonged online courses could have also led students to feel lonely and to lack a sense of belonging [ 58 ].

The scale of interaction in some courses did not change substantially after the pandemic yet learners’ connections became tighter. We hence recommend that teachers seize pandemic-related opportunities to design suitable activities. Additionally, instructors should promote student-teacher and student-student interaction, encourage students to actively participate online, and generally intensify the impact of online learning.

5.3 What are the characteristics of interaction in social science courses and natural science courses?

The level of interaction in Economics (a social science course) was significantly higher than that in Electrodynamics (a natural science course), and the small-world property in Economics increased as well. To boost online courses’ learning-related impacts, teachers can divide groups of learners based on the clustering coefficient and the average path length. Small groups of students may benefit teachers in several ways: to participate actively in activities intended to expand students’ knowledge, and to serve as key actors in these small groups. Cultivating students’ keenness to participate in class activities and self-management can also help teachers guide learner interaction and foster deep knowledge construction.

As evidenced by comments posted in the Electrodynamics course, we observed less interaction between students. Teachers also rarely urged students to contribute to conversations. These trends may have arisen because teachers and students were in different spaces. Teachers might have struggled to discern students’ interaction status. Teachers could also have failed to intervene in time, to design online learning activities that piqued learners’ interest, and to employ sound interactive theme planning and guidance. Teachers are often active in traditional classroom settings. Their roles are comparatively weakened online, such that they possess less control over instruction [ 59 ]. Online instruction also requires a stronger hand in learning: teachers should play a leading role in regulating network members’ interactive communication [ 60 ]. Teachers can guide learners to participate, help learners establish social networks, and heighten students’ interest in learning [ 61 ]. Teachers should attend to core members in online learning while also considering edge members; by doing so, all network members can be driven to share their knowledge and become more engaged. Finally, teachers and assistant teachers should help learners develop knowledge, exchange topic-related ideas, pose relevant questions during course discussions, and craft activities that enable learners to interact online [ 62 ]. These tactics can improve the effectiveness of online learning.

As described, network members displayed distinct interaction behavior in Economics and Electrodynamics courses. First, these courses varied in their difficulty: the social science course seemed easier to understand and focused on divergent thinking. Learners were often willing to express their views in comments and to ponder others’ perspectives [ 63 ]. The natural science course seemed more demanding and was oriented around logical thinking and skills [ 64 ]. Second, courses’ content differed. In general, social science courses favor the acquisition of declarative knowledge and creative knowledge compared with natural science courses. Social science courses also entertain open questions [ 65 ]. Natural science courses revolve around principle knowledge, strategic knowledge, and transfer knowledge [ 66 ]. Problems in these courses are normally more complicated than those in social science courses. Third, the indicators affecting students’ attitudes toward learning were unique. Guo et al. discovered that “teacher feedback” most strongly influenced students’ attitudes towards learning social science courses but had less impact on students in natural science courses [ 67 ]. Therefore, learners in social science courses likely expect more feedback from teachers and greater interaction with others.

6. Conclusion and future work

Our findings show that the network interaction scale of some online courses expanded during the COVID-19 pandemic. The network scale of most courses did not change significantly, demonstrating that the pandemic did not notably alter the scale of course interaction. Online learning interaction among course network members whose interaction scale increased also became more frequent during the pandemic. Once the outbreak was under control, although the scale of interaction declined, the level and scope of some courses’ interactive networks continued to rise; interaction was thus particularly effective in these cases. Overall, the pandemic appeared to have a relatively positive impact on online learning interaction. We considered a pair of courses in detail and found that Economics (a social science course) fared much better than Electrodynamics (a natural science course) in classroom interaction; learners were more willing to partake in-class activities, perhaps due to these courses’ unique characteristics. Brint et al. also came to similar conclusions [ 57 ].

This study was intended to be rigorous. Even so, several constraints can be addressed in future work. The first limitation involves our sample: we focused on a select set of courses hosted on China’s icourse.163 MOOC platform. Future studies should involve an expansive collection of courses to provide a more holistic understanding of how the pandemic has influenced online interaction. Second, we only explored the interactive relationship between learners and did not analyze interactive content. More in-depth content analysis should be carried out in subsequent research. All in all, the emergence of COVID-19 has provided a new path for online learning and has reshaped the distance learning landscape. To cope with associated challenges, educational practitioners will need to continue innovating in online instructional design, strengthen related pedagogy, optimize online learning conditions, and bolster teachers’ and students’ competence in online learning.

  • View Article
  • Google Scholar
  • PubMed/NCBI
  • 30. Serrat O. Social network analysis. Knowledge solutions: Springer; 2017. p. 39–43. https://doi.org/10.1007/978-981-10-0983-9_9
  • 33. Rong Y, Xu E, editors. Strategies for the Management of the Government Affairs Microblogs in China Based on the SNA of Fifty Government Affairs Microblogs in Beijing. 14th International Conference on Service Systems and Service Management 2017.
  • 34. Hu X, Chu S, editors. A comparison on using social media in a professional experience course. International Conference on Social Media and Society; 2013.
  • 35. Bydžovská H. A Comparative Analysis of Techniques for Predicting Student Performance. Proceedings of the 9th International Conference on Educational Data Mining; Raleigh, NC, USA: International Educational Data Mining Society2016. p. 306–311.
  • 40. Olivares D, Adesope O, Hundhausen C, et al., editors. Using social network analysis to measure the effect of learning analytics in computing education. 19th IEEE International Conference on Advanced Learning Technologies 2019.
  • 41. Travers J, Milgram S. An experimental study of the small world problem. Social Networks: Elsevier; 1977. p. 179–197. https://doi.org/10.1016/B978-0-12-442450-0.50018–3
  • 43. Okamoto K, Chen W, Li X-Y, editors. Ranking of closeness centrality for large-scale social networks. International workshop on frontiers in algorithmics; 2008; Springer, Berlin, Heidelberg: Springer.
  • 47. Ding Y, Yang X, Zheng Y, editors. COVID-19’s Effects on the Scope, Effectiveness, and Roles of Teachers in Online Learning Based on Social Network Analysis: A Case Study. International Conference on Blended Learning; 2021: Springer.
  • 64. Boys C, Brennan J., Henkel M., Kirkland J., Kogan M., Youl P. Higher Education and Preparation for Work. Jessica Kingsley Publishers. 1988. https://doi.org/10.1080/03075079612331381467

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Wiley - PMC COVID-19 Collection

Logo of pheblackwell

Online and face‐to‐face learning: Evidence from students’ performance during the Covid‐19 pandemic

Carolyn chisadza.

1 Department of Economics, University of Pretoria, Hatfield South Africa

Matthew Clance

Thulani mthembu.

2 Department of Education Innovation, University of Pretoria, Hatfield South Africa

Nicky Nicholls

Eleni yitbarek.

This study investigates the factors that predict students' performance after transitioning from face‐to‐face to online learning as a result of the Covid‐19 pandemic. It uses students' responses from survey questions and the difference in the average assessment grades between pre‐lockdown and post‐lockdown at a South African university. We find that students' performance was positively associated with good wifi access, relative to using mobile internet data. We also observe lower academic performance for students who found transitioning to online difficult and who expressed a preference for self‐study (i.e. reading through class slides and notes) over assisted study (i.e. joining live lectures or watching recorded lectures). The findings suggest that improving digital infrastructure and reducing the cost of internet access may be necessary for mitigating the impact of the Covid‐19 pandemic on education outcomes.

1. INTRODUCTION

The Covid‐19 pandemic has been a wake‐up call to many countries regarding their capacity to cater for mass online education. This situation has been further complicated in developing countries, such as South Africa, who lack the digital infrastructure for the majority of the population. The extended lockdown in South Africa saw most of the universities with mainly in‐person teaching scrambling to source hardware (e.g. laptops, internet access), software (e.g. Microsoft packages, data analysis packages) and internet data for disadvantaged students in order for the semester to recommence. Not only has the pandemic revealed the already stark inequality within the tertiary student population, but it has also revealed that high internet data costs in South Africa may perpetuate this inequality, making online education relatively inaccessible for disadvantaged students. 1

The lockdown in South Africa made it possible to investigate the changes in second‐year students' performance in the Economics department at the University of Pretoria. In particular, we are interested in assessing what factors predict changes in students' performance after transitioning from face‐to‐face (F2F) to online learning. Our main objectives in answering this study question are to establish what study materials the students were able to access (i.e. slides, recordings, or live sessions) and how students got access to these materials (i.e. the infrastructure they used).

The benefits of education on economic development are well established in the literature (Gyimah‐Brempong,  2011 ), ranging from health awareness (Glick et al.,  2009 ), improved technological innovations, to increased capacity development and employment opportunities for the youth (Anyanwu,  2013 ; Emediegwu,  2021 ). One of the ways in which inequality is perpetuated in South Africa, and Africa as a whole, is through access to education (Anyanwu,  2016 ; Coetzee,  2014 ; Tchamyou et al.,  2019 ); therefore, understanding the obstacles that students face in transitioning to online learning can be helpful in ensuring more equal access to education.

Using students' responses from survey questions and the difference in the average grades between pre‐lockdown and post‐lockdown, our findings indicate that students' performance in the online setting was positively associated with better internet access. Accessing assisted study material, such as narrated slides or recordings of the online lectures, also helped students. We also find lower academic performance for students who reported finding transitioning to online difficult and for those who expressed a preference for self‐study (i.e. reading through class slides and notes) over assisted study (i.e. joining live lectures or watching recorded lectures). The average grades between pre‐lockdown and post‐lockdown were about two points and three points lower for those who reported transitioning to online teaching difficult and for those who indicated a preference for self‐study, respectively. The findings suggest that improving the quality of internet infrastructure and providing assisted learning can be beneficial in reducing the adverse effects of the Covid‐19 pandemic on learning outcomes.

Our study contributes to the literature by examining the changes in the online (post‐lockdown) performance of students and their F2F (pre‐lockdown) performance. This approach differs from previous studies that, in most cases, use between‐subject designs where one group of students following online learning is compared to a different group of students attending F2F lectures (Almatra et al.,  2015 ; Brown & Liedholm,  2002 ). This approach has a limitation in that that there may be unobserved characteristics unique to students choosing online learning that differ from those choosing F2F lectures. Our approach avoids this issue because we use a within‐subject design: we compare the performance of the same students who followed F2F learning Before lockdown and moved to online learning during lockdown due to the Covid‐19 pandemic. Moreover, the study contributes to the limited literature that compares F2F and online learning in developing countries.

Several studies that have also compared the effectiveness of online learning and F2F classes encounter methodological weaknesses, such as small samples, not controlling for demographic characteristics, and substantial differences in course materials and assessments between online and F2F contexts. To address these shortcomings, our study is based on a relatively large sample of students and includes demographic characteristics such as age, gender and perceived family income classification. The lecturer and course materials also remained similar in the online and F2F contexts. A significant proportion of our students indicated that they never had online learning experience before. Less than 20% of the students in the sample had previous experience with online learning. This highlights the fact that online education is still relatively new to most students in our sample.

Given the global experience of the fourth industrial revolution (4IR), 2 with rapidly accelerating technological progress, South Africa needs to be prepared for the possibility of online learning becoming the new norm in the education system. To this end, policymakers may consider engaging with various organizations (schools, universities, colleges, private sector, and research facilities) To adopt interventions that may facilitate the transition to online learning, while at the same time ensuring fair access to education for all students across different income levels. 3

1.1. Related literature

Online learning is a form of distance education which mainly involves internet‐based education where courses are offered synchronously (i.e. live sessions online) and/or asynchronously (i.e. students access course materials online in their own time, which is associated with the more traditional distance education). On the other hand, traditional F2F learning is real time or synchronous learning. In a physical classroom, instructors engage with the students in real time, while in the online format instructors can offer real time lectures through learning management systems (e.g. Blackboard Collaborate), or record the lectures for the students to watch later. Purely online courses are offered entirely over the internet, while blended learning combines traditional F2F classes with learning over the internet, and learning supported by other technologies (Nguyen,  2015 ).

Moreover, designing online courses requires several considerations. For example, the quality of the learning environment, the ease of using the learning platform, the learning outcomes to be achieved, instructor support to assist and motivate students to engage with the course material, peer interaction, class participation, type of assessments (Paechter & Maier,  2010 ), not to mention training of the instructor in adopting and introducing new teaching methods online (Lundberg et al.,  2008 ). In online learning, instructors are more facilitators of learning. On the other hand, traditional F2F classes are structured in such a way that the instructor delivers knowledge, is better able to gauge understanding and interest of students, can engage in class activities, and can provide immediate feedback on clarifying questions during the class. Additionally, the designing of traditional F2F courses can be less time consuming for instructors compared to online courses (Navarro,  2000 ).

Online learning is also particularly suited for nontraditional students who require flexibility due to work or family commitments that are not usually associated with the undergraduate student population (Arias et al.,  2018 ). Initially the nontraditional student belonged to the older adult age group, but with blended learning becoming more commonplace in high schools, colleges and universities, online learning has begun to traverse a wider range of age groups. However, traditional F2F classes are still more beneficial for learners that are not so self‐sufficient and lack discipline in working through the class material in the required time frame (Arias et al.,  2018 ).

For the purpose of this literature review, both pure online and blended learning are considered to be online learning because much of the evidence in the literature compares these two types against the traditional F2F learning. The debate in the literature surrounding online learning versus F2F teaching continues to be a contentious one. A review of the literature reveals mixed findings when comparing the efficacy of online learning on student performance in relation to the traditional F2F medium of instruction (Lundberg et al.,  2008 ; Nguyen,  2015 ). A number of studies conducted Before the 2000s find what is known today in the empirical literature as the “No Significant Difference” phenomenon (Russell & International Distance Education Certificate Center (IDECC),  1999 ). The seminal work from Russell and IDECC ( 1999 ) involved over 350 comparative studies on online/distance learning versus F2F learning, dating back to 1928. The author finds no significant difference overall between online and traditional F2F classroom education outcomes. Subsequent studies that followed find similar “no significant difference” outcomes (Arbaugh,  2000 ; Fallah & Ubell,  2000 ; Freeman & Capper,  1999 ; Johnson et al.,  2000 ; Neuhauser,  2002 ). While Bernard et al. ( 2004 ) also find that overall there is no significant difference in achievement between online education and F2F education, the study does find significant heterogeneity in student performance for different activities. The findings show that students in F2F classes outperform the students participating in synchronous online classes (i.e. classes that require online students to participate in live sessions at specific times). However, asynchronous online classes (i.e. students access class materials at their own time online) outperform F2F classes.

More recent studies find significant results for online learning outcomes in relation to F2F outcomes. On the one hand, Shachar and Yoram ( 2003 ) and Shachar and Neumann ( 2010 ) conduct a meta‐analysis of studies from 1990 to 2009 and find that in 70% of the cases, students taking courses by online education outperformed students in traditionally instructed courses (i.e. F2F lectures). In addition, Navarro and Shoemaker ( 2000 ) observe that learning outcomes for online learners are as effective as or better than outcomes for F2F learners, regardless of background characteristics. In a study on computer science students, Dutton et al. ( 2002 ) find online students perform significantly better compared to the students who take the same course on campus. A meta‐analysis conducted by the US Department of Education finds that students who took all or part of their course online performed better, on average, than those taking the same course through traditional F2F instructions. The report also finds that the effect sizes are larger for studies in which the online learning was collaborative or instructor‐driven than in those studies where online learners worked independently (Means et al.,  2010 ).

On the other hand, evidence by Brown and Liedholm ( 2002 ) based on test scores from macroeconomics students in the United States suggest that F2F students tend to outperform online students. These findings are supported by Coates et al. ( 2004 ) who base their study on macroeconomics students in the United States, and Xu and Jaggars ( 2014 ) who find negative effects for online students using a data set of about 500,000 courses taken by over 40,000 students in Washington. Furthermore, Almatra et al. ( 2015 ) compare overall course grades between online and F2F students for a Telecommunications course and find that F2F students significantly outperform online learning students. In an experimental study where students are randomly assigned to attend live lectures versus watching the same lectures online, Figlio et al. ( 2013 ) observe some evidence that the traditional format has a positive effect compared to online format. Interestingly, Callister and Love ( 2016 ) specifically compare the learning outcomes of online versus F2F skills‐based courses and find that F2F learners earned better outcomes than online learners even when using the same technology. This study highlights that some of the inconsistencies that we find in the results comparing online to F2F learning might be influenced by the nature of the course: theory‐based courses might be less impacted by in‐person interaction than skills‐based courses.

The fact that the reviewed studies on the effects of F2F versus online learning on student performance have been mainly focused in developed countries indicates the dearth of similar studies being conducted in developing countries. This gap in the literature may also highlight a salient point: online learning is still relatively underexplored in developing countries. The lockdown in South Africa therefore provides us with an opportunity to contribute to the existing literature from a developing country context.

2. CONTEXT OF STUDY

South Africa went into national lockdown in March 2020 due to the Covid‐19 pandemic. Like most universities in the country, the first semester for undergraduate courses at the University of Pretoria had already been running since the start of the academic year in February. Before the pandemic, a number of F2F lectures and assessments had already been conducted in most courses. The nationwide lockdown forced the university, which was mainly in‐person teaching, to move to full online learning for the remainder of the semester. This forced shift from F2F teaching to online learning allows us to investigate the changes in students' performance.

Before lockdown, classes were conducted on campus. During lockdown, these live classes were moved to an online platform, Blackboard Collaborate, which could be accessed by all registered students on the university intranet (“ClickUP”). However, these live online lectures involve substantial internet data costs for students. To ensure access to course content for those students who were unable to attend the live online lectures due to poor internet connections or internet data costs, several options for accessing course content were made available. These options included prerecorded narrated slides (which required less usage of internet data), recordings of the live online lectures, PowerPoint slides with explanatory notes and standard PDF lecture slides.

At the same time, the university managed to procure and loan out laptops to a number of disadvantaged students, and negotiated with major mobile internet data providers in the country for students to have free access to study material through the university's “connect” website (also referred to as the zero‐rated website). However, this free access excluded some video content and live online lectures (see Table  1 ). The university also provided between 10 and 20 gigabytes of mobile internet data per month, depending on the network provider, sent to students' mobile phones to assist with internet data costs.

Sites available on zero‐rated website

Note : The table summarizes the sites that were available on the zero‐rated website and those that incurred data costs.

High data costs continue to be a contentious issue in Africa where average incomes are low. Gilbert ( 2019 ) reports that South Africa ranked 16th of the 45 countries researched in terms of the most expensive internet data in Africa, at US$6.81 per gigabyte, in comparison to other Southern African countries such as Mozambique (US$1.97), Zambia (US$2.70), and Lesotho (US$4.09). Internet data prices have also been called into question in South Africa after the Competition Commission published a report from its Data Services Market Inquiry calling the country's internet data pricing “excessive” (Gilbert,  2019 ).

3. EMPIRICAL APPROACH

We use a sample of 395 s‐year students taking a macroeconomics module in the Economics department to compare the effects of F2F and online learning on students' performance using a range of assessments. The module was an introduction to the application of theoretical economic concepts. The content was both theory‐based (developing economic growth models using concepts and equations) and skill‐based (application involving the collection of data from online data sources and analyzing the data using statistical software). Both individual and group assignments formed part of the assessments. Before the end of the semester, during lockdown in June 2020, we asked the students to complete a survey with questions related to the transition from F2F to online learning and the difficulties that they may have faced. For example, we asked the students: (i) how easy or difficult they found the transition from F2F to online lectures; (ii) what internet options were available to them and which they used the most to access the online prescribed work; (iii) what format of content they accessed and which they preferred the most (i.e. self‐study material in the form of PDF and PowerPoint slides with notes vs. assisted study with narrated slides and lecture recordings); (iv) what difficulties they faced accessing the live online lectures, to name a few. Figure  1 summarizes the key survey questions that we asked the students regarding their transition from F2F to online learning.

An external file that holds a picture, illustration, etc.
Object name is AFDR-33-S114-g002.jpg

Summary of survey data

Before the lockdown, the students had already attended several F2F classes and completed three assessments. We are therefore able to create a dependent variable that is comprised of the average grades of three assignments taken before lockdown and the average grades of three assignments taken after the start of the lockdown for each student. Specifically, we use the difference between the post‐ and pre‐lockdown average grades as the dependent variable. However, the number of student observations dropped to 275 due to some students missing one or more of the assessments. The lecturer, content and format of the assessments remain similar across the module. We estimate the following equation using ordinary least squares (OLS) with robust standard errors:

where Y i is the student's performance measured by the difference between the post and pre‐lockdown average grades. B represents the vector of determinants that measure the difficulty faced by students to transition from F2F to online learning. This vector includes access to the internet, study material preferred, quality of the online live lecture sessions and pre‐lockdown class attendance. X is the vector of student demographic controls such as race, gender and an indicator if the student's perceived family income is below average. The ε i is unobserved student characteristics.

4. ANALYSIS

4.1. descriptive statistics.

Table  2 gives an overview of the sample of students. We find that among the black students, a higher proportion of students reported finding the transition to online learning more difficult. On the other hand, more white students reported finding the transition moderately easy, as did the other races. According to Coetzee ( 2014 ), the quality of schools can vary significantly between higher income and lower‐income areas, with black South Africans far more likely to live in lower‐income areas with lower quality schools than white South Africans. As such, these differences in quality of education from secondary schooling can persist at tertiary level. Furthermore, persistent income inequality between races in South Africa likely means that many poorer black students might not be able to afford wifi connections or large internet data bundles which can make the transition difficult for black students compared to their white counterparts.

Descriptive statistics

Notes : The transition difficulty variable was ordered 1: Very Easy; 2: Moderately Easy; 3: Difficult; and 4: Impossible. Since we have few responses to the extremes, we combined Very Easy and Moderately as well as Difficult and Impossible to make the table easier to read. The table with a full breakdown is available upon request.

A higher proportion of students reported that wifi access made the transition to online learning moderately easy. However, relatively more students reported that mobile internet data and accessing the zero‐rated website made the transition difficult. Surprisingly, not many students made use of the zero‐rated website which was freely available. Figure  2 shows that students who reported difficulty transitioning to online learning did not perform as well in online learning versus F2F when compared to those that found it less difficult to transition.

An external file that holds a picture, illustration, etc.
Object name is AFDR-33-S114-g003.jpg

Transition from F2F to online learning.

Notes : This graph shows the students' responses to the question “How easy did you find the transition from face‐to‐face lectures to online lectures?” in relation to the outcome variable for performance

In Figure  3 , the kernel density shows that students who had access to wifi performed better than those who used mobile internet data or the zero‐rated data.

An external file that holds a picture, illustration, etc.
Object name is AFDR-33-S114-g001.jpg

Access to online learning.

Notes : This graph shows the students' responses to the question “What do you currently use the most to access most of your prescribed work?” in relation to the outcome variable for performance

The regression results are reported in Table  3 . We find that the change in students' performance from F2F to online is negatively associated with the difficulty they faced in transitioning from F2F to online learning. According to student survey responses, factors contributing to difficulty in transitioning included poor internet access, high internet data costs and lack of equipment such as laptops or tablets to access the study materials on the university website. Students who had access to wifi (i.e. fixed wireless broadband, Asymmetric Digital Subscriber Line (ADSL) or optic fiber) performed significantly better, with on average 4.5 points higher grade, in relation to students that had to use mobile internet data (i.e. personal mobile internet data, wifi at home using mobile internet data, or hotspot using mobile internet data) or the zero‐rated website to access the study materials. The insignificant results for the zero‐rated website are surprising given that the website was freely available and did not incur any internet data costs. However, most students in this sample complained that the internet connection on the zero‐rated website was slow, especially in uploading assignments. They also complained about being disconnected when they were in the middle of an assessment. This may have discouraged some students from making use of the zero‐rated website.

Results: Predictors for student performance using the difference on average assessment grades between pre‐ and post‐lockdown

Coefficients reported. Robust standard errors in parentheses.

∗∗∗ p  < .01.

Students who expressed a preference for self‐study approaches (i.e. reading PDF slides or PowerPoint slides with explanatory notes) did not perform as well, on average, as students who preferred assisted study (i.e. listening to recorded narrated slides or lecture recordings). This result is in line with Means et al. ( 2010 ), where student performance was better for online learning that was collaborative or instructor‐driven than in cases where online learners worked independently. Interestingly, we also observe that the performance of students who often attended in‐person classes before the lockdown decreased. Perhaps these students found the F2F lectures particularly helpful in mastering the course material. From the survey responses, we find that a significant proportion of the students (about 70%) preferred F2F to online lectures. This preference for F2F lectures may also be linked to the factors contributing to the difficulty some students faced in transitioning to online learning.

We find that the performance of low‐income students decreased post‐lockdown, which highlights another potential challenge to transitioning to online learning. The picture and sound quality of the live online lectures also contributed to lower performance. Although this result is not statistically significant, it is worth noting as the implications are linked to the quality of infrastructure currently available for students to access online learning. We find no significant effects of race on changes in students' performance, though males appeared to struggle more with the shift to online teaching than females.

For the robustness check in Table  4 , we consider the average grades of the three assignments taken after the start of the lockdown as a dependent variable (i.e. the post‐lockdown average grades for each student). We then include the pre‐lockdown average grades as an explanatory variable. The findings and overall conclusions in Table  4 are consistent with the previous results.

Robustness check: Predictors for student performance using the average assessment grades for post‐lockdown

As a further robustness check in Table  5 , we create a panel for each student across the six assignment grades so we can control for individual heterogeneity. We create a post‐lockdown binary variable that takes the value of 1 for the lockdown period and 0 otherwise. We interact the post‐lockdown dummy variable with a measure for transition difficulty and internet access. The internet access variable is an indicator variable for mobile internet data, wifi, or zero‐rated access to class materials. The variable wifi is a binary variable taking the value of 1 if the student has access to wifi and 0 otherwise. The zero‐rated variable is a binary variable taking the value of 1 if the student used the university's free portal access and 0 otherwise. We also include assignment and student fixed effects. The results in Table  5 remain consistent with our previous findings that students who had wifi access performed significantly better than their peers.

Interaction model

Notes : Coefficients reported. Robust standard errors in parentheses. The dependent variable is the assessment grades for each student on each assignment. The number of observations include the pre‐post number of assessments multiplied by the number of students.

6. CONCLUSION

The Covid‐19 pandemic left many education institutions with no option but to transition to online learning. The University of Pretoria was no exception. We examine the effect of transitioning to online learning on the academic performance of second‐year economic students. We use assessment results from F2F lectures before lockdown, and online lectures post lockdown for the same group of students, together with responses from survey questions. We find that the main contributor to lower academic performance in the online setting was poor internet access, which made transitioning to online learning more difficult. In addition, opting to self‐study (read notes instead of joining online classes and/or watching recordings) did not help the students in their performance.

The implications of the results highlight the need for improved quality of internet infrastructure with affordable internet data pricing. Despite the university's best efforts not to leave any student behind with the zero‐rated website and free monthly internet data, the inequality dynamics in the country are such that invariably some students were negatively affected by this transition, not because the student was struggling academically, but because of inaccessibility of internet (wifi). While the zero‐rated website is a good collaborative initiative between universities and network providers, the infrastructure is not sufficient to accommodate mass students accessing it simultaneously.

This study's findings may highlight some shortcomings in the academic sector that need to be addressed by both the public and private sectors. There is potential for an increase in the digital divide gap resulting from the inequitable distribution of digital infrastructure. This may lead to reinforcement of current inequalities in accessing higher education in the long term. To prepare the country for online learning, some considerations might need to be made to make internet data tariffs more affordable and internet accessible to all. We hope that this study's findings will provide a platform (or will at least start the conversation for taking remedial action) for policy engagements in this regard.

We are aware of some limitations presented by our study. The sample we have at hand makes it difficult to extrapolate our findings to either all students at the University of Pretoria or other higher education students in South Africa. Despite this limitation, our findings highlight the negative effect of the digital divide on students' educational outcomes in the country. The transition to online learning and the high internet data costs in South Africa can also have adverse learning outcomes for low‐income students. With higher education institutions, such as the University of Pretoria, integrating online teaching to overcome the effect of the Covid‐19 pandemic, access to stable internet is vital for students' academic success.

It is also important to note that the data we have at hand does not allow us to isolate wifi's causal effect on students' performance post‐lockdown due to two main reasons. First, wifi access is not randomly assigned; for instance, there is a high chance that students with better‐off family backgrounds might have better access to wifi and other supplementary infrastructure than their poor counterparts. Second, due to the university's data access policy and consent, we could not merge the data at hand with the student's previous year's performance. Therefore, future research might involve examining the importance of these elements to document the causal impact of access to wifi on students' educational outcomes in the country.

ACKNOWLEDGMENT

The authors acknowledge the helpful comments received from the editor, the anonymous reviewers, and Elizabeth Asiedu.

Chisadza, C. , Clance, M. , Mthembu, T. , Nicholls, N. , & Yitbarek, E. (2021). Online and face‐to‐face learning: Evidence from students’ performance during the Covid‐19 pandemic . Afr Dev Rev , 33 , S114–S125. 10.1111/afdr.12520 [ CrossRef ] [ Google Scholar ]

1 https://mybroadband.co.za/news/cellular/309693-mobile-data-prices-south-africa-vs-the-world.html .

2 The 4IR is currently characterized by increased use of new technologies, such as advanced wireless technologies, artificial intelligence, cloud computing, robotics, among others. This era has also facilitated the use of different online learning platforms ( https://www.brookings.edu/research/the-fourth-industrialrevolution-and-digitization-will-transform-africa-into-a-global-powerhouse/ ).

3 Note that we control for income, but it is plausible to assume other unobservable factors such as parental preference and parenting style might also affect access to the internet of students.

  • Almatra, O. , Johri, A. , Nagappan, K. , & Modanlu, A. (2015). An empirical study of face‐to‐face and distance learning sections of a core telecommunication course (Conference Proceedings Paper No. 12944). 122nd ASEE Annual Conference and Exposition, Seattle, Washington State.
  • Anyanwu, J. C. (2013). Characteristics and macroeconomic determinants of youth employment in Africa . African Development Review , 25 ( 2 ), 107–129. [ Google Scholar ]
  • Anyanwu, J. C. (2016). Accounting for gender equality in secondary school enrolment in Africa: Accounting for gender equality in secondary school enrolment . African Development Review , 28 ( 2 ), 170–191. [ Google Scholar ]
  • Arbaugh, J. (2000). Virtual classroom versus physical classroom: An exploratory study of class discussion patterns and student learning in an asynchronous internet‐based MBA course . Journal of Management Education , 24 ( 2 ), 213–233. [ Google Scholar ]
  • Arias, J. J. , Swinton, J. , & Anderson, K. (2018). On‐line vs. face‐to‐face: A comparison of student outcomes with random assignment . e‐Journal of Business Education and Scholarship of Teaching, , 12 ( 2 ), 1–23. [ Google Scholar ]
  • Bernard, R. M. , Abrami, P. C. , Lou, Y. , Borokhovski, E. , Wade, A. , Wozney, L. , Wallet, P. A. , Fiset, M. , & Huang, B. (2004). How does distance education compare with classroom instruction? A meta‐analysis of the empirical literature . Review of Educational Research , 74 ( 3 ), 379–439. [ Google Scholar ]
  • Brown, B. , & Liedholm, C. (2002). Can web courses replace the classroom in principles of microeconomics? American Economic Review , 92 ( 2 ), 444–448. [ Google Scholar ]
  • Callister, R. R. , & Love, M. S. (2016). A comparison of learning outcomes in skills‐based courses: Online versus face‐to‐face formats . Decision Sciences Journal of Innovative Education , 14 ( 2 ), 243–256. [ Google Scholar ]
  • Coates, D. , Humphreys, B. R. , Kane, J. , & Vachris, M. A. (2004). “No significant distance” between face‐to‐face and online instruction: Evidence from principles of economics . Economics of Education Review , 23 ( 5 ), 533–546. [ Google Scholar ]
  • Coetzee, M. (2014). School quality and the performance of disadvantaged learners in South Africa (Working Paper No. 22). University of Stellenbosch Economics Department, Stellenbosch
  • Dutton, J. , Dutton, M. , & Perry, J. (2002). How do online students differ from lecture students? Journal of Asynchronous Learning Networks , 6 ( 1 ), 1–20. [ Google Scholar ]
  • Emediegwu, L. (2021). Does educational investment enhance capacity development for Nigerian youths? An autoregressive distributed lag approach . African Development Review , 32 ( S1 ), S45–S53. [ Google Scholar ]
  • Fallah, M. H. , & Ubell, R. (2000). Blind scores in a graduate test. Conventional compared with web‐based outcomes . ALN Magazine , 4 ( 2 ). [ Google Scholar ]
  • Figlio, D. , Rush, M. , & Yin, L. (2013). Is it live or is it internet? Experimental estimates of the effects of online instruction on student learning . Journal of Labor Economics , 31 ( 4 ), 763–784. [ Google Scholar ]
  • Freeman, M. A. , & Capper, J. M. (1999). Exploiting the web for education: An anonymous asynchronous role simulation . Australasian Journal of Educational Technology , 15 ( 1 ), 95–116. [ Google Scholar ]
  • Gilbert, P. (2019). The most expensive data prices in Africa . Connecting Africa. https://www.connectingafrica.com/author.asp?section_id=761%26doc_id=756372
  • Glick, P. , Randriamamonjy, J. , & Sahn, D. (2009). Determinants of HIV knowledge and condom use among women in Madagascar: An analysis using matched household and community data . African Development Review , 21 ( 1 ), 147–179. [ Google Scholar ]
  • Gyimah‐Brempong, K. (2011). Education and economic development in Africa . African Development Review , 23 ( 2 ), 219–236. [ Google Scholar ]
  • Johnson, S. , Aragon, S. , Shaik, N. , & Palma‐Rivas, N. (2000). Comparative analysis of learner satisfaction and learning outcomes in online and face‐to‐face learning environments . Journal of Interactive Learning Research , 11 ( 1 ), 29–49. [ Google Scholar ]
  • Lundberg, J. , Merino, D. , & Dahmani, M. (2008). Do online students perform better than face‐to‐face students? Reflections and a short review of some empirical findings . Revista de Universidad y Sociedad del Conocimiento , 5 ( 1 ), 35–44. [ Google Scholar ]
  • Means, B. , Toyama, Y. , Murphy, R. , Bakia, M. , & Jones, K. (2010). Evaluation of evidence‐based practices in online learning: A meta‐analysis and review of online learning studies (Report No. ed‐04‐co‐0040 task 0006). U.S. Department of Education, Office of Planning, Evaluation, and Policy Development, Washington DC.
  • Navarro, P. (2000). Economics in the cyber‐classroom . Journal of Economic Perspectives , 14 ( 2 ), 119–132. [ Google Scholar ]
  • Navarro, P. , & Shoemaker, J. (2000). Performance and perceptions of distance learners in cyberspace . American Journal of Distance Education , 14 ( 2 ), 15–35. [ Google Scholar ]
  • Neuhauser, C. (2002). Learning style and effectiveness of online and face‐to‐face instruction . American Journal of Distance Education , 16 ( 2 ), 99–113. [ Google Scholar ]
  • Nguyen, T. (2015). The effectiveness of online learning: Beyond no significant difference and future horizons . MERLOT Journal of Online Teaching and Learning , 11 ( 2 ), 309–319. [ Google Scholar ]
  • Paechter, M. , & Maier, B. (2010). Online or face‐to‐face? Students' experiences and preferences in e‐learning . Internet and Higher Education , 13 ( 4 ), 292–297. [ Google Scholar ]
  • Russell, T. L. , & International Distance Education Certificate Center (IDECC) (1999). The no significant difference phenomenon: A comparative research annotated bibliography on technology for distance education: As reported in 355 research reports, summaries and papers . North Carolina State University. [ Google Scholar ]
  • Shachar, M. , & Neumann, Y. (2010). Twenty years of research on the academic performance differences between traditional and distance learning: Summative meta‐analysis and trend examination . MERLOT Journal of Online Learning and Teaching , 6 ( 2 ), 318–334. [ Google Scholar ]
  • Shachar, M. , & Yoram, N. (2003). Differences between traditional and distance education academic performances: A meta‐analytic approach . International Review of Research in Open and Distance Learning , 4 ( 2 ), 1–20. [ Google Scholar ]
  • Tchamyou, V. S. , Asongu, S. , & Odhiambo, N. (2019). The role of ICT in modulating the effect of education and lifelong learning on income inequality and economic growth in Africa . African Development Review , 31 ( 3 ), 261–274. [ Google Scholar ]
  • Xu, D. , & Jaggars, S. S. (2014). Performance gaps between online and face‐to‐face courses: Differences across types of students and academic subject areas . The Journal of Higher Education , 85 ( 5 ), 633–659. [ Google Scholar ]
  • Research article
  • Open access
  • Published: 02 December 2020

Integrating students’ perspectives about online learning: a hierarchy of factors

  • Montgomery Van Wart 1 ,
  • Anna Ni 1 ,
  • Pamela Medina 1 ,
  • Jesus Canelon 1 ,
  • Melika Kordrostami 1 ,
  • Jing Zhang 1 &

International Journal of Educational Technology in Higher Education volume  17 , Article number:  53 ( 2020 ) Cite this article

149k Accesses

51 Citations

24 Altmetric

Metrics details

This article reports on a large-scale ( n  = 987), exploratory factor analysis study incorporating various concepts identified in the literature as critical success factors for online learning from the students’ perspective, and then determines their hierarchical significance. Seven factors--Basic Online Modality, Instructional Support, Teaching Presence, Cognitive Presence, Online Social Comfort, Online Interactive Modality, and Social Presence--were identified as significant and reliable. Regression analysis indicates the minimal factors for enrollment in future classes—when students consider convenience and scheduling—were Basic Online Modality, Cognitive Presence, and Online Social Comfort. Students who accepted or embraced online courses on their own merits wanted a minimum of Basic Online Modality, Teaching Presence, Cognitive Presence, Online Social Comfort, and Social Presence. Students, who preferred face-to-face classes and demanded a comparable experience, valued Online Interactive Modality and Instructional Support more highly. Recommendations for online course design, policy, and future research are provided.

Introduction

While there are different perspectives of the learning process such as learning achievement and faculty perspectives, students’ perspectives are especially critical since they are ultimately the raison d’être of the educational endeavor (Chickering & Gamson, 1987 ). More pragmatically, students’ perspectives provide invaluable, first-hand insights into their experiences and expectations (Dawson et al., 2019 ). The student perspective is especially important when new teaching approaches are used and when new technologies are being introduced (Arthur, 2009 ; Crews & Butterfield, 2014 ; Van Wart, Ni, Ready, Shayo, & Court, 2020 ). With the renewed interest in “active” education in general (Arruabarrena, Sánchez, Blanco, et al., 2019 ; Kay, MacDonald, & DiGiuseppe, 2019 ; Nouri, 2016 ; Vlachopoulos & Makri, 2017 ) and the flipped classroom approach in particular (Flores, del-Arco, & Silva, 2016 ; Gong, Yang, & Cai, 2020 ; Lundin, et al., 2018 ; Maycock, 2019 ; McGivney-Burelle, 2013 ; O’Flaherty & Phillips, 2015 ; Tucker , 2012 ) along with extraordinary shifts in the technology, the student perspective on online education is profoundly important. What shapes students’ perceptions of quality integrate are their own sense of learning achievement, satisfaction with the support they receive, technical proficiency of the process, intellectual and emotional stimulation, comfort with the process, and sense of learning community. The factors that students perceive as quality online teaching, however, has not been as clear as it might be for at least two reasons.

First, it is important to note that the overall online learning experience for students is also composed of non-teaching factors which we briefly mention. Three such factors are (1) convenience, (2) learner characteristics and readiness, and (3) antecedent conditions that may foster teaching quality but are not directly responsible for it. (1) Convenience is an enormous non-quality factor for students (Artino, 2010 ) which has driven up online demand around the world (Fidalgo, Thormann, Kulyk, et al., 2020 ; Inside Higher Education and Gallup, 2019 ; Legon & Garrett, 2019 ; Ortagus, 2017 ). This is important since satisfaction with online classes is frequently somewhat lower than face-to-face classes (Macon, 2011 ). However, the literature generally supports the relative equivalence of face-to-face and online modes regarding learning achievement criteria (Bernard et al., 2004 ; Nguyen, 2015 ; Ni, 2013 ; Sitzmann, Kraiger, Stewart, & Wisher, 2006 ; see Xu & Jaggars, 2014 for an alternate perspective). These contrasts are exemplified in a recent study of business students, in which online students using a flipped classroom approach outperformed their face-to-face peers, but ironically rated instructor performance lower (Harjoto, 2017 ). (2) Learner characteristics also affect the experience related to self-regulation in an active learning model, comfort with technology, and age, among others,which affect both receptiveness and readiness of online instruction. (Alqurashi, 2016 ; Cohen & Baruth, 2017 ; Kintu, Zhu, & Kagambe, 2017 ; Kuo, Walker, Schroder, & Belland, 2013 ; Ventura & Moscoloni, 2015 ) (3) Finally, numerous antecedent factors may lead to improved instruction, but are not themselves directly perceived by students such as instructor training (Brinkley-Etzkorn, 2018 ), and the sources of faculty motivation (e.g., incentives, recognition, social influence, and voluntariness) (Wingo, Ivankova, & Moss, 2017 ). Important as these factors are, mixing them with the perceptions of quality tends to obfuscate the quality factors directly perceived by students.

Second, while student perceptions of quality are used in innumerable studies, our overall understanding still needs to integrate them more holistically. Many studies use student perceptions of quality and overall effectiveness of individual tools and strategies in online contexts such as mobile devices (Drew & Mann, 2018 ), small groups (Choi, Land, & Turgeon, 2005 ), journals (Nair, Tay, & Koh, 2013 ), simulations (Vlachopoulos & Makri, 2017 ), video (Lange & Costley, 2020 ), etc. Such studies, however, cannot provide the overall context and comparative importance. Some studies have examined the overall learning experience of students with exploratory lists, but have mixed non-quality factors with quality of teaching factors making it difficult to discern the instructor’s versus contextual roles in quality (e.g., Asoodar, Vaezi, & Izanloo, 2016 ; Bollinger & Martindale, 2004 ; Farrell & Brunton, 2020 ; Hong, 2002 ; Song, Singleton, Hill, & Koh, 2004 ; Sun, Tsai, Finger, Chen, & Yeh, 2008 ). The application of technology adoption studies also fall into this category by essentially aggregating all teaching quality in the single category of performance ( Al-Gahtani, 2016 ; Artino, 2010 ). Some studies have used high-level teaching-oriented models, primarily the Community of Inquiry model (le Roux & Nagel, 2018 ), but empirical support has been mixed (Arbaugh et al., 2008 ); and its elegance (i.e., relying on only three factors) has not provided much insight to practitioners (Anderson, 2016 ; Cleveland-Innes & Campbell, 2012 ).

Research questions

Integration of studies and concepts explored continues to be fragmented and confusing despite the fact that the number of empirical studies related to student perceptions of quality factors has increased. It is important to have an empirical view of what students’ value in a single comprehensive study and, also, to know if there is a hierarchy of factors, ranging from students who are least to most critical of the online learning experience. This research study has two research questions.

The first research question is: What are the significant factors in creating a high-quality online learning experience from students’ perspectives? That is important to know because it should have a significant effect on the instructor’s design of online classes. The goal of this research question is identify a more articulated and empirically-supported set of factors capturing the full range of student expectations.

The second research question is: Is there a priority or hierarchy of factors related to students’ perceptions of online teaching quality that relate to their decisions to enroll in online classes? For example, is it possible to distinguish which factors are critical for enrollment decisions when students are primarily motivated by convenience and scheduling flexibility (minimum threshold)? Do these factors differ from students with a genuine acceptance of the general quality of online courses (a moderate threshold)? What are the factors that are important for the students who are the most critical of online course delivery (highest threshold)?

This article next reviews the literature on online education quality, focusing on the student perspective and reviews eight factors derived from it. The research methods section discusses the study structure and methods. Demographic data related to the sample are next, followed by the results, discussion, and conclusion.

Literature review

Online education is much discussed (Prinsloo, 2016 ; Van Wart et al., 2019 ; Zawacki-Richter & Naidu, 2016 ), but its perception is substantially influenced by where you stand and what you value (Otter et al., 2013 ; Tanner, Noser, & Totaro, 2009 ). Accrediting bodies care about meeting technical standards, proof of effectiveness, and consistency (Grandzol & Grandzol, 2006 ). Institutions care about reputation, rigor, student satisfaction, and institutional efficiency (Jung, 2011 ). Faculty care about subject coverage, student participation, faculty satisfaction, and faculty workload (Horvitz, Beach, Anderson, & Xia, 2015 ; Mansbach & Austin, 2018 ). For their part, students care about learning achievement (Marks, Sibley, & Arbaugh, 2005 ; O’Neill & Sai, 2014 ; Shen, Cho, Tsai, & Marra, 2013 ), but also view online education as a function of their enjoyment of classes, instructor capability and responsiveness, and comfort in the learning environment (e.g., Asoodar et al., 2016 ; Sebastianelli, Swift, & Tamimi, 2015 ). It is this last perspective, of students, upon which we focus.

It is important to note students do not sign up for online classes solely based on perceived quality. Perceptions of quality derive from notions of the capacity of online learning when ideal—relative to both learning achievement and satisfaction/enjoyment, and perceptions about the likelihood and experience of classes living up to expectations. Students also sign up because of convenience and flexibility, and personal notions of suitability about learning. Convenience and flexibility are enormous drivers of online registration (Lee, Stringer, & Du, 2017 ; Mann & Henneberry, 2012 ). Even when students say they prefer face-to-face classes to online, many enroll in online classes and re-enroll in the future if the experience meets minimum expectations. This study examines the threshold expectations of students when they are considering taking online classes.

When discussing students’ perceptions of quality, there is little clarity about the actual range of concepts because no integrated empirical studies exist comparing major factors found throughout the literature. Rather, there are practitioner-generated lists of micro-competencies such as the Quality Matters consortium for higher education (Quality Matters, 2018 ), or broad frameworks encompassing many aspects of quality beyond teaching (Open and Distant Learning Quality Council, 2012 ). While checklists are useful for practitioners and accreditation processes, they do not provide robust, theoretical bases for scholarly development. Overarching frameworks are heuristically useful, but not for pragmatic purposes or theory building arenas. The most prominent theoretical framework used in online literature is the Community of Inquiry (CoI) model (Arbaugh et al., 2008 ; Garrison, Anderson, & Archer, 2003 ), which divides instruction into teaching, cognitive, and social presence. Like deductive theories, however, the supportive evidence is mixed (Rourke & Kanuka, 2009 ), especially regarding the importance of social presence (Annand, 2011 ; Armellini and De Stefani, 2016 ). Conceptually, the problem is not so much with the narrow articulation of cognitive or social presence; cognitive presence is how the instructor provides opportunities for students to interact with material in robust, thought-provoking ways, and social presence refers to building a community of learning that incorporates student-to-student interactions. However, teaching presence includes everything else the instructor does—structuring the course, providing lectures, explaining assignments, creating rehearsal opportunities, supplying tests, grading, answering questions, and so on. These challenges become even more prominent in the online context. While the lecture as a single medium is paramount in face-to-face classes, it fades as the primary vehicle in online classes with increased use of detailed syllabi, electronic announcements, recorded and synchronous lectures, 24/7 communications related to student questions, etc. Amassing the pedagogical and technological elements related to teaching under a single concept provides little insight.

In addition to the CoI model, numerous concepts are suggested in single-factor empirical studies when focusing on quality from a student’s perspective, with overlapping conceptualizations and nonstandardized naming conventions. Seven distinct factors are derived here from the literature of student perceptions of online quality: Instructional Support, Teaching Presence, Basic Online Modality, Social Presence, Online Social Comfort, cognitive Presence, and Interactive Online Modality.

Instructional support

Instructional Support refers to students’ perceptions of techniques by the instructor used for input, rehearsal, feedback, and evaluation. Specifically, this entails providing detailed instructions, designed use of multimedia, and the balance between repetitive class features for ease of use, and techniques to prevent boredom. Instructional Support is often included as an element of Teaching Presence, but is also labeled “structure” (Lee & Rha, 2009 ; So & Brush, 2008 ) and instructor facilitation (Eom, Wen, & Ashill, 2006 ). A prime example of the difference between face-to-face and online education is the extensive use of the “flipped classroom” (Maycock, 2019 ; Wang, Huang, & Schunn, 2019 ) in which students move to rehearsal activities faster and more frequently than traditional classrooms, with less instructor lecture (Jung, 2011 ; Martin, Wang, & Sadaf, 2018 ). It has been consistently supported as an element of student perceptions of quality (Espasa & Meneses, 2010 ).

  • Teaching presence

Teaching Presence refers to students’ perceptions about the quality of communication in lectures, directions, and individual feedback including encouragement (Jaggars & Xu, 2016 ; Marks et al., 2005 ). Specifically, instructor communication is clear, focused, and encouraging, and instructor feedback is customized and timely. If Instructional Support is what an instructor does before the course begins and in carrying out those plans, then Teaching Presence is what the instructor does while the class is conducted and in response to specific circumstances. For example, a course could be well designed but poorly delivered because the instructor is distracted; or a course could be poorly designed but an instructor might make up for the deficit by spending time and energy in elaborate communications and ad hoc teaching techniques. It is especially important in student satisfaction (Sebastianelli et al., 2015 ; Young, 2006 ) and also referred to as instructor presence (Asoodar et al., 2016 ), learner-instructor interaction (Marks et al., 2005 ), and staff support (Jung, 2011 ). As with Instructional Support, it has been consistently supported as an element of student perceptions of quality.

Basic online modality

Basic Online Modality refers to the competent use of basic online class tools—online grading, navigation methods, online grade book, and the announcements function. It is frequently clumped with instructional quality (Artino, 2010 ), service quality (Mohammadi, 2015 ), instructor expertise in e-teaching (Paechter, Maier, & Macher, 2010 ), and similar terms. As a narrowly defined concept, it is sometimes called technology (Asoodar et al., 2016 ; Bollinger & Martindale, 2004 ; Sun et al., 2008 ). The only empirical study that did not find Basic Online Modality significant, as technology, was Sun et al. ( 2008 ). Because Basic Online Modality is addressed with basic instructor training, some studies assert the importance of training (e.g., Asoodar et al., 2016 ).

Social presence

Social Presence refers to students’ perceptions of the quality of student-to-student interaction. Social Presence focuses on the quality of shared learning and collaboration among students, such as in threaded discussion responses (Garrison et al., 2003 ; Kehrwald, 2008 ). Much emphasized but challenged in the CoI literature (Rourke & Kanuka, 2009 ), it has mixed support in the online literature. While some studies found Social Presence or related concepts to be significant (e.g., Asoodar et al., 2016 ; Bollinger & Martindale, 2004 ; Eom et al., 2006 ; Richardson, Maeda, Lv, & Caskurlu, 2017 ), others found Social Presence insignificant (Joo, Lim, & Kim, 2011 ; So & Brush, 2008 ; Sun et al., 2008 ).

Online social comfort

Online Social Comfort refers to the instructor’s ability to provide an environment in which anxiety is low, and students feel comfortable interacting even when expressing opposing viewpoints. While numerous studies have examined anxiety (e.g., Liaw & Huang, 2013 ; Otter et al., 2013 ; Sun et al., 2008 ), only one found anxiety insignificant (Asoodar et al., 2016 ); many others have not examined the concept.

  • Cognitive presence

Cognitive Presence refers to the engagement of students such that they perceive they are stimulated by the material and instructor to reflect deeply and critically, and seek to understand different perspectives (Garrison et al., 2003 ). The instructor provides instructional materials and facilitates an environment that piques interest, is reflective, and enhances inclusiveness of perspectives (Durabi, Arrastia, Nelson, Cornille, & Liang, 2011 ). Cognitive Presence includes enhancing the applicability of material for student’s potential or current careers. Cognitive Presence is supported as significant in many online studies (e.g., Artino, 2010 ; Asoodar et al., 2016 ; Joo et al., 2011 ; Marks et al., 2005 ; Sebastianelli et al., 2015 ; Sun et al., 2008 ). Further, while many instructors perceive that cognitive presence is diminished in online settings, neuroscientific studies indicate this need not be the case (Takamine, 2017 ). While numerous studies failed to examine Cognitive Presence, this review found no studies that lessened its significance for students.

Interactive online modality

Interactive Online Modality refers to the “high-end” usage of online functionality. That is, the instructor uses interactive online class tools—video lectures, videoconferencing, and small group discussions—well. It is often included in concepts such as instructional quality (Artino, 2010 ; Asoodar et al., 2016 ; Mohammadi, 2015 ; Otter et al., 2013 ; Paechter et al., 2010 ) or engagement (Clayton, Blumberg, & Anthony, 2018 ). While individual methods have been investigated (e.g. Durabi et al., 2011 ), high-end engagement methods have not.

Other independent variables affecting perceptions of quality include age, undergraduate versus graduate status, gender, ethnicity/race, discipline, educational motivation of students, and previous online experience. While age has been found to be small or insignificant, more notable effects have been reported at the level-of-study, with graduate students reporting higher “success” (Macon, 2011 ), and community college students having greater difficulty with online classes (Legon & Garrett, 2019 ; Xu & Jaggars, 2014 ). Ethnicity and race have also been small or insignificant. Some situational variations and student preferences can be captured by paying attention to disciplinary differences (Arbaugh, 2005 ; Macon, 2011 ). Motivation levels of students have been reported to be significant in completion and achievement, with better students doing as well across face-to-face and online modes, and weaker students having greater completion and achievement challenges (Clayton et al., 2018 ; Lu & Lemonde, 2013 ).

Research methods

To examine the various quality factors, we apply a critical success factor methodology, initially introduced to schools of business research in the 1970s. In 1981, Rockhart and Bullen codified an approach embodying principles of critical success factors (CSFs) as a way to identify the information needs of executives, detailing steps for the collection and analyzation of data to create a set of organizational CSFs (Rockhart & Bullen, 1981 ). CSFs describe the underlying or guiding principles which must be incorporated to ensure success.

Utilizing this methodology, CSFs in the context of this paper define key areas of instruction and design essential for an online class to be successful from a student’s perspective. Instructors implicitly know and consider these areas when setting up an online class and designing and directing activities and tasks important to achieving learning goals. CSFs make explicit those things good instructors may intuitively know and (should) do to enhance student learning. When made explicit, CSFs not only confirm the knowledge of successful instructors, but tap their intuition to guide and direct the accomplishment of quality instruction for entire programs. In addition, CSFs are linked with goals and objectives, helping generate a small number of truly important matters an instructor should focus attention on to achieve different thresholds of online success.

After a comprehensive literature review, an instrument was created to measure students’ perceptions about the importance of techniques and indicators leading to quality online classes. Items were designed to capture the major factors in the literature. The instrument was pilot studied during academic year 2017–18 with a 397 student sample, facilitating an exploratory factor analysis leading to important preliminary findings (reference withheld for review). Based on the pilot, survey items were added and refined to include seven groups of quality teaching factors and two groups of items related to students’ overall acceptance of online classes as well as a variable on their future online class enrollment. Demographic information was gathered to determine their effects on students’ levels of acceptance of online classes based on age, year in program, major, distance from university, number of online classes taken, high school experience with online classes, and communication preferences.

This paper draws evidence from a sample of students enrolled in educational programs at Jack H. Brown College of Business and Public Administration (JHBC), California State University San Bernardino (CSUSB). The JHBC offers a wide range of online courses for undergraduate and graduate programs. To ensure comparable learning outcomes, online classes and face-to-face classes of a certain subject are similar in size—undergraduate classes are generally capped at 60 and graduate classes at 30, and often taught by the same instructors. Students sometimes have the option to choose between both face-to-face and online modes of learning.

A Qualtrics survey link was sent out by 11 instructors to students who were unlikely to be cross-enrolled in classes during the 2018–19 academic year. 1 Approximately 2500 students were contacted, with some instructors providing class time to complete the anonymous survey. All students, whether they had taken an online class or not, were encouraged to respond. Nine hundred eighty-seven students responded, representing a 40% response rate. Although drawn from a single business school, it is a broad sample representing students from several disciplines—management, accounting and finance, marketing, information decision sciences, and public administration, as well as both graduate and undergraduate programs of study.

The sample age of students is young, with 78% being under 30. The sample has almost no lower division students (i.e., freshman and sophomore), 73% upper division students (i.e., junior and senior) and 24% graduate students (master’s level). Only 17% reported having taken a hybrid or online class in high school. There was a wide range of exposure to university level online courses, with 47% reporting having taken 1 to 4 classes, and 21% reporting no online class experience. As a Hispanic-serving institution, 54% self-identified as Latino, 18% White, and 13% Asian and Pacific Islander. The five largest majors were accounting & finance (25%), management (21%), master of public administration (16%), marketing (12%), and information decision sciences (10%). Seventy-four percent work full- or part-time. See Table  1 for demographic data.

Measures and procedure

To increase the reliability of evaluation scores, composite evaluation variables are formed after an exploratory factor analysis of individual evaluation items. A principle component method with Quartimin (oblique) rotation was applied to explore the factor construct of student perceptions of online teaching CSFs. The item correlations for student perceptions of importance coefficients greater than .30 were included, a commonly acceptable ratio in factor analysis. A simple least-squares regression analysis was applied to test the significance levels of factors on students’ impression of online classes.

Exploratory factor constructs

Using a threshold loading of 0.3 for items, 37 items loaded on seven factors. All factors were logically consistent. The first factor, with eight items, was labeled Teaching Presence. Items included providing clear instructions, staying on task, clear deadlines, and customized feedback on strengths and weaknesses. Teaching Presence items all related to instructor involvement during the course as a director, monitor, and learning facilitator. The second factor, with seven items, aligned with Cognitive Presence. Items included stimulating curiosity, opportunities for reflection, helping students construct explanations posed in online courses, and the applicability of material. The third factor, with six items, aligned with Social Presence defined as providing student-to-student learning opportunities. Items included getting to know course participants for sense of belonging, forming impressions of other students, and interacting with others. The fourth factor, with six new items as well as two (“interaction with other students” and “a sense of community in the class”) shared with the third factor, was Instructional Support which related to the instructor’s roles in providing students a cohesive learning experience. They included providing sufficient rehearsal, structured feedback, techniques for communication, navigation guide, detailed syllabus, and coordinating student interaction and creating a sense of online community. This factor also included enthusiasm which students generally interpreted as a robustly designed course, rather than animation in a traditional lecture. The fifth factor was labeled Basic Online Modality and focused on the basic technological requirements for a functional online course. Three items included allowing students to make online submissions, use of online gradebooks, and online grading. A fourth item is the use of online quizzes, viewed by students as mechanical practice opportunities rather than small tests and a fifth is navigation, a key component of Online Modality. The sixth factor, loaded on four items, was labeled Online Social Comfort. Items here included comfort discussing ideas online, comfort disagreeing, developing a sense of collaboration via discussion, and considering online communication as an excellent medium for social interaction. The final factor was called Interactive Online Modality because it included items for “richer” communications or interactions, no matter whether one- or two-way. Items included videoconferencing, instructor-generated videos, and small group discussions. Taken together, these seven explained 67% of the variance which is considered in the acceptable range in social science research for a robust model (Hair, Black, Babin, & Anderson, 2014 ). See Table  2 for the full list.

To test for factor reliability, the Cronbach alpha of variables were calculated. All produced values greater than 0.7, the standard threshold used for reliability, except for system trust which was therefore dropped. To gauge students’ sense of factor importance, all items were means averaged. Factor means (lower means indicating higher importance to students), ranged from 1.5 to 2.6 on a 5-point scale. Basic Online Modality was most important, followed by Instructional Support and Teaching Presence. Students deemed Cognitive Presence, Social Online Comfort, and Online Interactive Modality less important. The least important for this sample was Social Presence. Table  3 arrays the critical success factor means, standard deviations, and Cronbach alpha.

To determine whether particular subgroups of respondents viewed factors differently, a series of ANOVAs were conducted using factor means as dependent variables. Six demographic variables were used as independent variables: graduate vs. undergraduate, age, work status, ethnicity, discipline, and past online experience. To determine strength of association of the independent variables to each of the seven CSFs, eta squared was calculated for each ANOVA. Eta squared indicates the proportion of variance in the dependent variable explained by the independent variable. Eta squared values greater than .01, .06, and .14 are conventionally interpreted as small, medium, and large effect sizes, respectively (Green & Salkind, 2003 ). Table  4 summarizes the eta squared values for the ANOVA tests with Eta squared values less than .01 omitted.

While no significant differences in factor means among students in different disciplines in the College occur, all five other independent variables have some small effect on some or all CSFs. Graduate students tend to rate Online Interactive Modality, Instructional Support, Teaching Presence, and Cognitive Presence higher than undergraduates. Elder students value more Online Interactive Modality. Full-time working students rate all factors, except Social Online Comfort, slightly higher than part-timers and non-working students. Latino and White rate Basic Online Modality and Instructional Support higher; Asian and Pacific Islanders rate Social Presence higher. Students who have taken more online classes rate all factors higher.

In addition to factor scores, two variables are constructed to identify the resultant impressions labeled online experience. Both were logically consistent with a Cronbach’s α greater than 0.75. The first variable, with six items, labeled “online acceptance,” included items such as “I enjoy online learning,” “My overall impression of hybrid/online learning is very good,” and “the instructors of online/hybrid classes are generally responsive.” The second variable was labeled “face-to-face preference” and combines four items, including enjoying, learning, and communicating more in face-to-face classes, as well as perceiving greater fairness and equity. In addition to these two constructed variables, a one-item variable was also used subsequently in the regression analysis: “online enrollment.” That question asked: if hybrid/online classes are well taught and available, how much would online education make up your entire course selection going forward?

Regression results

As noted above, two constructed variables and one item were used as dependent variables for purposes of regression analysis. They were online acceptance, F2F preference, and the selection of online classes. In addition to seven quality-of-teaching factors identified by factor analysis, control variables included level of education (graduate versus undergraduate), age, ethnicity, work status, distance to university, and number of online/hybrid classes taken in the past. See Table  5 .

When the ETA squared values for ANOVA significance were measured for control factors, only one was close to a medium effect. Graduate versus undergraduate status had a .05 effect (considered medium) related to Online Interactive Modality, meaning graduate students were more sensitive to interactive modality than undergraduates. Multiple regression analysis of critical success factors and online impressions were conducted to compare under what conditions factors were significant. The only consistently significant control factor was number of online classes taken. The more classes students had taken online, the more inclined they were to take future classes. Level of program, age, ethnicity, and working status do not significantly affect students’ choice or overall acceptance of online classes.

The least restrictive condition was online enrollment (Table  6 ). That is, students might not feel online courses were ideal, but because of convenience and scheduling might enroll in them if minimum threshold expectations were met. When considering online enrollment three factors were significant and positive (at the 0.1 level): Basic Online Modality, Cognitive Presence, and Online Social Comfort. These least-demanding students expected classes to have basic technological functionality, provide good opportunities for knowledge acquisition, and provide comfortable interaction in small groups. Students who demand good Instructional Support (e.g., rehearsal opportunities, standardized feedback, clear syllabus) are less likely to enroll.

Online acceptance was more restrictive (see Table  7 ). This variable captured the idea that students not only enrolled in online classes out of necessity, but with an appreciation of the positive attributes of online instruction, which balanced the negative aspects. When this standard was applied, students expected not only Basic Online Modality, Cognitive Presence, and Online Social Comfort, but expected their instructors to be highly engaged virtually as the course progressed (Teaching Presence), and to create strong student-to-student dynamics (Social Presence). Students who rated Instructional Support higher are less accepting of online classes.

Another restrictive condition was catering to the needs of students who preferred face-to-face classes (see Table  8 ). That is, they preferred face-to-face classes even when online classes were well taught. Unlike students more accepting of, or more likely to enroll in, online classes, this group rates Instructional Support as critical to enrolling, rather than a negative factor when absent. Again different from the other two groups, these students demand appropriate interactive mechanisms (Online Interactive Modality) to enable richer communication (e.g., videoconferencing). Student-to-student collaboration (Social Presence) was also significant. This group also rated Cognitive Presence and Online Social Comfort as significant, but only in their absence. That is, these students were most attached to direct interaction with the instructor and other students rather than specific teaching methods. Interestingly, Basic Online Modality and Teaching Presence were not significant. Our interpretation here is this student group, most critical of online classes for its loss of physical interaction, are beyond being concerned with mechanical technical interaction and demand higher levels of interactivity and instructional sophistication.

Discussion and study limitations

Some past studies have used robust empirical methods to identify a single factor or a small number of factors related to quality from a student’s perspective, but have not sought to be relatively comprehensive. Others have used a longer series of itemized factors, but have less used less robust methods, and have not tied those factors back to the literature. This study has used the literature to develop a relatively comprehensive list of items focused on quality teaching in a single rigorous protocol. That is, while a Beta test had identified five coherent factors, substantial changes to the current survey that sharpened the focus on quality factors rather than antecedent factors, as well as better articulating the array of factors often lumped under the mantle of “teaching presence.” In addition, it has also examined them based on threshold expectations: from minimal, such as when flexibility is the driving consideration, to modest, such as when students want a “good” online class, to high, when students demand an interactive virtual experience equivalent to face-to-face.

Exploratory factor analysis identified seven factors that were reliable, coherent, and significant under different conditions. When considering students’ overall sense of importance, they are, in order: Basic Online Modality, Instructional Support, Teaching Presence, Cognitive Presence, Social Online Comfort, Interactive Online Modality, and Social Presence. Students are most concerned with the basics of a course first, that is the technological and instructor competence. Next they want engagement and virtual comfort. Social Presence, while valued, is the least critical from this overall perspective.

The factor analysis is quite consistent with the range of factors identified in the literature, pointing to the fact that students can differentiate among different aspects of what have been clumped as larger concepts, such as teaching presence. Essentially, the instructor’s role in quality can be divided into her/his command of basic online functionality, good design, and good presence during the class. The instructor’s command of basic functionality is paramount. Because so much of online classes must be built in advance of the class, quality of the class design is rated more highly than the instructor’s role in facilitating the class. Taken as a whole, the instructor’s role in traditional teaching elements is primary, as we would expect it to be. Cognitive presence, especially as pertinence of the instructional material and its applicability to student interests, has always been found significant when studied, and was highly rated as well in a single factor. Finally, the degree to which students feel comfortable with the online environment and enjoy the learner-learner aspect has been less supported in empirical studies, was found significant here, but rated the lowest among the factors of quality to students.

Regression analysis paints a more nuanced picture, depending on student focus. It also helps explain some of the heterogeneity of previous studies, depending on what the dependent variables were. If convenience and scheduling are critical and students are less demanding, minimum requirements are Basic Online Modality, Cognitive Presence, and Online Social Comfort. That is, students’ expect an instructor who knows how to use an online platform, delivers useful information, and who provides a comfortable learning environment. However, they do not expect to get poor design. They do not expect much in terms of the quality teaching presence, learner-to-learner interaction, or interactive teaching.

When students are signing up for critical classes, or they have both F2F and online options, they have a higher standard. That is, they not only expect the factors for decisions about enrolling in noncritical classes, but they also expect good Teaching and Social Presence. Students who simply need a class may be willing to teach themselves a bit more, but students who want a good class expect a highly present instructor in terms responsiveness and immediacy. “Good” classes must not only create a comfortable atmosphere, but in social science classes at least, must provide strong learner-to-learner interactions as well. At the time of the research, most students believe that you can have a good class without high interactivity via pre-recorded video and videoconference. That may, or may not, change over time as technology thresholds of various video media become easier to use, more reliable, and more commonplace.

The most demanding students are those who prefer F2F classes because of learning style preferences, poor past experiences, or both. Such students (seem to) assume that a worthwhile online class has basic functionality and that the instructor provides a strong presence. They are also critical of the absence of Cognitive Presence and Online Social Comfort. They want strong Instructional Support and Social Presence. But in addition, and uniquely, they expect Online Interactive Modality which provides the greatest verisimilitude to the traditional classroom as possible. More than the other two groups, these students crave human interaction in the learning process, both with the instructor and other students.

These findings shed light on the possible ramifications of the COVID-19 aftermath. Many universities around the world jumped from relatively low levels of online instruction in the beginning of spring 2020 to nearly 100% by mandate by the end of the spring term. The question becomes, what will happen after the mandate is removed? Will demand resume pre-crisis levels, will it increase modestly, or will it skyrocket? Time will be the best judge, but the findings here would suggest that the ability/interest of instructors and institutions to “rise to the occasion” with quality teaching will have as much effect on demand as students becoming more acclimated to online learning. If in the rush to get classes online many students experience shoddy basic functional competence, poor instructional design, sporadic teaching presence, and poorly implemented cognitive and social aspects, they may be quite willing to return to the traditional classroom. If faculty and institutions supporting them are able to increase the quality of classes despite time pressures, then most students may be interested in more hybrid and fully online classes. If instructors are able to introduce high quality interactive teaching, nearly the entire student population will be interested in more online classes. Of course students will have a variety of experiences, but this analysis suggests that those instructors, departments, and institutions that put greater effort into the temporary adjustment (and who resist less), will be substantially more likely to have increases in demand beyond what the modest national trajectory has been for the last decade or so.

There are several study limitations. First, the study does not include a sample of non-respondents. Non-responders may have a somewhat different profile. Second, the study draws from a single college and university. The profile derived here may vary significantly by type of student. Third, some survey statements may have led respondents to rate quality based upon experience rather than assess the general importance of online course elements. “I felt comfortable participating in the course discussions,” could be revised to “comfort in participating in course discussions.” The authors weighed differences among subgroups (e.g., among majors) as small and statistically insignificant. However, it is possible differences between biology and marketing students would be significant, leading factors to be differently ordered. Emphasis and ordering might vary at a community college versus research-oriented university (Gonzalez, 2009 ).

Availability of data and materials

We will make the data available.

Al-Gahtani, S. S. (2016). Empirical investigation of e-learning acceptance and assimilation: A structural equation model. Applied Comput Information , 12 , 27–50.

Google Scholar  

Alqurashi, E. (2016). Self-efficacy in online learning environments: A literature review. Contemporary Issues Educ Res (CIER) , 9 (1), 45–52.

Anderson, T. (2016). A fourth presence for the Community of Inquiry model? Retrieved from https://virtualcanuck.ca/2016/01/04/a-fourth-presence-for-the-community-of-inquiry-model/ .

Annand, D. (2011). Social presence within the community of inquiry framework. The International Review of Research in Open and Distributed Learning , 12 (5), 40.

Arbaugh, J. B. (2005). How much does “subject matter” matter? A study of disciplinary effects in on-line MBA courses. Academy of Management Learning & Education , 4 (1), 57–73.

Arbaugh, J. B., Cleveland-Innes, M., Diaz, S. R., Garrison, D. R., Ice, P., Richardson, J. C., & Swan, K. P. (2008). Developing a community of inquiry instrument: Testing a measure of the Community of Inquiry framework using a multi-institutional sample. Internet and Higher Education , 11 , 133–136.

Armellini, A., & De Stefani, M. (2016). Social presence in the 21st century: An adjustment to the Community of Inquiry framework. British Journal of Educational Technology , 47 (6), 1202–1216.

Arruabarrena, R., Sánchez, A., Blanco, J. M., et al. (2019). Integration of good practices of active methodologies with the reuse of student-generated content. International Journal of Educational Technology in Higher Education , 16 , #10.

Arthur, L. (2009). From performativity to professionalism: Lecturers’ responses to student feedback. Teaching in Higher Education , 14 (4), 441–454.

Artino, A. R. (2010). Online or face-to-face learning? Exploring the personal factors that predict students’ choice of instructional format. Internet and Higher Education , 13 , 272–276.

Asoodar, M., Vaezi, S., & Izanloo, B. (2016). Framework to improve e-learner satisfaction and further strengthen e-learning implementation. Computers in Human Behavior , 63 , 704–716.

Bernard, R. M., et al. (2004). How does distance education compare with classroom instruction? A meta-analysis of the empirical literature. Review of Educational Research , 74 (3), 379–439.

Bollinger, D., & Martindale, T. (2004). Key factors for determining student satisfaction in online courses. Int J E-learning , 3 (1), 61–67.

Brinkley-Etzkorn, K. E. (2018). Learning to teach online: Measuring the influence of faculty development training on teaching effectiveness through a TPACK lens. The Internet and Higher Education , 38 , 28–35.

Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin , 3 , 7.

Choi, I., Land, S. M., & Turgeon, A. J. (2005). Scaffolding peer-questioning strategies to facilitate metacognition during online small group discussion. Instructional Science , 33 , 483–511.

Clayton, K. E., Blumberg, F. C., & Anthony, J. A. (2018). Linkages between course status, perceived course value, and students’ preferences for traditional versus non-traditional learning environments. Computers & Education , 125 , 175–181.

Cleveland-Innes, M., & Campbell, P. (2012). Emotional presence, learning, and the online learning environment. The International Review of Research in Open and Distributed Learning , 13 (4), 269–292.

Cohen, A., & Baruth, O. (2017). Personality, learning, and satisfaction in fully online academic courses. Computers in Human Behavior , 72 , 1–12.

Crews, T., & Butterfield, J. (2014). Data for flipped classroom design: Using student feedback to identify the best components from online and face-to-face classes. Higher Education Studies , 4 (3), 38–47.

Dawson, P., Henderson, M., Mahoney, P., Phillips, M., Ryan, T., Boud, D., & Molloy, E. (2019). What makes for effective feedback: Staff and student perspectives. Assessment & Evaluation in Higher Education , 44 (1), 25–36.

Drew, C., & Mann, A. (2018). Unfitting, uncomfortable, unacademic: A sociological reading of an interactive mobile phone app in university lectures. International Journal of Educational Technology in Higher Education , 15 , #43.

Durabi, A., Arrastia, M., Nelson, D., Cornille, T., & Liang, X. (2011). Cognitive presence in asynchronous online learning: A comparison of four discussion strategies. Journal of Computer Assisted Learning , 27 (3), 216–227.

Eom, S. B., Wen, H. J., & Ashill, N. (2006). The determinants of students’ perceived learning outcomes and satisfaction in university online education: An empirical investigation. Decision Sciences Journal of Innovative Education , 4 (2), 215–235.

Espasa, A., & Meneses, J. (2010). Analysing feedback processes in an online teaching and learning environment: An exploratory study. Higher Education , 59 (3), 277–292.

Farrell, O., & Brunton, J. (2020). A balancing act: A window into online student engagement experiences. International Journal of Educational Technology in High Education , 17 , #25.

Fidalgo, P., Thormann, J., Kulyk, O., et al. (2020). Students’ perceptions on distance education: A multinational study. International Journal of Educational Technology in High Education , 17 , #18.

Flores, Ò., del-Arco, I., & Silva, P. (2016). The flipped classroom model at the university: Analysis based on professors’ and students’ assessment in the educational field. International Journal of Educational Technology in Higher Education , 13 , #21.

Garrison, D. R., Anderson, T., & Archer, W. (2003). A theory of critical inquiry in online distance education. Handbook of Distance Education , 1 , 113–127.

Gong, D., Yang, H. H., & Cai, J. (2020). Exploring the key influencing factors on college students’ computational thinking skills through flipped-classroom instruction. International Journal of Educational Technology in Higher Education , 17 , #19.

Gonzalez, C. (2009). Conceptions of, and approaches to, teaching online: A study of lecturers teaching postgraduate distance courses. Higher Education , 57 (3), 299–314.

Grandzol, J. R., & Grandzol, C. J. (2006). Best practices for online business Education. International Review of Research in Open and Distance Learning , 7 (1), 1–18.

Green, S. B., & Salkind, N. J. (2003). Using SPSS: Analyzing and understanding data , (3rd ed., ). Upper Saddle River: Prentice Hall.

Hair, J. F., Black, W. C., Babin, B. J., & Anderson, R. E. (2014). Multivariate data analysis: Pearson new international edition . Essex: Pearson Education Limited.

Harjoto, M. A. (2017). Blended versus face-to-face: Evidence from a graduate corporate finance class. Journal of Education for Business , 92 (3), 129–137.

Hong, K.-S. (2002). Relationships between students’ instructional variables with satisfaction and learning from a web-based course. The Internet and Higher Education , 5 , 267–281.

Horvitz, B. S., Beach, A. L., Anderson, M. L., & Xia, J. (2015). Examination of faculty self-efficacy related to online teaching. Innovation Higher Education , 40 , 305–316.

Inside Higher Education and Gallup. (2019). The 2019 survey of faculty attitudes on technology. Author .

Jaggars, S. S., & Xu, D. (2016). How do online course design features influence student performance? Computers and Education , 95 , 270–284.

Joo, Y. J., Lim, K. Y., & Kim, E. K. (2011). Online university students’ satisfaction and persistence: Examining perceived level of presence, usefulness and ease of use as predictor in a structural model. Computers & Education , 57 (2), 1654–1664.

Jung, I. (2011). The dimensions of e-learning quality: From the learner’s perspective. Educational Technology Research and Development , 59 (4), 445–464.

Kay, R., MacDonald, T., & DiGiuseppe, M. (2019). A comparison of lecture-based, active, and flipped classroom teaching approaches in higher education. Journal of Computing in Higher Education , 31 , 449–471.

Kehrwald, B. (2008). Understanding social presence in text-based online learning environments. Distance Education , 29 (1), 89–106.

Kintu, M. J., Zhu, C., & Kagambe, E. (2017). Blended learning effectiveness: The relationship between student characteristics, design features and outcomes. International Journal of Educational Technology in Higher Education , 14 , #7.

Kuo, Y.-C., Walker, A. E., Schroder, K. E., & Belland, B. R. (2013). Interaction, internet self-efficacy, and self-regulated learning as predictors of student satisfaction in online education courses. Internet and Education , 20 , 35–50.

Lange, C., & Costley, J. (2020). Improving online video lectures: Learning challenges created by media. International Journal of Educational Technology in Higher Education , 17 , #16.

le Roux, I., & Nagel, L. (2018). Seeking the best blend for deep learning in a flipped classroom – Viewing student perceptions through the Community of Inquiry lens. International Journal of Educational Technology in High Education , 15 , #16.

Lee, H.-J., & Rha, I. (2009). Influence of structure and interaction on student achievement and satisfaction in web-based distance learning. Educational Technology & Society , 12 (4), 372–382.

Lee, Y., Stringer, D., & Du, J. (2017). What determines students’ preference of online to F2F class? Business Education Innovation Journal , 9 (2), 97–102.

Legon, R., & Garrett, R. (2019). CHLOE 3: Behind the numbers . Published online by Quality Matters and Eduventures. https://www.qualitymatters.org/sites/default/files/research-docs-pdfs/CHLOE-3-Report-2019-Behind-the-Numbers.pdf

Liaw, S.-S., & Huang, H.-M. (2013). Perceived satisfaction, perceived usefulness and interactive learning environments as predictors of self-regulation in e-learning environments. Computers & Education , 60 (1), 14–24.

Lu, F., & Lemonde, M. (2013). A comparison of online versus face-to-face students teaching delivery in statistics instruction for undergraduate health science students. Advances in Health Science Education , 18 , 963–973.

Lundin, M., Bergviken Rensfeldt, A., Hillman, T., Lantz-Andersson, A., & Peterson, L. (2018). Higher education dominance and siloed knowledge: a systematic review of flipped classroom research. International Journal of Educational Technology in Higher Education , 15 (1).

Macon, D. K. (2011). Student satisfaction with online courses versus traditional courses: A meta-analysis . Disssertation: Northcentral University, CA.

Mann, J., & Henneberry, S. (2012). What characteristics of college students influence their decisions to select online courses? Online Journal of Distance Learning Administration , 15 (5), 1–14.

Mansbach, J., & Austin, A. E. (2018). Nuanced perspectives about online teaching: Mid-career senior faculty voices reflecting on academic work in the digital age. Innovative Higher Education , 43 (4), 257–272.

Marks, R. B., Sibley, S. D., & Arbaugh, J. B. (2005). A structural equation model of predictors for effective online learning. Journal of Management Education , 29 (4), 531–563.

Martin, F., Wang, C., & Sadaf, A. (2018). Student perception of facilitation strategies that enhance instructor presence, connectedness, engagement and learning in online courses. Internet and Higher Education , 37 , 52–65.

Maycock, K. W. (2019). Chalk and talk versus flipped learning: A case study. Journal of Computer Assisted Learning , 35 , 121–126.

McGivney-Burelle, J. (2013). Flipping Calculus. PRIMUS Problems, Resources, and Issues in Mathematics Undergraduate . Studies , 23 (5), 477–486.

Mohammadi, H. (2015). Investigating users’ perspectives on e-learning: An integration of TAM and IS success model. Computers in Human Behavior , 45 , 359–374.

Nair, S. S., Tay, L. Y., & Koh, J. H. L. (2013). Students’ motivation and teachers’ teaching practices towards the use of blogs for writing of online journals. Educational Media International , 50 (2), 108–119.

Nguyen, T. (2015). The effectiveness of online learning: Beyond no significant difference and future horizons. MERLOT Journal of Online Learning and Teaching , 11 (2), 309–319.

Ni, A. Y. (2013). Comparing the effectiveness of classroom and online learning: Teaching research methods. Journal of Public Affairs Education , 19 (2), 199–215.

Nouri, J. (2016). The flipped classroom: For active, effective and increased learning – Especially for low achievers. International Journal of Educational Technology in Higher Education , 13 , #33.

O’Neill, D. K., & Sai, T. H. (2014). Why not? Examining college students’ reasons for avoiding an online course. Higher Education , 68 (1), 1–14.

O'Flaherty, J., & Phillips, C. (2015). The use of flipped classrooms in higher education: A scoping review. The Internet and Higher Education , 25 , 85–95.

Open & Distant Learning Quality Council (2012). ODLQC standards . England: Author https://www.odlqc.org.uk/odlqc-standards .

Ortagus, J. C. (2017). From the periphery to prominence: An examination of the changing profile of online students in American higher education. Internet and Higher Education , 32 , 47–57.

Otter, R. R., Seipel, S., Graef, T., Alexander, B., Boraiko, C., Gray, J., … Sadler, K. (2013). Comparing student and faculty perceptions of online and traditional courses. Internet and Higher Education , 19 , 27–35.

Paechter, M., Maier, B., & Macher, D. (2010). Online or face-to-face? Students’ experiences and preferences in e-learning. Internet and Higher Education , 13 , 292–329.

Prinsloo, P. (2016). (re)considering distance education: Exploring its relevance, sustainability and value contribution. Distance Education , 37 (2), 139–145.

Quality Matters (2018). Specific review standards from the QM higher Education rubric , (6th ed., ). MD: MarylandOnline.

Richardson, J. C., Maeda, Y., Lv, J., & Caskurlu, S. (2017). Social presence in relation to students’ satisfaction and learning in the online environment: A meta-analysis. Computers in Human Behavior , 71 , 402–417.

Rockhart, J. F., & Bullen, C. V. (1981). A primer on critical success factors . Cambridge: Center for Information Systems Research, Massachusetts Institute of Technology.

Rourke, L., & Kanuka, H. (2009). Learning in Communities of Inquiry: A Review of the Literature. The Journal of Distance Education / Revue de l'ducation Distance , 23 (1), 19–48 Athabasca University Press. Retrieved August 2, 2020 from https://www.learntechlib.org/p/105542/ .

Sebastianelli, R., Swift, C., & Tamimi, N. (2015). Factors affecting perceived learning, satisfaction, and quality in the online MBA: A structural equation modeling approach. Journal of Education for Business , 90 (6), 296–305.

Shen, D., Cho, M.-H., Tsai, C.-L., & Marra, R. (2013). Unpacking online learning experiences: Online learning self-efficacy and learning satisfaction. Internet and Higher Education , 19 , 10–17.

Sitzmann, T., Kraiger, K., Stewart, D., & Wisher, R. (2006). The comparative effectiveness of web-based and classroom instruction: A meta-analysis. Personnel Psychology , 59 (3), 623–664.

So, H. J., & Brush, T. A. (2008). Student perceptions of collaborative learning, social presence and satisfaction in a blended learning environment: Relationships and critical factors. Computers & Education , 51 (1), 318–336.

Song, L., Singleton, E. S., Hill, J. R., & Koh, M. H. (2004). Improving online learning: Student perceptions of useful and challenging characteristics. The Internet and Higher Education , 7 (1), 59–70.

Sun, P. C., Tsai, R. J., Finger, G., Chen, Y. Y., & Yeh, D. (2008). What drives a successful e-learning? An empirical investigation of the critical factors influencing learner satisfaction. Computers & Education , 50 (4), 1183–1202.

Takamine, K. (2017). Michelle D. miller: Minds online: Teaching effectively with technology. Higher Education , 73 , 789–791.

Tanner, J. R., Noser, T. C., & Totaro, M. W. (2009). Business faculty and undergraduate students’ perceptions of online learning: A comparative study. Journal of Information Systems Education , 20 (1), 29.

Tucker, B. (2012). The flipped classroom. Education Next , 12 (1), 82–83.

Van Wart, M., Ni, A., Ready, D., Shayo, C., & Court, J. (2020). Factors leading to online learner satisfaction. Business Educational Innovation Journal , 12 (1), 15–24.

Van Wart, M., Ni, A., Rose, L., McWeeney, T., & Worrell, R. A. (2019). Literature review and model of online teaching effectiveness integrating concerns for learning achievement, student satisfaction, faculty satisfaction, and institutional results. Pan-Pacific . Journal of Business Research , 10 (1), 1–22.

Ventura, A. C., & Moscoloni, N. (2015). Learning styles and disciplinary differences: A cross-sectional study of undergraduate students. International Journal of Learning and Teaching , 1 (2), 88–93.

Vlachopoulos, D., & Makri, A. (2017). The effect of games and simulations on higher education: A systematic literature review. International Journal of Educational Technology in Higher Education , 14 , #22.

Wang, Y., Huang, X., & Schunn, C. D. (2019). Redesigning flipped classrooms: A learning model and its effects on student perceptions. Higher Education , 78 , 711–728.

Wingo, N. P., Ivankova, N. V., & Moss, J. A. (2017). Faculty perceptions about teaching online: Exploring the literature using the technology acceptance model as an organizing framework. Online Learning , 21 (1), 15–35.

Xu, D., & Jaggars, S. S. (2014). Performance gaps between online and face-to-face courses: Differences across types of students and academic subject areas. Journal of Higher Education , 85 (5), 633–659.

Young, S. (2006). Student views of effective online teaching in higher education. American Journal of Distance Education , 20 (2), 65–77.

Zawacki-Richter, O., & Naidu, S. (2016). Mapping research trends from 35 years of publications in distance Education. Distance Education , 37 (3), 245–269.

Download references

Acknowledgements

No external funding/ NA.

Author information

Authors and affiliations.

Development for the JHB College of Business and Public Administration, 5500 University Parkway, San Bernardino, California, 92407, USA

Montgomery Van Wart, Anna Ni, Pamela Medina, Jesus Canelon, Melika Kordrostami, Jing Zhang & Yu Liu

You can also search for this author in PubMed   Google Scholar

Contributions

Equal. The author(s) read and approved the final manuscript.

Corresponding author

Correspondence to Montgomery Van Wart .

Ethics declarations

Competing interests.

We have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Van Wart, M., Ni, A., Medina, P. et al. Integrating students’ perspectives about online learning: a hierarchy of factors. Int J Educ Technol High Educ 17 , 53 (2020). https://doi.org/10.1186/s41239-020-00229-8

Download citation

Received : 29 April 2020

Accepted : 30 July 2020

Published : 02 December 2020

DOI : https://doi.org/10.1186/s41239-020-00229-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Online education
  • Online teaching
  • Student perceptions
  • Online quality
  • Student presence

online education's effect on learning research paper

ORIGINAL RESEARCH article

The impact of online classes on sleep, physical activity, and cognition functioning among physical education students.

Monoem Haddad

  • 1 Physical Education Department, College of Education, Qatar University, Doha, Qatar
  • 2 Department of Mathematics and Statistics, College of Arts and Sciences, Qatar University, Doha, Qatar

Introduction: Online education has become a crucial component of teachers’ professional development, and universities incorporate innovative pedagogical approaches to enhance teachers’ training. These approaches have proven invaluable, particularly during the COVID-19 pandemic. This study investigates the impact of online versus face-to-face learning environments on sleep quality, physical activity, and cognitive functioning among physical education students.

Methods: Utilizing a unique methodological approach that combines wrist actigraphy, the Pittsburgh Sleep Quality Index, and the Cambridge Neuropsychological Test Automated Battery, we provide a comprehensive assessment of these variables. Over 4 weeks, 19 male students participated in alternating online and face-to-face class formats.

Results: Our results reveal no significant differences in sleep quality or cognitive function between learning environments. However, notable findings include significant differences in Paired Associates Learning and weekday step counts in the face-to-face setting.

Discussion: These insights suggest that while online learning environments may not adversely affect sleep or cognitive functions, they could impact certain aspects of physical activity and specific cognitive tasks. These findings contribute to the nuanced understanding of online learning’s implications and can inform the design of educational strategies that promote student well-being.

1 Introduction

Online education has become an essential element of teachers’ professional development, and universities are responsible for enhancing their training by incorporating innovative pedagogical approaches that have proven invaluable under various circumstances, including the COVID-19 pandemic. This study was conducted at Qatar University, located in Doha, Qatar, providing a unique context for exploring the impacts of online versus face-to-face learning environments within the Arab region. Prioritizing the creation of engaging learning opportunities and cultivating unique educational settings using digital technologies has become a central focus of the higher education system. This approach aims to enable teachers and students with limited exposure to e-learning to adapt swiftly to current challenges ( Rapanta et al., 2020 ).

The challenges brought about by changes enforced by the pandemic in the realm of teaching activities were multifaceted. They encompass issues such as heightened anxiety, diminished face-to-face interactions, shifts from conventional teaching techniques, the demand for indoor physical activities with associated issues, and increased workload and stress stemming from novel working conditions ( Aperribai et al., 2020 ). Additionally, Joshi et al. (2021) emphasize the unique challenges within the academic landscape in India that impede the enhancement of quality education in universities. These obstacles include limited funding for the procurement of advanced information technology (IT) equipment, inadequate computer skills training, external distractions and family responsibilities, technical issues, instructional and assessment difficulties, a pessimistic outlook, and lack of motivation. Given the convenience of attending online classes from home, it is important to note that this method of teaching often involves prolonged exposure to screens on mobile devices, tablets, or laptops. Consequently, students may need to remain sedentary for extended periods. Prolonged exposure to artificial light from electronic devices can adversely affect human health ( Khare et al., 2021 ). In addition to these challenges, the increasing prevalence of nomophobia and problematic internet use among students has raised concerns regarding their potential impact on learning environments. Nomophobia, or the fear of being without a mobile phone, has been linked to heightened anxiety and could interfere with students’ learning processes and overall well-being. Relevant studies suggest a high prevalence of nomophobia among university students ( Tuco et al., 2023 ). In Arab countries, moderate to severe rates of nomophobia have been observed, indicating its potential influence on students’ academic engagement and performance ( Jelleli et al., 2023 ). Similarly, problematic internet use, characterized by excessive and uncontrolled online activity, has been associated with adverse mental health outcomes and could affect students’ ability to engage effectively in online learning. A meta-analytic review reported high rates of problematic internet use and its associations with mental health outcomes among students ( Cai et al., 2023 ). Additionally, the prevalence and risk factors of internet gaming disorder and problematic internet use have garnered attention, especially during the COVID-19 pandemic, underscoring the need to understand these behaviors in the context of online education ( Oka et al., 2021 ). Given the significant role of sleep in determining cognitive and physical performance, its inclusion as a study variable is essential. Sleep quality and duration have profound effects on learning efficiency, memory consolidation, and overall student health, which are particularly pertinent in the context of varied learning environments such as online versus face-to-face settings ( Curcio et al., 2006 ; Walker and Stickgold, 2006 ). These considerations support our focus on assessing how different learning modalities influence sleep patterns among physical education students ( Alhola and Polo-Kantola, 2007 ; Killgore, 2010 ). To address the unique impacts of the pandemic, our study contrasts online and face-to-face learning environments during this period, specifically incorporating an analysis of stress and anxiety as they relate to these educational settings. This consideration aims to enhance our understanding of how pandemic-related factors influence sleep, memory, and cognitive functions among physical education students. Our study specifically examines both the theoretical and practical aspects of physical education, integrating components of physical training and theoretical instruction within both the online and face-to-face learning environments. This comprehensive approach allows us to evaluate the impact of these modalities on various aspects of student well-being, including sleep, physical activity, and cognitive functions. Tan et al. (2020) suggest that the university environment can view the pandemic as an opportunity for innovation and exploration of novel teaching methods in online distance learning.

Challenges in online education have been documented in Ghana, where students face difficulties owing to the limited computer and technical proficiency required for remote learning. In addition, parents struggle to assist their children in accessing and navigating online platforms, and Internet connectivity is often constrained. Consequently, the pandemic has adversely affected the quality of education and learning experiences. It is imperative to provide training to both students and teachers to ensure the efficient utilization of these online platforms ( Owusu-Fordjour et al., 2020 ). Over time, significant advancements have been made in the field of online teaching and learning. Despite the intricacies in transitioning to a fundamentally different teaching approach, our university serves as an exemplary institution that has successfully risen to the challenge of finding solutions to ensure the uninterrupted continuity of education during the pandemic ( Heider, 2021 ).

The impact of the pandemic on university staff and students has primarily manifested as reduced physical activity due to the suspension of in-person teaching and the closure of fitness facilities. However, Barkley et al. (2020) indicate that this decline was more pronounced among those with high levels of physical activity. Conversely, individuals with medium and low levels of physical activity expressed increased concern about maintaining their physical activity routines. By contrast, López-Valenciano et al. (2021) observe a decrease in engagement in lower-intensity physical activities such as walking and very intense/vigorous physical activity among students from various countries. Nevertheless, they exhibited a consistent commitment to maintain a minimum level of physical activity during the pandemic, especially if they were already active before the outbreak. Meza and López (2021) also report a decrease in overall physical activity engagement, but noted that some individuals were compensated by organizing physical activities at home. Concerning PA-related issues, Rodríguez-Larrad et al. (2021) conduct a study on 13,754 Spanish students from 16 universities. They find a decrease in both moderate and vigorous physical activity levels, along with an increase in sedentary behavior in over 50% of the cases. However, some individuals have sought to address these deficiencies by incorporating high-intensity activities and mind–body practices, such as yoga, especially among women who excel in managing physical activities and often use social networks for this purpose. It is worth noting that during the pandemic, men took more steps than women, leading to a noticeable decline in the weekly distance covered ( Wickersham et al., 2021 ).

The perceptions of safety measures and their impact on student lifestyles vary significantly from country to country. For example, in Denmark, a survey revealed that approximately 68% of the students adhered to government-issued protective measures. Compliance was associated with older age, feelings of depression, and challenges stemming from a pandemic. Surprisingly, despite this adherence, approximately 60% of the Danish students reported no significant concerns about the pandemic ( Berg-Beckhoff et al., 2021 ). In Naples, Italy, the pandemic has brought about substantial changes in the lifestyles of students, with men particularly affected by negative eating habits and a decline in physical activity due to quarantine measures ( Brancaccio et al., 2021 ). A German study finds a link between increased alcohol consumption, reduced physical activity, smoking, cannabis use, and depressive symptoms related to the pandemic among students at four universities ( Busse et al., 2021 ). Similarly, research involving Swiss students conducted by Volken et al. (2021) reveals that more than one-fourth experienced symptoms of depression during the pandemic.

However, the impact of online teaching on sleep patterns, physical activity, and cognitive functioning among university students has not been studied extensively. Therefore, the current research aimed to explore the effects of online classes compared to traditional face-to-face classes, with a specific focus on students majoring in physical education. Our study hypothesizes that: (1) there will be no significant difference in sleep quality between online and face-to-face learning environments; (2) physical activity levels will be higher in face-to-face learning environments compared to online settings; and (3) cognitive function will remain consistent across both learning environments. These hypotheses are formulated based on the assumption that while the modality of learning might influence physical activity due to inherent differences in the physical engagement required, the impact on sleep quality and cognitive function could be less pronounced.

2 Materials and methods

2.1 study design and participants.

This study used a within-subjects design to examine the influence of distinct learning environments (face-to-face and online) on multiple facets of students’ sleep patterns, physical activity levels, and cognitive function. The study spanned 4 weeks, consisting of 2 weeks of online classes followed by 2 weeks of face-to-face classes. Nineteen male students (age: 25 ± 3 years; weight: 79.36 ± 15.18 kg; height: 1.77 ± 0.09 m; BMI: 25.09 ± 3.99) willingly took part in this research. To determine the required sample size and ensure adequate statistical power for detecting meaningful effects, a priori power analysis was conducted using G*Power software (Version 3.1.9.6). Based on preliminary studies and existing literature, an expected moderate effect size (Cohen’s d ) of 0.5 was assumed for the primary outcomes. The power analysis was set with an alpha level of 0.05 and a power of 80%, which are standard values to detect true effects while controlling for Type I and Type II errors. The analysis suggested that a minimum of 17 participants would be necessary to adequately power the study. With 19 participants enrolled, our study exceeded the minimum required sample size, ensuring robustness in our findings. The study adhered to the principles outlined in the Declaration of Helsinki, and the research protocol was approved by our university’s institutional review board (QU-IRB 1467-EA/21) before the recruitment of subjects and data collection. Before participating, all participants carefully read and signed a written informed consent form. This document comprehensively explains the potential risks and benefits of the study, details the research methods and procedures, and addresses the issues related to data confidentiality. Participants were assured that they could withdraw from the study at any time without any adverse consequences.

2.2 Learning environments

2.2.1 face-to-face learning environment.

The face-to-face learning environment consisted of interactive lectures, group discussions, and practical sessions conducted in traditional classroom settings. The instructional content encompassed theoretical concepts in physical education, supplemented by practical exercises and demonstrations to enhance students’ understanding and application of the material. Qatar University established a standardized meeting pattern to ensure the maximum use of the instructional week, to provide students with greater registration options and flexibility, and to better facilitate scheduling of instructional facilities. On Monday and Wednesday, face-to-face sessions were scheduled for 1 h and 15 min. During Sunday, Tuesday and Thursday, each face-to-face session had a duration of 50 min.

2.2.2 Online learning environment

The online learning environment utilized the university’s e-learning platform to deliver lectures, multimedia presentations, and virtual simulations. Asynchronous engagement activities, including online discussions and assignment submissions, were integrated to facilitate interaction and collaboration among students. The structure and duration of online sessions mirrored that of face-to-face instruction, ensuring consistency across learning environments.

2.3 Measurements

Sleep measurements included three components: Wrist Actigraphy: The ActiGraph GT9X Link (Pensacola, FL, United States), which was used to assess Sleep Efficiency (%), Total Sleep Time (TST), and wake time after sleep onset (WASO). Extensive research has demonstrated the validity and reliability of ActiGraph for these measurements ( Colbert et al., 2011 ; Sasaki et al., 2011 ; Hanggi et al., 2013 ; Anastasopoulou et al., 2014 ; Cellini et al., 2016 ).

Daily Self-Rating Scale of Sleep Quality to gauge sleep quality, a daily self-rating scale originally proposed by Hooper and Mackinnon (1995) was used. This scale allowed participants to self-report their sleep quality on a scale ranging from 1 to 7, offering flexibility in selecting fractional ratings (e.g., 2 or 3.5).

Sleep Quality Assessment (PSQI): The Pittsburgh Sleep Quality Index (PSQI) was administered during both face-to-face and online classes to assess sleep quality. The validated Arabic translation of the PSQI by Suleiman et al. (2010) was used in this study.

The Cambridge Neuropsychological Test Automated Battery (CANTAB) was used to precisely assess cognitive functioning. Widely acknowledged as the gold standard for cognitive assessment and data collection software ( De Bruin et al., 2017 ), the CANTAB tests were conducted using a computer equipped with a touch-sensitive screen. These tests were administered and feedback was provided in a standardized and consistent manner.

Specific cognitive domains were assessed using different components of the CANTAB:

• Rapid Visual Information Processing (RVPA): This test measured sustained attention.

• Spatial Working Memory (SWMBE): SWMBE assessed executive function.

• Paired associate learning (PAL): PAL was used to evaluate visual memory and new learning abilities.

In addition to these sleep and cognitive function measurements, students’ academic performance was evaluated based on their grade point average (GPA), as indicated by their transcripts. Physical activity was estimated using a counter walking 3D pedometers. The objective indicators of physical activity were mean weekend and weekday steps. These measures provide insight into students’ sleep quality, physical activity, and other cognitive functions.

2.4 Statistical analysis

A Wilcoxon signed-rank test was conducted to compare the effects of the face-to-face and online learning environments on the dependent variables. This non-parametric test was chosen to evaluate the differences within the same subject across different learning environments.

Furthermore, building upon these initial findings, a linear mixed-effects model was then employed to analyze the data further, utilizing the “lme4” package in R. In this model, “Subject” was treated as a random effect to account for the repeated measures design, while GPA and Type (face-to-face vs. online) were treated as fixed effects.

A separate mixed-model was fitted to each dependent variable. These models provided estimates of fixed effects, including the intercept, GPA, and type of learning environment, along with standard errors, t -values, and p -values for each effect. The significance of the fixed effects was determined based on their p -values, with a threshold of p -value <0.05, which is typically considered statistically significant. The model outputs were used to assess the impact of the students’ GPA and the type of learning environment on each of the dependent variables, controlling for individual differences among the subjects.

3.1 Descriptive statistics

The following table presents a comprehensive overview of various sleep- and activity-related variables across the two groups: face-to-face and online. The variables included PSQI Score, RVPA, SWMBE, PAL, Weekday Steps, Weekend Steps, sleep quality, sleep efficiency, TST, and WASO. The mean and standard deviation (SD) are provided for each variable, offering a detailed comparison between the two groups. Additionally, the table includes the combined statistics (overall) for all participants, irrespective of their group, thereby providing a holistic view of the data ( Table 1 ).

www.frontiersin.org

Table 1 . Comparison of sleep and activity-related variables between face-to-face and online groups: mean and standard deviation analysis with overall statistics.

3.2 Wilcoxon signed-rank test

The following table presents the results of the Wilcoxon signed-rank tests conducted to compare various dependent variables between the face-to-face and online learning environments. These variables include sleep quality, physical activity, and academic performance. The test statistics and p -values for each variable are reported. This analysis aimed to determine whether there were statistically significant differences in these outcomes based on the type of learning environment with a focus on paired comparisons within the same subjects across both environments ( Table 2 ).

www.frontiersin.org

Table 2 . Wilcoxon signed-rank test results for comparing dependent variables between face-to-face and online learning environments: sleep quality, physical activity, and academic performance measures.

In the analysis using Wilcoxon signed-rank tests, the findings suggested that there were no statistically significant differences between face-to-face and online learning environments across a range of dependent variables. Specifically, measures such as the PSQI Score ( V  = 50.5, p  = 0.6051), RVPA ( V  = 120, p  = 0.3242), and SWMBE ( V  = 74, p  = 0.6315) demonstrated p -values significantly above the conventional significance threshold of 0.05, indicating no meaningful differences in these aspects between the two learning modalities. Similarly, variables related to physical activity and sleep, including PAL ( V  = 41, p -value = 0.09741), weekday and weekend steps ( V  = 55, p  = 0.1119 and V  = 115, p -value = 0.4326, respectively), sleep quality ( V  = 46.5, p -value = 0.2774), sleep efficiency ( V  = 103, p -value = 0.7628), total sleep time (TST; V  = 103.5, p -value = 0.7475), and wake time after sleep onset (WASO; V  = 73.5, p -value = 0.3978), all yielded p -values that did not reach statistical significance. These results suggest that there were no substantial differences in these variables between students participating in the face-to-face and online learning settings.

3.3 Linear mixed-effects model

This study was designed to investigate how different learning environments, specifically face-to-face and online settings, affect students’ sleep patterns and physical activity. This investigation employed linear mixed-effects models to evaluate the influence of learning environment type on an array of variables, including sleep quality, physical activity, and other related factors. Additionally, the students’ GPA was incorporated as an independent covariate to account for academic performance, which may interact with their sleep and activity patterns ( Table 3 ).

www.frontiersin.org

Table 3 . Linear mixed-effects model analysis with GPA as covariate.

3.4 Pittsburgh sleep quality index (PSQI) score

In the PSQI Score analysis, the results indicated no significant effects related to the type of learning environment (online), with an estimated effect of 0.053 ( SE  = 0.346, t  = 0.152, p  = 0.880). Furthermore, the intercept was not statistically significant (estimate = 4.981, SE  = 5.657, t  = 0.881, p  = 0.391), suggesting no substantial baseline differences in sleep quality. The GPA factor did not significantly influence the PSQI Score (estimate = 0.354, SE  = 1.728, t  = 0.205, p  = 0.840).

For the RVPA, there was no significant variation between the learning environments (Online: Estimate = −0.004, SE  = 0.016, t  = −0.248, p -value = 0.805). However, the intercept was statistically significant (estimate = 0.943, SE  = 0.126, t  = 7.507, p -value <0.001), indicating a notable baseline RVPA score. As a covariate, GPA was not found to significantly impact RVPA (estimate = −0.006, SE  = 0.038, t  = −0.160, p  = 0.874).

In the SWMBE analysis, the type of learning environment (online) did not present significant differences (estimate = 0.974, SE  = 1.594, t  = 0.611, p  = 0.544), nor did the intercept show any statistical significance (estimate = 10.801, SE  = 11.541, t  = 0.936, p  = 0.362). Similarly, GPA did not significantly affect SWMBE scores (Estimate = −1.203, SE  = 3.519, t  = −0.342, p -value = 0.737).

3.7 PALTEA (PAL) %

Regarding PAL, a significant difference was observed between learning environments (Online: Estimate = 9.105, SE  = 4.487, t  = 2.029, p  = 0.047), and the intercept also showed statistical significance (estimate = 126.519, SE  = 27.619, t  = 4.581, p  < 0.001). This suggests a baseline level and an effect of online learning on PAL. GPA was found to have a nearly significant influence on PAL (Estimate = −17.280, SE  = 8.413, t  = −2.054, p -value = 0.056).

3.8 Mean weekday and weekend steps

The analysis of mean weekday steps revealed a significant effect of learning environment type (Online: Estimate = 750.630, SE  = 374.150, t  = 2.006, p -value = 0.050), but the intercept was not significant (estimate = 5065.490, SE  = 3550.890, t  = 1.427, p -value = 0.172). GPA did not show a significant effect (Estimate = 520.000, SE  = 1083.670, t  = 0.480, p  = 0.638). For weekend steps, neither the type of learning environment (Online: Estimate = −383.590, SE  = 529.630, t  = −0.724, p -value = 0.472) nor GPA (Estimate = 710.970, SE  = 941.790, t  = 0.755, p -value = 0.461) showed significant effects, and the intercept was also not significant (Estimate = 4566.320, SE  = 3093.060, t  = 1.476, p -value = 0.158).

3.9 Self-rating scale of sleep quality

The self-rating scale of sleep quality did not exhibit significant differences for the type of learning environment (Online: Estimate = 0.139, SE  = 0.117, t  = 1.186, p -value = 0.241) or GPA (Estimate = −0.165, SE  = 0.445, t  = −0.371, p -value = 0.716). However, the intercept was nearly significant (estimate = 3.044, SE  = 1.458, t  = 2.088, p -value = 0.052), suggesting a trend toward a baseline effect on sleep quality ratings.

3.10 Sleep efficiency %

In terms of sleep efficiency, no significant effects were observed for the type of learning environment (Online: Estimate = −0.132, SE  = 0.517, t  = −0.254, p -value = 0.800) or GPA (Estimate = 0.712, SE  = 0.910, t  = 0.782, p -value = 0.445). However, the intercept was highly significant (estimate = 92.018, SE  = 2.989, t  = 30.786, p -value <0.001), indicating pronounced baseline sleep efficiency.

3.11 Total sleep time (TST) and wake time after sleep onset (WASO)

For TST, no significant differences were found between the learning environments (Online: Estimate = −18.380, SE  = 31.900, t  = −0.576, p -value = 0.567) or due to GPA (Estimate = −136.570, SE  = 68.570, t  = −1.992, p -value = 0.063), although the intercept was significant (Estimate = 937.270, SE  = 224.930, t  = 4.167, p -value <0.001). In the case of WASO, neither the type of learning environment (Online: Estimate = 0.842, SE  = 1.553, t  = 0.542, p  = 0.590) nor the GPA (estimate = −1.631, SE  = 4.556, t  = −0.358, p  = 0.725) demonstrated significant effects, and the intercept was not significant (estimate = 26.819, SE  = 14.928, t  = 1.797, p  = 0.090).

4 Discussion

The main objective of this study was to analyze how different learning environments (face-to-face vs. online) affect students’ sleep, cognitive function, various aspects of sleep patterns, physical activity, and cognitive function. Additionally, this study aimed to assess the influence of students’ GPA and the type of learning environment on each of these dependent variables, while controlling for individual differences among subjects.

A comprehensive analysis of various sleep- and activity-related factors revealed a prevailing trend characterized by a lack of substantial differences between the different learning environments (face-to-face vs. online). Furthermore, no statistically significant variations were observed in the PSQI Score, RVPA, SWMBE, or self-assessment of sleep quality according to the type of learning environment. Additionally, it is worth noting that in several models, the intercepts were significant, emphasizing the importance of baseline levels for these variables.

The absence of a significant effect of teaching environment on sleep suggests that students were able to obtain the necessary amount of sleep during online classes. It is important to emphasize that our university maintained consistent lecture schedules during the day, even when transitioning to online classes with recorded sessions. This strategic decision aimed to mitigate the negative effect of using electronic devices late at night on sleep quality. Although it is evident that the COVID-19 pandemic initially had a detrimental impact on sleep quality, it is worth noting that over the extended period of the pandemic, people gradually regained stability in their daily routines. This may explain why online classes did not have a significant effect on sleep patterns in this study. As individuals adapted to the challenges posed by the pandemic, they were likely to develop strategies to prioritize their sleep while continuing their online education ( Walker and Stickgold, 2006 ; Alhola and Polo-Kantola, 2007 ).

The absence of pronounced effects on sustained attention and executive function in online classes found in this study can be attributed to a combination of individual differences, adaptation over time, effective teaching practices, and student engagement. This highlights the notion that students’ ability to maintain attention and executive function is influenced by various factors and that the learning environment is just one element in a complex interplay of influences ( Ratey and Loehr, 2011 ; Diamond, 2015 ). To clarify, the participants in this study were engaged in both theoretical and practical aspects of physical education. The face-to-face learning environment included interactive lectures, group discussions, and practical sessions that involved physical activity. Conversely, the online sessions primarily focused on theoretical knowledge dissemination through multimedia presentations and virtual simulations. This distinction is crucial as it directly influences the engagement levels and physical activity of the students, potentially impacting their sleep patterns and cognitive functions. It is essential to note that these results may be specific to our study group, because students’ learning preferences and experiences can vary significantly. Some individuals thrive in the structured and socially interactive environments of in-person classes, whereas others find online learning more flexible and less distracting. In a meta-analysis conducted by Pashler et al. (2008) , weak or no evidence was found to support the effectiveness of learning-style interventions, indicating that individual differences in learning styles may overshadow any overall effect of online learning on sustained attention and executive functioning. It is worth mentioning that the participants in our study had prior experience with online learning environments during the COVID-19 pandemic. This prior experience may have equipped them with better skills in managing distractions and maintaining focus, whereas those new to online learning may have faced initial challenges. A review by Means et al. (2013) acknowledges the potential facilitative role of prior experience in online learning but emphasizes the need for further research to better understand the specific mechanisms at play and their interactions with other factors. In addition, the quality of online course design and delivery plays a significant role in student engagement and attention. Well-designed courses that incorporate interactive activities, provide clear expectations, and offer regular feedback may enhance focus, whereas poorly designed courses could lead to passive learning and increased distraction ( Clark and Mayer, 2023 ).

By contrast, when examining PAL and weekday steps, the present study identified significant effects specifically associated with the face-to-face learning environment, highlighting a notable impact in these areas.

The findings of this study align closely with similar research conducted globally focusing on the realm of online activities. While the number of studies conducted specifically within the domains of physical education and sports departments is relatively limited, there is a broader body of research related to online education across various disciplines. Analyzing these broader studies can shed light on both shared and unique aspects within the field of physical education at universities.

The challenges associated with physical activities are identified in a study conducted by Rodríguez-Larrad et al. (2021) involving 13,754 students from 16 Spanish universities. This study reveals a significant decline in both moderate and vigorous physical activity, with an increase in sedentary behavior affecting more than 50% of the participants. Another study conducted by Wickersham et al. (2021) highlights a noticeable decline in the number of weekly steps taken, reflecting decreased distances traveled during off-campus classes. In the present study, the linear mixed-effects model of mean weekday steps reveals a significant effect of learning environment type. This finding is consistent with previous studies that reported decreases in physical activity among student populations during off-campus classes ( Gallo et al., 2020 ; Savage et al., 2020 ; Luciano et al., 2021 ; Mocanu et al., 2021 ; Rodríguez-Larrad et al., 2021 ). However, it is worth noting that during the COVID-19 pandemic, Wickersham et al. (2021) find an increase in the number of steps taken from the onset of lockdown and throughout the subsequent easing of restrictions. This likely indicates growing certainty and clarity surrounding the imposed restrictions, coupled with the gradual relaxation of rules, which allowed for more outdoor activities and social interactions ( Wickersham et al., 2021 ). It is important to mention that during our study, no restrictions were imposed when conducting online classes, which may explain the non-significant effect of the type of learning environment on weekends between online and face-to-face classes.

Physical activity may not be the sole factor that affects online classes. Therefore, it is important to consider their impact on cognitive and executive functions, which are known to contribute to academic success. In our study, the linear mixed-effects model identifies a significant difference in PAL between the learning environments. Specifically, notable variations in visual memory and learning abilities were observed when the PAL was used to assess these variables. With the utilization of CANTAB, the outcome measures encompass various factors, including participant errors, number of trials needed to correctly identify patterns, memory scores, and completed stages. The effective storage of information, whether temporary or retrieved, necessitates proficient encoding, manipulation, and retrieval abilities. Consequently, as recommended by Varela-Aldás et al. (2023) , the implementation of memory training programs becomes imperative when these deficits are identified. Therefore, it is advisable to integrate memory training programs into online classes.

In our study, when the GPA was included as a covariate, it did not consistently demonstrate a significant influence on various outcomes. Consequently, these findings suggest that academic performance may not be a decisive factor affecting the impact of online classes on sleep, physical activity, or cognitive functioning among physical education students. Further research is required to delve deeper into the intricate relationships among academic performance, online learning, and well-being, considering diverse populations and various influencing factors.

To further enhance the efficacy of online learning environments, educators could integrate structured physical activities and cognitive exercises into the curriculum. Such integration not only counters the sedentary nature of online learning but could also improve cognitive functions, as physical activity is known to have a positive impact on brain health and cognitive performance ( Ratey and Loehr, 2011 ). Specifically, incorporating short physical activity breaks can help maintain students’ physical health and potentially enhance their cognitive engagement and learning outcomes. Moreover, embedding cognitive exercises and interactive learning activities can provide the dual benefits of enhancing cognitive function and mitigating the potential distractions associated with online learning environments ( Diamond, 2015 ). Ultimately, this study underscores the importance of considering various factors, including physical activity and cognitive engagement, in the design and delivery of online learning. As educators continue to navigate the challenges and opportunities presented by online and hybrid learning modalities, incorporating evidence-based strategies that promote students’ physical and cognitive well-being will be crucial to fostering effective and holistic educational experiences ( Donnelly and Lambourne, 2011 ). Furthermore, future research should explore the long-term effects of different learning environments, considering both extended study periods and follow-up assessments, as highlighted by Vaalayi et al. (2023) and Rassolnia and Nobari (2024) . The study by Vaalayi et al. (2023) demonstrated that low-intensity aerobic exercise can attenuate the negative effects of partial sleep deprivation on cognitive performance, suggesting that integrating physical activities in online learning could mitigate some of the cognitive challenges posed by reduced physical activity. Similarly, Rassolnia and Nobari (2024) found that socio-economic status and physical activity significantly influence psychological well-being and sleep quality among college students, highlighting the importance of promoting physical activity to improve overall student health and academic performance.

5 Conclusion

A comprehensive analysis of various sleep- and activity-related variables revealed a pattern of mostly non-significant differences between the learning environments (face-to-face vs. online). Specifically, for the PSQI Score, RVPA, SWMBE, and the self-rating scale of sleep quality, no significant variations were found in relation to the type of learning environment. However, in the case of PAL and weekday steps, significant effects were observed in the face-to-face learning environment, indicating a significant impact on these areas. Additionally, the intercepts in several models were significant, highlighting the importance of the baseline levels of these variables. The GPA, included as a covariate, did not demonstrate a consistently significant influence on the outcomes. These findings suggest that while the learning environment type notably influences certain activity metrics, such as PAL and weekday steps, its overall impact on sleep quality and other activity metrics is generally limited. To deepen our understanding of these phenomena, future research should explore the long-term effects of online learning on critical health and cognitive domains. Investigating potential interventions to enhance physical activity and cognitive function in online learning environments is crucial for developing more holistic educational models. Additionally, examining the impacts of various learning modalities on students’ well-being and academic performance can offer valuable insights for educators and policymakers aiming to optimize educational practices in a digital landscape.

While our study provides significant insights into the impacts of learning environments on physical education students, it is imperative to consider the broader applicability of these findings. Further investigation is needed to determine how these results translate to students in other academic disciplines or educational settings, which would enhance the generalizability and relevance of our conclusions. Moreover, the focus of this study on male students raises questions about the generalizability of our findings across genders. Future research should include a more diverse participant pool, encompassing both male and female students, to explore potential gender differences in how learning environments affect sleep, physical activity, and cognitive function. Such inclusive research efforts will contribute to a more comprehensive understanding of the educational landscape, supporting the development of strategies that cater to a wider array of student needs and preferences.

Data availability statement

The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.

Ethics statement

The studies involving humans were approved by Qatar University’s Institutional Review Board. The studies were conducted in accordance with the local legislation and institutional requirements. The participants provided their written informed consent to participate in this study.

Author contributions

MH: Writing – review & editing, Writing – original draft, Visualization, Validation, Supervision, Software, Resources, Project administration, Methodology, Investigation, Funding acquisition, Formal analysis, Data curation, Conceptualization. ZA: Writing – review & editing, Writing – original draft, Visualization, Validation, Supervision, Software, Resources, Project administration, Methodology, Investigation, Data curation, Conceptualization. A-SA-S: Writing – review & editing, Writing – original draft, Visualization, Validation, Supervision, Methodology, Investigation, Funding acquisition, Formal analysis, Data curation, Conceptualization.

The author(s) declare that financial support was received for the research, authorship, and/or publication of this article. This work was supported by Qatar University under Collaborative Grants QUCG-CED-21-22-1 and QUCG-CED- 24/25-495.

Acknowledgments

The authors would like to thank all the participants for taking part in this study.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

The author(s) declared that they were an editorial board member of Frontiers, at the time of submission. This had no impact on the peer review process and the final decision.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Alhola, P., and Polo-Kantola, P. (2007). Sleep deprivation: impact on cognitive performance. Neuropsychiatr. Dis. Treat. 3, 553–567. doi: 10.2147/ndt.s12160203

PubMed Abstract | Crossref Full Text | Google Scholar

Anastasopoulou, P., Tubic, M., Schmidt, S., Neumann, R., Woll, A., and Hartel, S. (2014). Validation and comparison of two methods to assess human energy expenditure during free-living activities. PLoS One 9:e90606. doi: 10.1371/journal.pone.0090606

Aperribai, L., Cortabarria, L., Aguirre, T., Verche, E., and Borges, Á. (2020). Teacher's physical activity and mental health during lockdown due to the COVID-2019 pandemic. Front. Psychol. 11:577886. doi: 10.3389/fpsyg.2020.577886

Barkley, J. E., Lepp, A., Glickman, E., Farnell, G., Beiting, J., Wiet, R., et al. (2020). The acute effects of the COVID-19 pandemic on physical activity and sedentary behavior in university students and employees. Int. J. Exerc. Sci. 13, 1326–1339.

PubMed Abstract | Google Scholar

Berg-Beckhoff, G., Dalgaard Guldager, J., Tanggaard Andersen, P., Stock, C., and Smith Jervelund, S. (2021). What predicts adherence to governmental COVID-19 measures among Danish students? Int. J. Environ. Res. Public Health 18:1822. doi: 10.3390/ijerph18041822

Brancaccio, M., Mennitti, C., Gentile, A., Correale, L., Buzzachera, C. F., Ferraris, C., et al. (2021). Effects of the COVID-19 pandemic on job activity, dietary behaviours and physical activity habits of university population of Naples, Federico II-Italy. Int. J. Environ. Res. Public Health 18:1502. doi: 10.3390/ijerph18041502

Busse, H., Buck, C., Stock, C., Zeeb, H., Pischke, C. R., Fialho, P. M. M., et al. (2021). Engagement in health risk behaviours before and during the COVID-19 pandemic in German university students: results of a cross-sectional study. Int. J. Environ. Res. Public Health 18:1410. doi: 10.3390/ijerph18041410

Cai, Z., Mao, P., Wang, Z., Wang, D., He, J., and Fan, X. (2023). Associations between problematic internet use and mental health outcomes of students: a meta-analytic review. Adolesc. Res. Rev. 8, 45–62. doi: 10.1007/s40894-022-00201-9

Cellini, N., Mcdevitt, E. A., Mednick, S. C., and Buman, M. P. (2016). Free-living cross-comparison of two wearable monitors for sleep and physical activity in healthy young adults. Physiol. Behav. 157, 79–86. doi: 10.1016/j.physbeh.2016.01.034

Crossref Full Text | Google Scholar

Clark, R. C., and Mayer, R. E. (2023). E-learning and the science of instruction: proven guidelines for consumers and designers of multimedia learning , Hoboken, NJ: John Wiley & Sons.

Google Scholar

Colbert, L. H., Matthews, C. E., Havighurst, T. C., Kim, K., and Schoeller, D. A. (2011). Comparative validity of physical activity measures in older adults. Med. Sci. Sports Exerc. 43, 867–876. doi: 10.1249/MSS.0b013e3181fc7162

Curcio, G., Ferrara, M., and De Gennaro, L. (2006). Sleep loss, learning capacity and academic performance. Sleep Med. Rev. 10, 323–337. doi: 10.1016/j.smrv.2005.11.001

De Bruin, E. J., Van Run, C., Staaks, J., and Meijer, A. M. (2017). Effects of sleep manipulation on cognitive functioning of adolescents: a systematic review. Sleep Med. Rev. 32, 45–57. doi: 10.1016/j.smrv.2016.02.006

Diamond, A. (2015). Effects of physical exercise on executive functions: going beyond simply moving to moving with thought. Ann. Sports Med. Res. 2:1011

Donnelly, J. E., and Lambourne, K. (2011). Classroom-based physical activity, cognition, and academic achievement. Prev. Med. 52, S36–S42. doi: 10.1016/j.ypmed.2011.01.021

Gallo, L. A., Gallo, T. F., Young, S. L., Moritz, K. M., and Akison, L. K. (2020). The impact of isolation measures due to COVID-19 on energy intake and physical activity levels in Australian university students. Nutrients 12:1865. doi: 10.3390/nu12061865

Hanggi, J. M., Phillips, L. R., and Rowlands, A. V. (2013). Validation of the GT3X ActiGraph in children and comparison with the GT1M ActiGraph. J. Sci. Med. Sport 16, 40–44. doi: 10.1016/j.jsams.2012.05.012

Heider, A. (2021). “Sudden shift to online learning during the COVID 19 pandemic: the case of Arabic at Qatar University” in The world universities’ response to COVID-19: remote online language teaching . Eds. Nebojša Radić, Аnastasia Atabekova, Maria Freddi, and Josef Schmied Research-publishing.net, Voillans, France. 155–166.

Hooper, S. L., and Mackinnon, L. T. (1995). Monitoring overtraining in athletes. Recommendations. Sports Med. 20, 321–327. doi: 10.2165/00007256-199520050-00003

Jelleli, H., Hindawi, O., Rebhi, M., Ben Aissa, M., Saidane, M., Saad, A. R., et al. (2023). Psychometric evidence of the Arabic version of nomophobia questionnaire among physical education students. Psychol. Res. Behav. Manag. 16, 2383–2394. doi: 10.2147/PRBM.S416312

Joshi, A., Vinay, M., and Bhaskar, P. (2021). Impact of coronavirus pandemic on the Indian education sector: perspectives of teachers on online teaching and assessments. Interact. Technol. Smart Educ. 18, 205–226. doi: 10.1108/ITSE-06-2020-0087

Khare, R., Mahour, J., Ohary, R., and Kumar, S. (2021). Impact of online classes, screen time, naps on sleep, and assessment of sleep-related problems in medical college students during lockdown due to coronavirus disease-19 pandemic. Natl. J. Physiol. Pharm. Pharmacol. 11, 1–56. doi: 10.5455/njppp.2021.10.09235202006092020

Killgore, W. (2010). Effects of sleep deprivation on cognition. Prog. Brain Res. 185, 105–129. doi: 10.1016/B978-0-444-53702-7.00007-5

López-Valenciano, A., Suárez-Iglesias, D., Sanchez-Lastra, M. A., and Ayán, C. (2021). Impact of COVID-19 pandemic on university students' physical activity levels: an early systematic review. Front. Psychol. 11:3787. doi: 10.3389/fpsyg.2020.624567

Luciano, F., Cenacchi, V., Vegro, V., and Pavei, G. (2021). COVID-19 lockdown: physical activity, sedentary behaviour and sleep in Italian medicine students. Eur. J. Sport Sci. 21, 1459–1468. doi: 10.1080/17461391.2020.1842910

Means, B., Toyama, Y., Murphy, R., and Baki, M. (2013). The effectiveness of online and blended learning: a meta-analysis of the empirical literature. Teach. Coll. Rec. 115, 1–47. doi: 10.1177/016146811311500307

Meza, E. I. A., and López, J. A. H. (2021). Physical activity in university student athletes, prior and in confinement due to pandemic associated with COVID-19. Retos 39, 572–575.

Mocanu, G. D., Murariu, G., Iordan, D. A., Sandu, I., and Munteanu, M. O. A. (2021). The perception of the online teaching process during the COVID-19 pandemic for the students of the physical education and sports domain. Appl. Sci. 11:5558. doi: 10.3390/app11125558

Oka, T., Hamamura, T., Miyake, Y., Kobayashi, N., Honjo, M., Kawato, M., et al. (2021). Prevalence and risk factors of internet gaming disorder and problematic internet use before and during the COVID-19 pandemic: a large online survey of Japanese adults. J. Psychiatr. Res. 142, 218–225. doi: 10.1016/j.jpsychires.2021.07.054

Owusu-Fordjour, C., Koomson, C. K., and Hanson, D. (2020). The impact of Covid-19 on learning-the perspective of the Ghanaian student. Eur. J. Educ. Stud. 7, 88–101. doi: 10.5281/zenodo.3753586

Pashler, H., Mcdaniel, M., Rohrer, D., and Bjork, R. (2008). Learning styles: concepts and evidence. Psychol. Sci. Public Interest 9, 105–119. doi: 10.1111/j.1539-6053.2009.01038.x

Rapanta, C., Botturi, L., Goodyear, P., Guàrdia, L., and Koole, M. (2020). Online university teaching during and after the Covid-19 crisis: refocusing teacher presence and learning activity. Postdigital Sci. Educ. 2, 923–945. doi: 10.1007/s42438-020-00155-y

Rassolnia, A., and Nobari, H. (2024). The impact of socio-economic status and physical activity on psychological well-being and sleep quality among college students during the COVID-19 pandemic. Int. J. Sport Stud. Health 7, 1–12. doi: 10.61838/kman.intjssh.7.2.1

Ratey, J. J., and Loehr, J. E. (2011). The positive impact of physical activity on cognition during adulthood: a review of underlying mechanisms, evidence and recommendations. Rev. Neurosci. 22, 171–185. doi: 10.1515/rns.2011.017

Rodríguez-Larrad, A., Mañas, A., Labayen, I., González-Gross, M., Espin, A., Aznar, S., et al. (2021). Impact of COVID-19 confinement on physical activity and sedentary behaviour in Spanish university students: role of gender. Int. J. Environ. Res. Public Health 18:369. doi: 10.3390/ijerph18020369

Sasaki, J. E., John, D., and Freedson, P. S. (2011). Validation and comparison of ActiGraph activity monitors. J. Sci. Med. Sport 14, 411–416. doi: 10.1016/j.jsams.2011.04.003

Savage, M. J., James, R., Magistro, D., Donaldson, J., Healy, L. C., Nevill, M., et al. (2020). Mental health and movement behaviour during the COVID-19 pandemic in UK university students: prospective cohort study. Ment. Health Phys. Act. 19:100357. doi: 10.1016/j.mhpa.2020.100357

Suleiman, K. H., Yates, B. C., Berger, A. M., Pozehl, B., and Meza, J. (2010). Translating the Pittsburgh sleep quality index into Arabic. West. J. Nurs. Res. 32, 250–268. doi: 10.1177/0193945909348230

Tan, H. R., Chng, W. H., Chonardo, C., Ng, M. T. T., and Fung, F. M. (2020). How chemists achieve active learning online during the COVID-19 pandemic: using the Community of Inquiry (CoI) framework to support remote teaching. J. Chem. Educ. 97, 2512–2518. doi: 10.1021/acs.jchemed.0c00541

Tuco, K. G., Castro-Diaz, S. D., Soriano-Moreno, D. R., and Benites-Zapata, V. A. (2023). Prevalence of nomophobia in university students: a systematic review and meta-analysis. Healthcare Inform. Res. 29, 40–53. doi: 10.4258/hir.2023.29.1.40

Vaalayi, F., Yagin, F. H., Yagin, B., and Gulu, M. (2023). The impact of low-intensity aerobic exercise on cognitive performance in female volleyball players following partial sleep deprivation. Health Nexus 1, 25–31. doi: 10.61838/hn.1.1.5

Varela-Aldás, J. L., Buele, J., Pérez, D., and Palacios-Navarro, G. (2023). Memory rehabilitation during the COVID-19 pandemic. BMC Med. Inform. Decis. Mak. 23:195. doi: 10.1186/s12911-023-02294-1

Volken, T., Zysset, A., Amendola, S., Klein Swormink, A., Huber, M., Von Wyl, A., et al. (2021). Depressive symptoms in Swiss university students during the COVID-19 pandemic and their correlates. Int. J. Environ. Res. Public Health 18:1458. doi: 10.3390/ijerph18041458

Walker, M. P., and Stickgold, R. (2006). Sleep, memory, and plasticity. Annu. Rev. Psychol. 57, 139–166. doi: 10.1146/annurev.psych.56.091103.070307

Wickersham, A., Carr, E., Hunt, R., Davis, J. P., Hotopf, M., Fear, N. T., et al. (2021). Changes in physical activity among United Kingdom university students following the implementation of coronavirus lockdown measures. Int. J. Environ. Res. Public Health 18:2792. doi: 10.3390/ijerph18062792

Keywords: wrist actigraphy, cognitive assessment, learning environment, step counter, university students

Citation: Haddad M, Abbes Z and Abdel-Salam A-SG (2024) The impact of online classes on sleep, physical activity, and cognition functioning among physical education students. Front. Psychol . 15:1397588. doi: 10.3389/fpsyg.2024.1397588

Received: 08 March 2024; Accepted: 20 May 2024; Published: 30 May 2024.

Reviewed by:

Copyright © 2024 Haddad, Abbes and Abdel-Salam. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Monoem Haddad, [email protected]

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

  • Open access
  • Published: 16 September 2021

Online learning during COVID-19 produced equivalent or better student course performance as compared with pre-pandemic: empirical evidence from a school-wide comparative study

  • Meixun Zheng 1 ,
  • Daniel Bender 1 &
  • Cindy Lyon 1  

BMC Medical Education volume  21 , Article number:  495 ( 2021 ) Cite this article

212k Accesses

80 Citations

115 Altmetric

Metrics details

The COVID-19 pandemic forced dental schools to close their campuses and move didactic instruction online. The abrupt transition to online learning, however, has raised several issues that have not been resolved. While several studies have investigated dental students’ attitude towards online learning during the pandemic, mixed results have been reported. Additionally, little research has been conducted to identify and understand factors, especially pedagogical factors, that impacted students’ acceptance of online learning during campus closure. Furthermore, how online learning during the pandemic impacted students’ learning performance has not been empirically investigated. In March 2020, the dental school studied here moved didactic instruction online in response to government issued stay-at-home orders. This first-of-its-kind comparative study examined students’ perceived effectiveness of online courses during summer quarter 2020, explored pedagogical factors impacting their acceptance of online courses, and empirically evaluated the impact of online learning on students’ course performance, during the pandemic.

The study employed a quasi-experimental design. Participants were 482 pre-doctoral students in a U.S dental school. Students’ perceived effectiveness of online courses during the pandemic was assessed with a survey. Students’ course grades for online courses during summer quarter 2020 were compared with that of a control group who received face-to-face instruction for the same courses before the pandemic in summer quarter 2019.

Survey results revealed that most online courses were well accepted by the students, and 80 % of them wanted to continue with some online instruction post pandemic. Regression analyses revealed that students’ perceived engagement with faculty and classmates predicted their perceived effectiveness of the online course. More notably, Chi Square tests demonstrated that in 16 out of the 17 courses compared, the online cohort during summer quarter 2020 was equally or more likely to get an A course grade than the analogous face-to-face cohort during summer quarter 2019.

Conclusions

This is the first empirical study in dental education to demonstrate that online courses during the pandemic could achieve equivalent or better student course performance than the same pre-pandemic in-person courses. The findings fill in gaps in literature and may inform online learning design moving forward.

Peer Review reports

Introduction

Research across disciplines has demonstrated that well-designed online learning can lead to students’ enhanced motivation, satisfaction, and learning [ 1 , 2 , 3 , 4 , 5 , 6 , 7 ]. A report by the U.S. Department of Education [ 8 ], based on examinations of comparative studies of online and face-to-face versions of the same course from 1996 to 2008, concluded that online learning could produce learning outcomes equivalent to or better than face-to-face learning. The more recent systematic review by Pei and Wu [ 9 ] provided additional evidence that online learning is at least as effective as face-to-face learning for undergraduate medical students.

To take advantage of the opportunities presented by online learning, thought leaders in dental education in the U.S. have advocated for the adoption of online learning in the nation’s dental schools [ 10 , 11 , 12 ]. However, digital innovation has been a slow process in academic dentistry [ 13 , 14 , 15 ]. In March 2020, the COVID-19 pandemic brought unprecedented disruption to dental education by necessitating the need for online learning. In accordance with stay-at-home orders to prevent the spread of the virus, dental schools around the world closed their campuses and moved didactic instruction online.

The abrupt transition to online learning, however, has raised several concerns and question. First, while several studies have examined dental students’ online learning satisfaction during the pandemic, mixed results have been reported. Some studies have reported students’ positive attitude towards online learning [ 15 , 16 , 17 , 18 , 19 , 20 ]. Sadid-Zadeh et al. [ 18 ] found that 99 % of the surveyed dental students at University of Buffalo, in the U.S., were satisfied with live web-based lectures during the pandemic. Schlenz et al. [ 15 ] reported that students in a German dental school had a favorable attitude towards online learning and wanted to continue with online instruction in their future curriculum. Other studies, however, have reported students’ negative online learning experience during the pandemic [ 21 , 22 , 23 , 24 , 25 , 26 ]. For instance, dental students at Harvard University felt that learning during the pandemic had worsened and engagement had decreased [ 23 , 24 ]. In a study with medical and dental students in Pakistan, Abbasi et al. [ 21 ] found that 77 % of the students had negative perceptions about online learning and 84 % reported reduced student-instructor interactions.

In addition to these mixed results, little attention has been given to factors affecting students’ acceptance of online learning during the pandemic. With the likelihood that online learning will persist post pandemic [ 27 ], research in this area is warranted to inform online course design moving forward. In particular, prior research has demonstrated that one of the most important factors influencing students’ performance in any learning environment is a sense of belonging, the feeling of being connected with and supported by the instructor and classmates [ 28 , 29 , 30 , 31 ]. Unfortunately, this aspect of the classroom experience has suffered during school closure. While educational events can be held using a video conferencing system, virtual peer interaction on such platforms has been perceived by medical trainees to be not as easy and personal as physical interaction [ 32 ]. The pandemic highlights the need to examine instructional strategies most suited to the current situation to support students’ engagement with faculty and classmates.

Furthermore, there is considerable concern from the academic community about the quality of online learning. Pre-pandemic, some faculty and students were already skeptical about the value of online learning [ 33 ]. The longer the pandemic lasts, the more they may question the value of online education, asking: Can online learning during the pandemic produce learning outcomes that are similar to face-to-face learning before the pandemic? Despite the documented benefits of online learning prior to the pandemic, the actual impact of online learning during the pandemic on students’ academic performance is still unknown due to reasons outlined below.

On one hand, several factors beyond the technology used could influence the effectiveness of online learning, one of which is the teaching context [ 34 ]. The sudden transition to online learning has posed many challenges to faculty and students. Faculty may not have had adequate time to carefully design online courses to take full advantage of the possibilities of the online format. Some faculty may not have had prior online teaching experience and experienced a deeper learning curve when it came to adopting online teaching methods [ 35 ]. Students may have been at the risk of increased anxiety due to concerns about contracting the virus, on time graduation, finances, and employment [ 36 , 37 ], which may have negatively impacted learning performance [ 38 ]. Therefore, whether online learning during the pandemic could produce learning outcomes similar to those of online learning implemented during more normal times remains to be determined.

Most existing studies on online learning in dental education during the pandemic have only reported students’ satisfaction. The actual impact of the online format on academic performance has not been empirically investigated. The few studies that have examined students’ learning outcomes have only used students’ self-reported data from surveys and focus groups. According to Kaczmarek et al. [ 24 ], 50 % of the participating dental faculty at Harvard University perceived student learning to have worsened during the pandemic and 70 % of the students felt the same. Abbasi et al. [ 21 ] reported that 86 % of medical and dental students in a Pakistan college felt that they learned less online. While student opinions are important, research has demonstrated a poor correlation between students’ perceived learning and actual learning gains [ 39 ]. As we continue to navigate the “new normal” in teaching, students’ learning performance needs to be empirically evaluated to help institutions gauge the impact of this grand online learning experiment.

Research purposes

In March 2020, the University of the Pacific Arthur A. Dugoni School of Dentistry, in the U.S., moved didactic instruction online to ensure the continuity of education during building closure. This study examined students’ acceptance of online learning during the pandemic and its impacting factors, focusing on instructional practices pertaining to students’ engagement/interaction with faculty and classmates. Another purpose of this study was to empirically evaluate the impact of online learning during the pandemic on students’ actual course performance by comparing it with that of a pre-pandemic cohort. To understand the broader impact of the institutional-wide online learning effort, we examined all online courses offered in summer quarter 2020 (July to September) that had a didactic component.

This is the first empirical study in dental education to evaluate students’ learning performance during the pandemic. The study aimed to answer the following three questions.

How well was online learning accepted by students, during the summer quarter 2020 pandemic interruption?

How did instructional strategies, centered around students’ engagement with faculty and classmates, impact their acceptance of online learning?

How did online learning during summer quarter 2020 impact students’ course performance as compared with a previous analogous cohort who received face-to-face instruction in summer quarter 2019?

This study employed a quasi-experimental design. The study was approved by the university’s institutional review board (#2020-68).

Study context and participants

The study was conducted at the Arthur A. Dugoni School of Dentistry, University of the Pacific. The program runs on a quarter system. It offers a 3-year accelerated Doctor of Dental Surgery (DDS) program and a 2-year International Dental Studies (IDS) program for international dentists who have obtained a doctoral degree in dentistry from a country outside the U.S. and want to practice in the U.S. Students advance throughout the program in cohorts. IDS students take some courses together with their DDS peers. All three DDS classes (D1/DDS 2023, D2/DDS 2022, and D3/DDS 2021) and both IDS classes (I1/IDS 2022 and I2/IDS 2021) were invited to participate in the study. The number of students in each class was: D1 = 145, D2 = 143, D3 = 143, I1 = 26, and I2 = 25. This resulted in a total of 482 student participants.

During campus closure, faculty delivered remote instruction in various ways, including live online classes via Zoom @  [ 40 ], self-paced online modules on the school’s learning management system Canvas @  [ 41 ], or a combination of live and self-paced delivery. For self-paced modules, students studied assigned readings and/or viewings such as videos and pre-recorded slide presentations. Some faculty also developed self-paced online lessons with SoftChalk @  [ 42 ], a cloud-based platform that supports the inclusion of gamified learning by insertion of various mini learning activities. The SoftChalk lessons were integrated with Canvas @  [ 41 ] and faculty could monitor students’ progress. After students completed the pre-assigned online materials, some faculty held virtual office hours or live online discussion sessions for students to ask questions and discuss key concepts.

Data collection and analysis

Student survey.

Students’ perceived effectiveness of summer quarter 2020 online courses was evaluated by the school’s Office of Academic Affairs in lieu of the regular course evaluation process. A total of 19 courses for DDS students and 10 courses for IDS students were evaluated. An 8-question survey developed by the researchers (Additional file 1 ) was administered online in the last week of summer quarter 2020. Course directors invited student to take the survey during live online classes. The survey introduction stated that taking the survey was voluntary and that their anonymous responses would be reported in aggregated form for research purposes. Students were invited to continue with the survey if they chose to participate; otherwise, they could exit the survey. The number of students in each class who took the survey was as follows: D1 ( n  = 142; 98 %), D2 ( n  = 133; 93 %), D3 ( n  = 61; 43 %), I1 ( n  = 23; 88 %), and I2 ( n  = 20; 80 %). This resulted in a total of 379 (79 %) respondents across all classes.

The survey questions were on a 4-point scale, ranging from Strongly Disagree (1 point), Disagree (2 points), Agree (3 points), and Strongly Agree (4 points). Students were asked to rate each online course by responding to four statements: “ I could fully engage with the instructor and classmates in this course”; “The online format of this course supported my learning”; “Overall this online course is effective.”, and “ I would have preferred face-to-face instruction for this course ”. For the first three survey questions, a higher mean score indicated a more positive attitude toward the online course. For the fourth question “ I would have preferred face-to-face instruction for this course ”, a higher mean score indicated that more students would have preferred face-to-face instruction for the course. Two additional survey questions asked students to select their preferred online delivery method for fully online courses during the pandemic from three given choices (synchronous online/live, asynchronous online/self-paced, and a combination of both), and to report whether they wanted to continue with some online instruction post pandemic. Finally, two open-ended questions at the end of the survey allowed students to comment on the aspects of online format that they found to be helpful and to provide suggestion for improvement. For the purpose of this study, we focused on the quantitative data from the Likert-scale questions.

Descriptive data such as the mean scores were reported for each course. Regression analyses were conducted to examine the relationship between instructional strategies focusing on students’ engagement with faculty and classmates, and their overall perceived effectiveness of the online course. The independent variable was student responses to the question “ I could fully engage with the instructor and classmates in this course ”, and the dependent variable was their answer to the question “ Overall, this online course is effective .”

Student course grades

Using Chi-square tests, student course grade distributions (A, B, C, D, and F) for summer quarter 2020 online courses were compared with that of a previous cohort who received face-to-face instruction for the same course in summer quarter 2019. Note that as a result of the school’s pre-doctoral curriculum redesign implemented in July 2019, not all courses offered in summer quarter 2020 were offered in the previous year in summer quarter 2019. In other words, some of the courses offered in summer quarter 2020 were new courses offered for the first time. Because these new courses did not have a previous face-to-face version to compare to, they were excluded from data analysis. For some other courses, while course content remained the same between 2019 and 2020, the sequence of course topics within the course had changed. These courses were also excluded from data analysis.

After excluding the aforementioned courses, it resulted in a total of 17 “comparable” courses that were included in data analysis (see the subsequent section). For these courses, the instructor, course content, and course goals were the same in both 2019 and 2020. The assessment methods and grading policies also remained the same through both years. For exams and quizzes, multiple choice questions were the dominating format for both years. While some exam questions in 2020 were different from 2019, faculty reported that the overall exam difficulty level was similar. The main difference in assessment was testing conditions. The 2019 cohort took computer-based exams in the physical classroom with faculty proctoring, and the 2020 cohort took exams at home with remote proctoring to ensure exam integrity. The remote proctoring software monitored the student during the exam through a web camera on their computer/laptop. The recorded video file flags suspicious activities for faculty review after exam completion.

Students’ perceived effectiveness of online learning

Table  1 summarized data on DDS students’ perceived effectiveness of each online course during summer quarter 2020. For the survey question “ Overall, this online course is effective ”, the majority of courses received a mean score that was approaching or over 3 points on the 4-point scale, suggesting that online learning was generally well accepted by students. Despite overall positive online course experiences, for many of the courses examined, there was an equal split in student responses to the question “ I would have preferred face-to-face instruction for this course .” Additionally, for students’ preferred online delivery method for fully online courses, about half of the students in each class preferred a combination of synchronous and asynchronous online learning (see Fig.  1 ). Finally, the majority of students wanted faculty to continue with some online instruction post pandemic: D1class (110; 78.60 %), D2 class (104; 80 %), and D3 class (49; 83.10 %).

While most online courses received favorable ratings, some variations did exist among courses. For D1 courses, “ Anatomy & Histology ” received lower ratings than others. This could be explained by its lab component, which didn’t lend itself as well to the online format. For D2 courses, several of them received lower ratings than others, especially for the survey question on students’ perceived engagement with faculty and classmates.

figure 1

DDS students’ preferred online delivery method for fully online courses

Table  2 summarized IDS students’ perceived effectiveness of each online course during summer quarter 2020. For the survey question “ Overall, this online course is effective ”, all courses received a mean score that was approaching or over 3 points on a 4-point scale, suggesting that online learning was well accepted by students. For the survey question “ I would have preferred face-to-face instruction for this course ”, for most online courses examined, the percentage of students who would have preferred face-to-face instruction was similar to that of students who preferred online instruction for the course. Like their DDS peers, about half of the IDS students in each class also preferred a combination of synchronous and asynchronous online delivery for fully online courses (See Fig.  2 ). Finally, the majority of IDS students (I1, n = 18, 81.80 %; I2, n = 16, 84.20 %) wanted to continue with some online learning after the pandemic is over.

figure 2

IDS students’ preferred online delivery method for fully online courses

Factors impacting students’ acceptance of online learning

For all 19 online courses taken by DDS students, regression analyses indicated that there was a significantly positive relationship between students’ perceived engagement with faculty and classmates and their perceived effectiveness of the course. P value was 0.00 across all courses. The ranges of effect size (r 2 ) were: D1 courses (0.26 to 0.50), D2 courses (0.39 to 0.650), and D3 courses (0.22 to 0.44), indicating moderate to high correlations across courses.

For 9 out of the 10 online courses taken by IDS students, there was a positive relationship between students’ perceived engagement with faculty and classmates and their perceived effectiveness of the course. P value was 0.00 across courses. The ranges of effect size were: I1 courses (0.35 to 0.77) and I2 courses (0.47 to 0.63), indicating consistently high correlations across courses. The only course in which students’ perceived engagement with faculty and classmates didn’t predict perceived effective of the course was “ Integrated Clinical Science III (ICS III) ”, which the I2 class took together with their D3 peers.

Impact of online learning on students’ course performance

Chi square test results (Table  3 ) indicated that in 4 out of the 17 courses compared, the online cohort during summer quarter 2020 was more likely to receive an A grade than the face-to-face cohort during summer quarter 2019. In 12 of the courses, the online cohort were equally likely to receive an A grade as the face-to-face cohort. In the remaining one course, the online cohort was less likely to receive an A grade than the face-to-face cohort.

Students’ acceptance of online learning during the pandemic

Survey results revealed that students had generally positive perceptions about online learning during the pandemic and the majority of them wanted to continue with some online learning post pandemic. Overall, our findings supported several other studies in dental [ 18 , 20 ], medical [ 43 , 44 ], and nursing [ 45 ] education that have also reported students’ positive attitudes towards online learning during the pandemic. In their written comments in the survey, students cited enhanced flexibility as one of the greatest benefits of online learning. Some students also commented that typing questions in the chat box during live online classes was less intimidating than speaking in class. Others explicitly stated that not having to commute to/from school provided more time for sleep, which helped with self-care and mental health. Our findings are in line with previous studies which have also demonstrated that online learning offered higher flexibility [ 46 , 47 ]. Meanwhile, consistent with findings of other researchers [ 19 , 21 , 46 ], our students felt difficulty engaging with faculty and classmates in several online courses.

There were some variations among individual courses in students’ acceptance of the online format. One factor that could partially account for the observed differences was instructional strategies. In particular, our regression analysis results demonstrated a positive correlation between students’ perceived engagement with faculty and classmates and their perceived overall effectiveness of the online course. Other aspects of course design might also have influenced students’ overall rating of the online course. For instance, some D2 students commented that the requirements of the course “ Integrated Case-based Seminars (ICS II) ” were not clear and that assessment did not align with lecture materials. It is important to remember that communicating course requirements clearly and aligning course content and assessment are principles that should be applied in any course, whether face-to-face or online. Our results highlighted the importance of providing faculty training on basic educational design principles and online learning design strategies. Furthermore, the nature of the course might also have impacted student ratings. For example, D1 course “ Anatomy and Histology ” had a lab component, which did not lend itself as well to the online format. Many students reported that it was difficult to see faculty’s live demonstration during Zoom lectures, which may have resulted in a lower student satisfaction rating.

As for students’ preferred online delivery method for fully online courses during the pandemic, about half of them preferred a combination of synchronous and asynchronous online learning. In light of this finding, as we continue with remote learning until public health directives allow a return to campus, we will encourage faculty to integrate these two online delivery modalities. Finally, in view of the result that over 80 % of the students wanted to continue with some online instruction after the pandemic, the school will advocate for blended learning in the post-pandemic world [ 48 ]. For future face-to-face courses on campus after the pandemic, faculty are encouraged to deliver some content online to reduce classroom seat time and make learning more flexible. Taken together, our findings not only add to the overall picture of the current situation but may inform learning design moving forward.

Role of online engagement and interaction

To reiterate, we found that students’ perceived engagement with faculty and classmates predicted their perceived overall effectiveness of the online course. This aligns with the larger literature on best practices in online learning design. Extensive research prior to the pandemic has confirmed that the effectiveness of online learning is determined by a number of factors beyond the tools used, including students’ interactions with the instructor and classmates [ 49 , 50 , 51 , 52 ]. Online students may feel isolated due to reduced or lack of interaction [ 53 , 54 ]. Therefore, in designing online learning experiences, it is important to remember that learning is a social process [ 55 ]. Faculty’s role is not only to transmit content but also to promote the different types of interactions that are an integral part of the online learning process [ 33 ]. The online teaching model in which faculty uploads materials online but teach it in the same way as in the physical classroom, without special effort to engage students, doesn’t make the best use of the online format. Putting the “sage on the screen” during a live class meeting on a video conferencing system is not different from “sage on the stage” in the physical classroom - both provide limited space for engagement. Such one-way monologue devalues the potentials that online learning presents.

In light of the critical role that social interaction plays in online learning, faculty are encouraged to use the interactive features of online learning platforms to provide clear channels for student-instructor and student-student interactions. In the open-ended comments, students highlighted several instructional strategies that they perceived to be helpful for learning. For live online classes, these included conducting breakout room activities, using the chat box to facilitate discussions, polling, and integrating gameplay with apps such as Kahoot! @  [ 56 ]. For self-paced classes, students appreciated that faculty held virtual office hours or subsequent live online discussion sessions to reinforce understanding of the pre-assigned materials.

Quality of online education during the pandemic

This study provided empirical evidence in dental education that it was possible to ensure the continuity of education without sacrificing the quality of education provided to students during forced migration to distance learning upon building closure. To reiterate, in all but one online course offered in summer quarter 2020, students were equally or more likely to get an A grade than the face-to-face cohort from summer quarter 2019. Even for courses that had less student support for the online format (e.g., the D1 course “ Anatomy and Histology ”), there was a significant increase in the number of students who earned an A grade in 2020 as compared with the previous year. The reduced capacity for technical training during the pandemic may have resulted in more study time for didactic content. Overall, our results resonate with several studies in health sciences education before the pandemic that the quality of learning is comparable in face-to-face and online formats [ 9 , 57 , 58 ]. For the only course ( Integrated Case-based Seminars ICS II) in which the online cohort had inferior performance than the face-to-face cohort, as mentioned earlier, students reported that assessment was not aligned with course materials and that course expectations were not clear. This might explain why students’ course performance was not as strong as expected.

Limitations

This study used a pre-existing control group from the previous year. There may have been individual differences between students in the online and the face-to-face cohorts, such as motivation, learning style, and prior knowledge, that could have impacted the observed outcomes. Additionally, even though course content and assessment methods were largely the same in 2019 and 2020, changes in other aspects of the course could have impacted students’ course performance. Some faculty may have been more compassionate with grading (e.g., more flexible with assignment deadlines) in summer quarter 2020 given the hardship students experienced during the pandemic. On the other hand, remote proctoring in summer quarter 2020 may have heightened some students’ exam anxiety knowing that they were being monitored through a webcam. The existence and magnitude of effect of these factors needs to be further investigated.

This present study only examined the correlation between students’ perceived online engagement and their perceived overall effectiveness of the online course. Other factors that might impact their acceptance of the online format need to be further researched in future studies. Another future direction is to examine how students’ perceived online engagement correlates with their actual course performance. Because the survey data collected for our present study are anonymous, we cannot match students’ perceived online engagement data with their course grades to run this additional analysis. It should also be noted that this study was focused on didactic online instruction. Future studies might examine how technical training was impacted during the COVID building closure. It was also out of the scope of this study to examine how student characteristics, especially high and low academic performance as reflected by individual grades, affects their online learning experience and performance. We plan to conduct a follow-up study to examine which group of students are most impacted by the online format. Finally, this study was conducted in a single dental school, and so the findings may not be generalizable to other schools and disciplines. Future studies could be conducted in another school or disciplines to compare results.

This study revealed that dental students had generally favorable attitudes towards online learning during the COVID-19 pandemic and that their perceived engagement with faculty and classmates predicted their acceptance of the online course. Most notably, this is the first study in dental education to demonstrate that online learning during the pandemic could achieve similar or better learning outcomes than face-to-face learning before the pandemic. Findings of our study could contribute significantly to the literature on online learning during the COVID-19 pandemic in health sciences education. The results could also inform future online learning design as we re-envision the future of online learning.

Availability of data and materials

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Bello G, Pennisi MA, Maviglia R, Maggiore SM, Bocci MG, Montini L, et al. Online vs live methods for teaching difficult airway management to anesthesiology residents. Intensive Care Med. 2005; 31 (4): 547–552.

Article   Google Scholar  

Ruiz JG, Mintzer MJ, Leipzig RM. The impact of e-learning in medical education. Acad Med. 2006; 81(3): 207–12.

Kavadella A, Tsiklakis K, Vougiouklakis G, Lionarakis A. Evaluation of a blended learning course for teaching oral radiology to undergraduate dental students. Eur J Dent Educ. 2012; 16(1): 88–95.

de Jong N, Verstegen DL, Tan FS, O’Connor SJ. A comparison of classroom and online asynchronous problem-based learning for students undertaking statistics training as part of a public health master’s degree. Adv Health Sci Educ. 2013; 18(2):245–64.

Hegeman JS. Using instructor-generated video lectures in online mathematics coursesimproves student learning. Online Learn. 2015;19(3):70–87.

Gaupp R, Körner M, Fabry G. Effects of a case-based interactive e-learning course on knowledge and attitudes about patient safety: a quasi-experimental study with third-year medical students. BMC Med Educ. 2016; 16(1):172.

Zheng M, Bender D, Reid L, Milani J. An interactive online approach to teaching evidence-based dentistry with Web 2.0 technology. J Dent Educ. 2017; 81(8): 995–1003.

Means B, Toyama Y, Murphy R, Bakia M, Jones K. Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies. U.S. Department of Education, Office of Planning, Evaluation and Policy Development. Washington D.C. 2009.

Google Scholar  

Pei L, Wu H. Does online learning work better than offline learning in undergraduate medical education? A systematic review and meta-analysis. Med Educ Online. 2019; 24(1):1666538.

Andrews KG, Demps EL. Distance education in the U.S. and Canadian undergraduate dental curriculum. J Dent Educ. 2003; 67(4):427–38.

Kassebaum DK, Hendricson WD, Taft T, Haden NK. The dental curriculum at North American dental institutions in 2002–03: a survey of current structure, recent innovations, and planned changes. J Dent Educ. 2004; 68(9):914–931.

Haden NK, Hendricson WD, Kassebaum DK, Ranney RR, Weinstein G, Anderson EL, et al. Curriculum changes in dental education, 2003–09. J Dent Educ. 2010; 74(5):539–57.

DeBate RD, Cragun D, Severson HH, Shaw T, Christiansen S, Koerber A, et al. Factors for increasing adoption of e-courses among dental and dental hygiene faculty members. J Dent Educ. 2011; 75 (5): 589–597.

Saeed SG, Bain J, Khoo E, Siqueira WL. COVID-19: Finding silver linings for dental education. J Dent Educ. 2020; 84(10):1060–1063.

Schlenz MA, Schmidt A, Wöstmann B, Krämer N, Schulz-Weidner N. Students’ and lecturers’ perspective on the implementation of online learning in dental education due to SARS-CoV-2 (COVID-19): a cross-sectional study. BMC Med Educ. 2020;20(1):1–7.

Donn J, Scott JA, Binnie V, Bell A. A pilot of a virtual Objective Structured Clinical Examination in dental education. A response to COVID-19. Eur J Dent Educ. 2020; https://doi.org/10.1111/eje.12624

Hung M, Licari FW, Hon ES, Lauren E, Su S, Birmingham WC, Wadsworth LL, Lassetter JH, Graff TC, Harman W, et al. In an era of uncertainty: impact of COVID-19 on dental education. J Dent Educ. 2020; 85 (2): 148–156.

Sadid-Zadeh R, Wee A, Li R, Somogyi‐Ganss E. Audience and presenter comparison of live web‐based lectures and traditional classroom lectures during the COVID‐19 pandemic. J Prosthodont. 2020. doi: https://doi.org/10.1111/jopr.13301

Wang K, Zhang L, Ye L. A nationwide survey of online teaching strategies in dental education in China. J Dent Educ. 2020; 85 (2): 128–134.

Rad FA, Otaki F, Baqain Z, Zary N, Al-Halabi M. Rapid transition to distance learning due to COVID-19: Perceptions of postgraduate dental learners and instructors. PLoS One. 2021; 16(2): e0246584.

Abbasi S, Ayoob T, Malik A, Memon SI. Perceptions of students regarding E-learning during Covid-19 at a private medical college. Pak J Med Sci. 2020; 3 6 : 57–61.

Al-Azzam N, Elsalem L, Gombedza F. A cross-sectional study to determine factors affecting dental and medical students’ preference for virtual learning during the COVID-19 outbreak. Heliyon. 6(12). 2020. doi: https://doi.org/10.1016/j.heliyon.2020.e05704

Chen E, Kaczmarek K, Ohyama H. Student perceptions of distance learning strategies during COVID-19. J Dent Educ. 2020. doi: https://doi.org/10.1002/jdd.12339

Kaczmarek K, Chen E, Ohyama H. Distance learning in the COVID-19 era: Comparison of student and faculty perceptions. J Dent Educ. 2020. https://doi.org/10.1002/jdd.12469

Sarwar H, Akhtar H, Naeem MM, Khan JA, Waraich K, Shabbir S, et al. Self-reported effectiveness of e-learning classes during COVID-19 pandemic: A nation-wide survey of Pakistani undergraduate dentistry students. Eur J Dent. 2020; 14 (S01): S34-S43.

Al-Taweel FB, Abdulkareem AA, Gul SS, Alshami ML. Evaluation of technology‐based learning by dental students during the pandemic outbreak of coronavirus disease 2019. Eur J Dent Educ. 2021; 25(1): 183–190.

Elangovan S, Mahrous A, Marchini L. Disruptions during a pandemic: Gaps identified and lessons learned. J Dent Educ. 2020; 84 (11): 1270–1274.

Goodenow C. Classroom belonging among early adolescent students: Relationships to motivation and achievement. J Early Adolesc.1993; 13(1): 21–43.

Goodenow C. The psychological sense of school membership among adolescents: Scale development and educational correlates. Psychol Sch. 1993; 30(1): 79–90.

St-Amand J, Girard S, Smith J. Sense of belonging at school: Defining attributes, determinants, and sustaining strategies. IAFOR Journal of Education. 2017; 5(2):105–19.

Peacock S, Cowan J. Promoting sense of belonging in online learning communities of inquiry at accredited courses. Online Learn. 2019; 23(2): 67–81.

Chan GM, Kanneganti A, Yasin N, Ismail-Pratt I, Logan SJ. Well‐being, obstetrics and gynecology and COVID‐19: Leaving no trainee behind. Aust N Z J Obstet Gynaecol. 2020; 60(6): 983–986.

Hodges C, Moore S, Lockee B, Trust T, Bond A. The difference between emergency remote teaching and online learning. Educause Review. 2020; 2 7 , 1–12.

Means B, Bakia M, Murphy R. Learning online: What research tells us about whether, when and how. Routledge. 2014.

Iyer P, Aziz K, Ojcius DM. Impact of COVID-19 on dental education in the United States. J Dent Educ. 2020; 84(6): 718–722.

Machado RA, Bonan PRF, Perez DEDC, Martelli JÚnior H. 2020. COVID-19 pandemic and the impact on dental education: Discussing current and future perspectives. Braz Oral Res. 2020; 34: e083.

Wu DT, Wu KY, Nguyen TT, Tran SD. The impact of COVID-19 on dental education in North America-Where do we go next? Eur J Dent Educ. 2020; 24(4): 825–827.

de Oliveira Araújo FJ, de Lima LSA, Cidade PIM, Nobre CB, Neto MLR. Impact of Sars-Cov-2 and its reverberation in global higher education and mental health. Psychiatry Res. 2020; 288:112977. doi: https://doi.org/10.1016/j.psychres.2020.112977

Persky AM, Lee E, Schlesselman LS. Perception of learning versus performance as outcome measures of educational research. Am J Pharm Educ. 2020; 8 4 (7): ajpe7782.

Zoom @ . Zoom Video Communications , San Jose, CA, USA. https://zoom.us/

Canvas @ . Instructure, INC. Salt Lake City, UT, USA. https://www.instructure.com/canvas

SoftChalk @ . SoftChalk LLC . San Antonio, TX, USA. https://www.softchalkcloud.com/

Agarwal S, Kaushik JS. Student’s perception of online learning during COVID pandemic. Indian J Pediatr. 2020; 87: 554–554.

Khalil R, Mansour AE, Fadda WA, Almisnid K, Aldamegh M, Al-Nafeesah A, et al. The sudden transition to synchronized online learning during the COVID-19 pandemic in Saudi Arabia: a qualitative study exploring medical students’ perspectives. BMC Med Educ. 2020; 20(1): 1–10.

Riley E, Capps N, Ward N, McCormack L, Staley J. Maintaining academic performance and student satisfaction during the remote transition of a nursing obstetrics course to online instruction. Online Learn. 2021; 25(1), 220–229.

Amir LR, Tanti I, Maharani DA, Wimardhani YS, Julia V, Sulijaya B, et al. Student perspective of classroom and distance learning during COVID-19 pandemic in the undergraduate dental study program Universitas Indonesia. BMC Med Educ. 2020; 20(1):1–8.

Dost S, Hossain A, Shehab M, Abdelwahed A, Al-Nusair L. Perceptions of medical students towards online teaching during the COVID-19 pandemic: a national cross-sectional survey of 2721 UK medical students. BMJ Open. 2020; 10(11).

Graham CR, Woodfield W, Harrison JB. A framework for institutional adoption and implementation of blended learning in higher education. Internet High Educ. 2013; 18 : 4–14.

Sing C, Khine M. An analysis of interaction and participation patterns in online community. J Educ Techno Soc. 2006; 9(1): 250–261.

Bernard RM, Abrami PC, Borokhovski E, Wade CA, Tamim RM, Surkes MA, et al. A meta-analysis of three types of interaction treatments in distance education. Rev Educ Res. 2009; 79(3): 1243–1289.

Fedynich L, Bradley KS, Bradley J. Graduate students’ perceptions of online learning. Res High Educ. 2015; 27.

Tanis CJ. The seven principles of online learning: Feedback from faculty and alumni on its importance for teaching and learning. Res Learn Technol. 2020; 28 . https://doi.org/10.25304/rlt.v28.2319

Dixson MD. Measuring student engagement in the online course: The Online Student Engagement scale (OSE). Online Learn. 2015; 19 (4).

Kwary DA, Fauzie S. Students’ achievement and opinions on the implementation of e-learning for phonetics and phonology lectures at Airlangga University. Educ Pesqui. 2018; 44 .

Vygotsky LS. Mind in society: The development of higher psychological processes. Cambridge (MA): Harvard University Press. 1978.

Kahoot! @ . Oslo, Norway. https://kahoot.com/

Davis J, Chryssafidou E, Zamora J, Davies D, Khan K, Coomarasamy A. Computer-based teaching is as good as face to face lecture-based teaching of evidence-based medicine: a randomised controlled trial. BMC Med Educ. 2007; 7(1): 1–6.

Davis J, Crabb S, Rogers E, Zamora J, Khan K. Computer-based teaching is as good as face to face lecture-based teaching of evidence-based medicine: a randomized controlled trial. Med Teach. 2008; 30(3): 302–307.

Download references

Acknowledgements

Not applicable.

Authors’ information

MZ is an Associate Professor of Learning Sciences and Senior Instructional Designer at School of Dentistry, University of the Pacific. She has a PhD in Education, with a specialty on learning sciences and technology. She has dedicated her entire career to conducting research on online learning, learning technology, and faculty development. Her research has resulted in several peer-reviewed publications in medical, dental, and educational technology journals. MZ has also presented regularly at national conferences.

DB is an Assistant Dean for Academic Affairs at School of Dentistry, University of the Pacific. He has an EdD degree in education, with a concentration on learning and instruction. Over the past decades, DB has been overseeing and delivering faculty pedagogical development programs to dental faculty. His research interest lies in educational leadership and instructional innovation. DB has co-authored several peer-reviewed publications in health sciences education and presented regularly at national conferences.

CL is Associate Dean of Oral Healthcare Education, School of Dentistry, University of the Pacific. She has a Doctor of Dental Surgery (DDS) degree and an EdD degree with a focus on educational leadership. Her professional interest lies in educational leadership, oral healthcare education innovation, and faculty development. CL has co-authored several publications in peer-reviewed journals in health sciences education and presented regularly at national conferences.

Author information

Authors and affiliations.

Office of Academic Affairs, Arthur A. Dugoni School of Dentistry, University of the Pacific, CA, San Francisco, USA

Meixun Zheng, Daniel Bender & Cindy Lyon

You can also search for this author in PubMed   Google Scholar

Contributions

MZ analyzed the data and wrote the initial draft of the manuscript. DB and CL both provided assistance with research design, data collection, and reviewed and edited the manuscript. The author(s) read and approved the final manuscript.

Corresponding author

Correspondence to Meixun Zheng .

Ethics declarations

Ethics approval and consent to participate.

The study was approved by the institutional review board at University of the Pacific in the U.S. (#2020-68). Informed consent was obtained from all participants. All methods were carried out in accordance with relevant guidelines and regulations.

Consent for publication

Competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1:.

Survey of online courses during COVID-19 pandemic.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Zheng, M., Bender, D. & Lyon, C. Online learning during COVID-19 produced equivalent or better student course performance as compared with pre-pandemic: empirical evidence from a school-wide comparative study. BMC Med Educ 21 , 495 (2021). https://doi.org/10.1186/s12909-021-02909-z

Download citation

Received : 31 March 2021

Accepted : 26 August 2021

Published : 16 September 2021

DOI : https://doi.org/10.1186/s12909-021-02909-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Dental education
  • Online learning
  • COVID-19 pandemic
  • Instructional strategies
  • Interaction
  • Learning performance

BMC Medical Education

ISSN: 1472-6920

online education's effect on learning research paper

Impact of online classes on the satisfaction and performance of students during the pandemic period of COVID 19

  • Published: 21 April 2021
  • Volume 26 , pages 6923–6947, ( 2021 )

Cite this article

online education's effect on learning research paper

  • Ram Gopal 1 ,
  • Varsha Singh 1 &
  • Arun Aggarwal   ORCID: orcid.org/0000-0003-3986-188X 2  

625k Accesses

224 Citations

25 Altmetric

Explore all metrics

The aim of the study is to identify the factors affecting students’ satisfaction and performance regarding online classes during the pandemic period of COVID–19 and to establish the relationship between these variables. The study is quantitative in nature, and the data were collected from 544 respondents through online survey who were studying the business management (B.B.A or M.B.A) or hotel management courses in Indian universities. Structural equation modeling was used to analyze the proposed hypotheses. The results show that four independent factors used in the study viz. quality of instructor, course design, prompt feedback, and expectation of students positively impact students’ satisfaction and further student’s satisfaction positively impact students’ performance. For educational management, these four factors are essential to have a high level of satisfaction and performance for online courses. This study is being conducted during the epidemic period of COVID- 19 to check the effect of online teaching on students’ performance.

Avoid common mistakes on your manuscript.

1 Introduction

Coronavirus is a group of viruses that is the main root of diseases like cough, cold, sneezing, fever, and some respiratory symptoms (WHO, 2019 ). Coronavirus is a contagious disease, which is spreading very fast amongst the human beings. COVID-19 is a new sprain which was originated in Wuhan, China, in December 2019. Coronavirus circulates in animals, but some of these viruses can transmit between animals and humans (Perlman & Mclntosh, 2020 ). As of March 282,020, according to the MoHFW, a total of 909 confirmed COVID-19 cases (862 Indians and 47 foreign nationals) had been reported in India (Centers for Disease Control and Prevention, 2020 ). Officially, no vaccine or medicine is evaluated to cure the spread of COVID-19 (Yu et al., 2020 ). The influence of the COVID-19 pandemic on the education system leads to schools and colleges’ widespread closures worldwide. On March 24, India declared a country-wide lockdown of schools and colleges (NDTV, 2020 ) for preventing the transmission of the coronavirus amongst the students (Bayham & Fenichel, 2020 ). School closures in response to the COVID-19 pandemic have shed light on several issues affecting access to education. COVID-19 is soaring due to which the huge number of children, adults, and youths cannot attend schools and colleges (UNESCO, 2020 ). Lah and Botelho ( 2012 ) contended that the effect of school closing on students’ performance is hazy.

Similarly, school closing may also affect students because of disruption of teacher and students’ networks, leading to poor performance. Bridge ( 2020 ) reported that schools and colleges are moving towards educational technologies for student learning to avoid a strain during the pandemic season. Hence, the present study’s objective is to develop and test a conceptual model of student’s satisfaction pertaining to online teaching during COVID-19, where both students and teachers have no other option than to use the online platform uninterrupted learning and teaching.

UNESCO recommends distance learning programs and open educational applications during school closure caused by COVID-19 so that schools and teachers use to teach their pupils and bound the interruption of education. Therefore, many institutes go for the online classes (Shehzadi et al., 2020 ).

As a versatile platform for learning and teaching processes, the E-learning framework has been increasingly used (Salloum & Shaalan, 2018 ). E-learning is defined as a new paradigm of online learning based on information technology (Moore et al., 2011 ). In contrast to traditional learning academics, educators, and other practitioners are eager to know how e-learning can produce better outcomes and academic achievements. Only by analyzing student satisfaction and their performance can the answer be sought.

Many comparative studies have been carried out to prove the point to explore whether face-to-face or traditional teaching methods are more productive or whether online or hybrid learning is better (Lockman & Schirmer, 2020 ; Pei & Wu, 2019 ; González-Gómez et al., 2016 ; González-Gómez et al., 2016 ). Results of the studies show that the students perform much better in online learning than in traditional learning. Henriksen et al. ( 2020 ) highlighted the problems faced by educators while shifting from offline to online mode of teaching. In the past, several research studies had been carried out on online learning to explore student satisfaction, acceptance of e-learning, distance learning success factors, and learning efficiency (Sher, 2009 ; Lee, 2014 ; Yen et al., 2018 ). However, scant amount of literature is available on the factors that affect the students’ satisfaction and performance in online classes during the pandemic of Covid-19 (Rajabalee & Santally, 2020 ). In the present study, the authors proposed that course design, quality of the instructor, prompt feedback, and students’ expectations are the four prominent determinants of learning outcome and satisfaction of the students during online classes (Lee, 2014 ).

The Course Design refers to curriculum knowledge, program organization, instructional goals, and course structure (Wright, 2003 ). If well planned, course design increasing the satisfaction of pupils with the system (Almaiah & Alyoussef, 2019 ). Mtebe and Raisamo ( 2014 ) proposed that effective course design will help in improving the performance through learners knowledge and skills (Khan & Yildiz, 2020 ; Mohammed et al., 2020 ). However, if the course is not designed effectively then it might lead to low usage of e-learning platforms by the teachers and students (Almaiah & Almulhem, 2018 ). On the other hand, if the course is designed effectively then it will lead to higher acceptance of e-learning system by the students and their performance also increases (Mtebe & Raisamo, 2014 ). Hence, to prepare these courses for online learning, many instructors who are teaching blended courses for the first time are likely to require a complete overhaul of their courses (Bersin, 2004 ; Ho et al., 2006 ).

The second-factor, Instructor Quality, plays an essential role in affecting the students’ satisfaction in online classes. Instructor quality refers to a professional who understands the students’ educational needs, has unique teaching skills, and understands how to meet the students’ learning needs (Luekens et al., 2004 ). Marsh ( 1987 ) developed five instruments for measuring the instructor’s quality, in which the main method was Students’ Evaluation of Educational Quality (SEEQ), which delineated the instructor’s quality. SEEQ is considered one of the methods most commonly used and embraced unanimously (Grammatikopoulos et al., 2014 ). SEEQ was a very useful method of feedback by students to measure the instructor’s quality (Marsh, 1987 ).

The third factor that improves the student’s satisfaction level is prompt feedback (Kinicki et al., 2004 ). Feedback is defined as information given by lecturers and tutors about the performance of students. Within this context, feedback is a “consequence of performance” (Hattie & Timperley, 2007 , p. 81). In education, “prompt feedback can be described as knowing what you know and what you do not related to learning” (Simsek et al., 2017 , p.334). Christensen ( 2014 ) studied linking feedback to performance and introduced the positivity ratio concept, which is a mechanism that plays an important role in finding out the performance through feedback. It has been found that prompt feedback helps in developing a strong linkage between faculty and students which ultimately leads to better learning outcomes (Simsek et al., 2017 ; Chang, 2011 ).

The fourth factor is students’ expectation . Appleton-Knapp and Krentler ( 2006 ) measured the impact of student’s expectations on their performance. They pin pointed that the student expectation is important. When the expectations of the students are achieved then it lead to the higher satisfaction level of the student (Bates & Kaye, 2014 ). These findings were backed by previous research model “Student Satisfaction Index Model” (Zhang et al., 2008 ). However, when the expectations are students is not fulfilled then it might lead to lower leaning and satisfaction with the course. Student satisfaction is defined as students’ ability to compare the desired benefit with the observed effect of a particular product or service (Budur et al., 2019 ). Students’ whose grade expectation is high will show high satisfaction instead of those facing lower grade expectations.

The scrutiny of the literature show that although different researchers have examined the factors affecting student satisfaction but none of the study has examined the effect of course design, quality of the instructor, prompt feedback, and students’ expectations on students’ satisfaction with online classes during the pandemic period of Covid-19. Therefore, this study tries to explore the factors that affect students’ satisfaction and performance regarding online classes during the pandemic period of COVID–19. As the pandemic compelled educational institutions to move online with which they were not acquainted, including teachers and learners. The students were not mentally prepared for such a shift. Therefore, this research will be examined to understand what factors affect students and how students perceived these changes which are reflected through their satisfaction level.

This paper is structured as follows: The second section provides a description of theoretical framework and the linkage among different research variables and accordingly different research hypotheses were framed. The third section deals with the research methodology of the paper as per APA guideline. The outcomes and corresponding results of the empirical analysis are then discussed. Lastly, the paper concludes with a discussion and proposes implications for future studies.

2 Theoretical framework

Achievement goal theory (AGT) is commonly used to understand the student’s performance, and it is proposed by four scholars Carole Ames, Carol Dweck, Martin Maehr, and John Nicholls in the late 1970s (Elliot, 2005 ). Elliott & Dweck ( 1988 , p11) define that “an achievement goal involves a program of cognitive processes that have cognitive, affective and behavioral consequence”. This theory suggests that students’ motivation and achievement-related behaviors can be easily understood by the purpose and the reasons they adopted while they are engaged in the learning activities (Dweck & Leggett, 1988 ; Ames, 1992 ; Urdan, 1997 ). Some of the studies believe that there are four approaches to achieve a goal, i.e., mastery-approach, mastery avoidance, performance approach, and performance-avoidance (Pintrich, 1999 ; Elliot & McGregor, 2001 ; Schwinger & Stiensmeier-Pelster, 2011 , Hansen & Ringdal, 2018 ; Mouratidis et al., 2018 ). The environment also affects the performance of students (Ames & Archer, 1988 ). Traditionally, classroom teaching is an effective method to achieve the goal (Ames & Archer, 1988 ; Ames, 1992 ; Clayton et al., 2010 ) however in the modern era, the internet-based teaching is also one of the effective tools to deliver lectures, and web-based applications are becoming modern classrooms (Azlan et al., 2020 ). Hence, following section discuss about the relationship between different independent variables and dependent variables (Fig. 1 ).

figure 1

Proposed Model

3 Hypotheses development

3.1 quality of the instructor and satisfaction of the students.

Quality of instructor with high fanaticism on student’s learning has a positive impact on their satisfaction. Quality of instructor is one of the most critical measures for student satisfaction, leading to the education process’s outcome (Munteanu et al., 2010 ; Arambewela & Hall, 2009 ; Ramsden, 1991 ). Suppose the teacher delivers the course effectively and influence the students to do better in their studies. In that case, this process leads to student satisfaction and enhances the learning process (Ladyshewsky, 2013 ). Furthermore, understanding the need of learner by the instructor also ensures student satisfaction (Kauffman, 2015 ). Hence the hypothesis that the quality of instructor significantly affects the satisfaction of the students was included in this study.

H1: The quality of the instructor positively affects the satisfaction of the students.

3.2 Course design and satisfaction of students

The course’s technological design is highly persuading the students’ learning and satisfaction through their course expectations (Liaw, 2008 ; Lin et al., 2008 ). Active course design indicates the students’ effective outcomes compared to the traditional design (Black & Kassaye, 2014 ). Learning style is essential for effective course design (Wooldridge, 1995 ). While creating an online course design, it is essential to keep in mind that we generate an experience for students with different learning styles. Similarly, (Jenkins, 2015 ) highlighted that the course design attributes could be developed and employed to enhance student success. Hence the hypothesis that the course design significantly affects students’ satisfaction was included in this study.

H2: Course design positively affects the satisfaction of students.

3.3 Prompt feedback and satisfaction of students

The emphasis in this study is to understand the influence of prompt feedback on satisfaction. Feedback gives the information about the students’ effective performance (Chang, 2011 ; Grebennikov & Shah, 2013 ; Simsek et al., 2017 ). Prompt feedback enhances student learning experience (Brownlee et al., 2009 ) and boosts satisfaction (O'donovan, 2017 ). Prompt feedback is the self-evaluation tool for the students (Rogers, 1992 ) by which they can improve their performance. Eraut ( 2006 ) highlighted the impact of feedback on future practice and student learning development. Good feedback practice is beneficial for student learning and teachers to improve students’ learning experience (Yorke, 2003 ). Hence the hypothesis that prompt feedback significantly affects satisfaction was included in this study.

H3: Prompt feedback of the students positively affects the satisfaction.

3.4 Expectations and satisfaction of students

Expectation is a crucial factor that directly influences the satisfaction of the student. Expectation Disconfirmation Theory (EDT) (Oliver, 1980 ) was utilized to determine the level of satisfaction based on their expectations (Schwarz & Zhu, 2015 ). Student’s expectation is the best way to improve their satisfaction (Brown et al., 2014 ). It is possible to recognize student expectations to progress satisfaction level (ICSB, 2015 ). Finally, the positive approach used in many online learning classes has been shown to place a high expectation on learners (Gold, 2011 ) and has led to successful outcomes. Hence the hypothesis that expectations of the student significantly affect the satisfaction was included in this study.

H4: Expectations of the students positively affects the satisfaction.

3.5 Satisfaction and performance of the students

Zeithaml ( 1988 ) describes that satisfaction is the outcome result of the performance of any educational institute. According to Kotler and Clarke ( 1986 ), satisfaction is the desired outcome of any aim that amuses any individual’s admiration. Quality interactions between instructor and students lead to student satisfaction (Malik et al., 2010 ; Martínez-Argüelles et al., 2016 ). Teaching quality and course material enhances the student satisfaction by successful outcomes (Sanderson, 1995 ). Satisfaction relates to the student performance in terms of motivation, learning, assurance, and retention (Biner et al., 1996 ). Mensink and King ( 2020 ) described that performance is the conclusion of student-teacher efforts, and it shows the interest of students in the studies. The critical element in education is students’ academic performance (Rono, 2013 ). Therefore, it is considered as center pole, and the entire education system rotates around the student’s performance. Narad and Abdullah ( 2016 ) concluded that the students’ academic performance determines academic institutions’ success and failure.

Singh et al. ( 2016 ) asserted that the student academic performance directly influences the country’s socio-economic development. Farooq et al. ( 2011 ) highlights the students’ academic performance is the primary concern of all faculties. Additionally, the main foundation of knowledge gaining and improvement of skills is student’s academic performance. According to Narad and Abdullah ( 2016 ), regular evaluation or examinations is essential over a specific period of time in assessing students’ academic performance for better outcomes. Hence the hypothesis that satisfaction significantly affects the performance of the students was included in this study.

H5: Students’ satisfaction positively affects the performance of the students.

3.6 Satisfaction as mediator

Sibanda et al. ( 2015 ) applied the goal theory to examine the factors persuading students’ academic performance that enlightens students’ significance connected to their satisfaction and academic achievement. According to this theory, students perform well if they know about factors that impact on their performance. Regarding the above variables, institutional factors that influence student satisfaction through performance include course design and quality of the instructor (DeBourgh, 2003 ; Lado et al., 2003 ), prompt feedback, and expectation (Fredericksen et al., 2000 ). Hence the hypothesis that quality of the instructor, course design, prompts feedback, and student expectations significantly affect the students’ performance through satisfaction was included in this study.

H6: Quality of the instructor, course design, prompt feedback, and student’ expectations affect the students’ performance through satisfaction.

H6a: Students’ satisfaction mediates the relationship between quality of the instructor and student’s performance.

H6b: Students’ satisfaction mediates the relationship between course design and student’s performance.

H6c: Students’ satisfaction mediates the relationship between prompt feedback and student’s performance.

H6d: Students’ satisfaction mediates the relationship between student’ expectations and student’s performance.

4.1 Participants

In this cross-sectional study, the data were collected from 544 respondents who were studying the management (B.B.A or M.B.A) and hotel management courses. The purposive sampling technique was used to collect the data. Descriptive statistics shows that 48.35% of the respondents were either MBA or BBA and rests of the respondents were hotel management students. The percentages of male students were (71%) and female students were (29%). The percentage of male students is almost double in comparison to females. The ages of the students varied from 18 to 35. The dominant group was those aged from 18 to 22, and which was the under graduation student group and their ratio was (94%), and another set of students were from the post-graduation course, which was (6%) only.

4.2 Materials

The research instrument consists of two sections. The first section is related to demographical variables such as discipline, gender, age group, and education level (under-graduate or post-graduate). The second section measures the six factors viz. instructor’s quality, course design, prompt feedback, student expectations, satisfaction, and performance. These attributes were taken from previous studies (Yin & Wang, 2015 ; Bangert, 2004 ; Chickering & Gamson, 1987 ; Wilson et al., 1997 ). The “instructor quality” was measured through the scale developed by Bangert ( 2004 ). The scale consists of seven items. The “course design” and “prompt feedback” items were adapted from the research work of Bangert ( 2004 ). The “course design” scale consists of six items. The “prompt feedback” scale consists of five items. The “students’ expectation” scale consists of five items. Four items were adapted from Bangert, 2004 and one item was taken from Wilson et al. ( 1997 ). Students’ satisfaction was measure with six items taken from Bangert ( 2004 ); Wilson et al. ( 1997 ); Yin and Wang ( 2015 ). The “students’ performance” was measured through the scale developed by Wilson et al. ( 1997 ). The scale consists of six items. These variables were accessed on a five-point likert scale, ranging from 1(strongly disagree) to 5(strongly agree). Only the students from India have taken part in the survey. A total of thirty-four questions were asked in the study to check the effect of the first four variables on students’ satisfaction and performance. For full details of the questionnaire, kindly refer Appendix Tables 6 .

The study used a descriptive research design. The factors “instructor quality, course design, prompt feedback and students’ expectation” were independent variables. The students’ satisfaction was mediator and students’ performance was the dependent variable in the current study.

4.4 Procedure

In this cross-sectional research the respondents were selected through judgment sampling. They were informed about the objective of the study and information gathering process. They were assured about the confidentiality of the data and no incentive was given to then for participating in this study. The information utilizes for this study was gathered through an online survey. The questionnaire was built through Google forms, and then it was circulated through the mails. Students’ were also asked to write the name of their college, and fifteen colleges across India have taken part to fill the data. The data were collected in the pandemic period of COVID-19 during the total lockdown in India. This was the best time to collect the data related to the current research topic because all the colleges across India were involved in online classes. Therefore, students have enough time to understand the instrument and respondent to the questionnaire in an effective manner. A total of 615 questionnaires were circulated, out of which the students returned 574. Thirty responses were not included due to the unengaged responses. Finally, 544 questionnaires were utilized in the present investigation. Male and female students both have taken part to fill the survey, different age groups, and various courses, i.e., under graduation and post-graduation students of management and hotel management students were the part of the sample.

5.1 Exploratory factor analysis (EFA)

To analyze the data, SPSS and AMOS software were used. First, to extract the distinct factors, an exploratory factor analysis (EFA) was performed using VARIMAX rotation on a sample of 544. Results of the exploratory analysis rendered six distinct factors. Factor one was named as the quality of instructor, and some of the items were “The instructor communicated effectively”, “The instructor was enthusiastic about online teaching” and “The instructor was concerned about student learning” etc. Factor two was labeled as course design, and the items were “The course was well organized”, “The course was designed to allow assignments to be completed across different learning environments.” and “The instructor facilitated the course effectively” etc. Factor three was labeled as prompt feedback of students, and some of the items were “The instructor responded promptly to my questions about the use of Webinar”, “The instructor responded promptly to my questions about general course requirements” etc. The fourth factor was Student’s Expectations, and the items were “The instructor provided models that clearly communicated expectations for weekly group assignments”, “The instructor used good examples to explain statistical concepts” etc. The fifth factor was students’ satisfaction, and the items were “The online classes were valuable”, “Overall, I am satisfied with the quality of this course” etc. The sixth factor was performance of the student, and the items were “The online classes has sharpened my analytic skills”, “Online classes really tries to get the best out of all its students” etc. These six factors explained 67.784% of the total variance. To validate the factors extracted through EFA, the researcher performed confirmatory factor analysis (CFA) through AMOS. Finally, structural equation modeling (SEM) was used to test the hypothesized relationships.

5.2 Measurement model

The results of Table 1 summarize the findings of EFA and CFA. Results of the table showed that EFA renders six distinct factors, and CFA validated these factors. Table 2 shows that the proposed measurement model achieved good convergent validity (Aggarwal et al., 2018a , b ). Results of the confirmatory factor analysis showed that the values of standardized factor loadings were statistically significant at the 0.05 level. Further, the results of the measurement model also showed acceptable model fit indices such that CMIN = 710.709; df = 480; CMIN/df = 1.481 p  < .000; Incremental Fit Index (IFI) = 0.979; Tucker-Lewis Index (TLI) = 0.976; Goodness of Fit index (GFI) = 0.928; Adjusted Goodness of Fit Index (AGFI) = 0.916; Comparative Fit Index (CFI) = 0.978; Root Mean Square Residual (RMR) = 0.042; Root Mean Squared Error of Approximation (RMSEA) = 0.030 is satisfactory.

The Average Variance Explained (AVE) according to the acceptable index should be higher than the value of squared correlations between the latent variables and all other variables. The discriminant validity is confirmed (Table 2 ) as the value of AVE’s square root is greater than the inter-construct correlations coefficient (Hair et al., 2006 ). Additionally, the discriminant validity existed when there was a low correlation between each variable measurement indicator with all other variables except with the one with which it must be theoretically associated (Aggarwal et al., 2018a , b ; Aggarwal et al., 2020 ). The results of Table 2 show that the measurement model achieved good discriminate validity.

5.3 Structural model

To test the proposed hypothesis, the researcher used the structural equation modeling technique. This is a multivariate statistical analysis technique, and it includes the amalgamation of factor analysis and multiple regression analysis. It is used to analyze the structural relationship between measured variables and latent constructs.

Table  3 represents the structural model’s model fitness indices where all variables put together when CMIN/DF is 2.479, and all the model fit values are within the particular range. That means the model has attained a good model fit. Furthermore, other fit indices as GFI = .982 and AGFI = 0.956 be all so supportive (Schumacker & Lomax, 1996 ; Marsh & Grayson, 1995 ; Kline, 2005 ).

Hence, the model fitted the data successfully. All co-variances among the variables and regression weights were statistically significant ( p  < 0.001).

Table 4 represents the relationship between exogenous, mediator and endogenous variables viz—quality of instructor, prompt feedback, course design, students’ expectation, students’ satisfaction and students’ performance. The first four factors have a positive relationship with satisfaction, which further leads to students’ performance positively. Results show that the instructor’s quality has a positive relationship with the satisfaction of students for online classes (SE = 0.706, t-value = 24.196; p  < 0.05). Hence, H1 was supported. The second factor is course design, which has a positive relationship with students’ satisfaction of students (SE = 0.064, t-value = 2.395; p < 0.05). Hence, H2 was supported. The third factor is Prompt feedback, and results show that feedback has a positive relationship with the satisfaction of the students (SE = 0.067, t-value = 2.520; p < 0.05). Hence, H3 was supported. The fourth factor is students’ expectations. The results show a positive relationship between students’ expectation and students’ satisfaction with online classes (SE = 0.149, t-value = 5.127; p < 0.05). Hence, H4 was supported. The results of SEM show that out of quality of instructor, prompt feedback, course design, and students’ expectation, the most influencing factor that affect the students’ satisfaction was instructor’s quality (SE = 0.706) followed by students’ expectation (SE =5.127), prompt feedback (SE = 2.520). The factor that least affects the students’ satisfaction was course design (2.395). The results of Table 4 finally depicts that students’ satisfaction has positive effect on students’ performance ((SE = 0.186, t-value = 2.800; p < 0.05). Hence H5 was supported.

Table 5 shows that students’ satisfaction partially mediates the positive relationship between the instructor’s quality and student performance. Hence, H6(a) was supported. Further, the mediation analysis results showed that satisfaction again partially mediates the positive relationship between course design and student’s performance. Hence, H6(b) was supported However, the mediation analysis results showed that satisfaction fully mediates the positive relationship between prompt feedback and student performance. Hence, H6(c) was supported. Finally, the results of the Table 5 showed that satisfaction partially mediates the positive relationship between expectations of the students and student’s performance. Hence, H6(d) was supported.

6 Discussion

In the present study, the authors evaluated the different factors directly linked with students’ satisfaction and performance with online classes during Covid-19. Due to the pandemic situation globally, all the colleges and universities were shifted to online mode by their respective governments. No one has the information that how long this pandemic will remain, and hence the teaching method was shifted to online mode. Even though some of the educators were not tech-savvy, they updated themselves to battle the unexpected circumstance (Pillai et al., 2021 ). The present study results will help the educators increase the student’s satisfaction and performance in online classes. The current research assists educators in understanding the different factors that are required for online teaching.

Comparing the current research with past studies, the past studies have examined the factors affecting the student’s satisfaction in the conventional schooling framework. However, the present study was conducted during India’s lockdown period to identify the prominent factors that derive the student’s satisfaction with online classes. The study also explored the direct linkage between student’s satisfaction and their performance. The present study’s findings indicated that instructor’s quality is the most prominent factor that affects the student’s satisfaction during online classes. This means that the instructor needs to be very efficient during the lectures. He needs to understand students’ psychology to deliver the course content prominently. If the teacher can deliver the course content properly, it affects the student’s satisfaction and performance. The teachers’ perspective is critical because their enthusiasm leads to a better online learning process quality.

The present study highlighted that the second most prominent factor affecting students’ satisfaction during online classes is the student’s expectations. Students might have some expectations during the classes. If the instructor understands that expectation and customizes his/her course design following the student’s expectations, then it is expected that the students will perform better in the examinations. The third factor that affects the student’s satisfaction is feedback. After delivering the course, appropriate feedback should be taken by the instructors to plan future courses. It also helps to make the future strategies (Tawafak et al., 2019 ). There must be a proper feedback system for improvement because feedback is the course content’s real image. The last factor that affects the student’s satisfaction is design. The course content needs to be designed in an effective manner so that students should easily understand it. If the instructor plans the course, so the students understand the content without any problems it effectively leads to satisfaction, and the student can perform better in the exams. In some situations, the course content is difficult to deliver in online teaching like the practical part i.e. recipes of dishes or practical demonstration in the lab. In such a situation, the instructor needs to be more creative in designing and delivering the course content so that it positively impacts the students’ overall satisfaction with online classes.

Overall, the students agreed that online teaching was valuable for them even though the online mode of classes was the first experience during the pandemic period of Covid-19 (Agarwal & Kaushik, 2020 ; Rajabalee & Santally, 2020 ). Some of the previous studies suggest that the technology-supported courses have a positive relationship with students’ performance (Cho & Schelzer, 2000 ; Harasim, 2000 ; Sigala, 2002 ). On the other hand, the demographic characteristic also plays a vital role in understanding the online course performance. According to APA Work Group of the Board of Educational Affairs ( 1997 ), the learner-centered principles suggest that students must be willing to invest the time required to complete individual course assignments. Online instructors must be enthusiastic about developing genuine instructional resources that actively connect learners and encourage them toward proficient performances. For better performance in studies, both teachers and students have equal responsibility. When the learner faces any problem to understand the concepts, he needs to make inquiries for the instructor’s solutions (Bangert, 2004 ). Thus, we can conclude that “instructor quality, student’s expectation, prompt feedback, and effective course design” significantly impact students’ online learning process.

7 Implications of the study

The results of this study have numerous significant practical implications for educators, students and researchers. It also contributes to the literature by demonstrating that multiple factors are responsible for student satisfaction and performance in the context of online classes during the period of the COVID-19 pandemic. This study was different from the previous studies (Baber, 2020 ; Ikhsan et al., 2019 ; Eom & Ashill, 2016 ). None of the studies had examined the effect of students’ satisfaction on their perceived academic performance. The previous empirical findings have highlighted the importance of examining the factors affecting student satisfaction (Maqableh & Jaradat, 2021 ; Yunusa & Umar, 2021 ). Still, none of the studies has examined the effect of course design, quality of instructor, prompt feedback, and students’ expectations on students’ satisfaction all together with online classes during the pandemic period. The present study tries to fill this research gap.

The first essential contribution of this study was the instructor’s facilitating role, and the competence he/she possesses affects the level of satisfaction of the students (Gray & DiLoreto, 2016 ). There was an extra obligation for instructors who taught online courses during the pandemic. They would have to adapt to a changing climate, polish their technical skills throughout the process, and foster new students’ technical knowledge in this environment. The present study’s findings indicate that instructor quality is a significant determinant of student satisfaction during online classes amid a pandemic. In higher education, the teacher’s standard referred to the instructor’s specific individual characteristics before entering the class (Darling-Hammond, 2010 ). These attributes include factors such as instructor content knowledge, pedagogical knowledge, inclination, and experience. More significantly, at that level, the amount of understanding could be given by those who have a significant amount of technical expertise in the areas they are teaching (Martin, 2021 ). Secondly, the present study results contribute to the profession of education by illustrating a realistic approach that can be used to recognize students’ expectations in their class effectively. The primary expectation of most students before joining a university is employment. Instructors have agreed that they should do more to fulfill students’ employment expectations (Gorgodze et al., 2020 ). The instructor can then use that to balance expectations to improve student satisfaction. Study results can be used to continually improve and build courses, as well as to make policy decisions to improve education programs. Thirdly, from result outcomes, online course design and instructors will delve deeper into how to structure online courses more efficiently, including design features that minimize adversely and maximize optimistic emotion, contributing to greater student satisfaction (Martin et al., 2018 ). The findings suggest that the course design has a substantial positive influence on the online class’s student performance. The findings indicate that the course design of online classes need to provide essential details like course content, educational goals, course structure, and course output in a consistent manner so that students would find the e-learning system beneficial for them; this situation will enable students to use the system and that leads to student performance (Almaiah & Alyoussef, 2019 ). Lastly, the results indicate that instructors respond to questions promptly and provide timely feedback on assignments to facilitate techniques that help students in online courses improve instructor participation, instructor interaction, understanding, and participation (Martin et al., 2018 ). Feedback can be beneficial for students to focus on the performance that enhances their learning.

Author information

Authors and affiliations.

Chitkara College of Hospitality Management, Chitkara University, Chandigarh, Punjab, India

Ram Gopal & Varsha Singh

Chitkara Business School, Chitkara University, Chandigarh, Punjab, India

Arun Aggarwal

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Arun Aggarwal .

Ethics declarations

Ethics approval.

Not applicable.

Conflict of interest

The authors declare no conflict of interest, financial or otherwise.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Gopal, R., Singh, V. & Aggarwal, A. Impact of online classes on the satisfaction and performance of students during the pandemic period of COVID 19. Educ Inf Technol 26 , 6923–6947 (2021). https://doi.org/10.1007/s10639-021-10523-1

Download citation

Received : 07 December 2020

Accepted : 22 March 2021

Published : 21 April 2021

Issue Date : November 2021

DOI : https://doi.org/10.1007/s10639-021-10523-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Quality of instructor
  • Course design
  • Instructor’s prompt feedback
  • Expectations
  • Student’s satisfaction
  • Perceived performance

Advertisement

  • Find a journal
  • Publish with us
  • Track your research

How Effective Is Online Learning? What the Research Does and Doesn’t Tell Us

online education's effect on learning research paper

  • Share article

Editor’s Note: This is part of a series on the practical takeaways from research.

The times have dictated school closings and the rapid expansion of online education. Can online lessons replace in-school time?

Clearly online time cannot provide many of the informal social interactions students have at school, but how will online courses do in terms of moving student learning forward? Research to date gives us some clues and also points us to what we could be doing to support students who are most likely to struggle in the online setting.

The use of virtual courses among K-12 students has grown rapidly in recent years. Florida, for example, requires all high school students to take at least one online course. Online learning can take a number of different forms. Often people think of Massive Open Online Courses, or MOOCs, where thousands of students watch a video online and fill out questionnaires or take exams based on those lectures.

In the online setting, students may have more distractions and less oversight, which can reduce their motivation.

Most online courses, however, particularly those serving K-12 students, have a format much more similar to in-person courses. The teacher helps to run virtual discussion among the students, assigns homework, and follows up with individual students. Sometimes these courses are synchronous (teachers and students all meet at the same time) and sometimes they are asynchronous (non-concurrent). In both cases, the teacher is supposed to provide opportunities for students to engage thoughtfully with subject matter, and students, in most cases, are required to interact with each other virtually.

Coronavirus and Schools

Online courses provide opportunities for students. Students in a school that doesn’t offer statistics classes may be able to learn statistics with virtual lessons. If students fail algebra, they may be able to catch up during evenings or summer using online classes, and not disrupt their math trajectory at school. So, almost certainly, online classes sometimes benefit students.

In comparisons of online and in-person classes, however, online classes aren’t as effective as in-person classes for most students. Only a little research has assessed the effects of online lessons for elementary and high school students, and even less has used the “gold standard” method of comparing the results for students assigned randomly to online or in-person courses. Jessica Heppen and colleagues at the American Institutes for Research and the University of Chicago Consortium on School Research randomly assigned students who had failed second semester Algebra I to either face-to-face or online credit recovery courses over the summer. Students’ credit-recovery success rates and algebra test scores were lower in the online setting. Students assigned to the online option also rated their class as more difficult than did their peers assigned to the face-to-face option.

Most of the research on online courses for K-12 students has used large-scale administrative data, looking at otherwise similar students in the two settings. One of these studies, by June Ahn of New York University and Andrew McEachin of the RAND Corp., examined Ohio charter schools; I did another with colleagues looking at Florida public school coursework. Both studies found evidence that online coursetaking was less effective.

About this series

BRIC ARCHIVE

This essay is the fifth in a series that aims to put the pieces of research together so that education decisionmakers can evaluate which policies and practices to implement.

The conveners of this project—Susanna Loeb, the director of Brown University’s Annenberg Institute for School Reform, and Harvard education professor Heather Hill—have received grant support from the Annenberg Institute for this series.

To suggest other topics for this series or join in the conversation, use #EdResearchtoPractice on Twitter.

Read the full series here .

It is not surprising that in-person courses are, on average, more effective. Being in person with teachers and other students creates social pressures and benefits that can help motivate students to engage. Some students do as well in online courses as in in-person courses, some may actually do better, but, on average, students do worse in the online setting, and this is particularly true for students with weaker academic backgrounds.

Students who struggle in in-person classes are likely to struggle even more online. While the research on virtual schools in K-12 education doesn’t address these differences directly, a study of college students that I worked on with Stanford colleagues found very little difference in learning for high-performing students in the online and in-person settings. On the other hand, lower performing students performed meaningfully worse in online courses than in in-person courses.

But just because students who struggle in in-person classes are even more likely to struggle online doesn’t mean that’s inevitable. Online teachers will need to consider the needs of less-engaged students and work to engage them. Online courses might be made to work for these students on average, even if they have not in the past.

Just like in brick-and-mortar classrooms, online courses need a strong curriculum and strong pedagogical practices. Teachers need to understand what students know and what they don’t know, as well as how to help them learn new material. What is different in the online setting is that students may have more distractions and less oversight, which can reduce their motivation. The teacher will need to set norms for engagement—such as requiring students to regularly ask questions and respond to their peers—that are different than the norms in the in-person setting.

Online courses are generally not as effective as in-person classes, but they are certainly better than no classes. A substantial research base developed by Karl Alexander at Johns Hopkins University and many others shows that students, especially students with fewer resources at home, learn less when they are not in school. Right now, virtual courses are allowing students to access lessons and exercises and interact with teachers in ways that would have been impossible if an epidemic had closed schools even a decade or two earlier. So we may be skeptical of online learning, but it is also time to embrace and improve it.

A version of this article appeared in the April 01, 2020 edition of Education Week as How Effective Is Online Learning?

Sign Up for EdWeek Tech Leader

Edweek top school jobs.

NXTLVL virtual classroom with individual student video headshots

Sign Up & Sign In

module image 9

Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

Publications

  • Our Methods
  • Short Reads
  • Tools & Resources

Read Our Research On:

Internet & Technology

6 facts about americans and tiktok.

62% of U.S. adults under 30 say they use TikTok, compared with 39% of those ages 30 to 49, 24% of those 50 to 64, and 10% of those 65 and older.

Many Americans think generative AI programs should credit the sources they rely on

Americans’ use of chatgpt is ticking up, but few trust its election information, whatsapp and facebook dominate the social media landscape in middle-income nations, sign up for our internet, science, and tech newsletter.

New findings, delivered monthly

Electric Vehicle Charging Infrastructure in the U.S.

64% of Americans live within 2 miles of a public electric vehicle charging station, and those who live closest to chargers view EVs more positively.

When Online Content Disappears

A quarter of all webpages that existed at one point between 2013 and 2023 are no longer accessible.

A quarter of U.S. teachers say AI tools do more harm than good in K-12 education

High school teachers are more likely than elementary and middle school teachers to hold negative views about AI tools in education.

Teens and Video Games Today

85% of U.S. teens say they play video games. They see both positive and negative sides, from making friends to harassment and sleep loss.

Americans’ Views of Technology Companies

Most Americans are wary of social media’s role in politics and its overall impact on the country, and these concerns are ticking up among Democrats. Still, Republicans stand out on several measures, with a majority believing major technology companies are biased toward liberals.

22% of Americans say they interact with artificial intelligence almost constantly or several times a day. 27% say they do this about once a day or several times a week.

About one-in-five U.S. adults have used ChatGPT to learn something new (17%) or for entertainment (17%).

Across eight countries surveyed in Latin America, Africa and South Asia, a median of 73% of adults say they use WhatsApp and 62% say they use Facebook.

5 facts about Americans and sports

About half of Americans (48%) say they took part in organized, competitive sports in high school or college.

REFINE YOUR SELECTION

Research teams, signature reports.

online education's effect on learning research paper

The State of Online Harassment

Roughly four-in-ten Americans have experienced online harassment, with half of this group citing politics as the reason they think they were targeted. Growing shares face more severe online abuse such as sexual harassment or stalking

Parenting Children in the Age of Screens

Two-thirds of parents in the U.S. say parenting is harder today than it was 20 years ago, with many citing technologies – like social media or smartphones – as a reason.

Dating and Relationships in the Digital Age

From distractions to jealousy, how Americans navigate cellphones and social media in their romantic relationships.

Americans and Privacy: Concerned, Confused and Feeling Lack of Control Over Their Personal Information

Majorities of U.S. adults believe their personal data is less secure now, that data collection poses more risks than benefits, and that it is not possible to go through daily life without being tracked.

Americans and ‘Cancel Culture’: Where Some See Calls for Accountability, Others See Censorship, Punishment

Social media fact sheet, digital knowledge quiz, video: how do americans define online harassment.

1615 L St. NW, Suite 800 Washington, DC 20036 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions. It is a subsidiary of  The Pew Charitable Trusts .

© 2024 Pew Research Center

IMAGES

  1. Impact Of Online Learning Research Paper

    online education's effect on learning research paper

  2. Impact of Online Education by PRESHIT KAMDAR on Prezi

    online education's effect on learning research paper

  3. paragraph on online classes

    online education's effect on learning research paper

  4. (PDF) Students’ Perceptions of Online Learning: A Comparative Study

    online education's effect on learning research paper

  5. (PDF) Study of the Impact of Online Education on Student's learning at

    online education's effect on learning research paper

  6. (PDF) The Effect of Online Learning on University Students' Learning

    online education's effect on learning research paper

VIDEO

  1. Session-3: Experimental Design and Optimization

  2. The Zoom Classroom: The Psychological Impact of Online Learning

  3. Session

  4. ChatDev UPDATE: Create POWERFUL Software In Minutes With AI Agents!

  5. Unlocking Opportunities: Advantages of Online Education

  6. Session-7: Introduction to Artificial Intelligence

COMMENTS

  1. The effects of online education on academic success: A meta ...

    The purpose of this study is to determine the effect size of online education on academic achievement. Before determining the effect sizes in the study, the probability of publication bias of this meta-analysis study was analyzed by using the funnel plot, Orwin's Safe N Analysis, Duval and Tweedie's Trip and Fill Analysis, and Egger's Regression Test.

  2. Examining research on the impact of distance and online learning: A

    The COVID-19 pandemic highlighted how distance learning and the special case of online learning would continue to be used in all formal educational settings. Research and best practices on distance and online learning have been implemented in several distance courses (Seaman et al., 2018).

  3. The Impact of Online Learning on Student's Academic Performance

    online classes could affect the academic performance of students. This paper seeks to study the. impact of online learning on the academic performance of university students and to determine. whether education systems should increase the amount of online learning for traditional in-class. subjects.

  4. The effects of online education on academic success: A meta-analysis

    According to the study of Bernard et al. ( 2004 ), this meta-analysis focuses on the activities done in online education lectures. As a result of the research, an overall effect size close to zero was found for online education utilizing more than one generation technology for students at different levels.

  5. Online education in the post-COVID era

    Metrics. The coronavirus pandemic has forced students and educators across all levels of education to rapidly adapt to online learning. The impact of this — and the developments required to make ...

  6. Review of Education

    This systematic analysis examines effectiveness research on online and blended learning from schools, particularly relevant during the Covid-19 pandemic, and also educational games, computer-supported cooperative learning (CSCL) and computer-assisted instruction (CAI), largely used in schools but with potential for outside school.

  7. Online vs in-person learning in higher education: effects on student

    Given the dearth of research on the academic impact of online learning, especially in light of Covid-19 in the educational arena, the present study aims to address that gap by assessing the ...

  8. A systematic review of research on online teaching and learning from

    1. Introduction. Online learning has been on the increase in the last two decades. In the United States, though higher education enrollment has declined, online learning enrollment in public institutions has continued to increase (Allen & Seaman, 2017), and so has the research on online learning.There have been review studies conducted on specific areas on online learning such as innovations ...

  9. PDF Online vs in-person learning in higher education: effects on student

    ARTICLE Online vs in-person learning in higher education: effects on student achievement and recommendations for leadership Bandar N. Alarifi1 & Steve Song2 This study is a comparative analysis ...

  10. Assessing the Impact of Online-Learning Effectiveness and Benefits in

    Online learning is one of the educational solutions for students during the COVID-19 pandemic. Worldwide, most universities have shifted much of their learning frameworks to an online learning model to limit physical interaction between people and slow the spread of COVID-19. The effectiveness of online learning depends on many factors, including student and instructor self-efficacy, attitudes ...

  11. COVID-19's impacts on the scope, effectiveness, and ...

    The COVID-19 outbreak brought online learning to the forefront of education. Scholars have conducted many studies on online learning during the pandemic, but only a few have performed quantitative comparative analyses of students' online learning behavior before and after the outbreak. We collected review data from China's massive open online course platform called icourse.163 and ...

  12. Online and face‐to‐face learning: Evidence from students' performance

    We use a sample of 395 s‐year students taking a macroeconomics module in the Economics department to compare the effects of F2F and online learning on students' performance using a range of assessments. ... Review of Educational Research, 74 (3), 379-439 ... As reported in 355 research reports, summaries and papers. North Carolina State ...

  13. Blended learning effectiveness: the relationship between student

    Improving online learning: student perceptions of useful and challenging characteristics'. Internet and Higher Education, 7(1), 59-70. Article Google Scholar Stacey, E., & Gerbic, P. (2007). Teaching for blended learning: research perspectives from on-campus and distance students.

  14. Traditional Learning Compared to Online Learning During the COVID-19

    Online learning is defined as an educational strategy in which the learner is geographically distant from the teacher, and the entire educational process is conducted across the Internet and communication networks (Ali Ta'amneh, 2021). Despite the advantages of online learning, there are still numerous challenges for students, administration ...

  15. (Pdf) Research on Online Learning

    This paper analyzes the difficulties faced by the students and teachers in online teaching learning process during the COVID-19 pandemic. Online learning is an alternative platform that replaced ...

  16. Assessing Online Students' Engagement in Higher Education: Use ...

    Post-Covid-19, exploring opportunities and challenges in online learning has become a crucial area of research. The dynamics of online student interactions are now the focal point of teaching and learning studies. However, higher education institutions are grappling with various obstacles that hinder students' active participation in this new ...

  17. Integrating students' perspectives about online learning: a hierarchy

    This article reports on a large-scale (n = 987), exploratory factor analysis study incorporating various concepts identified in the literature as critical success factors for online learning from the students' perspective, and then determines their hierarchical significance. Seven factors--Basic Online Modality, Instructional Support, Teaching Presence, Cognitive Presence, Online Social ...

  18. Frontiers

    To deepen our understanding of these phenomena, future research should explore the long-term effects of online learning on critical health and cognitive domains. Investigating potential interventions to enhance physical activity and cognitive function in online learning environments is crucial for developing more holistic educational models.

  19. Online learning during COVID-19 produced ...

    Research across disciplines has demonstrated that well-designed online learning can lead to students' enhanced motivation, satisfaction, and learning [1,2,3,4,5,6,7].]. A report by the U.S. Department of Education [], based on examinations of comparative studies of online and face-to-face versions of the same course from 1996 to 2008, concluded that online learning could produce learning ...

  20. Online Education and Its Effective Practice: A Research Review

    gued that effective online instruction is dependent upon 1) w ell-designed course content, motiva t-. ed interaction between the instructor and learners, we ll-prepared and fully-supported ...

  21. Negative Impacts From the Shift to Online Learning During the COVID-19

    The COVID-19 pandemic led to an abrupt shift from in-person to virtual instruction in the spring of 2020. We use two complementary difference-in-differences frameworks: one that leverages within-instructor-by-course variation on whether students started their spring 2020 courses in person or online and another that incorporates student fixed effects.

  22. Impact of online classes on the satisfaction and performance of

    The aim of the study is to identify the factors affecting students' satisfaction and performance regarding online classes during the pandemic period of COVID-19 and to establish the relationship between these variables. The study is quantitative in nature, and the data were collected from 544 respondents through online survey who were studying the business management (B.B.A or M.B.A) or ...

  23. (PDF) The Effects of Online Learning on Students' Performance: A

    The Covid-19 pandemic has spread across the globe, causing educational institutions to shut down. As a result, classes have been held online. This study investigated the impact of online learning ...

  24. How Effective Is Online Learning? What the Research Does and Doesn't

    Online learning can take a number of different forms. Often people think of Massive Open Online Courses, or MOOCs, where thousands of students watch a video online and fill out questionnaires or ...

  25. Internet & Technology

    Research and data on Internet & Technology from Pew Research Center. Numbers, Facts and Trends Shaping Your World . Newsletters; Press; My Account ... A quarter of U.S. teachers say AI tools do more harm than good in K-12 education. High school teachers are more likely than elementary and middle school teachers to hold negative views about AI ...

  26. A RESEARCH PROJECT REPORT ON To Study on Impact of The Online Learning

    A RESEARCH PROJECT REPORT ON To Study on Impact of The Online Learning/Teaching on The Students of Higher Education (Submitted in partial fulfillment of the requirement for the award of degree ...

  27. Senate 30 May 2024 Question and Answer Session

    Senate 30 May 2024 Question and Answer Session